This paper investigates the lag length selection problem of a vector error correction model by using a convergent information criterion and tools based on the Box-Pierce methodology recently proposed in the literature. The performances of these approaches for selecting the optimal lag length are compared via Monte Carlo experiments. The effects of misspecified deterministic trend or cointegrating rank on the lag length selection are studied. Noting that processes often exhibit nonlinearities, the cases of iid and conditionally heteroscedastic errors will be considered. Strategies that can avoid misleading situations are proposed.