In this paper we study an order reduction phenomenon arising in Nordsieck methods when they are applied to ordinary differential equations on nonuniform grids. It causes some difficulties of using stepsize selection strategies in practical computations. We prove that the problem mentioned above is just a consequence of the fact that the concepts of consistency and quasi-consistency are not equivalent for such methods. The paper is also supplied with numerical examples which clearly confirm the presented theory.

CEMAT - Center for Computational and Stochastic Mathematics