Computational Mathematics and Mathematical Physics, 40(9) (2000), 1255-1275

A technique for controlling the integration step size in multistep methods is developed. It is based on a combined use of the principal terms of the local and global errors for calculating the optimal step size and controlling the accuracy of the numerical solution. An advantage of the new technique is the ability to automatically provide an arbitrary predefined accuracy. The algorithm is theoretically substantiated and verified on test problems.