We present high-order splitting time integrators for nonlinear evolution
equations. Methods for splitting into two or three operators are
introduced together with a rigorous error analysis and
asymptotically correct defect-based
error estimators. Our error analysis applies to equations of
Schrödinger type, but also extends to parabolic problems within
the appropriate functional analytic framework. The proposed
methods are also demonstrated to be highly successful in a
parallel environment, with good scaling in the number of processors,
and to produce efficient and accurate simulations of intricate dynamics as
in models for pattern generation.