Geomagnetic jerks are rapid time variations of the magnetic field at the Earth's surface that are thought to be of primarily internal origin. Jerks are relevant for studies of the Earth interior: they likely give information on core dynamics and possibly on mantle electrical conductivity. In such studies a precise determination of the jerk occurrence time and its error bar at each observatory is required. We analyze the most well-known global jerks (1969, 1978, and 1991) and a possible local jerk in 1999, considering all three components of the magnetic field (X, Y, and Z). Different data sets are investigated: annual means, 12 month running averages of observatory monthly means in rotated geomagnetic dipole coordinates, and data representing the core field contribution synthesized from the CM4 time-dependent field model. The secular variation in each component of the field around the time of a jerk was modeled by two straight line segments, using both least squares and 1-norm methods. The 1969, 1978, and 1991 jerks were globally detected, while the 1999 event was only locally identified. Using this simple method enables us to calculate error bars in the jerk occurrence times and to quantify their nonsimultaneous behavior. We find that our error bars are not, in general, symmetric about the mean occurrence time and that the mean errors on the X and Z components of 1.7 years and 1.5 years are larger than that of 1.1 years on the Y component. Generally, the error bars were found to be larger in the Southern Hemisphere observatories. Our results are necessary prerequisites for further studies of the inverse problem that attempt to determine mantle electrical conductivity from variations in jerk occurrence times. Copyright © 2011 by the American Geophysical Union.