We incorporate a maximum entropy image reconstruction technique into the process of modelling the time-dependent geomagnetic field at the core–mantle boundary (CMB). In order to deal with unconstrained small lengthscales in the process of inverting the data, some core field models are regularized using a priori quadratic norms in both space and time. This artificial damping leads to the underestimation of power at large wavenumbers, and to a loss of contrast in the reconstructed picture of the field at the CMB. The entropy norm, recently introduced to regularize magnetic field maps, provides models with better contrast, and involves a minimum of a priori information about the field structure. However, this technique was developed to build only snapshots of the magnetic field. Previously described in the spatial domain, we show here how to implement this technique in the spherical harmonic domain, and we extend it to the time-dependent problem where both spatial and temporal regularizations are required. We apply our method to model the field over the interval 1840–1990 from a compilation of historical observations. Applying the maximum entropy method in space—for a fit to the data similar to that obtained with a quadratic regularization—effectively reorganizes the magnetic field lines in order to have a map with better contrast. This is associated with a less rapidly decaying spectrum at large wavenumbers. Applying the maximum entropy method in time permits us to model sharper temporal changes, associated with larger spatial gradients in the secular variation, without producing spurious fluctuations on short timescales. This method avoids the smearing back in time of field features that are not constrained by the data. Perspectives concerning future applications of the method are also discussed.