This paper studies Laplace-type estimators that are based on simulated moments. It shows that confidence intervals using these methods may have coverage which is far from the nominal level. A neural network may be used to reduce the dimension of an initial set of moments to the minimum number that maintains identification. When Laplace-type estimation and inference is based on these neural moments, confidence intervals have statistically correct coverage in most cases studied, with only small departures from correct coverage. The methods are illustrated by an application to a jump diffusion model for returns of the S&P 500 index.