Neural circuits generate representations of the external world from multiple information streams. The navigation system provides an exceptional lens through which we may gain insights about how such computations are implemented. Neural circuits in the medial temporal lobe construct a map-like representation of space that supports navigation. This computation integrates multiple sensory cues, and, in addition, is thought to require cues related to the individual's movement through the environment. Here, we identify multiple self-motion signals, related to the position and velocity of the head and eyes, encoded by neurons in a key node of the navigation circuitry of mice, the medial entorhinal cortex (MEC). The representation of these signals is highly integrated with other cues in individual neurons. Such information could be used to compute the allocentric location of landmarks from visual cues and to generate internal representations of space.