Object orientation influences visual and haptic recognition differently. This could be caused by the two modalities accessing different object representations or it could be due to differences in how each modality acquires information. These two alternatives were investigated using sequential haptic matching tasks. Matches presented the same object twice. Mismatches presented two similarly-shaped objects. Objects were either both placed at the same orientation or were rotated 90° in depth from each other. Experiment 1 manipulated exploration time to test if longer durations weakened orientation-sensitivity by allowing orientation-invariant representations to be extracted. This hypothesis was not supported. Experiment 2 investigated whether the same-orientation advantage resulted from general spatial or motor action cueing rather than the involvement of orientation-specific object representations. To distinguish between these two possibilities, people did a secondary task interleaved within the matching task. They reported the orientation of a fork or spoon which was presented in between the first and second objects. The main axis of the fork/spoon was the same as that of the final object, equating spatial and motor cueing across the same-orientation and orientation-change conditions. Nevertheless, matching remained orientation-sensitive. Together these results suggest that there are separate visual and haptic stored, orientation-specific perceptual representations of objects.