This past summer (2010) I worked doing independent study research in HCI. Through an extensive process for finding the best way to use the hardware I had available, I developed an Ogre3D application that consisted of a 3D scene loaded with a simple mesh, whose view was set by controlling a virtual camera. The virtual camera was controlled with the position and orientation of infrared markers, tracked by Optitrack infrared cameras along with the Tracking Tools software and API, whose retrieved data was streamed via the NatNet protocol into my C++ Ogre3D application. I ran into some difficulties concerning the limitations of the hardware itself, but I managed to construct a rather interesting enhanced reality interface. From this experience I wrote a report that you can find here, and I recorded this video demonstrating the use of the interface. I apologize in advance for my disheveled appearance in the video, it was about 9pm when I recorded it and I had been working in the lab for about 12 hours.