Mediated Perception
Sept 2013 - May 2014
Team of 2 Mediated Perception is an Augmented Reality Headset used for depth analysis and real world motion tracking. When completed, the device will be available to software developers to create games or other apps. |
|
The device was planned to use the Beagle Bone Black to communicate with the DE2-115 Board. The board would use two camera's to grab the images and apply a stereo vision algorithm to achieve depth. The device would also use a Mbed to capture real world information such as pitch, roll and yaw. For more information, the paper and presentation are below.
Unfortunately due to the lack of time and inexperience the project was incomplete.
Contributions
- Configured I2C to work with the HMC5883L 3-Axis Digital Compass
- Calibrated the compass to read magnetic north
- Implemented FPGA Design to link the DE2-115 with the D5M Camera
- Researched FPGA Design for StereoVision
- FPGA code written in Verilog HDL
- Mbed code written in C++
Completed
- The Inertia Movement Unit is completed
- Android installed on the BeagleBone black
- DE2-115 Board communicating with a single camera
Credits
Johnny Sim - Firmware & Verilog Programmer
Cody Harris - Firmware & Linux Programmer
Unfortunately due to the lack of time and inexperience the project was incomplete.
Contributions
- Configured I2C to work with the HMC5883L 3-Axis Digital Compass
- Calibrated the compass to read magnetic north
- Implemented FPGA Design to link the DE2-115 with the D5M Camera
- Researched FPGA Design for StereoVision
- FPGA code written in Verilog HDL
- Mbed code written in C++
Completed
- The Inertia Movement Unit is completed
- Android installed on the BeagleBone black
- DE2-115 Board communicating with a single camera
Credits
Johnny Sim - Firmware & Verilog Programmer
Cody Harris - Firmware & Linux Programmer