Cellbots is a collection of hobbyists who want to free your cell phone from your pocket and let it connect and move with the real word by integrating it with a robot.
Now Anthony Francis, a member of Google’s Cloud Robotics team and a volunteer at Cellbots.com, has been given the green light to contribute the sensory integration code he wrote to Cellbots’s codebase. The code was originally developed as part of a cooperative project between Google and Hasbro to make robot phone docks.
The Perception Manager is a Java class that abstracts the raw Android Sensor API into higher-level ‘percepts,’ effectively translating hundreds of samples a second from the accelerometer and gyroscope into binary features like “shaking” or “upside down”.
Yes, you could write these yourself – but why should you have to? Furthermore, the PerceptionManager has support for higher-level sensors like “movement in space” or “vertical motion” which you can use to build up your own percepts, or math functions for Verlet integration and vector math you can use to develop your own sensory processing.