Open source perception manager

Cellbots is a collection of hobbyists who want to free your cell phone from your pocket and let it connect and move with the real word by integrating it with a robot.

Now Anthony Francis, a member of Google’s Cloud Robotics team and a volunteer at Cellbots.com, has been given the green light to contribute the sensory integration code he wrote to Cellbots’s codebase. The code was originally developed as part of a cooperative project between Google and Hasbro to make robot phone docks.

Anthony writes:

The Perception Manager is a Java class that abstracts the raw Android Sensor API into higher-level ‘percepts,’ effectively translating hundreds of samples a second from the accelerometer and gyroscope into binary features like “shaking” or “upside down”.

Yes, you could write these yourself – but why should you have to? Furthermore, the PerceptionManager has support for higher-level sensors like “movement in space” or “vertical motion” which you can use to build up your own percepts, or math functions for Verlet integration and vector math you can use to develop your own sensory processing.

This entry was posted in Android, code, robotics, sensors and tagged , , , .

Comments

  1. bearmos says:

    before reading the excerpt, i thought this was about managing social “perceptions” somehow. . .the automated gesture interpretation is much more likely to work correctly!

Leave a Comment

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>

Notify me of followup comments via e-mail. You can also subscribe without commenting.