I was a Roboticist at Romotive, where we built a small iPhone-based robot to help teach children programming concepts through a lovable embodied character named Romo. While at Romotive I lead a software team of seven people, taking ownership of the personality and autonomy of Romo, as well as coordinating the robot's software architecture.
One of my primary contributions for Romo was a best-in-class iOS framework for realtime computer vision called RMVision. This framework allowed our robot to expertly track faces, follow lines on the floor, detect changes in brightness, and use natural training by a person to chase brightly colored objects. Using a combination of OpenCV and hardware-accelerated OpenGL shaders, this framework was able to squeeze every bit of performance out of both legacy and modern iOS devices.
iOS development, OpenCV, hardware-accelerated computer vision, Flash