Ever wish you had a friend who would always be there to listen to and sympathize with all your human troubles? Well, next time you’ve exhausted all your human friends with gripes about work, parents or your roommate, you can unload your burdens onto Mimbo. This miniature bot from Will Langford, promises to always be there to commiserate, using facial recognition software to track your expressions and mimicking them on his iPhone face. Mimbo is the latest in emotive, reactive mechanical devices.
To build your own version of Mimbo, you’ll need a sheet of cardboard, a spare iPhone 3GS (or 4/4s), some Processing skills and an X-Acto knife. Start by cutting out the paper pattern and folding into place.
Download the TouchOSC Editor and TouchOSC app for your iPhone. To add a feature, right click and select the object. You’re able to edit the color, position, name, value range, and other parameters by selecting the object and editing the values. Mimbo’s face will have six features: four LEDs (for the eyes), one empty label box (for eyebrows/eyelids), and one multi-fader (for the mouth). All of these facial features are controlled programmatically in the Processing code. Download Landford’s interface in Step 2 and hit the “Sync” button on your iPhone to update the layout.
Download and install the oscP5 library for Processing to enable the connection between TouchOSC and Open Sound Control. Once you figure out the basic structure of messages, you’ll find that OSC is a really simple and effective way of talking to all kinds of multimedia devices. In the setup, you’ll need to initialize oscP5 and tell it what port to listen to for incoming messages, as well as declare a remote address (the IP address of the phone) and outgoing port. You can find these two values in the OSC app. Find some messaging examples in Step 3.
Mimbo utilizes the accelerometer feature in TouchOSC to send its acceleration sensor values back to the Processing script. In order to receive these OSC messages, an interrupt/event handler method is required. The simplest way to debug OSC event handling is to print out whatever is received using the code outlined in Step 4.
Once you have the basics of sending and receiving OSC messages with Processing and TouchOSC, you should end up with this code and Mimbo’s cute face on your phone. If you’re having trouble, the controls are well documented here.
Finally, set Mimbo to track your face and mimic your facial expressions by using FaceOSC, which uses a webcam to scan your face, sending the OSC messages to a specific port. You can then either read-in these messages directly in Processing or route them through Osculator for more calculated results.
Visit the Instructables How-To for further instruction, more detailed photographs and tips on where to buy materials.