Pixy magically recognises objects for Arduino
Essentially, the Pixy is a fast vision sensor (think pixellation) that you can “teach” to find objects, through the use of colour codes, and it reports its findings through several simple interfaces, say its creators.
Featuring an Omnivision OV9715 1/4-inch image sensor with 1280×800 resolution, it’s powered by an NXP LPC4330 dual-core ARM processor that can process images at 50 frames per second.
The Pixy is a joint development between Carnegie Mellon University and Austin, Texas-based Charmed Labs.
“We tried to make Pixy as easy to use as possible. We think this will make it popular with the robotics and maker communities,” said Anthony Rowe, CMU faculty member.
“We’ve opened up the design by using the Open Source Hardware licensing model. You get source code, schematics, board layouts, everything,” said Rich LeGrand, Charmed Labs president.
It has interfaces for UART serial, SPI, I2C, digital or analogue I/O.
The product is part of a Kickstarter campaign, and is available by contributing $59 or more. At time of writing, with 24 days to go, it has already busted through its target – $45,262 has been pledged when $25,000 was the goal.
About Kickstarter, the creators write:
We’ve done our best to keep the cost of Pixy as low as possible. Improvements in technology deserve much of the credit, but this Kickstarter campaign is a big help also. The Kickstarter funds allow us to manufacture in sufficient quantity to get the parts and manufacturing costs down. The result is that Pixy is available to a wider audience, which has always been the point of the CMUcam: to put a capable, easy to use vision sensor in the hands of lots of people.
Accompanying Pixy is PixyMon, an application that runs on a PC or Mac. It lets you share the eyes of the Pixy, seeing what it sees, either as raw or processed video. You can also configure your Pixy, for example, set the output port and manage colour signatures. Communication with the Pixy is over mini USB.
PixyMon is great for debugging your application. You can plug a USB cable into the back of Pixy and run PixyMon and then see what Pixy sees while it is hooked to your Arduino or other microcontroller — no need to unplug anything. PixyMon is open source, like everything else. It’s written using the Qt framework.
Image below: “Pixy being taught. When colour LED matches the colour of the object, release button. Pixy will then find objects that match.”arm processor, faculty member, image recognition, omnivision, Pixy