Teresa Escrig

News and oppinion about Cognitive AI & Robotics

Human-Computer (or Robot) interface through Rough Sketches

leave a comment

A team from Rhode Island’s Brown University and the Technical University of Berlin have created software that analyzes users’ crude, cartoony sketches, and figures out what it is that they’re trying to draw.

To develop the system, the researchers started with a database made up of 250 categories of annotated photographs. Then, using Amazon’s Mechanical Turk crowd-sourcing service, they hired people to make rough sketches of objects from each of those categories. The resulting 20,000 sketches were then subjected to recognition and machine learning algorithms, in order to teach the system what general sort of sketches could be attributed to which categories. After seeing numerous examples of how various people drew a rabbit, for instance, it would learn that combinations of specific shapes usually meant “rabbit.”

Check out the video showing the performance of the application. It is amazing! This technology has a broad and very deep implication in many areas, robotics is just one.

The research  is  available online, together with a library of sample sketches, and other materials. The team is currently considering a ‘Pictionary’ type open source game to expand the systems’ drawing reference library.

Read More:

Leave a Reply