Teresa Escrig

News and oppinion about Cognitive AI & Robotics

Archive for the ‘research’ tag

Is the long anticipated shift in robotics finally happening?

2 comments

Whew… with so many exciting things happening in the robotics field lately, I just couldn’t remain silent anymore…

kiva robots

Kiva robots carrying shelves in a warehouse.

We were all wowed by Amazon’s acquisition in 2012 of  Kiva Systems for $775 million.  Kiva’s clever self-propelled robots scoot around warehouses in a numeric control dance to retrieve and carry entire shelf-units of items to their proper packaging point.

In December 2013 and January 2014, Google bought 7 robotics companies investing an unknown amount of money.  The Internet giant and pioneer of self-driving cars is serious about a robot-filled future. However we don’t know much about the intent of Google with all these acquisitions. They’re all a part of the Google X division, which is top secret by definition. Most of these companies have closed down their websites and retreated into stealth mode. My guess is that they are grouping up to decide the direction they’ll take to serve Google’s goals.

The robotics team is led by Andy Rubin, who recently stepped down as head of Android.

Here there is a brief summary of all Google’s acquisitions (and a bunch of links to dig deeper):

Arm manipulator of Industrial Perception, Inc.

Arm manipulator of Industrial Perception, Inc.

The biped robot at Schaft, Inc.

The biped robot at Schaft, Inc.

 

 

 

 

 

 

 

 

  • Industrial Perception, Inc (IPI) – spun off of the Menlo Park robotics company Willow Garage.  They have a 3D vision-guided robot to be used in manufacturing and logistics.
  • Schaft Inc. The Japanese team that got its start at Tokyo University. They took the top prize at DARPA’s Robotics Challenge Trial with their bipedal robot.
  • Redwood Robotics – started as a joint venture between Meka Robotics, SRI International, and Willow Garage (IPI’s parent). Redwood wants to build the “next generation arm” for robots.
  • Meka Robotics – A very nice torso robot with very sophisticated hands in a mobile platform with wheels.

  • Bot & Dolly  – a design and engineering studio that specializes in automation, robotics, and filmmaking. They use robots to help film commercials and movies like Gravity.
  • Holomini – The only thing we know about them is that they are creators of high-tech wheels for omnidirectional motion.
Bot & Dolly arm with camera.

Bot & Dolly arm with camera.

 

Holomini's wheels.

Holomini’s wheels.

 

 

 

 

 

 

 

 

  • Boston Dynamics  – The most high-profile of all the robotic companies that Google has acquired so far. They have two main robots: ATLAS -the sophisticated humanoid robot and Cheetah, also called the BigDog that can reach 28 mph.
ATLAS robot from Boston Robotics.

ATLAS robot from Boston Robotics.

BigDog from Boston Robotics.

BigDog from Boston Robotics.

 

 

 

 

 

 

 

In the middle of January 2014, Google acquired Nest for $3.2 billion dollars.

  • Nest  – is an automation startup whose product is a smoke and CO2 alarm that talks.

And at the end of January Google acquired DeepMind for more than $500 M (after having beaten out Facebook):

  • DeepMind – is an AI research company out of London founded by neuroscientist Demis Hassabis, Skype developer Jaan Tallin, and researcher Shane Leggthe.  They use the best techniques from machine learning and systems neuroscience to build powerful general-purpose learning algorithms.

In 2012 Google hired Ray Kurzweil to work on machine learning and language processing, to actually understand the content of the Web pages and provide a better way to rank them beside the number of times a web site is mentioned in other web sites. According to Dr. Kurzweil… you will be able to “ask it more complex questions that might be a whole paragraph… It might engage in a dialogue with you to find out what you need… It might come back in two months if it finds something useful.”

imperial college london robotics lab

The butler robot from the Imperial College London Robotics Lab

And now Sir James Dyson (the bagless vacuum cleaner inventor) is investing £5M in the Imperial College London to develop a new generation of “intelligent domestic robots” (an Iron Man’s style robot), with a further £3 million investment from various sources over the next five years.

Dyson remains frustrated at his prototypes’ inability to navigate simple household obstacles after working on a robotic vacuum cleaner  to go along with his company’s famous bagless line for as long as a decade. Indeed, even the greatest Roomba finds itself at a loss under a tangle of dining room chairs, and would shrug its shoulders when faced with a flight of stairs.”

Is the tide finally turning in robotics?

Many robotic prototypes built, few arrive to the market

one comment

I really enjoyed  the following video from iRobot that shows their museum of all of the robotic prototypes and applications they’ve been working on over the years.

It’s amazing stuff, and very important to realize the amount of work that needs to be done to prove a concept. Even when proven, the robot may not meet some of the needs of the user, and not a best seller anyway.

Next time you buy a sophisticated toy or a small (not so intelligent) vacuum cleaner, remember all of the time, money, research and work behind it!

Thank you iRobot for showing us this treasure!

Written by Teresa Escrig

February 11th, 2013 at 11:33 pm

What are the benefits of Artificial Intelligence in Robotics?

one comment

Happy New Year to all!  It’s been a while since my last post. Too busy. Now, I’m back.

————————————————————————————-

Robotics is not only a research field within artificial intelligence, but a field of application, one where all areas of artificial intelligence can be tested and integrated into a final result.

Amazing humanoid robots exhibit elegant and smooth motion capable of walking, running, and going up and down stairs.  They use their hands to protect themselves when falling, and to get up afterward.  They’re an example of the tremendous financial and human capital that is being devoted to research and development in the field of electronics, control and the design of robots.

Very often, the behavior of these robots contains a fixed number of pre-programmed instructions that are repeated regardless of  any changes in the environment. These robots have no autonomy, nor adaptation, to the changing environment, and therefore do not show intelligent behavior. We are amazed by the technology they provide, which is fantastic! But we can not infer that, because the robots are physically so realistic and the movements so precise and gentle, that they are able to do what we (people) do. Read the rest of this entry »

Cognitive Robots’ Cognitive Brain for Service Robotics has been successfully incorporated into Robosoft’s Kompai companion robot

leave a comment

Last week the results of the ECHORD C-Brain experiment was presented at IROS’12 conference in Portugal.

The overall goal of the project is to enhance the Kompai companion robotic platform from Robosoft (picture on the left) with the Cognitive Brain for Service Robotics ® (CBRAIN) from Cognitive Robots (picture on the right). The existing functionalities of the KOMPAI platform will remain and be enhanced with the cognitive capabilities of the CBRAIN.

The original capabilities of the Kompai at the beginning of the project were:

  1. Autonomous navigation solution based on traditional techniques such as laser-based SLAM (Simultaneous Localization and Mapping).
  2. Linear Obstacle detection at the height of the laser.
  3. Advanced dialog: the robot can receive verbal commands and give verbal responses.

The initial limitations that where identify in the Kompai platform and were addressed in this project were:

  • No automatic map building. A technician needs to manually create the map of each new environment (half day of work). Every single time the layout of that home is changed, the technician needs to go back to the home to re-learn the map of the environment for the robot.
  • No 3D obstacle avoidance. The current sensor of the Kompai is a laser, which provide linear distance measurement of the obstacles at the height of the laser. Read the rest of this entry »

Open-source humanoid platform from NimbRo to compete in RoboCup’s TeenSize league

leave a comment

Once upon a time, when I finished my PhD dissertation, I went to the IJCAI conference in Kyoto, Japan, and the Robocup competition was taken place in the same venue. I absolutely fall in love with the Aibo dog and cat robots from Sony, that were exposed at the competition (before they were widely used at the same competition).

At that event I decided that I wanted to apply the results of my PhD to bring Intelligence to robots. And that is what I did. I started a research group at Jaume I University. My students play with the Aibos for years. And working on one of the challenges of the Robocup competition with my students, I put all the dots together, and after 10 years of research since my PhD was finished, the seed of Cognitive Robots was born. That technology became a patent pending for our company and is still ahead of the rest of the technology that brings Intelligence to the robots, as far as we know.

I have great memories about the Robocup competition. I agree that it is a great play ground to integrate and test technologies in the areas of AI and Robotics. And it is for sure much more that a toy test.

By , October 8, 2012

University of Bonn’s Team NimbRo are commercializing a humanoid platform, NimbRo-OP, for €20,000 (US$26,000) to compete in RoboCup‘s TeenSize league. It sounds rather expensive, but it will save teams the trouble of prototyping their own, and the untold hours of research and development that would normally require.

Read more >

AISOY1 II, a programmable inexpensive robot with emotions

leave a comment

By , September 19, 2012

Spanish start-up Aisoy Robotics is marketing a new robot that, while it may look similar to the famous Furby, is actually a fully programmable research and development platform.

The Aisoy1 II robot comes with a variety of sensors (touch, light, position, temperature, and camera), microphone and speaker, RGB LEDs in its body, and a 70 mini-LED matrix display (for animated lips). Four servos control the robot’s neck rotation, eyelids, and eyebrows. The platform doesn’t move.

The package includes a dialogue system for speech recognition and synthesis, as well as computer vision software for stuff like face and object recognition, all running on the Linux operating system. The company claims even complete novices can take advantage of these functions without having to learn how to code thanks to DIA, its visual programming tool. The program runs in HTML5 compatible browsers, allowing you to select nodes that control the robot’s various sensors and behaviors.

Read more >

Aisoy 1 II includes a dialogue system for speech recognition and synthesis, as well as com...As the Thymio II, a specific non-standard programming language is against the robotic community efforts for standardization. However, the fact that is HTML5 compatible contributes to the creation of the Robotics App Economy.

The most important feature of Aisoy1 II, which is not mentioned in the previous article, is its emotional motor, a very interesting AI feature at the service of developers for a very low price. As their creators said: ” humans would not take decisions without emotions”. This emotional motor can be a key factor for development of the robotic industry.

Very cute little and inexpensive robots that can help to promote robotics education at schools and colleges.

Electronic nose to detect harmful airborne agents

leave a comment

A prototype of an electronic nose to detect harmful airborne agents such as pesticides, biological weapons, gas leaks and other unwanted presences has been developed at University of California.

The “electronic nose” will eventually be developed into three platforms: a handheld device, which could be used for environmental monitoring, a smaller wearable version useful for monitoring air quality, and a smartphone-integrated system, which the team reports could detect a potentially harmful airborne agent.

This is a very important sensor to include into robots, as well.

By , August 23, 2012

Read more >

Surfing Robot Tells Scientists Where the Sharks Are

one comment

Researchers at Stanford University have developed a Wave Glider robot which tracks the migratory patterns of great white sharks off the California coast, near San Francisco.

Stanford marine scientists have spent the past 12 years tracking the migratory patterns of sharks by placing acoustic tags on the animals that send a signal to a receiver when they pass within 1,500 feet.


Their goal is to use revolutionary technology that increases our capacity to observe our oceans and census populations, improve fisheries management models, and monitor animal responses to climate change.

The surfing robot will receive audio information from the shark’s tags and then it will propel itself forward through the water to follow the animal in an unobtrusive manner. The surfboard part acts like a WiFi hotspot, pinging the research team with the latest data about the sharks’ movements.

The Stanford team has released a new iPhone and iPad app called Shark Net to model the sharks’ patterns and offer real-time notifications when the robot crosses paths with certain sharks. The idea behind the app is to allow everyone to explore the places where these sharks live, and to get to know them just like their friends on Facebook.

Read more >

By August 20, 2012 Read more >

This Little Robot Could Totally Transform The Way Humanity Shops

leave a comment

by Jill Krasny  Jul. 20, 2012

AndyVision Future of Retail Project at Carnegie Mellon University. This project involves in-store digital signage for customers to browse the store’s 3D planograms, as well as an autonomous store-operations robot to assist in inventory management, including out-of-stock detection.

AndyVision manages inventory, but his influence might go farther than that, reports Motherboard’s Adam Clark Estes. Researchers say the lightweight, red-hoodied robot was built to “transform the shopping experience.”

Here, Estes explains how the “mechanized messenger” works:

“With the help of a video camera and an onboard computer that combines image-processing with machine learning algorithms, it can patrol the aisles counting stock and scanning for misplaced items … The data from the inventory scans are all sent to a large touchscreen, where customers can browse through what’s available in the store.”

Read more >

Biologically accurate robotic legs get the gait right

leave a comment

Very impressive video of the biologically accurate robotic legs in action.

By , July 10, 2012

The machine comprises simplified versions of the human neural, musculoskeletal and sensory feedback systems.

The robotic legs are unique in that they are controlled by a crude equivalent of the central pattern generator (CPG) – a neural network located in the spinal cord at the abdominal level and responsible for generating rhythmic muscle signals. These signals are modulated by the CPG as it gathers information from different body parts responding to external stimuli. As a result, we are able to walk without ever giving the activity much thought.

The most basic form of a CPG is called a half center and is made up of two neurons rhythmically alternating in producing a signal. An artificial version of a half center produces signals and gathers feedback from sensors in the robotic limbs, such as load sensors that notice when the angle of the walking surface has shifted.

Read more >