Teresa Escrig

News and oppinion about Cognitive AI & Robotics

Archive for the ‘robots’ tag

This Little Robot Could Totally Transform The Way Humanity Shops

leave a comment

by Jill Krasny  Jul. 20, 2012

AndyVision Future of Retail Project at Carnegie Mellon University. This project involves in-store digital signage for customers to browse the store’s 3D planograms, as well as an autonomous store-operations robot to assist in inventory management, including out-of-stock detection.

AndyVision manages inventory, but his influence might go farther than that, reports Motherboard’s Adam Clark Estes. Researchers say the lightweight, red-hoodied robot was built to “transform the shopping experience.”

Here, Estes explains how the “mechanized messenger” works:

“With the help of a video camera and an onboard computer that combines image-processing with machine learning algorithms, it can patrol the aisles counting stock and scanning for misplaced items … The data from the inventory scans are all sent to a large touchscreen, where customers can browse through what’s available in the store.”

Read more >

MIT creates intelligent car co-pilot that only interferes if you’re about to crash

leave a comment

By on July 13, 2012

Mechanical engineers and roboticists working at MIT have developed an intelligent automobile co-pilot that sits in the background and only interferes if you’re about to have an accident. If you fall asleep, for example, the co-pilot activates and keeps you on the road until you wake up again.

Like other autonomous and semi-autonomous solutions, the MIT co-pilot [research paper] uses an on-board camera and laser rangefinder to identify obstacles. These obstacles are then combined with various data points — such as the driver’s performance, and the car’s speed, stability, and physical characteristics — to create constraints. The co-pilot stays completely silent unless you come close to breaking one of these constraints — which might be as simple as a car in front braking quickly, or as complex as taking a corner too quickly. When this happens, a ton of robotics under the hood take over, only passing back control to the driver when the car is safe.

Read more >

A next step for the autonomous cars- Instincts are important to reproduce the perfection of a human race car driver

leave a comment

Autonomous cars can help people to drive when necessary.

Whatch this TED talk where Chris Gerdes (Director, Center for Automotive Research at Stanford (CARS)) reveals how he and his team are developing robotic race cars that can drive at 150 mph while avoiding every possible accident. And yet, in studying the brainwaves of professional racing drivers, Gerdes says he has gained a new appreciation for the instincts of professional drivers.

Instincts as well as common sense knowledge have always been the most difficult knowledge to capture by Artificial Intelligence. That has been the job of the team of  Cognitive Robots for over 20 years. The results are included in its Cognitive Brain for Service Robotics, which is able to capture commonsense knowledge by qualitative representation and reasoning.

Written by Teresa Escrig

July 13th, 2012 at 7:11 pm

Skippy is an internet-controlled robot that skips stones across a pond

leave a comment

We are going to be amazed of the number and variety of applications that people will came up with in service robotics…

Look at this video of Skippy, an internet-controlled robot that skips stones across a pond.

By , July 11, 2012

Read more >

Biologically accurate robotic legs get the gait right

leave a comment

Very impressive video of the biologically accurate robotic legs in action.

By , July 10, 2012

The machine comprises simplified versions of the human neural, musculoskeletal and sensory feedback systems.

The robotic legs are unique in that they are controlled by a crude equivalent of the central pattern generator (CPG) – a neural network located in the spinal cord at the abdominal level and responsible for generating rhythmic muscle signals. These signals are modulated by the CPG as it gathers information from different body parts responding to external stimuli. As a result, we are able to walk without ever giving the activity much thought.

The most basic form of a CPG is called a half center and is made up of two neurons rhythmically alternating in producing a signal. An artificial version of a half center produces signals and gathers feedback from sensors in the robotic limbs, such as load sensors that notice when the angle of the walking surface has shifted.

Read more >

Why Amazon acquired Kiva?

leave a comment

by Mark P. Mills, 3/23/2012

Amazon’s enormous, automated and well-organized warehouses are the stuff of legend, as are their path-breaking joint ventures with vendors, repair operations and UPS shipping. Still, physical order fulfillment reportedly costs nearly 9 percent of their $40 billion in global revenues.

Amazon was amongst the first to build data centers at Cloud scale – a scale that Google engineers labeled “warehouse scale computing.”   But to disrupt traditional retail Amazon had to do more than create a customer-friendly Web interface for their warehouse-scale computers.  They had to solve the old-fashioned physical warehouse problem in order to distribute the objects they sold.

Enter Kiva’s robots, and their inevitable progeny; the logical connection between the cyber and physical worlds. Think of Kiva bots as the hands and feet of the Cloud. They are not autonomous Star-Trek-like agents, but are wirelessly connected to and controlled by the Cloud in real-time.

When you tap “place your order” on your iPad’s touch-screen you are literally reaching through the Cloud to become one with Kiva to grab a box in the warehouse. Such robots are practical today because of a confluence of enabling technologies; cheap and powerful processing and communications, advanced electro-motive power, and clever software. All this is the domain of computing and square in Amazon’s wheelhouse.

Amazon needs to own Kiva for the same reason they own computing.

Read more >

How NASA plans to land a 2000 pound rover on Mars

leave a comment

A month from now, the Mars Science Laboratory (Curiosity) rover is set to touch down on the surface of the Red Planet and begin its mission to learn more about the possible existence of life – past or present. Curiosity will attempt to touch down using a complex and unusual landing sequence unlike any other used for previous Mars rovers … here’s how the plan will unfold.

The entire process will be executed completely autonomously, managed not by human intervention.

The technology behind the landing is an interplay of hardware and software. On the software side, the computer algorithms that guide each part of the craft can be tested from Earth, simulations can be run, and new software updates can be installed – the final stable version was uploaded in the last few days of May.

Testing the hardware was not nearly as easy, since the right conditions can’t be recreated on Earth…

By , July 5, 2012

Read more >

The rapidly evolving world of robotic technology

leave a comment

June 25 (Bloomberg) — Stanford University’s Marina Gorbis discusses the rapidly evolving world of robotic technology and how humans will interact with them, and learn from them over the next five to ten years. She interviews with Adam Johnson on Bloomberg Television’s “Bloomberg Rewind.” (Source: Bloomberg)

Marina Gorbis is the Executive Director of Institute for the Future.

Marina’s biography – During her tenture at IFTF, and previously with SRI International, Marina has worked with hundreds of organizations in business, education, government, and philanthropy, bringing a future perspective to improve innovation capacity, develop strategies, and design new products and services. A native of Odessa, Ukraine, Marina is particularly suited to see things from a global perspective. She has worked all over the world and feels equally at home in Silicon Valley, Europe, India, or Kazakhstan. Before becoming IFTF’s Executive Director in 2006, Marina created the Global Innovation Forum, a project comparing innovation strategies in different regions, and she founded Global Ethnographic Network (GEN), a multi-year ethnographic research program aimed at understanding daily lives of people in Brazil, Russia, India, China, and Silicon Valley. She also led IFTF’s Technology Horizons Program, focusing on interaction between technology and social organizations. She has been a guest blogger on BoingBoing.net and writes for IFTF and major media outlets. She is a frequent speaker on future organizational, technology, and social issues. Marina holds a Master’s Degree from the Graduate School of Public Policy at UC Berkeley.

DARPA looks at developing robots to sew uniforms

leave a comment

U.S. military uniforms may not be the most fashionable of clothes, but there are a lot of them. Every year, the Pentagon spends US$4 billion on uniforms and over 50,000 people are employed in their production. In an effort to cut costs and increase efficiency, DARPA has awarded a US$1.25 million contract SoftWear Automation, Inc. to develop “complete production facilities that produce garments with zero direct labor is the ultimate goal” – in other words, a robot factory that can make uniforms from beginning to end without human operators.

Sewing is a very complex task. I would love to know how they are going to do it!

 

By June 18, 2012

Read more >

ESA tests autonomous rover in Chilean desert ahead of ExoMars mission

leave a comment

With remote control of rovers on Mars out of the question due to radio signals taking up to 40 minutes to make the round trip to and from the Red Planet, the European Space Agency (ESA) has developed a vehicle that is able to carry out instructions fully autonomously.

With Mars lacking any GPS satellites to help with navigation, the rover must determine how far it has moved relative to its starting point. However, as ESA’s Gianfranco Visentin points out, any errors in this “dead reckoning” method can “build up into risky uncertainties.”

To minimize any uncertainties, the team sought to fix the rover’s position on a map to an accuracy of one meter (3.28 ft). To build a 3D map of its surroundings, assess how far it had traveled and plan the most efficient route to avoid obstacles, Seeker relied on its stereo vision.

“We managed 5.1 km (3.16 miles), somewhat short of our 6 km goal, but an excellent result considering the variety of terrain crossed, changes in lighting conditions experienced and most of all this was ESA’s first large-scale rover test – though definitely not our last.”

“The difficulty comes with follow-on missions, which will require daily traverses of five to ten times longer,” he says. “With longer journeys, the rover progressively loses sense of where it is.”

By , June 19, 2012

Read more >