Teresa Escrig

News and oppinion about Cognitive AI & Robotics

Archive for the ‘decision making’ tag

What are the benefits of Artificial Intelligence in Robotics?

one comment

Happy New Year to all!  It’s been a while since my last post. Too busy. Now, I’m back.

————————————————————————————-

Robotics is not only a research field within artificial intelligence, but a field of application, one where all areas of artificial intelligence can be tested and integrated into a final result.

Amazing humanoid robots exhibit elegant and smooth motion capable of walking, running, and going up and down stairs.  They use their hands to protect themselves when falling, and to get up afterward.  They’re an example of the tremendous financial and human capital that is being devoted to research and development in the field of electronics, control and the design of robots.

Very often, the behavior of these robots contains a fixed number of pre-programmed instructions that are repeated regardless of  any changes in the environment. These robots have no autonomy, nor adaptation, to the changing environment, and therefore do not show intelligent behavior. We are amazed by the technology they provide, which is fantastic! But we can not infer that, because the robots are physically so realistic and the movements so precise and gentle, that they are able to do what we (people) do. Read the rest of this entry »

The rapidly evolving world of robotic technology

leave a comment

June 25 (Bloomberg) — Stanford University’s Marina Gorbis discusses the rapidly evolving world of robotic technology and how humans will interact with them, and learn from them over the next five to ten years. She interviews with Adam Johnson on Bloomberg Television’s “Bloomberg Rewind.” (Source: Bloomberg)

Marina Gorbis is the Executive Director of Institute for the Future.

Marina’s biography – During her tenture at IFTF, and previously with SRI International, Marina has worked with hundreds of organizations in business, education, government, and philanthropy, bringing a future perspective to improve innovation capacity, develop strategies, and design new products and services. A native of Odessa, Ukraine, Marina is particularly suited to see things from a global perspective. She has worked all over the world and feels equally at home in Silicon Valley, Europe, India, or Kazakhstan. Before becoming IFTF’s Executive Director in 2006, Marina created the Global Innovation Forum, a project comparing innovation strategies in different regions, and she founded Global Ethnographic Network (GEN), a multi-year ethnographic research program aimed at understanding daily lives of people in Brazil, Russia, India, China, and Silicon Valley. She also led IFTF’s Technology Horizons Program, focusing on interaction between technology and social organizations. She has been a guest blogger on BoingBoing.net and writes for IFTF and major media outlets. She is a frequent speaker on future organizational, technology, and social issues. Marina holds a Master’s Degree from the Graduate School of Public Policy at UC Berkeley.

ESA tests autonomous rover in Chilean desert ahead of ExoMars mission

leave a comment

With remote control of rovers on Mars out of the question due to radio signals taking up to 40 minutes to make the round trip to and from the Red Planet, the European Space Agency (ESA) has developed a vehicle that is able to carry out instructions fully autonomously.

With Mars lacking any GPS satellites to help with navigation, the rover must determine how far it has moved relative to its starting point. However, as ESA’s Gianfranco Visentin points out, any errors in this “dead reckoning” method can “build up into risky uncertainties.”

To minimize any uncertainties, the team sought to fix the rover’s position on a map to an accuracy of one meter (3.28 ft). To build a 3D map of its surroundings, assess how far it had traveled and plan the most efficient route to avoid obstacles, Seeker relied on its stereo vision.

“We managed 5.1 km (3.16 miles), somewhat short of our 6 km goal, but an excellent result considering the variety of terrain crossed, changes in lighting conditions experienced and most of all this was ESA’s first large-scale rover test – though definitely not our last.”

“The difficulty comes with follow-on missions, which will require daily traverses of five to ten times longer,” he says. “With longer journeys, the rover progressively loses sense of where it is.”

By , June 19, 2012

Read more >

The Future of Robotics: personal point of view

2 comments

The future of robotics is advancing towards the incorporation of increasing intelligence.

Intelligence includes, among other things, perception (interpreting the environment and extracting the most relevant information from it), reasoning (inferring new knowledge from the one we perceive, i.e. if we know that A implies B, and B implies C, then we can infer that A implies C), learning (as many people have pointed out in this thread already) and decision making to implement solutions to particular applications (such as security, companion, tele-presence robots, autonomous scrubber machines, vacuum cleaners, etc).

At Cognitive Robots, we have developed the first embryonic brain called “Cognitive Brain for Service Robotics” -CR-B100-, which integrates all these four aspects, in a patent pending software.

We have tested the “brain” in several “bodies” with excellent results.

Please, check this post for more information.

We are actively looking for partnerships and investment capital to bring our company Cognitive Robots to the next level.

If you know of a visionary mind with capital to invest, please, pass that person my email: mtescrig@c-robots.com

We are planning on going to crowdfunding resources like KickStarter and offering our own robotic platform (brain and body) for research and a smaller version for education. What are your thoughts on that?

Cognitive Robots enhances Kompai’s capabilities by incorporating its “Cognitive Brain for Service Robotics”

leave a comment

Since February 2011, Cognitive Robots and Robosoft have been collaborating on the framework of a European project, the ECHORD C-Kompai. The objective of the project is to enhance the companion robot Kompai with the cognitive capabilities provided by the “Cognitive Brain for Service Robotics ®” – CR-B100 – of Cognitive Robots.

The intent behind the improvement of the Kompai platform is to better serve the users – the elderly.

We have identified 3 aspects of the Kompai’s functionality to be improved in this project:

Read the rest of this entry »

We need Service Robots to feed disable students

6 comments

Dear Teresa, My name is Paul Doyle and I am Head of Access R&D at Hereward College in Coventry. Hereward is a residential college that supports disabled students. We have for some years developed a keen interest in the use of robotics as an assistive technology.

I have been in contact with many providers of robots over the years from the PR2 at Willow Garage to the Care-o-bot by Fraunhofer with little tangible progress. What we have failed to achieve to date is to embed and evaluate an actual device in a real care/living/education environment such as Hereward to see if it actually works and if it is financially viable!

I would like to challenge any robot for example to help with the scenario I posted recently on a Linkedin forum:

Today when I was having lunch in our refectory I observed a number of students (with a variety of physical disabilities) waiting in an orderly queue for a human career to help feed them their lunchtime meal. Due to a shortage of careers some of the students waited for a very long time before a staff member could ask what the student wanted from the menu, picked up the chosen meal from the counter and then fed the student in an appropriate manner (food at the right temperature consistency and rate).
This situation led me to ponder the questions could a robot have helped carry out these tasks to some degree, and bearing in mind the care staff are paid not much over minimum wage, when (if ever) will a robot alternative be a financially viable?”

I would hope manufacturers could see this exposure to a group of users as a development resource, as we have a residential care and education setting where such technologies can be tested in a managed and safe environment.

Many of the young people at Hereward will eventually be the recipients of assistive robot technologies if and when they come online, so hearing what they need/want would I imagine provide a useful insight to product developers.

Read the rest of this entry »

The SICK laser sensor is currently mandatory for autonomous robots – if we want the ability to perceive the world, and therefore show a bit of intelligence

8 comments

The security SICK laser sensor is currently mandatory for autonomous robots – if we want the ability to perceive the world, and therefore show a bit of intelligence. It costs almost 3000 euros. While not without its drawbacks, this sensor represents the  state of the art and is the most expensive component in a current autonomous robot.   If we produce robots as prototypes, not on a large scale, we can not provide inexpensive robots yet.

James Falasco – I am curious about the comment that the SICK sensor is mandatory . How so ?

Teresa – Jim, The SICK laser sensor is still mandatory for robots or vehicles that need to show intelligence because:

  • it’s the most reliable distance sensor for medium-long distances, much more than sonar or infrared (which is basically useful for very short distances)
  • it’s necessary to perceive the boundaries of the environment to autonomously build the map of it. The map is necessary for the robot to know where things are.
  • The linear laser, such as SICK, has also drawbacks. The main one is that it only perceives one line.
  • The best way to go would be to have all the information needed and interpreted from a camera, which would be much less expensive, and with richer information.
  • Although we have developed a cognitive vision system which gives meaning to the objects of an image, with two cameras you can get distances to objects, yet we still need further development and some integration to use only camera.
  • We have also integrated into the Cognitive Brain the Kinect sensor with great success. It gives us depth in a conical area in front of the robot, although with short reach (we can’t see the limits of the rooms) and very sensitive to light changes (not good in exterior settings yet).

Summary: We use laser, Kinect and camera sensors. We can’t avoid the laser yet, which is the most expensive component of the whole robot, by far.

I am sure that with more development we can make the camera work to completely substitute the laser. I would love to do it.

Comments of other experts on the subject are very welcome. Thanks.

Read the comments.

Read the rest of this entry »

Cognitive Robots is actively seeking working partnerships and investment capital

3 comments

My name is Teresa Escrig (TeresaEscrig.com).  I’m the founder and CEO of Cognitive Robots.

We’ve successfully developed the worlds first truly autonomous Cognitive Brain, and have focused our efforts on Service Robotics.

We’re actively seeking both working partnerships and investment capital.

Highlights to-date include:

  • A part of the Cognitive Brain for Service Robotics has been successfully incorporated into a commercial floor scrubber machine, as well as a Pioneer research platform (investment from different sources).
  • Our ‘Manual Assisted Driver’ has been successfully incorporated into forklifts and buses (funded by the Spanish government).
  • We have integrating the Cognitive Brain into our own service robotics platform.  This will be launched in the next few months, and can be used for a variety of applications, including companion, security, marketing, air contamination detection, etc. (funded by Spanish government).
  • The Cognitive Brain is being incorporated into Robosoft’s companion robot Kompai of (funded by a European Project).

 

 

 

 

 

 

 

 

 

 

 

If you’d like further information, we’ve prepared a .pdf document that explains in detail what we have and are offering.

If you are interested, please, contact me at mtescrig@c-robots.com

Kind Regards, Teresa Escrig, PhD, CEO Cognitive Robots

The Intelligence Revolution: Visions of the Future

2 comments

Dr. Michio Kaku is a theoretical physicist, best-selling author, and popularizer of science. He’s the co-founder of string field theory (a branch of string theory), and continues Einstein’s search to unite the four fundamental forces of nature into one unified theory.

In this incredibly well done movie, he explains how Artificial Intelligence is affecting our lives now.  How our kids are spending more time in virtual worlds, such as “War of World Craft”,  than with their real friends. And how this will affect our lives in the near future.

It is an amazing review, of some of the scientific research that is taking place on the planet, related with Artificial Intelligence.

There is at least one thing that I do not agree with at all: that humans are going to have incorporated into their bodies more robotic parts than human parts in the near future.  To me, this final idea is nonsense.

Design of a robot for the elderly: aspect and functionalities

leave a comment

If you were going to design a companion robot, what would it look like?

What would it need to do?

What would they call it?

How would it change the life of the elderly?

I asked those questions to my colleges at the LinkedIn groups related with Artificial Intelligence and Robotics.

I will be posting their answer here. Thank you very much for all your contributions!  Keep an eye to it…

by Elad Inbar (LinkedIn Group: IEEE Robotics and Automation Society (IEEE RAS))

Check out this new movie… I think it will gove you many answers.

http://singularityhub.com/2012/02/24/new-robot-and-frank-movie-looks-like-a-realistic-portrayal-of-the-not-too-distant-future/

I post the trailer below: Frank Langella and Liv Tyler on their Sundance hit ‘Robot and Frank,’ about an elderly man living with a home health aid robot. (March 23)

Read the rest of this entry »