Teresa Escrig

News and oppinion about Cognitive AI & Robotics

The SICK laser sensor is currently mandatory for autonomous robots – if we want the ability to perceive the world, and therefore show a bit of intelligence

8 comments

The security SICK laser sensor is currently mandatory for autonomous robots – if we want the ability to perceive the world, and therefore show a bit of intelligence. It costs almost 3000 euros. While not without its drawbacks, this sensor represents the  state of the art and is the most expensive component in a current autonomous robot.   If we produce robots as prototypes, not on a large scale, we can not provide inexpensive robots yet.

James Falasco – I am curious about the comment that the SICK sensor is mandatory . How so ?

Teresa – Jim, The SICK laser sensor is still mandatory for robots or vehicles that need to show intelligence because:

  • it’s the most reliable distance sensor for medium-long distances, much more than sonar or infrared (which is basically useful for very short distances)
  • it’s necessary to perceive the boundaries of the environment to autonomously build the map of it. The map is necessary for the robot to know where things are.
  • The linear laser, such as SICK, has also drawbacks. The main one is that it only perceives one line.
  • The best way to go would be to have all the information needed and interpreted from a camera, which would be much less expensive, and with richer information.
  • Although we have developed a cognitive vision system which gives meaning to the objects of an image, with two cameras you can get distances to objects, yet we still need further development and some integration to use only camera.
  • We have also integrated into the Cognitive Brain the Kinect sensor with great success. It gives us depth in a conical area in front of the robot, although with short reach (we can’t see the limits of the rooms) and very sensitive to light changes (not good in exterior settings yet).

Summary: We use laser, Kinect and camera sensors. We can’t avoid the laser yet, which is the most expensive component of the whole robot, by far.

I am sure that with more development we can make the camera work to completely substitute the laser. I would love to do it.

Comments of other experts on the subject are very welcome. Thanks.

Read the comments.

Comments:

by Adel Djellal (LinkedIn Group: IEEE Robotics and Automation Society (IEEE RAS)) – yes it is very mandatory, it helps the robot to perceive the entire environment. but the problem is that it is very expensive, so we have to search other techniques less expensive, because if we need to design cheaper robots in commercial dimension. we need to minimize the expenses as possible as we get.

by Jim Falasco (LinkedIn Group: IEEE Robotics and Automation Society (IEEE RAS)) – One of the reasons the robotics niche has never fully matured is we continue to debate technology and not talk what people are really willing to pay for and fund. See The Ten Top Reasons Why The Market has never developed at : http://falascoj.blogspot.com/
Here is a teaser :
10.Pervasive Internet
With the all encompassing reach of the Internet old ideas resurface and morph into a reflection of the current environment . With all the attention paid to drones and other related technology people have rediscovered robotics and using the connectivity of the Internet building a worldwide community. This factor gives some glue to the movement that didn’t exist when the last crazes hit.

by Seth Kaufman (LinkedIn Group: Robotics Trends Professional Network) – While Lidar in general is a common, useful sensor, SICK is not the only brand. Hokuyo makes an IP 67 rated Lidar unit, for example.

by Harri Vartiainen (LinkedIn Group: Robotics Guru) – Robert, Google Street View cars has at least two Sick scanners. Compare yourself:
http://ekstreme.com/images/google-streetview-camera-1.jpg
https://mysick.com/partnerPortal/ProductCatalog/DataSheet.aspx?ProductID=33772#
It might not be exactly the same model, but body is same.

by Bryn Wolfe (LinkedIn Group: Robotics Guru) – There seems to be some confusion here about what “Google car” means. The StreetView car is what Google uses to generate street views for google maps. That’s what Harri V shows having the SICK sensors on it.
The self-driving car from google has a Velodyne HDL-64E S2 sensor (shown here
http://www.youtube.com/watch?v=YaGJ6nH36uI ), which costs about $70K.
SICK used to be the only player in the Lidar market, but there are plenty of alternatives now. Unfortunately, these sensors from any vendor are still going to cost you around $5000.

by David McMillan (LinkedIn Group: Robotics Guru) – One large driver for the cost is the fact that this is safety-related hardware. The test and certification for these units is much more expensive than for components that do not directly impact human safety. There’s also the legal liability issue, especially in the US market — it’s bad enough that I’ve worked on systems inside the USA where the safety certification had to be performed by someone hired from the EU, specifically b/c there was no US company willing to accept the legal liability involved.
One side effect of the legal liability issue is that a very high barrier to entry is created, limiting innovation; only entities with very deep pockets and large R&D&T resources will try to get into the safety-hardware market, and even those entities will tend to settle on a “if it’s not broke, don’t tinker with it” policy as soon as they have a product that works well enough.

by Arnout Appelo (personal email) – One of my findings is that for correct navigation you do need something like a laser range scanner, but the thing is way too expensive to make a viable business case for the cleaning industry. But perhaps there will be new solutions within short that will enable the case.

by R. Martin Spencer (LinkedIn Group: Cognitive Modeling) – That’s not really true. We used sensor fusion quite a bit to eliminate the high cost SICK indoors. We have fused short range IR on a scanning array with fixed long range IR and sonar and gotten “loose crowd” levels of autonomy.
Here is an example from our elder care personal robot alpha trials:
http://www.geckosystems.com/timeline/?year=2010
Now outdoors, the SICK does dominate. But we believe our CSA technology can be migrated outdoors. But again use sensor fusion and not rely on any single sensor system to achieve “actionable situation awareness.”

by Hudhaifa Jasim (LinkedIn Group: IEEE Robotics and Automation Society (IEEE RAS)) – Currently, I’m working on an Intelligent Ground Vehicle – I’m glad to reach the level that we can do obstacle avoidance using only machine vision. I admit, the accuracy is not yet enough – but it’s a mater of time (and hard work).

by Ronald Howell (LinkedIn Group: Robotics Guru) – I’ve used the Leuze sensor and had only one problem. (I had a false reflection problem which was corrected by turning up the time delay to about 50 mSec. The factory defaults are set very low, probably for liability reasons). The software is logical but not all that intutive (German). The price was a little less the Sick. I have also used the Keyance and liked that unit for the price we got. We bought a couple of dozen and got them at about half the price of the Sick. Sick does not like to negotiate.

by Meysar Zeinali (LinkedIn Group: IEEE Robotics and Automation Society (IEEE RAS)) – I think we can do the job with vision,and more intelligence. Nature is a good example to learn. Almost all of the animals uses vision system and intelligence (although some animal like Bat uses ultrasonic sounds). Vision can provide more feature of the obstale and surronding environment.
I have developed a data analysis and control systems that has two components. These two components of the control system can also be interpreted as the integration of the fast reaction to immediate feedback information and the reaction based on the knowledge that has already been built into the knowledge base of the controller, i.e., the information encapsulated in the fuzzy rules. So, we even do not need high density data streams to make a decision. The built in knowledge can be updated and modified based on the long-term observation. My point is that use prediction and data stream. on high denstiy data stream cost more.

8 Responses to 'The SICK laser sensor is currently mandatory for autonomous robots – if we want the ability to perceive the world, and therefore show a bit of intelligence'

Subscribe to comments with RSS or TrackBack to 'The SICK laser sensor is currently mandatory for autonomous robots – if we want the ability to perceive the world, and therefore show a bit of intelligence'.

  1. Again I ask who is making it mandatory ? You are making it sound like FAA or other rules apply. They don’t . Also with the advent of GPS the location of the robot could be determined just as a cell phone or IPad can be seen. The real discussion needs to focus on the application and most key what are people willing to pay for . Not thinking this way is one of the reasons that this niche hasn’t become a real industry like the aerospace and automotive verticals.

    James falasco

    3 May 12 at 9:36 pm

  2. Totally true. One of my friend’s startups ‘Gray Orange Robotics’ faces the exact same problems. They have to charge ~ $8.5k per robot (warehouse automation bots) of which the major cost component is simply the SICK laser.

    Naresh

    5 May 12 at 1:11 am

  3. I think many researchers use Hokuyo rangers which are cheaper (<1000 euros for the cheapest) and have a lower spec. but are adequate for most indoor ranges (<= 5m). The kinect also has a similar range, and though it has problems in outdoor environments is potentially more flexible than a (2D) laser type rangefinder and certainly much lower cost!

    Mick Walters

    5 May 12 at 10:54 am

  4. While Lidar in general is a common, useful sensor, SICK is not the only brand. Hokuyo makes an IP 67 rated Lidar unit, for example.

    Seth Kaufman

    5 May 12 at 12:50 pm

  5. The SICK laser has a security protocol to ensure that the vehicle the SICK is installed in will stop before it arrives to an obstacle (or person). The whole vehicle inherits this security protocol once installed.

    We have tried the Hokuyo laser and it has less range, not enough for our application with scrubber machines in commercial environments. But adequate for indoor applications in small spaces.

    In our trials, the SICK laser has performed well outdoors where it has a wall to bump off of.

    Teresa Escrig

    8 May 12 at 4:58 pm

  6. Dave, Velodynelidar looks good. Can you, please, describe here the main advantages for Service Robotics?

    Teresa Escrig

    8 May 12 at 5:01 pm

  7. Yes,it is really expensive. I also think it should be very important to improve current method to use camera vision like human’s eyes. I don’t appreciate 64 Lines Laser Sensor on google car, it’s hard to imagine its practicality in current massively produced car. but how resolve vision perception better, after all camera vision cannot reach at the level of human’s eyes.

    Forrestg Wang

    18 May 12 at 11:07 am

  8. They aren’t mandatory. My college robotics team just placed 5th in an autonomous robot competition (Intelligent Ground Vehicle Competition). We did not use a sick sensor or any kind of laser scanner. We used two $50 webcams for stereo vision.

    Philip

    13 Jun 12 at 12:28 pm

Leave a Reply