How Intel’s bleeding-edge Loihi 2 chips help robots perceive the world

“The Center for Robotics is a huge lab”

Computers destroy humans at chess, but there’s not a robot in the world you could send into an unfamiliar house and tell it to feed the dog; the general intelligence and adaptability of the human brain remain unrivalled. Intel’s research-grade Loihi 2 neuromorphic chips are designed to help close the gap, drawing inspiration from nature’s greatest necktop supercomputer.

We spoke to Queensland University of Technology researcher Dr. Tobias Fischer about his work integrating these cutting-edge chips into autonomous robots, where they’re outperforming resource-draining supercomputers on certain tasks. Fischer’s team is working specifically on localization and navigation – helping robots work out where they are in unfamiliar situations.

See Also:

Norway: An unusually bright meteor lit the sky (video)

“The Center for Robotics is a huge lab,” said Dr. Fischer. “Over 100 people. We do everything from manipulation – grasping objects and picking them up – to space robotics, a bit of human interaction and the social elements needed when you talk with humans. We do a lot of research on vision techniques, using cameras and sensors to help robots perceive the world similarly to how we do with our eyes. Taking a series of pixel intensities and giving it a higher level meaning to say that’s a car, that’s a chair. Super simple, even for a five-year-old, but incredibly hard for a computer.

Read more: New Atlas