April 12, 2026

by

Tor Odland

We're going to one of the biggest robotics shows in the world to talk about people

MODEX draws around 40,000 visitors to Atlanta every two years. The floor is enormous with hall after hall of robots moving pallets, scanning shelves, sorting parcels. It is, in every sense, a show about machines. So it feels right that we're showing up with a sign that says: Humans detected.

Sonair team human detected

Meet the Sonair team at booth #B15851

Get the materials we're sharing with visitors, straight to your email

The problem with how robots see

Most robots working in warehouses and factories today rely on 2D safety sensors: flat, horizontal slices of the world at roughly knee height. The robot knows if something is beside it. It does not know what is above, below, or leaning in from an angle. It cannot detect a child crouching, a low pallet partially hidden around a corner, or a bag left on the floor. This is not a failure of engineering. It was a reasonable starting point. But as robots move deeper into spaces shared with people a flat picture of the world is no longer enough.

We built ADAR to fix that.

Sound sees what light misses

ADAR stands for acoustic detection and ranging. Instead of light, it uses sound. Specifically, MEMS-based ultrasonic arrays that emit pulses in multiple directions and build a true 3D picture of the environment in real time.

The result is a 180° × 180° field of view at a four-metre range. Not a slice. A sphere.

And because it works through sound rather than light, it does not care about the conditions that cause cameras and LiDAR to struggle: darkness, dust, reflective floors, humid air, transparent surfaces. The environments where autonomous robots most commonly operate are exactly the environments where optical sensing most commonly fails.

ADAR also carries no cameras. It captures no images. In a world where robots are increasingly deployed in schools, hospitals, sports venues, and government facilities, that matters. Not just for safety, but for privacy and data compliance.

Already in production

Technology is easy to promise. Harder to prove. At MODEX, we will point to something real: Cleanfix's RA660 Navi XL, a next-generation autonomous floor-cleaning robot now shipping with ADAR as its primary safety sensor. Where earlier versions of the platform relied on up to 14 separate linear ultrasonic sensors, each covering a limited area, ADAR consolidates all of that into a single unit, generating a full 3D point cloud.

RA660 Navi XL by Cleanfix

"For us, it was crucial that perception remains stable even under difficult conditions," says Roger Kaiser, Head of Robotics at Cleanfix. "Only then can autonomous cleaning be reliably scaled in everyday operation."

Fewer interruptions. Smoother motion. Robots that work through the night without stopping for a dust particle.

Cleanfix is exhibiting at MODEX: Booth A7005.

The team in Atlanta

Four of us are making the trip: Christian Clausen (CCO), Mathias Madsen (Robotics Engineer), Marta Haro (Sales Support Manager), and Tor Odland (CMO),. We will be on the show floor talking to engineers, operators, fleet managers, and robot builders, the people who decide what goes on a machine and why.

We expect they will want to understand not just what the sensor does, but why it is built the way it is. That question is the one we find most interesting too.

Sonair team

Come find us

We are at MODEX in Atlanta this week (April 13-16, 2026). If you are there, come by and see ADAR in person.

If you are not at the show but are building autonomous machines that operate around people, we would still like to talk.

Get in touch →