Caltech’s aquatic robot uses AI to navigate the oceans


The ocean is big and our attempts to understand it are still largely on the surface. According to National Oceanic and Atmospheric Organization, about 80% of the big blue is “unmapped, unobserved and unexplored”.

Ships are the primary means of collecting information about the seas, but sending them frequently is costly. More recently, robotic buoys called Argo floats have drifted with the currents, diving up and down to take various measurements at depths of up to 6,500 feet. But new aquatic robots from a Caltech laboratory could go further and undertake more suitable submarine missions.

“We envision an approach for global ocean exploration where you take swarms of little robots of different types and fill the ocean with them for tracking, for climate change, for understanding the physics of the ocean,” explains John O. Dabiri, a professor of aeronautics and mechanical engineering at the California Institute of Technology.

Enter CARL-Bot (Caltech Autonomous Reinforcement Learning Robot), a palm-sized aquatic robot that looks like a cross between a pill capsule and a dumbo octopus. It has motors for swimming, is weighted for standing, and has sensors that can sense pressure, depth, acceleration, and orientation. Everything CARL does is powered by a microcontroller inside, which has a processor that is 1 megabyte smaller than a postage stamp.

CARL is Dabiri Lab’s latest ocean crossing innovation, created and 3D printed at home by a Caltech graduate student Peter Gunnarson. The first tests Gunnarson performed with him were in his bathtub, as Caltech’s labs were closed in early 2021 due to COVID.

[Related: These free-floating robots can monitor the health of our oceans]

At present, CARL can still be controlled remotely. But to really reach the deepest parts of the ocean, there can’t be a hand grip. This means that no researcher is giving instructions to CARL – he must learn to navigate the mighty ocean on his own. Gunnarson and Dabiri sought out computer scientist Petros Koumoutsakos, who helped develop AI algorithms for CARL that could teach him to orient himself based on changes in his immediate environment and past experiences. Their research was published this week in Nature Communication.

CARL may decide to adjust its route on the fly to bypass strong currents and get to its destination. Or it can be left in place in a designated location using the “minimum energy” of a lithium-ion battery.

CARL’s power is in memories

The set of algorithms developed by Koumoutsakos can perform orientation calculations on board the small robot. The algorithms also take advantage of robot memory from previous encounters, such as how to pass a hot tub. “We can use this information to decide how to handle these situations in the future,” explains Dabiri.

CARL’s programming allows him to remember similar paths he has taken in previous missions and “over repeated experiments, better and better at sampling the ocean with less time and less energy,” adds Gunnarson.

Much of machine learning is done in simulation, where all data points are clean. But transferring that to the real world can be tricky. The sensors are sometimes overwhelmed and may not pick up all the necessary metrics. “We’re just starting the testing in the physical tank,” Gunnarson says. The first step is to test if CARL can perform simple tasks, such as repeated dives. A short video on Caltech blog shows the robot awkwardly swinging and diving into a tank of still water.

Caltech

As the tests progress, the team plans to place CARL in a pool-shaped tank with small jets that can generate horizontal currents for it to navigate. When the robot graduates, it will move to a two-story facility that can mimic updrafts and downdrafts. There, we will have to find how to maintain a certain depth in a region of the ocean where the surrounding water flows in all directions.

[Related: Fish sounds tell us about underwater reefs—but we need better tech to really listen]

“Ultimately, however, we want CARL in the real world. He will leave the nest and go into the ocean and with repeated trials there the goal would be for him to learn to sail on his own, ”explains Dabiri.

During testing, the team will also adjust the sensors in and on CARL. “One of the questions we asked ourselves was what is the minimum set of sensors that you can put on board to accomplish the task,” explains Dabiri. When a robot is equipped with tools like LiDAR or cameras, “it limits the ability of the system to stay in the ocean for a very long time before needing to change the battery.”

By lightening the load on the sensors, researchers could extend CARL’s lifespan and open up space to add scientific instruments to measure pH, salinity, temperature, and more.

CARL’s software could inspire the next bionic jellyfish

Early last year, Dabiri’s group published an article about how they were using electric zaps to control the movements of a jellyfish. It’s possible that adding a chip that houses machine learning algorithms similar to CARL’s would allow researchers to better direct jellies across the ocean.

“Finding out how this navigation algorithm works on a real live jellyfish can take a lot of time and effort,” says Dabiri. In this regard, CARL provides a test vessel for algorithms that could potentially enter mechanically modified creatures. Unlike robots and rovers, these jellies reportedly have no depth limits, as biologists know they can exist in the Mariana Trench, some 30,000 feet below the surface.

[Related: Bionic jellyfish can swim three times faster]

CARL, in itself, can still be a useful asset in ocean monitoring. It can work alongside existing instruments like Argo floats, and perform solo missions to perform finer explorations, as it can get close to the seabed and other fragile structures. It can also track and tag with biological organisms like a school of fish.

“You might one day imagine 10,000 or a million CARL (we’ll give them different names, I guess) all going out into the ocean to measure areas that we just can’t access today simultaneously so we have a weather – a resolute picture of how the ocean is changing, ”says Dabiri. “This is going to be really essential for modeling climate predictions, but also for understanding how the ocean works.”


Source link

Previous Completeness: The stewards' decision regarding the procedure for a late restart of the race in Abu Dhabi
Next Other Papers Say: Make Double Credit Courses Free