A Conversation With Brendan Englot, Stevens Institute’s New Director for Artificial Intelligence

Englot studies how to develop navigation, sensing, and mapping systems for robots in GPS-denied environments.

Stevens Institute of Technology

Brendan Englot is the new director of Stevens Institute for Artificial Intelligence (SIAI)
Englot spoke with Robotics 24/7 as part of National Robotics Week and discussed his research and new role as Steven Institute's new director of artificial intelligence.

With National Robotics Week soon coming to a close, we at Robotics 24/7 wanted to get some additional academic perspective.

We spoke with Brendan Englot, an associate professor of mechanical engineering at the Stevens Institute of Technology, to discuss his research in submarine robotics and his role as an educator.

“I just think robotics is a great way to get a hands-on introduction to all the different career paths that are the result of STEM education,” he told us.  

In addition to teaching robotics courses and running his own robotics lab, Englot was recently named the new director of the Stevens Institute for Artificial Intelligence (SIAI). It's a role he is excited about as AI continues to permeate into new subject areas.

Englot is also a senior member of the Institute of Electrical and Electronics Engineers (IEE) and the American Society of Mechanical Engineers (ASME). 

Editor's note: This interview has been edited and condensed for clarity.

Can you give some background on who you are and your area of specialty?

Brendan Englot 

I would say my area of expertise I would broadly describe as autonomous navigation for mobile robots - any kind of robot that moves, whether that's on the ground, underwater, or in the air. I'm concerned with how those robots can localize themselves, figure out where to go, how to get there, and do so in degraded conditions they might encounter in outdoor environments and in GPS-denied situations as well.

A lot of that work in recent years has involved underwater robots because that is really where some of the toughest unsolved problems are right now in robot navigation. There are a lot of unique challenges in that domain.

Submerged underwater, you don't really have access to the electromagnetic spectrum, so you can't communicate via Wi-Fi radio. You can't receive GPS information. Your ability to see what is around is degraded as well. Cameras don’t' work as well underwater. … So we have to take advantage of unconventional sensing and communication modalities for robots to have situational awareness.

That's a major thrust of my research, helping underwater robots, and in general, robots that are operating in degraded conditions achieve better situational awareness…

In the course of doing that, we also integrate a lot of recent developments in AI and machine learning into our solutions. In virtue of that, I'm also serving as the Director of the Stevens Institute for Artificial Intelligence. As of this year, I've been director of that center in addition to running my robotics lab at Stevens.

You recently published an academic journal that describes the work you and a group of students completed regarding submerging autonomous mobile robots in a crowded Long Island marina. Could you speak about that research and your findings?

That article was published in IEE Journal of Oceanic Engineering, and it describes a multi-year undertaking to customize an underwater robot and equip it with the capabilities, both the navigation and decision-making capabilities, needed to explore unknown environments completely on its own, completely autonomously.

The goal was to introduce it into a part of a marina it had never seen before. There will be obstacles surrounding it everywhere – piers, boat holes, sea walls, all kinds of hazardous structures. And the goal would be for the robot, on its own, get turned on, wake up, discover its surroundings, and then figure out how to build a complete map of the entire environment within a bounding box that we provided it.  The only thing it knows is the designated boundaries and it has to try to map everything within those boundaries.

It took many years to get to that point where we had that working successfully. We started out working in simulation. We used simulation tools to try to develop algorithms that we'd thought would run efficiently on the real robot. We wanted to capture the fact that this robot would be computationally constrained. …

At the conclusion of the project, we did have a fully autonomous exploration capability, where our robot was able to map the surrounding environments using its sonar, plan a trajectory through the environment and choose the next places to explore, ensuring that it avoided collisions along the way, increase as it navigated the environment, increase the boundaries of its map and populate the map with new information. It ended up coming out of it with a very accurate map as well.

We did a systematic study where we compared our approach with a few other classic approaches for robots to explore unknown environments. The key innovation with our method was that the robot's goal was not just to do this quickly, but the goal was to build the most accurate map possible. The robot needed to be able to predict its own navigation precision as it went along the way. …

We had our robot basically step-by-step throughout this experiment predicting its own uncertainty, and try and make decisions that would keep that uncertainty ellipse as small as possible and allow it to build the most accurate map.

I know the whole point of National Robotics Week is to get students inspired about robotics and STEM. Can you speak about how students engaged with this project?

 We've had students involved in that project over the years at all different levels. Undergraduate students have gotten involved. Master's students and Ph.D. students have gotten involved as well. I would say the main technical contributors to this research over the years have been the Ph.D. students who made a long-term commitment of several years.  

This research ended up being the content of their doctoral thesis, and they went off and started their own career in research afterward. There have been some pivotal contributions to the project in that category.

But we also have had every year new undergraduate students join our project and help us in ways that really turned out to be critical, helping us design and build new components into our robot, helping us add additional sensor payloads to the robot, helping us launch and recover it from outdoor environments safely.

Anytime we dropped the robot into the water and pulled it back, we wanted to make sure that we were doing it in way that protected the equipment and is safe for all the people operating the robot.

We've had undergraduate students over the years help us improve and optimize the user interface so that we could look at our computer and see exactly what the robots doing, and get the best picture of what is going on below.

This has led to great career opportunities as well for students who get involved in this work because robotics is a multidisciplinary experience that allows them to see a little bit of mechanical design, electrical design, and programming algorithm design. They come out with a great skill set that ends up being very helpful in looking for jobs later on.

Tell us more about your role as the new director of Stevens Institute for Artificial Intelligence.

It's a unique opportunity because SIAI is a unique center at Stevens, and is very, very broad in scope. It's probably the broadest collection of faculty that we have involved in one center at Stevens. AI touches so many different disciplines in science, and engineering, where you would expect it to, but then even outside of those as well. We have faculty in the arts and humanities, music, business, systems, and enterprise, whom all think about AI in different ways and study it in different applications.

Our goal is to bring all those researchers together who have a very different perspective on AI and enable these ambitious multidisciplinary collaborations that might not otherwise have happened. We may not normally have had a media or music or arts professor talking to a computer science professor, but through our center, those kinds of connections happen.

With technologies like ChatGPT, it seems like AI is becoming more graspable to regular people. You are talking about bringing this multi-disciplinary approach to understanding AI. Was that intentional on your part knowing that these things are converging, or was it kind of like a happy accident?  

I think it is just kind of a happy accident that we were sitting in the right place at the right time for all of this to happen. Everyone who belongs to our center right now was already working on AI in various areas. All the new developments that now we are seeing week by week with these generative AI capabilities that are coming out and having such impact, it's just motivated us to embrace new ways we can brainstorm and team up to better understand these tools.

Do you teach specific classes at the university?

The whole time I've been at Stevens, I have taught one core undergraduate course, which is the senior design capstone. It's a really fun course to teach. It's intended to mimic the real-world engineering experience you might have when you go out into the industry and you work on a real project with an engineering team.

It's one of the most unique experiences students get to have because everyone does it differently. Everyone chooses a different project name or project sponsor.

I teach a section of that course where all the students are working on robotics projects. It involves multidisciplinary teams of mechanical, electrical, and computer engineering students working together. It's a really fun class to teach because I get to work with students over the course of a whole year as they're building these exciting robot projects.

In addition to that undergraduate course, I also teach a graduate course that I've developed basically as the course I use to train students in my own lab. The title of the course is basically my area of research specialization – autonomous navigation for mobile robots.

In that course, I teach all the fundamentals of the different types of algorithms autonomous vehicles need to navigate successfully. We talk about path planning. We talk about localization and so on. At the end of the semester, students have kind of seen the gamut of different algorithms that an autonomous vehicle needs to navigate successfully.

You work with students who are far along in their education and understand they want to go into robotics, but it seems the whole point of National Robotics Week is to speak to a younger audience. What advice would you have for students thinking about going into robotics as a career but are still young and trying to figure out what it is that they want to do?

I really enjoy interacting with pre-college students, especially students who are still figuring out what they want to do, what they want to major in. I've had the opportunity to give a lot of lectures to high school audiences and a few middle school audiences as well. Robotics is just an exciting vehicle for students to learn about science and engineering and all the different ways they can get involved in STEM in the future since it's so multidisciplinary.

I’m thrilled that there are now things like FIRST Robotics that are so ubiquitous that you can get involved in so early in your K-12 trajectory. Maybe not everyone who does FIRST Robotics will go on to become a robotics researcher, but I think it will help them learn a lot about what they might want to study in the future because they are going to get exposure to physics, a bit of mechanical engineering, electrical engineering, computer science, and math as well. It's channeling all of it toward a really fun application, a really fun goal – trying to accomplish a really tough mission with a custom-built robot.

What are some exciting developments happening in your research area?

One of the coolest things that happening that I'm paying very close attention to is the amount of maturity we are seeing in legged locomotion. We're starting to see robotic legs that can do very sophisticated things like Boston Dynamics' Spot and all its various competitors around the world that are being developed by different competing companies building quadrupedal robots and humanoid robots as well. …

It's going to be interesting to see the ways those begin to have an impact and get into places no wheeled robot was able to go before, maybe no winged robot was able to go before.

I'm also excited to see what happens with these new generative AI tools that are coming out week by week. We're getting more powerful tools put into the public domain. And those tools can also have an impact in robotics. I'm really interested to see what robotics researchers can do with some of these tools.

And you know, there's some challenges we have to confront as well. These really large models that are the basis of ChatGPT have to be connected to the cloud to be used.

There are some really tough and really exciting robotics applications where you might be not be able to be connected to the internet at all. … Those kind of robots also need to be able to use AI and machine learning, and there's potential for impact.

I'm interested how we can make progress in that area, where you have systems, sometimes these are called swamp systems, that have to be small, light weight and low power and not connected to the internet. They may have to be fully embedded systems.

I'm curious to see what's going to happen in the future. Maybe we take all the processing power of the best-wheeled robots today and shrink it down to the size of an insect.

The last one I should mention is human-robot collaboration. As we get better and better more sophisticated robot platforms, better sensing, better computing, and better actuation capabilities, then there are going to resemble the agility and perception of humans more and more.

I think we we're going to just end up having better collaborators and robots of the future.

#62 Robotics Innovation Chat: Jiren Parikh + Brendan Englot - A Few Things - 53 Min

About the Author

Cesareo Contreras's avatar
Cesareo Contreras
Cesareo Contreras was associate editor at Robotics 24/7. Prior to working at Peerless Media, he was an award-winning reporter at the Metrowest Daily News and Milford Daily News in Massachusetts. Contreras is a graduate of Framingham State University and has a keen interest in the human side of emerging technologies.
Follow Cesareo:  
Follow Robotics 24/7 on Facebook
Follow Robotics 24/7 on Linkedin

Email Sign Up

Get news, papers, media and research delivered
Stay up-to-date with news and resources you need to do your job. Research industry trends, compare companies and get market intelligence every week with Robotics 24/7. Subscribe to our robotics user email newsletter and we'll keep you informed and up-to-date.

Stevens Institute of Technology

Brendan Englot is the new director of Stevens Institute for Artificial Intelligence (SIAI)

Robot Technologies