“What Can Robots Do? Things That Smartphones Won’t Be Able to Do” – An Interview with MIT Professor Sangbae Kim

Robots have been with us for a very long time–in literature, film, comics and even factories. It is still difficult to see them in our daily lives. When are they going to come into our daily life? What will they be like? We asked these questions to Professor Sangbae Kim, who serves as a technical consultant at NAVER LABS and leads MIT Biomimetics Robotics Lab.

Q. Why is it still difficult to see robots helping people in daily life?

Robots have all been inside factories–they performed simple predefined tasks in factories. Then, various algorithms have recently advanced at a rapid pace through artificial intelligence (AI) and deep learning. Based on such advances, many engineers are thinking more deeply about how we can bring robots outside factories for direct assistance and contact with people.

It is going to happen in the not-so-distant future. Nevertheless, it is still very difficult to make robots be of help to people in daily life. Daily life settings are quite different from factories. In factories, robots are optimized for simple tasks such as picking up and moving things. With accurate location control alone, we were able to solve many problems in factories.

However, when you look at what we are doing in our daily life, things are not so structured at all. For instance, we do the dishes differently every day. Dishes are very different from one another in terms of weight and threshold. If you want to do the dishes quickly, you often hit dishes with one another. You have to pay extra care to make sure that the dishes do not broken. Even if you want to make robots do such menial task, many things are missing in today’s robotic algorithms.

In fact, the human brain or its function is truly amazing. But from our point of view, that is taken for granted. So we do not recognize how difficult things can be. For example, let’s say I put my hand in my pocket and take out a 500 won coin. Then, what have my fingers done, and how? It’s perplexing and also difficult to explain. The brain or somewhere in the nervous system has just done it on its own. We don’t recognize that this hidden process and function are required for robots. There is some wishful thinking that a robot works immediately like a human if you bring a robotic arm from the factory, put a camera on it, and run some machine learning on it. From my point of view, something big is missing, and we should keep looking into what it is.

Sometimes when developing robotic technology, there are times when some things do not receive adequate attention or when some things receive undeserved attention. For instance, everyone’s mind was blown away after they saw Mini Cheetah backflipping. Truth be told, that is seen from a human’s point of view. Programmatically, it is actually tens of thousands times more difficult to make it not trip over and walk straight than to make it backflip. Many of these tasks for robots will eventually happen in daily life.

Q. What are some projects and/or hot topics that you’ve been recently thinking about?

What many professors are recently talking about is whether we should optimize existing technology for a certain task or we should just give up previous technology and replace it with machine learning. If we mix them together, the question then falls on its architecturing. When a task gets a little bit complex, a robot will not function as intended if there isn’t the right architecture. When you talk about AI today, most problems are solved by machine learning, deep learning and neural networks, and people are looking forward to advances in combining various neural networks. Many of the specialized neural networks will be created such as the AI which only recognizes faces and the AI which avoids only people. Most discussions are about how we combine them together and align them better for a more complex function. In the end, it is about whether we go with data driven or modelling driven, and how we can mix and combine them well.

Q. What role will robots play in our life?

Eventually, I think it will be for labor. Because smartphones are already doing almost everything that does not require labor. What are the things that smartphones cannot do? Although smartphones are already doing intelligent things such as delivering information and communicating with people, they do not have a motor. In other words, a smartphone cannot bring a cup of water to someone physically challenged.

Of course, today’s robots still lack physical intelligence, so there is a lot more to research. Anyway, it will begin with services that do not require complex actions or interactions. That is why many people are now thinking about delivery. Not to the extent of robots knocking on doors, but to make it perform labor useful for people like delivering something from point A to point B.

This is an attempt to connect technology to value. When a robot has an autonomous driving function, they attempt to connect this function to a delivery service where the robot delivers things to people. As such, it is important to identify how we connect a robot’s function to valuable aspects in our life.

Q. With regard research in robot mobility, how far has it come along and how will it be advancing?

We developed MIT Mini Cheetah as the last one in the Cheetah series where we researched quadruped robots. So, its performance is better than other previous ones. Its size is also small, so its ability to resist a shock is very good. This robot can do experiments hundred times in a day. Furthermore, a robot should be designed extremely well to recognize a shock without any sensor, and Mini Cheetah is also great in this regard.

Recently, we have been developing an algorithm that maintains balance while controlling a force in accordance with directional instructions. When we applied this algorithm, Mini Cheetah began to show the outstanding movements which have been difficult to imagine before in other robots. It is like a superhero in the robot community. You can drop it from 2 meters above ground, and it can take 10 steps in 1 second. Mini Cheetah can perform movements that other robots have found difficult to perform.

I think these technological elements in Mini Cheetah can be applied to biped robots in the future. They will be able to walk around while using arms in places like construction sites. One of the advantages with biped robots is that it is taller. While quadruped robots should get bigger in size to pick up something from a bookshelf or ledge, biped robots are smaller and can perform similar movements like humans. Although it is still difficult and dangerous to bring biped robots into our daily life, I think it is the ultimate direction and many technological elements in Mini Cheetah will be used for that.

I think research for robots equipped with functions that can actually help people should continue for years to come.

Related Articles >
“AI for Robotics,” A Global Workshop on the Future of AI and Robots
MIT Mini-Cheetah Workshop
Professor Kim Sangbae, a Global Robotics Engineer, joins NAVER LABS as Technical Consultant

> Subscribe to our newsletter

Related Articles