Satish Shukla – Hello listeners, Welcome to the brand-new episode of Rawbotics. We have with us today Mr. Adityanag, Co-founder and Chief Business Officer at Mowito, a company at the forefront of vision picking solutions. Adityanag has a strong foundation in computer science engineering and brings a wealth of experience and expertise in the diverse fields of hardware engineering. Welcome to our podcast Adityanag.
Adityanag – Happy to be here, Satish.
Satish Shukla – So Adityanag, you are a software engineer by grade, and we usually see that it is mechanical engineers or electrical engineers or civil engineers who get into software. But in your case a software engineer went into hardware. So, could you share with our listeners how did this happen?
Adityanag – Yeah, so I was working at Sensara Technologies, another value-based startup which was into TV and media space and we were building accessories to enhance the TV viewing experience. And we had built a TV guides which you could load on your mobile and figure out what’s playing with the channel. And then switch your channel with your TV. But what we heard from users is we need a way to switch the channel right from the remote. And that’s when we started our hardware journey on how do we build these accessories for a mobile phone that can help you switch the channel with you without having to reach for your TV or set up or something. And with that journey we built five different products for five years, work with the likes of Xiaomi, Airtel and Panasonic everybody and sold over a half million pieces of our hardware. And that’s how I got initiated into the hardware world. So, to speak.
Satish Shukla – Okay, that’s an interesting journey. And from then hardware to vision picking and Mowito, how did that happen?
Adityanag – So it was mid 2020 when I met Puru. Again, (so this was the time of Covid). Yes, it was the time of Covid and there was an article in the leading meant about what Guru was doing. Puru had studied, worked in the US, come back to India So he was building navigation software for AMRs and read an article and I just out of the blue reached out to him. We started chatting over a few months. I feel that I should be a part of that. And that’s how we got together and we started Mowito. So that was how it happened. And once we once I was a part of the Movito, we started visiting multiple warehouses in all of the different segments. It could be like grocery, it could be fashion, it could be jewelry. We visited over 20 different warehouses of every kind. And what we realized is picking was very repetitive in every kind of warehouse. It included two main actions. One was walking around the warehouse, and one was actually reaching to the item and grabbing it from the shelf or bin. But what we also realized at the same time is picking is still nascent. Yes, there are a few companies that overrated and Right hand-robotics who have started work very seriously on picking. But considering AI itself was not very, very old as a technology, which even was still early days for picking in general. And that’s how we got into vision based picking, we saw that opportunity there.
Satish Shukla – So, very courageous of you to reach out to someone after reading about their newspaper. But not everyone does it. But you definitely require every different type of courage, a bit of audacity to start a work company.
Adityanag – Yeah, yeah. You know, I’m surprised that how very well worked out.
Satish Shukla – Yeah, But it’s great that it all worked out. So, coming back to the warehouse, when it’s like, in the warehouse with a lot of repetitive jobs, there are dull, dirty, dangerous job which you need to automate. And there’s a lot of companies including Addverb, which is doing automation. Yes. Now, there have already been picking robots, if I would say, Yeah. There has been an articulated robot, which does heavy pallet-loading. Yes. Also used in welding. Then there is picking robot, which is a SCARA robot, which does a pick and place operation. Yes. So, if you could share with our listeners, how does vision change these picking operations? And where do you come into the picture?
Adityanag – Yes. So, manufacturing industry has been using robotic arms for a very long time now. Yeah. Especially automobile. Yes. Exactly. So, if we look at Maruti’s Assembly lines, it’ll have a whole bunch of robots, (Yeah) But most of it, so far, has been programmed to work on exact points of point A to point B kind of actions. And there was no increase in spilling into it. Because it’s being used as a hardware tool, which would just move from one point another, did a small work, came back to the same point before. So, they were not perceiving the world around it. They were not doing anything more than that. And then if you look at how the general world around us is structured, it’s not an instructor at all. And who will have items thrown around your shelf, or your wardrobe, or you get a warehouse. Things are not neatly stacked all the time. We could have been, there’s a heap of items. It’s never neatly kept on up on them. (And then you have more than 10,000 to 12,00 SKUs in E-commerce or grocery.) Yeah, yeah. Actually, it turns into more than that. Around E-commerce, it’s millions. Correct. So, world not structured. In the way that a robot can always handle you, just you know, A to B kind of movement. What you need is a level of intelligence where you can pursue the wider object. And then have a vision and manipulation style that will help you pick the item. And in picking itself, you have to look at what I’m doing in terms of what the kind of items I’m picking. Then you have where am I picking it from? And what I’m using to pick it with. So, it would be like, I’m picking from a bin, I’m picking from a shelf, I’m picking from a carton, I’m picking from a pallet. All of these use cases, then I don’t do it, I’m picking fruits and vegetables. Or am I picking cosmetics, all of you are picking jewelry. This is an object around this, it’s square, it’s a cuboid. And then what’s the gripper that has to go with it? Is it a suction base gripper? Is it a two-finger gripper? Now, if you take the sum total of all of these use cases, you cannot solve them one by one. (So, what do you mean you cannot solve them one by one?) So, what I mean is your base architecture of a solution like this, it has to be able to scale to these use cases over time. And you cannot re-do them one after another, (So, your architecture should be such that as the data set increases, it should not increase in size, it should be still able to give you the throughput that you want.) Yes, exactly. And it should not require us to re-do everything from scratch. For each of these reasons. So, our quest has been to build that layer of intelligence, where we know what we need to pick, what’s the best way to pick it, just to go more into bin picking itself. So, if I look at a bin, I’m getting a 2-D image. I’m using a depth camera, I’m getting some information on depth as well. But from that image, I have to figure out where are the objects to and around inside of the bin, which is the best object you pick. Now, there could be 10 items, I have to rank them in a way which I can pick. The one that is easiest to pick. And then even on that item, I have to figure out what’s the best pick point. If you’re going to happen, there’s a place where the stop is coming in. I cannot place my suction there, it will never grip it. So, I have to figure out which is the best pick point on the item.
Satish Shukla – Yeah, that’s very important. You should not damage the product.
Adityanag – Exactly. And on top of this is the application layer of intelligence. So, somebody might say, like in the grocery world, that I don’t want to pick items which are already rotten. If you’re doing fruits and vegetables, I cannot apply the same suction pressure on a tomato, it’ll implode. I have to build this level of intelligence so that it can handle all these items, just like how human does. Like, you know how much how to pick up an egg versus how you pick a tomato. How do you give that kind of intelligence to a machine?
Satish Shukla – So, through vision system, we will act as intelligence. So, the robots of basically, they will also be able to see just like, humans can see us and take that decision. Though with us, it’s very futile. See, and handle. So, you would also be in the process designing repairs for different kinds of picking up the kitchen.
Adityanag – Yes, we would. I know we are using suction, which is working very well for other warehouse use case that we’re going after, which is bin picking. But going forward, we do envision that we might need different kind of grippers for different use cases and we have to build them if it’s not dead.
Satish Shukla – Because each object orientation, they said would repair very difficult. (Yes good. Very good. So, what were some of the learnings that you have during this journey by building this solution or what were some of the Eureka moments if you could share with our audience?
Adityanag – Yeah, so, while all of this sounds very easy, right? The kind of items we encounter in everyday life, the range is very, very high. We don’t seem to realize it. But the way you pick a visiting card or the way you pick a needle or the way you pick a very heavy aata bag, they’re all very different. So, one of the things we’re going to recommend is even if the number of rescues in a dark store is maybe 5,000, 6,000 SKUs, it’s still very hard to come to a point where everybody can handle everything. Yeah. Like there’s a great deal of rescuing market. Now, even after I have all the increases in how to handle it, there’s a good account that’s required for either a particular payload is very different from something that handles a 200-damp payload. (Very correct). So, but I do not can do it very easily. Yeah. Right. So, the same picker can handle everything. So, our biggest learning has been to make robotic picking universal, to cover a whole range of SKUs in order to point where it really makes it ubiquitous in every warehouse. It’s still hard long journey.
Satish Shukla – Okay. And you’ll build this intelligence on the robot, or you’ll build it on cloud and how you’re able to handle that.
Adityanag – So, the way we have built us that it’s completely independent of the robot. Okay. So, based on the application you’ve already. So, it’s hardware agnostic. Yes. It’s totally hard diagnostic. And we’re using interviewer sense, which is which is not even an expensive vision equipment. Easily available. So, the idea has always been that you use existing hardware tech, which is already mature. And see how far you can get with that itself.
Satish Shukla – Okay. So, since when you impart intelligence to robots to this vision system. Yeah. Do you think it would be easier for humans and vision enable robots to work side by side? Do you do you have such use cases?
Adityanag – Right now, we have not seen too many use cases where the human and the robot has to work side by side. But I do envision that with Cobots coming along and becoming more affordable. So, cobots would be collaborative robots? Yes, collaborative robots. We can have the work and outside view, in cosmetics, for example, you’re doing packing. (yeah) So, a person can make a box and the work will grow back and see what it is. And so, it’s kind of improving the throughput and reducing the workload on the person. Yeah. So, these are use cases that can be tapped into as we proceed. Because maybe the box is maybe a little bit harder for a robot. Yeah, because there is more dexterity. Yes. So, we need that to the human, but have the reward drop items into the box, which then gets back in single conveyor. So, those use cases definitely will be explored.
Satish Shukla – Yeah. did mention that a lot of products are already there from point A to point B movement, but picking is probably that frontier, which will require more innovation. Yes. Because today, that is the more manpower intensive process at warehouses. Yeah. So, when the kind of experience and expertise that you gain in developing a vision solution, I would say, for making robots intelligent. What are the kind of products that you would like to develop in future or what is the kind of roadmap that you feel you would like to follow for future?
Adityanag – So, I think the key is how quickly and what is it? The holy grey here is, I am doing an action. Okay. There’s a camera here and it’s already learning and telling about what to do the same thing. Yeah. If you can get there, then things will be super-fast in how quickly we can deploy these solutions. Because we do believe that the hardware cost will keep coming down. Okay. Over time, over more usage of robotic arms, the costs will kind of come down, but having to train them for everything which takes months is a no. So, how quickly enable them to this intelligence that a real human interaction can you just duplicate the action? That is where I believe our R&D will be.
Satish Shukla – Okay. That will be interesting. Yeah. so we have a lot of listeners who come from different engineering colleges. Yeah. And this is a field where a lot of students are looking to do some work and do some projects. So, since you’ve worked extensively on vision, if somebody wants to do a project or if somebody wants to learn how to go about this field, what would be some of the hacks that you would like to share with them?
Adityanag – So, most of the graduates that we talked to these days, they’re already doing a lot of projects. Everybody is at least a part of the F1 Racing team which is there. And if any college nowadays has a robotics club, you cannot find one without it. So, they’re already doing projects and that’s already a great start. So, be a part of the as many of these clubs in your college or school and get your hands dirty. So, I think it requires the maker breaker mentality. It’s good to make and break things early on. So, when you actually are in a position to start something, you’re already equipped with some basic skills of how to put things together and how to build your first prototype and so on. So, as far as the skill set goes, people started programming in school. So, but I think what we need to encourage more is the maker of side. Once we have people trying to build things and even without the worry of what it doesn’t work, I guess we will progress a long way and we’ll see a lot more companies attempting these kinds of problems.
Satish Shukla – Very true. So, for all the entrepreneurs out there, if you would like to share some learnings based on your journey, like if somebody wants to start a robotic company, what would some of those learnings be?
Adityanag – Patience. First, first patience. I think it’s a much longer journey than a lot of other start-up areas, I feel. Primarily because it’s very cross-sectional. You have to bring mechanical engineering, you have to bring electrical engineering, you have to bring software. So, you have to put all the pieces of the puzzle and that takes time. So, that is point no 1. On product my, and this is always been my maybe quirk, you could say. If you then put your product in the back of your car and demonstrate for customer, it’s a great thing. So, I don’t know if our product qualifies for that, but if you’re building, let’s say, a floor cleaning solution, maybe, you can just put it in the back of the car and demonstrate it. I guess, demonstrability or ease of demonstrability plays a huge role in cracking those early deals and getting a nod from a customer rather than having to bring him over to your place and demonstrate it. Other thing was integration. So, are you dependent on some of the system being there? (Yeah.) You have to integrate very quickly with some software stack you’re using? This kind of an expression to the whole same cycle. That’s that’s one thing I’ve learned. And as well as the skillset goes, yeah, you have to get to the rest of the engineers. And look for people who have built things.
Satish Shukla – So, that was very interesting. Yeah. So, to conclude it on a lighter note. as you as I understood, with your vision system, the robot gets sort of the eyes and then is able to see this around the legs. Right. So, can you expect with your intelligence, these robots can help pick partners for humans someday?
Adityanag – I think that’s the job best left to humans at present. I don’t want to run the risk. But today that job is also being done by dating sites. have could we have a robot inside
Adityanag – Not without human validation. It still takes you to a date, but not beyond that. I think we should leave it in that for now. And probably solve problems with factories and real-time. Thanks, actually. Yeah. Okay. Nobody’s cursing for the rest of their lives.
Satish Shukla – Very true. So, thank you. Thank you for your time. It was really fun interaction with you.
Adityanag – It’s thank you so much for having me here.
Satish Shukla – Thank you.