Maker Faire NYC? We were there, and we took to the tents to explore who on hand is making life better. We spoke with teens and the well-traveled each tackling issues of access. One group of tech whizzes are 3D-printing InMoov robots, and tweaking them to perform different tasks - so we grabbed a quick [tech-y] conversation with Boston-based InMoov "contributor, enthusiast, promoter," Kevin Waters.

Upload PhotosAdd Photos from Photo Library

0%

NIN: What are you adding to the InMoov robot that is different to what other people are doing?

Kevin: With InMoov, being a combination of MyRobotLab and the InMoov parts. one of the things I really wanted to do was the Oculus Rift integration.

So, I took a JNA [Java Native Access] binding of the Oculus Rift APIs that exposed those APIs to Java, and I took that and integrated it in as a service into MyRobotLab. I'm one of the committers that actually submits code to MyRobotLab and works on the software side of it. So, basically, anybody now that has an InMoov or a stereo vision system, in MyRobotLab they can right-click and start an Oculus Rift service and be able to start tracking your head for your position (tilt, roll, pinch, all that sort of stuff).

It's just really fun.

Kevin Water's InMoov bot

Kevin Water's InMoov bot

Sorry, how long did it take you to print the entire bot?

Kevin: I've been working on this particular robot for about 8 or 9 months now. I would say it's nearly a month worth of continuous printing and though I'd like to say I printed all the parts once, sometimes I made mistakes or whatever. So I had to re-print a few of them.

I've got a MakerBot Replicator 2. So mine's printed out in PLA [a bioplastic], because I don't have a heated bed on mine. But the other two InMoovs that are here, they're ABS. I believe one of them was done on a [MakerBot] Replicator 2X. 

"The first open-source 3D-printed life-size robot"

"The first open-source 3D-printed life-size robot"

What have you seen other people doing within the InMoov community?

Kevin: It is a community.

And you share code and share software?

Kevin: Share code, share ideas, share software. Some of the exciting things I've seen lately are people coming up with new gearing mechanisms. One guy came up with servo motors to replace some of the stepper motors, so it's much faster, much stronger. Some other people have changed out their worm gears for the shoulders and created a planetary gear drive.

Of how many pieces is the robot comprised?

Kevin: In total, there's probably over 200 parts in the entire InMoov project at this point.

And have you seen anyone doing stuff to offset a disability?

Kevin: There are some people who are looking at using the arms as prostheses. There may actually be one person here [at Maker Faire], that has one. 

I think we know him. Yeah, we met one guy at Maker Faire Bay Area - Nicolas Huchet - who created his own bionic prosthesis together with an InMoov hand.

Kevin: Yeah, I saw pictures of that.

He had about 25 Duracells strapped to his forearm to power it.

Kevin: (laughs) The true Iron Man. The true "InMoov man," I guess, right?

When he touched it, it almost burned him, so I don't know if it's necessarily replicable.

Kevin: (laugh) Well, you know, it’s functional if not practical, right?

Have you seen any other devices where someone's using InMoov as an assistive device?

Kevin: To assist movement? Mostly, it's just been for a prosthesis for the hand, that's the most obvious. For me, my hope, once the chassis is finished and all the software is done, is to have a sort of telepresent, telemanipulative robot, so, if I was disabled, for example, I could potentially drive this thing to the supermarket, get a cup of tea or whatever and bring it back home. Or, in my case, go to the liquor store and get a six-pack and bring it back for me. So, that's actually my goal, is to drive this to the liquor store (laughs).

You're using Microsoft's gaming Kinect camera?

Kevin: The Kinect definitely gives us the depth perception in front of the InMoov. What I have really been focusing on is using it for skeleton tracking. I think that once you put the Oculus Rift on, you can see out of the robot's eyes and you've got the head tracking from the Oculus Rift but the other part of the equation is actually articulating the rest of the torso. So, by having the Kinect there, you can actually track and mimic your arm movements from the Kinect and relay that to the robot. I see that as a new interface, perhaps, between man and machine. You know, that human-robot interface is going to have to mature over the next couple of years.

If you could make something possible that's currently not possible, what would it be?

Kevin: I'd make the robot automatically calibrate its positions and movements and stuff. That's got to be the one next thing that we're going to start looking at. The bigger term is actually to make it a full-blown biped (two-legged, walking) robot. When you start getting into that, you start getting into kinematics, and forced feedback mechanisms. That's going to be the real tough task.

 

All photos by Elliot V. Kotek for NotImpossibleNow