Tuesday, April 7, 2009

Ant-like hexapod

Very cool control of a Lynxmotion Phoenix Hexapod base. It gives me half a mind to buy new servos for mine. That is, if I had the tons of money required to do that.

Particularly impressive is the demonstration of having control over a chosen center of rotation and holding that there. Very fluid movement, as well.

Client/Robot Control

Particularly when dealing with smaller, light weight robots, the issue of processing power in such a small package becomes problematic - particularly with budget in mind. Recently, netbooks (such as my beloved EEE PC, the original one with 7 inch screen) became a relatively cheap solution. Capable of being mounted on robots at least the size of a remote control car, they can provide a full OS of built in control and up to 2 gigahertz of processing power. Never mind that the "reduced" and "cheap" cost I'm talking about is still in the $200-$400 range.

Still, small development boards with tremendous processing power are becoming more popular, particularly with the dropping price and rising power of the ARM architecture. My own experience with the TS-7800 during my senior project provided a low cost, and surprisingly capable board that controlled my robot. Recent discoveries, such as Tin Can Tool's sexy, sexy 40-pin DIP Linux Board provide still more packaged processing power, even lower at $150 dollars a pop. Gumstix provides even more solutions starting as low as around $100.

But there is still a size limit, and the omnipresent requirement of a limited budget as well. The requirement becomes necessary, then, to off-load the processing from the robot to a networked "client". This client, typically another computer, has more processing power and can control the robot from afar, much like a human controls a remote control car.

My current and final robotics engineering class (Unified Robotics IV) at WPI is an excellent example of this. In order to produce affordable platforms for student use for the course, without sacrificing significant capabilities, we are using a serial-to-wifi bridge that goes from an Atmel Atmega644p micro controller to our computers via a LAN. (It is worth noting, however, I took a different route and merely used my bluetooth DIP module to connect to my robot, mostly due to my own impatience for the finished platform.)

Using this, the students can use a Client Framework with the unleashed power of our workstation computers (all 2.4gHz PCs) for our robots. This includes the integration of programs such as MATLAB and video processing programs for use in the lab.

Why do I bring this up?

The Framework has been the problem of late. The original provided Client Framework - made in C++ - was disorganized and unfinished. Several students, myself included, selected our favorite languages and began designing our own. I chose to create an object oriented Framework in Python. Initially I wrote the client framework out of haste and necessity. I have had a blast designing the framework, taking in all the implications of having an object oriented modular approach to designing robots. The possibilities of future projects seem almost endless, and I plan on exploring this side project after I graduate from here.

The Pitch

My goal is to create a simple object oriented python robot framework in which it is easy to develop a Robot-side framework to match it. Case in point - my current C robot-framework developed for the Atmega644p. A very probable future-case-in-point - my iRobot Create and its Python interface.

"Aren't there already projects like that," you ask? Yes, I am well aware of Pyro and Pyrobot, and will discuss those in some other post. I believe I am still bringing something new to the table.

The Framework will have three primary goals.

1. Create an object oriented approach to a viewing a Robot. This framework itself has need of accomplishing the following goals:
  • Provide simple, one-command interface for simple functions of the Robot. Turning, going straight, etc.
  • Allow the robot to determine where it is from sensors.
  • Develop and expand Map.
  • Simple responses such as bumper response.
2. Allow for control of one or multiple robots at once via an easily used Python script. Each robot maintains their own connection, allowing for this to be done seamlessly.

3. Allow creation of an AI object - each with their own set of goals, behaviors, and reactions to situations. The AI object is given a robot or set of robots to control.

With the Framework having a standardized approach, this can allow AI research (or simple games, since whatever work I do will be trivial to current AI research) to be easily explored with little to no work holding back the initial exploration.

For now?

I will be focusing on my coursework, of course. The framework will be put together primarily to accomplish the goals of laboratory exercises in Unified Robotics IV. The work I do, however, can be expanded upon and be used as a launching point. There are already numerous pieces of code to control the iRobot Create - some I wrote for the TEPRA Autonomous Robot Challenge (which I did well in) and - and more from projects such as pycreate. I can easily adapt this to the Robot Framework and use the Create as an easy low-maintenance platform for learning navigation and AI.

As always, more to come...

Saturday, April 4, 2009

Final Project for Unified Robotics III

So yes, this is a repost from my older blog, but I wanted to preserve this in the new blog. On top of that, I think it's still cool. A little bit of background this assumes you know - I am (for now) a student at Worcester Polytechnic Institute (WPI) going for Robotics Engineering (RBE). There are four primary robotics courses for RBE - the Unified Robotics courses, I through IV. The following is a post about the final project for Unified Robotics III:

~~~

Not only a demonstration of my lack of video editing skills (the false start, sorry about that), but also the demonstration of the final one week project for Unified Robotics (RBE3001) at WPI. The robot arm must detect and localize the block while it is moving on the conveyor belt, determine the inverse kinematics to move the arm into the right position, and then pluck up the block. Once it lifts the block, it determines which of two possible weights the block is, and then drops the block into the correct bin.

Development was done on an AVR Atmega644p with custom control board for the robot arm. The arm itself is also custom to the class.

Lifting is done via an electro-magnet.



~~
And a classmate's project:
~~



What is this here for?

So recently I began work on a school project that I realize can, very easily, be branched out into something bigger. On top of that, I am beginning to realize that when I move to a 9-5 job (where ever I end up) I shall probably have far more time than I do now as an engineering student. As such I am planning on exploring parts of robotics I haven't been able to dedicate as much time as I would have liked.

First off - who the heck am I? I'm an engineer (or will be when I graduate this may) possessing an undergraduate degree in Robotics Engineering. I'm also a huge geek who loves doing technical things for fun. I think that about covers that...

More to come later.