STEM Workshops

Thanks for attending my STEM presentation!

We talked about the many exciting and fun opportunities that there are in science and engineering careers and we saw how much can be done with with a little understanding how to connect electronic parts together and by using programming no more complex than seen in Hour of Code.

While technology devices are shrinking, the jobs in computers, engineering and science are growing at an astounding rate.
It’s truly an exciting time and careers like those we’ve talked about will let you participate in and shape the next generation of the technology people use every day.

Below are some additional resources that you may be interested in.

After-School Workshop Interests Form

If you’d like to attend the after-school workshop, I like to know what you’d find most interesting. Please click on the form link below to give me your thoughts.

After-School Workshop Form

More About the uArm Robot

The robotic tic-tac-toe player is an ongoing project and you can follow all the progress here.

This short video gives a good overview of the vision system.

You can play tic-tac-toe against the actual game algorithm programmed into the robot by clicking here, or on the image below.

Challenge the Robot
Challenge the Robot now on the Web

If you are a school administrator or a parent and are interested in more information, please email me.

Robot Tic-Tac-Toe

Robot-in-Training
Robot-in-Training

 uArm Metal Autonomous Tic-Tac-Toe Robot

Welcome to the project page for the Autonomous Tic-Tac-Toe Robot. I’ll update this page periodically with any new progress if you want to follow along.

Goals

The overarching goal for the robot is to play fully autonomously, like a human. This defines five major components I decided were needed to fulfill that goal.

  • Understand the Game
    The robot must understand how the game is played and have some sense of a strategy for winning.
  • Sense the Board
    The robot must be able to interpret the moves that the player makes and understand the positions of  the pieces on the board.
  • Move the Pieces
    To robot must make its own moves at the appropriate times, using whichever marker has been assigned.
  • Play Autonomously
    All the programming must reside within the robot itself and it should play without any connected computer or external signal.
  • Convey Emotion
    This is a “stretch goal” to see if the robot can convey emotion to the player, based on the status of the game.

About the Robot Arm

The robot arm is a uArm Metal, made by UFactory. It was created initially as part of a Kickstarter campaign and is now in production.

The uArm Metal
The uArm Metal

The uArm is a 4-axis robotic arm with three degrees of freedom as well as a hand-rotation axis. It’s based on a design for an industrial pallet-loader and is thus best suited for positioning and stacking items in a 180-degree operating area.

The uArm is controlled by an on-board Arduino microprocessor and is fully open source.

Understanding the Game of Tic-Tac-Toe

There are lots of resources on the web that will tell you how to play tic-tac-toe and there are many ways to teach a computer how to implement game strategy. For reasons related to the memory available on the microprocessor, I wrote an algorithm based on logical human strategies such as, “move to an open corner such that a block does not result in two in-a-row for the opponent.” The computer must understand both how to translate that strategy into an appropriate move and also when to apply that particular strategy.

The initial version of the strategy algorithm worked so well that the robot was unbeatable and therefore, noted my daughter Cassie, no fun at all.

Challenge the Robot
Challenge the Robot now on the Web

A final version of the robot game-logic incorporates three difficulty levels based on characters I hope to explore further in the emotion section.

You can play tic-tac-toe against the actual game algorithm programmed into the robot by clicking here, or on the image of the game board.

Sensing the Board

Computer vision is a computationally-expensive thing and beyond the reach of most small micro-controllers. The robot solves this problem with the Pixy (CMUcam5) from Charmed Labs that employs a dedicated on-board processor for simple object recognition.

This short video gives a good overview of the vision system.

The Pixy allowed me, the programmer, access to a much simpler stream of information where blobs of different color are represented as screen rectangles. This helps a lot.

Moving the Pieces

Moving the pieces was pretty straightforward using the built-in programming of the uArm, but I thought it could be better. I made a number of improvements to the way the arm worked and contributed those changes back to the open-source project where they’re now part of the arm’s programming. Pretty cool!

The video below also shows an example of common engineering math. You’ll find that it’s not really so hard!

Playing Autonomously

Combining the vision system with the robot’s movement system is the next challenge!

The robot’s micro-controller is part of a custom circuit-board developed specifically for the uArm and is therefore optimized for control of the robot. Without access to the input/output pins normally available on an Arduino microprocessor, the options for interfacing the vision system with the robot’s hardware are quite limited.

Thus, I haven’t yet been able to run both vision and movement from the single robot micro-controller. I have some ideas, though!

Conveying Emotion

If you have seem my STEM talk on Computer Engineering or the movement video above, you’ve seen that the robot is capable of expressing at least a few different emotions. I hope to build out this capability once the other issues are solved.

Final Update

Thanks to a comment on one of the YouTube videos, I realized today that it had been more than a year since I promised to update this page with progress. So here’s what happened.

Even if two halves work flawlessly, there will be unforeseen issues when they try to work together. When the only communication channel your onboard computer has is being used for a camera, it is impossible to debug a robot.

My approach was to rig a software serial port off of two unused uArm board pins strung to another ‘debug’ Arduino that would simply proxy data from the uArm to the computer via USB. Once robot debugging was complete, the external Arduino could be disconnected and the robot would run autonomously.

In the end, grounding problems between the Arduinos and glitchy Pixy performance due to its not getting enough juice off the uArm board were enough to ground the project.

I’m more a software guy. When it comes to low-level things like wire protocols and power systems, I want someone else to handle them. I had made each half work, and that was all I needed! 🙂