Category Archives: arduino

Pythagoras Clock

Illustration of Pythagorean triple, 3, 4, 5, and units of time

The traditional clock shows the hour divided into 12 periods of 5 minutes, 6 periods of 10 minutes, 4 periods of 15 minutes ( a quarter of an hour ), 3 periods of 20 minutes, and 2 periods of 30 minutes (half an hour). This neat divisibility is a consequence of the factorizations of the number of hours in a day and minutes in an hour: one day = 12 hours = 3*4 hours – resulting in integer factors of 2, 3, 4 and 6 hours. One hour = 60 minutes = 3*4*5 minutes – resulting in integer factors of 2, 3, 4, 5, 6, 10, 15, 20, 30 minutes. If there were 61 instead of 60 minutes in an hour, then half an hour, a quarter of an hour – none of these would correspond to a whole number of minutes.

There is something else special about these numbers: 3, 4 and 5 make up the least Pythagorean triple: a right angled triangle can be drawn with sides 3, 4, and 5 units long.

A few years ago, I designed the Pythagoras Clock, based on this coincidence.

The clock is powered by an Atmel ATMega168. The clock display is made of laser cut acrylic, housing 19 colored LEDs laid out in two intersecting Pythagorean triangles.

Pythagoras Clock

The two lines in the upper right corner – excluding the diagonal – represent fractions of a day, and the three lines in the bottom left corner plus the diagonal, represent fractions of an hour.

Pythagoras clock with explanatory overlay

When the clock boots, it shows up at 9 o’clock. The following video shows the clock working at increasingly accelerated speed, starting from 9 o’clock:

The top line, of white LEDs, divides each day up into four quarters. If the leftmost of those quarters is lit, then we are in the first quarter of the day, i.e. between twelve o’clock and three o’clock; if the rightmost quarter is lit, then we are in the fourth quarter of the day, i.e. between nine o’clock and twelve o’clock. The orange line on the right edge divides each of those quarters of a day into three thirds. If the topmost of those LEDs is lit, then we are in the first third of the quarter of the day represented by the top line, and if the bottommost is lit, then we are in the last third of the quarter of a day.

In the diagram, the second white LED is lit, meaning the time is between three and six, and the first orange LED is lit, meaning the time is in the first third of that range, i.e. it is four. More mathematically, the hour is 1 x 12/4 + 0 * (12/4)/3 = 4.

Similarly, the bottom line counts time in units of a quarter of an hour, the leftmost line counts time in units of a third of a quarter of an hour (five minutes), and the diagonal counts time in fifths of thirds of quarters of an hour, i.e. minutes.

So in the diagram the hour is 4, as we have already seen, and the minutes are zero quarters (bottom line), two thirds of a quarter (left line), and four fifths of a third of a quarter (diagonal), i.e. 0 x 60/4 + 2 x (60/4)/3 + 4 x ((60/4)/3)/5 = 14.

I am releasing the source code and designs under open source licenses – the source code is available under an MIT license, and the designs are licensed under a Creative Commons Attribution-ShareAlike 4.0 International license. You can download them from Github

The circuit assigns one pin for each LED, and one for the button which is used to set the clock or place it in demo mode.

Pythagoras clock circuit

The clock spent a long time on a proto board, but I finally did get some boards made. If you’d like one let me know.

An assembled circuit for the Pythagoras Clock.

The body of the clock is a sandwich of eight pieces of acrylic, which I cut with a laser cutter at Techshop. The Corel Draw file is on github.

The acrylic cuts.

The LEDs each sit in a little pocket, with a rectangular slit set on top of and to the edge of the pocket.

During assembly of the acrylic.

This gives a uniform rectangle of light.

Light rectangle

Lady Ada Lovelace Day

According to, “Ada Lovelace Day is an international day of blogging to celebrate the achievements of women in technology and science”.

So, I would like to celebrate the open source hardware achievements of Limor Fried, whose store sold me my first Arduino, the first serious electronics kit I soldered together, a USBtinyISP AVR Programmer Kit, and whose instructions and forums have been an invaluable resource in my Arduino and AVR-based projects.

NPR for free on a well-tempered Arduino

I just built a little audio amp using an LM386 chip and the Little Gem amplifier circuit. I plugged it into my Arduino and programmed up a little well-tempered scale. I found to my surprise that if the Arduino was off and I touched the amplifier input, then I got very faint speech out of my little speaker, which I am pretty sure is NPR!

Hard to get machine-readable weather data

Today, I interfaced an Arduino to a stepper motor – the hardest bit in the end was figuring out which pins of the Unipolar stepper motor do what. The motor, available as Jameco 171601, has six wires – yellow, red, orange, black, green, brown – which come out in a connector. The most useful reference I found was Tom Igoe’s Stepper Motor Control page. In the color sequence above, the wires are those numbered 1, 5, 2, 3, 6, 4 in Igoe’s diagram.

As a demonstration I wanted to turn it into a little weather toy, but it has been very difficult finding machine-readable real-time weather data on the web. The NOAA site is chaos. Eventually, I settled on this:

NOAA CA weather data.

It was easy to grep and sed the data I wanted out of its text and tables. God forbid that whoever produces this should take a more literary turn – then I would need NLP! 🙂

Experimental 3d Sonar Map

Using EZ-4 sonar on my robot, I attempted to get a depth map of my view
of the opposite wall in our office at home. The arduino was taking sonar
readings at a rate of 20Hz and sending down the serial port to my Mac. I
pointed at the wall and then scanned across at an even speed clockwise –
taking about 6 seconds to cover 30 degress of arc – then put my hand in
front of the sonar to get some easily recognisable low readings to mark
as end-of-row, tilted the arduino up 5 degrees and scanned right to left
at about the same rate. I repeated this backwards and forwards scanning,
increasing the angle from the floor at the end of each row, until I had
9 rows.

I normalized the data by assuming that each sweep (left-right or
right-left) covered an equal angle but possible at a different speed
(resulting in a different number of readings for different rows – the
average number of readings was about 120 (6 seconds at 20Hz)).

Here is the raw data.

The readings were now – with a little work – in the 3d polar form with
variables phi, the angle the sonar beam made to the center of my point
of view when projected onto an axis parallel with the floor, theta, the
angle the beam was pointing above the floor (0 ~ parallel to the floor),
and r, the sonar range in inches. With a fair amount of pain, I
converted the readings into XYZ readings with (0,0,0) where I was
sitting and plotted them in grapher.

The data is… hard to interpret. I just bought a copy of “Probabilistic
Robotics” by Sebastian Thrun, Wolfram Burgard and Dieter Fox and
although I have hardly had a chance to look at it I did notice that the
sonar maps they present are very noisy. It looks like my sonar map is
noisy too. The data points resolve into a few coherent blocks. Closest
to the viewpoint in the picture below is a block of points where a tall
filing cabinet is. A little deeper into the picture, and not extending
so high (up the screen in the picture) there is a sofa. The other points
are very noisy but are bounded behind by a wall. I expect that the wall
is quite a good reflector of sonar, and so results in very poor

Robot maps floor

Robot-derived floorplan

At the weekend I calibrated the robot. In some driving tests I found that it drives 7.5″ to the right for every 30″ it drives forwards – which is corrected by adjusting the left wheel speed to be about 80% that of the right wheel, drives forwards at a rate of 6″ per second, and turns on the spot at a rate of 96 degrees per second.

I made some changes to the software to get my robot to use the sonar range data to produce a map of the terrain it traverses. Initially the robot was saving the map to onboard RAM, but I found that plugging the Arduino in to the Mac to read the map out over the serial port would reboot the Arduino and erase the data. The next version saved the map to on-board EEPROM, and that properly survided the reboot and could be read out. At first the results were very obscure and disappointing. Until I realized that degrees != radians and fixed the trig appropriately. The map is a little hard to interpret – I plan to make changes to the software to help that – but considering that the robot has been only roughly calibrated, the results are quite impressive (to me – I built the robot and wrote the software so I might be a little biased).

The picture is of my robot-derived floorplan. The height of the bumps is proportional to the time from start of drive that the robot added that map point. Note that new map points could overwrite old ones.

I think that the ridge of high bumps which crosses the middle of the picture parallel to the X axis corresponds to the same part of the hallway as the longer ridge which was found much earlier in the run and which appears below it in the picture. The high ridge comes from the robot’s second tour around the hallway.

Experimenting with the Google AJAX API, I uploaded a graph which progressively shows the obstacles the robot detected as it drove. Note, these are the obstacles detected, and not the robot’s estimated path. This gives some idea of how the robot was moving around, and also of the accumulating inaccuracies in its estimation of where it and the obstacles were.

Little Robot Moves Around

The next thing little robot needed was the ability to sense its environment. I ordered a MaxBotix EZ4 sonar from Sparkfun for less than $30, and with a little nervous soldering today got it connected up to the robot. The obstacle avoidance code is very simple – if the robot detects an obstacle sufficiently close, it turns left a little. An earlier algorithm had both left and right turning but needs some work to prevent excessive oscillations.

Below is a little video of the robot navigating around the hallway and avoiding a moving beslippered object (me).


And here is an interactive visualization of the obstacles the robot sensed.

Little Baby Bot Lives!

Today at around 4pm, a little baby bot was born. Oliver helped me with some of the design decisions, and with measuring and calibrating the little guy. He weighs about 2lb (including batteries), and is 14″ long. At this stage, he is blind. He does, however, diligently follow instructions fed to him as a program down a USB cable. After each programming, we remove the USB cable and he is reborn, newly obedient.

One of the lessons I learned with this robot was that the electronics and programming are at most half the challenge. What caused the most problems was the mechanical part. In the end, I settled for an old iPhone box for the body. We carefully cut holes in the sides for the motors, making the motors fit as snugly as possible to minimize wobble. Even so, a couple of folded pieces of paper and an earplug are used to pack the motors more tightly.

My first little Robot
My first little Robot

Little Robot Steps

It works! I built the beginnings of a little robot today. I have an Arduino Diecimila connected to a circuit built on a breadboard around a TI SN754410 Quadruple Half-H Driver, connected to two Pololu Gearmotors . I followed recommendations found in various net places to reduce motor noise by wiring each motor with 3 capacitors – fiddly work with such small motors, and followed a very clear example I found in a course on the web to hook up the Arduino via the H-Bridge to a motor.

I wrote some code to drive each motor forwards or backwards at a chosen speed, and on top of this wrote routines to drive the robot forwards, backwards, clockwise or anticlockwise. After a little debugging it all works. Only problem now is… the robot has no body, just guts.

I had in mind a Ferrero Rocher box for a transparent, light, appropriately sized body, but the particular – formerly ubiquitous – size which I wanted is nowhere to be found. I may have to cannibalize some tupperware.