-
Notifications
You must be signed in to change notification settings - Fork 2
/
Copy pathProjectArticle
14 lines (8 loc) · 4.19 KB
/
ProjectArticle
1
2
3
4
5
6
7
8
9
10
11
12
13
14
7168 LEDs Out of Control
After years of sterling service they were made redundant and ignomiously thrown on the scrap heap before finding shelter at Swindon HackSpace. Mounted as 32 boards of 7 rows of 32 columns in 4 curved enclosures, try as we might we just couldn’t get any control over the blighters. We tried coaxing them into action with the WinSign configuration program that they were reputedly chums with in days gone by, but they were clearly in full rebellion against their hapless serial controller board. There was nothing for it but to liberate them into free-ranging boards and power supplies. We hoped they may become more cooperative when in a one-to-one relationship.
Damian and Rob looked into the boards’ background, perusing the data-sheets of the ICs (intergrated circuits) they were using, hoping to spot some hints as to how the boards might best be approached. One IC seemed to promise significant influence over the LEDS as it had 4 control pins for setting the outputs to 16 LEDS. There were 2 of these ICs on each board together with 32 columns of LEDS which was suggestive, and a bit more reverse engineering revealed 7 extra pins to select each of the 7 rows of LEDs. Rob with his C# and trusty Netduino tried tickling up those pins and finally got those LEDS to begin to behave a bit orderly. But the Netduino just couldn’t keep up with flighty LEDS which flickered horribly – we needed a more nimble taskmaster. This picture illustrates how slowly each row is lit in turn; the camera got bored waiting for all the rows to light up.
Was the Arduino the answer? It runs C which is fast, and with an ASCII character lookup table it was able to persuade a board of LEDs to display text, but still they flickered noticeably. The Arduino was simply doing too much background housekeeping and not giving the LEDs enough of its attention. We needed to make servicing the LEDs the utmost priority and the Arduino interrupts might be ideal. An interrupt fires at regular intervals. When it fires it suspends all other processes whilst it runs some service code. If we set the service code to simply toggle the LED pins to set and illuminate one row, and on each interrupt we cycle to the next of each of the 7 rows, then thats about as simple and quick as we can make things. The human eye doesn’t detect flicker that is 16 times a second or faster so the interrupt must be set at least this fast. But we can only display one row of the seven at a time so the interrupt must fire 7 times as fast, so a convenient rate of 128 is ideal.
And so it proved. The Arduino kept the LEDS in tight check so they finally displayed rock-steady text. By connecting four boards of LEDs in sequence we achieved a much longer string of text. In theory we should now be able to connect all 32 boards in sequence, but we don’t have the power supplies in a safe condition yet. We also need to check that the interrupt can service 32 boards 128 times per second.
Displaying fixed text is fine, but using a second Arduino interrupt that fires 5 times per second we can scroll the text one LED column at a time left or right. In future we might choose to set individual characters to flash or animate in some other way. Or indeed we might choose to physically arrange the boards differently so we might use them effectively as a general graphics display.
So far we have a modicum of control over our thousands of LEDS, but they are still unemployed. We need some creative thinking, as well as some creative action. The practical to do list includes building a housing for the boards and their power supplies so they may be easily used together with some higher-level Arduino code to tailor the LEDS for a specific purpose. Volunteers please.
One suggestion is to use the LEDs as part of a window display for the Museum of Computing in some fashion. By themselves the LEDS are rather tame, but maybe they could be used to interact in some way with members of the public as they walk past the museum. Which suggests a related project to detect passers-by, using a Kinect sensor. We have a Kinect working with a Window machine, but ideally we would drive it from a Raspberry Pi. Any volunteers for getting a Kinect sensor working with a Pi?