After spending a few weeks working on the software, it’s time to get back to the hardware. My first task is to hook up the arduino and see if I can reliably trigger off the interrupts.
The answer to that question is yes; setting the trigger on “FALLING” works reliably. But I came across the following behavior that is a big puzzling:
These are the row and column strobes. On the left, you can see the top strobe happens first, and then the bottom strobe. If I zoom in, I can see that they occur when data is present on the bus (bus signals not shown). Just as it should be.
But what is going on with the signals on the right? Here’s a bigger picture:
This is weird; first we see a column/row strobe with some crappy looking bus data, and then a row/column strobe with some good looking bus data. This would seem to indicate that part of the time, the CPU sends data and then sends different data soon after, and other times it just sends the data. The hardware in the machine won’t care; the time that the first data is present is so short that it should not affect the behavior of the lights in a noticeable way.
But is this what is really going on, or is it a glitch in my logic analyzer? Time to find out…
BTW, the big square wave in the middle is an output from the Arduino. An interrupt service routine is hooked up to the bottom signal on the trace, and the lag between the negative edge on the strobe and the positive edge on the arduino trace is the time it takes the interrupt to happen + the time it takes to turn on a pin. So, it appear the decision not to do this purely in software on the arduino is a good one.
To figure out what is going on, I wrote an arduino program:
void setup() {
pinMode(0, INPUT);
pinMode(1, INPUT);
attachInterrupt(0, rowDataHandler, FALLING); // pin 2
attachInterrupt(1, columnDataHandler, FALLING); // pin 3
Serial.begin(9600);
}unsigned long startTime = 0;
const int ArraySize = 128;
unsigned long rowTimes[ArraySize];
unsigned long columnTimes[ArraySize];
int rowIndex = 0;
int columnIndex = 0;void rowDataHandler()
{
digitalWrite(8, HIGH);
if (rowIndex < ArraySize)
{
rowTimes[rowIndex] = micros();
rowIndex++;
}
digitalWrite(8, LOW);
}void columnDataHandler()
{
digitalWrite(9, LOW);
if (columnIndex < ArraySize)
{
columnTimes[columnIndex] = micros();
columnIndex++;
}
digitalWrite(9, LOW);
}// the loop function runs over and over again forever
void loop() {
while (rowIndex < ArraySize && columnIndex < ArraySize)
{
Serial.print(".");
}
Serial.println();Serial.println("Times");
for (int i = 0; i < ArraySize; i++)
{
Serial.print(rowTimes[i] – startTime);
Serial.print(",");
Serial.println(columnTimes[i] – startTime);
}Serial.println();
startTime = micros();
rowIndex = 0;
columnIndex = 0;
}
Basically, it sets up interrupt handlers for both row and column strobes and then starts saving the time when the interrupt occurs. When it has collected 128 of each, it stops collecting them, and the main loop sends them out over the serial port. Then the process repeats itself.
I took the data and pulled it into Excel for analysis. This is the delta time between interrupts, all times are in microseconds.
Row Delta | Col Delta |
2064 | 2112 |
44 | 44 |
1992 | 1992 |
44 | 44 |
2048 | 2044 |
44 | 44 |
1952 | 1956 |
44 | 44 |
2000 | 2000 |
44 | 44 |
2008 | 2008 |
44 | 44 |
2000 | 1996 |
44 | 44 |
1996 | 2000 |
44 | 44 |
2000 | 1956 |
2048 | 2048 |
2056 | 2056 |
2052 | 2096 |
44 | 44 |
1988 | 1984 |
44 | 44 |
2008 | 2008 |
44 | 44 |
1996 | 1996 |
44 | 44 |
2000 | 2000 |
44 | 44 |
2000 | 2004 |
There are times with a double interrupt, and other times that only have a single. Sometimes the row interrupt comes first, sometimes the column one comes. The period is right around 2 mS, so the frequency is pretty close to 500 Hz; exactly what I had measured before. The time period between the first one and the second is either 44 or 48 uS; I think the difference depends on the ordering.
The ratio of the two time periods is 44/2000, or just about 2%. Based on my extensive experience dimming lights by microcontroller, I can tell you that 2% isn’t really a noticeable difference in how a light looks; you *might* be able to detect the difference between 98% and 100% brightness in a controlled setting, but it would be hard to do. I’m confident that you can’t tell the difference between 0% and 2%, because on a incandescent light, 2% is just off.
I did a little graph in Excel that looked pretty much exactly like the logic analyzer image, so I won’t share it here.
How to deal with this? Well, I think it will actually turn out to be pretty simple. Instead of doing work directly off of the row or column interrupts, I will set up a short timer interrupt (let’s say 100 uS or so) whenever the row/column interrupt fires. That will be updated by any new strobes that occur before the timer expires, so the effect will be that the timer interrupt will be serviced a short time after the last row/column strobe.