Wednesday, January 10, 2018

Decoding GM's ALDL with Teensy 3.6

I want to log my Jeep's Engine Control Module (ECM) diagnostic data reliably and inexpensively any time the vehicle is running so I can tune it to pass emissions tests and reduce pollution.

To do that, I am using a new Teensy 3.6 microcontroller to first decode the data stream out of the ECM and then store it on a microSD card for later retrieval.

My Jeep is running a retrofitted General Motors #1227747 ECM, which is the brains of a mid-80's Throttle Body Injection (TBI) system found on Chevy and GMC trucks.

The ECM, which predates OBDII systems, spits out a serial data stream called Assembly Line Diagnostic Link (ALDL) at 160 baud. This data stream protocol is nothing like plain old RS-232 serial protocol, however...

What is the RS-232 Protocol?

Each byte of data transmitted in RS-232 protocol is preceded by one start bit. Typically each byte is terminated by one stop bit, with no parity bit (8N1), though other combinations are possible.

All bit transmissions take the same time, based on the configured baud rate, and their values are indicated by voltage levels, often 12V and -12V for 0 and 1, respectively.

By Ktnbn, derivative work: Samuel Tardieu (talk),
CC SA 1.0
TTL serial protocol, used by microcontrollers, differs only in voltage levels, using 0V to represent a 0 and Vcc (5V, 3.3V) for a 1. The ALDL protocol also uses TTL voltages but the similarities end there.

What is the ALDL Protocol?

At 160 baud, each bit is transmitted in 6250 microseconds (μsec) but instead of indicating value with voltage level, ALDL indicates it with the length of a logic-low pulse (sort of like 1-Wire).

A short duration pulse represents logic 0 while a long duration pulse represents logic 1. All bits start with the falling edge of this low pulse.

My ECM represents a 0 with 368 μsec pulse and 1 with a 4400 μsec pulse, but timing differs between ECM part numbers. And some error between ECMs can be expected. I captured timing samples with a Saleae Logic Pro 8, below.

"0" bit timing

"1" bit timing

How To Decode ALDL Bits

I tasked one of my Teensy 3.6 Kickstarter rewards with decoding the protocol. It has a microSD card, making data logging a breeze.

One way to decode is to use a standard UART set to 2400 or 1600 baud, treating each ALDL bit as a UART byte. Just count the logic 1 bits in the received character.

ALDL 160 baud and async serial 1600 baud

For example, using 1600 baud, an ALDL 0 bit is received as a 11111111 byte and a 1 bit is received as 00000111, depending on how the timing lines up.

In fact, the timing doesn't quite line up, though it can be made to work, as evidenced by WinALDL and other software.

Instead, the Teensy measures the timing directly in software with a combination of timers and interrupts doing the bulk of the work. It was easy to code and performance isn't an issue, so why not?

To detect the signal's falling edge (the start of a bit) I use attachInterrupt() to run an interrupt handler function.

attachInterrupt(digitalPinToInterrupt(AldlPin), handlePinChange, FALLING);

In the handler, I start a timer that calls a function, doSample(), 2000 μsec later to read the signal and determine if the bit is a 1 or 0.

timer.begin(doSample, 2000);

Why 2000 μsec? To account for bit timing differences between ECMs.

In doSample(), I shift the current byte to make room for the new bit, and then set the new least significant bit to 1 only if the signal is still low (representing ALDL 1). Otherwise it stays 0.

uint16_t myByte = 0;

void doSample() {
  myByte <<= 1;
  if (digitalRead(AldlPin) == LOW) {
    // Bit is a 1
    digitalWrite(LoPin, HIGH);
    myByte |= 1;
  } else {
    // Bit is a 0
    digitalWrite(LoPin, LOW);
  myByte &= 0x1ff;

But wait. In that last line of code, why am I keeping 9 bits of data? And why is myByte actually a 16-bit word?

Synchronizing ALDL

The ALDL data stream is transmitted as an unending series of bits, organized into 20-byte frames, so how do you know which bit is the first in a byte? Or which byte is first in a frame?

How do you guarantee synchronization with the ALDL data stream? By sending a sync character.

To distinguish the sync character from any other byte, which could have any value from 0x00 to 0xff, GM uses 9-bit bytes and 0x1ff as the sync character, the only byte with the first bit set to 1.

As an added bonus, the sync character is the only time the ECM sends nine 1 bits in a row, so your receiver can reliably re-synchronize just by counting strings of "1" bits.

In the doSample() handler, after shifting a bit into myByte, additional code checks to see if it has received the sync character and, if so, resets the bit counter to 0 and begins counting out 9 bits at a time to form the bytes. The code doesn't need to count the bytes in a frame. It's not necessary.

// Sync to the sync character
  // GM 160 ALDL sends a sync character starting with
  // a "1" bit (all other bytes start with a "0" bit)
  // followed by 0xff.
  if (myByte == 0x1ff) {
    digitalWrite(HiPin, HIGH);
    Serial.println("***** SYNC *****");
    bitCount = 0;
    myByte = 0;
  } else if (++bitCount >= 9) { // Received next 9 bits?
    // Mask off the low byte and print
    digitalWrite(HiPin, LOW);
    Serial.println(int(myByte & 0x0ff));
    bitCount = 0;

This code is transmitting the decoded bytes over its serial peripheral for testing. Speaking of which...


To find inevitable bugs, I needed a test ECM to generate ALDL data.

ECM test points: check engine light, ALDL out, 12V, GND.
I have a couple of spare ECMs and powered it with a spare PC power supply that I converted into a bench supply for this project. I soldered wires onto the bottom of the ECM board to tap into the pins for power, check engine, and ALDL.

The project was good motivation to convert my spare PC power supply
With an ECM just sitting there powered by 12V with no sensors hooked up, it spits out a predicable, always-repeating frame of bytes that is perfect for testing.

The Teensy code prints out each byte decoded from the ALDL stream to my workstation. I compare the Teensy stream against two things:
  • Manual decoding
  • TunerPro decoding
Using my Saleae Logic Pro 8, I captured the datastream coming out of a test ECM on the workbench and then manually decoded the 1's and 0's after finding a sync character. Turning the stream of bits into bytes, verified that both streams were identical.

And, just to be sure, I also used TunerPro (which I use for tuning and ad-hoc data logging) to print the stream for comparison.

After fixing a couple issues, everything checked out. Yay! Now I can do some tuning with the Wideband O2 sensor.

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.