I recently bought an Adafruit Gemma. It’s a little programming board that is slightly bigger than a 10p coin and it costs about GBP 7.
It uses an ATtiny85 micro and is Arduino compatible, so the way you’re encouraged to program is to use the Arduino GUI tools and all that good stuff.
By “traditional approach” I mean grumpy old man approach. I don’t like GUIs much. I can’t use vi and the rendering of the font in the editor is terrible. And the syntax highlighting burns my eyes.
So you can just use the command line tools, right? Right. On Ubuntu you can apt-get install gcc-avr and then use the avr-gcc compiler to compile your C code. You’ll need avr-objcopy (from the binutils-avr package) to convert your .elf file into an Intel .hex file, and you’ll need avrdude (from the avrdude package) to flash the device. The gory details are captured in this Makefile, and I got those gory details by switching the Arduino GUI into its verbose mode and watching it compile my project.
My first demo project is also an exercise in avoiding the Arduino libraries. Mostly because when I was working out how to use the command line tools, I didn’t want the hassle of dealing with multiple files and libraries and things.
So this is also an example of how to program the ATtiny85 (and more or less any AVR type micro) without using heavyweight libraries. The Gemma has a built-in red LED on PB1. This was definitely one of the things that attracted me to the Gemma. I can program it do something without needing to plug any extra hardware in. Specifically, I can flash the LED.
Flashing an LED is a matter of using a GPIO pin and driving it high (on) and low (off). The assembler programmer would do that with the SBI (Set BIt) and CBI (Clear BIt) instructions. So I’m thinking “Can we have reasonable looking C code generate The Right Instruction?”.
The C code to set a bit is generally of the form *p |= b where p is a pointer to some memory location and b is a number with a single bit set (a power of 2 in other words). Similarly the C code to clear a bit is *p &= ~b. As long as we give the compiler enough information, it should be able to compile the code *p |= b into an SBI instruction.
And so it can. Through some fairly tedious but also fairly ordinary C macros, I can write BIT_SET(PORTB, 1) to set pin 1 or PORTB (PB1, the pin with the LED attached), and it gets converted into roughly: *(volatile uint8_t *)0x38 |= 2; which is basically saying modify memory location 0x38 by setting its bit 1. In a little oddity of the AVR architecture the SBI and CBI instructions operate on IO addresses which are at memory locations 0x20 onwards. The upshot of this is that memory location 0x38 is modified with a SBI 0x18 instruction (this mystified me for about 2 hours last night, and I realised what was wrong just as I was drifting off to sleep).
Because in the code *(volatile uint8_t *)0x38 |= 2; both the location, 0x38, and the value, 2, are constant, the compiler has everything it needs to generate the right SBI instruction. And it does!
drj$ avr-objdump -d main.elf 2>&1 | sed -n '/<main>:/,$p' | sed 9q 00000040 <main>: 40: 1f 93 push r17 42: b9 9a sbi 0x17, 1 ; 23 44: c1 9a sbi 0x18, 1 ; 24 46: 88 ec ldi r24, 0xC8 ; 200 48: 90 e0 ldi r25, 0x00 ; 0 4a: f2 df rcall .-28 ; 0x30 <delay> 4c: c1 98 cbi 0x18, 1 ; 24 4e: 88 ec ldi r24, 0xC8 ; 200
You can see at the beginning of the disassembly of main the SBI 0x17, 1 instruction which is as a result of the macro BIT_SET(DDRB, 1) (setting the pin to be a digital output). And you can see SBI 0x18, 1 to drive the pin high and light the LED and CBI 0x18, 1 to drive the pin low. The compiler has even subtracted 0x20 from the addresses.
avr-gcc -Os FTW!