I’ve decided to dabble in hardware again. The grand plan is to turn an RC aeroplane into a semi-autonomous UAV. I’ve used PIC microcontrollers before, but this time I thought I’d switch and try one of the Atmel microcontrollers instead. Microcontrollers have got a bit nicer since the last time I used them. For example, generating PWM signals for servos used to require software bit-banging but now there is hardware support for it.
I settled on the Atmega16L which runs at 8MHz, has 16k flash ram for programs, and 1k static ram for calculations. It has onboard analogue-digital converters, which is perfect for plugging in “what angle is the plane at” sensors. It has 3 PWM outputs, which’ll be perfect for controlling the servos which move the rudder/elevator plus the electronic speed controller. Plus, it supports in-circuit programming which means you don’t have to keep dragging the chip out of the circuit and plopping it in a hardware programmer.
At first, I’m going to for something simple as proof-of-concept – automatic landing lights. An ultrasonic range sensor will point downward from the plane and switch on some super-bright LEDs when the height is below two meters. It’s nothing complicated, but it’s a good starting point to allow me to figure out how this stuff will fit into an RC plane, and how to make it robust enough to survive the inevitable crash landings.
Beyond that, I’m going to use a spare RC channels to allow me to switch an ‘autopilot mode’ on and off. After all, I want to be able to get manual control back if the software crashes! When autopilot is on, the microcontroller will take input from a range of sensors (IR horizon finding or inclinometer/gyro-combo for attitude, pressure sensor for altitude, GPS for position). It’ll figure out what it wants to do next, and move the control surfaces appropriately. Things like automatic landing modes have been done before and sound pretty possible. There are also light-weight wireless video systems which people have put into planes. And you can buy fairly cheap 433Mhz radio link chips which’d allow telemetry to be sent from the plane to a ground station.
What’s the point in this? It’s a blend between three of my interests. I’m interested in the physics/engineering aspect of flight and aerodynamic design. I enjoy doing simple hardware systems, and nowadays simple hardware systems can do lots of cool stuff. Finally, I like working on reliable/critical software systems. So, dealing with the realtime aspects of this project will be new and fun. As an aside, I’ve been reading about RTLinux and RTAI which are pretty cool. It reminds me of first time I saw the SoftICE debugger. There’s something remarkable about being able to pause an entire operating system!
On the software side, I’d be happier in a high-level language. But embedded systems are traditionally done in C, Forth or assembly in order to meet hard realtime demands. If you used a garbage collected language, a GC pause could mean that you miss a critical event. There’s been research into incremental collectors with low overhead for embedded systems though. And, typically, you have less need for dynamic memory allocation in an embedded system. But people have tried targetting high level language to embedded platform. Someone had a go at running nhc98 haskell runtime on a palm pda, and someone else did scheme on a PIC (PDF). If you were to use a high-level language in an embedded system, you’d want to have some guarantees about its time and space performance, and this is what the Hume project at St Andrews uni is looking at. And if you think that HLL can’t live on the bare metal, take a look at this Haskell OS project.
Finally, the “testing” side of this project is pretty interesting. I can obviously unit test the software before it goes onboard the plane. But how do you do integration testing? One answer is to place your flight-control software into a virtual world and make it think that it is actually flying. To achieve this, you can take advantage of the fact that Flightgear can send information about the plane (position, inclination, speed etc) out on a network port. You can grab this information, repackage it and send it to the flight-control software running inside a simulator (no hardware involved). The flight control software ultimately sends signals to servo, so you need to read these signal, map them back into flightgear-speak and push them across the network to flightgear. This way, you can see how the flight-control software behaves in high winds without risking the model plane itself!
Ah, that’s all for now. I’m well aware that I often start projects with grand plans and then get distracted by Other Things. This project is pretty amenable to the old ‘put it on the shelf for a few months’ treatment. It’s not “all or nothing” like some of my previous projects. So I am pretty hopeful of building something k3wl over the next few months and having a semi-autonomous plane buzzing around the skies.