Highlights of CSEE 4840 Embedded System Design
Spring 2004
Columbia University, Computer Engineering Program
Prof. Stephen A. Edwards

The main focus of CSEE 4840 Embedded System Design is an independent group project on the design and implementation of a small system involving hardware and software. The students implemented the projects on an FPGA board (the XSB-300E) from XESS Corporation. This board contains a Xilinx Spartan IIE FPGA (an XC2S300E) capable of holding both a 32-bit RISC microprocessor core (a ``Microblaze'') and quite a lot of student-designed custom logic. Virtually every project incorporated a combination of C code running on the processor and custom logic written in the VHDL hardware description language.

Below are selected projects from the spring 2004 course to give you an idea of the breath and complexity of these projects. I must say, I was very impressed with what these students could create in half a term.


keyboard and computer Terrormouse: A MIDI Synthesizer
Gabriel Glaser    Scott Arfin    Ron Weiss

Gabe, Scott, and Ron did a great job on this perfect example of hardware/software codesign. They decoded the high-level MIDI stream, which consists mostly of note on/note off events, in software and used this information to control twelve oscillators, six based on the Karplus-Strong string synthesis algorithm, the other six based on FM synthesis. By changing the FM synthesis parameters, such as modulation frequency and depth, they implemented ten different ``patches'' (sounds) ranging from a pure sinewave to a metallic-sounding organ, all of which sounded very good. three guys

PDF FileTheir final report      .wav file Listen to their synthesizer

robot Muddrover: A line-following robot with video vision
Akshay Kumar     Justin Titi     Athena Ledakis     Ron Coleman     Josef Bryks Schenker

This large group did one of the most ambitious projects, integrating a Lego Mindstorms-based robot, a video camera, and custom hardware and software to produce a robot that could run laps around a black line on white paper. They took the raw video signal in through the Philips SAA7019 video decoder chip on the XSB-300E board, processed it, divided the screen into ninths, and used information about how much black was in each area to decide whether to advance or turn the robot. These commands were fed out a serial port to an IR tower to control the Lego robot. Prof. Sklar lent the group the videocamera and extensive advice. the group

PDF FileTheir final report      Quicktime movie Watch their robot in action (Quicktime .mov file)

scrunched video image TAMF: Thing-a-ma-Flipper Video Effects Generator
Essa Farhat     Eveliza Herrera     Rhonda Jordan     Amon Wilkes

This hard-working group produced a dynamic video effects generator. Their hardware is able to scale and distort a still video image stored in memory under software control. The result was a digital fun-house mirror. They deserve extra credit for using a picture of themselves as a sample and still being able to look at themselves at the end of the term. the group

PDF FileTheir final report      Quicktime movie Watch their video (Quicktime .mov file)

people with car DAQ-T: SAE Student Vehicle Telemetry System
Laxmikant Joshi     Sangeeta Das     Stephen Tarzia     Waclaw Aleksander Godycki

This unique project involved an unusual peripheral: a formula-style racecar built with a space-frame chassis powered by a 600cc motorcycle engine. Their goal was real-time remote data acquisition from the car, letting them capture data such as engine RPM, throttle position, and manifold air pressure from a safe, quiet, still location. They bought an off-the-shelf RF transmitter/receiver pair, interfaced it with a PIC microcontroller, and designed and built their own analog signal conditioning hardware. Very impressive, although their car ran out of gas during their demo, a challenge only this group could face. the group

PDF FileTheir final report

video camera SDE: Stereo Depth Extractor
Ang Cui     Jeng-Ming Hwang     Yen Yen Ooi     Kashif Siddiqui     Ting-Hsiang Wu

This group implemented an idea based on work by the CS Department's Prof. Shree K. Nayar: extracting depth information from a video camera pointed at a mirror (see his publications page for details). They took the video from the camera, identified two spots on the image, projected by a laser pointer onto a piece of paper, and determined how far away the point was using parallax. They revised their initial mirror configuration after reading Shree's paper on the subject.
the group

PDF FileTheir final report

pac-man Video Game

Dagna Harasim     Charles Finkel     David Soofian

This group implemented a video game based very loosely on Pac-Man. The display uses a combination of characters for the maze and sprites for the characters. The game logic is implemented in C. They did a good job, but I considered taking off a few points because I kept dying.
the group

PDF FileTheir final report      Quicktime movie Watch their game in action

pac-man Video Game

Ke Xu     Eric Li     Winston Chao

This group implemented a video game based very loosely on Pac-Man. The display uses a combination of characters for the maze and sprites for the characters. The game logic is implemented in C. They did a good job, but I considered taking off a few points because I kept dying.
the group

PDF FileTheir final report      Quicktime movie Watch their game in action

funny video image JAYcam: Real-time video system
Yaniv Schiller     Avrum Tilman     Joshua Weinberg

This group exemplified good engineering: their project appeared to not be doing anything, but in fact was very complicated. It took real-time black-and-white video, reduced its resolution, and transmitted it over a 100 Mbit/s Ethernet link to a simple Java program that displayed it on the screen. What was amazing about this project was the number of different, complicated standards it implemented: NTSC video, Ethernet, and UDP. the group

PDF FileTheir final report

video image NORTSAM: Real-time audio spectrum analyzer
Tecuan Flores     Joshua Mackler     Philip Coakley

This group really pushed the computing power of the XESS board. They implemented a real-time graphical spectrum analyzer able perform a real-time 2048-point fast Fourier transform (FFT) on a pair of 48 kHz audio signals, bin the results, and display it graphically. Most of the FFT was done in software, but they implemented a very fast complex multiplier peripheral to speed up the inner loop. the group

PDF FileTheir final report Quicktime movie Watch their spectrum analyzer run