The video functionality was the most interesting and most rewarding aspect of the Phoenix hardware design.
I decided to use composite video output ("analog TV") for Phoenix, as this is what most Z80-based computers tended to use, and it's fairly easy to generate using a microcontroller (at least for monochrome).
Some articles I used as inspiration:
- http://www.batsocks.co.uk/readme/art_SerialVideo_1.htm
- http://www.rickard.gunee.com/projects/video/pic/howto.php
- http://sbc.rictor.org/vid3.html
A composite video signal contains horizontal sync, vertical sync, brightness and color information. If we just look at monochrome (no colors or even greyscales) the signal has 3 voltage levels: 0.3 volts for black, 1.0 volts for white, and 0.0 volts for synchronization pulses (both horizontal and vertical). It's easy to generate these voltage levels using 2 digital output pins and a few diodes and resistors, as described in the articles above.
I decided to use an ATmega328p as the microcontroller for the video signal generation. It may seem like cheating to use a modern microcontroller in a retrocomputing project; on the other hand many Z80-based computers used custom ICs for video. Programming a microcontroller is just a different way of making a "custom IC".
I started with generating the sync pulses. This is basically just an exercise in careful timing. Rather than using the AVR timers, I just wrote this in AVR assembler as a single loop with pin toggles and delay loops. Counting the number of clock cycles for each instruction, this is a relatively easy (but very boring) process. The result on the logic analyzer:
On the left a few normal horizontal sync pulses (the last few scan lines of a frame), followed by the series of pulses making up a vertical sync: these are half a scan line long (6 half-lines of short pulses, followed by 6 half-lines of long pulses, and then 6 short ones again).
Now we need the actual image. Since I grew up with a ZX Spectrum, I decided on a resolution of 256x192 pixels. For 256x192 monochrome pixels a framebuffer of 256/8*192 = 6144 bytes (6 KB) is needed. Since Phoenix already has 32 KB of RAM, I can just set aside 6 KB for the framebuffer, although simultaneous access from the video circuitry and the CPU needs to be figured out (more on that below).
An ATmega328p is not fast enough to read from memory and produce the brightness signal. Instead, I used a slightly different approach: the ATmega will just put addresses on the bus, and since each byte in the frame buffer contains 8 pixels, a 74HC165 8-bit parallel-in-serial-out shift register is used to output the pixels one at a time:
Each group of 8 pixels is basically generated as follows (assuming the ATmega328p has full access to the memory):
- place correct framebuffer address on the address bus
- load byte from data bus into shift register (this outputs the first bit in the byte)
- shift the bits in the shift register repeatedly to display the remaining 7 pixels
To get uniform pixels, the load and shift pulses must happen exactly evenly spaced at the pixel clock frequency.
To get roughly square pixels the pixel clock needs to be about 6 MHz. Toggling a pin on and off requires 2 clock cycles on the AVR. However, we also need to increment the address and put it on the bus every 8 pixels. To keep an even pixel size, we therefore need 3 clock cycles per pixel to do all these operations. At a 6 MHz pixel clock, this means the ATmega needs to run at 18 MHz, which is within its specifications.
The logic analyzer shows this pattern: a load pulse every eight pixels, and shift pulses for the 7 remaining pixels. The pixel data from the shift register is updated at a 6 MHz rate, and the address bus is updated every 8 pixels.
The only remaining issue is making sure the Z80 CPU and the ATmega play nice together for access to the RAM. The solution I chose is not very sophisticated: the Z80 has a "bus request" signal which allows an external device (such as the ATmega in this case) to request access to the address and data bus. After asserting this signal, it may take a few clock cycles for the Z80 to grant access, which it indicates using the "bus acknowledgement" signal.
Given the tight pixel timing requirements, we have to request the bus for the entire period of the 256 horizontal pixels. This means that while pixel data is being sent to the screen the Z80 is effectively halted. This works out to approximately 50% of the time, so even though the Z80 is clocked at 6 MHz, it performs like a 3 MHz one.
The final bit of the video design involves taking control of the bus as soon as the Z80 grants it. This is implemented using a 74HC541 buffer whose output is gated by the Z80 bus acknowledgement signal. As soon as the Z80 acknowledges the bus request, the 74HC541 puts the MREQ, IORQ, RD, WR, A15 and A14 signals in a defined state.
The AVR assembler code for this project can be found on GitHub.
And finally two pictures:
Early prototype of the video circuit: the ATmega328p and 74HC165 wired together. There's no memory, instead I'm using the address lines as the input to the shift register. |
Final version of Phoenix showing a test pattern on an Adafruit 4" NTSC monitor. |