Chroma Transients

Colour looks great when there are 4 or more horizontally-adjacent pixels. But when there is an isolated single overlay pixel in a scanline (e.g. as part of a vertical line or the upright stems of text characters) then the colour is faint, as you can see in this image below.

Altimeter pointer

Notice how much darker the vertical and diagonal lines are compared to the solid green horizontal lines. Also, although it’s not visible here, the green appears to “bleed” into the adjacent left and right pixels from the video feed.

This may be the result of signal processing in either the decoder, encoder, or my video monitor. Or it could be the result of the analog video encoding, which attempts to squeeze 3 signals (1 luminance and 2 chrominance) down one wire, with limited bandwidth. The ADV7184 datasheet states:

Due to the higher bandwidth, the signal transition of the luma component is usually much sharper than that of the chroma component. The color edge is not sharp and can be blurred, in the worst case, over several pixels.

The decoder has a Chroma Transient Improvement block, and the encoder has various Digital Noise Reduction filters. I will need to experiment with all of them. Next step will be to add a pushbutton handler which will progressively enable and disable the CTI and DNR settings so I can observe what difference if any they make.

Frame buffer

The STM32F407 has 128K of SRAM (as well as an additional 64K of core coupled memory). 128K is just enough to implement a double frame buffer with 300 * 200 resolution. This does not quite cover the full screen (the grey strips at the right and left indicate the available area for graphics).

I could increase resolution by changing to a 4 bits per pixel colour space, giving a palette of 16 colours. Currently using 8bpp I have access to 64 simultaneous colours.

The other option would be to “race the beam” – generate raster scans in realtime without using a frame buffer. This is how the early game consoles did it. I did some experiments earlier and while it is possible, I found I was running out of clock cycles when doing anything other than very simple graphics. A single raster scan is equivalent to around 12,000 clock cycles on the STM32F407.

What I have already tried

Previously I tried another approach: using an LM1881 sync separator IC I extracted the timing signals from a composite video source and used a microcontroller connected to an AD724 IC to generate the composite video I wished to overlay. Using a fast switch, I was able to switch pixels.

The challenge with this approach is that both the overlay and source video need to be in phase with each other. Composite video encodes the colour using a phase-modulated subcarrier. This remarkable achievement allowed colour TV broadcasts to be watchable on the black and white sets that people already owned (and black and white TV to be watchable on a colour TV set). But it also means you can’t do simple pixel switching with
colour composite video signals. If the subcarriers aren’t in phase with each other then the colours will be wrong.

Daryl Rictor determined that using an MC44144 you can generate a clock signal which is in phase with the colour subcarrier. And if you use that to clock an AD724, you will have video in correct phase with the source that you wish to overlay. I was able to confirm that this does work, but I decided not to pursue it for two reasons: firstly the MC44144 is an obsolete part, and secondly it only works reliably for NTSC video (we use PAL in my part of the world and I want to support both standards).

I believe it could still be a viable approach if the problem with PAL can be solved, but my prototype had 5 ICs on a breadboard not including the microcontroller and I have my doubts that it could be made small enough.

Progress so far

To date I have built a proof concept that shows colour overlay is possible and within reach of the hobbyist. I believe the hardware problems are solved, now I am working on the software.

 

I’m calling my proof of concept a “Video Experimenter’s Board”. It consists of two video ASIC chips from Analog Devices: the ADV7184 which converts a composite video signal such as an FPV camera would output, into a digital byte stream. This chip is connected back-to-back with an ADV7341 digital-analogue converter which turns the byte stream back into composite video. On its own this back-to-back arrangement is not very useful: why would you want to convert analogue to digital, then back to analogue?

The reason is that the ADV7184 also supports SCART overlay.

Among other things, the European SCART standard allows a device such as set-top box to perform pixel switching. By switching rapidly between its own video and that of the TV set, it can build up a text or graphic overlay. Exactly what what we  want for an OSD system! Using a microcontroller we can generate RGB video which the ADV7184 will combine with the composite video from an FPV camera. As long as our RGB video is synchronised with the timing signals from the ADV7184, we will have a full-colour OSD.

My Video Experimenter’s Board is not a stand-alone device. As well as a microcontroller, it also needs a dual-voltage power supply and a resistor network to convert the digital pixel video into analogue RGB. However any microcontroller should be able to interface with it, even an Arduino! Shortly I will provide schematics and design files for the VEB and its supporting modules.

By using a modular approach, I will be able to experiment with different components without committing to a final design. Once I do have a final design I will produce another board with all components, small and light enough to mount on an RC aircraft.

Assembly Text rendering

Testing an assembly routine for rendering text, as I need a significant performance boost. The C text render routine uses too many cycles. Fast blank pixel switching stopped working. It was only switching to contrast-reduction mode, pixels were not appearing. It was as if the voltage level on the FB pin was not high enough to switch to pixels although according to my scope the voltage levels were correct. This appears to have happened when I connected the ground for the FB DAC. Previously I did not have this connected.
After I removed it it started working again. Relieved to see it work once more although in fact it is supposed to have ground connected.

I think the reason it needs GND disconnected is because in contrast-reduction mode, one of the 2 FB pins is at level 0 (i.e. ground) so it is acting as a second ground. That is why the DAC still works without its GND connection. This behaviour of digital pins at level 0 means I have probably miscalculated all the resistors in the other DACs, although colours seem OK.

What is Weka OSD?

Update: new approach by André using DDS, see below.

This will be a record of my attempts to develop a colour graphical OSD for radio-controlled aircraft. An OSD system is a device which overlays flight information onto the video feed used by the pilot of an RC aircraft, so that they are able to see their altitude, airspeed, GPS co-ordinates, remaining battery power or fuel, or direction back home.

There are a number of OSD systems available. Many of these are based on the Minim OSD board. This was originally designed by 3D Robotics but released as an open hardware project so there are many variations available and it is inexpensive. However it is entirely text based. The MAX7456 chip at the heart of it was intended to display black and white text over security camera feeds.

There have been projects to create true graphical OSD systems, such as AlceOSD. It is relatively straightforward to superimpose a black and white or greyscale image over a video feed using a modern microprocessor, doing the same with a colour image is much more challenging. The only widely available OSD capable of doing this is the EagleTree Vector. It is a closed-source commercial product. Edit: MyFlyDream Crosshair autopilot, another closed-source product, also features a colour OSD.

I am aiming to create something of comparable performance, for the FPV community, which will be fully open that anybody can build on or customize. At present I have two working solutions, one using the ADV7184 and ADV7341 digital encoder and decoder connected back to back, and one using the AD724 locked to an external subcarrier (Yes, Analog Devices are the last best maker of the buggy whip that is analog video technology). Neither solution is entirely satisfactory to me; the first uses too much current and the second too many components. So the search goes on. Please keep reading below to follow my progress, and if you have any ideas feel free to share them!