New Boards Complete

It took a long time to get these boards made up and populated, and I ran into issues along the way due to some errors in the design. The first mistake was to use 1.27mm pinheader footprints instead of 2.54mm. This might have been great from a miniaturisation point of view but not for convenience of prototyping as I needed to make custom jumpers to connect the boards, and special adapters to work with the DuPont connectors that are normal for prototyping. More troublesome was the use of a single AC coupling capacitor on the video input. Each IC which receives the video signal needs its own AC coupling capacitor as they apply their own bias to the incoming signal and will interfere with each other. I had to perform surgery by scraping off the soldermask, cutting traces and soldering in extra caps before the LM1881 sync separator would extract meaningful sync signals. Once I get some free time I will correct the design files, but for now please don’t anybody use them! They are fixed now.

I have a PAL and NTSC version of the timing board. I intend to start with the NTSC version and use it to drive an AD724 video generator in sync with the colour subcarrier. I will then use another board (not shown) with a video switch controlled by the MCU to switch pixels.

Another possibility is to ditch the AD724 and generate composite video with discrete components, by applying a delay to the colour subcarrier. This is how the early home computers did it back in the 1980s. If time permits I will explore both options, but for now the aim is to get a proof of concept working. The next task on the agenda is to revisit the WekaOSD code and adapt it for the new hardware.

 

New Board Designs

New design is a timing board, which extracts all the signals needed for genlock but does not generate any video. There will also be a fast video switch on a separate board using a MAX4313 which is both a buffer and a switch. Switching time is around 40ns, much slower than the TI switch I was planning to use. But this may still be OK and does not require a negative supply. I will still make the negative supply board though and if the MAX4313 is not suitable I can make another board with the TI switch.

This is a modular approach. I intend to try generating the CVBS from the uC, but if this does not work out I can make another board with an AD724 as originally planned.

Circuit schematics are complete, currently working on the routing.

Assuming 52uS of analog video, 320*240 resolution means each pixel lasts 162.5nS. This means the MAX4313 which takes 40nS to switch, will take around 1/4 of a pixel when switching.

Video timing board design is uploaded to Github:

https://github.com/Batperson/video_timing_board

Testing the Clamp Output

Connecting my video monitor to the clamp’s output results in a distorted signal (as shown on the scope) and nothing visible on the monitor. But without the monitor, the signal waveform looks normal.

With a 0.1uF capacitor, the black level and porches look “tilted”, with the left end at a slightly lower voltage than the right.

With a 4.7uF capacitor, the tilt is no longer visible. Same with a 220uF cap. But still no steady video on the monitor.

Putting a 75 ohm terminating resistor in the circuit as I would have in a real circuit, halves the amplitude of the video signal. I’m guessing 0.1uF / 75 ohms as the input stage to a 2* op amp as I plan to use would be OK.

Seems a CVBS waveform can pass through a capacitor OK, but not with enough power to drive a monitor. It needs buffering.

Clamping and Measuring

With the AD724, I measured 0mV at black level, -200mV at sync tip (using a Uzebox I have lying around). With the gameloader screen active, I got -100mV at sync tip and still 0V black. Noticed some movement up and down.

With my small PAL FPV camera, I measured around 180mV at black level, 50mV at sync tip.

With my Runcam action camera, I measured 20mv at sync tip, 180mv at black level.

In conclusion, everything I have, generates CVBS with a different DC offset. TV monitors do not care. They adapt to whatever the offset is. But if I am going to mix signals, they will both need to be clamped to a common DC offset. 0V sync tip should be fine. So I will try to make a clamp circuit using just a diode and capacitor.

Made up a clamp circuit on a breadboard using a BAT48 schottky diode and a 0.1uF ceramic cap. Observed that it did bring the sync tips to the same voltage for my 2 cameras, and that unclamped each camera’s signal had a different DC offset. However the sync tips were not at exactly 0 volts as expected but appeared to be 20-40mV below zero. This must be due to the diode voltage drop.

Now I am not sure what value capacitor to use in a real system. Many video circuits just use 0.1uF. But according to what I read, at least 220uF is needed to pass the full video bandwidth.

Revisiting the AD724

Made a design for an OSD board using a genlocked AD724. When I attempted this previously, one issue seemed to be mismatched DC levels for the input video and the OSD video. To get them matched, both signals will need to be clamped. Normally video signals that are AC coupled are clamped to 0V for black level, which means sync tips are -0.3V. So I need a circuit to do that.

Design Note DN327 from Linear Technology has a clamp circuit using a diode and a capacitor-resistor network. Not sure what DC level it actually clamps to, it is probably optimized for the amplifier in the circuit which would have sync tips at >= 0V.
Wondering if this actually matters? Assuming the next stage down from the OSD (the monitor, or the vtx) is going to be AC-coupled, the DC level is going to be stripped out anyway. As long as the black level of both signals is equal it is probably OK.

If I do clamp to a DC offset with no negative excursion, maybe I won’t need a negative supply for the amplifier and pixel switch either.

My plan for the time being is to experiment with clamp circuits before continuing with the OSD board design.

Current Draw

Using my multimeter, measured 436mA drawn before the LDOs, for the OSD circuit. Does not include MCU current.

Measured 260mA drawn after the LDOs on the 3.3V supply. Measured 70mA on the 1.8V supply. (All using the Amps scale, on the 10A unfused input).

If this is accurate, then I might be able to achieve around 360mA using a switching power supply with 80% efficiency (not including MCU current draw).

New Approach, New Ideas

Now looking at making a new board, using the AD724 clocked in sync with the incoming video signal’s colourburst. This is looking like quite a challenge. The basic principle has been shown to work, but it requires a lot of ICs. Am considering using the AD8001 as the pixel switch as it has “excellent video characteristics” according to the datasheet, which also gives a reference circuit for exactly that. However it requires a dual +-5V supply and I am not sure what is the best way to provide this. More research needed. Could it be as simple as 2 +5V LDOs, with the positive rail of one and the negative rail of the other, becoming GND?

Am also wondering how much of the 550mA used by my existing design, is due to losses by the inefficient linear voltage regulators? If I changed them to switch-mode regulators, could I bring the power down to 300mA? And could I do this while keeping noise within acceptable limits? I will try to measure the actual current drawn by the video chips from the LDO regulators. To date I have only measured total current drawn from the 5V USB power supply which includes losses from the LDOs.

Revised font system and other changes

Before attempting new hardware I wanted to finish 2018 with a better, more performant font system and tidy up other loose ends in the code. These changes are now complete. I am now storing fonts as 2bpp bitmaps which allows me to store the outline along with the character data, giving much better performance when rendering in outline mode.

Normally outlining in black is necessary around all character data as well and most graphics, because they tend to disappear against a white or light background. An OSD must be easy to read against any kind of background – you don’t want to have to nose down in order to read the altimeter because it is unreadable against the sky. Now almost everything has a black outline to avoid this problem.

I also implemented a variometer and experimented with yellow-white text and graphics.

I now have a font editor which I will be uploading to Github soon. It is very far from production quality code, simply the bare minimum necessary for me to be able to edit fonts instead of hand-coding them in raw binary as I was doing previously.

Now that these tasks are out of the way I will focus on the new hardware design. I will start by using a genlocked ADV725.

Progress, and some new ideas

With altitude, airspeed and heading indicators as well as an RTH arrow and a better attitude indicator, it is starting to look like a proper HUD. I have made the indicators white-green to reduce the effects of limited chroma bandwidth, and the Armed warning is now black on yellow for the same reason.

Progress.
Pitch ladders, heading, RTH arrow, it’s starting to look good. Pity the Armed warning is covered up, but that will be soon fixed. It’s time to get rid of the rainbow test pattern.

The attitude indicator needs a boresight, and text rendering needs some tidying up – you can see the imperfections if you look closely. Next item to implement will be a battery meter. It will use colour to show battery status, progressing from green to yellow, orange and red as the voltage drops. I will also try to implement a generic “indicator” which can be used to display any kind of parameter. It will have a text label and an optional colour swatch or icon, which can change in response to the parameter. It can be used to show things such as GPS satellite count, temperature, current and so on.

Some time in the future I would also like to have analog gauges as an alternative to HUD-style tapes. You will be able to have the traditional “6-pack” instruments if you want to.

After some discussion with airbanana at RC Groups I am going to make another attempt to generate the overlay by genlocking and pixel switching, instead of using digital decoder and encoder ASICs. If this works it should use less current (at the moment my dev board and MCU draw about 0.5A). There 2 are ways I could do this. One is the method I tried earlier, using an AD724 clocked in sync with the colour subcarrier. Another possibility is the approach used here. Rossum is generating the entire composite signal including the colourburst using the SPI port. To do that he needed to overclock the MCU to a multiple of the colourburst frequency. But the SPI port can be driven by an external clock. I can obtain a clock signal in sync with the colourburst using an MC44144 and a comparator as I did previously, and try using that to drive the SPI. It will be some time before I can try either approach as I will need to fabricate a new board. In the meantime I will continue working on the graphics.

Full Screen

I have finished porting the code over to the new STM32F413 Nucleo board and now have a frame buffer big enough to cover the whole screen. You can see the output here.

Full frame buffer
360 * 288 frame buffer, instruments in white. Notice how much better the text looks.

Word from Analog Devices regarding the chroma transient issue I mentioned earlier, is that this is a result of conversion to a YCrCb 4:2:2 digital format by the decoder. This format compresses the video data significantly, but at the expense of chroma information. Chroma is sacrificed because the human eye is much more sensitive to luma detail than chroma, so the loss is barely noticeable with a scene from ordinary television. However with coloured text and graphics it does become noticeable. From what I have observed, the effect is most noticeable with primary colours (pure red, blue or green). White or black pixels are the least affected and intermediate colours (yellow, magenta), while still affected, look better than primary colours. This makes sense when you consider that white pixels are almost entirely luma in the data stream, while primary colours contain the greatest proportion of chroma information.

For the time being I will have to live with this limitation. I will try to mitigate it by outlining text with black pixels (as most OSDs already do) and making coloured output at least 4 pixels wide (which will be 2 pixels in my half-resolution overlay).

It would make sense to render the altimeter, airspeed and AHI in white, and reserve colour for warning messages and status icons so that is what I am doing now. Ultimately this will be the sort of thing users can set according to their own preferences.

The way to avoid the chroma issue altogether would be to stream data as an RGB 4:4:4 stream which has no compression, or YCrCb 4:4:4. The ADV7341 encoder I am using understands RGB 4:4:4 but the ADV7184 cannot generate it and does not have a wide enough pixel bus.