FPVLAB

image
Page 1 of 4 123 ... LastLast
Results 1 to 10 of 40

Thread: FPV max resolution?

  1. #1

    FPV max resolution?

    Does anyone know off the top of their head the max resolution you can transmit via our analog Vtxs? This is under absolute best conditions obviously. I vaguely remember reading somewhere that its around 360 p or 640 x 480? any help would be gladly appreciated!

  2. #2
    there is no hard set maximum to the actual video standard itself. the maximum comes from the components in use. vtx/vrx and viewing device. most viewing devices are very low resolution and have to downscale even STANDARD 480/540p composite video tho.

    practically all the cheap dvd/gps/navigation/small TV screens out there are really only 240p (not 480). What's funny is that many of the 480p screens out there don't actually deinterlace the image. displaying interlaced 240 frames. LoL! very few commonly found goggles (none of the FPV ones that im aware of, tho i do not claim to have handled them all) have much resolution at all. for example, fatsharks have claimed for a long time that they use 640x480 LCD's in each eye and 922,000 pixels. ya right. more likely by 234. barring you go like, source it to a high resolution TV/Monitor that WONT re-scale the image, you're always loosing a lot of quality.

  3. #3
    Quote Originally Posted by Toysrme View Post
    there is no hard set maximum to the actual video standard itself. the maximum comes from the components in use. vtx/vrx and viewing device. most viewing devices are very low resolution and have to downscale even STANDARD 480/540p composite video tho.

    practically all the cheap dvd/gps/navigation/small TV screens out there are really only 240p (not 480). What's funny is that many of the 480p screens out there don't actually deinterlace the image. displaying interlaced 240 frames. LoL! very few commonly found goggles (none of the FPV ones that im aware of, tho i do not claim to have handled them all) have much resolution at all. for example, fatsharks have claimed for a long time that they use 640x480 LCD's in each eye and 922,000 pixels. ya right. more likely by 234. barring you go like, source it to a high resolution TV/Monitor that WONT re-scale the image, you're always loosing a lot of quality.
    so based on a common tx and rx like immersion rc, what would these two components limit the resolution to?

  4. #4
    Navigator
    Join Date
    Aug 2011
    Location
    Brazil
    Posts
    685
    Here's the long answer...

    It's not measured in resolution, but scan or television lines. Television lines are considered a measure of "image quality": http://en.wikipedia.org/wiki/Analog_television
    You could also call this the information transferred per frame.

    Most camera's are PAL or NTSC standards, which produce 525 or 625 lines. Some camera's like the TBS69 have 690 scan lines, so offer superior image quality. More is explained here: http://www.securitycameraking.com/se...vl-resolution/
    That also states the importance of the receiver equipment.

    690 lines on 320x200 make it very sharp. 690 lines on an HD tv clearly show the pixelization that occurs in the transfer. Our analog signals transfer well to a 640x480 screen. A 320x200 screen would have compressed the image, an HD resolution would show pixelization, because the source signal doesn't have more information to provide the relevant bits to each pixel.

    Analog signals, similar to digital signals, are limited in how much information can be transferred because of the choices made in the used "bandwidth" and the "refresh rate" (framerate). If we use pixels it's easier to explain, so let's assume every analog signal discussed hereafter is perfectly mapped to a digital image in pixels without losing or compressing information. When you transmit an image with alternating on/off pixels, that's when you're transmitting the maximum amount of information for a single image, because it turns fully on to fully off as fast as possible.

    Now, when you transfer 30 'frames' per second (or a refresh rate of 30Hz in the analog world), you can now easily calculate how much time you have to produce a single image: 33ms. Because you can't stop time, you must now choose some method where a 'signal' alternates between different states to be able to transfer the information. This is a sequence of these horizontal video lines: http://en.wikipedia.org/wiki/File:Video-line.svg.

    Analog signal transmission is trickier than it appears. You can still see that the line has a couple of "plateau's" in it where some information could perhaps be placed. The reason they're there is that the signal above is how it's designed, but not necessarily how it is actually transferred in the real world. In the real world you won't typically see "step" signals like that, the signal always has a slope. This is why a bit of time is left to allow the signal to "settle" before accurate measurements are taken (for example, the color burst).

    Onwards... so one frame is transferred in 33ms. The luminance and chrominance signals in the picture above change their amplitude over time. What you see there is the 'time domain' view of the signal. You could also decompose this into the frequencies that make up the same signal, which is called the "frequency view" (done through a Fourier transform). This frequency view shows you which frequencies are used to generate the signal. It just so happens that when you want to squash more information into an analog signal in a particular period of time, you need to vary the signal faster over time, thus it increases the highest frequency occurring in that signal.

    Video is a "baseband" signal. This means it starts at 0Hz, but the highest frequency existing in the signal (see explanation above) determines the "bandwidth" of the signal. So you'd have frequencies in the above signal that vary from 0Mhz to x Mhz. For baseband video, that's between 4-6 MHz. The baseband audio signal is only 20kHz, so peanuts compared to that.

    But wait! that's not all. There are cases where the signal varies a bit too fast (a step), which generates frequencies much higher than the upper limit of 6MHz. If you'd attempt to transmit the baseband signal as is, you'd thus get a transmitter that uses a very wide part of the spectrum. So the higher frequencies need to be filtered out on the baseband. These electronic filters aren't perfect however, they have a certain slope or fall-off in their efficiency of filtering. If you'd want to reduce the power of the highest frequency at 6MHz, you need to start filtering at 2-4MHz. Ow crap, you'd thus also lose 10-15% of the higher frequencies. This basically means you lose contrast and brightness in parts of the image where there's lots of variation. What the engineers did is thus 'extend' the bandwidth of the signal by a factor of ~3. On our transmitters, we transmit video in 20 MHz slots. Now the filter can be activated at 6 MHz, which doesn't decrease the higher frequencies and it's still getting rid of higher frequencies present in the signal after that, which would cause interference with signals following our own carrier (in the next 20MHz slot). What's left is modulate the baseband video on a 2.4, 5.8 or 1.2 GHz carrier and transmit.

    Thus, you get a video signal that's essentially 4-6 MHz in bandwidth, contained in a "20 MHz bandwidth slot" on the transmitter to separate the signals. However, the only way to increase the "resolution" on an eventual digital image is to increase the frequencies used for 'scanning' the image. This significantly increases bandwidth. The other thing that can be done is reduce the frame rate. Those are the options.

    A rule of thumb for determining how much bandwidth you need for an analog signal which is more or less "correctly" transferred onto a digital display of x*y pixels without losing or compressing information there, is this:

    SF = (( width * height * refreshrate ) / 2) * 3

    width = how many horizontal pixels, height is number of lines, refresh rate is framerate.

    Example:
    (640 * 480 * 30 / 2) * 3 = 13.8 MHz.

    Thus, to transfer 1280*720@30 analog video, you'd need (beyond the equipment that actually does this successfully):
    (1280 * 720 * 30 / 2) * 3 = 41.5 MHz

  5. #5
    thanks for this great knowledge! this will definitely become one of those often referenced posts.

  6. #6
    You mentioned compressing. Ive heard that compressing can create laggy video, but would it be possible to do a slight compression to boost resolution only a little more than the analog bandwidth allows, or would this make video transmission too slow for fpv? Also, do all vtx/rx for fpv ( immersion, bosscam, lawmate, racewood) offer roughly the same resolution, or in your opinion is there one that is better than the others?

  7. #7
    Navigator
    Join Date
    Aug 2011
    Location
    Brazil
    Posts
    685
    Whoops. Compression here is used in the context of resizing the image. It's the same effect as projecting a 4x4 thing on a 1x1 surface or "zooming out" of an image on a computer. That's what I meant in this context.

    You could theoretically compress a bit, but I doubt the complexity of doing this and making your own hardware is worth the effort of those couple of extra pixels that you can squeeze out. With compression, you're always introducing a tiny bit of latency, because you need to wait for other data that's coming up to use it effectively, beyond having to process the information together. It makes sense when bandwidth is a costly resource.

    Anyway, if you do it, you also need to get access to display devices that can decode this (or goggles). I don't see that happening.

  8. #8
    Crash Test Dummy
    Join Date
    Mar 2012
    Location
    Los Angeles
    Posts
    51
    Great post! Thanks for taking the time.

    Quote Originally Posted by radialmind View Post
    Here's the long answer...

    It's not measured in resolution, but scan or television lines. Television lines are considered a measure of "image quality": http://en.wikipedia.org/wiki/Analog_television
    You could also call this the information transferred per frame.

    Most camera's are PAL or NTSC standards, which produce 525 or 625 lines. Some camera's like the TBS69 have 690 scan lines, so offer superior image quality. More is explained here: http://www.securitycameraking.com/se...vl-resolution/
    That also states the importance of the receiver equipment.

    690 lines on 320x200 make it very sharp. 690 lines on an HD tv clearly show the pixelization that occurs in the transfer. Our analog signals transfer well to a 640x480 screen. A 320x200 screen would have compressed the image, an HD resolution would show pixelization, because the source signal doesn't have more information to provide the relevant bits to each pixel.

    Analog signals, similar to digital signals, are limited in how much information can be transferred because of the choices made in the used "bandwidth" and the "refresh rate" (framerate). If we use pixels it's easier to explain, so let's assume every analog signal discussed hereafter is perfectly mapped to a digital image in pixels without losing or compressing information. When you transmit an image with alternating on/off pixels, that's when you're transmitting the maximum amount of information for a single image, because it turns fully on to fully off as fast as possible.

    Now, when you transfer 30 'frames' per second (or a refresh rate of 30Hz in the analog world), you can now easily calculate how much time you have to produce a single image: 33ms. Because you can't stop time, you must now choose some method where a 'signal' alternates between different states to be able to transfer the information. This is a sequence of these horizontal video lines: http://en.wikipedia.org/wiki/File:Video-line.svg.

    Analog signal transmission is trickier than it appears. You can still see that the line has a couple of "plateau's" in it where some information could perhaps be placed. The reason they're there is that the signal above is how it's designed, but not necessarily how it is actually transferred in the real world. In the real world you won't typically see "step" signals like that, the signal always has a slope. This is why a bit of time is left to allow the signal to "settle" before accurate measurements are taken (for example, the color burst).

    Onwards... so one frame is transferred in 33ms. The luminance and chrominance signals in the picture above change their amplitude over time. What you see there is the 'time domain' view of the signal. You could also decompose this into the frequencies that make up the same signal, which is called the "frequency view" (done through a Fourier transform). This frequency view shows you which frequencies are used to generate the signal. It just so happens that when you want to squash more information into an analog signal in a particular period of time, you need to vary the signal faster over time, thus it increases the highest frequency occurring in that signal.

    Video is a "baseband" signal. This means it starts at 0Hz, but the highest frequency existing in the signal (see explanation above) determines the "bandwidth" of the signal. So you'd have frequencies in the above signal that vary from 0Mhz to x Mhz. For baseband video, that's between 4-6 MHz. The baseband audio signal is only 20kHz, so peanuts compared to that.

    But wait! that's not all. There are cases where the signal varies a bit too fast (a step), which generates frequencies much higher than the upper limit of 6MHz. If you'd attempt to transmit the baseband signal as is, you'd thus get a transmitter that uses a very wide part of the spectrum. So the higher frequencies need to be filtered out on the baseband. These electronic filters aren't perfect however, they have a certain slope or fall-off in their efficiency of filtering. If you'd want to reduce the power of the highest frequency at 6MHz, you need to start filtering at 2-4MHz. Ow crap, you'd thus also lose 10-15% of the higher frequencies. This basically means you lose contrast and brightness in parts of the image where there's lots of variation. What the engineers did is thus 'extend' the bandwidth of the signal by a factor of ~3. On our transmitters, we transmit video in 20 MHz slots. Now the filter can be activated at 6 MHz, which doesn't decrease the higher frequencies and it's still getting rid of higher frequencies present in the signal after that, which would cause interference with signals following our own carrier (in the next 20MHz slot). What's left is modulate the baseband video on a 2.4, 5.8 or 1.2 GHz carrier and transmit.

    Thus, you get a video signal that's essentially 4-6 MHz in bandwidth, contained in a "20 MHz bandwidth slot" on the transmitter to separate the signals. However, the only way to increase the "resolution" on an eventual digital image is to increase the frequencies used for 'scanning' the image. This significantly increases bandwidth. The other thing that can be done is reduce the frame rate. Those are the options.

    A rule of thumb for determining how much bandwidth you need for an analog signal which is more or less "correctly" transferred onto a digital display of x*y pixels without losing or compressing information there, is this:

    SF = (( width * height * refreshrate ) / 2) * 3

    width = how many horizontal pixels, height is number of lines, refresh rate is framerate.

    Example:
    (640 * 480 * 30 / 2) * 3 = 13.8 MHz.

    Thus, to transfer 1280*720@30 analog video, you'd need (beyond the equipment that actually does this successfully):
    (1280 * 720 * 30 / 2) * 3 = 41.5 MHz

  9. #9
    And the short answer?

    for those of us less tech savvy, using easily obtainable\standard equipment, whats the maximum number of resolved lines in each direction? (also known as resolution. ofc assuming ur not using 320x200 goggles or something ridiculous...) basically what OP asked. i dont feel like its really been answered here.

    did a quick test using patterns such as this: http://www.gpsinformation.org/jack/iso-gd-cb-955s.jpg (or just google resolution test picture), and found my setup managed just below 400x400 resolution. it wasnt an accurate test by any means but it gives a good indication.
    equipment:
    -600tvl sony super had from sc2k
    -immersionrc 600mw 5.8 (whip)
    -immersionrc uno v2 reciever (whip)
    -1080p hdtv(cant remember brand/model) with composite input

    higher doable? i certainly hope so..

  10. #10
    Navigator rdbell's Avatar
    Join Date
    Jul 2011
    Location
    Nashville, TN
    Posts
    33
    Quote Originally Posted by radialmind View Post
    Here's the long answer...
    Thanks for that great answer! I just took a Signal Processing class last semester so at least some of it made sense. I had been wondering how a video signal is transferred

Page 1 of 4 123 ... LastLast

Similar Threads

  1. Whats the highest resolution board camera?
    By Thefokker in forum Flight Camera Talk
    Replies: 5
    Last Post: 28th May 2013, 12:17 AM
  2. Replies: 0
    Last Post: 10th September 2012, 05:58 PM
  3. Goggle pilots, ever find one with 1080p resolution?
    By GreenAce92 in forum OFF-TOPIC
    Replies: 20
    Last Post: 10th July 2012, 06:48 PM
  4. Ezuhf resolution and speed
    By paulaus in forum ImmersionRC
    Replies: 1
    Last Post: 14th June 2012, 05:02 AM
  5. Googles With resolution up to 2,560 by 2,048 pixels :) :) :)
    By filipinto in forum IFR - Video Link Discussion
    Replies: 4
    Last Post: 13th February 2012, 12:05 AM

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •