bjm at lvcm
Jan 23, 2007, 3:42 PM
Daniel Kristjansson wrote:
> On Fri, 2007-01-19 at 12:21 +0100, Hans Verkuil wrote:
>> Note regarding ivtv VBI support: it is flaky in all current ivtv
>> versions. Basically when VBI capturing is on it is possible for MPEG or
>> VBI data to turn up in the wrong stream. This is a firmware bug for
>> which only the current ivtv subversion trunk code contains a
>> workaround. This code will become available Real Soon Now for kernels
>> 2.6.18 and up. It is extremely unlikely that it will ever be backported
>> to older kernels since it required a huge interrupt/DMA rewrite in
>> Due to these problems I would recommend that for ivtv VBI is turned off
>> by default in MythTV if possible.
But thousands of people use it every day right now. To impose
on them that they are not allowed to use VBI because you say
it is imperfect is blown way out of proportion.
> Do you mean VBI embedding in the MPEG stream should be turned
> off by default, oe the VBI device (which we don't use) should
> be disabled? Also can you recommend some way we can detect if
> we have an ivtv driver with a non-buggy VBI?
I would first want to see a reproducible test case of something
that does or does not work depending on if VBI is "turned off"
(whatever that means). If there is something that this fixes
and isn't throwing out the baby with the bath water, there should
be an option to turn it off per input rather than global (the
current setting affects all cards of all types on all backends
and not only affects recording but tells the frontend that they
are not allowed to try to read VBI). The default for the per
input option should be "on" to not impose a change on users but
allow them to turn VBI off to address these issues that I don't
>> Another cause of problems in MythTV is the default resolution of 480x480
>> instead of 720x480/576. The MPEG encoder introduces a small amount of
>> ghosting when it has to scale.
First, if this is a reference to the doubled image broken frame,
I've already stated that turning off VBI had no impact on the
Next, "cause of problems" is an odd characterization. If ivtv
can't record a sample rate other than 720 samples per scan line
then it is clearly a bug in the driver/firmware/compression
hardware. Bt8x8 absolutely can digitize analog NTSC/PAL at any
horizontal sample rate and the bttv driver has no issues doing
Lastly, as has been covered many, many times here over the years,
NTSC/PAL is not very high resolution. As the electron beam sweeps
across the screen, the analog resolution of broadcast equipment
and TV sets are in a range equivalent to 300-500 vertical scan
The leading commercial DVRs use 544x480 for their highest quality.
Because of the law of diminishing returns, higher sample rates are
overkill and in a blindfold test, you can't tell the difference in
recorded broadcast television at higher rates.
But here's the downside. If you increase the resolution (for no
benefit) and leave the compression bit rate the same so that the
recorded files are the same size, there will be a lower bit per
pixel ratio. This means that there will be more loss-iness in
the compression. The result is that, because of the compression
artifacts, there is less(!) detail on the screen and it is lower
quality even though the "resolution" numbers are set higher. In
order to counteract this, the bitrate would need to increase in
proportion to the sampling dimensions. This results in bigger
files but no better quality than at lower resolutions.
So what is "720"? It is the arbitrary maximum point where it
is absolutely overkill and wasteful. This could have been 1024
or 2048 that would have created huge files and stressed system
resources for a picture that looks exactly like 544x480 or even
480x480 when displayed on a TV set.
> I'll fix this, it's just a hangover from frame buffer recording
> profiles. I always change this to 720x480 when I set up a MythTV
Ah, here's the reason I wanted to reply to this message ;-)
In "2954-v3.patch" the "1178" appears to look through the
codecparams table and deletes existing settings.
Please, please do not do this.
I can see changing the default for new systems for IVTV only
(do not change the bttv default from 480x480) as a workaround
for these crappy cards. However, existing users may have tweaked
their settings and are fine with what they have now and do not
need to be ambushed by a change they did not ask for. As above,
this may cause more artifacts and an overall lower quality
picture if they don't increase the bitrate (and, they may not
want to increase the bitrate). Anyone is welcome to follow a
suggestion of trying 720 but this should not be blindly imposed
on all existing users.
PS I've been using myth since 0.5 or 0.6 and I got a PVR-250
before tmk started what became the ivtv driver. There was a
time around 0.10 or earlier where I was the only one doing
testing for ivtv cards to verify things before releases. In the
years since, there hasn't been a day when I honestly believed
that hardware encoders do a better job than software encoders.
I never mention this on the lists because the "common knowledge"
belief is that hardware encoders must be better and I don't need
lynch mobs taking shots at me and I don't want to do hand-holding
support for tuning software encoders. However, due to several
more aggravations recently, I'll come out and say it:
Hardware encoders suck!
Clearly any form of digital broadcast is always going to be
better than digitizing analog broadcasts so discussion of
"quality" is like comparing phonograph records to cassette
tapes when we have CDs.
Robert once made a comment a few years ago that stuck with me
where he believed the picture was better from PVR cards because
the chips were "mechanically" better. After all, if you look
at a ivtv card output with default settings and a bttv card
output with default settings, the ivtv picture is much better.
Cory has posted several times that hardware encoders are better
because the encoder chip on his card is better than the bt8x8
of his bttv card. He then goes on to compare the characteristics
of these two different chips as they digitize. This, of course,
is comparing apples and oranges. I have a bt8x8 hardware encoder
and bytesex,org has drivers for software encoders with different
Given the same chip on both types of cards, the software approach
has to be better because you have more control over the picture
and compression. The reason bttv looks worse is that there are
bad defaults and the hardware encoders address these right out
of the box.
There is a well know bug in the bt8x8 chip where the max luma
is set to 235 rather than 253 (duh!). This makes the colors
weak in brighter areas and gives the picture a dull, washed
out look. Raising the saturation doesn't fix this and just makes
the picture look weird. Myth addresses this with the "adjust"
filter and this can be used to make further tweaks to the
chroma and luma ranges before compression. In fact, you can
apply any current of future filter before compressing with
software encoding whereas hardware encoding can only use
filters built into the card.
The default brightness and contrast of 50% (32768) are way off.
the contrast needs to be cut significantly and the brightness
raised so the whitecrush and blackcrush will fit in range. The
default makes near white appear to be the same as completely
The bttv driver includes options that improve the image that
AFAIK are not available for ivtv. "Automatic Gain Contrail" or
agc normalizes the luma levels between cards and perhaps channels
that are out of whack. This give me a more consistent picture
from recording to recording. I don't see this option with ivtv
but it does do something annoying with the picture settings.
Often after a scene change, about 2 seconds in, the brightness
will instantly down shift a small amount but it is noticeable
Bttv has a comb filter that fixes that annoying multi-colored
flicker you see on stripped shirts or small white lettering.
Ivtv does this too but bttv filter set to 2 seems to do a
Bttv has an option to use "full_luma_range" which spreads the
digital range out to 0-255 to get the maximum contrast rather
than the normal limited range,
Bttv has "coring" that does a digital cutoff for the black
level, this cleans up noise and artifact and makes black truly
black. The contrast and brightness need to be set so that dark
grey or near black are above this point but black areas are
cleaner and compression is more efficient.
Bttv has a "uv ratio" which is another axis of tint. I find
that 52% make yellow and gold look more vivid and makes
flesh tones more natural.
Then there is the compression. For a given bitreate/filesize,
ffmpeg MPEG-4 certainly has fewer artifacts than MPEG-2 from
the ivtv card at the same resolution and bitrate. The MPEG-4
artifacts tend to appear smooth like blurring whereas the MPEG-2
artifacts look like harsh little boxes. Ffmpeg has made several
improvements over the past four years. The algorithms burned
into to PVR-250 card are exactly the same as the day I bought it.
Conversely, the bttv driver is fully functional and is done and
untouched since April 21st, 2004. Ivtv is still a moving target
and will continue to be for the foreseeable future.
For bttv, I set these modprobe options:
options bttv chroma_agc=1 combfilter=2 full_luma_range=1 coring=1 video_nr=0 vbi_nr=0
The last two force the devices to be /dev/video0 and /dev/vbi0
(these are to defeat udev which i despise even more than ivtv ;-).
I also set these options when a backend is started:
v4lctl -c /dev/video0 setattr 'mute' off > /dev/null 2>&1
v4lctl -c /dev/video0 setattr 'uv ratio' 52 > /dev/null 2>&1
v4lctl -c /dev/video0 setattr 'whitecrush upper' 253 > /dev/null 2>&1
v4lctl -c /dev/video0 setattr 'luma decimation filter' on > /dev/null 2>&1
My Default profile for myth is 496x480, 4800 bitrate with
"Scale bitrate for frame size" turned on, quality 0. 15. 3,
and the four encoding options turned off (two of these are
entirely broken). My picture settings for the coax input
are contrast = 26100 and brightness = 39400. For s-video
from a STB contrast = 26300 and brightness = 43350 (YMMV).
I also use "quickdnr" for channels with poor signal to clean
then up before compression.
With these settings a get recordings of about 1.7GB per hour
that look great and better than any recording I've seen from
ivtv. I keep my ivtv card as the last choice with lower input
priority. It sometimes records. I usually regret it =).
- Playback startup is slow (as is changing to this input in
- ~10% to 20% of the time, audio sync is off by about 0.2sec.
- The texture leans to grainy and is worse with noise in the
signal. [.Oh, and sharpness. Sharpness is adding controlled
noise to the signal to make edges between light and dark areas
wrap making the edge look more abrupt (ya know, that black circle
of distortion around a golf ball in PGA coverage). Ivtv output
seems to have sharpness added but no option. This makes the
image more harsh and annoying and I can't find a way to turn
- Harsh motion artifact. Really bad if the black level is too
high and noise near black is being compressed.
- One or two seconds to sync A/V after each seek.
- High speed FF/Rew can often freeze if used for more than a
I don't have any of these problems when software encoding.
The main issue with software is that frames will be dropped
if the CPU is pegged. I don't have problems with this because
I know not to run a compiler on a machine while it is recording.
The parameters above seem to use about the equivalent CPU time
of an AMD ~700Mhz. I used Duron and Athlon 1.3Ghz chips for
years and with a 2000, 2400 3000 or more it is absolutely no
problem. You can't hardly buy a chip lees than 2GHz these days
and if quality and reliability are the goal, CPU shouldn't be
an issue for recording with bttv.
So given that bttv or ivtv pale in comparison to HDTV, I honestly
believe that I get better looking video and a lot fewer hassles
with software encoding and will use bttv to record NTSC for the
mythtv-dev mailing list
mythtv-dev [at] mythtv