Login | Register For Free | Help
Search for: (Advanced)

Mailing List Archive: MythTV: Dev

IVTV VBI reading

 

 

First page Previous page 1 2 Next page Last page  View All MythTV dev RSS feed   Index | Next | Previous | View Threaded


g8ecj at gilks

Jan 18, 2007, 1:31 AM

Post #1 of 28 (15909 views)
Permalink
IVTV VBI reading

Greetings

Is it possible to turn off the reading of /dev/vbi0 from an ivtv stream to
alleviate the ivtv driver problems if the capture resolution is not 'full
screen' (720x576 here in PAL country). I'd like to at least check that not
reading the vbi stream gets rid of the video artifact hassles that the
last few versions of the driver have suffered.

Cheers

--
Robin Gilks

_______________________________________________
mythtv-dev mailing list
mythtv-dev [at] mythtv
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-dev


ijr at case

Jan 18, 2007, 9:35 AM

Post #2 of 28 (15744 views)
Permalink
Re: IVTV VBI reading [In reply to]

On Thursday 18 January 2007 4:31 am, Robin Gilks wrote:
> Greetings
>
> Is it possible to turn off the reading of /dev/vbi0 from an ivtv stream to
> alleviate the ivtv driver problems if the capture resolution is not 'full
> screen' (720x576 here in PAL country). I'd like to at least check that not
> reading the vbi stream gets rid of the video artifact hassles that the
> last few versions of the driver have suffered.

We don't read from /dev/vbiX with ivtv cards. It asks the driver to embed the
stream in the mpeg data. You can disable _that_ by just setting the 'VBI
Format' setting in mythtv-setup to 'none'.

Isaac
_______________________________________________
mythtv-dev mailing list
mythtv-dev [at] mythtv
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-dev


bjm at lvcm

Jan 18, 2007, 12:18 PM

Post #3 of 28 (15738 views)
Permalink
Re: IVTV VBI reading [In reply to]

Isaac Richards wrote:
> On Thursday 18 January 2007 4:31 am, Robin Gilks wrote:
>> Greetings
>>
>> Is it possible to turn off the reading of /dev/vbi0 from an ivtv stream to
>> alleviate the ivtv driver problems if the capture resolution is not 'full
>> screen' (720x576 here in PAL country). I'd like to at least check that not
>> reading the vbi stream gets rid of the video artifact hassles that the
>> last few versions of the driver have suffered.

I assume this is the issue where frames will be flashed that
are misaligned near the bottom with a doubled image offset by
the delta between 720 and the recording width.

> We don't read from /dev/vbiX with ivtv cards. It asks the driver to embed the
> stream in the mpeg data. You can disable _that_ by just setting the 'VBI
> Format' setting in mythtv-setup to 'none'.

In a quick test, this doesn't seem to affect the driver.

I stopped the MBE, ran mythtv-setup and set "VBI format:" to
"None", saved the restarted. I set a record rule with the
preferred input for my ivtv card which is last because it is
so lame compared to well tuned bttv recording.

"Top" shows that the ivtv-enc-vbi thread is running (and there
are still broken frame but I don't yet know if they are related).
Pressing "T" reports "No captions". I stopped, reset to NTSC
Closed Caption and restarted the MBE. Now "T" shows the captions
from the recording made while it was set to "None".

-- bjm
_______________________________________________
mythtv-dev mailing list
mythtv-dev [at] mythtv
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-dev


g8ecj at gilks

Jan 18, 2007, 4:23 PM

Post #4 of 28 (15734 views)
Permalink
Re: IVTV VBI reading [In reply to]

> In a quick test, this doesn't seem to affect the driver.
>
> I stopped the MBE, ran mythtv-setup and set "VBI format:" to
> "None", saved the restarted. I set a record rule with the
> preferred input for my ivtv card which is last because it is
> so lame compared to well tuned bttv recording.
>
> "Top" shows that the ivtv-enc-vbi thread is running (and there
> are still broken frame but I don't yet know if they are related).
> Pressing "T" reports "No captions". I stopped, reset to NTSC
> Closed Caption and restarted the MBE. Now "T" shows the captions
> from the recording made while it was set to "None".

I had the same result - the ivtv-enc-vbi thread is still running during
recording after a restart of the backend having set VBI format to None. I
haven't tried unloading the ivtv driver and starting it fresh though...


--
Robin Gilks


_______________________________________________
mythtv-dev mailing list
mythtv-dev [at] mythtv
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-dev


ijr at case

Jan 18, 2007, 4:40 PM

Post #5 of 28 (15721 views)
Permalink
Re: IVTV VBI reading [In reply to]

On Thursday 18 January 2007 7:23 pm, Robin Gilks wrote:
> > In a quick test, this doesn't seem to affect the driver.
> >
> > I stopped the MBE, ran mythtv-setup and set "VBI format:" to
> > "None", saved the restarted. I set a record rule with the
> > preferred input for my ivtv card which is last because it is
> > so lame compared to well tuned bttv recording.
> >
> > "Top" shows that the ivtv-enc-vbi thread is running (and there
> > are still broken frame but I don't yet know if they are related).
> > Pressing "T" reports "No captions". I stopped, reset to NTSC
> > Closed Caption and restarted the MBE. Now "T" shows the captions
> > from the recording made while it was set to "None".
>
> I had the same result - the ivtv-enc-vbi thread is still running during
> recording after a restart of the backend having set VBI format to None. I
> haven't tried unloading the ivtv driver and starting it fresh though...

It may be cached in the driver (we don't explicitly _disable_ vbi
encapsulation if it's set to None), or they just run that thread all the
time.

Isaac
_______________________________________________
mythtv-dev mailing list
mythtv-dev [at] mythtv
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-dev


g8ecj at gilks

Jan 18, 2007, 11:37 PM

Post #6 of 28 (15745 views)
Permalink
Re: IVTV VBI reading [In reply to]

> It may be cached in the driver (we don't explicitly _disable_ vbi
> encapsulation if it's set to None), or they just run that thread all the
> time.
>
> Isaac

I can confirm that unloading and reloading the ivtv driver stops the
vbi-enc process from coming alive.

BTW - is it reasonable to look for VBI data from the s-video or composite
input of a pvr150/500 card? Perhaps just turn VBI on when a channel on a
tuner input is selected?

--
Robin Gilks

_______________________________________________
mythtv-dev mailing list
mythtv-dev [at] mythtv
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-dev


bjm at lvcm

Jan 19, 2007, 12:57 AM

Post #7 of 28 (15752 views)
Permalink
Re: IVTV VBI reading [In reply to]

Robin Gilks wrote:
>
>> It may be cached in the driver (we don't explicitly _disable_ vbi
>> encapsulation if it's set to None), or they just run that thread all the
>> time.
>>
>> Isaac
>
> I can confirm that unloading and reloading the ivtv driver stops the
> vbi-enc process from coming alive.

Right but this doesn't help with the broken frames in any way.

-- bjm
_______________________________________________
mythtv-dev mailing list
mythtv-dev [at] mythtv
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-dev


g8ecj at gilks

Jan 19, 2007, 1:36 AM

Post #8 of 28 (15729 views)
Permalink
Re: IVTV VBI reading [In reply to]

> Robin Gilks wrote:
>>
>>> It may be cached in the driver (we don't explicitly _disable_ vbi
>>> encapsulation if it's set to None), or they just run that thread all
>>> the
>>> time.
>>>
>>> Isaac
>>
>> I can confirm that unloading and reloading the ivtv driver stops the
>> vbi-enc process from coming alive.
>
> Right but this doesn't help with the broken frames in any way.
>
> -- bjm

I was going to reboot tomorrow into a 2.6.18 kernel to check this - on
2.6.16 with ivtv-0.6.6 at present to avoid the tearing pictures at
present...

Looks like its native resolution then - no big deal, just a tweak f
recording profiles.

--
Robin Gilks


_______________________________________________
mythtv-dev mailing list
mythtv-dev [at] mythtv
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-dev


hverkuil at xs4all

Jan 19, 2007, 3:21 AM

Post #9 of 28 (15722 views)
Permalink
Re: IVTV VBI reading [In reply to]

On Friday 19 January 2007 01:40, Isaac Richards wrote:
> On Thursday 18 January 2007 7:23 pm, Robin Gilks wrote:
> > > In a quick test, this doesn't seem to affect the driver.
> > >
> > > I stopped the MBE, ran mythtv-setup and set "VBI format:" to
> > > "None", saved the restarted. I set a record rule with the
> > > preferred input for my ivtv card which is last because it is
> > > so lame compared to well tuned bttv recording.
> > >
> > > "Top" shows that the ivtv-enc-vbi thread is running (and there
> > > are still broken frame but I don't yet know if they are related).
> > > Pressing "T" reports "No captions". I stopped, reset to NTSC
> > > Closed Caption and restarted the MBE. Now "T" shows the captions
> > > from the recording made while it was set to "None".
> >
> > I had the same result - the ivtv-enc-vbi thread is still running
during
> > recording after a restart of the backend having set VBI format to
None. I
> > haven't tried unloading the ivtv driver and starting it fresh
though...
>
> It may be cached in the driver (we don't explicitly _disable_ vbi
> encapsulation if it's set to None), or they just run that thread all
the
> time.

It's cached by the driver (it's a persistent setting), so it needs to be
explicitly turned off.

Note regarding ivtv VBI support: it is flaky in all current ivtv
versions. Basically when VBI capturing is on it is possible for MPEG or
VBI data to turn up in the wrong stream. This is a firmware bug for
which only the current ivtv subversion trunk code contains a
workaround. This code will become available Real Soon Now for kernels
2.6.18 and up. It is extremely unlikely that it will ever be backported
to older kernels since it required a huge interrupt/DMA rewrite in
ivtv.

Due to these problems I would recommend that for ivtv VBI is turned off
by default in MythTV if possible.

Another cause of problems in MythTV is the default resolution of 480x480
instead of 720x480/576. The MPEG encoder introduces a small amount of
ghosting when it has to scale. I also do not understand why MythTV
would want to scale when capturing MPEG, it can only degrade picture
quality since MPEG encoders are generally optimized for full resolution
captures. Scaling will also break VBI capturing on a PVR150/500 for
which no easy solution exists. It would be nice if MythTV would default
to full resolution if possible.

It would also be very nice if the ivtv header included in MythTV would
replace the very old incorrect ioctls with the correct ones:

#ifdef __FreeBSD__
#define IVTV_IOC_G_CODEC _IOR ('V', 73, struct ivtv_ioctl_codec)
#define IVTV_IOC_S_CODEC _IOWR ('V', 74, struct ivtv_ioctl_codec)
#else
#define IVTV_IOC_G_CODEC 0xFFEE7703
#define IVTV_IOC_S_CODEC 0xFFEE7704
#endif

The bit under __FreeBSD__ should be the default as well for linux. In
fact, all IVTV ioctls that are just a hex number should be replaced
with the correct _IO defines. Only drivers < 0.2.0 still need these and
those drivers are ANCIENT. The problem is that with newer drivers I
still need to check for these old ioctls to prevent them from being
(mis)handled by the v4l2 subsystem. These checks WILL disappear once
ivtv enters the kernel. So I would appreciate it very much if this
header could be updated.

Thanks!

Regards,

Hans
_______________________________________________
mythtv-dev mailing list
mythtv-dev [at] mythtv
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-dev


danielk at cuymedia

Jan 19, 2007, 6:51 AM

Post #10 of 28 (15723 views)
Permalink
Re: IVTV VBI reading [In reply to]

On Fri, 2007-01-19 at 12:21 +0100, Hans Verkuil wrote:
> Note regarding ivtv VBI support: it is flaky in all current ivtv
> versions. Basically when VBI capturing is on it is possible for MPEG or
> VBI data to turn up in the wrong stream. This is a firmware bug for
> which only the current ivtv subversion trunk code contains a
> workaround. This code will become available Real Soon Now for kernels
> 2.6.18 and up. It is extremely unlikely that it will ever be backported
> to older kernels since it required a huge interrupt/DMA rewrite in
> ivtv.
>
> Due to these problems I would recommend that for ivtv VBI is turned off
> by default in MythTV if possible.

Do you mean VBI embedding in the MPEG stream should be turned
off by default, oe the VBI device (which we don't use) should
be disabled? Also can you recommend some way we can detect if
we have an ivtv driver with a non-buggy VBI?

> Another cause of problems in MythTV is the default resolution of 480x480
> instead of 720x480/576. The MPEG encoder introduces a small amount of
> ghosting when it has to scale.
I'll fix this, it's just a hangover from frame buffer recording
profiles. I always change this to 720x480 when I set up a MythTV
machine.

> It would also be very nice if the ivtv header included in MythTV would
> replace the very old incorrect ioctls with the correct ones:
I just went through the header and did this, the only
possible problem I saw was with these ioctls:

#define IVTV_IOC_PAUSE_BLACK _IO ('@', 35)
#define IVTV_IOC_STOP _IO ('@', 36)

#define IVTV_IOC_S_VBI_MODE _IOWR('@', 35, struct
ivtv_sliced_vbi_format) /* old ioctl */
#define IVTV_IOC_G_VBI_MODE _IOR ('@', 36, struct
ivtv_sliced_vbi_format) /* old ioctl */

Is this a conflict? I'm assuming the first two are for
PVR-350 output, and the VBI ones are for recording.

> The bit under __FreeBSD__ should be the default as well for linux. In
> fact, all IVTV ioctls that are just a hex number should be replaced
> with the correct _IO defines. Only drivers < 0.2.0 still need these and
> those drivers are ANCIENT.
I have no problem with dropping support for pre-0.2.0 drivers
at this point.

> The problem is that with newer drivers I
> still need to check for these old ioctls to prevent them from being
> (mis)handled by the v4l2 subsystem. These checks WILL disappear once
> ivtv enters the kernel. So I would appreciate it very much if this
> header could be updated.
Will there be some way to detect the non-backward compatible
drivers? We will have to treat it like the V4L vs. V4L2
transition and will need some way to detect which API to
use...

-- Daniel

_______________________________________________
mythtv-dev mailing list
mythtv-dev [at] mythtv
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-dev


hverkuil at xs4all

Jan 19, 2007, 8:36 AM

Post #11 of 28 (15729 views)
Permalink
Re: IVTV VBI reading [In reply to]

On Friday 19 January 2007 15:51, Daniel Kristjansson wrote:
> On Fri, 2007-01-19 at 12:21 +0100, Hans Verkuil wrote:
> > Note regarding ivtv VBI support: it is flaky in all current ivtv
> > versions. Basically when VBI capturing is on it is possible for MPEG
or
> > VBI data to turn up in the wrong stream. This is a firmware bug for
> > which only the current ivtv subversion trunk code contains a
> > workaround. This code will become available Real Soon Now for
kernels
> > 2.6.18 and up. It is extremely unlikely that it will ever be
backported
> > to older kernels since it required a huge interrupt/DMA rewrite in
> > ivtv.
> >
> > Due to these problems I would recommend that for ivtv VBI is turned
off
> > by default in MythTV if possible.
>
> Do you mean VBI embedding in the MPEG stream should be turned
> off by default, oe the VBI device (which we don't use) should
> be disabled? Also can you recommend some way we can detect if
> we have an ivtv driver with a non-buggy VBI?

The usage of VBI should be turned off by default. It doesn't matter
whether the VBI is embedded or read straight from /dev/vbi, for the
hardware it's all the same.

Non-buggy VBI will be available in ivtv drivers with version number
(obtain with VIDIOC_QUERYCAP) >= KERNEL_VERSION(0, 10, 0).

I hope to make a release candidate available of this version some time
this weekend.

>
> > Another cause of problems in MythTV is the default resolution of
480x480
> > instead of 720x480/576. The MPEG encoder introduces a small amount
of
> > ghosting when it has to scale.
> I'll fix this, it's just a hangover from frame buffer recording
> profiles. I always change this to 720x480 when I set up a MythTV
> machine.

Much appreciated!

>
> > It would also be very nice if the ivtv header included in MythTV
would
> > replace the very old incorrect ioctls with the correct ones:
> I just went through the header and did this, the only
> possible problem I saw was with these ioctls:
>
> #define IVTV_IOC_PAUSE_BLACK _IO ('@', 35)
> #define IVTV_IOC_STOP _IO ('@', 36)
>
> #define IVTV_IOC_S_VBI_MODE _IOWR('@', 35, struct
> ivtv_sliced_vbi_format) /* old ioctl */
> #define IVTV_IOC_G_VBI_MODE _IOR ('@', 36, struct
> ivtv_sliced_vbi_format) /* old ioctl */
>
> Is this a conflict? I'm assuming the first two are for
> PVR-350 output, and the VBI ones are for recording.

Ouch, that's old! The IVTV_IOC_G/S_VBI_MODE are used in pre-ivtv-0.4
versions. Newer versions from ivtv-0.4 onward use the v4l2 sliced VBI
API to specify which VBI types should be captured. The V4L2 sliced VBI
API was introduced in kernel 2.6.14. It is pure luck that MythTV works
with ivtv-0.4: if you turn on VBI embedding and no sliced VBI types are
selected, then ivtv defaults to CC for NTSC and the wide screen signal
for PAL.

> > The bit under __FreeBSD__ should be the default as well for linux.
In
> > fact, all IVTV ioctls that are just a hex number should be replaced
> > with the correct _IO defines. Only drivers < 0.2.0 still need these
and
> > those drivers are ANCIENT.
> I have no problem with dropping support for pre-0.2.0 drivers
> at this point.

Thanks again!

>
> > The problem is that with newer drivers I
> > still need to check for these old ioctls to prevent them from being
> > (mis)handled by the v4l2 subsystem. These checks WILL disappear once
> > ivtv enters the kernel. So I would appreciate it very much if this
> > header could be updated.
> Will there be some way to detect the non-backward compatible
> drivers? We will have to treat it like the V4L vs. V4L2
> transition and will need some way to detect which API to
> use...

Why would you want to check for this? Once you drop the numeric ioctls
it will work fine for all versions from 0.2.0 onwards!

BTW, just to inform you: once ivtv-0.10.0 is done I will proceed with
the final step that is needed to get ivtv into the kernel: the
ivtv-specific MPEG decoding API will be redesigned into a full V4L2
MPEG decoding API. There are also still a few remaining ivtv-specific
MPEG encoding ioctls, those too will be converted (most likely).

Ian Armstrong has been doing a lot of work lately in improving the
framebuffer device for the OSD (used by the PVR350). It is unlikely at
this stage that the few ivtv-specific OSD ioctls will be converted to
the V4L2 API. They will probably be improved a bit, but nothing big.
Almost all OSD functionality can be accessed with the linux framebuffer
API.

Anyway, the API changes will become effective once the driver is merged
into the kernel (2.6.2x were x >= 2 :-) ).

I know it will be a hassle for MythTV, but at least the good news is
that the new API will be V4L2 compliant and is no longer tied to a
single hardware platform.

Regards,

Hans
_______________________________________________
mythtv-dev mailing list
mythtv-dev [at] mythtv
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-dev


danielk at cuymedia

Jan 20, 2007, 6:34 AM

Post #12 of 28 (15695 views)
Permalink
Re: IVTV VBI reading [In reply to]

On Fri, 2007-01-19 at 17:36 +0100, Hans Verkuil wrote:
> On Friday 19 January 2007 15:51, Daniel Kristjansson wrote:
> The usage of VBI should be turned off by default. It doesn't matter
> whether the VBI is embedded or read straight from /dev/vbi, for the
> hardware it's all the same.
>
> Non-buggy VBI will be available in ivtv drivers with version number
> (obtain with VIDIOC_QUERYCAP) >= KERNEL_VERSION(0, 10, 0).

Ok, I'll just disable captions unless it passes this test,
that will avoid the 0.2 ioctl VBI ioctl as well.

> > I'll fix this, it's just a hangover from frame buffer recording
> > profiles. I always change this to 720x480 when I set up a MythTV
> > machine.
> Much appreciated!

I have a patch with this that I'm running locally. I just
created a ticket with the patch here:
http://svn.mythtv.org/trac/ticket/2954


> > #define IVTV_IOC_PAUSE_BLACK _IO ('@', 35)
> > #define IVTV_IOC_STOP _IO ('@', 36)
> >
> > #define IVTV_IOC_S_VBI_MODE _IOWR('@', 35, struct
> > ivtv_sliced_vbi_format) /* old ioctl */
> > #define IVTV_IOC_G_VBI_MODE _IOR ('@', 36, struct
> > ivtv_sliced_vbi_format) /* old ioctl */

> Ouch, that's old! The IVTV_IOC_G/S_VBI_MODE are used in pre-ivtv-0.4
> versions. Newer versions from ivtv-0.4 onward use the v4l2 sliced VBI
> API to specify which VBI types should be captured. The V4L2 sliced VBI
> API was introduced in kernel 2.6.14. It is pure luck that MythTV works
> with ivtv-0.4: if you turn on VBI embedding and no sliced VBI types are
> selected, then ivtv defaults to CC for NTSC and the wide screen signal
> for PAL.

We use the V4L2_CAP_SLICED_VBI_CAPTURE ioctl if IVTV_IOC_S_VBI_MODE
ioctl fails. But considering the problems I'll just drop support
for captions unless the user is using the latest drivers. This
isn't in the patch yet. I want to test the changes for this
against the new drivers.

> > > The problem is that with newer drivers I
> > > still need to check for these old ioctls to prevent them from being
> > > (mis)handled by the v4l2 subsystem. These checks WILL disappear once
> > > ivtv enters the kernel. So I would appreciate it very much if this
> > > header could be updated.
> > Will there be some way to detect the non-backward compatible
> > drivers? We will have to treat it like the V4L vs. V4L2
> > transition and will need some way to detect which API to
> > use...
>
> Why would you want to check for this? Once you drop the numeric ioctls
> it will work fine for all versions from 0.2.0 onwards!
>
> BTW, just to inform you: once ivtv-0.10.0 is done I will proceed with
> the final step that is needed to get ivtv into the kernel: the
> ivtv-specific MPEG decoding API will be redesigned into a full V4L2
> MPEG decoding API. There are also still a few remaining ivtv-specific
> MPEG encoding ioctls, those too will be converted (most likely).

But if these ivtv-specific IOC's change it will affect us since
we will be using them... We also try to support drivers until
they are 2 years old. I don't my dropping support for something
auxiliary like captions, but I don't want to drop support for
basic recording.

> Ian Armstrong has been doing a lot of work lately in improving the
> framebuffer device for the OSD (used by the PVR350). It is unlikely at
> this stage that the few ivtv-specific OSD ioctls will be converted to
> the V4L2 API. They will probably be improved a bit, but nothing big.
> Almost all OSD functionality can be accessed with the linux framebuffer
> API.

Unfotunately the PVR-350 output support in MythTV is completely
unmaintained, if this stops working we'll probably just drop
PVR-350 output support. It's already pretty broken. (Unless Ian
wants to write some patches :)

> Anyway, the API changes will become effective once the driver is merged
> into the kernel (2.6.2x were x >= 2 :-) ).
>
> I know it will be a hassle for MythTV, but at least the good news is
> that the new API will be V4L2 compliant and is no longer tied to a
> single hardware platform.

Yep, I'm quite happy with the kernel merge. Once a kernel
with ivtv support has been out a couple years we can get
rid of all the ivtv specific calls in MythTV !

-- Daniel

_______________________________________________
mythtv-dev mailing list
mythtv-dev [at] mythtv
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-dev


hverkuil at xs4all

Jan 20, 2007, 7:12 AM

Post #13 of 28 (15735 views)
Permalink
Re: IVTV VBI reading [In reply to]

On Saturday 20 January 2007 15:34, Daniel Kristjansson wrote:
> On Fri, 2007-01-19 at 17:36 +0100, Hans Verkuil wrote:
> > On Friday 19 January 2007 15:51, Daniel Kristjansson wrote:
> > The usage of VBI should be turned off by default. It doesn't matter
> > whether the VBI is embedded or read straight from /dev/vbi, for the
> > hardware it's all the same.
> >
> > Non-buggy VBI will be available in ivtv drivers with version number
> > (obtain with VIDIOC_QUERYCAP) >= KERNEL_VERSION(0, 10, 0).
>
> Ok, I'll just disable captions unless it passes this test,
> that will avoid the 0.2 ioctl VBI ioctl as well.
>
> > > I'll fix this, it's just a hangover from frame buffer recording
> > > profiles. I always change this to 720x480 when I set up a MythTV
> > > machine.
> >
> > Much appreciated!
>
> I have a patch with this that I'm running locally. I just
> created a ticket with the patch here:
> http://svn.mythtv.org/trac/ticket/2954
>
> > > #define IVTV_IOC_PAUSE_BLACK _IO ('@', 35)
> > > #define IVTV_IOC_STOP _IO ('@', 36)
> > >
> > > #define IVTV_IOC_S_VBI_MODE _IOWR('@', 35, struct
> > > ivtv_sliced_vbi_format) /* old ioctl */
> > > #define IVTV_IOC_G_VBI_MODE _IOR ('@', 36, struct
> > > ivtv_sliced_vbi_format) /* old ioctl */
> >
> > Ouch, that's old! The IVTV_IOC_G/S_VBI_MODE are used in
> > pre-ivtv-0.4 versions. Newer versions from ivtv-0.4 onward use the
> > v4l2 sliced VBI API to specify which VBI types should be captured.
> > The V4L2 sliced VBI API was introduced in kernel 2.6.14. It is pure
> > luck that MythTV works with ivtv-0.4: if you turn on VBI embedding
> > and no sliced VBI types are selected, then ivtv defaults to CC for
> > NTSC and the wide screen signal for PAL.
>
> We use the V4L2_CAP_SLICED_VBI_CAPTURE ioctl if IVTV_IOC_S_VBI_MODE
> ioctl fails. But considering the problems I'll just drop support
> for captions unless the user is using the latest drivers. This
> isn't in the patch yet. I want to test the changes for this
> against the new drivers.

I think I would prefer that VBI is off by default for anything but the
latest drivers (>= 0.10.0). If someone wants it for older drivers, then
that is still possible. Completely removing it is overkill, I know that
some people are using it to their satisfaction.

> > > > The problem is that with newer drivers I
> > > > still need to check for these old ioctls to prevent them from
> > > > being (mis)handled by the v4l2 subsystem. These checks WILL
> > > > disappear once ivtv enters the kernel. So I would appreciate it
> > > > very much if this header could be updated.
> > >
> > > Will there be some way to detect the non-backward compatible
> > > drivers? We will have to treat it like the V4L vs. V4L2
> > > transition and will need some way to detect which API to
> > > use...
> >
> > Why would you want to check for this? Once you drop the numeric
> > ioctls it will work fine for all versions from 0.2.0 onwards!
> >
> > BTW, just to inform you: once ivtv-0.10.0 is done I will proceed
> > with the final step that is needed to get ivtv into the kernel: the
> > ivtv-specific MPEG decoding API will be redesigned into a full V4L2
> > MPEG decoding API. There are also still a few remaining
> > ivtv-specific MPEG encoding ioctls, those too will be converted
> > (most likely).
>
> But if these ivtv-specific IOC's change it will affect us since
> we will be using them... We also try to support drivers until
> they are 2 years old. I don't my dropping support for something
> auxiliary like captions, but I don't want to drop support for
> basic recording.

I thought you meant the removal of the numeric ioctls. Sorry for the
misunderstanding.

It will definitely be possible to test for the new ioctls using the ivtv
version with QUERYCAP. I will almost certainly move to version 1.0.0 or
something like that once the driver is merged into the kernel.

>
> > Ian Armstrong has been doing a lot of work lately in improving the
> > framebuffer device for the OSD (used by the PVR350). It is unlikely
> > at this stage that the few ivtv-specific OSD ioctls will be
> > converted to the V4L2 API. They will probably be improved a bit,
> > but nothing big. Almost all OSD functionality can be accessed with
> > the linux framebuffer API.
>
> Unfotunately the PVR-350 output support in MythTV is completely
> unmaintained, if this stops working we'll probably just drop
> PVR-350 output support. It's already pretty broken. (Unless Ian
> wants to write some patches :)

I'll ask him gently :-)

>
> > Anyway, the API changes will become effective once the driver is
> > merged into the kernel (2.6.2x were x >= 2 :-) ).
> >
> > I know it will be a hassle for MythTV, but at least the good news
> > is that the new API will be V4L2 compliant and is no longer tied to
> > a single hardware platform.
>
> Yep, I'm quite happy with the kernel merge. Once a kernel
> with ivtv support has been out a couple years we can get
> rid of all the ivtv specific calls in MythTV !
>
> -- Daniel

Thanks!

Hans
_______________________________________________
mythtv-dev mailing list
mythtv-dev [at] mythtv
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-dev


f-myth-users at media

Jan 21, 2007, 11:35 PM

Post #14 of 28 (15695 views)
Permalink
IVTV VBI reading [In reply to]

> Date: Sat, 20 Jan 2007 16:12:11 +0100
> From: Hans Verkuil <hverkuil [at] xs4all>

> On Saturday 20 January 2007 15:34, Daniel Kristjansson wrote:
> > On Fri, 2007-01-19 at 17:36 +0100, Hans Verkuil wrote:
> > We use the V4L2_CAP_SLICED_VBI_CAPTURE ioctl if IVTV_IOC_S_VBI_MODE
> > ioctl fails. But considering the problems I'll just drop support
> > for captions unless the user is using the latest drivers. This
> > isn't in the patch yet. I want to test the changes for this
> > against the new drivers.

> I think I would prefer that VBI is off by default for anything but the
> latest drivers (>= 0.10.0). If someone wants it for older drivers, then
> that is still possible. Completely removing it is overkill, I know that
> some people are using it to their satisfaction.

Yes, -please- don't -force- users with older drivers not to have
captions at all; make it possible to enable them. Maybe issue a
warning about the sorts of issues seen when they -are- enabled, or
something if necessary.

I'm currently using ivtv 0.4.1 with a flock of PVR-250's and CC is
working just fine for me, and I'd be really annoyed if a working
configuration was forced to not work.

(Even very recent Ubuntu releases have moderately old kernels by the
standards of the very-fast-moving ivtv timeline, and the newest ivtv
won't work in kernels < 2.18.x, which would mean [.if I understand you
correctly, and I'm not sure that I do] you'd force captions not to
work unless users are running distributions with kernels newer than
the distribution expects [.hence users would be forced to compile their
own kernels, couldn't get automatic security updates for them, etc etc].
That in turn would mean that I, for one, would never run that version
of Myth until my distro could catch up to the kernel version required,
since my research requires the closed-captioning data.)
_______________________________________________
mythtv-dev mailing list
mythtv-dev [at] mythtv
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-dev


luke.hart at birchleys

Jan 23, 2007, 3:54 AM

Post #15 of 28 (15688 views)
Permalink
Re: IVTV VBI reading [In reply to]

Robin Gilks wrote:
> BTW - is it reasonable to look for VBI data from the s-video or composite
> input of a pvr150/500 card? Perhaps just turn VBI on when a channel on a
> tuner input is selected?
>
Yes, it's quite possible for a signal on s-video or composite to contain
information in the VBI. At one point at least one channel on NTL digital
in the UK contained a minimal teletext service (with details of music
tracks you could phone up and request). Whether anyone uses this anymore
is probably a better question.

Luke

_______________________________________________
mythtv-dev mailing list
mythtv-dev [at] mythtv
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-dev


bjm at lvcm

Jan 23, 2007, 3:42 PM

Post #16 of 28 (15685 views)
Permalink
Re: IVTV VBI reading [In reply to]

Daniel Kristjansson wrote:
> On Fri, 2007-01-19 at 12:21 +0100, Hans Verkuil wrote:
>> Note regarding ivtv VBI support: it is flaky in all current ivtv
>> versions. Basically when VBI capturing is on it is possible for MPEG or
>> VBI data to turn up in the wrong stream. This is a firmware bug for
>> which only the current ivtv subversion trunk code contains a
>> workaround. This code will become available Real Soon Now for kernels
>> 2.6.18 and up. It is extremely unlikely that it will ever be backported
>> to older kernels since it required a huge interrupt/DMA rewrite in
>> ivtv.
>>
>> Due to these problems I would recommend that for ivtv VBI is turned off
>> by default in MythTV if possible.

But thousands of people use it every day right now. To impose
on them that they are not allowed to use VBI because you say
it is imperfect is blown way out of proportion.

> Do you mean VBI embedding in the MPEG stream should be turned
> off by default, oe the VBI device (which we don't use) should
> be disabled? Also can you recommend some way we can detect if
> we have an ivtv driver with a non-buggy VBI?

I would first want to see a reproducible test case of something
that does or does not work depending on if VBI is "turned off"
(whatever that means). If there is something that this fixes
and isn't throwing out the baby with the bath water, there should
be an option to turn it off per input rather than global (the
current setting affects all cards of all types on all backends
and not only affects recording but tells the frontend that they
are not allowed to try to read VBI). The default for the per
input option should be "on" to not impose a change on users but
allow them to turn VBI off to address these issues that I don't
know about.

>> Another cause of problems in MythTV is the default resolution of 480x480
>> instead of 720x480/576. The MPEG encoder introduces a small amount of
>> ghosting when it has to scale.

First, if this is a reference to the doubled image broken frame,
I've already stated that turning off VBI had no impact on the
doubled frame.

Next, "cause of problems" is an odd characterization. If ivtv
can't record a sample rate other than 720 samples per scan line
then it is clearly a bug in the driver/firmware/compression
hardware. Bt8x8 absolutely can digitize analog NTSC/PAL at any
horizontal sample rate and the bttv driver has no issues doing
so.

Lastly, as has been covered many, many times here over the years,
NTSC/PAL is not very high resolution. As the electron beam sweeps
across the screen, the analog resolution of broadcast equipment
and TV sets are in a range equivalent to 300-500 vertical scan
lines.

http://www.ntsc-tv.com/ntsc-index-04.htm

The leading commercial DVRs use 544x480 for their highest quality.
Because of the law of diminishing returns, higher sample rates are
overkill and in a blindfold test, you can't tell the difference in
recorded broadcast television at higher rates.

But here's the downside. If you increase the resolution (for no
benefit) and leave the compression bit rate the same so that the
recorded files are the same size, there will be a lower bit per
pixel ratio. This means that there will be more loss-iness in
the compression. The result is that, because of the compression
artifacts, there is less(!) detail on the screen and it is lower
quality even though the "resolution" numbers are set higher. In
order to counteract this, the bitrate would need to increase in
proportion to the sampling dimensions. This results in bigger
files but no better quality than at lower resolutions.

So what is "720"? It is the arbitrary maximum point where it
is absolutely overkill and wasteful. This could have been 1024
or 2048 that would have created huge files and stressed system
resources for a picture that looks exactly like 544x480 or even
480x480 when displayed on a TV set.


> I'll fix this, it's just a hangover from frame buffer recording
> profiles. I always change this to 720x480 when I set up a MythTV
> machine.

Ah, here's the reason I wanted to reply to this message ;-)

In "2954-v3.patch" the "1178" appears to look through the
codecparams table and deletes existing settings.

Please, please do not do this.

I can see changing the default for new systems for IVTV only
(do not change the bttv default from 480x480) as a workaround
for these crappy cards. However, existing users may have tweaked
their settings and are fine with what they have now and do not
need to be ambushed by a change they did not ask for. As above,
this may cause more artifacts and an overall lower quality
picture if they don't increase the bitrate (and, they may not
want to increase the bitrate). Anyone is welcome to follow a
suggestion of trying 720 but this should not be blindly imposed
on all existing users.

-- bjm

PS I've been using myth since 0.5 or 0.6 and I got a PVR-250
before tmk started what became the ivtv driver. There was a
time around 0.10 or earlier where I was the only one doing
testing for ivtv cards to verify things before releases. In the
years since, there hasn't been a day when I honestly believed
that hardware encoders do a better job than software encoders.

I never mention this on the lists because the "common knowledge"
belief is that hardware encoders must be better and I don't need
lynch mobs taking shots at me and I don't want to do hand-holding
support for tuning software encoders. However, due to several
more aggravations recently, I'll come out and say it:

<rant>

Hardware encoders suck!

Clearly any form of digital broadcast is always going to be
better than digitizing analog broadcasts so discussion of
"quality" is like comparing phonograph records to cassette
tapes when we have CDs.

Robert once made a comment a few years ago that stuck with me
where he believed the picture was better from PVR cards because
the chips were "mechanically" better. After all, if you look
at a ivtv card output with default settings and a bttv card
output with default settings, the ivtv picture is much better.

Cory has posted several times that hardware encoders are better
because the encoder chip on his card is better than the bt8x8
of his bttv card. He then goes on to compare the characteristics
of these two different chips as they digitize. This, of course,
is comparing apples and oranges. I have a bt8x8 hardware encoder
and bytesex,org has drivers for software encoders with different
chip sets.

Given the same chip on both types of cards, the software approach
has to be better because you have more control over the picture
and compression. The reason bttv looks worse is that there are
bad defaults and the hardware encoders address these right out
of the box.

There is a well know bug in the bt8x8 chip where the max luma
is set to 235 rather than 253 (duh!). This makes the colors
weak in brighter areas and gives the picture a dull, washed
out look. Raising the saturation doesn't fix this and just makes
the picture look weird. Myth addresses this with the "adjust"
filter and this can be used to make further tweaks to the
chroma and luma ranges before compression. In fact, you can
apply any current of future filter before compressing with
software encoding whereas hardware encoding can only use
filters built into the card.

The default brightness and contrast of 50% (32768) are way off.
the contrast needs to be cut significantly and the brightness
raised so the whitecrush and blackcrush will fit in range. The
default makes near white appear to be the same as completely
white.

The bttv driver includes options that improve the image that
AFAIK are not available for ivtv. "Automatic Gain Contrail" or
agc normalizes the luma levels between cards and perhaps channels
that are out of whack. This give me a more consistent picture
from recording to recording. I don't see this option with ivtv
but it does do something annoying with the picture settings.
Often after a scene change, about 2 seconds in, the brightness
will instantly down shift a small amount but it is noticeable
and annoying.

Bttv has a comb filter that fixes that annoying multi-colored
flicker you see on stripped shirts or small white lettering.
Ivtv does this too but bttv filter set to 2 seems to do a
better job.

Bttv has an option to use "full_luma_range" which spreads the
digital range out to 0-255 to get the maximum contrast rather
than the normal limited range,

Bttv has "coring" that does a digital cutoff for the black
level, this cleans up noise and artifact and makes black truly
black. The contrast and brightness need to be set so that dark
grey or near black are above this point but black areas are
cleaner and compression is more efficient.

Bttv has a "uv ratio" which is another axis of tint. I find
that 52% make yellow and gold look more vivid and makes
flesh tones more natural.

Then there is the compression. For a given bitreate/filesize,
ffmpeg MPEG-4 certainly has fewer artifacts than MPEG-2 from
the ivtv card at the same resolution and bitrate. The MPEG-4
artifacts tend to appear smooth like blurring whereas the MPEG-2
artifacts look like harsh little boxes. Ffmpeg has made several
improvements over the past four years. The algorithms burned
into to PVR-250 card are exactly the same as the day I bought it.

Conversely, the bttv driver is fully functional and is done and
untouched since April 21st, 2004. Ivtv is still a moving target
and will continue to be for the foreseeable future.

For bttv, I set these modprobe options:

options bttv chroma_agc=1 combfilter=2 full_luma_range=1 coring=1 video_nr=0 vbi_nr=0

The last two force the devices to be /dev/video0 and /dev/vbi0
(these are to defeat udev which i despise even more than ivtv ;-).
I also set these options when a backend is started:

v4lctl -c /dev/video0 setattr 'mute' off > /dev/null 2>&1
v4lctl -c /dev/video0 setattr 'uv ratio' 52 > /dev/null 2>&1
v4lctl -c /dev/video0 setattr 'whitecrush upper' 253 > /dev/null 2>&1
v4lctl -c /dev/video0 setattr 'luma decimation filter' on > /dev/null 2>&1

My Default profile for myth is 496x480, 4800 bitrate with
"Scale bitrate for frame size" turned on, quality 0. 15. 3,
and the four encoding options turned off (two of these are
entirely broken). My picture settings for the coax input
are contrast = 26100 and brightness = 39400. For s-video
from a STB contrast = 26300 and brightness = 43350 (YMMV).
I also use "quickdnr" for channels with poor signal to clean
then up before compression.

With these settings a get recordings of about 1.7GB per hour
that look great and better than any recording I've seen from
ivtv. I keep my ivtv card as the last choice with lower input
priority. It sometimes records. I usually regret it =).

- Playback startup is slow (as is changing to this input in
live TV).

- ~10% to 20% of the time, audio sync is off by about 0.2sec.

- The texture leans to grainy and is worse with noise in the
signal. [.Oh, and sharpness. Sharpness is adding controlled
noise to the signal to make edges between light and dark areas
wrap making the edge look more abrupt (ya know, that black circle
of distortion around a golf ball in PGA coverage). Ivtv output
seems to have sharpness added but no option. This makes the
image more harsh and annoying and I can't find a way to turn
it off].

- Harsh motion artifact. Really bad if the black level is too
high and noise near black is being compressed.

- One or two seconds to sync A/V after each seek.

- High speed FF/Rew can often freeze if used for more than a
few seconds.

I don't have any of these problems when software encoding.
The main issue with software is that frames will be dropped
if the CPU is pegged. I don't have problems with this because
I know not to run a compiler on a machine while it is recording.
The parameters above seem to use about the equivalent CPU time
of an AMD ~700Mhz. I used Duron and Athlon 1.3Ghz chips for
years and with a 2000, 2400 3000 or more it is absolutely no
problem. You can't hardly buy a chip lees than 2GHz these days
and if quality and reliability are the goal, CPU shouldn't be
an issue for recording with bttv.

So given that bttv or ivtv pale in comparison to HDTV, I honestly
believe that I get better looking video and a lot fewer hassles
with software encoding and will use bttv to record NTSC for the
foreseeable future.

<!rant>
_______________________________________________
mythtv-dev mailing list
mythtv-dev [at] mythtv
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-dev


tom at redpepperracing

Jan 24, 2007, 8:36 AM

Post #17 of 28 (15687 views)
Permalink
Re: Hardware encoders (was: IVTV VBI reading) [In reply to]

Bruce Markey wrote:
>
> Hardware encoders suck!

snip...

> <!rant>

Thanks for that discourse Bruce, it was very enlightening. Would you say
that this applies to most bttv based cards? Any to avoid, or
alternatively, that you prefer? I am going to be making some big changes
to my myth setup, and this could help me in some ways.

Tom
_______________________________________________
mythtv-dev mailing list
mythtv-dev [at] mythtv
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-dev


hverkuil at xs4all

Jan 24, 2007, 11:13 AM

Post #18 of 28 (15659 views)
Permalink
Re: IVTV VBI reading [In reply to]

On Wednesday 24 January 2007 00:42, Bruce Markey wrote:
> Daniel Kristjansson wrote:
> > On Fri, 2007-01-19 at 12:21 +0100, Hans Verkuil wrote:
> >> Note regarding ivtv VBI support: it is flaky in all current ivtv
> >> versions. Basically when VBI capturing is on it is possible for
> >> MPEG or VBI data to turn up in the wrong stream. This is a
> >> firmware bug for which only the current ivtv subversion trunk code
> >> contains a workaround. This code will become available Real Soon
> >> Now for kernels 2.6.18 and up. It is extremely unlikely that it
> >> will ever be backported to older kernels since it required a huge
> >> interrupt/DMA rewrite in ivtv.
> >>
> >> Due to these problems I would recommend that for ivtv VBI is
> >> turned off by default in MythTV if possible.
>
> But thousands of people use it every day right now. To impose
> on them that they are not allowed to use VBI because you say
> it is imperfect is blown way out of proportion.

Just to clarify: for ivtv-based cards the MythTV VBI setting should be
off by default (i.e. after installing MythTV from scratch) for driver
version <0.10.0. You should of course always be able to turn it on if
you want.

> > Do you mean VBI embedding in the MPEG stream should be turned
> > off by default, oe the VBI device (which we don't use) should
> > be disabled? Also can you recommend some way we can detect if
> > we have an ivtv driver with a non-buggy VBI?
>
> I would first want to see a reproducible test case of something
> that does or does not work depending on if VBI is "turned off"
> (whatever that means). If there is something that this fixes
> and isn't throwing out the baby with the bath water, there should
> be an option to turn it off per input rather than global (the
> current setting affects all cards of all types on all backends
> and not only affects recording but tells the frontend that they
> are not allowed to try to read VBI). The default for the per
> input option should be "on" to not impose a change on users but
> allow them to turn VBI off to address these issues that I don't
> know about.

The information that I had suggests that this setting was off in
previous versions but was turned on in 0.20. I may be wrong. If it is a
global setting that would also affect other non-ivtv cards, then it is
probably not worth the effort of making this change, especially since
the next ivtv release will fix this issue. However, this fix will only
be available for ivtv together with kernels >= 2.6.18.

> >> Another cause of problems in MythTV is the default resolution of
> >> 480x480 instead of 720x480/576. The MPEG encoder introduces a
> >> small amount of ghosting when it has to scale.
>
> First, if this is a reference to the doubled image broken frame,
> I've already stated that turning off VBI had no impact on the
> doubled frame.

No, nothing to do with that.

> Next, "cause of problems" is an odd characterization. If ivtv
> can't record a sample rate other than 720 samples per scan line
> then it is clearly a bug in the driver/firmware/compression
> hardware. Bt8x8 absolutely can digitize analog NTSC/PAL at any
> horizontal sample rate and the bttv driver has no issues doing
> so.

It can record it perfectly, but scaling does introduce slight amount of
ghosting. Whether this is a driver, firmware or hardware bug is
something that I need to look into one of these days.

> Lastly, as has been covered many, many times here over the years,
> NTSC/PAL is not very high resolution. As the electron beam sweeps
> across the screen, the analog resolution of broadcast equipment
> and TV sets are in a range equivalent to 300-500 vertical scan
> lines.

Out of curiosity, is this also true when you record with S-Video from a
DVD player, for example?

> http://www.ntsc-tv.com/ntsc-index-04.htm
>
> The leading commercial DVRs use 544x480 for their highest quality.
> Because of the law of diminishing returns, higher sample rates are
> overkill and in a blindfold test, you can't tell the difference in
> recorded broadcast television at higher rates.
>
> But here's the downside. If you increase the resolution (for no
> benefit) and leave the compression bit rate the same so that the
> recorded files are the same size, there will be a lower bit per
> pixel ratio. This means that there will be more loss-iness in
> the compression. The result is that, because of the compression
> artifacts, there is less(!) detail on the screen and it is lower
> quality even though the "resolution" numbers are set higher. In
> order to counteract this, the bitrate would need to increase in
> proportion to the sampling dimensions. This results in bigger
> files but no better quality than at lower resolutions.

Absolutely true.

> So what is "720"? It is the arbitrary maximum point where it
> is absolutely overkill and wasteful. This could have been 1024
> or 2048 that would have created huge files and stressed system
> resources for a picture that looks exactly like 544x480 or even
> 480x480 when displayed on a TV set.

Well, an 720x480 MPEG stream has 1) the correct display ratio, 2) can
easily be burned to DVD without need for resizing. The default bitrate
set by ivtv is sufficient for good quality encoding at that resolution.

> > I'll fix this, it's just a hangover from frame buffer recording
> > profiles. I always change this to 720x480 when I set up a MythTV
> > machine.
>
> Ah, here's the reason I wanted to reply to this message ;-)
>
> In "2954-v3.patch" the "1178" appears to look through the
> codecparams table and deletes existing settings.
>
> Please, please do not do this.
>
> I can see changing the default for new systems for IVTV only
> (do not change the bttv default from 480x480) as a workaround
> for these crappy cards. However, existing users may have tweaked
> their settings and are fine with what they have now and do not
> need to be ambushed by a change they did not ask for. As above,
> this may cause more artifacts and an overall lower quality
> picture if they don't increase the bitrate (and, they may not
> want to increase the bitrate). Anyone is welcome to follow a
> suggestion of trying 720 but this should not be blindly imposed
> on all existing users.

For the record: these were just suggestions for improving (IMHO) the
default settings of MythTV for ivtv after a new installation based on
user experiences I received. These settings are specific to cards with
a cx23415/6 MPEG encoder.

Regards,

Hans Verkuil

>
> -- bjm
>
> PS I've been using myth since 0.5 or 0.6 and I got a PVR-250
> before tmk started what became the ivtv driver. There was a
> time around 0.10 or earlier where I was the only one doing
> testing for ivtv cards to verify things before releases. In the
> years since, there hasn't been a day when I honestly believed
> that hardware encoders do a better job than software encoders.
>
> I never mention this on the lists because the "common knowledge"
> belief is that hardware encoders must be better and I don't need
> lynch mobs taking shots at me and I don't want to do hand-holding
> support for tuning software encoders. However, due to several
> more aggravations recently, I'll come out and say it:
>
> <rant>
>
> Hardware encoders suck!
>
> Clearly any form of digital broadcast is always going to be
> better than digitizing analog broadcasts so discussion of
> "quality" is like comparing phonograph records to cassette
> tapes when we have CDs.
>
> Robert once made a comment a few years ago that stuck with me
> where he believed the picture was better from PVR cards because
> the chips were "mechanically" better. After all, if you look
> at a ivtv card output with default settings and a bttv card
> output with default settings, the ivtv picture is much better.
>
> Cory has posted several times that hardware encoders are better
> because the encoder chip on his card is better than the bt8x8
> of his bttv card. He then goes on to compare the characteristics
> of these two different chips as they digitize. This, of course,
> is comparing apples and oranges. I have a bt8x8 hardware encoder
> and bytesex,org has drivers for software encoders with different
> chip sets.
>
> Given the same chip on both types of cards, the software approach
> has to be better because you have more control over the picture
> and compression. The reason bttv looks worse is that there are
> bad defaults and the hardware encoders address these right out
> of the box.
>
> There is a well know bug in the bt8x8 chip where the max luma
> is set to 235 rather than 253 (duh!). This makes the colors
> weak in brighter areas and gives the picture a dull, washed
> out look. Raising the saturation doesn't fix this and just makes
> the picture look weird. Myth addresses this with the "adjust"
> filter and this can be used to make further tweaks to the
> chroma and luma ranges before compression. In fact, you can
> apply any current of future filter before compressing with
> software encoding whereas hardware encoding can only use
> filters built into the card.
>
> The default brightness and contrast of 50% (32768) are way off.
> the contrast needs to be cut significantly and the brightness
> raised so the whitecrush and blackcrush will fit in range. The
> default makes near white appear to be the same as completely
> white.
>
> The bttv driver includes options that improve the image that
> AFAIK are not available for ivtv. "Automatic Gain Contrail" or
> agc normalizes the luma levels between cards and perhaps channels
> that are out of whack. This give me a more consistent picture
> from recording to recording. I don't see this option with ivtv
> but it does do something annoying with the picture settings.
> Often after a scene change, about 2 seconds in, the brightness
> will instantly down shift a small amount but it is noticeable
> and annoying.
>
> Bttv has a comb filter that fixes that annoying multi-colored
> flicker you see on stripped shirts or small white lettering.
> Ivtv does this too but bttv filter set to 2 seems to do a
> better job.
>
> Bttv has an option to use "full_luma_range" which spreads the
> digital range out to 0-255 to get the maximum contrast rather
> than the normal limited range,
>
> Bttv has "coring" that does a digital cutoff for the black
> level, this cleans up noise and artifact and makes black truly
> black. The contrast and brightness need to be set so that dark
> grey or near black are above this point but black areas are
> cleaner and compression is more efficient.
>
> Bttv has a "uv ratio" which is another axis of tint. I find
> that 52% make yellow and gold look more vivid and makes
> flesh tones more natural.
>
> Then there is the compression. For a given bitreate/filesize,
> ffmpeg MPEG-4 certainly has fewer artifacts than MPEG-2 from
> the ivtv card at the same resolution and bitrate. The MPEG-4
> artifacts tend to appear smooth like blurring whereas the MPEG-2
> artifacts look like harsh little boxes. Ffmpeg has made several
> improvements over the past four years. The algorithms burned
> into to PVR-250 card are exactly the same as the day I bought it.
>
> Conversely, the bttv driver is fully functional and is done and
> untouched since April 21st, 2004. Ivtv is still a moving target
> and will continue to be for the foreseeable future.
>
> For bttv, I set these modprobe options:
>
> options bttv chroma_agc=1 combfilter=2 full_luma_range=1 coring=1
> video_nr=0 vbi_nr=0
>
> The last two force the devices to be /dev/video0 and /dev/vbi0
> (these are to defeat udev which i despise even more than ivtv ;-).
> I also set these options when a backend is started:
>
> v4lctl -c /dev/video0 setattr 'mute' off > /dev/null 2>&1
> v4lctl -c /dev/video0 setattr 'uv ratio' 52 > /dev/null 2>&1
> v4lctl -c /dev/video0 setattr 'whitecrush upper' 253 > /dev/null
> 2>&1 v4lctl -c /dev/video0 setattr 'luma decimation filter' on >
> /dev/null 2>&1
>
> My Default profile for myth is 496x480, 4800 bitrate with
> "Scale bitrate for frame size" turned on, quality 0. 15. 3,
> and the four encoding options turned off (two of these are
> entirely broken). My picture settings for the coax input
> are contrast = 26100 and brightness = 39400. For s-video
> from a STB contrast = 26300 and brightness = 43350 (YMMV).
> I also use "quickdnr" for channels with poor signal to clean
> then up before compression.
>
> With these settings a get recordings of about 1.7GB per hour
> that look great and better than any recording I've seen from
> ivtv. I keep my ivtv card as the last choice with lower input
> priority. It sometimes records. I usually regret it =).
>
> - Playback startup is slow (as is changing to this input in
> live TV).
>
> - ~10% to 20% of the time, audio sync is off by about 0.2sec.
>
> - The texture leans to grainy and is worse with noise in the
> signal. [.Oh, and sharpness. Sharpness is adding controlled
> noise to the signal to make edges between light and dark areas
> wrap making the edge look more abrupt (ya know, that black circle
> of distortion around a golf ball in PGA coverage). Ivtv output
> seems to have sharpness added but no option. This makes the
> image more harsh and annoying and I can't find a way to turn
> it off].
>
> - Harsh motion artifact. Really bad if the black level is too
> high and noise near black is being compressed.
>
> - One or two seconds to sync A/V after each seek.
>
> - High speed FF/Rew can often freeze if used for more than a
> few seconds.
>
> I don't have any of these problems when software encoding.
> The main issue with software is that frames will be dropped
> if the CPU is pegged. I don't have problems with this because
> I know not to run a compiler on a machine while it is recording.
> The parameters above seem to use about the equivalent CPU time
> of an AMD ~700Mhz. I used Duron and Athlon 1.3Ghz chips for
> years and with a 2000, 2400 3000 or more it is absolutely no
> problem. You can't hardly buy a chip lees than 2GHz these days
> and if quality and reliability are the goal, CPU shouldn't be
> an issue for recording with bttv.
>
> So given that bttv or ivtv pale in comparison to HDTV, I honestly
> believe that I get better looking video and a lot fewer hassles
> with software encoding and will use bttv to record NTSC for the
> foreseeable future.
>
> <!rant>
> _______________________________________________
> mythtv-dev mailing list
> mythtv-dev [at] mythtv
> http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-dev
_______________________________________________
mythtv-dev mailing list
mythtv-dev [at] mythtv
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-dev


ijr at case

Jan 24, 2007, 11:27 AM

Post #19 of 28 (15657 views)
Permalink
Re: IVTV VBI reading [In reply to]

On Wednesday 24 January 2007 2:13 pm, Hans Verkuil wrote:
> Just to clarify: for ivtv-based cards the MythTV VBI setting should be
> off by default (i.e. after installing MythTV from scratch) for driver
> version <0.10.0. You should of course always be able to turn it on if
> you want.

It is, and always has been, off by default.

Isaac
_______________________________________________
mythtv-dev mailing list
mythtv-dev [at] mythtv
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-dev


hverkuil at xs4all

Jan 24, 2007, 12:15 PM

Post #20 of 28 (15658 views)
Permalink
Re: IVTV VBI reading [In reply to]

On Wednesday 24 January 2007 20:27, Isaac Richards wrote:
> On Wednesday 24 January 2007 2:13 pm, Hans Verkuil wrote:
> > Just to clarify: for ivtv-based cards the MythTV VBI setting should
> > be off by default (i.e. after installing MythTV from scratch) for
> > driver version <0.10.0. You should of course always be able to turn
> > it on if you want.
>
> It is, and always has been, off by default.

In that case it seems that this is a non-issue. Thanks for the info!

Regards,

Hans
_______________________________________________
mythtv-dev mailing list
mythtv-dev [at] mythtv
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-dev


bjm at lvcm

Jan 24, 2007, 12:55 PM

Post #21 of 28 (15641 views)
Permalink
Re: Hardware encoders [In reply to]

Tom Lichti wrote:
> Bruce Markey wrote:
>> Hardware encoders suck!
>
> snip...
>
>> <!rant>
>
> Thanks for that discourse Bruce, it was very enlightening. Would you say
> that this applies to most bttv based cards? Any to avoid, or
> alternatively, that you prefer? I am going to be making some big changes
> to my myth setup, and this could help me in some ways.

An underlying premise is that a bt8x8 is a bt8x8 and they
are all capable of outputting the same raw frame given the
same register settings. The questions are which features are
turned on or off and how are those frames post processed.

AFAIK all of the module parameters for bttv apply to any
bt8x8 chip on cards supported by bttv. However, you need to
be aware of audio differences. If you use s-video of composite,
you can plug into the line-in stereo mini jack and the stereo
channels will pass through on any card. If you use the coax tuner
input, only cards that specify "Stereo TV" or "DBX" can decode
stereo from this input. WinTV "GO" is the low end model and does
not decoded stereo from cable but a higher end card like WinTV
model 401 does. Same for AverTV. Budget models don't do stereo.

Another difference that has drawn too much attention over the
years is "btaudio". Some card (very few) have a DSP on board
and audio can be recorded from this device with the btaudio
kernel module but this has limitations. If I had a card that
worked with btaudio and an open line-in on a sound card, I'd
still plug into the sound card to use it's DSP to digitize the
audio.

I'd say before "making some big changes", get one card and
experiment with it. See what does and doesn't work well for you.

-- bjm



_______________________________________________
mythtv-dev mailing list
mythtv-dev [at] mythtv
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-dev


tom at redpepperracing

Jan 24, 2007, 7:04 PM

Post #22 of 28 (15646 views)
Permalink
Re: Hardware encoders [In reply to]

> Tom Lichti wrote:
>> Bruce Markey wrote:
>>> Hardware encoders suck!
>>
>> snip...
>>
>>> <!rant>
>>
>> Thanks for that discourse Bruce, it was very enlightening. Would you say
>> that this applies to most bttv based cards? Any to avoid, or
>> alternatively, that you prefer? I am going to be making some big changes
>> to my myth setup, and this could help me in some ways.
>
> An underlying premise is that a bt8x8 is a bt8x8 and they
> are all capable of outputting the same raw frame given the
> same register settings. The questions are which features are
> turned on or off and how are those frames post processed.

snip...

>
> I'd say before "making some big changes", get one card and
> experiment with it. See what does and doesn't work well for you.
>

Again, thanks for the info. The 'big changes' are mostly unrelated to this
(moving from PVR cards and analog cable to DVB-S) so not to worry, I am
not doing anything I wasn't going to do anyway, I just figured if I can
get a cheap 'backup' card that worked well, I'd probably go the bttv
route.

Tom


_______________________________________________
mythtv-dev mailing list
mythtv-dev [at] mythtv
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-dev


bjm at lvcm

Jan 26, 2007, 12:07 PM

Post #23 of 28 (15591 views)
Permalink
Re: IVTV VBI reading [In reply to]

Hans Verkuil wrote:
> On Wednesday 24 January 2007 00:42, Bruce Markey wrote:
> > Daniel Kristjansson wrote:
> > > On Fri, 2007-01-19 at 12:21 +0100, Hans Verkuil wrote:
> > >> Note regarding ivtv VBI support: it is flaky in all current ivtv
> > >> versions. Basically when VBI capturing is on it is possible for
> > >> MPEG or VBI data to turn up in the wrong stream. This is a
> > >> firmware bug for which only the current ivtv subversion trunk code
> > >> contains a workaround. This code will become available Real Soon
> > >> Now for kernels 2.6.18 and up. It is extremely unlikely that it
> > >> will ever be backported to older kernels since it required a huge
> > >> interrupt/DMA rewrite in ivtv.
> > >>
> > >> Due to these problems I would recommend that for ivtv VBI is
> > >> turned off by default in MythTV if possible.
> >
> > But thousands of people use it every day right now. To impose
> > on them that they are not allowed to use VBI because you say
> > it is imperfect is blown way out of proportion.
>
> Just to clarify: for ivtv-based cards the MythTV VBI setting should be
> off by default (i.e. after installing MythTV from scratch) for driver
> version <0.10.0. You should of course always be able to turn it on if
> you want.

I'm lost. I've been looking for any VBI setting "for ivtv-based cards"
and I'm not finding one. There is no reference to VBI in "./configure
--help". The only thing that I know of to 'turn off VBI' is:

+-----------+---------------------+----------+
| value | data | hostname |
+-----------+---------------------+----------+
| VbiFormat | NTSC Closed Caption | NULL |
+-----------+---------------------+----------+

which is set from mythtv-setup General, second page. This turns
off VBI for all cards on all hosts regardless of type and disables
Closed Caption for all frontends even if the information is already
stored in the recordings.


> It can record it perfectly,

I disagree.

> but scaling does introduce slight amount of
> ghosting. Whether this is a driver, firmware or hardware bug is
> something that I need to look into one of these days.

This seems to be a severe issue if only the extreme maximum
resolution is considered to be acceptable. A 720x480 recording
file with the same relative bitrate as 480x480 would be 50% larger
for no (or very little) visible difference.

> > Lastly, as has been covered many, many times here over the years,
> > NTSC/PAL is not very high resolution. As the electron beam sweeps
> > across the screen, the analog resolution of broadcast equipment
> > and TV sets are in a range equivalent to 300-500 vertical scan
> > lines.
>
> Out of curiosity, is this also true when you record
> with S-Video from a DVD player, for example?

Yes, the output of the NTSC analog signal wave has no digital
characteristic but that isn't even the point. The signal directly
from a DVD player may be clearer and depending on the TV set, you
may be able to get a little more out of the law of diminishing
returns by using a higher number of samples per scan line as the
(now analog) signal is re-sampled. However, the limitations are not
just the input signal but the output of the display device.

> > http://www.ntsc-tv.com/ntsc-index-04.htm

If you looked at this page you may have been distracted by an
animation to demonstrate Kell Factor.

There are a couple ways of looking at this but there are limits to
how detailed a beam flying across a screen can be, The example I
like is to imagine a black picture with a vertical line in the
middle that is extremely bright and extremely narrow. On a TV
screen it will be no brighter than the maximum brightness of the TV
set. When the beam is on the line, the line will be as wide as the
beam itself. Further, as the beam approaches the line, the power
needs to go from nothing to max and takes some time and distance for
the amplitude to change. As it leaves the line it again takes time
to ramp down.

Both the width of the beam and the time it takes to change limit
the horizontal resolution. Even a 'good' large TV tube isn't
going to be much better than 400 lines of resolutions but this
is a vague comparison because it is an analog signal. But however
it is measured, you could have thousands of digital values per
scan line but the picture can't go from black to white to black
to white in less than the width of the beam or even get distinct
vertical lines at near the width of the beam.

> Well, an 720x480 MPEG stream has 1) the correct display ratio,

You mean square pixels? Four thirds of 480 is 640 and 720
is no where near 4:3 for PAL. I know of nothing where this
is the correct display ratio for anything.

> 2) can easily be burned to DVD without need for resizing.

1) 704x480 has been used as a standard resolutions for DVDs and
any of several resolutions could be written to DVD.

2) I can set my bt8x8 chip to output 720 samples per scan line from
a bttv card but that driver isn't broken at all other resolutions.
Therefore, I don't have to generate huge files for no benefit to
workaround a bug. A 720x480 recording file with the same relative
bitrate as 480x480 would be 50% larger and that doesn't even
account for differences between mpeg2 and mpeg4.

3) Something well beyond 99% of the TV shows recorded by MythTV are
not burned to DVD and I would be surprised if it were as high as 1
in 100,000 or if as many as 1% of myth users have ever burned a
DVD. Regardless of the statistics, the possibility of burning a DVD
does not negate the existence of the bug and sounds more like an
attempt to rationalize rather than a legitimate benefit.

> The default bitrate
> set by ivtv is sufficient for good quality encoding at that resolution.

Now you've peaked my curiosity. What is this default bitrate
and why is it a good choice?

-- bjm
_______________________________________________
mythtv-dev mailing list
mythtv-dev [at] mythtv
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-dev


hverkuil at xs4all

Jan 26, 2007, 1:54 PM

Post #24 of 28 (15608 views)
Permalink
Re: IVTV VBI reading [In reply to]

On Friday 26 January 2007 21:07, Bruce Markey wrote:
> Hans Verkuil wrote:
> > On Wednesday 24 January 2007 00:42, Bruce Markey wrote:
> > > Daniel Kristjansson wrote:
> > > > On Fri, 2007-01-19 at 12:21 +0100, Hans Verkuil wrote:
> > > >> Note regarding ivtv VBI support: it is flaky in all current
> > > >> ivtv versions. Basically when VBI capturing is on it is
> > > >> possible for MPEG or VBI data to turn up in the wrong stream.
> > > >> This is a firmware bug for which only the current ivtv
> > > >> subversion trunk code contains a workaround. This code will
> > > >> become available Real Soon Now for kernels 2.6.18 and up. It
> > > >> is extremely unlikely that it will ever be backported to older
> > > >> kernels since it required a huge interrupt/DMA rewrite in
> > > >> ivtv.
> > > >>
> > > >> Due to these problems I would recommend that for ivtv VBI is
> > > >> turned off by default in MythTV if possible.
> > >
> > > But thousands of people use it every day right now. To impose
> > > on them that they are not allowed to use VBI because you say
> > > it is imperfect is blown way out of proportion.
> >
> > Just to clarify: for ivtv-based cards the MythTV VBI setting should
> > be off by default (i.e. after installing MythTV from scratch) for
> > driver version <0.10.0. You should of course always be able to turn
> > it on if you want.
>
> I'm lost. I've been looking for any VBI setting "for ivtv-based
> cards" and I'm not finding one. There is no reference to VBI in
> "./configure --help". The only thing that I know of to 'turn off VBI'
> is:
>
> mysql> select * from settings where value like '%VBI%';
> +-----------+---------------------+----------+
>
> | value | data | hostname |
>
> +-----------+---------------------+----------+
>
> | VbiFormat | NTSC Closed Caption | NULL |
>
> +-----------+---------------------+----------+
>
> which is set from mythtv-setup General, second page. This turns
> off VBI for all cards on all hosts regardless of type and disables
> Closed Caption for all frontends even if the information is already
> stored in the recordings.

That's the one. In the end it turned out that this setting is off on a
new installation so there's no problem. I was told that it was on
initially, but that was obviously wrong information.

>
> > It can record it perfectly,
>
> I disagree.

I think that bttv will get a better picture because as you correctly say
it can be calibrated better, both on the btxxx chip and (probably) in
the software mpeg encoder. However, my experience is that 90% of the
quality is determined by your cabling, interference, the actual tuner
on the card, the TV you're using and the quality of your antenna or
cable signal. Most people probably won't see the difference between
bttv and ivtv except if everything else is optimal. I'm pretty sure I
wouldn't. Basically the PAL/NTSC quality is pretty lousy to begin with.

But a MPEG encoder is very handy for having a reasonable quality
recording that takes very little CPU load. And I really like the TV-out
of my PVR350 where I get very nice quality.

> > but scaling does introduce slight amount of
> > ghosting. Whether this is a driver, firmware or hardware bug is
> > something that I need to look into one of these days.
>
> This seems to be a severe issue if only the extreme maximum
> resolution is considered to be acceptable. A 720x480 recording
> file with the same relative bitrate as 480x480 would be 50% larger
> for no (or very little) visible difference.

To put this into perspective: the driver has been around for several
years and I have had only a handful of complaints about this. I'm
pretty sure it can be fixed somehow but it's simply not high prio at
the moment.

> > > Lastly, as has been covered many, many times here over the years,
> > > NTSC/PAL is not very high resolution. As the electron beam sweeps
> > > across the screen, the analog resolution of broadcast equipment
> > > and TV sets are in a range equivalent to 300-500 vertical scan
> > > lines.
> >
> > Out of curiosity, is this also true when you record
> > with S-Video from a DVD player, for example?
>
> Yes, the output of the NTSC analog signal wave has no digital
> characteristic but that isn't even the point. The signal directly
> from a DVD player may be clearer and depending on the TV set, you
> may be able to get a little more out of the law of diminishing
> returns by using a higher number of samples per scan line as the
> (now analog) signal is re-sampled. However, the limitations are not
> just the input signal but the output of the display device.
>
> > > http://www.ntsc-tv.com/ntsc-index-04.htm
>
> If you looked at this page you may have been distracted by an
> animation to demonstrate Kell Factor.
>
> There are a couple ways of looking at this but there are limits to
> how detailed a beam flying across a screen can be, The example I
> like is to imagine a black picture with a vertical line in the
> middle that is extremely bright and extremely narrow. On a TV
> screen it will be no brighter than the maximum brightness of the TV
> set. When the beam is on the line, the line will be as wide as the
> beam itself. Further, as the beam approaches the line, the power
> needs to go from nothing to max and takes some time and distance for
> the amplitude to change. As it leaves the line it again takes time
> to ramp down.
>
> Both the width of the beam and the time it takes to change limit
> the horizontal resolution. Even a 'good' large TV tube isn't
> going to be much better than 400 lines of resolutions but this
> is a vague comparison because it is an analog signal. But however
> it is measured, you could have thousands of digital values per
> scan line but the picture can't go from black to white to black
> to white in less than the width of the beam or even get distinct
> vertical lines at near the width of the beam.
>
> > Well, an 720x480 MPEG stream has 1) the correct display ratio,
>
> You mean square pixels? Four thirds of 480 is 640 and 720
> is no where near 4:3 for PAL. I know of nothing where this
> is the correct display ratio for anything.

Sorry, I'm wrong here. The 720x480 is derived from the MPEG Main Level
maximum resolution. Apparently typical MPEG recording resolutions for
NTSC are 352x480, 544x480, 640x480, 704x480 and 720x480 (according to
my 'Video Demystified, 4th Edition').

> > 2) can easily be burned to DVD without need for resizing.
>
> 1) 704x480 has been used as a standard resolutions for DVDs and
> any of several resolutions could be written to DVD.

You're right again.

> 2) I can set my bt8x8 chip to output 720 samples per scan line from
> a bttv card but that driver isn't broken at all other resolutions.
> Therefore, I don't have to generate huge files for no benefit to
> workaround a bug. A 720x480 recording file with the same relative
> bitrate as 480x480 would be 50% larger and that doesn't even
> account for differences between mpeg2 and mpeg4.
>
> 3) Something well beyond 99% of the TV shows recorded by MythTV are
> not burned to DVD and I would be surprised if it were as high as 1
> in 100,000 or if as many as 1% of myth users have ever burned a
> DVD. Regardless of the statistics, the possibility of burning a DVD
> does not negate the existence of the bug and sounds more like an
> attempt to rationalize rather than a legitimate benefit.

Listen, I thought that it was easy to change this setting specifically
for ivtv cards. Apparently it isn't, so then just keep it as is. I'll
just make a note in the driver README that if people notice ghosting
with MythTV then you can workaround it by setting the resolution to
720x480 until I find the time to look for the real problem.

>
> > The default bitrate
> > set by ivtv is sufficient for good quality encoding at that
> > resolution.
>
> Now you've peaked my curiosity. What is this default bitrate
> and why is it a good choice?

Let me see: VBR 6Mbps, peak 8Mbps. Audio 224 kbps. I believe these
values were originally derived from the Windows driver, and I've never
had any complaints :-) According to Video Demystified MPEG-2 was
targeted for broadcast-quality video at bit rates of 4-9 Mbps, so ivtv
is within that range. Nothing terribly scientific, I'm afraid.

Regards,

Hans
_______________________________________________
mythtv-dev mailing list
mythtv-dev [at] mythtv
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-dev


bjm at lvcm

Jan 28, 2007, 2:43 PM

Post #25 of 28 (15531 views)
Permalink
Re: IVTV VBI reading [In reply to]

Hans Verkuil wrote:
> On Friday 26 January 2007 21:07, Bruce Markey wrote:
...
>> which is set from mythtv-setup General, second page. This turns
>> off VBI for all cards on all hosts regardless of type and disables
>> Closed Caption for all frontends even if the information is already
>> stored in the recordings.
>
> That's the one. In the end it turned out that this setting is off on a
> new installation so there's no problem. I was told that it was on
> initially, but that was obviously wrong information.

Okay. My concern that I discovered is that this affects both
the frontend and the backend. This should be decoupled and
possibly the backend settings could be per card type, per card
or per input.

>>> It can record it perfectly,
>> I disagree.
>
> I think that bttv will get a better picture because as you correctly say

My terse point wasn't that bttv, ivtv, myth, mpeg4, mpeg2,
ffmpeg or anything else is perfect but that working around
issues is not the same as perfection.

>>> but scaling does introduce slight amount of
>>> ghosting. Whether this is a driver, firmware or hardware bug is
>>> something that I need to look into one of these days.
>> This seems to be a severe issue if only the extreme maximum
>> resolution is considered to be acceptable. A 720x480 recording
>> file with the same relative bitrate as 480x480 would be 50% larger
>> for no (or very little) visible difference.
>
> To put this into perspective: the driver has been around for several
> years and I have had only a handful of complaints about this. I'm
> pretty sure it can be fixed somehow but it's simply not high prio at
> the moment.

1) I didn't have your email address 2) I'm complaining now and
3) I could assemble a lynch mod if that's what is needed ;-).
There was another thread on these lists about ghosting so there
have been at least ten messages on this issue just this week.
If the response is always that you must record at 720x480 then
people will do what they have to do but that doesn't mean that
there isn't a problem.

What if I were to tell you that you could create files two-thirds
the size with no visible loss in quality? That your current disks
could hold 50% more hours of recordings and that there would be
less network traffic for remote playback, lower decode CPU time,
faster file copy, faster deletes, less I/O contention, fewer
prebuffer pauses, faster commercial flagging, faster transcoding
or no need for transcoding, etc.? I think everyone would like that
even if they are not complaining now. I think a slick marketing
department could put out an impressive press release about the
vast improvements of the new version if this were fixed =).

>>> The default bitrate
>>> set by ivtv is sufficient for good quality encoding at that
>>> resolution.
>> Now you've peaked my curiosity. What is this default bitrate
>> and why is it a good choice?
>
> Let me see: VBR 6Mbps, peak 8Mbps. Audio 224 kbps. I believe these
> values were originally derived from the Windows driver, and I've never
> had any complaints :-) According to Video Demystified MPEG-2 was
> targeted for broadcast-quality video at bit rates of 4-9 Mbps, so ivtv
> is within that range. Nothing terribly scientific, I'm afraid.

Good answer. I recall that early versions used 16000 as the
peak because this was the max and 8000 as the average I guess
because that was half the max. The myth default resolution was
480x480 so this was overkill by any measure. At the time, most
people assumed that the default must be right and would complain
about how big hardware encoded files were (this is before any
HDTV). I changes the myth defaults to 4500/6000 at 480x480 and
this wasn't terribly scientific either. I did some test recording
to find a bit rate that would create hardware encoded files of
approximately the same size as my software encoded files at the
time. 4500/6000 at 480x480 is similar to 6000/8000 at 720x480
but a slightly higher relative bitrate. However, I suspect that
many people use 4500/6000 at 720x480. This is perfectly acceptable
but may have some artifacts with higher motion like sporting
events.

In my myth profiles, I set LiveTV to a slightly lower 432x480 but
a higher bitrate. This keeps the file size (processing, throughput)
down but does an even better job with high motion and I use this
profile for sports. I set the Low Quality profile for 'talking
heads' news, old b&w movies and animation. Animation is easy to
compress and using 352x480 with a bitrate on the order of 1800,
I get files of about 3/4GB/hr that look perfectly fine. This is
another area where being locked to 720x480 only would be a little
disappointing.


While googling for info about the sampling clocking for bt8x8
I found this interesting article and thought I'd pass it along.
While this isn't the article I was looking for and doesn't have
the info I was actually seeking, it has gads of info about
sampling rates, resolutions and capture sizes.

http://www.doom9.org/index.html?/capture/sizes_advanced.html

-- bjm


_______________________________________________
mythtv-dev mailing list
mythtv-dev [at] mythtv
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-dev

First page Previous page 1 2 Next page Last page  View All MythTV dev RSS feed   Index | Next | Previous | View Threaded
 
 


Interested in having your list archived? Contact Gossamer Threads
 
  Web Applications & Managed Hosting Powered by Gossamer Threads Inc.