Login | Register For Free | Help
Search for: (Advanced)

Mailing List Archive: MythTV: Users

Refresh Rate reported for interlaced output (was: New nVidia 169.07 and interlace changes)

 

 

MythTV users RSS feed   Index | Next | Previous | View Threaded


digitalaudiorock at gmail

Dec 26, 2007, 9:38 AM

Post #1 of 7 (3108 views)
Permalink
Refresh Rate reported for interlaced output (was: New nVidia 169.07 and interlace changes)

On Dec 21, 2007 11:52 AM, Tom Dexter <digitalaudiorock [at] gmail> wrote:
> On Dec 21, 2007 10:37 AM, Michael T. Dean <mtdean [at] thirdcontact> wrote:
> > On 12/21/2007 10:26 AM, Tom Dexter wrote:
> > > Speaking of that patch, I'd still love to know why the frontend
> > > reports 60Hz output as 30Hz when the output is interlaced....which is
> > > what that patch changes. I posted a question to the dev list about it
> > > some time ago and got no response:
> > >
> > > http://www.gossamer-threads.com/lists/mythtv/dev/302387
> > >
> >
> > 1080i60 is 1920x1080 pixel frames shown at 30 frames per second. Each
> > frame is composed of 2 fields--each having half the number of lines of a
> > frame.
> >
> > 720p60 is 1280x720 pixel frames shown at 60 frames per second.
> >
> > There is no such thing as 1080i120 or 1080p60 in the ATSC
> > specification. There is, however, 1080p30 and 1080p24, as well as
> > 720p30 and 720p24.
> >
> > http://www.hdtvprimer.com/ISSUES/what_is_ATSC.html
> >
> > Mike
> >
>
> Maybe I'm missing something...you're talking about frame rates
> correct? I was talking about the refresh rate.
>
> The frontend reports a frame rate of 30 for 1080i60, which is correct,
> and the patch we were referring to doesn't change that. However it
> also reports a refresh rate of 30 rather than 60. Isn't supposed to
> be 60, as in the 60 fields per second?
>
> In any case, that reported refresh rate causes the frontend to
> disallow bob de-interlacing when running at 1080i, which works
> perfectly otherwise, and produces the best display of 1080i content I
> can get by far. Seth will concur.
>
> Tom
>

Does anyone have any insight on this one...this still has me really perplexed.

Michaels response above is a bit like the response that was posted to
Viktor's bug 2903 (with the patch) where it was pointed out that a
50Hz interlaced display only has a 25Hz frame rate, when the frame
rate being reported isn't what's being questioned at all, but rather
the refresh rate. The two values are stored separately in the
frontend, and the frame rates for 60Hz and 50Hz displays are in fact
stored as 30 and 25 respectively, but the refresh rates are also 30
and 25 rather than 60 and 50.

I understand that 60Hz interlaced displays have a frame rate of 30,
but do they not have a refresh rate of 60 rather than the 30 reported?

What's even more confusing is the fact that bob deinterlacing appears
to work with the patch and in fact looks great for those of us using
it with interlaced displays. Everything I read about bob indicates
that is relies on doubling the frame rate, which certainly doesn't
sound like something that would work with 30 frame per second video on
an interlaced display. I'd imagine that's probably what the reply to
bug 2903 was driving at. I'm not sure what actually ends up happening
at the display for those of us doing this, but it certainly looks
great.

Tom
_______________________________________________
mythtv-users mailing list
mythtv-users [at] mythtv
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users


halovanic at gmail

Dec 26, 2007, 8:17 PM

Post #2 of 7 (3008 views)
Permalink
Re: Refresh Rate reported for interlaced output (was: New nVidia 169.07 and interlace changes) [In reply to]

My understanding was that for most purposes X doesn't really distinguish
between a progressive and interlaced display; it outputs a 1920x1080 picture
at ~ 60fps and then the hardware device handles sending the interlaced
picture to the tv, presumably by simply discarding half the field lines for
each frame.

Most of the time when you just display an interlaced video in myth, the fields
being displayed for the video don't match up with those fields getting sent
to the device. Applying interlacing to a 60fps video gives you about
effectively 30fps for moving objects; applying another round of interlacing
introduced by the unsynced display makes it more like 15fps and very juddery
for motion.

Bob2x deinterlaces the picture by showing all 1080 fields at 60fps. It does
this by showing just the odd fields for 1 frame, which measure 1920x540,
stretching it out vertically to 1920x1080 (making each field 2 pixels high
instead of 1). This causes some very slight bounciness in thin elements or
the edge of motion, as some pixels are rapidly switching between showing an
edge and showing the space adjacent.

Even with just displaying static elements, you still necessarily get this
bounciness on a 1080i display (check out some of the menu elements in the
myth main menus for example). I think that any bounciness introduced by
bobbing is covered up by the inherent bounciness in the 1080i display, even
when the fields are being displayed in reverse order, and you're basically
now just interlacing a progressive 60fps video, instead of reinterlacing an
already interlaced video. Therefore: a perfect 1080i display at the expense
of a lot of apparently unneccessary cpu work.

Incidentally, I finally figured out how I got a perfect 1080i picture without
bob. It absolutely required the double refresh rate patch, otherwise the
video drifts in and out of sync every few seconds even with everything else
identical. I've watched the CBS broadcast of the Buffalo-Cleveland NFL game
for over 30 minutes and 100,000 interlaced frames now and it's never once
lost sync. I'd appreciate if others could test this, especially with display
devices other than the onboard component out (such as DVI with the predefined
1920x1080_60 tv modeline)

My set-up:
-XV video sync enabled in nvidia-settings
-OpenGL video sync enabled (I require both to stop tearing, ymmv)
-Use Video As Timebase enabled (this is crucial or else the fps drifts all
over the place)
-No zoom or overscan in Mythfrontend's playback settings, at least vertically
(there's no point trying to sync things if their sizes aren't identical)
-The double refresh rate patch applied to SVN (0.20 will probably do fine as
well)

Obviously you can't be dropping frames from a bad recording or a loaded CPU or
hard disk, and I'm not sure if it will work if you're stuck with a station
that is broadcasting a combination of interlaced and progressive frames.
When you start playing the recording, there's a good chance it won't be in
sync. If it's not, pause and unpause the video, check any motion for judder,
and keep pausing and unpausing until it starts synced up. I think it's about
a 50% chance of it being synced whenever it starts or restarts playback so it
shouldn't take more than 5 pause-unpauses to get it right. Now, sit back and
watch the perfect 1080i picture, resisting the urge to pause or skip around
and throw things back out of sync ;)


-Alex
_______________________________________________
mythtv-users mailing list
mythtv-users [at] mythtv
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users


digitalaudiorock at gmail

Dec 29, 2007, 1:50 PM

Post #3 of 7 (2999 views)
Permalink
Re: Refresh Rate reported for interlaced output (was: New nVidia 169.07 and interlace changes) [In reply to]

On Dec 26, 2007 11:17 PM, Alex Halovanic <halovanic [at] gmail> wrote:
> My understanding was that for most purposes X doesn't really distinguish
> between a progressive and interlaced display; it outputs a 1920x1080 picture
> at ~ 60fps and then the hardware device handles sending the interlaced
> picture to the tv, presumably by simply discarding half the field lines for
> each frame.
>
> Most of the time when you just display an interlaced video in myth, the fields
> being displayed for the video don't match up with those fields getting sent
> to the device. Applying interlacing to a 60fps video gives you about
> effectively 30fps for moving objects; applying another round of interlacing
> introduced by the unsynced display makes it more like 15fps and very juddery
> for motion.
>
> Bob2x deinterlaces the picture by showing all 1080 fields at 60fps. It does
> this by showing just the odd fields for 1 frame, which measure 1920x540,
> stretching it out vertically to 1920x1080 (making each field 2 pixels high
> instead of 1). This causes some very slight bounciness in thin elements or
> the edge of motion, as some pixels are rapidly switching between showing an
> edge and showing the space adjacent.
>
> Even with just displaying static elements, you still necessarily get this
> bounciness on a 1080i display (check out some of the menu elements in the
> myth main menus for example). I think that any bounciness introduced by
> bobbing is covered up by the inherent bounciness in the 1080i display, even
> when the fields are being displayed in reverse order, and you're basically
> now just interlacing a progressive 60fps video, instead of reinterlacing an
> already interlaced video. Therefore: a perfect 1080i display at the expense
> of a lot of apparently unneccessary cpu work.
>

Thanks for the explanation Alex. That makes sense. Actually even
with bob I find the CPU usage to be very acceptable. It actually uses
less CPU than kernel deinterlacing. I'm not using XvMC...for some
reason it actually uses more CPU than just using libmpeg2 on my
system.

I'd still love to know why MythTV reports that refresh rate as 30.

> Incidentally, I finally figured out how I got a perfect 1080i picture without
> bob. It absolutely required the double refresh rate patch, otherwise the
> video drifts in and out of sync every few seconds even with everything else
> identical. I've watched the CBS broadcast of the Buffalo-Cleveland NFL game
> for over 30 minutes and 100,000 interlaced frames now and it's never once
> lost sync. I'd appreciate if others could test this, especially with display
> devices other than the onboard component out (such as DVI with the predefined
> 1920x1080_60 tv modeline)
>
> My set-up:
> -XV video sync enabled in nvidia-settings
> -OpenGL video sync enabled (I require both to stop tearing, ymmv)
> -Use Video As Timebase enabled (this is crucial or else the fps drifts all
> over the place)
> -No zoom or overscan in Mythfrontend's playback settings, at least vertically
> (there's no point trying to sync things if their sizes aren't identical)
> -The double refresh rate patch applied to SVN (0.20 will probably do fine as
> well)
>
> Obviously you can't be dropping frames from a bad recording or a loaded CPU or
> hard disk, and I'm not sure if it will work if you're stuck with a station
> that is broadcasting a combination of interlaced and progressive frames.
> When you start playing the recording, there's a good chance it won't be in
> sync. If it's not, pause and unpause the video, check any motion for judder,
> and keep pausing and unpausing until it starts synced up. I think it's about
> a 50% chance of it being synced whenever it starts or restarts playback so it
> shouldn't take more than 5 pause-unpauses to get it right. Now, sit back and
> watch the perfect 1080i picture, resisting the urge to pause or skip around
> and throw things back out of sync ;)
>
>
> -Alex

It appears I was able to duplicate your test there. I have a 1080i RP
CRT connected via DVI. I'm still using a modeline, as I'm still using
the 100.14.11 driver and the built-in modes give me that half screen
bug. The only setting you had there that I had never tried is the
'use video as timebase'. I've never been crazy about using video as
the timebase only because video is much more forgiving about dropping
stuff than audio.

I think you're correct...when watching 1080i shows that don't mix
frame rates (as NBC seems to do with dramas and comedies all the
time), for example CBS shows, it appears that, once it gets in sync,
it does seem to stay there if you just let it play. I was curious
though...you're not suggesting that's a viable way to use myth are you
:D. If I were going to abandon pausing and time shifting, I'd just
flick my DVI switch over to my old Samsung HD receiver to get true
1080i :D.

An interesting test none the less. I don't know if you've ever
noticed this, but a user on the vNidia Linux forum has noticed that,
when outputting 1080i with the nVidia linux drivers, the following:

nvidia-settings -q RefreshRate

...reports 60.05 even though xorg sees it as 59.9. That very well may
be at the heart of the whole problem of interlacing drifting in and
out of sync at fairly steady intervals. Apparently it reports that
60.05 regardless of your modeline. Jeez...I really wish those folks
at nVidia would act like we're alive, you know?

Thanks again.

Tom
_______________________________________________
mythtv-users mailing list
mythtv-users [at] mythtv
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users


halovanic at gmail

Dec 29, 2007, 11:01 PM

Post #4 of 7 (2980 views)
Permalink
Re: Refresh Rate reported for interlaced output (was: New nVidia 169.07 and interlace changes) [In reply to]

Well I think nvidia's reporting the refresh rate wrong with nvidia-settings as
mine's showing 29.97 Hz for the component out. I guess it's almost right as
59.94 Hz /2 for interlacing = 29.97 full frames, but that's not a real
refresh rate by the correct definition. Maybe they just show Tvs as always
29.97 and DVI devices at ~60 since they're both assumed to be all about the
same? However, it seems with both of our experiences that it is in fact
producing the correct refresh rate, otherwise I don't think the video would
sync at all, certainly not at 60.05, so my conclusion is that nVidia's got it
just about right and Mythtv's the one that's at fault.

I think the crucial thing in the 'use video as timebase' is that the audio
tends to drift slightly and is not the best source for syncing something as I
end up seeing the fps in the verbose playback output going up and down a lot.
It's too bad the internal player causes DVDs to stutter horribly for me with
this option on or I would use it.

Another problem I suspect is that mythtv doesn't do anything to sync to the
top field so you end up with completely reversed sync half the time. If that
could be solved (I think I saw a rough patch for it once, but can't for the
life of me find it again) then that would eliminate the pause-unpause hack.

>I was curious
>though...you're not suggesting that's a viable way to use myth are you
>:D.  If I were going to abandon pausing and time shifting, I'd just
>flick my DVI switch over to my old Samsung HD receiver to get true
>1080i :D.
Of course not; I was actually advising pausing a LOT more, especially when you
time-shift ;)

Alex
Attachments: signature.asc (0.18 KB)


halovanic at gmail

Dec 30, 2007, 1:49 PM

Post #5 of 7 (2970 views)
Permalink
Re: Refresh Rate reported for interlaced output (was: New nVidia 169.07 and interlace changes) [In reply to]

I finally found the other patch for vsync Mark Kendall made:
http://www.gossamer-threads.com/lists/mythtv/users/270570#270570

Applying this one on top of all the other tweaks in this thread brings things
quite a bit closer to good interlaced playback. It's now coming out of a
single pause to a synced picture nearly 100% of the time. Curiously, almost
every time when I start playing a recording it begins unsynced. Pausing it
fixes it, then any time I skip around it goes back to being unsynced very
reliably. However, a couple of times when I began playback it started up
perfectly synced and then maintained it when I skipped around! I suspect if
I could just determine a way to get it to start up synced properly and also
resync whenever it drops frames it would be just about there.

I've attached an updated version of Mark's patch against SVN since I'm not
sure it would apply cleanly.

Alex
Attachments: sync-15260.diff (6.02 KB)


mark.kendall at gmail

Jan 1, 2008, 5:16 PM

Post #6 of 7 (2934 views)
Permalink
Re: Refresh Rate reported for interlaced output (was: New nVidia 169.07 and interlace changes) [In reply to]

On 31/12/2007, Alex Halovanic <halovanic [at] gmail> wrote:
> I finally found the other patch for vsync Mark Kendall made:
> http://www.gossamer-threads.com/lists/mythtv/users/270570#270570
>
> Applying this one on top of all the other tweaks in this thread brings things
> quite a bit closer to good interlaced playback. It's now coming out of a
> single pause to a synced picture nearly 100% of the time. Curiously, almost
> every time when I start playing a recording it begins unsynced. Pausing it
> fixes it, then any time I skip around it goes back to being unsynced very
> reliably. However, a couple of times when I began playback it started up
> perfectly synced and then maintained it when I skipped around! I suspect if
> I could just determine a way to get it to start up synced properly and also
> resync whenever it drops frames it would be just about there.

My first time online for almost 6 weeks, so I may be out of date here
and I definitely haven't had a chance to test the latest binary
drivers...

That patch was one of my earlier attempts to fix interlacing sync. If
you can use the opengl video renderer, the 'Interlaced' deinterlacer
should give you a more consistent result. It seems to be display
dependant (i.e. tv brand/model - not gpu etc) but seems to cope etter
with minor playback glitches etc and with some hardware will stay in
sync 100% of the time.

Now that (I think) there is double rate software deinterlacer support
in trunk (as needed for the greedy/yadif deinterlacers), I intent to
write a cpu based version of the same filter in the near future which
should give a more accessible solution for all.

Regards

Mark
_______________________________________________
mythtv-users mailing list
mythtv-users [at] mythtv
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users


halovanic at gmail

Jan 1, 2008, 8:05 PM

Post #7 of 7 (2928 views)
Permalink
Re: Refresh Rate reported for interlaced output (was: New nVidia 169.07 and interlace changes) [In reply to]

> My first time online for almost 6 weeks, so I may be out of date here
> and I definitely haven't had a chance to test the latest binary
> drivers...
>
> That patch was one of my earlier attempts to fix interlacing sync. If
> you can use the opengl video renderer, the 'Interlaced' deinterlacer
> should give you a more consistent result. It seems to be display
> dependant (i.e. tv brand/model - not gpu etc) but seems to cope etter
> with minor playback glitches etc and with some hardware will stay in
> sync 100% of the time.

Ah, I was wondering what that filter did when I first saw it. Unfortunately
my onboard graphics can't even cope with SD and the opengl renderer at
1920x1080.

> Now that (I think) there is double rate software deinterlacer support
> in trunk (as needed for the greedy/yadif deinterlacers), I intent to
> write a cpu based version of the same filter in the near future which
> should give a more accessible solution for all.

That would be great, and I would be very willing to test anything you come up
with.


-Alex
Attachments: signature.asc (0.18 KB)

MythTV users RSS feed   Index | Next | Previous | View Threaded
 
 


Interested in having your list archived? Contact Gossamer Threads
 
  Web Applications & Managed Hosting Powered by Gossamer Threads Inc.