Login | Register For Free | Help
Search for: (Advanced)

Mailing List Archive: MythTV: Users

Advanced 2X not working after system upgrade

 

 

MythTV users RSS feed   Index | Next | Previous | View Threaded


justin.johnson3 at gmail

Oct 2, 2012, 6:35 PM

Post #1 of 4 (854 views)
Permalink
Advanced 2X not working after system upgrade

Hi everyone,
I've upgraded from Mythbuntu 10.04.3 to 12.04.1 this weekend and am no
longer able to successfully use the VDPAU advanced 2x deinterlacer.
I am using the techniques outlined in the Judder free wiki page to
synchronize refresh rates, my TV's supported modes are:
*1920x1080 [at] 6
*1920x1080 [at] 2
*1920x1080 [at] 23
*1920x1080 [at] 60
*1920x1080 [at] 59
*1920x1080 [at] 59
*1280x720 [at] 6

In Xorg.0.log I see the screen change to 59.94Hz when I start
playback, and can confirm with the info display on my TV and xrandr.
However, the Advanced2X deinterlacer is not working. I can see the
interlacing artifacts on the screen, and have seen at least once in
the playback log something along the lines of "monitor does not
support double framerate". The Advanced1X deinterlacer does work,
though it doesn't appear to be perfectly smooth.

Important version number:
NVIDIA driver: 304.51
Linux kernel 3.2.0-31-generic
Xorg: 1.11.3
Mythtv: v0.25.2-31-g33c34da; compiled myself, with the only change
from fixes/0.25 being
http://code.mythtv.org/trac/changeset/21306c39/mythtv

I posted my full xorg.conf at http://pastebin.com/yM3smYbP

Any help troubleshooting would be very much appreciated.

--
Justin Johnson
_______________________________________________
mythtv-users mailing list
mythtv-users [at] mythtv
http://www.mythtv.org/mailman/listinfo/mythtv-users


salstrom at gmail

Oct 3, 2012, 6:15 AM

Post #2 of 4 (822 views)
Permalink
Re: Advanced 2X not working after system upgrade [In reply to]

On Tue, Oct 2, 2012 at 6:35 PM, Justin Johnson
<justin.johnson3 [at] gmail> wrote:
> Hi everyone,
> I've upgraded from Mythbuntu 10.04.3 to 12.04.1 this weekend and am no
> longer able to successfully use the VDPAU advanced 2x deinterlacer.
> I am using the techniques outlined in the Judder free wiki page to
> synchronize refresh rates, my TV's supported modes are:
> *1920x1080 [at] 6
> *1920x1080 [at] 2
> *1920x1080 [at] 23
> *1920x1080 [at] 60
> *1920x1080 [at] 59
> *1920x1080 [at] 59
> *1280x720 [at] 6
>
> In Xorg.0.log I see the screen change to 59.94Hz when I start
> playback, and can confirm with the info display on my TV and xrandr.
> However, the Advanced2X deinterlacer is not working. I can see the
> interlacing artifacts on the screen, and have seen at least once in
> the playback log something along the lines of "monitor does not
> support double framerate". The Advanced1X deinterlacer does work,
> though it doesn't appear to be perfectly smooth.
>
> Important version number:
> NVIDIA driver: 304.51
> Linux kernel 3.2.0-31-generic
> Xorg: 1.11.3
> Mythtv: v0.25.2-31-g33c34da; compiled myself, with the only change
> from fixes/0.25 being
> http://code.mythtv.org/trac/changeset/21306c39/mythtv
>
> I posted my full xorg.conf at http://pastebin.com/yM3smYbP
>
> Any help troubleshooting would be very much appreciated.
>
> --
> Justin Johnson
> _______________________________________________
> mythtv-users mailing list
> mythtv-users [at] mythtv
> http://www.mythtv.org/mailman/listinfo/mythtv-users

The 3XX.XX series of Nvidia drivers handles modelines totally
differently. I've not been able to get it working well myself so I'm
on 295.75. I either lose modes or HD-audio (long story there).

Here is a quote from an email chain I've had going on with an Nvidia developer:

"There was a significant change to the mode validation process in the
302.* driver series: previously, a "mode" consisted of a set of
"back-end timings," i.e. the timings the GPU actually sent to the
monitor, and "front-end timings," which were reported to the X server.
The front-end timings were basically only used for their height and
width, with the rest of the parameters ignored. That lead to all
kinds of confusion when, for example, programs would calculate and
report the "refresh rate" of the front-end timings, even when that was
a lie and the back-end timings were completely different.

302.* did away with all that. Now, the modes reported by the driver
are the real timings sent to the monitor. This means that only modes
that your monitor can support are advertised. In particular, if your
flat panel reports that only its native mode is supported, then only
its native mode will appear in the mode pool. To get scaling, you can
use the new ViewPortIn and ViewPortOut attributes on MetaModes,
documented here:
ftp://download.nvidia.com/XFree86/Linux-x86/304.37/README/configtwinview.html

You can also get the driver to automatically include MetaModes that
scale to common resolutions via the IncludeImplicitMetaModes option
(which is on by default). These MetaModes are reported through the
RandR 1.1 mode list: you can query them with "xrandr --q1" and set
them with "xrandr -s <size number> -r <fake refresh rate number>".
They are *not* reported through RandR 1.2, since 1.2 gives
applications complete control over the display configuration.

I know it's confusing. We're working on making nvidia-settings better
at coping with the new configuration flexibility. In the meantime,
let me know if you have any questions about how to configure your
displays. "
_______________________________________________
mythtv-users mailing list
mythtv-users [at] mythtv
http://www.mythtv.org/mailman/listinfo/mythtv-users


justin.johnson3 at gmail

Oct 3, 2012, 4:11 PM

Post #3 of 4 (811 views)
Permalink
Re: Advanced 2X not working after system upgrade [In reply to]

On Wed, Oct 3, 2012 at 8:15 AM, Neil Salstrom <salstrom [at] gmail> wrote:
> On Tue, Oct 2, 2012 at 6:35 PM, Justin Johnson
> <justin.johnson3 [at] gmail> wrote:
>> Hi everyone,
>> I've upgraded from Mythbuntu 10.04.3 to 12.04.1 this weekend and am no
>> longer able to successfully use the VDPAU advanced 2x deinterlacer.
>> I am using the techniques outlined in the Judder free wiki page to
>> synchronize refresh rates, my TV's supported modes are:
>> *1920x1080 [at] 6
>> *1920x1080 [at] 2
>> *1920x1080 [at] 23
>> *1920x1080 [at] 60
>> *1920x1080 [at] 59
>> *1920x1080 [at] 59
>> *1280x720 [at] 6
>>
>> In Xorg.0.log I see the screen change to 59.94Hz when I start
>> playback, and can confirm with the info display on my TV and xrandr.
>> However, the Advanced2X deinterlacer is not working. I can see the
>> interlacing artifacts on the screen, and have seen at least once in
>> the playback log something along the lines of "monitor does not
>> support double framerate". The Advanced1X deinterlacer does work,
>> though it doesn't appear to be perfectly smooth.
>>
>> Important version number:
>> NVIDIA driver: 304.51
>> Linux kernel 3.2.0-31-generic
>> Xorg: 1.11.3
>> Mythtv: v0.25.2-31-g33c34da; compiled myself, with the only change
>> from fixes/0.25 being
>> http://code.mythtv.org/trac/changeset/21306c39/mythtv
>>
>> I posted my full xorg.conf at http://pastebin.com/yM3smYbP
>>
>> Any help troubleshooting would be very much appreciated.
>>
>> --
>> Justin Johnson
>> _______________________________________________
>> mythtv-users mailing list
>> mythtv-users [at] mythtv
>> http://www.mythtv.org/mailman/listinfo/mythtv-users
>
> The 3XX.XX series of Nvidia drivers handles modelines totally
> differently. I've not been able to get it working well myself so I'm
> on 295.75. I either lose modes or HD-audio (long story there).
>
> Here is a quote from an email chain I've had going on with an Nvidia developer:
>

<<SNIP>>

Well that explains why the modes reported by xrandr changed to be more
in line with the actual frequencies. I'll roll back the driver to
295.XX and try with that for a while. Sometimes it's not noticeable,
and I was playing with the playback filters last night and it seemed
to go away, but I can't say for sure.

I guess my first reaction should have been to try an older driver.
Anyways, thanks for your help! Certainly saved me a lot of time and
eyesore.

-- Justin
_______________________________________________
mythtv-users mailing list
mythtv-users [at] mythtv
http://www.mythtv.org/mailman/listinfo/mythtv-users


jyavenard at gmail

Apr 18, 2013, 4:38 PM

Post #4 of 4 (332 views)
Permalink
Re: Advanced 2X not working after system upgrade [In reply to]

Hi


On 3 October 2012 23:15, Neil Salstrom <salstrom [at] gmail> wrote:

>
>
> The 3XX.XX series of Nvidia drivers handles modelines totally
> differently. I've not been able to get it working well myself so I'm
> on 295.75. I either lose modes or HD-audio (long story there).
>
> Here is a quote from an email chain I've had going on with an Nvidia
> developer:
>
> "There was a significant change to the mode validation process in the
> 302.* driver series: previously, a "mode" consisted of a set of
> "back-end timings," i.e. the timings the GPU actually sent to the
> monitor, and "front-end timings," which were reported to the X server.
> The front-end timings were basically only used for their height and
> width, with the rest of the parameters ignored. That lead to all
> kinds of confusion when, for example, programs would calculate and
> report the "refresh rate" of the front-end timings, even when that was
> a lie and the back-end timings were completely different.
>


Well, I had a deeper look into it, and the above is certainly not what I'm
seeing at all.

The judder free wiki page works just fine, however there seems to be a bug
with NVCtrl X extension, in that the first entry of the available modelines
gets overidden from time to time. In which case you can't switch to it.

I have amended the judder free wiki page with a work-around. I've reported
the issue to nvidia, will see what they say.

in the mean time, it works perfectly for me with the modeline workaround..

MythTV users RSS feed   Index | Next | Previous | View Threaded
 
 


Interested in having your list archived? Contact Gossamer Threads
 
  Web Applications & Managed Hosting Powered by Gossamer Threads Inc.