tom at graniteskies
Dec 9, 2010, 9:13 AM
Post #3 of 4
On Sat, Oct 16, 2010 at 1:55 PM, Raymond Wagner <raymond [at] wagnerrp>wrote:
Re: What renderer, interlacer for CrystalHD card
[In reply to]
> On 10/16/2010 14:45, dave [at] 0bits wrote:
>> Got my crystalhd card and recompiled with trunk/svn. All went well and
>> i've set up the decoder as crystalhd in the settings. but unsure what i
>> should use for the renderer, and primary/secondary deinterlacer. My
>> understanding that the crystalhd can do deinterlacing also but there doesn't
>> seem to be an option for it or do i just set it to 'none' ?
> The CrystalHD is a decoder, not an output card. It is a replacement for
> the 'standard' decoder. You need to use Xv or OpenGL just like before.
> mythtv-users mailing list
> mythtv-users [at] mythtv
I've pretty much used the defaults since I've started with Myth, so forgive
me if this is an obvious question.
Assuming I want to use the CrystalHD to decode anything 720p and above... I
should have the "Match criteria" set to >= 1280 x 720, correct?
If I wanted it do everything, it would be <= 1920 x 1088?