1080i HD-A2 with 1080P Full HD? - High-Def Digest Forums
Reply
 
Thread Tools Search this Thread Display Modes
  #1  
Old 06-05-2007, 10:57 AM
ScorpionKingCT's Avatar
Senior Member
Thread Starter
 
Join Date: Jun 2007
Posts: 244
Default 1080i HD-A2 with 1080P Full HD?

I just wanted to bring this to the table to clear up myths and truths.

It has been said you dont have to spend more $$ on bluerays HD which is full 1080P. the HD-A2 and many others are 1080i.

It has been said if I watch a HD which is in 1080P, and play in a 1080i HD player, using a 1080P TV the signal is reconstructed back in 1080P.
Reply With Quote
  #2  
Old 06-05-2007, 11:32 AM
Feyd's Avatar
Senior Member
 
Join Date: Dec 2006
Posts: 559
Default

Quote:
Originally Posted by ScorpionKingCT View Post
I just wanted to bring this to the table to clear up myths and truths.

It has been said you dont have to spend more $$ on bluerays HD which is full 1080P. the HD-A2 and many others are 1080i.

It has been said if I watch a HD which is in 1080P, and play in a 1080i HD player, using a 1080P TV the signal is reconstructed back in 1080P.
Yes. Most people would be extremely hard-pressed to see the difference between a 1080i and 1080p signal.
Reply With Quote
  #3  
Old 06-05-2007, 02:47 PM
Member
 
Join Date: Feb 2007
Posts: 39
Default

Quote:
Originally Posted by ScorpionKingCT View Post

It has been said if I watch a HD which is in 1080P, and play in a 1080i HD player, using a 1080P TV the signal is reconstructed back in 1080P.
I believe this is correct (someone chime in)... 1080p TVs (in fact all LCD TVs, display their images progressively).. so a 1080i input over component/HDMI/DVI ultimately is deinterlaced by the TV and displayed in whatever the native resolution of the TV is (typically 1366x768 or 1920x1080)...

so.... using a HD-A2, it would actually go through several process.... content gets interlaced and is output in 1080i, then is de-interlaced by the TV.... whether or not this is noticeable depends on the quality of the scalars/converters involved...
Reply With Quote
  #4  
Old 06-05-2007, 06:53 PM
Arkadin's Avatar
Senior Member
 
Join Date: Mar 2007
Posts: 12,401
Default

If the native resolution of the HDtv is 1080i then a signal from the A2 is unaffected. If the HDtv can OUTPUT a 1080p signal then the 1080i signal is displayed in 1080p by the tv. But many so-called 1080p TVs DO NOT output 1080p, they only accept 1080p signals. (sad, but true). So you really have to know exactly what type of tv you have.
Reply With Quote
  #5  
Old 06-09-2007, 02:04 PM
Member
 
Join Date: Jun 2007
Posts: 48
Default

As long as your TV has a accurate deinterlacer (you can go to hometheatermag.com, and there are a few articles showing which 1080p TVs pass which tests), there is NO difference in the image you will see if its getting 1080i or 1080p. On a 60Hz TV (I'll go into 24fps capable TVs later) each frame needs to be displayed 2.5 times. This is where 3:2 pulldown comes into play. A 1080p/60Hz signal from a film source will output the following sequence of frames 1:1:1:2:2:3:3:3:4:4:5:5:5:.....23:23:23:24:24. Basically, the odd frames are displayed for 1/20th of a second (instead of the accurate 1/24th) and the even for 1/30th. This is what causes film judder, which is noticeable in slow panning scenes (end credits are a perfect example).

A 1080i signal, on the other hand, only has half the information (even and odd fields) sent out with every clock cycle. What you get then is 1(odd):1(even):1(odd):2(even):2(odd)...

All the information that is contained in the original frame is still there, thus if you have a proper deinterlacer it combines the odd and even fields into the original frames, and displays them in the same manner as a 1080p/60Hz signal. 1:1:1:2:2....

A lot of people claim that having a player that outputs 1080p/24 is beneficial, when in fact it isn't. The reason a 24fps player is irrelevant is the fact that any TV that can display 24fps material (72, 96 or 120Hz, typically) will be able to get the same result from a 1080i/60 or 1080p/60 signal assuming its video processor functions properly (which to the best of my knowledge, all 24fps capable TVs do).

Having a TV that refreshes at 72, 96, or best of all 120 Hz (because it is capable of properly displaying both film and video sourced content since its a multiple of both 24 and 30) is where you'll see a performance improvement. The reason is instead of performing a 3:2 pulldown, the TV's video processor is capable of a 5:5 pulldown (120Hz). As such, you'll get the following sequence 1:1:1:1:1:2:2:2:2:2:3:3:3:3:3...
Each frame is now displayed for 1/24 of a second, eliminating any film judder.

James
Reply With Quote
Reply

Related Topics
Thread Thread Starter Forum Replies Last Post
The Gunstringer full game DL, Fruit Ninja full game DL for X360 Kinect Master X Video Game Exchange 3 11-18-2011 11:16 AM
Blu-Ray Question: Studio claims Full 1080p/Titles Play Back at 1080i EmpireGuy Blu-ray Software General Discussion 3 07-02-2008 12:50 AM
1080p vs. 1080i vs. 720p wordjam Home Theater Gear 15 03-25-2008 12:03 AM
1080i VS 1080p converted to 1080i bboy412 General Discussion: Both Formats 6 10-18-2007 03:36 AM
1080i vs 720p, It seems 1080i is better for gaming vahdyx Home Theater Gear 10 09-21-2007 03:57 PM


Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off