High-Def Digest Forums

High-Def Digest Forums (https://forums.highdefdigest.com/)
-   General Discussion: Both Formats (https://forums.highdefdigest.com/general-discussion-both-formats-34/)
-   -   1080i VS 1080p converted to 1080i (https://forums.highdefdigest.com/general-discussion-both-formats/21377-1080i-vs-1080p-converted-1080i.html)

bboy412 10-14-2007 04:31 PM

1080i VS 1080p converted to 1080i
 
I was just wondering, if im watching 1080i content displayed on 1366 x 768 TV. Would it look better or worse than native 1080p content, but played in 1080i on that same tv? For example broadcast HD vs a blu-ray played in 1080i.

Arkadin 10-14-2007 05:09 PM

ONLY crt set are CAPABLE of displaying 1080i. If you have lcd or plasma this is NOT an option. In other words you aren't watching 1080i most likely. It is converted to 720p by your set. So the signals from both sources would be converted to 720p and any difference will most likely be indiscernible.

punkerpat 10-18-2007 12:24 AM

Quote:

Originally Posted by Arkadin (Post 363747)
ONLY crt set are CAPABLE of displaying 1080i. If you have lcd or plasma this is NOT an option. In other words you aren't watching 1080i most likely. It is converted to 720p by your set. So the signals from both sources would be converted to 720p and any difference will most likely be indiscernible.

Dude, you have no idea what you are talking about. There are plenty of LCD and Plasma displays that show true 1080i/1080p.

However, you are correct that it will convert the input res to the display's native res.

Aurora 10-18-2007 12:31 AM

Quote:

Originally Posted by punkerpat (Post 370819)
Dude, you have no idea what you are talking about. There are plenty of LCD and Plasma displays that show true 1080i/1080p.

However, you are correct that it will convert the input res to the display's native res.

LCD and Plasma displays cannot display interlaced signals because they are, by nature, progressive display devices. Any interlaced signal that is input must be deinterlaced to progressive before it can be displayed.

CRTs can, however, display an interlaced signal without any conversion required.

Arkadin 10-18-2007 12:33 AM

the information I have read many times has indicated that lcd and plasma are inherently progressive scan technologies. If other members also dispute this I will gladly research this further.

oh, thank god! Aurora to the rescue. :) I was really wondering if I had been giving out false info. :confused:

DSquared 10-18-2007 12:36 AM

Quote:

Originally Posted by punkerpat (Post 370819)
Dude, you have no idea what you are talking about. There are plenty of LCD and Plasma displays that show true 1080i/1080p.

However, you are correct that it will convert the input res to the display's native res.

I think what he means is that LCD and Plasma screens always display the content in a progressive scan. Even if the signal is interlaced, the TV converts it progressive. This is why LCD screens are so much easier on your eyes. They don't have to work so hard to overcome the "flicker" that is created by the interlacing.

I have a 19' LCD monitor on my home computer but use dual 17' flat screen CRT's at work. Sometimes I work from home and when I go back and stare at those CRT's, I can actually feel my eyes having to adjust back to them.

A good way to test this is hook up dual monitors on your computer (if your video card is so equipped), one a flat panel LCD and the other a CRT. Then just shift your eyes back and forth. You will notice a huge difference between the interlaced CRT and the progressive LCD.

This is a big reason why it is really uneccessary to spend the extra money to buy at 1080p player (HD-A2 vs. HD-A20) if you are watching on a plasma or LCD screen. You get a progressive scan anyway. :)

Helo 10-18-2007 04:36 AM

The difference is in the signal processing...
 
My guess would be that the difference in quality would depend most on how many times the signal is converted back & forth between progressive & interlaced signals. Although 1080i/p is the same resolution, 1080i is approximately half the amount of info in the signal at a time. So the tv would have to deinterlace it before displaying the signal, so it stands to reason that the difference in quality would depend upon the tv's & players own processing procedures/capability.

BD/HD disc progressive -> hd player converts to interlaced -> tv converts to progressive.
Or,
BD/Hd disc progressive -> hd player progressive-> tv progressive.

I've heard that some tv's & players do it like this(I hope it's not true):

BD/hd disc p-> hd player p.-> hd player interlaces signal inorder to process-> hd player reprocesses to make p.-> tv receives p. signal-> converts to int.-> converts to p.-> displays p. signal.
(& we wonder why we get lip-sync incompatibility:))
Although I can not state which tv's & players process in what way, I can tell you that my Pioneer PDP-427XA can handle/process 720p/1080i & 1080p/24hz & displays them at 1024 x 768 (It cannot process 1080/60.)

"NTSC" DVD = 720 x 480 pixels (0.35 megapixel)
720p = 1280 x 720 pixels (0.92 megapixel)
1080i & 1080p= 1920 x 1080 pixels (2.1 megapixel)

1080i = 1920 x 540 pixels per refresh. (half as much information refreshed/processed at a time.)
1080p = 1920 x 1080 pixels per refresh.(the highest amount of information processed)

P.s. I'll add/edit if I'm wrong.


All times are GMT -4. The time now is 12:08 AM.