1080i vs 720p, It seems 1080i is better for gaming - High-Def Digest Forums
Reply
 
Thread Tools Search this Thread Display Modes
  #1  
Old 09-19-2007, 09:17 PM
vahdyx's Avatar
Senior Member
Thread Starter
 
Join Date: Feb 2007
Posts: 848
Default 1080i vs 720p, It seems 1080i is better for gaming

Guys, I know this issue has been discussed countless amount of times.

I have a problem though.

Every site i've visited is showing 1080i being inferior to 720p, but when I look at the two, it's very mild, but 1080i seems sharper and brighter.

I just don't see why 720p is better.

I notice 1080i being inferior with DVD movies (on Xbox 360) because I get interlacing, but not with HD DVD.

It's frustrating that I can't get a definite answer. AHHHH! It's killing me. I just need to sell my WD-52525 and get a Samsung HLT5087S and put everything at 1080p

Last edited by vahdyx; 09-19-2007 at 09:40 PM.
Reply With Quote
  #2  
Old 09-19-2007, 09:29 PM
Arkadin's Avatar
Senior Member
 
Join Date: Mar 2007
Posts: 12,401
Default

Part of the confusion arises because there are really 2 different 1080i definitions if you will. You have so-called "Full HD" sets that are 1920x1080 native resolution. If you were watching 1080i on these sets you would be watching what I call "true" 1080i. This form of 1080i, if you will--is definitely superior to 720p. The sets which have native resolution 768x1080 are said to be 720p/1080i sets. In these instances the resolutions are essentially identical. I have furthermore read that UNLESS you have a crt set, you cannot even get 1080i. Plasma and lcd sets cannot physically do 1080i-- so every time, you are getting 720p. So basically it bolis down to this:

Best 1080p
next best 1080i (from true hd sets)
3rd best 1080i (same as 720p )

Hope this clears things up.
Reply With Quote
  #3  
Old 09-19-2007, 09:36 PM
vahdyx's Avatar
Senior Member
Thread Starter
 
Join Date: Feb 2007
Posts: 848
Default

sort of, my tv native is 1280x720, but it does both 1080i and 720p


It's the Mitsubishi WD-52525 http://www.mitsubishi-tv.com/img/220...ifications.pdf <--- More info

So it was helpful, but I feel a bit more confused
Reply With Quote
  #4  
Old 09-19-2007, 09:37 PM
Aurora's Avatar
Senior Member
 
Join Date: Aug 2007
Posts: 1,818
Default

You should see less "flicker" with 720p because it is progressive, and so, in theory, it should be better for gaming. However, it's all completely dependent on the framerate being output.

1080i30: This is what HD channels broadcast at when they say "1080i." ATSC standards don't allow for broadcast of 1080i at 60 Hz. Unfortunately, this is poor for fast motion, so you'll notice blur and ghosting. ESPN, ABC, and Fox don't broadcast in 1080i because of the high amount of sports they show. The fast moving players and objects in sports come out much better in...

720p30: In terms of motion, equivalent to 1080i60 when the latter is deinterlaced properly. Otherwise, this is "twice as good" as 1080i30. There's twice as many frames per second, so motion will appear much smoother.

720p60: Blows away the two previous ones. 1080p60 is equivalent in terms of motion ability, but they don't broadcast TV in that format yet.

1080i should be sharper for stationary shots because it has more lines of resolution. However, because the TV has to deinterlace (HDTVs, except for CRTs, are by nature progressive displays), and doing so filters some details, 720p and 1080i end up pretty close in appearance when 1080i uses twice the frames as 720p.
Reply With Quote
  #5  
Old 09-19-2007, 09:53 PM
Arkadin's Avatar
Senior Member
 
Join Date: Mar 2007
Posts: 12,401
Default

so basically you have two factors which come into play --actual native resolution of the set and
frame rate (defined by hz) or fps (frame per second) and obviously Aurora knows way more than I do about the latter for sure.

Also many sets can ACCEPT 1080i signals and not be able to OUTPUT them.
Likewise many sets can ACCEPT 1080p signals and not be able to OUPUT them. The ce companies have lots of fun with this. Also, what the tv display reads is often what the tv is accepting, not what it is outputting and sometimes vice a versa. This also confuses lots of people.

Last edited by Arkadin; 09-19-2007 at 10:11 PM.
Reply With Quote
  #6  
Old 09-19-2007, 10:37 PM
Aurora's Avatar
Senior Member
 
Join Date: Aug 2007
Posts: 1,818
Default

Quote:
Originally Posted by Arkadin View Post
Also many sets can ACCEPT 1080i signals and not be able to OUTPUT them.
Likewise many sets can ACCEPT 1080p signals and not be able to OUPUT them. The ce companies have lots of fun with this. Also, what the tv display reads is often what the tv is accepting, not what it is outputting and sometimes vice a versa. This also confuses lots of people.
LCDs, DLPs, and Plasma TVs are all of a fixed resolution, while CRTs have the ability to change the size of the projected pixels. The "highest resolution" a non-CRT can display is called its native resolution, and it is the only output the TV can actually produce. Furthermore, non-CRT TVs are all progressive by nature, while CRTs can be either these days. A tidbit of history: interlacing was created because they could not make CRTs fire all the pixels without problems, so the fix was to fire half each time at twice the desired framerate. Improvements in technology fixed that, and now CRTs are capable of displaying true progressive images.

The native resolution of the TV is usually what they advertise. A 720p TV displays only 720p images. Such TVs can usually accept 1080i/p signals, but cannot display them directly. A 1080i signal must be upconverted to 720p, and a 1080p signal must be downconverted to 720p for the TV to display the image. If you know the resolution of your TV, you know what it is outputting regardless of what you put in.

The truth is that LCDs, DLPs, and plasmas cannot display the true 1080i at all because they need a progressive image. They either convert it to 720p or 1080p for displaying purposes.

That CRTs can change the size of their pixels is why a DVD that has not been upscaled will look better on a CRT than on one of the other types of display. The CRT can natively display 480p image through remapping, while the other has to convert that to a 720p image--which induces blur, since a TV just kinda zooms in rather than applying a true upconversion algorithm like is used by scalers. If we had stuck with CRT technology, there would be no DVD upscaling, in all likelihood.
Reply With Quote
  #7  
Old 09-19-2007, 10:54 PM
mrkiller's Avatar
Senior Member
 
Join Date: Feb 2007
Posts: 3,545
Default

I have a 1080p set and 1080i looks a whole lot better then 720p (for HD DVD)

1080i also does 1:1 pixel while 720p doesnt (on some 1080p sets)
Reply With Quote
  #8  
Old 09-20-2007, 09:55 AM
Senior Member
 
Join Date: Jun 2007
Posts: 116
Default

I have my HD DVR set to automatically change the resolution to what the station is broadcasting in.

It will auto switch between 1080i/720p

My question is, do you think this can harm the tv over time? I constantly am channel surfing and wondering if having the resolutions change so much, does it have an effect on the tv?
Reply With Quote
  #9  
Old 09-20-2007, 01:35 PM
vahdyx's Avatar
Senior Member
Thread Starter
 
Join Date: Feb 2007
Posts: 848
Default

so basically, you're saying my TV (since it's native is 720p) is displaying 720p even when I make my xbox pump out 1080i? So why is there a difference in image quality?
Reply With Quote
  #10  
Old 09-20-2007, 03:20 PM
Aurora's Avatar
Senior Member
 
Join Date: Aug 2007
Posts: 1,818
Default

Quote:
Originally Posted by mrkiller View Post
I have a 1080p set and 1080i looks a whole lot better then 720p (for HD DVD)

1080i also does 1:1 pixel while 720p doesnt (on some 1080p sets)
720p will have to be converted up to 1080p for displaying purposes. Fixed pixel displays--everything that's not a CRT--CANNOT resize their pixels. What they do is double up certain pixels to stretch the image to the needed size.

Quote:
My question is, do you think this can harm the tv over time? I constantly am channel surfing and wondering if having the resolutions change so much, does it have an effect on the tv?
It shouldn't. The 1080i signal never actually makes it to the pixels. It gets converted before that and then displayed.

Quote:
so basically, you're saying my TV (since it's native is 720p) is displaying 720p even when I make my xbox pump out 1080i? So why is there a difference in image quality?
Yes. Maybe you perceive a difference because you think there should be one. You'd be amazed at how the mind works like that--if you think it should be better, you find that it is, even if it's really not.
Reply With Quote
Reply

Related Topics
Thread Thread Starter Forum Replies Last Post
For Anyone That Cares: Tomb Raider On PS3 Seems Better Shoeless Gaming Smackdown 33 11-25-2008 09:14 AM
Whats better for Blu-Rays: 720p or 1080i? Vesh Drown Blu-ray Software General Discussion 2 12-10-2007 10:33 PM
Which is better...1080i over componet or 720p over VGA for my Xbox 360 Detroit Tigers Fan HD DVD and Video Game Consoles 2 07-05-2007 11:13 PM
1080p vs 720p. 720p may win.. ack_bak Home Theater Gear 4 02-24-2007 08:46 AM
720p w/component vs. 720p w/HDMI bone crusher HD DVD Hardware General Discussion 5 02-15-2007 12:56 PM


Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off