Lets looks at the effect of bitrate on image quality - High-Def Digest Forums
Reply
 
Thread Tools Search this Thread Display Modes
  #1  
Old 04-18-2007, 02:34 AM
Senior Member
Thread Starter
 
Join Date: Jan 2007
Posts: 1,381
Lightbulb Lets looks at the effect of bitrate on image quality

We talk a lot about the effects of bitrate has on the quality of an image. I wanted to test this. After looking all over the web I could not find a source image of sufficient quality to make for a good test. Then I had an idea.

Some games allow you to export screen caps. I recorded about a 5 min loop in Far Cry. I then had it replay and save out each frame of the loop generating about 9500 jpeg's, each about 2mb. This is very low compression and pretty much lossless from the original images. I used this to generate an uncompressed AVI file, ~30gb. I clipped off the uninteresting beginning and end of the clip and the end result was a bit under 3 min (5000 frames)

This source is fairly difficult for a codec because of the high frequency detail textures the game uses. You can see this in the detail of the ground and walls. This is slightly easier to encode than film grain but more difficult than a typical CGI film which is ray traced.

The next step was to do some encoding. Thankfully Ben, the vc-1 expert, had this to say over at AVS. He was also nice enough to provide links and suggested settings for VC-1 which worked very well.
Quote:
Fully understood. Which is why I keep suggesting that folks download our free encoder and make some samples for themselves to see for themselves .
http://www.avsforum.com/avs-vb/showt...&#post10220450


For AVC I downloaded an utility called megui, which uses the x264 codec for encoding. It took me a few tries to get the settings right to get decent results from this one since I didn't have any good initial suggested settings to start from.

Now on to the results. I compressed the clip several times. I used VC-1 set at 10, 15, and 20Mbps. The public VC-1 encoder is limited to 20Mbps so I could not test higher bitrates. For AVC I tried 15, 20, 25, 30, and 35Mbps. I used a variable bitrate for both allowing the encoder to do its best to distribute the bits. The AVC encoder allowed me to specify a peak rate so I used the 40Mbps peak that BD provides as the cap.

I picked one frame from the original source then used media player classic to capture that frame from each encode. I would suggest downloading the images then using the windows preview feature to flip between them. It makes the differences easier to see.

Each image that follows is an lossless compressed images, about 4MB each.


VC-1 @ 10Mbps
VC-1 @ 15Mbps
VC-1 @ 20Mbps

AVC @ 10 [coming soon]
AVC @ 15Mbps
AVC @ 20Mbps
AVC @ 25Mbps
AVC @ 30Mbps
AVC @ 35Mbps


Finally the reference frame.

I will not tell you what to make of the above images. But I am curious which of the above everyone thinks is “Good enough” and which “They want to watch”

Last edited by enigma; 04-18-2007 at 01:19 PM.
Reply With Quote
  #2  
Old 04-18-2007, 02:58 AM
Balian's Avatar
Senior Member
 
Join Date: Jan 2007
Posts: 3,207
Default

To be honest there is not much of a huge difference between 20 Mbs VC-1 and 30 Mbs AVC to worry which one is better. On most details, the AVC is a little bit sharper like in the finer details of the grass. However, on the barrel, the VC-1 image was truer to the original image. Whether this is a good thing or not is arguable because it was too blocky in the original image which does not seem natural on a rounded surface. If you compare [email protected] vs [email protected], then there is a huge difference.

The original image was not ideal to compare the two codecs.

Last edited by Balian; 04-18-2007 at 03:01 AM.
Reply With Quote
  #3  
Old 04-18-2007, 04:04 AM
onyxx's Avatar
Senior Member
 
Join Date: Jan 2007
Posts: 329
Default

Very interesting stuff, I compared the [email protected] vs the [email protected] and found the AVC looking quite a bit sharper. Like Balain says, the difference isn't huge but it is there...
Reply With Quote
  #4  
Old 04-18-2007, 05:34 AM
Indy_aka_Rex's Avatar
Senior Member
 
Join Date: Feb 2007
Posts: 19,228
Default

Is it just me or does VC1 at 20Mbps look crisper than AVC at 30Mpbs?
Reply With Quote
  #5  
Old 04-18-2007, 07:25 AM
Banned
 
Join Date: Oct 2006
Posts: 4,955
Default

I found the AVC 30 to be the crispest, but only my a tiny margin when looking at the long grass in the distance.
Reply With Quote
  #6  
Old 04-18-2007, 09:32 AM
Senior Member
 
Join Date: Dec 2006
Posts: 4,973
Default

Although there are too many variables to this test to really make any finalized judgement on the quality of these codecs, I still have to say Thank You for taking the time to make this effort.

Looking at the screen grabs, the first thing that I noticed was the white symbol on the barrel in the bottom right area. It really doesn't look good on the AVC sample until it hits the 30mbs level. The VC1 version looked pretty good at 20mbs. At the 15mbs rate, the AVC version was complete mush and totally unrecognizable. Although the VC1 codec got very blocky there at 10mbs, it did retain the overal shape and detail fairly well, at least to the point of recognizing what the object is.

But again, due to the array of variables when encoding video, its really impossible to choose a "winner" in an example like this. A solid A/B comparison could perhaps be accomplished by getting experts with professional encoding tools to work with uncompressed source files and optimize for different bitrates and such.

While this experiment is certainly interesting, scientifically it simply proves that Enigma, with the tools and tool skills available to him at this time, can generate a VC1 video file that is probably closer to the master than he can with AVC. I certainly wouldn't pass judgement on either codec based upon this test.
Reply With Quote
  #7  
Old 04-18-2007, 01:22 PM
Senior Member
Thread Starter
 
Join Date: Jan 2007
Posts: 1,381
Default

35Mbps added.

Its interesting. I wasn't attempting to discuss ACV vs VC-1. After doing this test I was convinced that it really doesn't matter which one you use as long as you give it enough bits and the choice isn't Mpeg2.

What I was trying to determine is if there was a noticable diffrence in quality as bitrate goes up, with advanced codecs. I saw a lot of claims that the extra bits would not make a diffrence.

BTW: If someone has access to a more film like source, I would be happy to compresses it to see how it turns out. This was simply the best I had to work with.
Reply With Quote
  #8  
Old 04-18-2007, 02:10 PM
AV_Integrated's Avatar
Senior Member
 
Join Date: Oct 2006
Posts: 7,153
Default

Actually, I'm suprised that MPEG2 isn't included here because it is the other of the three majors and in comparisions made at one point on AVS, it was apparent that MPEG2 was delivering a sharper image - albeit at a higher bitrate, than VC-1.

I found one of the best areas of the image to see compression influences, especially with AVC, was in the upper right corner of the image there is a piece of netting - behind it is a wall painted in camo. That section contains two low-contrast, very similar colors, and it is extremely apparent what levels of compression are taking place in that section across almost all of the encodes. Yes, sharp edges/contrast remains good when there are finely detailed areas - but areas with subtle hues are clearly affected with encoding.

Of course, it's still very difficult to say how the encoders used in this example really compare to those being used by production studios.
Reply With Quote
  #9  
Old 04-18-2007, 02:54 PM
Senior Member
 
Join Date: Jan 2007
Posts: 2,012
Default

To the right of the screen there is a tree with a red flower next to it. The flower looks sharper in the AVC 35Mb than the VC-1 20Mb. How about a higher bitrate VC-1 to compare to the higher bitrate AVC?

Last edited by ckelly79; 04-18-2007 at 03:02 PM.
Reply With Quote
  #10  
Old 04-18-2007, 09:43 PM
Senior Member
 
Join Date: Jan 2007
Posts: 452
Default

Quote:
Originally Posted by enigma View Post
35Mbps added.

Its interesting. I wasn't attempting to discuss ACV vs VC-1. After doing this test I was convinced that it really doesn't matter which one you use as long as you give it enough bits and the choice isn't Mpeg2.

What I was trying to determine is if there was a noticable diffrence in quality as bitrate goes up, with advanced codecs. I saw a lot of claims that the extra bits would not make a diffrence.

BTW: If someone has access to a more film like source, I would be happy to compresses it to see how it turns out. This was simply the best I had to work with.

This link leads to a link for files recorded from the Panasonic HVX-200, it is more "movie like" than just a video game footage. The files I understand are massive.

http://www.dvinfo.net/conf/showthread.php?p=660257

The bad news... the files are recorded in HDPRO-100, so the resolution is 1280h x 1080, there are 24P files... Also, one must register to get into the download section.


Good luck,

Bob Diaz
Reply With Quote
Reply

Related Topics
Thread Thread Starter Forum Replies Last Post
Will laptop send the original quality and bitrate trought the HDMI SaM_SpArK Home Theater Gear 2 09-30-2015 08:33 AM
Prima Lets You Stream Movies Currently in Theaters at Higher than Blu-ray quality TheDickWard Home Theater Gear 39 05-03-2013 08:01 PM
Image Quality Assessment... LordoftheRingsEE Home Theater Gear 0 03-12-2010 03:22 AM
What did you think of the image quality of 10,000 B.C. on Blu-Ray? HV77 Blu-ray Software General Discussion 1 06-30-2008 05:11 PM
Xbox 360 Spring Update Improves VGA Image Quality Aodh Gaming Smackdown 16 05-15-2007 02:26 PM


Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off