Page 51 of 1163 FirstFirst ... 414950515253611011515511051 ... LastLast
Results 751 to 765 of 17445
  1. #751
    Blinx123's Avatar
    Blinx123 is offline Member
    Join Date
    Mar 2007
    Posts
    9,787

    Default

    Quote Originally Posted by Badger3920 View Post
    So I went in the case this morning and the screws holding the 212 in there were pretty loose and the cooler could be jiggled and moved - which is bad. So, I recentered it and screwed it in a bit more tightly. Idle temps have dropped - but it's obvious I need to reapply thermal paste. Still in the upper 30's at idle, and hitting 70 in starcraft 2. So, enough of that. Gotta go get some arctic silver 5
    You didn't remove the cooler before shipping it off to NYC?
    Should consider yourself lucky. Some people destroy their mainboard and/or all of their cards.
    PSN/XBL: BLINX1234

    Bernie 2016!

    Interim Project Leader @ Platinum Arts

    Currently working on the next generation of Platinum Arts Sandbox.
    An easy to use, multi-platform gaming engine for Windows, Mac OS X and GNU/Linux (next generation console releases TBA)

  2. #752
    Badger3920's Avatar
    Badger3920 is offline Member
    Join Date
    Jan 2008
    Location
    SF Bay Area
    Posts
    23,659

    Default

    Quote Originally Posted by Blinx123 View Post
    You didn't remove the cooler before shipping it off to NYC?
    Should consider yourself lucky. Some people destroy their mainboard and/or all of their cards.
    Live and learn
    Home Theater: Panasonic PT-ae4000u, 110" Carada Brilliant White Screen, PS4
    Gaming PC: Intel i5 2500k, EVGA GTX 1080 FTW, 16GB DDR3 RAM, 256GB Samsung 840 Pro SSD, 1TB Samsung 850 EVO SSD

  3. #753
    Favelle's Avatar
    Favelle is offline Member
    Join Date
    Nov 2007
    Location
    Victoria, BC - Canada
    Posts
    18,155

    Default

    Quote Originally Posted by Blinx123 View Post
    The Noctua NT-H1 is the one I'm going to use, as I like the way it's applied (apply a single drop to the centre then lift the cooler on top of the CPU. Done).
    ?? They are pretty much all applied like that. Even the CoolerMaster paste that comes WITH the Hyper 212 is applied like that.
    Quote Originally Posted by twonunpackmule View Post
    You don't keep friends who use low resolutions.
    Quote Originally Posted by Boston007 View Post
    "You don't keep friends who use low volume."
    Intel i7 4770k (@4.5Ghz) + CrossFire Gigabyte R9 290x's w/ 4GB video RAM! + 16GB DDR3 RAM (STEAM/Origin: Nealon_Greene)
    Elitist >> mindless peon
    Yamaha 6260 - 7.1 Polks - 2 LLT subs powered by 1000W
    Optoma HD20 with 123" fixed screen for movies

  4. #754
    Blinx123's Avatar
    Blinx123 is offline Member
    Join Date
    Mar 2007
    Posts
    9,787

    Default

    Quote Originally Posted by Favelle View Post
    ?? They are pretty much all applied like that. Even the CoolerMaster paste that comes WITH the Hyper 212 is applied like that.
    Not all of them. Liquid metal/diamond ones aren't.
    The only ones (that I know of) that are all applied like that are ceramic pastes.

    It all depends on the consistency of the paste.
    PSN/XBL: BLINX1234

    Bernie 2016!

    Interim Project Leader @ Platinum Arts

    Currently working on the next generation of Platinum Arts Sandbox.
    An easy to use, multi-platform gaming engine for Windows, Mac OS X and GNU/Linux (next generation console releases TBA)

  5. #755
    Join Date
    Jan 2007
    Location
    Portland,Maine
    Posts
    22,927

    Default

    Quote Originally Posted by Badger3920 View Post
    Live and learn
    I've never removed the cooler in the past when moving and shipping. Never had a problem.
    Even Sponsored by Crony Mule admits:
    Quote Originally Posted by twonunpackmule View Post
    Uncharted peaked with 2 and each one just gets worse.

    I'll say this...I actually finished 3, I couldn't bring myself to finish 4. It was so boring.

  6. #756
    Blinx123's Avatar
    Blinx123 is offline Member
    Join Date
    Mar 2007
    Posts
    9,787

    Default

    Darn. I have just discovered the new MSI Z68-A GD80 G3 which is quite a bit cheaper.
    Perhaps I should've gone with that one? It spots a small footprint Linux as well (not that I would need that. Booting Puppy Linux from CD is just as fine)

    What really confuses me is how PCIE 3.0 works on each one of those mainboards.
    The MSI one spots 2 PCIE 3.0 slots (just like all the lower end Asrock Extreme series boards), running at 16x and 8x respectively, while the Asrock is limited to one PCIE 3.0 slot running at 16x.

    Now MSI says their board can power up to two NVIDIA cards in SLI but can also do Triple-Crossfire, while ASROCK promises triple-SLI as well.
    Could that be due to the SLI bridge not stretching far enough on the MSI board?

    Also puzzled as to how SLI with NVIDIA's Kepler will work on my ASROCK Extreme 7 board. Simultaneously running one card in PCIE 3.0 mode and the other in PCIE 2.0 mode probably won't work, but will triple SLI (switching that Gen 3 slot to PCIE 2.0) ? And will there even be a difference between PCIE 2.0 16x/16x and PCIE 3.0 16x/8x?

    Questions and nothing but questions. Guess I will have to experiment a bit.

    EDIT: Read up on the PCIE specifications. PCIE 3.0 8x and PCIE 2.0 16x are the same, as far as bandwidth is concerned. That's quite a relief. Might not concern me at all though, as I think about going "Ultra" (single PCB/multi GPU) with Kepler.

    On an even more than positive note: The Asrock Extreme 7's OC seems to fare way better.

    I guess better OC, three-way SLI and the addition of 5 PCIE 16x (4 PCIE 2.0 + 1 PCIE 3.0) slots (as opposed to 3 on that MSI board) were worth the 40 euro premium. Also, I know Asrock's customer support. I know they are good in supporting multiple CPU generations on one single mainboard platform. Can't say the same about MSI (my brother is a big MSI buff while I have never had a single MSI board on any modern platform).

    What I would have really liked with my new board are 2 PCI slots. Guess I will have to use that PCIE 1x slot instead (I'm talking about a simple daughterboard with no actual electric PCI/PCIE connector, so I guess it doesn't matter how or where I insert it).
    Last edited by Blinx123; 09-18-2011 at 04:18 PM.
    PSN/XBL: BLINX1234

    Bernie 2016!

    Interim Project Leader @ Platinum Arts

    Currently working on the next generation of Platinum Arts Sandbox.
    An easy to use, multi-platform gaming engine for Windows, Mac OS X and GNU/Linux (next generation console releases TBA)

  7. #757
    Badger3920's Avatar
    Badger3920 is offline Member
    Join Date
    Jan 2008
    Location
    SF Bay Area
    Posts
    23,659

    Default

    I'd stick with your asrock board and call it a day if it runs both sli and xfire
    Home Theater: Panasonic PT-ae4000u, 110" Carada Brilliant White Screen, PS4
    Gaming PC: Intel i5 2500k, EVGA GTX 1080 FTW, 16GB DDR3 RAM, 256GB Samsung 840 Pro SSD, 1TB Samsung 850 EVO SSD

  8. #758
    Badger3920's Avatar
    Badger3920 is offline Member
    Join Date
    Jan 2008
    Location
    SF Bay Area
    Posts
    23,659

    Default

    resolved my issue. I apparently installed the backplate like a dumbass when I built the rig. I removed the cooler (and found some of the backplate nuts had completely fallen off into the case), and the motherboard, cleaned all the old TIM off, reinstalled the backplate in a competent manner, put everything back together again - and my temperatures seem much better
    Home Theater: Panasonic PT-ae4000u, 110" Carada Brilliant White Screen, PS4
    Gaming PC: Intel i5 2500k, EVGA GTX 1080 FTW, 16GB DDR3 RAM, 256GB Samsung 840 Pro SSD, 1TB Samsung 850 EVO SSD

  9. #759
    Blinx123's Avatar
    Blinx123 is offline Member
    Join Date
    Mar 2007
    Posts
    9,787

    Default

    Quote Originally Posted by Badger3920 View Post
    I'd stick with your asrock board and call it a day if it runs both sli and xfire
    Any reason why it shouldn't? Every Z68 board can do SLI and Crossfire.

    But yea. Sticking to the Asrock.
    MSI can't even get their homepage right, apparently. According to their listing, the board got 2 + 3 PCIE 1x slots (almost turned me into a madman, because I figured those first 2 were PCIE 3.0 1x or 2.0 2x. Guess that isn't the case. From what I see on all the photographs, the board got less PCIE slots than mine).

    How does PCIE 4x work with a board that only has a PCIE 1x slot and 5 PCIE 16x slots, btw? Assuming I insert two GPUs and one additional input card into those 16x slots, will this limit the GPUs (make them revert to PCIE 8x)?
    PSN/XBL: BLINX1234

    Bernie 2016!

    Interim Project Leader @ Platinum Arts

    Currently working on the next generation of Platinum Arts Sandbox.
    An easy to use, multi-platform gaming engine for Windows, Mac OS X and GNU/Linux (next generation console releases TBA)

  10. #760
    Join Date
    Jan 2007
    Location
    Portland,Maine
    Posts
    22,927

    Default

    Quote Originally Posted by Blinx123 View Post
    Any reason why it shouldn't? Every Z68 board can do SLI and Crossfire.

    But yea. Sticking to the Asrock.
    MSI can't even get their homepage right, apparently. According to their listing, the board got 2 + 3 PCIE 1x slots (almost turned me into a madman, because I figured those first 2 were PCIE 3.0 1x or 2.0 2x. Guess that isn't the case. From what I see on all the photographs, the board got less PCIE slots than mine).

    How does PCIE 4x work with a board that only has a PCIE 1x slot and 5 PCIE 16x slots, btw? Assuming I insert two GPUs and one additional input card into those 16x slots, will this limit the GPUs (make them revert to PCIE 8x)?
    Even if they did revert to 8x speeds, there has been no difference at all in games between 8x and 16x.
    Even Sponsored by Crony Mule admits:
    Quote Originally Posted by twonunpackmule View Post
    Uncharted peaked with 2 and each one just gets worse.

    I'll say this...I actually finished 3, I couldn't bring myself to finish 4. It was so boring.

  11. #761
    Blinx123's Avatar
    Blinx123 is offline Member
    Join Date
    Mar 2007
    Posts
    9,787

    Default

    Quote Originally Posted by CorruptedDragon View Post
    Even if they did revert to 8x speeds, there has been no difference at all in games between 8x and 16x.
    Sure. But this might change with Kepler and other PCIE 3.0 compliant cards.

    Just checked though. The PCIE5 slot is supposed to be PCIE 4x compliant.
    Guess I'm fine then.
    PSN/XBL: BLINX1234

    Bernie 2016!

    Interim Project Leader @ Platinum Arts

    Currently working on the next generation of Platinum Arts Sandbox.
    An easy to use, multi-platform gaming engine for Windows, Mac OS X and GNU/Linux (next generation console releases TBA)

  12. #762
    Favelle's Avatar
    Favelle is offline Member
    Join Date
    Nov 2007
    Location
    Victoria, BC - Canada
    Posts
    18,155

    Default

    Quote Originally Posted by Badger3920 View Post
    resolved my issue. I apparently installed the backplate like a dumbass when I built the rig. I removed the cooler (and found some of the backplate nuts had completely fallen off into the case), and the motherboard, cleaned all the old TIM off, reinstalled the backplate in a competent manner, put everything back together again - and my temperatures seem much better
    Perfect! You're golden. Now, get that puppy to 4.5Ghz and call it a day!
    Quote Originally Posted by twonunpackmule View Post
    You don't keep friends who use low resolutions.
    Quote Originally Posted by Boston007 View Post
    "You don't keep friends who use low volume."
    Intel i7 4770k (@4.5Ghz) + CrossFire Gigabyte R9 290x's w/ 4GB video RAM! + 16GB DDR3 RAM (STEAM/Origin: Nealon_Greene)
    Elitist >> mindless peon
    Yamaha 6260 - 7.1 Polks - 2 LLT subs powered by 1000W
    Optoma HD20 with 123" fixed screen for movies

  13. #763
    Blinx123's Avatar
    Blinx123 is offline Member
    Join Date
    Mar 2007
    Posts
    9,787

    Default

    Quote Originally Posted by Favelle View Post
    Perfect! You're golden. Now, get that puppy to 4.5Ghz and call it a day!
    Why not 4.8 or 5 Ghz? What's so special about 4.5Ghz?
    PSN/XBL: BLINX1234

    Bernie 2016!

    Interim Project Leader @ Platinum Arts

    Currently working on the next generation of Platinum Arts Sandbox.
    An easy to use, multi-platform gaming engine for Windows, Mac OS X and GNU/Linux (next generation console releases TBA)

  14. #764
    Badger3920's Avatar
    Badger3920 is offline Member
    Join Date
    Jan 2008
    Location
    SF Bay Area
    Posts
    23,659

    Default

    Quote Originally Posted by Favelle View Post
    Perfect! You're golden. Now, get that puppy to 4.5Ghz and call it a day!
    Yeah much better, idle in low-mid 30's and load at mid 50's on occasion reaching 59
    Home Theater: Panasonic PT-ae4000u, 110" Carada Brilliant White Screen, PS4
    Gaming PC: Intel i5 2500k, EVGA GTX 1080 FTW, 16GB DDR3 RAM, 256GB Samsung 840 Pro SSD, 1TB Samsung 850 EVO SSD

  15. #765
    Join Date
    Jan 2007
    Location
    Portland,Maine
    Posts
    22,927

    Default

    Quote Originally Posted by Blinx123 View Post
    Why not 4.8 or 5 Ghz? What's so special about 4.5Ghz?
    4.5 is pretty easy. Higher you may need to adjust voltages which they wanted to avoid.
    Even Sponsored by Crony Mule admits:
    Quote Originally Posted by twonunpackmule View Post
    Uncharted peaked with 2 and each one just gets worse.

    I'll say this...I actually finished 3, I couldn't bring myself to finish 4. It was so boring.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  

//
Content Relevant URLs by vBSEO 3.5.1 PL1