Jump to content


Getting a Gaming PC...


  • Please log in to reply
718 replies to this topic

#701 Frigidman™

Frigidman™

    Eye Sea Yew

  • Admin
  • 3262 posts
  • Steam ID:frigidman
  • Location:East mahn, East!
  • Pro Member:Yes

Posted 05 December 2013 - 01:44 PM

I'll be finally putting all my popsnizzle together this weekend.

I quickly hooked up the new MB CPU and PSU on my desk so I could see if any were DOA. Fired up and went to bios just fine. The new BIOS is pretty slick compared to what I'm used too. Its very graphical and even has a mouse-input option (if you can image that! I cant). Pages upon pages of settings, a lot of new things in there. Crazy. Anyhow the BIOS HW Monitor was showing idle at 40c, but then after a couple turn on/off it idled at 35c. This was with the crappy intel cooler slapped on real quick (with its heapin thermal paste mountain) just so the CPU wouldnt burn up during testing for DOA.

Will being doing the real surgery this weekend and I cant wait! WOO... fun way to spend a winter storm :)

-Fm [1oM7]
"I'm not incorruptible, I am so corrupt nothing you can offer me is tempting." - Alfred Bester


#702 Frigidman™

Frigidman™

    Eye Sea Yew

  • Admin
  • 3262 posts
  • Steam ID:frigidman
  • Location:East mahn, East!
  • Pro Member:Yes

Posted 08 December 2013 - 02:20 PM

Well after a hiccup with a DOA drive, and running to microcenter in a snow storm (instead of waiting 2 weeks for newegg rma)... I was able to finally get to booting win7 and running some initial cpu tests on stock bios settings.

CoreTemp.exe readings:
Ambient 24°C
CPU Idle 25-26°C
CPU Load 51-57°C

HWMonitor64.exe readings:
Ambient 26°
CPU Idle 27-30°
CPU Load 52-57°

Look like my 4th core sits much cooler than the other three (by about ~5°). Odd about CoreTemp reading totally different 'idle' temps. But I've read that when its near the 26 mark, the readings get a bit wonky as its pretty much the baseline of the sensors. Both monitor apps read about the same for when the CPU's are loaded down. I was using the IntelBurnTest app and prime95.

Also time to drop from load to idle temps is about 3 seconds with my HSF. Thats how much time it takes it to cool from 57 down to 29 after I kill the burn tests.

So far, I'm pleased with the results. However I am not overclocking it yet or adding anymore voltage than normal.

-Fm [1oM7]
"I'm not incorruptible, I am so corrupt nothing you can offer me is tempting." - Alfred Bester


#703 Tetsuya

Tetsuya

    Master Blaster

  • Members
  • PipPipPipPipPipPip
  • 2034 posts
  • Location:MI

Posted 08 December 2013 - 11:10 PM

What cooler are you using?

#704 Frigidman™

Frigidman™

    Eye Sea Yew

  • Admin
  • 3262 posts
  • Steam ID:frigidman
  • Location:East mahn, East!
  • Pro Member:Yes

Posted 09 December 2013 - 06:41 AM

This one:
http://www.newegg.co...N82E16835242008

Same one I had bought for my first i7, amazingly they had the foresight to supply multiple mounting holes and it works for the new socket LGA1150 just fine!

Currently I put this fan on it:
http://www.newegg.co...N82E16835608037

Not a real fan of the fan, because its not as powerful as so many people make it out to be. I have a smaller 90mm fan that pushes more air with force, and runs just as quite at double the rpm. Go figure. So I will prolly be changing out the fan at some point if I see temps get too high and want more airflow through the heatsink.

-Fm [1oM7]
"I'm not incorruptible, I am so corrupt nothing you can offer me is tempting." - Alfred Bester


#705 Sneaky Snake

Sneaky Snake

    Official Mascot of the 1988 Winter Olympics

  • IMG Writers
  • 2772 posts
  • Steam Name:SneakySnake
  • Steam ID:sneaky_snake
  • Location:Waterloo, Canada

Posted 26 January 2015 - 10:22 PM

Nvidia's shady marketing tactics strike again.

The GTX 970 is advertised as a 4 GB card, when in reality it starts to choke after 3.5 GB's (132 page thread on the Nvidia forums). It's causing a performance drop in many games, and even prompted an official response from Nvidia.

It seems like in reality the 970 is actually a 3.5 GB card with a 500 MB buffer (to vastly oversimplify how its memory works). This is different from the GTX 980, which is able to use a full 4 GB no problem.

I love PhysX and CUDA and stuff, but the constant shady tactics from Nvidia really make me not want to buy their products.
- Snake

Mac: 2.3 GHz Quad i7 || 650M 1GB (rMRP 10,1)
PC:   2.5 GHz Quad i7 || 970M 3GB (MSI GS70-2QE Stealth Pro)

#706 Frigidman™

Frigidman™

    Eye Sea Yew

  • Admin
  • 3262 posts
  • Steam ID:frigidman
  • Location:East mahn, East!
  • Pro Member:Yes

Posted 27 January 2015 - 09:30 AM

This is why I flip flop whenever I have to go and upgrade. Depending on who is doing the less "popsnizzle" when I'm going to buy, is the company I go with. None of them have gained any real loyalty from me because of the crap they pull over the years.

-Fm [1oM7]
"I'm not incorruptible, I am so corrupt nothing you can offer me is tempting." - Alfred Bester


#707 macdude22

macdude22

    Like, totally awesome.

  • IMG Pro Users
  • PipPipPipPipPip
  • 854 posts
  • Steam Name:Rakden
  • Location:Iowa
  • Pro Member:Yes

Posted 27 January 2015 - 09:39 AM

Ugh, but Catalyst. It's gotten so bad. I still am annoyed that the chipset drivers for my Gigabyte board use catalyst. And then it gets all grumpy there are no graphics detected. Ok, I am one of like 5 people that didn't get the APU, y have you forsaken me AMD.

Barring this memory issue I totally don't understand is the 970 generally a good value, at least at screen resolutions not requiring 47 GB of VRAM? :happy:
http://raptr.com/rakden

Enterprise (MacPro 3,1): 8 Xeon Cores @ 2.8 GHz || 14 GB RAM || EVGA GTX 680 2GB (Mac Edition) || 480GB Crucial M500 (Macintosh Boot) + 2TB Seagate SSHD (Macintosh Data) + 240 GB OWC SSD (Windows Boot) + 2TB Western Digital Black (Windows Data) || Apple Studio Display
Defiant (MacBookPro 9,1): Core i7 @ 2.3ghz || 8GB RAM || nVidia GT 650M 512MB || 512GB Toshiba SSD Dual Boot

#708 Frigidman™

Frigidman™

    Eye Sea Yew

  • Admin
  • 3262 posts
  • Steam ID:frigidman
  • Location:East mahn, East!
  • Pro Member:Yes

Posted 27 January 2015 - 09:55 AM

My 780 has 3gb and I've really not run into issues... but then, my screen res is a 1080 monitor. Maybe if I had one of those 4k monitors, and then wanted to sli... and ... well... yeah, then we are going down a whole nother road ;)

-Fm [1oM7]
"I'm not incorruptible, I am so corrupt nothing you can offer me is tempting." - Alfred Bester


#709 Tetsuya

Tetsuya

    Master Blaster

  • Members
  • PipPipPipPipPipPip
  • 2034 posts
  • Location:MI

Posted 27 January 2015 - 01:29 PM

View PostSneaky Snake, on 26 January 2015 - 10:22 PM, said:

Nvidia's shady marketing tactics strike again.

The GTX 970 is advertised as a 4 GB card, when in reality it starts to choke after 3.5 GB's (132 page thread on the Nvidia forums). It's causing a performance drop in many games, and even prompted an official response from Nvidia.

It seems like in reality the 970 is actually a 3.5 GB card with a 500 MB buffer (to vastly oversimplify how its memory works). This is different from the GTX 980, which is able to use a full 4 GB no problem.

I love PhysX and CUDA and stuff, but the constant shady tactics from Nvidia really make me not want to buy their products.

This is so absurdly overblown.  The thing that causes this (less memory controller on a cheaper card; in this case, instead of the last 500MB of VRAM having its own set of controllers, it shares one of the controllers with thhe previous 500MB) has been present in every generation of cards by both manufacturers for 5+ years.  Because the last 500MB cant be accessed as quickly, it is used as a buffer instead of swapping data to system RAM.  Its been a common practice by both chipmkaers, when they cut down capabilities to make a cheaper card.  The only reason it is even coming to light is because people are deliberately trying to overload the card and trying to use it as a single-card 4K solution, which it was never meant to be.  

Running at 1080p or 1440p, even at max default settings (these reproduced these issues by rendering at 4k then displaying at 1440p with enhanced texture and modpacks), you will never run in to this 'issue'.  

TL:DR - overblown.

#710 Tetsuya

Tetsuya

    Master Blaster

  • Members
  • PipPipPipPipPipPip
  • 2034 posts
  • Location:MI

Posted 27 January 2015 - 01:36 PM

View Postmacdude22, on 27 January 2015 - 09:39 AM, said:

Barring this memory issue I totally don't understand is the 970 generally a good value, at least at screen resolutions not requiring 47 GB of VRAM? :happy:

Yes, it is still hands-down the best price/performance card on the market.  Can usually be had for 329$ US and is (by benchmark) the second or third fastest card available.  Both cards that are faster are almost double the price (or more) and only perform about 20% faster.

#711 macdude22

macdude22

    Like, totally awesome.

  • IMG Pro Users
  • PipPipPipPipPip
  • 854 posts
  • Steam Name:Rakden
  • Location:Iowa
  • Pro Member:Yes

Posted 27 January 2015 - 01:41 PM

I understood some of those words Tetsuya.

Considering my basement PC has two monitors at 1440x900 and my Mac Pro is a Cinema display at 1680x1050. Even my TV is just standard 1080p if i jack in a computer. I won't ever haz issues. Even that meager 750 ti super clocked on the basement PC can pretty much run everything on high at those resolutions lol.
http://raptr.com/rakden

Enterprise (MacPro 3,1): 8 Xeon Cores @ 2.8 GHz || 14 GB RAM || EVGA GTX 680 2GB (Mac Edition) || 480GB Crucial M500 (Macintosh Boot) + 2TB Seagate SSHD (Macintosh Data) + 240 GB OWC SSD (Windows Boot) + 2TB Western Digital Black (Windows Data) || Apple Studio Display
Defiant (MacBookPro 9,1): Core i7 @ 2.3ghz || 8GB RAM || nVidia GT 650M 512MB || 512GB Toshiba SSD Dual Boot

#712 Sneaky Snake

Sneaky Snake

    Official Mascot of the 1988 Winter Olympics

  • IMG Writers
  • 2772 posts
  • Steam Name:SneakySnake
  • Steam ID:sneaky_snake
  • Location:Waterloo, Canada

Posted 27 January 2015 - 02:58 PM

View PostTetsuya, on 27 January 2015 - 01:29 PM, said:

This is so absurdly overblown.  The thing that causes this (less memory controller on a cheaper card; in this case, instead of the last 500MB of VRAM having its own set of controllers, it shares one of the controllers with thhe previous 500MB) has been present in every generation of cards by both manufacturers for 5+ years.  Because the last 500MB cant be accessed as quickly, it is used as a buffer instead of swapping data to system RAM.  Its been a common practice by both chipmkaers, when they cut down capabilities to make a cheaper card.  The only reason it is even coming to light is because people are deliberately trying to overload the card and trying to use it as a single-card 4K solution, which it was never meant to be.  

Running at 1080p or 1440p, even at max default settings (these reproduced these issues by rendering at 4k then displaying at 1440p with enhanced texture and modpacks), you will never run in to this 'issue'.  

TL:DR - overblown.

The GTX 970 is certainly one of the best performing cards for the dollar, but that doesn't excuse Nvidia from flat out lying to consumers in order to make the 970 appear better then it actually is. The image below shows what the GTX 970 actually is, compared to what it was released as. I wouldn't really care that much, except for Nvidia has a track record of stuff like this (They deleted the 130+ page thread that I linked to in earlier post).

This issue doesn't really effect users who are using a single 970 at resolutions below 4K, but it does effect those who have 970 SLI setups (or were considering adding a 2nd 970 later).

Is there a source for this being common practice by both AMD and Nvidia in the past?

Posted Image

Image Source (Anandtech Article)
- Snake

Mac: 2.3 GHz Quad i7 || 650M 1GB (rMRP 10,1)
PC:   2.5 GHz Quad i7 || 970M 3GB (MSI GS70-2QE Stealth Pro)

#713 Frigidman™

Frigidman™

    Eye Sea Yew

  • Admin
  • 3262 posts
  • Steam ID:frigidman
  • Location:East mahn, East!
  • Pro Member:Yes

Posted 27 January 2015 - 03:23 PM

I love how my 780 heats my house when I play games. Saves me money on my gas bill.

-Fm [1oM7]
"I'm not incorruptible, I am so corrupt nothing you can offer me is tempting." - Alfred Bester


#714 Frost

Frost

    The Last Word

  • Forum Moderators
  • PipPipPipPipPipPipPipPipPipPip
  • 5452 posts
  • Steam ID:CaptFrost
  • Location:Republic of Texas
  • Pro Member:Yes

Posted 28 January 2015 - 03:16 AM

View PostFrigidman™, on 27 January 2015 - 03:23 PM, said:

I love how my 780 heats my house when I play games. Saves me money on my gas bill.
Hahaha, same with my Titan. At least it doesn't sound like a blowdryer like AMD's high-end cards, but it sure does put out the heat like one. Looking at it with a FLIR camera while it was running full out was pretty cool.
Cypher (PowerMac G5 Quad) – 2x2.5 GHz PPC 970MP / 16GB ECC RAM / 1TB WDC Velociraptor, 2TB STX Constellation ES.2 / QuadroFX 4500 512MB
Kestrel (Falcon NW Tiki) – 4.0 GHz Ci7 4790K / 16GB RAM / 512GB Crucial M550 M.2, 2x480GB Intel 730 (RAID0), 3TB STX Barracuda / GeForce GTX TITAN X 12GB
Chromium (MacBook Pro Early 2008) – 2.6 GHz C2D T9500 / 4GB RAM / 1TB Micron M600 / GeForce 8600M GT 512MB
Antimony (PowerBook G4 Titanium) – 1.0 GHz PPC 7455 / 1GB RAM / 512GB Crucial M550 / Radeon 9000 64MB

Eric5h5:
When there's a multiplayer version, I'm going to be on Frost's team. Well, except he doesn't seem to actually need a team...I mean, what's the point? "Hey look, it's Frost and His Merry Gang of Useless Hangers-On!" Or something.

#715 Atticus

Atticus

    Legendary

  • Members
  • PipPipPipPipPip
  • 902 posts
  • Location:Sternum's Ribcage

Posted 28 January 2015 - 10:43 AM

I have two 970s in SLI in the rig I built last Oct/Nov and haven't run into any performance issues.  I'm also not trying to drive the cards at 300fps on a ten-monitor setup at 10,0000 x 56,300000 rez. :-)

So, yes, shame of Nvidia for fudging the numbers (they claim it was a "miscommunication" between PR and engineering), but my 970s chew through Shadow of Mordor on Ultra settings ftw. I'm happy with them.
"I'm standing in the middle of life with my pants behind me."

#716 Atticus

Atticus

    Legendary

  • Members
  • PipPipPipPipPip
  • 902 posts
  • Location:Sternum's Ribcage

Posted 28 January 2015 - 02:59 PM

And looks like Nvidia is going to fix the problem with a driver update, and has offered refunds/exchanges to unhappy customers. Pitchforks and torches can be returned to storage. :-)
"I'm standing in the middle of life with my pants behind me."

#717 the Battle Cat

the Battle Cat

    Carnage Served Raw

  • Admin
  • 16163 posts
  • Location:Citadel City, Lh'owon
  • Pro Member:Yes

Posted 28 January 2015 - 07:04 PM

::shows up wearing Nvidia hard core riot gear and spots Atticus::

You can't hide rebel scum, Nvidia has resources!

::powers up his Benford 7000 megawatt taser::

LEEEERRRRROY JENNNNNNNKINS.
Gary Simmons
the Battle Cat

#718 Atticus

Atticus

    Legendary

  • Members
  • PipPipPipPipPip
  • 902 posts
  • Location:Sternum's Ribcage

Posted 29 January 2015 - 12:52 PM

NOOOOOOOOOOooooooooooooobleaaarrrrraaaaauuuuuggghhfsadfjasdlfkmas;flzzzmmmm.nnnn.....---xx........... ..   ... ...  .
"I'm standing in the middle of life with my pants behind me."

#719 the Battle Cat

the Battle Cat

    Carnage Served Raw

  • Admin
  • 16163 posts
  • Location:Citadel City, Lh'owon
  • Pro Member:Yes

Posted 29 January 2015 - 03:34 PM

View PostAtticus, on 29 January 2015 - 12:52 PM, said:

NOOOOOOOOOOooooooooooooobleaaarrrrraaaaauuuuuggghhfsadfjasdlfkmas;flzzzmmmm.nnnn.....---xx........... ..   ... ...  .

I was hoping for a "Don't taze me bro!" but this is entirely acceptable.  ::tazes Atticus again to make sure he knows where Nvidia stands on the matter::
Gary Simmons
the Battle Cat