Jump to content


All New Mac Pro Available Starting Tomorrow(19/12-2013)

mac pro available

  • Please log in to reply
66 replies to this topic

#21 Smoke_Tetsu

Smoke_Tetsu

    Uberspewer

  • Members
  • PipPipPipPipPipPipPip
  • 3318 posts
  • Steam Name:Tetsu Jutsu
  • Steam ID:smoke_tetsu
  • Location:Cyberspace

Posted 19 December 2013 - 09:13 AM

View Postthe Battle Cat, on 18 December 2013 - 09:40 AM, said:

Man, those things look awful.  And where is the foot peddle for when you want to throw trash into it?

Someone made a trash icon for the dock from a picture of the new Mac Pro: http://9to5mac.com/2...-your-mac-dock/
--Tetsuo

Alex Delarg, A Clockwork Orange said:

It's funny how the colors of the real world only seem really real when you viddy them on the screen.

the Battle Cat said:

Slower and faster? I'm sorry to hear such good news?

Late 2012 27 inch iMac, Core i7 Quad 3.4GHz, 16GB RAM, Nvidia GeForce GTX 680MX 2GB, 3TB HDD - Mavericks

Late 2009 27 inch iMac, Core i5 2.6GHz, 12GB RAM, ATI Radeon 4850HD 512MB, 1TB HDD - Mavericks

Mac Mini, PowerPC G4 1.4Ghz, 1GB RAM, Radeon 9200 32MB, 256GB HDD - Leopard

Dell Inspiron 1200 Notebook: 1.2GHz Celeron, 1.2GB RAM, Intel GMA915, 75GB HDD - Ubuntu

Generic Black Tower PC, Dual Core 64-bit 2.4GHz, 4GB RAM, GeForce 9600 GT 512MB - Windows 7


#22 the Battle Cat

the Battle Cat

    Carnage Served Raw

  • Admin
  • 17376 posts
  • Location:Citadel City, Lh'owon
  • Pro Member:Yes

Posted 19 December 2013 - 09:38 AM

View PostSmoke_Tetsu, on 19 December 2013 - 09:13 AM, said:

Someone made a trash icon for the dock from a picture of the new Mac Pro: http://9to5mac.com/2...-your-mac-dock/

Ha!  Awesome.  Someone at Apple needs to throw that model design into that trash can.  And please, secure delete.
Gary Simmons
the Battle Cat

#23 Tetsuya

Tetsuya

    Master Blaster

  • Members
  • PipPipPipPipPipPip
  • 2172 posts
  • Location:MI

Posted 19 December 2013 - 11:46 AM

View PostFrost, on 19 December 2013 - 01:54 AM, said:

That 12-core Xeon EP is $2750 at retail, which means Apple is getting it at least a few hundred cheaper. All the rest of the components excluding the GPUs and including the case probably don't exceed $1000.

12-core is supposed to start at $7000 without any other options... those FirePros are either absurdly expensive even at wholesale, or Apple's profit margin on the new Mac Pro is batpopsnizzle crazy.

Well the FirePro's (the W9000 that is the spec-equivalent to the D700, anyway) are $3,100 at retail.  So even if Apple saves $1,000 each, With just the CPU and GPUs, you're probably looking at ~5000$ in costs on that 7000$ machine just in CPU and GPU.

#24 Tesseract

Tesseract

    Unmanageable Megaweight

  • Members
  • PipPipPipPipPipPipPipPip
  • 3512 posts
  • Pro Member:Yes

Posted 19 December 2013 - 01:45 PM

Realising that I have more USB devices connected to my current machine than the new Mac Pro has ports: priceless.

#25 mattw

mattw

    Legendary

  • Members
  • PipPipPipPipPip
  • 828 posts

Posted 19 December 2013 - 02:41 PM

Will be interesting to see some real world testing and benchmarks but looking at the BTO GPU options available and specs in terms of memory bus width, bandwidth and stream processors then when you aren't working and are having fun gaming:

D700 is going to perform like a Radeon 7970

D500 like a Radeon 7950

D300 like my Radeon 5870 from 2010?  

Assuming of course there is no magic solution for games to use both cards.

I know the rate of progress these days isn't so great as tablets and portable machines are vastly outselling desktops but does that seem a bit disappointing to anyone else?
Mac Pro 09 (now a 5.1, 2 x 3.06GHz Xeon X5675, 24GB, R9 280X 3GB, 480SSD, 16TB HD, MacOS 10.12.6

#26 Frost

Frost

    Secretary of Offense

  • Forum Moderators
  • PipPipPipPipPipPipPipPipPipPipPip
  • 6075 posts
  • Steam ID:CaptFrost
  • Location:Republic of Texas
  • Pro Member:Yes

Posted 19 December 2013 - 03:08 PM

"Shipping February"

Looks like they missed their ship date by a couple miles.

View Postmattw, on 19 December 2013 - 02:41 PM, said:

I know the rate of progress these days isn't so great as tablets and portable machines are vastly outselling desktops but does that seem a bit disappointing to anyone else?
And there you have yet another reason I threw my hands up and gave my Mac Pro money to Falcon Northwest a couple months ago. Megabucks for... a 7970, with no option to upgrade? Not just no, hell no.

For gaming, that's is only marginally better than the 680 Mac Edition EVGA has on sale for current Mac Pros.
Kestrel (Falcon NW Tiki) – 4.0 GHz i7 4790K / 16GB RAM / 512GB Samsung 950 Pro M.2, 2x480GB Intel 730 (RAID0), 10TB STX BarraCuda Pro / GeForce GTX TITAN X 12GB
Iridium (MacBook Pro Mid-2012) – 2.7 GHz i7 3820QM / 16GB RAM / 2TB Samsung 850 Pro / GeForce GT 650M 1GB

Eric5h5:
When there's a multiplayer version, I'm going to be on Frost's team. Well, except he doesn't seem to actually need a team...I mean, what's the point? "Hey look, it's Frost and His Merry Gang of Useless Hangers-On!" Or something.

#27 Tetsuya

Tetsuya

    Master Blaster

  • Members
  • PipPipPipPipPipPip
  • 2172 posts
  • Location:MI

Posted 19 December 2013 - 03:09 PM

If you will even get that level of performance... given that the drivers will undoubtably be optimized for workstation work.  

I'm puzzled as all get out by their decision to use AMD, honestly.  nVidia is pretty soundly kicking AMD in the butt right now, all over the board... and CUDA seems to be WAY more used in Workstation-class apps than OpenCL.

#28 Tesseract

Tesseract

    Unmanageable Megaweight

  • Members
  • PipPipPipPipPipPipPipPip
  • 3512 posts
  • Pro Member:Yes

Posted 19 December 2013 - 04:15 PM

View Postmattw, on 19 December 2013 - 02:41 PM, said:

Will be interesting to see some real world testing and benchmarks but looking at the BTO GPU options available and specs in terms of memory bus width, bandwidth and stream processors then when you aren't working and are having fun gaming:

D700 is going to perform like a Radeon 7970

D500 like a Radeon 7950

D300 like my Radeon 5870 from 2010?
If you look up benchmarks for the equivalent FirePros (W7000, W8000, W9000) under Windows, the results are rather odd. The W7000 often outperforms not just the W8000 but also the comparable Nvidia card. It may be just a quirk of the drivers but who knows. I haven't read any followups that answer that yet.

They say you're mainly paying for the drivers with the pro-class cards. Hopefully the Mac drivers greatly surpass Apple's historical standard, otherwise it's not clear just what you are paying for.

#29 bobbob

bobbob

    Uberspewer

  • Members
  • PipPipPipPipPipPipPip
  • 3367 posts

Posted 19 December 2013 - 06:31 PM

View PostTetsuya, on 19 December 2013 - 03:09 PM, said:

nVidia is pretty soundly kicking AMD in the butt right now, all over the board
Uh, what? AMD seems to be higher performance for cheaper, at least until nVidia makes/made some permanent price drops and rebrands.

#30 Smoke_Tetsu

Smoke_Tetsu

    Uberspewer

  • Members
  • PipPipPipPipPipPipPip
  • 3318 posts
  • Steam Name:Tetsu Jutsu
  • Steam ID:smoke_tetsu
  • Location:Cyberspace

Posted 19 December 2013 - 06:34 PM

Nvidia is perhaps kicking AMDs butt in Windows PCs but it seems to me that in OS X AMD is kicking their butts in terms of drivers. It's kind of like a reverse situation. It's usually Nvidia that's having issues with games in OS X as far as I can tell.
--Tetsuo

Alex Delarg, A Clockwork Orange said:

It's funny how the colors of the real world only seem really real when you viddy them on the screen.

the Battle Cat said:

Slower and faster? I'm sorry to hear such good news?

Late 2012 27 inch iMac, Core i7 Quad 3.4GHz, 16GB RAM, Nvidia GeForce GTX 680MX 2GB, 3TB HDD - Mavericks

Late 2009 27 inch iMac, Core i5 2.6GHz, 12GB RAM, ATI Radeon 4850HD 512MB, 1TB HDD - Mavericks

Mac Mini, PowerPC G4 1.4Ghz, 1GB RAM, Radeon 9200 32MB, 256GB HDD - Leopard

Dell Inspiron 1200 Notebook: 1.2GHz Celeron, 1.2GB RAM, Intel GMA915, 75GB HDD - Ubuntu

Generic Black Tower PC, Dual Core 64-bit 2.4GHz, 4GB RAM, GeForce 9600 GT 512MB - Windows 7


#31 Sneaky Snake

Sneaky Snake

    Official Mascot of the 1988 Winter Olympics

  • IMG Writers
  • 3298 posts
  • Steam Name:SneakySnake
  • Steam ID:sneaky_snake
  • Location:Waterloo, Canada

Posted 19 December 2013 - 09:40 PM

Not sure about all this talk about Nvidia kicking AMD's butt is coming from.

The 700 series vs the R9 series is a very close battle. The 780 Ti is the fastest single GPU card, but AMD's 290X beats the Titan by a few percentage points, and beats the 780 by a few more percentage points, and is a few hundred cheaper then the 780 Ti. Once the 290X is available with decent aftermarket coolers it'll be a tough decision between the 290X and 780 Ti. (Ti will give a few percentage points more of performance, but cost your a couple hundred for those few percentage points)

Sure Nvidia gives you more gimmicks that you'll probably never use (PhysX is used in like 5 popular games) (Gsync requires a $400 monitor.... ), and this is all ignoring the fact that pretty much every AAA game for the next 5+ years will be optimized to AMD's GPU architecture first, since that's what the next gen consoles are using. There's also AMD's tech like Mantle.

Does anyone know if it's possible to crossfire the GPU's in the Mac Pro when booted into Windows? Is there even crossfire drivers for the workstation cards? Or do they just fall back to the traditional gaming card drivers when gaming?
2015 13" rMBP: i5 5257U @ 2.7 GHz || Intel Iris 6100 || 8 GB LPDDR3 1866 || 256 GB SSD || macOS Sierra
Gaming Build: R5 1600 @ 3.9 GHz || Asus GTX 1070 8 GB || 16 GB DDR4 3000 || 960 Evo NVMe, 1 TB FireCuda || Win10 Pro
Other: Dell OptiPlex 3040 as VMware host || QNAP TS-228 NAS || iPhone 6S 64GB

#32 Smoke_Tetsu

Smoke_Tetsu

    Uberspewer

  • Members
  • PipPipPipPipPipPipPip
  • 3318 posts
  • Steam Name:Tetsu Jutsu
  • Steam ID:smoke_tetsu
  • Location:Cyberspace

Posted 19 December 2013 - 10:13 PM

From what I've always heard Nvidia has a reputation for having better drivers in Windows especially with OpenGL. As I said earlier at least from my experience that doesn't hold true in OS X however.

Also, it remains to be seen what kind of splash Mantle will make or whether it'll venture beyond Windows. Specialized stuff like that doesn't tend to from what I can tell.

Lastly I highly doubt the GPUs in the new Mac Pro are crossfire capable even in Windows. There's more to crossfire or SLI than just software support. The cards have to be compatible cards on a crossfire capable motherboard and hooked together with a cable from what I've heard. The kind of hardware they are putting into the Mac Pro isn't the type that has gaming specific features.
--Tetsuo

Alex Delarg, A Clockwork Orange said:

It's funny how the colors of the real world only seem really real when you viddy them on the screen.

the Battle Cat said:

Slower and faster? I'm sorry to hear such good news?

Late 2012 27 inch iMac, Core i7 Quad 3.4GHz, 16GB RAM, Nvidia GeForce GTX 680MX 2GB, 3TB HDD - Mavericks

Late 2009 27 inch iMac, Core i5 2.6GHz, 12GB RAM, ATI Radeon 4850HD 512MB, 1TB HDD - Mavericks

Mac Mini, PowerPC G4 1.4Ghz, 1GB RAM, Radeon 9200 32MB, 256GB HDD - Leopard

Dell Inspiron 1200 Notebook: 1.2GHz Celeron, 1.2GB RAM, Intel GMA915, 75GB HDD - Ubuntu

Generic Black Tower PC, Dual Core 64-bit 2.4GHz, 4GB RAM, GeForce 9600 GT 512MB - Windows 7


#33 Sneaky Snake

Sneaky Snake

    Official Mascot of the 1988 Winter Olympics

  • IMG Writers
  • 3298 posts
  • Steam Name:SneakySnake
  • Steam ID:sneaky_snake
  • Location:Waterloo, Canada

Posted 19 December 2013 - 10:47 PM

View PostSmoke_Tetsu, on 19 December 2013 - 10:13 PM, said:

From what I've always heard Nvidia has a reputation for having better drivers in Windows especially with OpenGL. As I said earlier at least from my experience that doesn't hold true in OS X however.

Also, it remains to be seen what kind of splash Mantle will make or whether it'll venture beyond Windows. Specialized stuff like that doesn't tend to from what I can tell.

Lastly I highly doubt the GPUs in the new Mac Pro are crossfire capable even in Windows. There's more to crossfire or SLI than just software support. The cards have to be compatible cards on a crossfire capable motherboard and hooked together with a cable from what I've heard. The kind of hardware they are putting into the Mac Pro isn't the type that has gaming specific features.

People complain about AMD's drivers, but I've never had an issue with them. Been using AMD cards for years. Pretty much every motherboard released in the last few years that has 2 PCIe x16/x8 slots is crossfire compatible. The new AMD R9 cards don't require a bridge to crossfire, not sure what architecture the Dx00 cards are using though.
2015 13" rMBP: i5 5257U @ 2.7 GHz || Intel Iris 6100 || 8 GB LPDDR3 1866 || 256 GB SSD || macOS Sierra
Gaming Build: R5 1600 @ 3.9 GHz || Asus GTX 1070 8 GB || 16 GB DDR4 3000 || 960 Evo NVMe, 1 TB FireCuda || Win10 Pro
Other: Dell OptiPlex 3040 as VMware host || QNAP TS-228 NAS || iPhone 6S 64GB

#34 bobbob

bobbob

    Uberspewer

  • Members
  • PipPipPipPipPipPipPip
  • 3367 posts

Posted 19 December 2013 - 10:51 PM

View PostSneaky Snake, on 18 December 2013 - 11:10 PM, said:

What I also know however, is that doing a Xeon E5-1620 build with dual D300's isn't cheap
The processor's $294 wholesale. It's less than 10% of the retail price. My CPU would have been maybe 15% of Dell's retail. Apple is not giving you a deal.

View PostFrost, on 19 December 2013 - 01:54 AM, said:

That 12-core Xeon EP is $2750 at retail, which means Apple is getting it at least a few hundred cheaper. All the rest of the components excluding the GPUs and including the case probably don't exceed $1000.
$2614. They add $3500 for that, for $3794 total on the CPU, retail. I wouldn't worry about them making money.

#35 Tetsuya

Tetsuya

    Master Blaster

  • Members
  • PipPipPipPipPipPip
  • 2172 posts
  • Location:MI

Posted 19 December 2013 - 11:11 PM

View PostSneaky Snake, on 19 December 2013 - 09:40 PM, said:

Not sure about all this talk about Nvidia kicking AMD's butt is coming from.

Tom's Hardware, Anandtech, a few other places.  Nowhere important.  The 290X, real-world, isn't doing as well as the golden samples sent out to review sites. Almost everyone has altered their reviews with real-world models - and they're finding almost across the board that they throttle themselves, and dont perform nearly as well as the press samples they got.  

Quote

The 700 series vs the R9 series is a very close battle. The 780 Ti is the fastest single GPU card, but AMD's 290X beats the Titan by a few percentage points, and beats the 780 by a few more percentage points, and is a few hundred cheaper then the 780 Ti.

Not sure where you're getting that....

http://www.videocard...h_end_gpus.html

See that R290X down there about a thousand points below the Titan and 780... and about 1500 points below the 780 Ti?  I sure do.  And the costs - aren't that disparate; As of today - cost on that 780 is $499; cost on that R290X is $573.  The 780 Ti is only about 120 bucks more at $699

Quote

Once the 290X is available with decent aftermarket coolers it'll be a tough decision between the 290X and 780 Ti. (Ti will give a few percentage points more of performance, but cost your a couple hundred for those few percentage points)

Yeah, real-world benchmarks aren't bearing any of that out.  Look back at the chart - it's repeated pretty well down the line there.  R280X/7970 - going for $299, but benching nearly identically to a GTX 760, which is about 80 bucks cheaper - and beat soundly by the GTX 770, which is only $20 more expensive.  

As you go down the line, at pretty much every price point, nVidia is matching or beating price/performance on the AMD parts.  

Quote

Sure Nvidia gives you more gimmicks that you'll probably never use (PhysX is used in like 5 popular games) (Gsync requires a $400 monitor.... ),

And a bunch i will, like loss-less screen recording, now with both mic and game sound inputs, and direct Twitch streaming with less than a 5% performance loss.  That's... pretty freaking awesome.  All from one piece of software, and you can include overlays if you need.  

Quote

and this is all ignoring the fact that pretty much every AAA game for the next 5+ years will be optimized to AMD's GPU architecture first, since that's what the next gen consoles are using.

If you think this next generation of PCs is going to be held back by consoles, i'm not sure we're even discussing the same world.  In the previous generation, when consoles came out, they met or exceeded what high-end PCs were capable of at the time.  And, because they were such a large part of the market, and for 70% of their lifespans, near-equivalent to PCs or even better, yeah, PC games were ported back from consoles and console development became the limiting factor.  

Problem is, this generation started with the consoles already weaker than a low-end PC.  I can build and i3 based system with a better GPU than the consoles for about the price of the PS4.  I seriously doubt PC development is going to be held back and optimized to a two-generation-old low-midrange AMD GPU (lets hear it for that Radeon 7850... bleeding edge console tech in 2013).  You're going to see a lot of games be backported to consoles this time... because people will want to sell games to PC users that actually make use of the fact that PC hardware is already burying the new consoles.  

Quote

There's also AMD's tech like Mantle.

Because PhysX did so well.  Vendor-specific APIs are about as relevant as a mole on my backside.  

i'm not trying to come across as an nVidia fanboy.  I go where the performance is.  If AMD's cards had been anywhere *near* as good, price/performance/heat/power wise, i'd be rocking an AMD GPU in my PC.  I'm not, because they weren't.  

If i had the equivalent AMD GPU in my machine (provided i could even have found one on a sub-8" PCB; the custom coolers those things require pretty much definitively rule that out) my PSU might not be able to handle it and i'd certainly more heat problems than i currently do (though a pair of 60mm SilenX fans are on the way to fix that - i got a cheapie 4000rpm 60mm and it cut about 5 degrees C right out of the case, so the pair of silent fans should move slightly more air than the single loud-ass jobbie).  

AMD  just isn't cutting it, and their future roadmaps dont show any real progress.

#36 Sneaky Snake

Sneaky Snake

    Official Mascot of the 1988 Winter Olympics

  • IMG Writers
  • 3298 posts
  • Steam Name:SneakySnake
  • Steam ID:sneaky_snake
  • Location:Waterloo, Canada

Posted 20 December 2013 - 12:03 AM

Quote

Tom's Hardware, Anandtech, a few other places.  Nowhere important.  The 290X, real-world, isn't doing as well as the golden samples sent out to review sites. Almost everyone has altered their reviews with real-world models - and they're finding almost across the board that they throttle themselves, and dont perform nearly as well as the press samples they got.  

We have different definitions of butt kicking. I thought you meant butt kicking similar to the Intel vs AMD situation. The 290/290X reviews were cherry picked cards, but in the real world they are still standing out as excellent price for peformance options. The 290 is almost as fast as a 780, while being $100 cheaper. A good cherry picked 290X (I think the retail version is a rather stupid card with the temps) performs pretty much identical to the Titan. The ones with aftermarket coolers coming up clock a lot faster (due to having manageable temps) then the review samples in most cases, so are performing at least on par with the cherry picked samples, if not faster.

Quote

Not sure where you're getting that....

http://www.videocard...h_end_gpus.html

See that R290X down there about a thousand points below the Titan and 780... and about 1500 points below the 780 Ti?  I sure do.  And the costs - aren't that disparate; As of today - cost on that 780 is $499; cost on that R290X is $573.  The 780 Ti is only about 120 bucks more at $699

I got it from the hardwarecanucks review (pretty large Canadian tech site, decently respected. Probably make the best case reviews you'll see on YouTube). Here is the review, linking to the conclusion page. It's using the cherry picked review sample though obviously, so take it with a grain of salt, but either a good retail 290x or an aftermarket one competes with the Titan pretty well in gaming, and definitely beats the 780 (non Ti).

In pretty much every price point below $400 Nvidia and AMD are both competitive at varying pricepoints. A 280X is definitely a good margin faster then the 760, and is only $50 more (In Canada). I think your exaggerating the performance of the 760 and 770 a bit. They're both great cards, but it's pretty well known that the 760 beats the 270X pretty well, is on par with the 7950, and loses to the 280X (or 7970 Ghz edition). The 280X trade blows with the 770 and both get victories in differing games.


Quote

Yeah, real-world benchmarks aren't bearing any of that out.  Look back at the chart - it's repeated pretty well down the line there.  R280X/7970 - going for $299, but benching nearly identically to a GTX 760, which is about 80 bucks cheaper - and beat soundly by the GTX 770, which is only $20 more expensive.  

As you go down the line, at pretty much every price point, nVidia is matching or beating price/performance on the AMD parts.  

And yet bobbob's chart shows something completely different. To be honest that chart isn't very accurate in showing gaming performance. It puts the 690 below the 670 in performance, while the 690 is essentially dual 670's and is also well known (along with the 7990) to be the fastest single card graphics solutions (note I said single card, not single GPU), although that might have changed with the 780 Ti, I haven't looked at the reviews of the 780 Ti vs the 7990 and 690.

Quote

And a bunch i will, like loss-less screen recording, now with both mic and game sound inputs, and direct Twitch streaming with less than a 5% performance loss.  That's... pretty freaking awesome.  All from one piece of software, and you can include overlays if you need.  

That is a really cool feature, but also one that almost no one will actually care about. I know probably a hundred guys who PC game regularly. Most don't have internet anywhere close to good enough to being able to stream, but there are a few who like to stream, and have good enough internet for it. I will definitely be recommended Nvidia GPU's for them.

If your seriously thinking of spending $400 on a G-Sync monitor why wouldn't you have just gotten a $150 monitor, and then use that extra $250 to get a 780 instead of a 760, and play at a rock solid 60 fps vsync. G Sync shines in the 30<framerate<60 zone, but it's really only worth it if your gaming at 2560x1440 or higher res, already have a top of line of card (like a 780 Ti), and are struggling to get a solid 60 fps in certain games when you have all of the settings maxed out.

Spending money on G Sync when you have a midrange GPU is a pretty big waste of money, since you could be spending hundreds more on your GPU to fix your framerate issues, instead of spending hundreds more on a monitor to help make your mid range card look a little smoother.

Quote

If you think this next generation of PCs is going to be held back by consoles, i'm not sure we're even discussing the same world.  In the previous generation, when consoles came out, they met or exceeded what high-end PCs were capable of at the time.  And, because they were such a large part of the market, and for 70% of their lifespans, near-equivalent to PCs or even better, yeah, PC games were ported back from consoles and console development became the limiting factor.  

Problem is, this generation started with the consoles already weaker than a low-end PC.  I can build and i3 based system with a better GPU than the consoles for about the price of the PS4.  I seriously doubt PC development is going to be held back and optimized to a two-generation-old low-midrange AMD GPU (lets hear it for that Radeon 7850... bleeding edge console tech in 2013).  You're going to see a lot of games be backported to consoles this time... because people will want to sell games to PC users that actually make use of the fact that PC hardware is already burying the new consoles.  

You can't really build PC that soundly beats the PS4 for $400. i3+7870 is already putting you at $250-$300. Then you need to get a mobo, RAM, case, PSU, and HDD for $100-$150.

I never implied that the PS4 and Xbone were going to hold back PC's??? (at least I hope I didn't imply that). My point was that the PS4 and Xbone are going to have a much larger install base then PC gaming will for the immediate future. They are also using AMD GPU's, so it's pretty natural to assume dev's are going to spend a little more time optimizing for the AMD graphics architecture over Nvidia. I'm not saying Nvidia cards will get bad performance, but just that I'm expecting more optimization for AMD cards (which I think is a pretty fair assumption, but we can disagree on this)

Quote

Because PhysX did so well.  Vendor-specific APIs are about as relevant as a mole on my backside.  

i'm not trying to come across as an nVidia fanboy.  I go where the performance is.  If AMD's cards had been anywhere *near* as good, price/performance/heat/power wise, i'd be rocking an AMD GPU in my PC.  I'm not, because they weren't.  

If i had the equivalent AMD GPU in my machine (provided i could even have found one on a sub-8" PCB; the custom coolers those things require pretty much definitively rule that out) my PSU might not be able to handle it and i'd certainly more heat problems than i currently do (though a pair of 60mm SilenX fans are on the way to fix that - i got a cheapie 4000rpm 60mm and it cut about 5 degrees C right out of the case, so the pair of silent fans should move slightly more air than the single loud-ass jobbie).  

AMD  just isn't cutting it, and their future roadmaps dont show any real progress.

We'll have to wait and see about Mantle. I'm not thinking it's going to be a golden goose for AMD, but it's something to consider. PhysX is hardly supported by anything, and look at how heavily Nvidia markets it (and markets it successfully).

In summary, I'm really just disagreeing with you that AMD GPU's are getting their ass kicked by Nvidia. I think the benchmarks show pretty clearly that at all the varying pricepoints, both AMD and Nvidia have the performance crown. I'd have trouble recommending a 290X to someone, but the 280x on the other hand is an excellent card for the price, and pretty much every review of it states that. (For example: Tom's Hardware gives the 280X a "2013 Smart Buy" Award)

Sorry for the wall of text.
2015 13" rMBP: i5 5257U @ 2.7 GHz || Intel Iris 6100 || 8 GB LPDDR3 1866 || 256 GB SSD || macOS Sierra
Gaming Build: R5 1600 @ 3.9 GHz || Asus GTX 1070 8 GB || 16 GB DDR4 3000 || 960 Evo NVMe, 1 TB FireCuda || Win10 Pro
Other: Dell OptiPlex 3040 as VMware host || QNAP TS-228 NAS || iPhone 6S 64GB

#37 The Liberator

The Liberator

    Liberate Tutemet Ex Infernis

  • Members
  • PipPipPipPipPipPipPipPip
  • 3707 posts
  • Steam Name:Meriones
  • Location:Melbourne, Australia

Posted 20 December 2013 - 03:14 AM

View PostSneaky Snake, on 18 December 2013 - 08:19 PM, said:

It's an apple's to oranges comparison to compare gaming hardware to workstation hardware.
I see what you did. :)

In all seriousness, I was rather surprised at the initial BTO offering that the store is giving, but I quickly remember that has been the same thing around for the older MPs and other Macs in previous years and models. It will take a while to see what will eventually be showcased and available for use in the black, bin-looking evolution for the "tower".

Lib.

iMac: 2.8GHz i7 | 16GB RAM | 10.10.5 | ATI Radeon HD 4850M | 512MB VRAM

Custom: 3.4 GHz i5 | 16GB RAM | Win 7 SP 1 | nVidia GeForce GTX 660 OCII | 2GB VRAM


We hang in D.C. with them CIA killers

Baraka Flacka Flames - Head of the State


#38 mattw

mattw

    Legendary

  • Members
  • PipPipPipPipPip
  • 828 posts

Posted 20 December 2013 - 05:30 AM

Interesting thoughts guys. Personally I do think the improvements on the desktop will be slight for
a while as the growth is mainly in tablets, phones and notebooks whilst desktops are moving to a longer update cycle. That is why my 5 year old machine is still OK today where as last time I had to upgrade to avoid being left behind.

Sure there will be a niche of uber gaming PCs with quad SLI graphics and overclocked water cooled CPUs but developers are looking at the more mainstream laptop specs and consoles when targeting their product.
Mac Pro 09 (now a 5.1, 2 x 3.06GHz Xeon X5675, 24GB, R9 280X 3GB, 480SSD, 16TB HD, MacOS 10.12.6

#39 Tesseract

Tesseract

    Unmanageable Megaweight

  • Members
  • PipPipPipPipPipPipPipPip
  • 3512 posts
  • Pro Member:Yes

Posted 20 December 2013 - 12:00 PM

View PostFrost, on 19 December 2013 - 03:08 PM, said:

"Shipping February"

Looks like they missed their ship date by a couple miles.
Depends how early you got your order in. Earliest orders for stock configs are shipping December 30 (which still isn't "Fall" of course). Choosing BTO made it slip to early January, and then the dates rapidly advanced as the orders came in and now it's February.

#40 Frost

Frost

    Secretary of Offense

  • Forum Moderators
  • PipPipPipPipPipPipPipPipPipPipPip
  • 6075 posts
  • Steam ID:CaptFrost
  • Location:Republic of Texas
  • Pro Member:Yes

Posted 20 December 2013 - 02:08 PM

Oh yeah? Well AMD GPUs are gerls' GPUs. Real manly men with hairy chests and fist-sized adam's apples use nVidia. Just like PowerPC.






Sorry. :P
Kestrel (Falcon NW Tiki) – 4.0 GHz i7 4790K / 16GB RAM / 512GB Samsung 950 Pro M.2, 2x480GB Intel 730 (RAID0), 10TB STX BarraCuda Pro / GeForce GTX TITAN X 12GB
Iridium (MacBook Pro Mid-2012) – 2.7 GHz i7 3820QM / 16GB RAM / 2TB Samsung 850 Pro / GeForce GT 650M 1GB

Eric5h5:
When there's a multiplayer version, I'm going to be on Frost's team. Well, except he doesn't seem to actually need a team...I mean, what's the point? "Hey look, it's Frost and His Merry Gang of Useless Hangers-On!" Or something.





Also tagged with one or more of these keywords: mac pro, available