Jump to content


Deus Ex: Mankind Divided for macOS now in final stages of development


  • Please log in to reply
65 replies to this topic

#21 marksatt

marksatt

    Fanatic

  • Members
  • PipPip
  • 59 posts

Posted 03 December 2017 - 05:43 PM

View PostMr Pink, on 03 December 2017 - 01:28 PM, said:

To be honest, I'd rather have Apple build their own graphics chips for the Mac, like they do for the newest iPhones with their A11 chip.

So you'd be happy only playing iOS F2P games and ports of Xbox 360/PS3 games then? Right now that's all Apple's GPU design is really capable of running and making a bigger one with more power won't actually change that. It is *remarkable* what Feral have done with Rome Total War and GRID Autosport on iOS - and DXMD is a much newer, more demanding title.

View Postipickert55, on 03 December 2017 - 02:04 PM, said:

Would that hurt the development process for porting games though? Another hardware milestone to break through? I don't know the answer genuinely wondering.

Mr Pink's suggestion could kill it stone dead.

A modern Deferred Renderer as used by all AAA Desktop/Console titles assumes that the GPU renders the whole frame directly to VRAM and that changing render-targets is cheap. These are properties of an Immediate-Mode GPU design like those from AMD, Nvidia & Intel. Apple's design is similar to the PowerVR designs it replaced in that it does not render the whole frame at once, instead saving silicon by rendering only a tiny sub-rect (a "Tile") at a time. This is called a Tiled-Binned-Deferred GPU design - they tend to be very power-efficient and therefore great for phones & tablets, but they make Deferred Rendering impractical (i.e. sub 10 fps). On a TBDR GPU rendering to an array of render-targets, which is required by Deferred Rendering, reduces the tile-size and that exponentially reduces rendering efficiency. TBDR GPUs must also flush all outstanding work to change render-targets in a much less efficient way than Immediate GPUs which can more naturally allow future vertex processing to overlap with outstanding rasterisation work (as they have *much* more memory and memory bandwidth). I haven't even mentioned the absence of any volumetric rendering support (so no lit-translucency, point-light shadows or atmospheric effects in UE4 on iOS!) or the absence of geometry shaders, or buffer writes from vertex-shaders etc. etc. etc. You get the picture - iOS GPUs lack a lot of the features of modern desktop GPUs. Adding them would be *hard* and you can't guarantee that Apple would.

And before anyone mentions the UE4 AR demo at WWDC 2017: I do all the Metal development at Epic, including this. We demonstrated the Desktop Forward Renderer originally developed for VR which is far less feature rich than our Deferred Renderer and doesn't require quite so many features that are unavailable (or fundamentally impossible) on the iOS GPU. Even then it was incredibly hard work to get it to an acceptable state and features were disabled that can't be done on iOS (lit translucency and atmospheric effects). It is not viable to expect any developer to retool each game engine & all game assets like this to run on both a Deferred Renderer for proper platforms (Windows, PS4, Xbox, AMD/Nvidia Macs) and a cut-down Forward Renderer for Mac & iOS with Apple's GPU. It can make sense if you are developing a low (or at least lower) end version for mobile (iOS/Android) and a high-end version for desktop, but probably only if you are the original developer - it'd be a huge undertaking for anyone after the fact.

Suffice it to say I think that it would be a huge mistake for Apple to adopt their current iOS GPU for Macs. Should they want to they would need to make considerable, fundamental, changes to make it a viable replacement for GPUs from AMD.

And with that I've derailed Ellie's thread enough. I will simply reiterate my congratulations on their breakthrough & progress on DXMD and hope there are more announcements to come!

#22 Mr Pink

Mr Pink

    Heroic

  • Members
  • PipPipPipPip
  • 448 posts

Posted 03 December 2017 - 06:11 PM

View Postmarksatt, on 03 December 2017 - 05:43 PM, said:

Mr Pink's suggestion could kill it stone dead.

I agree with you and apologise. I had no background knowledge on this topic.
I just hope some day soon we will be able to buy a new Mac in which GPUs can be swapped easily, to avoid situations like this one.

Anyway, I'm glad 'Mankind Divided' is coming to the Mac, even though it's only for Macs with AMD Graphics for now.

#23 marksatt

marksatt

    Fanatic

  • Members
  • PipPip
  • 59 posts

Posted 03 December 2017 - 06:40 PM

View PostMr Pink, on 03 December 2017 - 06:11 PM, said:

I agree with you and apologise. I had no background knowledge on this topic.
I just hope some day soon we will be able to buy a new Mac in which GPUs can be swapped easily, to avoid situations like this one.

Anyway, I'm glad 'Mankind Divided' is coming to the Mac, even though it's only for Macs with AMD Graphics for now.

No need to apologise.

This has been waiting to come off my chest for a while what with the continued blather in other forums about Apple using their GPUs in Macs. It'd be insane at this point without a huge amount of work and I don't believe for a moment that they'd do this without taking the time to do it *right*.

So in the short and maybe medium term we need faster AMD GPUs, a real Mac Pro & more eGPU users!

#24 macdude22

macdude22

    Like, totally awesome.

  • Forum Moderators
  • PipPipPipPipPipPip
  • 2131 posts
  • Steam Name:Rakden
  • Location:Iowa
  • Pro Member:Yes

Posted 03 December 2017 - 08:17 PM

View Postmarksatt, on 03 December 2017 - 06:40 PM, said:

So in the short and maybe medium term we need faster AMD GPUs, a real Mac Pro & more eGPU users!

Does anyone here have experience with the current state of eGPUs? I see Sonnet offers their breakout boxes and some new package deals with a box and card (puck). Are the Sonnet Thunderbolt 3 (USB-C) boxes compatible with Thunderbolt 2 devices via TB3-2 adapter? I realize there would be less bandwidth available but still better than nothing.
IMG Discord Server | | http://www.trueachie....com/Rakden.htm
Enterprise (MacPro 3,1): 8 Xeon Cores @ 2.8 GHz || 14 GB RAM || Radeon 4870 || 480GB Crucial M500 + 2TB WD Black (Fusion Drive) || 144hz Asus Mon
Defiant (MacBookPro 9,1): Core i7 @ 2.3ghz || 8GB RAM || nVidia GT 650M 512MB || 512GB Toshiba SSD

#25 DirtyHarry50

DirtyHarry50

    Special Snowflake

  • Members
  • PipPipPipPipPipPip
  • 1789 posts
  • Steam Name:DirtyHarry
  • Steam ID:dirtyharry2
  • Location:North Carolina, USA

Posted 03 December 2017 - 08:38 PM

View Postmarksatt, on 03 December 2017 - 06:40 PM, said:

So in the short and maybe medium term we need faster AMD GPUs, a real Mac Pro & more eGPU users!

That was, as usual, really great and informative stuff leading up to your concluding remark which I'd argue was on topic all things considered. All of that is related to the release and the questions and sometimes mistaken assumptions people have made about what drives outcomes for this stuff. I love reading your posts because I almost always learn something new and interesting. I gain a better understanding of why things are as they are at a level of detail I genuinely appreciate. I can't imagine Ellie or anyone at Feral minding your added and relevant comments. I posted, "Go Feral!" earlier and while this is off-topic (I am so bad for this), the hell with everything: Go Epic! :)
“The time you enjoy wasting is not wasted time.” — Bertrand Russell

#26 marksatt

marksatt

    Fanatic

  • Members
  • PipPip
  • 59 posts

Posted 03 December 2017 - 11:19 PM

View Postmacdude22, on 03 December 2017 - 08:17 PM, said:

Does anyone here have experience with the current state of eGPUs? I see Sonnet offers their breakout boxes and some new package deals with a box and card (puck). Are the Sonnet Thunderbolt 3 (USB-C) boxes compatible with Thunderbolt 2 devices via TB3-2 adapter? I realize there would be less bandwidth available but still better than nothing.

I have one of the Sonnet eGPU boxes plugged into my Mac Pro at work via a ThunderBolt 2 cable and TB2->TB3 dongle. Works fine and a faster GPU is still faster than what's inside the Darth Pro. Of course how much performance is lost depends on the bandwidth sensitivity of the application or game you run so YMMV. I may of course be running super in-development drivers with an off-the-shelf next-gen PC GPU... so I couldn't possibly confirm actual performance numbers! ;)

#27 jeannot

jeannot

    Heroic

  • Members
  • PipPipPipPip
  • 321 posts

Posted 04 December 2017 - 12:31 AM

Usually Apple has been favouring the "good" GPU series over the bad ones at any given time. They used the geforce 8800 instead of the Radeon HD 3XXX, then switched to the excellent radeon HD 47XX and 5XXX later, completely ignoring the infamous nVidia fermi cards, then they used nVidia's kepler. But this changed in 2014, where they should have used Maxwell. They have never dropped a GPU vendor for such a long amont of time. It was first speculated that it was to favour openGL over CUDA, but I don't buy this explanation. If it was the case they would have ditched nVidia in 2009-2010 when their introduced openCL.
Some at ars technica speculate that the RISC architecture of GCN really helps with Metal. I'm not convinced either.
I find the "political" explanation more convincing.

#28 jeannot

jeannot

    Heroic

  • Members
  • PipPipPipPip
  • 321 posts

Posted 04 December 2017 - 05:14 AM

View Postmarksatt, on 03 December 2017 - 06:40 PM, said:

No need to apologise.

This has been waiting to come off my chest for a while what with the continued blather in other forums about Apple using their GPUs in Macs. It'd be insane at this point without a huge amount of work and I don't believe for a moment that they'd do this without taking the time to do it *right*.
The thing is, would we be able to tell if Apple had been developing their own desktop GPU during the last years?  I suppose not, since this work would have been kept secret. And Apple certainly have the money for such development (the expertise is another mater).
At least it's in Apple's philosophy to develop custom hardware and APIs to use it. They already have the later (Metal).

#29 marksatt

marksatt

    Fanatic

  • Members
  • PipPip
  • 59 posts

Posted 04 December 2017 - 07:50 PM

View Postjeannot, on 04 December 2017 - 05:14 AM, said:

The thing is, would we be able to tell if Apple had been developing their own desktop GPU during the last years?  I suppose not, since this work would have been kept secret. And Apple certainly have the money for such development (the expertise is another mater).
At least it's in Apple's philosophy to develop custom hardware and APIs to use it. They already have the later (Metal).

No-one outside of Apple would know for obvious reasons. They do have some very smart people on their GPU team so there is no reason they couldn't develop their own GPU for Mac. It would just need to be a quite different beast to their current iOS design to maintain functional parity with existing Macs. They couldn't rush into this.

#30 Mr Pink

Mr Pink

    Heroic

  • Members
  • PipPipPipPip
  • 448 posts

Posted 06 December 2017 - 02:09 PM

Deus Ex: Mankind Divided will be out on December 12. System Requirements are out too.

Officially supported are:
  • 15" MacBook Pro (2016)
  • 21.5" iMac 4K (2017)
  • 27" iMac 5K (late 2014)
  • Mac Pro (late 2013)
The 15" MacBook Pro (mid 2015) with AMD Radeon R9 M370X can run the game, but is not officially supported.

#31 mattw

mattw

    Legendary

  • Members
  • PipPipPipPipPip
  • 863 posts

Posted 06 December 2017 - 03:11 PM

Hopefully now I upgraded to an RX580 I'll be able to play this, and stave off the obsolescence just that bit longer.

So far only Mafia 3 fails to launch but must be a dual socket machine specific as I read of a 6-Core 3.33GHz owner that can run it.
Mac Pro 09 (now a 5.1, 2 x 3.06GHz Xeon X5675, 24GB, RX580 8GB, 480SSD, 16TB HD, MacOS 10.13.1

#32 ozzy

ozzy

    Heroic

  • Members
  • PipPipPipPip
  • 441 posts
  • Steam Name:ozzy
  • Location:London, UK

Posted 06 December 2017 - 03:34 PM

View Postmacdude22, on 03 December 2017 - 08:17 PM, said:



Does anyone here have experience with the current state of eGPUs? I see Sonnet offers their breakout boxes and some new package deals with a box and card (puck). Are the Sonnet Thunderbolt 3 (USB-C) boxes compatible with Thunderbolt 2 devices via TB3-2 adapter? I realize there would be less bandwidth available but still better than nothing.

I have an Akitio Node eGpu chasis with an nVidia 1070 and an apple tb2-tb3 adapter running on my MBP mid-2012 15” retina. There is definitely performance loss, but it’s still light years better than the integrated card. I can run anything in MacOS flawlessly at the highest settings in 1080p and pretty well at 1440p, with it being flawless in 1440p in Windows. Now it really depends on what model Mac you have. Mine isn’t that hard to setup, but it actually worked better in 10.12 than it does in 10.13. And the nVidia drivers are pretty crap in os x (hence hitman, Deus Ex and others not being supported). Right now an AMD card (580) is probably the best bet for support and drivers, unless your Mac has an integrated nVidia card, in which case it won’t work yet in 10.13.

All of this to say that eGpus work great, but aren’t without flaws, there’s a chance an OS update breaks compatibility, and you have to do some research to find out what combo works on your particular Mac model. The site eGpu.io has all the resources you need to research and get it working along with a very helpful forum if you want to make a foray.

#33 jeannot

jeannot

    Heroic

  • Members
  • PipPipPipPip
  • 321 posts

Posted 09 December 2017 - 06:04 PM

I don't want to start a new thread for that, so since it's about Deus Ex (Human revolution), here it is.
This may sound stupid, but I can't find how to buy praxis kits at limb clinics. I can't talk to anyone at the counter nor can I look into the computer. What am I supposed to do? :unsure:

EDIT: apparently, I wasn't at the right sport when I was trying to talk to the nurse. :cool:

#34 Mr Pink

Mr Pink

    Heroic

  • Members
  • PipPipPipPip
  • 448 posts

Posted 12 December 2017 - 11:03 AM

Congratulations on the release of 'Deus Ex: Mankind Divided' for macOS.
Will the game be also released on the Mac App Store in the new future?
Any chance future macOS updates would make Intel/nVidia support for older Macs more likely?

#35 UmarOMC1

UmarOMC1

    Legendary

  • Members
  • PipPipPipPipPip
  • 1468 posts
  • Location:NYC

Posted 12 December 2017 - 09:35 PM

While not silky smooth on my setup it's certainly playable at Medium settings at 1920x1080... and holy cow it was like a 75GB install, initial launch took around eight minutes, a lot shorter after that. I think I spent about 1 hour between playing and trying different graphic settings. I like it so far although I wish I could toggle Aim Down Sight.

Emphasis on 'cow.'
macOS 10.13.x/Windows 7 Pro/2009 MacPro 4,1 Xeon W3580 3.33GHz/16GB RAM/EVGA GTX1070 8GB

#36 jeannot

jeannot

    Heroic

  • Members
  • PipPipPipPip
  • 321 posts

Posted 13 December 2017 - 06:33 AM

Yeah, the first launch of the game takes very long. I think it's the time taken for Metal shader compilation. But this step takes less than a single CPU. I'm not sure why this doesn't take a whole CPU, and less time as a result.

#37 ozzy

ozzy

    Heroic

  • Members
  • PipPipPipPip
  • 441 posts
  • Steam Name:ozzy
  • Location:London, UK

Posted 13 December 2017 - 01:15 PM

View PostUmarOMC1, on 12 December 2017 - 09:35 PM, said:

While not silky smooth on my setup it's certainly playable at Medium settings at 1920x1080... and holy cow it was like a 75GB install, initial launch took around eight minutes, a lot shorter after that. I think I spent about 1 hour between playing and trying different graphic settings. I like it so far although I wish I could toggle Aim Down Sight.

Emphasis on 'cow.'

And this is on a GTX1070 if I'm reading your signature right? Excellent news - maybe there is hope for me playing in OS X then.

#38 Camper-Hunter

Camper-Hunter

    Heroic

  • Members
  • PipPipPipPip
  • 399 posts
  • Steam Name:Rorqual
  • Steam ID:Rorqual
  • Location:Paris, France

Posted 13 December 2017 - 04:38 PM

Something's very wrong if a GTX 1070 can only handle Medium settings at 1920x1080. Like awfully poor drivers. On Windows, you can get 73 fps on high quality at this resolution...

#39 Mr Pink

Mr Pink

    Heroic

  • Members
  • PipPipPipPip
  • 448 posts

Posted 14 December 2017 - 04:05 AM

So basically macOS and its crappy nVidia drivers are the real problem.
And even if the upcoming new modular Mac Pro is going to support nVidia cards Apple will have to work on their drivers quite a bit.

#40 UmarOMC1

UmarOMC1

    Legendary

  • Members
  • PipPipPipPipPip
  • 1468 posts
  • Location:NYC

Posted 14 December 2017 - 05:16 AM

A GTX1070 in macOS gives about the performance of maybe a GTX680 in Windows, and noticeably not as smooth. Apple just really, really suck ass when it comes to gaming support. Anyone's welcome to blame NVIDIA, AMD or whatever but where was Apple when Microsoft were/are honing DirectX? Sprockets? Lagging OpenGL support? Let's reinvent the wheel and introduce Metal and introduce non-upgradeable hardware and keep the word 'Pro' in the name!

At the price for an iMac 'Pro' I can't imagine an actual 2018 MacPro costing significantly more as I can't envision Apple putting a lot of effort into creating a cool, upgradeable PC case with macOS installed and better GPU options, which is what I'd really want.

I KNOW the options.
macOS 10.13.x/Windows 7 Pro/2009 MacPro 4,1 Xeon W3580 3.33GHz/16GB RAM/EVGA GTX1070 8GB