Jump to content


Deus Ex: Mankind Divided for macOS now in final stages of development


  • Please log in to reply
65 replies to this topic

#41 jeannot

jeannot

    Heroic

  • Members
  • PipPipPipPip
  • 330 posts

Posted 14 December 2017 - 10:39 AM

View PostUmarOMC1, on 14 December 2017 - 05:16 AM, said:

A GTX1070 in macOS gives about the performance of maybe a GTX680 in Windows, and noticeably not as smooth. Apple just really, really suck ass when it comes to gaming support.
The web drivers that you use are the sole responsibility of nVidia. Why should Apple care if these drivers are poorly optimised? Pascal cards are not even officially compatible with Macs.

#42 UmarOMC1

UmarOMC1

    Legendary

  • Members
  • PipPipPipPipPip
  • 1477 posts
  • Location:NYC

Posted 14 December 2017 - 03:19 PM

View Postjeannot, on 14 December 2017 - 10:39 AM, said:

The web drivers that you use are the sole responsibility of nVidia. Why should Apple care if these drivers are poorly optimised? Pascal cards are not even officially compatible with Macs.
I'D HAVE THE SAME MESSAGE WERE I USING AN AMD CARD, WOULDN'T I? UNLESS, OF COURSE, I WAS ONLY TALKING ABOUT F1 2017. WOOT.
macOS 10.14.x/Windows 7 Pro/2009 MacPro 4,1 Xeon W3580 3.33GHz/16GB RAM/EVGA GTX1070 8GB (testing a Vega 56)

#43 jeannot

jeannot

    Heroic

  • Members
  • PipPipPipPip
  • 330 posts

Posted 14 December 2017 - 03:27 PM

Why would you have the same message if you were using an AMD card? As a matter of fact, the game performs quite well on my AMD iMac. It's only around 15% slower than on DX11. I'd say that the macOS drivers are pretty good considering that Metal is not a mature as DX11 and drivers are not custom-optimised form specific games like they are on Windows.

#44 UmarOMC1

UmarOMC1

    Legendary

  • Members
  • PipPipPipPipPip
  • 1477 posts
  • Location:NYC

Posted 14 December 2017 - 07:40 PM

...and NVIDIA's drivers are pretty good considering Apple doesn't support their GPUs with kext updates like they do AMD cards.
macOS 10.14.x/Windows 7 Pro/2009 MacPro 4,1 Xeon W3580 3.33GHz/16GB RAM/EVGA GTX1070 8GB (testing a Vega 56)

#45 marksatt

marksatt

    Fanatic

  • Members
  • PipPip
  • 59 posts

Posted 14 December 2017 - 11:43 PM

View PostUmarOMC1, on 14 December 2017 - 07:40 PM, said:

...and NVIDIA's drivers are pretty good considering Apple doesn't support their GPUs with kext updates like they do AMD cards.

Apple supply updated drivers for all the GPUs they have ever shipped in a Mac, I don't see why you'd expect them to supply anything else.

View Postjeannot, on 14 December 2017 - 10:39 AM, said:

The web drivers that you use are the sole responsibility of nVidia. Why should Apple care if these drivers are poorly optimised? Pascal cards are not even officially compatible with Macs.

Exactly.

Unlike with the old OpenGL stack there is no Apple layer between the application developer and the GPU vendor driver with Metal. If the game isn't running well on Nvidia's new GPUs (Maxwell & Pascal) when using the WebDriver that is something that needs to be communicated to Nvidia as Apple aren't involved at all.

I'm intrigued as to what the underlying problems are...

#46 jeannot

jeannot

    Heroic

  • Members
  • PipPipPipPip
  • 330 posts

Posted 15 December 2017 - 12:52 AM

View Postmarksatt, on 14 December 2017 - 11:43 PM, said:

Apple supply updated drivers for all the GPUs they have ever shipped in a Mac, I don't see why you'd expect them to supply anything else.



Exactly.

Unlike with the old OpenGL stack there is no Apple layer between the application developer and the GPU vendor driver with Metal.

What was this layer, some sort of of translation of openGL calls into some other calls consumed by the driver?

Quote

If the game isn't running well on Nvidia's new GPUs (Maxwell & Pascal) when using the WebDriver that is something that needs to be communicated to Nvidia as Apple aren't involved at all.

I'm intrigued as to what the underlying problems are...

I believe the root cause of the problems is the absence of cooperation between the API and driver developers, as their relationships appear to be in a pretty bad shape these days.

#47 UmarOMC1

UmarOMC1

    Legendary

  • Members
  • PipPipPipPipPip
  • 1477 posts
  • Location:NYC

Posted 15 December 2017 - 01:46 AM

View Postmarksatt, on 14 December 2017 - 11:43 PM, said:

Apple supply updated drivers for all the GPUs they have ever shipped in a Mac, I don't see why you'd expect them to supply anything else.
Why do you think I expect Apple to do anything? I have a GTX1070 in my MacPro, you think I just slapped it in haphazardly?
macOS 10.14.x/Windows 7 Pro/2009 MacPro 4,1 Xeon W3580 3.33GHz/16GB RAM/EVGA GTX1070 8GB (testing a Vega 56)

#48 jeannot

jeannot

    Heroic

  • Members
  • PipPipPipPip
  • 330 posts

Posted 15 December 2017 - 02:24 AM

View PostUmarOMC1, on 15 December 2017 - 01:46 AM, said:

Why do you think I expect Apple to do anything? I have a GTX1070 in my MacPro, you think I just slapped it in haphazardly?
Was "Mac" indicated anywhere on the box of the card?

#49 UmarOMC1

UmarOMC1

    Legendary

  • Members
  • PipPipPipPipPip
  • 1477 posts
  • Location:NYC

Posted 15 December 2017 - 12:25 PM

View Postjeannot, on 15 December 2017 - 02:24 AM, said:

Was "Mac" indicated anywhere on the box of the card?
Who's stupid enough to think I bought a GTX1070 expecting support from Apple? Come on, who? I did my homework, children. See this post for answers to stupid questions. Please consider your questions before posting them.
macOS 10.14.x/Windows 7 Pro/2009 MacPro 4,1 Xeon W3580 3.33GHz/16GB RAM/EVGA GTX1070 8GB (testing a Vega 56)

#50 jeannot

jeannot

    Heroic

  • Members
  • PipPipPipPip
  • 330 posts

Posted 15 December 2017 - 12:50 PM

So why again are you blaming Apple for the poor performance of this card on macOS?

Quoting you again:

Quote

A GTX1070 in macOS gives about the performance of maybe a GTX680 in Windows, and noticeably not as smooth. Apple just really, really suck ass when it comes to gaming support.

It does read like you expected support from Apple.

#51 UmarOMC1

UmarOMC1

    Legendary

  • Members
  • PipPipPipPipPip
  • 1477 posts
  • Location:NYC

Posted 15 December 2017 - 02:26 PM

Sorry, I've been using Macs for a DAMNED LONG TIME and, really, you tell me, how have Apple treated gaming in general? Did you start using Macs last year? I apologize for believing others can see the bigger picture that I have over the years. Read that quote... "GAMING SUPPORT."

"GAMING SUPPORT!" I personally don't see how that's confusing enough to warrant so much passive aggressive snark. Was 'Mac' on the side of the box? Of Goddamn course it wasn't! Anything didactic to add?
macOS 10.14.x/Windows 7 Pro/2009 MacPro 4,1 Xeon W3580 3.33GHz/16GB RAM/EVGA GTX1070 8GB (testing a Vega 56)

#52 jeannot

jeannot

    Heroic

  • Members
  • PipPipPipPip
  • 330 posts

Posted 15 December 2017 - 02:42 PM

So, the topic has shifted from the poor performance of some non-Apple drivers, to how Apple have been treating gaming over the years?
That was unclear at best. Even though I have been using Macs for the last 20 years, for work and gaming, I don't really care about how poorly Apple used to support gaming years ago.

According to others, gaming support has improved substantially since the release of Metal.
And it shows in Feral games.

#53 elowel

elowel

    Fan

  • Members
  • Pip
  • 41 posts

Posted 15 December 2017 - 02:56 PM

Is it possible to cap Mankind Divided FPS at 30? Wildly fluctuating numbers are not a totally enjoyable experience.

#54 jeannot

jeannot

    Heroic

  • Members
  • PipPipPipPip
  • 330 posts

Posted 15 December 2017 - 03:12 PM

View Postelowel, on 15 December 2017 - 02:56 PM, said:

Is it possible to cap Mankind Divided FPS at 30? Wildly fluctuating numbers are not a totally enjoyable experience.
Apparently the game offers no such option. And unfortunately, macOS drivers do not support game-specific profiles like on Windows. Maybe you can find a way to reduce the monitor refresh rate with some special software like switchresX (which requires disabling SIP).
You can also ask Feral directly.

#55 marksatt

marksatt

    Fanatic

  • Members
  • PipPip
  • 59 posts

Posted 16 December 2017 - 12:08 PM

View Postjeannot, on 15 December 2017 - 12:52 AM, said:

What was this layer, some sort of of translation of openGL calls into some other calls consumed by the driver?

Pretty much. Apple's OpenGL stack dated back to one they bought in the 90's for Classic MacOS and it separated the front-end which supplied all the OpenGL function-calls from the driver level which was meant to be much a thinner, simpler layer for the GPU vendors to provide as they no longer had to worry about all the complexities of OpenGL's specification. The intention was good. The result is that we now have Metal, where the GPU vendors implement the whole widget...

#56 ozzy

ozzy

    Heroic

  • Members
  • PipPipPipPip
  • 445 posts
  • Steam Name:ozzy
  • Location:London, UK

Posted 16 December 2017 - 02:53 PM

I bought the game through MGS so Feral will get the money, but looks like I will be playing in Windows. It's amazing the nVidia drivers are this horrific in OS X :(.  Here are some FPS numbers from the benchmark on my 2012 15" Retina MBP with a GTX 1070 eGPU:

Mac OS X 10.13.2:
1440p High Settings - Average 3.6 FPS Minimum 2.8 FPS Maximum 6.3 FPS

Windows 10 DirectX 12:
1440p High Settings - Average 42.4 FPS Minimum 31.0 FPS Maximum 57.1 FPS
1440p Very High Settings (what I play it in) - Average 37.8 FPS Minimum 25.8 FPS Maximum 50.5 FPS
1440p Ultra Settings - Average 34.1 FPS Minimum 24.6 FPS Maximum 44.8 FPS
1080p High Settings - Average 45.3 FPS Minimum 31.3 FPS Maximum 66.4 FPS
1080p Very High Settings - Average 42.4 FPS Minimum 26.4 FPS Maximum 67.6 FPS

#57 marksatt

marksatt

    Fanatic

  • Members
  • PipPip
  • 59 posts

Posted 16 December 2017 - 04:19 PM

View Postozzy, on 16 December 2017 - 02:53 PM, said:

I bought the game through MGS so Feral will get the money, but looks like I will be playing in Windows. It's amazing the nVidia drivers are this horrific in OS X :(.  Here are some FPS numbers from the benchmark on my 2012 15" Retina MBP with a GTX 1070 eGPU:

Mac OS X 10.13.2:
1440p High Settings - Average 3.6 FPS Minimum 2.8 FPS Maximum 6.3 FPS

Windows 10 DirectX 12:
1440p High Settings - Average 42.4 FPS Minimum 31.0 FPS Maximum 57.1 FPS
1440p Very High Settings (what I play it in) - Average 37.8 FPS Minimum 25.8 FPS Maximum 50.5 FPS
1440p Ultra Settings - Average 34.1 FPS Minimum 24.6 FPS Maximum 44.8 FPS
1080p High Settings - Average 45.3 FPS Minimum 31.3 FPS Maximum 66.4 FPS
1080p Very High Settings - Average 42.4 FPS Minimum 26.4 FPS Maximum 67.6 FPS

Crikey. I know that Nvidia are a bit behind as they aren't shipping new hardware in Macs and so the WebDriver is more a labour of love unsupported by Apple, but I wonder what is going on there. It looks more like the game is running on the internal NV 650 GPU and just feeding the frame-buffer to the eGPU rather than actually rendering on the 1070.

#58 ozzy

ozzy

    Heroic

  • Members
  • PipPipPipPip
  • 445 posts
  • Steam Name:ozzy
  • Location:London, UK

Posted 16 December 2017 - 04:47 PM

Yea, I’ve been wondering that too. Especially since Hitman suffers the same fate in 10.13 (e.g. under 5 FPS) but worked flawlessly in 10.12. I checked iStat menus though and it showed al activity happening on the 1070 not the 650m. It seems to be that either metal in 10.13 is hugely problematic on nVidia or that metal doesn’t render on an eGPU in 10.13 for some reason. OpenGL games and benchmarks seem to be fine in 10.13 though.

#59 ozzy

ozzy

    Heroic

  • Members
  • PipPipPipPip
  • 445 posts
  • Steam Name:ozzy
  • Location:London, UK

Posted 17 December 2017 - 10:33 AM

View Postmarksatt, on 16 December 2017 - 04:19 PM, said:

Crikey. I know that Nvidia are a bit behind as they aren't shipping new hardware in Macs and so the WebDriver is more a labour of love unsupported by Apple, but I wonder what is going on there. It looks more like the game is running on the internal NV 650 GPU and just feeding the frame-buffer to the eGPU rather than actually rendering on the 1070.

So I dug into this a little bit more and you might be right. It almost looks like in 10.13 anything using Metal is being rendered on the 650m, while anything using OpenGL is being rendered on the eGPU. Would help explain why Hitman performance tanked when I upgraded (to less than 5 FPS). Here's a post I wrote on the eGPU.io forums after doing a little sleuthing in case anyone is interested here:

------------

I’m beginning to think Metal games are not rendering on the nVidia eGPU in 10.13 the way they did in 10.12. This is on my 2012 15″ MBP Retina with nVidia 650. I have an Akitio Node with GTX 1070 as my eGPU.

The reason for my original suspicion is that Hitman (which uses metal) ran flawlessly on my eGPU at 1440p in 10.12, but now in 10.13 it’s unplayable (under 5 FPS). I just bought and installed Deus Ex, which also uses Metal, and am seeing the same thing – the benchmark in OS X is ~3 FPS while in Windows it is ~40 FPS (see results below). The strange thing is after quitting the game and checking the iStat menu it shows the 1070 as the one with all the processing power and memory usage, not the 650m. However, things that use OpenGL still work great in 10.13 (e.g. Heaven benchmark, Civ VI).

I’ve also used the GFX OpenGL and GFX Metal benchmarks today. Interestingly, after running the Metal benchmarks the results are not really any different than my 650m when I did it back in February in 10.12.x, and the iStat menu shows the 650m as the one being used. But on the OpenGL benchmark the results are significantly higher, and the iState menu shows the 1070 being used.

Has anyone else noticed this? Or can anyone else check? I’m thinking the Kext to enable nVidia eGPU may not actually work as well as the script from Goalque in 10.12. Any thoughts?

Benchmark results:

Deus Ex:
  • Mac OS X 10.13.2:
    • 1440p High Settings – Average 3.6 FPS Minimum 2.8 FPS Maximum 6.3 FPS
  • Windows 10 DirectX 12:
    • 1440p High Settings – Average 42.4 FPS Minimum 31.0 FPS Maximum 57.1 FPS
    • 1440p Very High Settings (what I play it in) – Average 37.8 FPS Minimum 25.8 FPS Maximum 50.5 FPS
    • 1440p Ultra Settings – Average 34.1 FPS Minimum 24.6 FPS Maximum 44.8 FPS
    • 1080p High Settings – Average 45.3 FPS Minimum 31.3 FPS Maximum 66.4 FPS
    • 1080p Very High Settings – Average 42.4 FPS Minimum 26.4 FPS Maximum 67.6 FPS

GFX OpenGL:
  • 1070 eGPU (10.13.2) Manhattan Onscreen: 58.915 FPS, Manhattan Offscreen 128.06 FPS, T-Rex Onscreen 59.53 FPS, T-Rex Offscreen 672.81 FPS
  • 650m (10.12.x) Manhattan Onscreen: 23.91 FPS, Manhattan Offscreen 56.09 FPS, T-Rex Onscreen 49.19 FPS, T-Rex Offscreen 153.1 FPS

GFX Metal:
  • 1070 eGPU (10.13.2) Manhattan Onscreen: 36.01 FPS, Manhattan Offscreen 67.91 FPS, T-Rex Onscreen 43.69 FPS, T-Rex Offscreen 124.46 FPS
  • 650m (10.12.x) Manhattan Onscreen: 22.70 FPS, Manhattan Offscreen 67.95 FPS, T-Rex Onscreen 44.82 FPS, T-Rex Offscreen 125.62 FPS


#60 jeannot

jeannot

    Heroic

  • Members
  • PipPipPipPip
  • 330 posts

Posted 17 December 2017 - 02:48 PM

10.13 was supposed to bring eGPU support, but it effectively works less well than 10.12?
Do you have the possibility to test with a radeon GPU?