UmarOMC1, on 14 December 2017 - 05:16 AM, said:


Deus Ex: Mankind Divided for macOS now in final stages of development
#41
Posted 14 December 2017 - 10:39 AM
#42
Posted 14 December 2017 - 03:19 PM
jeannot, on 14 December 2017 - 10:39 AM, said:
macOS 10.15.x/Manjario KDE/3.7GHz i7-8700K Hackintosh/64GB RAM/Gigabyte RADEON VII
(my 'world of hurt' that my kids built in a day & is easier to maintain than Windows)
macOS 10.14.x/3.33GHz Xeon W3580 cMacPro (5,1 flash)/64GB RAM/PowerColor RedDevil RX580
#43
Posted 14 December 2017 - 03:27 PM
#44
Posted 14 December 2017 - 07:40 PM
macOS 10.15.x/Manjario KDE/3.7GHz i7-8700K Hackintosh/64GB RAM/Gigabyte RADEON VII
(my 'world of hurt' that my kids built in a day & is easier to maintain than Windows)
macOS 10.14.x/3.33GHz Xeon W3580 cMacPro (5,1 flash)/64GB RAM/PowerColor RedDevil RX580
#45
Posted 14 December 2017 - 11:43 PM
UmarOMC1, on 14 December 2017 - 07:40 PM, said:
Apple supply updated drivers for all the GPUs they have ever shipped in a Mac, I don't see why you'd expect them to supply anything else.
jeannot, on 14 December 2017 - 10:39 AM, said:
Exactly.
Unlike with the old OpenGL stack there is no Apple layer between the application developer and the GPU vendor driver with Metal. If the game isn't running well on Nvidia's new GPUs (Maxwell & Pascal) when using the WebDriver that is something that needs to be communicated to Nvidia as Apple aren't involved at all.
I'm intrigued as to what the underlying problems are...
#46
Posted 15 December 2017 - 12:52 AM
marksatt, on 14 December 2017 - 11:43 PM, said:
Exactly.
Unlike with the old OpenGL stack there is no Apple layer between the application developer and the GPU vendor driver with Metal.
What was this layer, some sort of of translation of openGL calls into some other calls consumed by the driver?
Quote
I'm intrigued as to what the underlying problems are...
I believe the root cause of the problems is the absence of cooperation between the API and driver developers, as their relationships appear to be in a pretty bad shape these days.
#47
Posted 15 December 2017 - 01:46 AM
marksatt, on 14 December 2017 - 11:43 PM, said:
macOS 10.15.x/Manjario KDE/3.7GHz i7-8700K Hackintosh/64GB RAM/Gigabyte RADEON VII
(my 'world of hurt' that my kids built in a day & is easier to maintain than Windows)
macOS 10.14.x/3.33GHz Xeon W3580 cMacPro (5,1 flash)/64GB RAM/PowerColor RedDevil RX580
#49
Posted 15 December 2017 - 12:25 PM
jeannot, on 15 December 2017 - 02:24 AM, said:
macOS 10.15.x/Manjario KDE/3.7GHz i7-8700K Hackintosh/64GB RAM/Gigabyte RADEON VII
(my 'world of hurt' that my kids built in a day & is easier to maintain than Windows)
macOS 10.14.x/3.33GHz Xeon W3580 cMacPro (5,1 flash)/64GB RAM/PowerColor RedDevil RX580
#50
Posted 15 December 2017 - 12:50 PM
Quoting you again:
Quote
It does read like you expected support from Apple.
#51
Posted 15 December 2017 - 02:26 PM
"GAMING SUPPORT!" I personally don't see how that's confusing enough to warrant so much passive aggressive snark. Was 'Mac' on the side of the box? Of Goddamn course it wasn't! Anything didactic to add?
macOS 10.15.x/Manjario KDE/3.7GHz i7-8700K Hackintosh/64GB RAM/Gigabyte RADEON VII
(my 'world of hurt' that my kids built in a day & is easier to maintain than Windows)
macOS 10.14.x/3.33GHz Xeon W3580 cMacPro (5,1 flash)/64GB RAM/PowerColor RedDevil RX580
#52
Posted 15 December 2017 - 02:42 PM
That was unclear at best. Even though I have been using Macs for the last 20 years, for work and gaming, I don't really care about how poorly Apple used to support gaming years ago.
According to others, gaming support has improved substantially since the release of Metal.
And it shows in Feral games.
#53
Posted 15 December 2017 - 02:56 PM
#54
Posted 15 December 2017 - 03:12 PM
elowel, on 15 December 2017 - 02:56 PM, said:
You can also ask Feral directly.
#55
Posted 16 December 2017 - 12:08 PM
jeannot, on 15 December 2017 - 12:52 AM, said:
Pretty much. Apple's OpenGL stack dated back to one they bought in the 90's for Classic MacOS and it separated the front-end which supplied all the OpenGL function-calls from the driver level which was meant to be much a thinner, simpler layer for the GPU vendors to provide as they no longer had to worry about all the complexities of OpenGL's specification. The intention was good. The result is that we now have Metal, where the GPU vendors implement the whole widget...
#56
Posted 16 December 2017 - 02:53 PM

Mac OS X 10.13.2:
1440p High Settings - Average 3.6 FPS Minimum 2.8 FPS Maximum 6.3 FPS
Windows 10 DirectX 12:
1440p High Settings - Average 42.4 FPS Minimum 31.0 FPS Maximum 57.1 FPS
1440p Very High Settings (what I play it in) - Average 37.8 FPS Minimum 25.8 FPS Maximum 50.5 FPS
1440p Ultra Settings - Average 34.1 FPS Minimum 24.6 FPS Maximum 44.8 FPS
1080p High Settings - Average 45.3 FPS Minimum 31.3 FPS Maximum 66.4 FPS
1080p Very High Settings - Average 42.4 FPS Minimum 26.4 FPS Maximum 67.6 FPS
#57
Posted 16 December 2017 - 04:19 PM
ozzy, on 16 December 2017 - 02:53 PM, said:

Mac OS X 10.13.2:
1440p High Settings - Average 3.6 FPS Minimum 2.8 FPS Maximum 6.3 FPS
Windows 10 DirectX 12:
1440p High Settings - Average 42.4 FPS Minimum 31.0 FPS Maximum 57.1 FPS
1440p Very High Settings (what I play it in) - Average 37.8 FPS Minimum 25.8 FPS Maximum 50.5 FPS
1440p Ultra Settings - Average 34.1 FPS Minimum 24.6 FPS Maximum 44.8 FPS
1080p High Settings - Average 45.3 FPS Minimum 31.3 FPS Maximum 66.4 FPS
1080p Very High Settings - Average 42.4 FPS Minimum 26.4 FPS Maximum 67.6 FPS
Crikey. I know that Nvidia are a bit behind as they aren't shipping new hardware in Macs and so the WebDriver is more a labour of love unsupported by Apple, but I wonder what is going on there. It looks more like the game is running on the internal NV 650 GPU and just feeding the frame-buffer to the eGPU rather than actually rendering on the 1070.
#58
Posted 16 December 2017 - 04:47 PM
#59
Posted 17 December 2017 - 10:33 AM
marksatt, on 16 December 2017 - 04:19 PM, said:
So I dug into this a little bit more and you might be right. It almost looks like in 10.13 anything using Metal is being rendered on the 650m, while anything using OpenGL is being rendered on the eGPU. Would help explain why Hitman performance tanked when I upgraded (to less than 5 FPS). Here's a post I wrote on the eGPU.io forums after doing a little sleuthing in case anyone is interested here:
------------
I’m beginning to think Metal games are not rendering on the nVidia eGPU in 10.13 the way they did in 10.12. This is on my 2012 15″ MBP Retina with nVidia 650. I have an Akitio Node with GTX 1070 as my eGPU.
The reason for my original suspicion is that Hitman (which uses metal) ran flawlessly on my eGPU at 1440p in 10.12, but now in 10.13 it’s unplayable (under 5 FPS). I just bought and installed Deus Ex, which also uses Metal, and am seeing the same thing – the benchmark in OS X is ~3 FPS while in Windows it is ~40 FPS (see results below). The strange thing is after quitting the game and checking the iStat menu it shows the 1070 as the one with all the processing power and memory usage, not the 650m. However, things that use OpenGL still work great in 10.13 (e.g. Heaven benchmark, Civ VI).
I’ve also used the GFX OpenGL and GFX Metal benchmarks today. Interestingly, after running the Metal benchmarks the results are not really any different than my 650m when I did it back in February in 10.12.x, and the iStat menu shows the 650m as the one being used. But on the OpenGL benchmark the results are significantly higher, and the iState menu shows the 1070 being used.
Has anyone else noticed this? Or can anyone else check? I’m thinking the Kext to enable nVidia eGPU may not actually work as well as the script from Goalque in 10.12. Any thoughts?
Benchmark results:
Deus Ex:
- Mac OS X 10.13.2:
- 1440p High Settings – Average 3.6 FPS Minimum 2.8 FPS Maximum 6.3 FPS
- Windows 10 DirectX 12:
- 1440p High Settings – Average 42.4 FPS Minimum 31.0 FPS Maximum 57.1 FPS
- 1440p Very High Settings (what I play it in) – Average 37.8 FPS Minimum 25.8 FPS Maximum 50.5 FPS
- 1440p Ultra Settings – Average 34.1 FPS Minimum 24.6 FPS Maximum 44.8 FPS
- 1080p High Settings – Average 45.3 FPS Minimum 31.3 FPS Maximum 66.4 FPS
- 1080p Very High Settings – Average 42.4 FPS Minimum 26.4 FPS Maximum 67.6 FPS
- 1440p High Settings – Average 42.4 FPS Minimum 31.0 FPS Maximum 57.1 FPS
GFX OpenGL:
- 1070 eGPU (10.13.2) Manhattan Onscreen: 58.915 FPS, Manhattan Offscreen 128.06 FPS, T-Rex Onscreen 59.53 FPS, T-Rex Offscreen 672.81 FPS
- 650m (10.12.x) Manhattan Onscreen: 23.91 FPS, Manhattan Offscreen 56.09 FPS, T-Rex Onscreen 49.19 FPS, T-Rex Offscreen 153.1 FPS
GFX Metal:
- 1070 eGPU (10.13.2) Manhattan Onscreen: 36.01 FPS, Manhattan Offscreen 67.91 FPS, T-Rex Onscreen 43.69 FPS, T-Rex Offscreen 124.46 FPS
- 650m (10.12.x) Manhattan Onscreen: 22.70 FPS, Manhattan Offscreen 67.95 FPS, T-Rex Onscreen 44.82 FPS, T-Rex Offscreen 125.62 FPS
#60
Posted 17 December 2017 - 02:48 PM
Do you have the possibility to test with a radeon GPU?