Jump to content


Some Performance observations re: Fortnite Battle Royale in macOS and Win10


  • Please log in to reply
7 replies to this topic

#1 Irishman

Irishman

    Fanatic

  • Members
  • PipPip
  • 114 posts

Posted 16 January 2018 - 08:11 AM

So, during this past Christmas break, my sons and I played quite a bit of Fortnite Battle Royale (them mostly in Xbox One, and me on my iMac, dual booting High Sierra and Bootcamp Windows 10). For a few days, we were playing it so frequently in Win10, that I just left Windows running without rebooting back into macOS. My iMac's specs are as follows: late 2012, 21.5" 1080p screen, 2.9 GHz i5, 8GB RAM, 1 TB HD, nVidia GT 650M 512MB GPU. I'm running nVidia's Web Drivers (up-to-date as of this morning).

We have Google Fiber gigabit internet, which is hard-wired straight into my Gigabit-equipped iMac.

The reason that I tried out the Windows version is because my initial experience running the game in macOS was so negative that it had me concerned that maybe my Mac was too long in the tooth to play it well. What I found out, instead was a little bit different.

Here are my observations after playing Fortnite on both OS's:

1. The loading screens and transitions are much smoother and reliable in Windows. In High Sierra, I experience random errors, (mostly failure to join a game pop-ups). Sometimes, I would have to quit out of Fortnite, then quit out of the Epic Launcher, both of which are laborious and slow to wait for. My son, at one point, while waiting to join a game in High Sierra, asked me if my Mac was frozen up.

2. After having played around with the settings to achieve the best results, I learned that it will only play on lowest settings at 640x480 in High Sierra, with a frame rate that varies wildly between 5 FPS to 60 FPS, even on loading screens. On Windows, I can run on 720p, with all settings lowest, except for draw distance, which I max out), for obvious reasons. These settings give me a frame rate from between 40-60 FPS. Even with the in-game frame cap turned up to 120 FPS, we noticed no difference in measured frame rate.

3. Once in-game, the differences between the two OS's mostly melted away, giving an enjoyable playing experience. The only hitch I encountered was a slight stutter when scoping in with the game's sniper rifle, an action which was smooth in Windows.

4. Regarding menu performance under High Sierra: one thing I quickly learned not to do is to bring up the menu to chance graphic settings once in-game. Doing so introduced the same transition slowness I noticed in #1 above. Sometimes, the game would freeze up and I would was forced to command-tab out of it, and restart my Mac.

5. Please notice that I am not shocked that a Mac of mine's vintage experiences problems playing the game. Also, please note that I am fully aware that the game is in Early Access, which means it's not fully optimized. What I hope to do is present mine and my sons's experiences playing the game. The differences between the performance in Windows 10 and macOS High Sierra are disappointing, and I can only hope that further Metal development brings performance gains more on-par with Windows 10.
COMING SOONFinding the Ark of the Covenant by Brian Roberts, in the iBook Store on iTunes, a new investigation into the Hebrew’s Most Sacred Relic!

#2 Camper-Hunter

Camper-Hunter

    Heroic

  • Members
  • PipPipPipPip
  • 415 posts
  • Steam Name:Rorqual
  • Steam ID:Rorqual
  • Location:Paris, France

Posted 16 January 2018 - 02:16 PM

View PostIrishman, on 16 January 2018 - 08:11 AM, said:

I can only hope that further Metal development brings performance gains more on-par with Windows 10.

While there is life there is hope! But life is short... NVIDIA drivers are unfortunately bad on macOS. I think it's a bit better with AMD, but still Windows has better performance.

#3 marksatt

marksatt

    Fanatic

  • Members
  • PipPip
  • 59 posts

Posted 16 January 2018 - 09:22 PM

I'm focused on keeping Metal support in UE4 moving forward with the rest of UE4 (it's a big team, so it keeps changing a lot!) so optimisation of the games is handled by others. As such there may well be more going on, but below I've summarised the obvious things that come to mind. I'll also note that we pay a penalty of 10-20% just for running on macOS/Metal rather than Windows/D3D11 which is often the difference between one resolution or quality level and another.

View PostIrishman, on 16 January 2018 - 08:11 AM, said:

So, during this past Christmas break, my sons and I played quite a bit of Fortnite Battle Royale (them mostly in Xbox One, and me on my iMac, dual booting High Sierra and Bootcamp Windows 10). For a few days, we were playing it so frequently in Win10, that I just left Windows running without rebooting back into macOS. My iMac's specs are as follows: late 2012, 21.5" 1080p screen, 2.9 GHz i5, 8GB RAM, 1 TB HD, nVidia GT 650M 512MB GPU. I'm running nVidia's Web Drivers (up-to-date as of this morning).

5. Please notice that I am not shocked that a Mac of mine's vintage experiences problems playing the game. Also, please note that I am fully aware that the game is in Early Access, which means it's not fully optimized. What I hope to do is present mine and my sons's experiences playing the game. The differences between the performance in Windows 10 and macOS High Sierra are disappointing, and I can only hope that further Metal development brings performance gains more on-par with Windows 10.

The lack of VRAM on that GPU is really going to hurt on macOS, we use more VRAM on macOS so we'll be paging a lot more between CPU & GPU which is bad. That's a necessary evil to avoid other inefficiencies caused by trying to map a D3D11-oriented engine on to the Metal API.

View PostIrishman, on 16 January 2018 - 08:11 AM, said:

1. The loading screens and transitions are much smoother and reliable in Windows. In High Sierra, I experience random errors, (mostly failure to join a game pop-ups). Sometimes, I would have to quit out of Fortnite, then quit out of the Epic Launcher, both of which are laborious and slow to wait for. My son, at one point, while waiting to join a game in High Sierra, asked me if my Mac was frozen up.

Behind the loading screen on macOS it will be doing a heck of a lot more shader compilation than it has to under Windows and unfortunately that is currently a highly serial single-threaded process.

View PostIrishman, on 16 January 2018 - 08:11 AM, said:

2. After having played around with the settings to achieve the best results, I learned that it will only play on lowest settings at 640x480 in High Sierra, with a frame rate that varies wildly between 5 FPS to 60 FPS, even on loading screens. On Windows, I can run on 720p, with all settings lowest, except for draw distance, which I max out), for obvious reasons. These settings give me a frame rate from between 40-60 FPS. Even with the in-game frame cap turned up to 120 FPS, we noticed no difference in measured frame rate.

The varying frame-rate will stabilise somewhat if you play on the same build for long enough as the local shader cache builds up entries. The price is longer load times of course. Metal (like both D3D12 & Vulkan) exposes the developer to the reality of how shaders are actually compiled for GPUs and expects developers to optimise around this. Unfortunately most engines are built on the abstraction provided by D3D11 which inherited the D3D9 model of separate shaders with near-zero runtime compilation cost, achieved by the driver vendors investing the man-hour equivalent of tens of millions of dollars to aggressively optimise their runtime shader compilers and make their D3D-driver fully asynchronous (so games don't block when calling D3D) via substantial multi-threading. Often they optimised their shader compilers to generate code that is *trivial* to patch when render-state (like render-target or texture-formats) change or even outright replace the games shaders with their own "specially optimised" version. The new APIs force/encourage driver vendors to optimise their shader compilers to generate "perfect" GPU shader code, even if it makes shader compilation much slower as that is now the game developers problem and not directly the vendors. That is not to say the vendors don't care - merely that the APIs send you down a particular implementation route.

Another cause of fluctuating frame-rates is that D3D11's GPU resource management is also heavily abstracted away from the developer with the vendor able to do a lot of under-the-hood optimisations as they control the implementation. Metal puts that all on the game developer a bit more like Vulkan (though the API & semantics are quite different) and right now we aren't as efficient at allocating resources as D3D which can cause hitches on the CPU.

View PostIrishman, on 16 January 2018 - 08:11 AM, said:

3. Once in-game, the differences between the two OS's mostly melted away, giving an enjoyable playing experience. The only hitch I encountered was a slight stutter when scoping in with the game's sniper rifle, an action which was smooth in Windows.

That'll be shader compilation or resource allocation which is more expensive on Metal than D3D. If the game plays well in-game on macOS then really that's the important part.

View PostIrishman, on 16 January 2018 - 08:11 AM, said:

4. Regarding menu performance under High Sierra: one thing I quickly learned not to do is to bring up the menu to chance graphic settings once in-game. Doing so introduced the same transition slowness I noticed in #1 above. Sometimes, the game would freeze up and I would was forced to command-tab out of it, and restart my Mac.

Well, yep, that'll be the engine recompiling all the shaders & shader-pipelines ;)

#4 jeannot

jeannot

    Heroic

  • Members
  • PipPipPipPip
  • 330 posts

Posted 17 January 2018 - 01:16 AM

Interesting. Do you think that when/if UE4 is optimised for DX12, that should benefit Metal as well?

#5 marksatt

marksatt

    Fanatic

  • Members
  • PipPip
  • 59 posts

Posted 17 January 2018 - 12:39 PM

View Postjeannot, on 17 January 2018 - 01:16 AM, said:

Interesting. Do you think that when/if UE4 is optimised for DX12, that should benefit Metal as well?

It is a bit more complicated than that. You can optimise D3D12 by changing D3D12-specific code, which won't benefit Metal, or by changing the common, higher-level code, which might. If we change high-level code to benefit D3D12 or Vulkan it could help Metal but it very much depends on what we are changing and how. What I will say is that performance will continue to improve incrementally in future releases but I can't promise how that will translate to individual games nor will I promise any radical improvements.

#6 Irishman

Irishman

    Fanatic

  • Members
  • PipPip
  • 114 posts

Posted 19 January 2018 - 12:15 PM

View Postmarksatt, on 17 January 2018 - 12:39 PM, said:

It is a bit more complicated than that. You can optimise D3D12 by changing D3D12-specific code, which won't benefit Metal, or by changing the common, higher-level code, which might. If we change high-level code to benefit D3D12 or Vulkan it could help Metal but it very much depends on what we are changing and how. What I will say is that performance will continue to improve incrementally in future releases but I can't promise how that will translate to individual games nor will I promise any radical improvements.


Thanks, Mark, for putting our experience into some context other than "Macs suck for gaming! What did you expect?"

:)
COMING SOONFinding the Ark of the Covenant by Brian Roberts, in the iBook Store on iTunes, a new investigation into the Hebrew’s Most Sacred Relic!

#7 MrUNIMOG

MrUNIMOG

    Fanatic

  • Members
  • PipPip
  • 58 posts
  • Location:Hamburg, Germany

Posted 14 April 2018 - 02:25 PM

View Postmarksatt, on 16 January 2018 - 09:22 PM, said:

I'm focused on keeping Metal support in UE4 moving forward with the rest of UE4 (it's a big team, so it keeps changing a lot!) so optimisation of the games is handled by others. As such there may well be more going on, but below I've summarised the obvious things that come to mind. I'll also note that we pay a penalty of 10-20% just for running on macOS/Metal rather than Windows/D3D11 which is often the difference between one resolution or quality level and another.

The lack of VRAM on that GPU is really going to hurt on macOS, we use more VRAM on macOS so we'll be paging a lot more between CPU & GPU which is bad. That's a necessary evil to avoid other inefficiencies caused by trying to map a D3D11-oriented engine on to the Metal API.

Behind the loading screen on macOS it will be doing a heck of a lot more shader compilation than it has to under Windows and unfortunately that is currently a highly serial single-threaded process.

The varying frame-rate will stabilise somewhat if you play on the same build for long enough as the local shader cache builds up entries. The price is longer load times of course. Metal (like both D3D12 & Vulkan) exposes the developer to the reality of how shaders are actually compiled for GPUs and expects developers to optimise around this. Unfortunately most engines are built on the abstraction provided by D3D11 which inherited the D3D9 model of separate shaders with near-zero runtime compilation cost, achieved by the driver vendors investing the man-hour equivalent of tens of millions of dollars to aggressively optimise their runtime shader compilers and make their D3D-driver fully asynchronous (so games don't block when calling D3D) via substantial multi-threading. Often they optimised their shader compilers to generate code that is *trivial* to patch when render-state (like render-target or texture-formats) change or even outright replace the games shaders with their own "specially optimised" version. The new APIs force/encourage driver vendors to optimise their shader compilers to generate "perfect" GPU shader code, even if it makes shader compilation much slower as that is now the game developers problem and not directly the vendors. That is not to say the vendors don't care - merely that the APIs send you down a particular implementation route.

Another cause of fluctuating frame-rates is that D3D11's GPU resource management is also heavily abstracted away from the developer with the vendor able to do a lot of under-the-hood optimisations as they control the implementation. Metal puts that all on the game developer a bit more like Vulkan (though the API & semantics are quite different) and right now we aren't as efficient at allocating resources as D3D which can cause hitches on the CPU.

That'll be shader compilation or resource allocation which is more expensive on Metal than D3D. If the game plays well in-game on macOS then really that's the important part.

Well, yep, that'll be the engine recompiling all the shaders & shader-pipelines ;)

Interesting read, thanks for sharing those insights!

I've noticed with my MBP's Iris Plus 650, Fortnite in general achieves pretty decent frame rates (mostly between 40 and 70 FPS) under macOS, except for severe frame drops every few minutes, which others seem to have also experienced on Macs with Intel GPUs.

Is there something specific to Intel's GPU architecture that might cause this, or could it be a driver issue?
Devices I play games on:

MacBook Pro (13-inch, Late 2016, 4 TBT3) — 3.1 GHz i5-6287U | Iris 550 | 16 GB RAM | 512 GB Flash | Space Gray
iMac (27-inch, Late 2012) — 3.4 GHz i7-3770 | GTX 680MX | 8 GB RAM | 1 TB Fusion Drive
iMac (Flat Panel) — 700 MHz PPC 7450 (G4) | GeForce2 MX | 640 MB RAM | 40 GB HDD
iMac (5 Flavors) — 333 MHz PPC 750 (G3) | ATI Rage Pro | 512 MB RAM | 6 GB HDD | Tangerine

iPhone X — 256 GB | Silver
iPhone 5s — 32 GB | Space Gray
iPad Pro (10.5-inch) — 256 GB | Space Gray
iPad Air 2 — 64 GB | Space Gray
iPad — 16 GB
iPod touch (3rd generation) — 32 GB
iPod touch — 8 GB
Apple Watch (1st generation) — 42 mm | Stainless Steel (Yes, there's games for watchOS)

#8 Irishman

Irishman

    Fanatic

  • Members
  • PipPip
  • 114 posts

Posted 30 November 2018 - 07:59 AM

Just an update on performance differences between Win10 running in Bootcamp and MacOS Mojave (10.14.1). Both OS's up-to-date as of last night. Fornite up-to-date as of yesterday.

After all the build-up about Mojave, I was hoping to see a continued shrinking of the performance delta between Win10 and macOS, and hopefully see something more competitive. Not to be, apparently. And that's frustrating. The gap is worsening for me, I can play 40-60 fps in Win10 at 1080p, minimum settings, and I am struggling to get 2-20 fps in Mojave, at lowest res I can crank it down to, minimum settings.

What the hell!?!?
COMING SOONFinding the Ark of the Covenant by Brian Roberts, in the iBook Store on iTunes, a new investigation into the Hebrew’s Most Sacred Relic!