Jump to content


Apple's M1 chip - huge GPU gain over Iris graphics

think different

  • Please log in to reply
121 replies to this topic

#21 Matt Diamond

Matt Diamond

    50 carat

  • Forum Moderators
  • 3259 posts
  • Location:PA; US
  • Pro Member:Yes

Posted 10 November 2020 - 10:06 PM

Bad news - the current M1 Macs don't support eGPUs. (Not clear if this is permanent, but it's looking dicey.)

Apple giveth, and Apple taketh away.

www.mindthecube.com

Current setup: macOS 10.14.x/2018 Mac Mini 3.2GHz i7/16GB RAM/Sonnet Breakaway 650 eGPU w Sapphire Radeon VEGA 56 8GB


#22 Thain Esh Kelch

Thain Esh Kelch

    Admin

  • Members+
  • PipPipPipPipPipPipPipPipPip
  • 4568 posts
  • Steam ID:thaineshkelch
  • Location:Denmark

Posted 11 November 2020 - 03:29 AM

Anandtech has a really good writeup (As usual), on what we can expect from the M1.

I was ready to pull trigger on the Mac Mini, but no eGPU support, HDMI2.0, too low monitor resolution/frequency options supported, and its a no. I will wait out and hope for a mid-range Mac tower or Pro mini-whatever.
"They're everywhere!" -BOB

#23 jeannot

jeannot

    Taunting a Second Time

  • Members+
  • PipPipPipPip
  • 466 posts

Posted 11 November 2020 - 06:23 AM

View PostSneaky Snake, on 10 November 2020 - 01:28 PM, said:

BareFeats has some benchmarks of the 13" MBP running the same Shadow of the Tomb Raider benchmark at 1280x800 (slightly lower resolution). The Intel 13" MBP gets 18 fps in that benchmark, so if we apply a 2.9x boost to simulate the M1 chip, we get 52.2 fps. The 16" MBP with the 5500M in the same benchmark gets 66 fps. This puts the 5500M as around 25% faster then the GPU in the M1 chip. It also places the M1 chip as more powerful then the older Radeon GPUs in the previous 15" MBPs such as the 560X and Vega20.
You have to consider that the game was running under Rosetta.

#24 Matt Diamond

Matt Diamond

    50 carat

  • Forum Moderators
  • 3259 posts
  • Location:PA; US
  • Pro Member:Yes

Posted 11 November 2020 - 10:26 AM

View PostThain Esh Kelch, on 11 November 2020 - 03:29 AM, said:

Anandtech has a really good writeup (As usual), on what we can expect from the M1.
Great article but most of it was way above my head. The gist is captured by the last graph on page 4, and these sentences:

Quote

Whilst in the past 5 years Intel has managed to increase their best single-thread performance by about 28%, Apple has managed to improve their designs by 198%, or 2.98x (let’s call it 3x) the performance of the Apple A9 of late 2015.

Apple’s performance trajectory and unquestioned execution over these years is what has made Apple Silicon a reality today. Anybody looking at the absurdness of that graph will realise that there simply was no other choice but for Apple to ditch Intel and x86 in favour of their own in-house microarchitecture – staying par for the course would have meant stagnation and worse consumer products.

www.mindthecube.com

Current setup: macOS 10.14.x/2018 Mac Mini 3.2GHz i7/16GB RAM/Sonnet Breakaway 650 eGPU w Sapphire Radeon VEGA 56 8GB


#25 Ichigo27

Ichigo27

    NSFW o_O

  • Members+
  • PipPipPipPipPipPip
  • 2298 posts
  • Location:pingas

Posted 11 November 2020 - 03:06 PM

Since theirs no ARM support for godot, I wonder about unity3d. Unity technologies did say in that WWDC ARM Announcement video stream that they are porting that game engine to silicon ARM binary of Big Sur. But they did not show enough of it to give a good idea about overall performance especially when you have to take into account that the first ARM macs coming out do not have discrete AMD graphics cards.
What is a man?

#26 Sneaky Snake

Sneaky Snake

    Official Mascot of the 1988 Winter Olympics

  • IMG Writers
  • 3505 posts
  • Steam Name:SneakySnake
  • Steam ID:sneaky_snake
  • Location:Waterloo, Canada

Posted 11 November 2020 - 03:28 PM

The hinted performance by Apple is definitely impressive. However, I am expected reviews to come out and for all of the hardcore hardware crowd to laugh and point when the M1 gets outclassed by an AMD 4900HS or a Intel 1185G7 (Tiger Lake) - however those chips are MUCH higher wattage (2-3x the wattage) and as such should naturally perform much higher.

What will be really interesting is next year when we have an M2/M1X or whatever they call the chip with a higher TDP, more CPU and GPU cores, and more memory. The chip that is destined for the 4 port 13" MBP, iMac, and 16" MBP.

Another notable performance improvement with the new chips is the memory speed is likely massively increased - seeing as it is part of the same SoC and likely more equivalent to GDDR RAM instead of regular DDR RAM. Essentially the M1 chip is more close in design to the console chips which have their CPU/GPU/Memory all as a single unit with both the CPU and GPU having direct access to it. The huge increase in memory speed likely plays a large role in the GPU outclassing Intel Iris graphics so much. This comes at the cost of only having 16GB of RAM, but future chips destined for more powerful Macs would almost certainly have access to 32GB or more of memory.
16" MBP: i9 9880H @ 2.3 GHz || Radeon 5500M 8GB || 32 GB DDR4 || 1TB SSD
Desktop: 5600X || RTX 3070 || 32 GB DDR4 || 1TB 970 EvoPlus + 1TB Seagate FireCuda
Other: 30TB Plex Server || Series X || PS5 || iPhone 12

#27 mindnoise

mindnoise

    Born On Board

  • Members+
  • PipPipPipPipPip
  • 755 posts
  • Steam ID:Exploding-Bob
  • Location:Delusive-Ville

Posted 11 November 2020 - 07:05 PM

Basically what has been said. Only two TB3 ports and 16GB Ram do not cut it. M2 or M3 Gemeration will become interesting I guess.
hey, don´t worry - it´s only red pixels on your hands...

#28 Tetsuya

Tetsuya

    Colonel Chaos

  • Members+
  • PipPipPipPipPipPip
  • 2315 posts
  • Location:MI

Posted 11 November 2020 - 07:52 PM

Quote

Whilst in the past 5 years Intel has managed to increase their best single-thread performance by about 28%, Apple has managed to improve their designs by 198%, or 2.98x (let’s call it 3x) the performance of the Apple A9 of late 2015.

Apple’s performance trajectory and unquestioned execution over these years is what has made Apple Silicon a reality today. Anybody looking at the absurdness of that graph will realise that there simply was no other choice but for Apple to ditch Intel and x86 in favour of their own in-house microarchitecture – staying par for the course would have meant stagnation and worse consumer products.



Only problem with this is that it assumes that that upward trend will continue (which is highly suspect to say the least, considering other ARM platforms have flattened out A LOT), and that Intel and AMD will do nothing.  

They didn't have to stay with Intel to stay X86.

AMD has been doing wonders, and Tiger Lake is showing quite a bit of promise.  

And for full-up desktops, its hard to imagine a low power BIG.little configuration is going to somehow compete with 5+ghz 16 core chips from AMD, particularly for production workloads.  

For a daily driver, i imagine these will be very adequate to superb.  

Doing "real" work on them seems like a real hit or miss prospect when the "competition" has double or triple the core count and significantly higher clock speeds (up to 2ghz faster).  This applies mainly to desktops/workstations, of course, and not ultraportables or the Mini, where thermal constraints wouldnt let you fit that in anyway.

#29 Homy

Homy

    Recharable Bionic Buttocks

  • Members+
  • PipPipPipPip
  • 494 posts
  • Location:Sweden

Posted 11 November 2020 - 10:02 PM

M1 GPU is on par with Radeon Pro 560X: https://browser.geek...compute/1802234

#30 Janichsan

Janichsan

    Jugger Bugger

  • Forum Moderators
  • 8714 posts
  • Steam Name:Janichsan
  • Location:over there

Posted 12 November 2020 - 12:55 AM

View PostHomy, on 11 November 2020 - 10:02 PM, said:

M1 GPU is on par with Radeon Pro 560X: https://browser.geek...compute/1802234
The artificial CPU benchmarks aren't too shabby either.

"We do what we must, because we can."
"Gaming on a Mac is like women on the internet." — "Highly common and totally awesome?"


#31 Thain Esh Kelch

Thain Esh Kelch

    Admin

  • Members+
  • PipPipPipPipPipPipPipPipPip
  • 4568 posts
  • Steam ID:thaineshkelch
  • Location:Denmark

Posted 12 November 2020 - 02:32 AM

I get why Apple chose Baldurs Gate 3 as a showcase for the M1, and it was impressive, but it was running at 10-25fps at 1080p, which didn't make it look very good IMHO. Of course, this was with all bells and whistles, so that at least was impressive.

I wonder if Apple is adding hardware raytracing in the M1X or whatever that will be called?
"They're everywhere!" -BOB

#32 Homy

Homy

    Recharable Bionic Buttocks

  • Members+
  • PipPipPipPip
  • 494 posts
  • Location:Sweden

Posted 12 November 2020 - 11:44 AM

View PostThain Esh Kelch, on 12 November 2020 - 02:32 AM, said:

I get why Apple chose Baldurs Gate 3 as a showcase for the M1, and it was impressive, but it was running at 10-25fps at 1080p, which didn't make it look very good IMHO. Of course, this was with all bells and whistles, so that at least was impressive.

I wonder if Apple is adding hardware raytracing in the M1X or whatever that will be called?

Where did you get those frame rates? In this video that someone else posted here but is gone? BG 3 runs smooth at Ultra settings 1080p on M1 at 6:45: https://developer.ap...ch-talks/10859/

#33 Thain Esh Kelch

Thain Esh Kelch

    Admin

  • Members+
  • PipPipPipPipPipPipPipPipPip
  • 4568 posts
  • Steam ID:thaineshkelch
  • Location:Denmark

Posted 13 November 2020 - 06:20 AM

View PostHomy, on 12 November 2020 - 11:44 AM, said:

Where did you get those frame rates? In this video that someone else posted here but is gone? BG 3 runs smooth at Ultra settings 1080p on M1 at 6:45: https://developer.ap...ch-talks/10859/
My very scientifically based subjective opinion of that video. I wrote it based on my first impression during the keynote, but seeing the footage again, i stick to the lower values, but the raise the high end. There's clear frame dropping when they zoom in to the characters in the beginning, and in every scene when the red guy starts running, there's also a noticeable drop. Mon the less, still very impressive for an iGPU.

Speaking of which, does it share VRAM with the system RAM?
"They're everywhere!" -BOB

#34 Homy

Homy

    Recharable Bionic Buttocks

  • Members+
  • PipPipPipPip
  • 494 posts
  • Location:Sweden

Posted 13 November 2020 - 10:42 AM

View PostThain Esh Kelch, on 13 November 2020 - 06:20 AM, said:

Speaking of which, does it share VRAM with the system RAM?

Yes, Unified memory: M1 also features our unified memory architecture, or UMA. M1 unifies its high-bandwidth, low-latency memory into a single pool within a custom package. As a result, all of the technologies in the SoC can access the same data without copying it between multiple pools of memory. This dramatically improves performance and power efficiency. Video apps are snappier. Games are richer and more detailed. Image processing is lightning fast. And your entire system is more responsive.

One upside is when you choose more RAM at purchase you automatically upgrade your VRAM too. So an iMac with 32 GB can have 20-30 GB VRAM. :)

At 14:14 https://youtu.be/2lK0ySxQyrs?t=854

#35 Thain Esh Kelch

Thain Esh Kelch

    Admin

  • Members+
  • PipPipPipPipPipPipPipPipPip
  • 4568 posts
  • Steam ID:thaineshkelch
  • Location:Denmark

Posted 13 November 2020 - 11:15 AM

Interesting. I will definitely not skip on the RAM then! Will be interesting to see benchmarks though.
"They're everywhere!" -BOB

#36 dr.zeissler

dr.zeissler

    Computing and Humanity

  • Members+
  • PipPipPipPip
  • 443 posts
  • Location:GERMANY

Posted 13 November 2020 - 12:52 PM

I consider M1-MacMini16GB and a good display (not sure what to buy)... M1 on pair with 560x that's GREAT!!!

as they have shown the architecture of the M1 I instantly thought about...what if the game-dev's use the extra ML-cores as "blitter/copper" like the amiga-guys used back in the day.... advantage due to custom chips/features...that would be even bigger than M1 alone.

If the performance is that good already I do not think about eGPU... waiting for some game benchmarks and some infos from the game-porters!
iMac 27" Late 2012 3,4Ghz i7 NT 680MX
Macbook Air Late 2010 320M
MacMinis 2005 - 2011

#37 Camper-Hunter

Camper-Hunter

    Wabbit Swayer

  • Members+
  • PipPipPipPipPip
  • 568 posts
  • Steam Name:Camper-Hunter
  • Steam ID:Rorqual
  • Location:Paris, France

Posted 13 November 2020 - 02:00 PM

View PostHomy, on 13 November 2020 - 10:42 AM, said:

One upside is when you choose more RAM at purchase you automatically upgrade your VRAM too. So an iMac with 32 GB can have 20-30 GB VRAM. :)

Yeah, but 20-30 GB of slow VRAM. Dedicated graphics card, especially high end, use much faster VRAM than the Mac's or PC's main RAM.

#38 dr.zeissler

dr.zeissler

    Computing and Humanity

  • Members+
  • PipPipPipPip
  • 443 posts
  • Location:GERMANY

Posted 13 November 2020 - 02:04 PM

As far as I understood that the RAM is included in the M1..so fast and short ways even no copping the framebuffer.
I think they opened a BIG box here. I expect lot's a great products to follow...
iMac 27" Late 2012 3,4Ghz i7 NT 680MX
Macbook Air Late 2010 320M
MacMinis 2005 - 2011

#39 Homy

Homy

    Recharable Bionic Buttocks

  • Members+
  • PipPipPipPip
  • 494 posts
  • Location:Sweden

Posted 13 November 2020 - 02:59 PM

View PostCamper-Hunter, on 13 November 2020 - 02:00 PM, said:

Yeah, but 20-30 GB of slow VRAM. Dedicated graphics card, especially high end, use much faster VRAM than the Mac's or PC's main RAM.

True but Apple must have a dGPU in development for at least Mac Pro.

#40 jeannot

jeannot

    Taunting a Second Time

  • Members+
  • PipPipPipPip
  • 466 posts

Posted 14 November 2020 - 02:32 PM

View PostHomy, on 11 November 2020 - 10:02 PM, said:

M1 GPU is on par with Radeon Pro 560X: https://browser.geek...compute/1802234
And in graphic (not compute) tasks, the M1 could be on par with the Radeon Pro 570X(and way ahead of the 560X).
There are reasons for that, as the TBDR architecture of the M1 (for which Metal has been tailored) benefits graphics more than it benefits compute. Also, Apple GPUs can use 16-bit numbers in shaders, to boost efficiency, which PC GPUs can't.