Jump to content


Tetsuya

Member Since 23 Jan 2003
Offline Last Active Today, 09:47 AM
-----

Posts I've Made

In Topic: MacOS 10.14 - Mojave

07 June 2018 - 11:26 AM

View Postipickert55, on 06 June 2018 - 09:56 PM, said:



Much like fire in California.

How about “Wildfire”?  That rolls off the tongue better.

In Topic: MacOS 10.14 - Mojave

06 June 2018 - 09:15 PM

View Postipickert55, on 05 June 2018 - 05:18 PM, said:



What about 10.15 "The Entire Damn State is on Fire"?

Doesnt have the same ring to it.  “Earthquake” is bold, quick, and evocative.

In Topic: The Last Jedi [SPOILERS GALORE]

03 June 2018 - 09:09 PM

View PostTibur, on 21 February 2018 - 10:38 AM, said:

The prequels lost me at midichlorians. Sorry. The Force is not biological.

You're correct, it isn't.  Maybe actually pay attention to the dialogue closely.  (I realize this sounds standoffish.  Not my intent.  But this is a sore subject with me because its an invented issue if you actually pay attention to what is said.)  

Quote

You aren't more powerful in the Force because you have a higher concentration of microscopic magical space bean creatures in your blood.

You're correct again!  You have a higher concentration of microscopic space bean creatures in your blood because you are more powerful in the Force.  

Quote

Otherwise, there'd be a huge market for blood transfusions from those with high concentrations of them in their blood.  Senator Palpatine would be essentially a vampire, going around draining the blood of anyone who showed the slightest ability and young Anakin would have been an empty husk at the beginning of Ep. II.</end rant>

The number of people who didn't understand the rather plain dialogue that medichlorians(sp) are attracted to the Force, not the cause of it, is staggering.  They dont cause the force, they are merely attracted to it, and therefore present anywhere it is strong.  

View PostJanichsan, on 03 June 2018 - 10:39 AM, said:

To re-activate and repurpose this thread: I don't know if anyone else has already seen "Solo", but I would suggest saving the money and wait for it to be streamable or lendable on disc. It's really not great.

I mean, it's not a disaster either (as you might have expected): it's certainly competently made, passes the time, and has a few good moments, but not enough to make it really worthwhile. It's really lacking thrill and humour, and is utterly devoid of charm. And the "big" surprise at the end feels completely unnecessary.

Basically my thoughts on the movie as well.  Its.. a movie.  You could have changed the names and made it into any other generic sci-fi movie.  Its not horrible, its not great.  

And the surprise at the end is only a surprise to non-fans - or, at least, people whose only exposure is the theatrical movies.

The reveal in question is not even a reveal if you follow the rest of the canon (and we're not even talking the books - the TV shows).

In Topic: Razer releases entry-level eGPU $299 eGPU enclosure with macOS support

24 May 2018 - 04:37 PM

View Postmacdude22, on 22 May 2018 - 07:19 PM, said:

I don't get what everybody's circle jerk around nvidia is. GeForce Experience sucks.

So dont use it.  You dont even have to install it.  Just unclick the little radio box next to "GeForce Experience".  Ive never installed it after i realized it was just a badly made optimization tool that i didn't need.  

Quote

Force people to sign in to get drivers? lolwut.

Wut?  I've never done any such thing.  Ever.  I have no nVidia account or anything like it.  And yet, i still have up-to-date drivers (just got new ones today, even).  

Quote

Their proprietary popsnizzle sucks.

G-sync is superior to Freesync in every conceivable way.  

Quote

PhysX doesn't add anything to games.

So dont use it.

Quote

Best thing apple did was toss those honkeys off the free money apple tit.

Posted Image

Yeah, boat-anchoring themselves to a company that cannot remotely compete at the high end, cant build a decent laptop capable card, and whose GPUs suck down power like a thirsty linebacker was a pro move.  (Some irony here, though, as the Skull Canyon NUC hybrid Intel i7/Vega 11 chips are seriously good...)

Or they could be shipping MBPs that have full-fat desktop GPUs in them for power users, and MaxQ enabled laptops for people who need battery-sipping efficiency.  

AMD is struggling to even stay relevant.  The -only- thing keeping them affloat in the GPU market right now is that they got Sony and Microsoft to pony up a huge pile of cash to have their consoles run on a cut-down RX-580 (for the PS4 Pro) and a full-fat 580 (For the Xbone X), and had them running on a cut-down RX460 for the vanilla models.  

Trust me, i rather wish AMD had anything remotely compelling to offer in the GPU market.  I'm not a fanboy.  I will alway buy the best bang for the buck, and that hasn't been AMD in about 6+ years, and isn't going to be AMD for the forseeable future.  Id like them to be more relevant just to create pressure on the market and lower prices if nothing else. (Unlike the CPU market, where Ryzen is making great inroads against Intel.  If they can break the 4-ish Ghz barrier with Zen 2, Intel is in real trouble).

View PostSneaky Snake, on 22 May 2018 - 05:47 PM, said:

If you are planning on going with RX 580 / 1060 or below then you don't really lose anything by being stuck on AMD. The 580 and 1060 are pretty much identical. Going much higher then that probably isn't very worth it on an eGPU due to the limited bandwidth, but I'm just speculating.

I'd have to look, but i know JayzTwoCents and LTT have both done videos with eGPUs all the way up to a 1080Ti.  It doesn't appear that there's a particular card that is not worth using - there's just a flat performance loss on anything much past a 1050Ti/RX 460.  So a 1080Ti will still outperform a 1080, etc.  It also depends on if you're using it attached to an external monitor or piping it back into the external display.  There's a MUCH bigger loss when piping back to the internal display (because it loses one of the channels so it can transmit both directions, cutting bandwidth in half, roughly).  

They weren't recent videos.. probably six-ish months old or more.

In Topic: Apple CPUs to Replace Intel in Macs?

02 May 2018 - 09:21 PM

View PostSneaky Snake, on 23 April 2018 - 07:50 AM, said:

Feel like there is a lot of bad information in here about ARM's performance that is mostly based off of the failed Windows RT or horribly optimized apps. Also comparing the iPad's ARM chip to something like the 15W 8650U is just ludicrous. The iPad has a system TDP of 5 watts versus the single chip TDP of 15W for the 8650U. The iPad's CPU is probably closer to 2-3 watts. Comparing it to something from Intel that uses 5-8x as much power and generates 5x the heat just is not fair at all. I also don't find the WindowsRT failure argument that convincing. Windows has had literally decades of optimization for x86 whereas their ARM port of Windows was rushed out the door and then canceled before there was any real time for improvement. Granted, I think WindowsRT was an absolutely terrible idea.

Here are some actual benchmarks of ARM vs x86 in a server environment.

TLDR: Single core performance Intel wins by a good margin. Multicore performance has ARM near the top (due it's architecture being able to scale to way more cores easily). ARM also has the lowest power consumption by a good margin.

Here is the conclusion from that linked article:

Ummm.. who mentioned Windows RT?  I certainly didnt.  Im talking about the very current, just-released Windows on ARM.  Like... six weeks old at most.  

And it sucks.  A lot.