Pumpkin spice latte (Linux to Mac)

Bigger and more expensive than the mini PC above.

At least it is reported to have adequate cooling.

Size and weight are separate issues, I have a Panasonic Toughbook that’s small and light as a feather but is pretty chunky. The new Ryzen APUs have increasingly nifty onboard graphics too and minimal power requirements compared to monster desktop replacement laptops where battery life isn’t top priority.

From what I’ve read the M1 performance hype was a tad overblown, especially the graphics/video side.

Yea…

And against Ryzen 7 igpu:

It would help if he wasn’t comparing the power draw of just the 3090 GPU in the PC with the overall power draw of the whole mac system, which includes power supply (in)efficiency, and the rest of the components (cpu, fans, storage).

What power was the PC actually drawing at the wall? I’m guessing it was a lot more.

Having said that, apple did oversell the raw 3D power of the gpu.

But I don’t choose what to buy based on marketing materials, and I wouldn’t be buying a laptop if I didn’t care about portability and battery life.

Don’t really care about gaming benchmarks either. The video decoding capabilities are what makes the macbook air and macbook pro excellent choices for the sort of work I’m involved in.

How can you be sure that those puddings weren’t over-egged as well?

I’ve not noticed it when editing video.

What would be your best example of something in the size and portability and battery life class of laptops that do better? Or price class.

And what compromises do they make when compared with the apple competition?

You and I have different ideas of what something being too big to be in the same category of portability is. I’d much rather have a 15.6" screen, for example, than having to squint at a 11" or 13" device and it really is a little bit of a strange concept to me that the extra couple of inches and a couple/few hundred grams suddenly makes something meaningfully less portable.

Plus Apple laptops these days need a Santa sack of dongles to have similar connectivity as a lot of laptops from other manufacturers. The smaller the form factor, the less ports are available and the less portability there is in other areas.

I’d need to see evidence that the M1s are better are hardware decoding/ecoding too. From here it looks like the Ryzen does more: https://www.cpu-monkey.com/en/compare_cpu-amd_ryzen_7_6800h-vs-apple_m1

Apple doesn’t provide any means for measuing gpu-only power draw as can be done on pc/Windows. So how did Apple get the numbers for their graph? Likely the same way that this guy did. Complaints you say? :wink: He is trying to recreate how Apple might have arrived at their graph by using industry standard benchmarks for gpu only, since Apple doesn’t provide any info about their testing. There is some explanation of the testing means at around 12:40 in the video.

That’s not any kind of benchmark. All sorts of bias can have influence there. Also, you mentioned that you are editing prores. No mention of encoding to H.264 or other format.

Lots of mentions out there that the encoding on the M1 is not good in terms of quality and less than hyped in terms of speed.

Not trying to slap down Apple’s M1 for what it actually is. But obviously there is a load of hype that needs shoveling before getting at fair comparing.

And yes, Ryzen does look overall better. Wish I could have found something with a 6800H, 17" 4k display (for running at 200% scaling) with good response time and contrast, good touchpad and speakers, linux compatible, that doesn’t cost an arm.

Also, I wonder if integer scaling was ever implemented for the 680M in Ryzen 7. I know that it was being requested but have not seen any followup.

I can encode overnight, in software if I want to. That’s not what slows me down when I’m actually using the computer.

Check this out, a comparison involving some professional media tasks, with a m2 laptop, and a mac pro.

The fact that any of the figures favor the m2 is incredible. And that’s with the big big big brother of the GPU setup in the AMD APU, and it’s not just one benchmark.

None of that makes testing one system at the wall plug, vs another with GPU only figures a fair comparison.

Not saying apple was right in their graph, but that test proves nothing useful.

Then your complaint is with Apple’s testing.

No, it’s with faulty methodology. Faulty methodology is faulty.

What did apple publish about how they measured GPU power usage?

They didn’t?

So, now we’re assuming, as well as using faulty methodology.

None of that adds up to getting useful figures about what’s actually really going on.

What apple did or didn’t do doesn’t change that.

It’s a best guess recreation of Apple’s methodology. Your complaint is with Apple for not providing any info on their testing. You can’t blame this guy for trying his best to recreate it.

Nope, it’s with using faulty methodology to compare apples and oranges.

Two wrongs don’t make a right.

The figures are worthless.

Unless you have a way to allow for non gpu power usage, and losses in power supply, you’re not comparing the same thing.

What apple did claim was GPU performance per watt. I don’t believe their claims stack up, but this test doesn’t help us get any closer to the truth of exactly how many performance’s per watt you get relatively from the respective GPU’s.

Also, the second video seems to be showing CPU and GPU power draw on the mac systems used.

Not really. We at least know his methodology. That is something to actually go on that you could do at home yourself. Apple’s graph is nothing to go on, other than indicating how they very likely measured power draw. Drink the kool-aid if you like, but this test is a best attempt recreation of Apple’s own testing for their provided graph.

With what CPU? How many fans? How efficient is his PSU compared to mine?

I know his methodology.

Compare the total power usage in one computer, against just the GPU usage in another…

And speculate about how much power the apple gpu actually used.

Fantastic.

If that’s all I could do, I’d say I had no real idea about the difference in performance per watt until I could allow for the confounding factors.

This is where people like the guys at Gamers Nexus are really handy. They actually dig into stuff like that when it matters.

That is why he made the video. Because Apple didn’t provide any such info. It’s ludicrous. He at least is showing internally measured power draw of the 3090 and the software used so that anyone can recreate the numbers he arrived at. Apple didn’t do that.

Correct. But the power draw of a 3090 vs the power draw of a complete computer, with all the losses in the PSU system, isn’t a particularly useful comparison.

If it makes you feel better I’ll admonish apple harshly, but that doesn’t make his figures any more useful to me in understanding the relative performance per watt of the two different GPU’s.

There is only one way to do that, and it’s by knowing how much power the apple GPU is actually using.