Pumpkin spice latte (Linux to Mac)

Because when you’re doing a shoot, it can be handy to be able to do some quick mock up’s of the end goal, and have a look before you decide what to do next.

Also handy for short projects, specifically advertising work, to be able to give the client something before they leave that is pretty close to the finished product.

If I was doing the editing back in the studio, I wouldn’t be doing it on a laptop at all.

Lots of mentions out there that the M1 for hardware encoding is both slow and poorer quality than software for H.264 (the common standard). So I would imagine that a pc with gpu would be the way to go that. If you’re not doing any encoding for end users (prores capture to prores editing; @ that apple tax), it wouldn’t be an issue.

Amazing what happens when adequate cooling is added to a machine:

But I guess using space for a heatsink and fan robs this device of ultimate portability. :face_with_diagonal_mouth:

Potential performance wasted. But if portability is actually something that really matters, then it might be worth it. Cost is a factor too. Better cooling in the same form factor has costs. Noise, power usage, money.

It does get silly if you buy a very high power draw cpu, that is never going to give you value for money with a given cooling setup, when you could have got a cheaper one that gives you the same or similar performance cause it can run at it’s full potential.

I really don’t see that as being the case here.

For a mini pc (think apple studio), I doubt that ultimate portability is an issue for the vast majority of people, else they would be using a small laptop, not a desktop monitor, keyboard, mouse (and desk itself). Costwise, performant mini pc’s aren’t exactly the best value any way. Adding $50 to the price for a heatsink and fan so that the thing can run at full performance isn’t exactly offputting. And a fan can be controlled for the situation. So noise is really a non-issue. Off when not needed. On when something heavy needs to be crunched.

Bigger and more expensive than the mini PC above.

At least it is reported to have adequate cooling.

Size and weight are separate issues, I have a Panasonic Toughbook that’s small and light as a feather but is pretty chunky. The new Ryzen APUs have increasingly nifty onboard graphics too and minimal power requirements compared to monster desktop replacement laptops where battery life isn’t top priority.

From what I’ve read the M1 performance hype was a tad overblown, especially the graphics/video side.

Yea…

And against Ryzen 7 igpu:

It would help if he wasn’t comparing the power draw of just the 3090 GPU in the PC with the overall power draw of the whole mac system, which includes power supply (in)efficiency, and the rest of the components (cpu, fans, storage).

What power was the PC actually drawing at the wall? I’m guessing it was a lot more.

Having said that, apple did oversell the raw 3D power of the gpu.

But I don’t choose what to buy based on marketing materials, and I wouldn’t be buying a laptop if I didn’t care about portability and battery life.

Don’t really care about gaming benchmarks either. The video decoding capabilities are what makes the macbook air and macbook pro excellent choices for the sort of work I’m involved in.

How can you be sure that those puddings weren’t over-egged as well?

I’ve not noticed it when editing video.

What would be your best example of something in the size and portability and battery life class of laptops that do better? Or price class.

And what compromises do they make when compared with the apple competition?

You and I have different ideas of what something being too big to be in the same category of portability is. I’d much rather have a 15.6" screen, for example, than having to squint at a 11" or 13" device and it really is a little bit of a strange concept to me that the extra couple of inches and a couple/few hundred grams suddenly makes something meaningfully less portable.

Plus Apple laptops these days need a Santa sack of dongles to have similar connectivity as a lot of laptops from other manufacturers. The smaller the form factor, the less ports are available and the less portability there is in other areas.

I’d need to see evidence that the M1s are better are hardware decoding/ecoding too. From here it looks like the Ryzen does more: https://www.cpu-monkey.com/en/compare_cpu-amd_ryzen_7_6800h-vs-apple_m1

Apple doesn’t provide any means for measuing gpu-only power draw as can be done on pc/Windows. So how did Apple get the numbers for their graph? Likely the same way that this guy did. Complaints you say? :wink: He is trying to recreate how Apple might have arrived at their graph by using industry standard benchmarks for gpu only, since Apple doesn’t provide any info about their testing. There is some explanation of the testing means at around 12:40 in the video.

That’s not any kind of benchmark. All sorts of bias can have influence there. Also, you mentioned that you are editing prores. No mention of encoding to H.264 or other format.

Lots of mentions out there that the encoding on the M1 is not good in terms of quality and less than hyped in terms of speed.

Not trying to slap down Apple’s M1 for what it actually is. But obviously there is a load of hype that needs shoveling before getting at fair comparing.

And yes, Ryzen does look overall better. Wish I could have found something with a 6800H, 17" 4k display (for running at 200% scaling) with good response time and contrast, good touchpad and speakers, linux compatible, that doesn’t cost an arm.

Also, I wonder if integer scaling was ever implemented for the 680M in Ryzen 7. I know that it was being requested but have not seen any followup.

I can encode overnight, in software if I want to. That’s not what slows me down when I’m actually using the computer.

Check this out, a comparison involving some professional media tasks, with a m2 laptop, and a mac pro.

The fact that any of the figures favor the m2 is incredible. And that’s with the big big big brother of the GPU setup in the AMD APU, and it’s not just one benchmark.

None of that makes testing one system at the wall plug, vs another with GPU only figures a fair comparison.

Not saying apple was right in their graph, but that test proves nothing useful.

Then your complaint is with Apple’s testing.

No, it’s with faulty methodology. Faulty methodology is faulty.

What did apple publish about how they measured GPU power usage?

They didn’t?

So, now we’re assuming, as well as using faulty methodology.

None of that adds up to getting useful figures about what’s actually really going on.

What apple did or didn’t do doesn’t change that.

It’s a best guess recreation of Apple’s methodology. Your complaint is with Apple for not providing any info on their testing. You can’t blame this guy for trying his best to recreate it.