« Meta-Analysis of the iPhone 12 Pro | Main | The Princess and the Photographer »

Wednesday, 11 November 2020

Comments

I find that the most sober, precise, and readable assesments are found on TidBITS.com.

Eolake

The main thing with this new system chip is getting performance comparable or better than the old Intel CPUs while using much much less power.

The hardware is derived from the chips that Apple has been making for their mobile devices, which over the last few years have gone from being power-efficient but somewhat slower than "desktop" CPUs to as fast or faster than regular CPUs while still being power-efficient.

I think for most normal users, one this is all baked, you'll notice things are zippier, but the main thing you'll notice is >10 hour battery life in a laptop ... and how the laptop doesn't seem to warm you up as much as your current one does, if you have one.

Mike, as with computers and cameras and cell phones... only Geeks rush in. Life as a troglodyte has my computers upgrades beings dissed because I'm using OSX High Sierra to run my Photoshop CS6 and my Safari is out of date, I am happy with my M10 and older lenses even though I am way behind in the megapixel count, and I use an iPhone 7 that is more than I will ever need. I will eventually have to upgrade my phone when the battery finally becomes too troublesome to deal with, won't change cameras until my eyesight dictates a change to AF and when I change my computer and operating system I will bury my CS6 and go with something else that doesn't indenture me to Adobe. This is a big change for Apple, what until the hoopla is over and saner voices have their say. Until then, if you need a new computer, the sales on the older models should start soon!

RAM is limited to 16Gb because it's on the M1 chip. I have 64Gb in my main Win 10 PC - and need it.

Not all existing programs will run under emulation, which is Rosetta 2 and some will take a long time to be recoded.

Some of the performance claims are likely to be marketing hyperbole......

As for the number of 'transistors', 16 billion: my new graphics card has 28 billion.

Oh, and have fun connecting anything to them. I am an MBP, iPad, iPhone user too, so by no means Apple phobic and I remember my Quadra with Trinitron monitor very fondly.

Bottom line: The M1 is just a new, faster than ever, Mac. That's it.


Do you remember when the Mac abandoned the PowerPC chip for the Intel chip? Not a lot of difference for those of us who just use a Mac; it was just a faster Mac. The Intel-to-M1 conversion will be a lot like that.


For the software geeks who appreciate the finer details of computer architecture (e.g., me), it is a fascinating step forward, lots of reasons to geek-out. But all of this is hidden under the covers.


Several years from now, it is likely that Intel-based applications will stop working. Users of these applications will be forced to upgrade or find alternative applications. We just went through this with MacOS Catalina: 32 bit applications no longer work. I'm still grumbling that I was forced to upgrade my ancient but adequate-for-me copy of Photoshop Elements.

Intel processors use enormous amounts of power because the actual binary instructions are so terribly hard to parse and decode. They translate them into an internal instruction set that the processor actually executes.

This is partly a legacy of it being a scale-up of the Intel 8080 CPU used in the first hobbyist computers. The smallest instructions are one byte, but some are two bytes, there are all sorts of override prefixes, etc.

This was an advantage when RAM memory was slow, small, and expensive. The processor spent less time waiting for memory when reading the instructions. With caches that's now irrelevant. Now it's just a compatibility Albatross around Intel's neck. The advantage of x86 chips is that they run Windows.

The ARM processor instruction set isn't a pile of history like the x86 instruction set, and was designed to be efficiently implemented with modern silicon technology and design concepts. This is why it has been such a success in the cellphone market, where the power budget is incredibly tight.

Software availability will be the initial challenge for the M1, just like the PowerPC to Intel transition for Apple. Sure, all of Microsoft's software will quickly be native for M1, since they already support ARM for the cheaper Surface machines. But lots of software will have to run under the Rosetta x86 emulation layer.

Since Apple has to license Rosetta, it will disappear from MacOS in a few years, just like the version for emulating PowerPC did.

I doubt there will ever be a way to run Photoshop CS6 on an M1 machine. Just like you can't run Nikon Scan (PowerPC only) natively on an Intel Mac anymore.

> Personally I like the fact that the Air finally dumps the unsatisfying butterfly keyboard

Note all of the current Macbook Air and Macbook Pro models, M1 or not, have a non-butterfly keyboard design, as far as I know. They changed this up over the last year or two.

The Pros retain the "touchbar" thingy at the top which people either love, or hate, or just think is kinda useless.


In support of what psu wrote above, this from AnandTech ...

Apple’s performance trajectory and unquestioned execution over these years is what has made Apple Silicon a reality today. ... there simply was no other choice but for Apple to ditch Intel and x86 in favour of their own in-house microarchitecture – staying par for the course would have meant stagnation and worse consumer products.

https://www.anandtech.com/show/16226/apple-silicon-m1-a14-deep-dive/4

The irony, in some ways, is that after years of being a dyed-in-the-wool computer power user, my primary (and, practically speaking, *only*) “computer” now is an iPad Air 3rd gen with the Smart Keyboard cover and a 1st gen Pencil (with separate keyboard and trackpad I can use when I’m at my desk, which is not at all often).

That even includes photo and video work. Between Photos, Files, Pixelmator Photo, Affinity Photo and LumaFusion, plus a nice big hard drive on a network storage device, the iPad has pretty much everything I do there covered. Being able to interact directly with the displayed image just feels right and the screen is the best I’ve ever used, size be damned. I could use more onboard storage, but that’ll be corrected on the next upgrade cycle (lessons learned, and all that).

I’ve made a very sharp turn deep into the Apple world in the last 6 months, but owning an actual Mac is barely on the radar any more. For the handful of tasks I can’t do yet using the iPad, I have my old Win10 laptop stored away; I haven’t powered it up in a couple of months...

It’s still interesting news though, I’d bet the line between iPadOS and MacOS will be getting just a little fuzzier going forward.

I have been using MacBook Pros for about a dozen years - but I run Windows because my work software doesn't work with the Mac OS. Bootcamp makes it seamless. I don't see anything about the M1 supporting Windows so I may be done with Macs eventually.

I remain firmly on the trailing edge of technology. I don't expect to transition to Apple Silicon anytime soon.

I still run an older MacPro that has four internal boot drives ranging from Snow Leopard to Mojave. This allows me to run both current and legacy software. I don't want to give up this "feature" just to have a newer, faster machine.

Because, in my case, the workflow bottleneck is not the computer, it's the operator.

Mike, I am like you and use and have used Macs for years and clearly am too old to change. But, there is always a but, I have a 5 year old MacBook Air and an I Mac. I love my old MacBook Air. I encouraged my wife to replace her 10 year ( non updatable) Mac with a new MacBook Pro. She doesn’t like it and I don’t either. The keyboard is terrible? My old Air has a great keyboard and I love typing on it. Also mine has all the ports on the side necessary for us Photogs. My iMac’ s keyboard is fine, better than her pros but not as good as my Air. I hope whatever Apple comes out with they provide a good keyboard. That is a small but important part of a computer, at least in my world. All the best. Eric

All the available (non-Apple provided) data shows even the iPhone/iPad versions of the chips are faster than the chips we previously had in any of these Macs. That's without even considering these are scaled up in terms of cores and power from the iOS versions.

This is Apple's 3rd CPU transition and with the big jump in speed devs were reporting better performance on the dev kits in emulation than on the older hardware running native code in many cases. In any case, Apple's ability to move the developer community is dramatically different now versus back in the PowerPC days. In addition to Microsoft, Adobe has been very aggressive in porting its code stack over to iOS which lays the foundation for running on Apple Silicon.
*Clarifications Rosetta isn't licensed, it's an Apple in-house technology.

The RAM limitation might be the biggest problem but it's certainly not an Apples-to-(old) Apples comparison even when it comes to RAM.

A performance chart from Anandtech. The Intel CPUs being charted are each year's top-of-the-line. The measure is single-threaded, integer performance which implies its own set of applicability.

https://pbs.twimg.com/media/EmioHgHXcAExFq_?format=png&name=medium

It's going to have to be pretty spectacular. I've supported Macs for a university's art and media programs for a while, and I'm sure we'll keep using them there. Works well. But it looks like an $800 to $900 price premium for an M1 Macbook Pro vs. a Thinkpad with some similar specs (16GB of RAM, 512GB SSD, three years of warranty with accidental damage). And the new Macs can't support an external GPU, so there may be some creatives who can't switch right away if the onboard GPU doesn't perform similarly. And it's hard for me to see how the extra $900 adds value as a general purpose computer. I picked up an i7 Thinkpad X390 on sale with similar specs (16/512) with 3 years of warranty for about $1200 last month, and it runs Photoshop great. With a docking station and my BenQ SW240, it's a really nice setup.

I prefer Apple for tablets/phones, but as comfortable as I am with Macs, I don't quite get the appeal vs. the price.

Paul Glover:
I’d bet the line between iPadOS and MacOS will be getting just a little fuzzier going forward."

That has already been announced. You will soon be able to run iOS apps on the Apple ARM based computers.

I got my first Mac on April 18, 1984. I've been through many Mac iterations and even a brief flirtation with Windows in the 90's.

Recently I wondered about the potential growing pains of a new chipset for the new Mac and decided to get an Intel based 2020 iMac, all tricked out. No doubt that the new chips will become the norm in a few years but by then, I may be ready for a new machine.

I've not forgiven Apple for the design and functional disaster that was the 2013 Mac Pro (trashcan). I pad a lot of money for it and ended up spending even more on lots of peripherals (OWC boxes for hard disks to store my TB's of photos.) I recently sold the 2013 Mac Pro; it was working just fine but it never felt right. The 2020 iMac just feels better.

Will the M1 chip run the current Photoshop and Lightroom Classic, and oh, by the way will my 3880 printer still work?

Jay Peters writes at The Verge ...

The reason (the new Macs) can natively run iOS apps is because the new Apple M1 is based on the Arm instruction set, just like your smartphone, instead of the x86-64 instructions used in Macs and Windows PCs. But the reverse is also true: we’re currently taking Apple’s word that existing Mac apps will work well when they don’t run natively. Yesterday’s was the second presentation in a row where we saw canned demos and unlabeled graphs instead of actual benchmarks and performance comparisons.

https://www.theverge.com/2020/11/11/21559515/apple-silicon-arm-computer-m1-chip-transition-microsoft-surface-rt

I apply what I call the "ASOEE" rule in these cases; viz. don't buy the latest version of anything computer related, as there are usually kinks. With new Apple products interoperability issues with non-Apple hardware and software often arise. I learned my lesson the expensive way after my main and business critical Konica Minolta printing unit (i.e. not photo printer) could not be used with a new Apple acquisition for over 7 months when I upgraded in disregard of ASOEE (and my work-around defeated the purposes of the upgrade anyway). So I recommend you wait, if you can, until at least the second generation (or more) of M1 chipped Macs for any kinks to be resolved ASOEE (i.e. at someone else's expense) before you buy in.

I do agree that this whole chip thing basically boils down to Apple's drive to control everything in its product line. Its constant (and in my opinion largely unnecessary) updates further that drive -- they make it really hard to have a stable relationship between a non-Apple piece of software and an updated Mac. As a novelist, I recently ran into a new bug, created because Apples "Pages" is not completely compatible with Word for Mac anymore. It's too complicated to explain here, but it nearly drove me crazy, and created an immense amount of extra work for me on deadline, and has me thinking seriously about moving to Windows. The problem there is, Windows has always baffled me. And it's ugly. It's like moving from Nikon to Ricoh. 8-)

A historical note: Apple is nowhere near the "wealthiest and most influential company in human history." I think those honors would have to go to the British East India Company, which once controlled an estimated half of all world trade, and basically owned India, with a private Army of 260,000 men; or perhaps the Dutch East India Company, with a net worth that in modern terms would amount to about $7.9 trillion.

Many claims for computer performance are similar to the claims for cameras - performance is more than megahertz and megapixels....

Moore's law still holds and integration allows more functions on smaller chips that use less power. Integrating on a single chip means less board space too.

Another thing the chip designer can do is optimize the hardware/software operations leading to efficiency, speed and reliability.

But this is a terrible time to get me to say something nice about Apple. Fixing their bugs caused by upgrades has cost me most of my time for two weeks. None of what happened was unique to me. Searches showed plenty of other users had the same problem. Lots of suggestions on how to fix but nothing worked but wipe and start over.

Total time to date: about 50-60 hours.

So I guess I'm not really enthusiastic about brand new MACs right now.

Advice: Remember to keep backups - several and several formats.

JH - computing since IBM computers used vacuum tubes.....

I’m waiting to hear what Joe Holms has to say, but having control of both the operating system and the cpu means that you can do things that aren’t otherwise possible. Having a 8‑core CPU, 8-core GPU, and 16‑core Neural Engine sharing 16 gig of ram and apple supplying developer tools to exploit that is exciting.

I fondly remember developing software on the Amiga where you could xor 16 meg of data in one clock tick with the blitter chip in 1986 when that was a big deal. App specific hardware is a wonderful thing.

The real win for Apple is that by providing tools and hardware, there will be applications that only can run on Mac, yet it will still run other software from legacy x86 OSs.

The real question is left to do on computers that they don’t do well enough now other than the same stuff they do now but with bigger data faster?

Mea culpa. John is quite correct that the Dutch East India Company was the wealthiest company in human history and, in even conservative modern normalized terms, had coffers stuffed many times the size of Apple’s.

The early benchmarks say that these are the fastest Macs ever at single thread jobs. Things get more problematic once you start using multiple threads and the GPUs. There's also the issue of RAM being maxed out at 16GB because it's built into the M1 chip itself. For simpler, less memory intensive tasks, these Macs fly in a league of their own. Run Photoshop with multiple layers on a large file with filters that call the GPU, well, the Intel+Radeon Macs are better, particularly if you've stuffed them with memory.

A few other comments to the comments:
* Yes, you want the MacBook 13" if you're going to run sustained tasks that max out the CPU. That's because the fan makes it so that it sustains peak speeds longer.
* If you're running legacy software that isn't being updated, your mileage may vary. Adobe will have an M1-enabled version of Lightroom shortly, and then we'll be able to do some real side-by-side comparisons.
* People are forgetting that Apple lent developers Apple Silicon Macs for the last few months. I suspect we're going to see fairly rapid software evolution once the final M1 Macs ship.
* Pages isn't a good substitute for Word on the Mac, and never has been. The Gold Standard for that is NisusWriter Pro. And NisusWriter Pro shows you how much Microsoft has ignored in Word, particularly when it comes to really big documents.
* As I wrote on my site: these are LOW-END Macs, and they're faster than a lot of high-end Macs. We're all waiting to see just what happens when Apple goes upscale with their chips, because what just happened to the MacBook Air and low-end Mini is pretty spectacular.

The new Mac Mini has my most interest because I can repurpose the many storage drives and other external components currently attached to my aging iMac. I would also attach a fine color- critical NEC monitor.
That all said, I'm going to wait to see how the new platformed Lightroom runs on these Apple Silicon models. I'm willing to wait a year for higher end Macs to be released if these entry level models aren't muscular enough.
We will see significant convergence with iOS devices in 1.5-2 years when 5G is included with the Macbook Pro models to start.

It can add. Your computer can add. It can make decisions depending on result of addition. Your computer can do this too. Perhaps it will be faster than your computer (but since it can have only 16GB memory it will not be faster for demanding tasks unless your computer is old and tiny: my macbook from 2013 also has 16GB, such is progress). It will not be faster if you must run old binaries under emulation, so better have current subscriptions to Photoshop etc. If you spend lot of time waiting for computations (not for your network) being faster may matter to you. It may use less power than the computer you have, and this may matter if you have a laptop and use it away from power a lot.

But let us jump up and down and be excited: it is a shiny new toy.

Addendum: Apple products that use iOS, iPadOS, tvOS, watchOS, and bridgeOS operating systems have been colloquially referred to as an iDevice.

Jailbreaking is when a user modifies, or patches, the iDevice's system to gain access to resources that are not usually made available to the user. Many use jailbreaking to install apps that are not available from Apple's App Store.

The reliability and security of jailbroken devices can be viewed as questionable since the patching process could also allow third-parties the ability to execute malicious code.

[Thanks Alex. --Mike]

I've got some experience working with ARM chips, emulation, software - that sort of stuff. And I have two reactions to these announcements: they are neat computers and bridge the lightweight low-power use of iPads to the more flexible MacOS use and never buy the first version! Or never buy with your own money if you're a middle income earner; the first version always has quirks.

I think there's interesting future potential here and we should even see lower costs if the competition can push Apple to do that. But especially the software is new and many apps won't work well, so I'd wait a bit and see.

The M1 is a seismic shift that goes way beyond Apple. Now the MacBook Air competes or beats in performances a MacBook Pro 16" with an Intel i9! Let that sink in.

The 3 computers that now have the M1 make more than 90% of Mac sales. In terms of sales volumes the transition is almost over already.

There are many reasons to that, among them the astonishing fact that where Intel can place one transistor, Apple can place four. AMD as well, but they have to carry the x86 baggage.

The MacBook Air with an M1 is cheaper and much faster than a Microsoft Surface Laptop. And is silent. And has way better battery life.

All that at a time when Windows compatibility is rapidly becoming less important due to the move to web-based apps and remote Windows apps hosted in Azure or AWS that can be used with a browser, an iPad, a Mac or an Android tablet.

Apple just showed how uncompetitive Intel has become, extending that to the whole PC market. Now either Microsoft gets their act together regarding Windows on ARM or life will become pretty hard for several PC builders who will only have the low-end junk to sell with very little margins.

Tidbits still around? I use them to try to dial into the Hong Kong supernet in 1993(?) and I were the 2nd mac user connected to the world using tidbit book and more important bits floppy disk. No one know how to it then I were told and you two sorted it out somehow. Most instruction is in pc using us robotics.

Btw, https://kosmofoto.com/2020/11/camera-used-to-take-1950s-candids-of-marilyn-monroe-goes-on-sale/?fbclid=IwAR0PofldW073kyGd1Rkac6VVJ2XPTmR407DJBpmsBUNE7wvJJBfYlE9EVk0 anyone know this USA camera. Never seen this before. Not a leica clone?

My gripe with Mac (and Windows) is stuff like this: https://sneak.berlin/20201112/your-computer-isnt-yours/

There is only a small selection of good photo editors on Linux, but at least I am not paying a lot of money for a computer that spies on me...

Robert Hudma writes: "If your current MacBook Pro or Air has a failure, your data on the SSD is essentially not recoverable since it is soldered on the logic board.”. It’s astonishing to me that people still keep important data in in-backed-up devices. SSDs fail just like other components, so if you’re relying on repairability as a strategy, you WILL lose data. The only question is when. I have both a local backup drive and cloud backup for my photo archives, and two distinct off-site cloud backups with two separate providers for all important files. As to repairability, that’s been an issue for decades. Back around 1970 the diaphragm in my Chevy’s fuel pump broke, disabling the pump. The pump was held together with maybe a dozen or screws that enabled you to replace the diaphragm. But diaphragms were no longer available. I had to replace the entire pump with all those screws replaced by a crimped edge. What were the trade offs? Reduced manufacturing costs versus increased repair costs. Soldered-together motherboards have trade offs too. Connectors are a hot-spot for failures; soldering increases reliability. Soldering reduces size and weight as opposed to using connectors that would permit changing out the SSD or the CPU. You may or may not like the trade offs, but there’s something in it for the consumer as well as for the manufacturer.

Miss the chime? It's back, as an option, in Big Sur. You can toggle it on and off through the "Sound" preference.

The comments to this entry are closed.

Portals




Stats


Blog powered by Typepad
Member since 06/2007