There are times I miss Ctein, and this is one.
My first Mac was...well, the first Mac, the 128k Macintosh (as it was called then) of 1984, that Orwellian year. I also had an original ImageWriter, Apple's first dot-matrix printer for the Macintosh. I bailed early on the Macintosh Classic, which was a pricing breakthrough but just too limited for its time. I replaced it with an early Ur-laptop, then called a "portable," the PowerBook 160 of 1992. I loved that thing, and clung to it for too long. Still have it, actually. I worked for years putting magazines together on a super-reliable Quadra 605 (with a black-and-white monitor, no joke). I jumped on board with the original "Blueberry" iMac (a Bondi Blue G3), Steve Jobs' first consumer product after his (Messianic? MacArthurian?) return to Apple in '97. That one ate a CD and wouldn't give it up. I practically had to take a crowbar to it. And so on. All in all, between school, work, and home, I think I've owned or been assigned to work on a total of about 17 Macs over the years. Maybe 18. I've kinda lost count, if I'm honest.
But I'm not a computerphile. Never got into it. Never subscribed to all those Mac magazines back in the '80s and '90s, Macworld, MacUser. Never been an enthusiast. I used computers for creative work but just as tools. Never knew much about it, although I probably know more than I think I do after all these years. In other words, I was pretty much "that guy" the Macintosh was aimed at in the first place.
So I hope the real Mac geeks amongst ye will pitch in and help me out here. I don't have the deep chops to put the new Apple Silicon M1 chip into perspective. I watched the whole Event this time, and some impressive claims are being made—
Packed with an astonishing 16 billion transistors, the new M1 chip integrates the CPU, GPU, Neural Engine, I/O, and so much more onto a single tiny chip. Combined with the new macOS Big Sur, M1 delivers category-smashing speed, mind-bending graphics, and power efficiency and battery life that defy belief.
—But where on the spectrum between truth and marketing hyperbole do those claims fall? I wouldn't know.
Ctein is probably all over it.
Anyway, here are some pre-order links to the new products with the M1 chip:
Late 2020 13.3" MacBook Pro with Retina Display
Late 2020 13.3" MacBook Air with Retina Display
And here's B&H's Explora Page for the new M1 products if you want to read what they have to say.
Looking into my crystal ball, some future M1 iMac will probably be my next computer. What can I say? At this point it's just tradition.
Mike
Original contents copyright 2020 by Michael C. Johnston and/or the bylined author. All Rights Reserved. Links in this post may be to our affiliates; sales through affiliate links may benefit this site. As an Amazon Associate I earn from qualifying purchases.
(To see all the comments, click on the "Comments" link below.)
Featured Comments from:
Eolake Stobblehouse: "I find that the most sober, precise, and readable assessments are found on TidBITS.com."
Mike replies: TidBITS suggests that the one to choose might be the M1 MacBook Air, because of the fact that it has no fan and that the two M1 MacBook Pro's might not offer enough of a performance premium. (The two higher-level MacBook Pro's retain their Intel chips, presumably because the M1 chip is limited to 16 GB of RAM, and perhaps because power users are less likely to want to be forced into macOS 11 right away.) Personally I like the fact that the Air finally dumps the unsatisfying butterfly keyboard—never liked that—and brings back the traditional Apple startup chime. Always liked that chime. :-)
Kenneth Tanaka: "I'm a long-time devoted Mac user, although I don't consider myself a 'geek.' But I don't think you really need to be a techie to understand this move. It's all about power and control.
"Yes, I'm certain that Apple's new processor brings advancements to the party. Power management, for example, has been one of the big issues since computers became predominantly batter-powered. The M1 claims to offer benefits here. And since gaming has also become a big priority for customers 3-D graphics processing has also taken a front seat with the new M1 processor.
"But ever since Apple left the PowerPC platform for Intel's platform 10+ years ago the writing seemed engraved on the wall; it would be just a matter of time before Apple would want to completely control (i.e. manufacture) its own CPUs and associated chip sets. To use a weak theoretical automotive analogy, how long would General Motors buy all their engines from Toyota? Since the time of that Intel partnership Apple's value has increased exponentially. They are now the wealthiest and most influential company in human history. So the move into the 'M1' was inevitable, even if it delivered no measurable performance advancements (which is possible).
"From a practical consumer perspective look for some relative bargains in Apple's Intel-based computers early next year as retailers try to clear inventories."
Richard T: "Jason Snell is the former editor of Macworld magazine and for several years now runs his own website at Sixcolors.com—excellent coverage that's also quite accessible. For old-school Mac users like Mike (or those wanting to learn more about Apple history!), Jason has started a great YouTube/article/podcast series called '20 Macs for 2020.' Includes wonderful articles/videos/podcasts. See here for the current list."
"...And to address the original question: yes, these chips appear to be extremely impressive. But note that Apple has announced a two-year transition window, so, as usual for tech products, it pays to wait if you have no pressing need to upgrade. The current M1-series-based lineup is very traditional (essentially identical hardware compared to the Intel equivalents) and future products (e.g. iMacs) may also provide hardware innovations."
Alex Mercado: "It boils down to latency and security—or in more pragmatic words, speed and control.
"The M1 is an integrated circuit (a system on a chip, or SoC) that incorporates much of the necessary components of the computer. Since it is integrated, the data transfer speeds amongst those components are significantly faster than on a traditional motherboard. As an analogy, think of SoC as a Costco, while a motherboard is akin to a shopping mall. You can find almost everything in the Costco rather quickly compared to roving a mall.
"As for security, Apple now controls the microcode the transistors on the SoC use to perform. Apple had to wait for Intel to create patches to address the Meltdown and Spectre threats that affected the CPUs a few years back, which, arguably, left Apple and their users susceptible to security threats.
"The M1 is completely managed by Apple, which significantly limits access to third parties. For comparison, iPod touch, iPhone, and iPad already work this way because Apple controls the A-series SoC in those devices.
"Yes an iDevice can be jailbroken, but its reliability and security are less than optimal for the average non-enthusiast. [Ed. does not understand this sentence. —Ed.]
"The marketing hyperbole for the M-series has much truth to it."
Robert Hudyma: "Be aware that Apple has been moving to control its entire Ecosystem slowly but surely. Saying good-bye to Intel and hello to ARM is part of that strategy.
"Current products are not upgradable or repairable by end-users or independent repair facilities.
"If your current MacBook Pro or Air has a failure, your data on the SSD is essentially not recoverable since it is soldered on the logic board.
"Also, the new macOS Big Sur is not supported on most Apple products older than eight years old so you can no longer keep your Operating System current.
"Another example is the new iPhone 12: individual modules such as the camera or battery are not end-user replaceable since they are serialized to each device and only Apple has the magic wand to make it work again.
"I think the right to repair is important and I am not purchasing any technology that cannot be repaired by me or a qualified technician. Louis Rossman has a popular YouTube channel that focuses on Apple product repairs. He is informative, outspoken and knowledgeable. He is worth a watch."
Mike asks: Cameras and cars too, and etc.? Because 'right to repair' as you call it is getting a lot more rare everywhere in products of all sorts, right? I learned last year that I don't have RtR for my five-year-old clothes washing machine, for example. A broken on-off switch(!) will total the motherboard which will total the whole machine.
Steve C: "Mike—I think, given how close the specs are, the most notable difference between the Air and Pro laptops is that the Air will probably have very similar peak performance to the Pro, but won't be able to sustain it for anywhere near as long without fans.
"@Robert Hudyma—right [to] repair is a concept I haven't thought much about but is really interesting. The usual thinking on the business side is the trade-off between modularity and integration. Integrated product/systems tend toward higher performance (and higher price), modular ones towards low cost. Alongside that, modular designs are inherently more repairable.
"It's the integration between components, their housings, the hardware and software (and now services) that makes Apple products unique. Ecosystem control is a benefit to both the company, and to users who don't value repairability as much as they do convenience and performance.
"The services approach to software, where everything is being updated all the time, that so many people hate is also what allows the company to stay ahead of the field. It prevents them from becoming weighted down by an ever-accumulating legacy of old versions, software interfaces, methods, system quirks, and security issues.
"The down side is that one day your otherwise perfectly fine device / software is no longer supported, and that sucks. The up side is that the company gets to keep pressing forward at speed. They could choose not to (which I expect many people here would love—myself included many days), but the press itself is relentless. Another company will simply take up the torch.
"Microsoft suffered badly under the weight of backward compatibility (between Windows and its many associated applications, drivers, systems) for many, many years. It's not a mistake, but it's a choice that has clear consequences.
"In slower-moving industries / technologies / products (hello white-goods!), integration around key parts is probably very deliberately about creating new purchases through planned failure and obsolescence. But Apple's desire to differentiate their whole multi-product, multi-service user experience such that each product makes the others better, kinda requires integration. I doubt they could wring such high size / performance from their camera system without it."
Steve C: "For anyone who's interested and moderately technically-minded, this piece on Apple's M1 and the A-series chips that lead to it is really impressive. The desire, vision, strategy, organisational focus and multi-year execution needed to run down a goal and outcome like this, in this space, is hard to over-estimate. To give you an idea of scale, the R&D investment that preceded the A-series semiconductor fabrication was in the order of building an aircraft carrier, or a New York City block every quarter."
Steve C: "One final comment on this—I promise—about the limited RAM. I've read in several places now, speculation that the combination of massive advances in the system-on-a-chip latency, and the integrated high-end SSDs make RAM much less of a bottleneck. Modern SSD memory is faster than RAM from only a few years ago. Storage and 'memory' are converging."
Andrew: "Shout out also to Google's Pixel line which has been blazing the computational photography route as well—what those phones achieve with relatively humdrum lenses and sensors shows, as Richard T commented, how much of the heavy-lifting can be done by the software these days. (And this isn't to invoke a Google/Apple, Android/iOS flame war—the best from both is seriously impressive.)
"Personally I loved the brief period just before the software took over from the hardware—fantastically-engineered phones like the 41MP Nokia 1020 and 1-inch-sensored Panasonic CM-1 have (mostly) been outclassed now, but how they went about their imaging business was a lot more on show than today."
Thom Hogan (partial comment): "As I wrote on my site: these are low-end Macs, and they're faster than a lot of high-end Macs. We're all waiting to see just what happens when Apple goes upscale with their chips, because what just happened to the MacBook Air and low-end Mini is pretty spectacular."
[For the full text of Partial Comments, see the full Comments section. —Ed.]
I find that the most sober, precise, and readable assesments are found on TidBITS.com.
Eolake
Posted by: Eolake Stobblehouse | Wednesday, 11 November 2020 at 08:27 AM
The main thing with this new system chip is getting performance comparable or better than the old Intel CPUs while using much much less power.
The hardware is derived from the chips that Apple has been making for their mobile devices, which over the last few years have gone from being power-efficient but somewhat slower than "desktop" CPUs to as fast or faster than regular CPUs while still being power-efficient.
I think for most normal users, one this is all baked, you'll notice things are zippier, but the main thing you'll notice is >10 hour battery life in a laptop ... and how the laptop doesn't seem to warm you up as much as your current one does, if you have one.
Posted by: psu | Wednesday, 11 November 2020 at 08:28 AM
Mike, as with computers and cameras and cell phones... only Geeks rush in. Life as a troglodyte has my computers upgrades beings dissed because I'm using OSX High Sierra to run my Photoshop CS6 and my Safari is out of date, I am happy with my M10 and older lenses even though I am way behind in the megapixel count, and I use an iPhone 7 that is more than I will ever need. I will eventually have to upgrade my phone when the battery finally becomes too troublesome to deal with, won't change cameras until my eyesight dictates a change to AF and when I change my computer and operating system I will bury my CS6 and go with something else that doesn't indenture me to Adobe. This is a big change for Apple, what until the hoopla is over and saner voices have their say. Until then, if you need a new computer, the sales on the older models should start soon!
Posted by: Rick in CO | Wednesday, 11 November 2020 at 08:44 AM
RAM is limited to 16Gb because it's on the M1 chip. I have 64Gb in my main Win 10 PC - and need it.
Not all existing programs will run under emulation, which is Rosetta 2 and some will take a long time to be recoded.
Some of the performance claims are likely to be marketing hyperbole......
As for the number of 'transistors', 16 billion: my new graphics card has 28 billion.
Oh, and have fun connecting anything to them. I am an MBP, iPad, iPhone user too, so by no means Apple phobic and I remember my Quadra with Trinitron monitor very fondly.
Posted by: Trevor Johnson | Wednesday, 11 November 2020 at 09:17 AM
Bottom line: The M1 is just a new, faster than ever, Mac. That's it.
Do you remember when the Mac abandoned the PowerPC chip for the Intel chip? Not a lot of difference for those of us who just use a Mac; it was just a faster Mac. The Intel-to-M1 conversion will be a lot like that.
For the software geeks who appreciate the finer details of computer architecture (e.g., me), it is a fascinating step forward, lots of reasons to geek-out. But all of this is hidden under the covers.
Several years from now, it is likely that Intel-based applications will stop working. Users of these applications will be forced to upgrade or find alternative applications. We just went through this with MacOS Catalina: 32 bit applications no longer work. I'm still grumbling that I was forced to upgrade my ancient but adequate-for-me copy of Photoshop Elements.
Posted by: kevin willoughby | Wednesday, 11 November 2020 at 09:42 AM
Intel processors use enormous amounts of power because the actual binary instructions are so terribly hard to parse and decode. They translate them into an internal instruction set that the processor actually executes.
This is partly a legacy of it being a scale-up of the Intel 8080 CPU used in the first hobbyist computers. The smallest instructions are one byte, but some are two bytes, there are all sorts of override prefixes, etc.
This was an advantage when RAM memory was slow, small, and expensive. The processor spent less time waiting for memory when reading the instructions. With caches that's now irrelevant. Now it's just a compatibility Albatross around Intel's neck. The advantage of x86 chips is that they run Windows.
The ARM processor instruction set isn't a pile of history like the x86 instruction set, and was designed to be efficiently implemented with modern silicon technology and design concepts. This is why it has been such a success in the cellphone market, where the power budget is incredibly tight.
Software availability will be the initial challenge for the M1, just like the PowerPC to Intel transition for Apple. Sure, all of Microsoft's software will quickly be native for M1, since they already support ARM for the cheaper Surface machines. But lots of software will have to run under the Rosetta x86 emulation layer.
Since Apple has to license Rosetta, it will disappear from MacOS in a few years, just like the version for emulating PowerPC did.
I doubt there will ever be a way to run Photoshop CS6 on an M1 machine. Just like you can't run Nikon Scan (PowerPC only) natively on an Intel Mac anymore.
Posted by: John Shriver | Wednesday, 11 November 2020 at 10:33 AM
> Personally I like the fact that the Air finally dumps the unsatisfying butterfly keyboard
Note all of the current Macbook Air and Macbook Pro models, M1 or not, have a non-butterfly keyboard design, as far as I know. They changed this up over the last year or two.
The Pros retain the "touchbar" thingy at the top which people either love, or hate, or just think is kinda useless.
Posted by: psu | Wednesday, 11 November 2020 at 10:49 AM
In support of what psu wrote above, this from AnandTech ...
Apple’s performance trajectory and unquestioned execution over these years is what has made Apple Silicon a reality today. ... there simply was no other choice but for Apple to ditch Intel and x86 in favour of their own in-house microarchitecture – staying par for the course would have meant stagnation and worse consumer products.
https://www.anandtech.com/show/16226/apple-silicon-m1-a14-deep-dive/4
Posted by: Speed | Wednesday, 11 November 2020 at 11:13 AM
The irony, in some ways, is that after years of being a dyed-in-the-wool computer power user, my primary (and, practically speaking, *only*) “computer” now is an iPad Air 3rd gen with the Smart Keyboard cover and a 1st gen Pencil (with separate keyboard and trackpad I can use when I’m at my desk, which is not at all often).
That even includes photo and video work. Between Photos, Files, Pixelmator Photo, Affinity Photo and LumaFusion, plus a nice big hard drive on a network storage device, the iPad has pretty much everything I do there covered. Being able to interact directly with the displayed image just feels right and the screen is the best I’ve ever used, size be damned. I could use more onboard storage, but that’ll be corrected on the next upgrade cycle (lessons learned, and all that).
I’ve made a very sharp turn deep into the Apple world in the last 6 months, but owning an actual Mac is barely on the radar any more. For the handful of tasks I can’t do yet using the iPad, I have my old Win10 laptop stored away; I haven’t powered it up in a couple of months...
It’s still interesting news though, I’d bet the line between iPadOS and MacOS will be getting just a little fuzzier going forward.
Posted by: Paul Glover | Wednesday, 11 November 2020 at 11:16 AM
I have been using MacBook Pros for about a dozen years - but I run Windows because my work software doesn't work with the Mac OS. Bootcamp makes it seamless. I don't see anything about the M1 supporting Windows so I may be done with Macs eventually.
Posted by: Malcolm Leader | Wednesday, 11 November 2020 at 11:20 AM
I remain firmly on the trailing edge of technology. I don't expect to transition to Apple Silicon anytime soon.
I still run an older MacPro that has four internal boot drives ranging from Snow Leopard to Mojave. This allows me to run both current and legacy software. I don't want to give up this "feature" just to have a newer, faster machine.
Because, in my case, the workflow bottleneck is not the computer, it's the operator.
Posted by: DavidB | Wednesday, 11 November 2020 at 11:22 AM
Mike, I am like you and use and have used Macs for years and clearly am too old to change. But, there is always a but, I have a 5 year old MacBook Air and an I Mac. I love my old MacBook Air. I encouraged my wife to replace her 10 year ( non updatable) Mac with a new MacBook Pro. She doesn’t like it and I don’t either. The keyboard is terrible? My old Air has a great keyboard and I love typing on it. Also mine has all the ports on the side necessary for us Photogs. My iMac’ s keyboard is fine, better than her pros but not as good as my Air. I hope whatever Apple comes out with they provide a good keyboard. That is a small but important part of a computer, at least in my world. All the best. Eric
Posted by: albert erickson | Wednesday, 11 November 2020 at 11:34 AM
All the available (non-Apple provided) data shows even the iPhone/iPad versions of the chips are faster than the chips we previously had in any of these Macs. That's without even considering these are scaled up in terms of cores and power from the iOS versions.
This is Apple's 3rd CPU transition and with the big jump in speed devs were reporting better performance on the dev kits in emulation than on the older hardware running native code in many cases. In any case, Apple's ability to move the developer community is dramatically different now versus back in the PowerPC days. In addition to Microsoft, Adobe has been very aggressive in porting its code stack over to iOS which lays the foundation for running on Apple Silicon.
*Clarifications Rosetta isn't licensed, it's an Apple in-house technology.
The RAM limitation might be the biggest problem but it's certainly not an Apples-to-(old) Apples comparison even when it comes to RAM.
Posted by: Jandrewyang | Wednesday, 11 November 2020 at 12:19 PM
A performance chart from Anandtech. The Intel CPUs being charted are each year's top-of-the-line. The measure is single-threaded, integer performance which implies its own set of applicability.
https://pbs.twimg.com/media/EmioHgHXcAExFq_?format=png&name=medium
Posted by: Jandrewyang | Wednesday, 11 November 2020 at 12:37 PM
It's going to have to be pretty spectacular. I've supported Macs for a university's art and media programs for a while, and I'm sure we'll keep using them there. Works well. But it looks like an $800 to $900 price premium for an M1 Macbook Pro vs. a Thinkpad with some similar specs (16GB of RAM, 512GB SSD, three years of warranty with accidental damage). And the new Macs can't support an external GPU, so there may be some creatives who can't switch right away if the onboard GPU doesn't perform similarly. And it's hard for me to see how the extra $900 adds value as a general purpose computer. I picked up an i7 Thinkpad X390 on sale with similar specs (16/512) with 3 years of warranty for about $1200 last month, and it runs Photoshop great. With a docking station and my BenQ SW240, it's a really nice setup.
I prefer Apple for tablets/phones, but as comfortable as I am with Macs, I don't quite get the appeal vs. the price.
Posted by: Paul Coen | Wednesday, 11 November 2020 at 12:44 PM
Paul Glover:
I’d bet the line between iPadOS and MacOS will be getting just a little fuzzier going forward."
That has already been announced. You will soon be able to run iOS apps on the Apple ARM based computers.
Posted by: KeithB | Wednesday, 11 November 2020 at 01:40 PM
I got my first Mac on April 18, 1984. I've been through many Mac iterations and even a brief flirtation with Windows in the 90's.
Recently I wondered about the potential growing pains of a new chipset for the new Mac and decided to get an Intel based 2020 iMac, all tricked out. No doubt that the new chips will become the norm in a few years but by then, I may be ready for a new machine.
I've not forgiven Apple for the design and functional disaster that was the 2013 Mac Pro (trashcan). I pad a lot of money for it and ended up spending even more on lots of peripherals (OWC boxes for hard disks to store my TB's of photos.) I recently sold the 2013 Mac Pro; it was working just fine but it never felt right. The 2020 iMac just feels better.
Will the M1 chip run the current Photoshop and Lightroom Classic, and oh, by the way will my 3880 printer still work?
Posted by: Eric Brody | Wednesday, 11 November 2020 at 04:03 PM
Jay Peters writes at The Verge ...
The reason (the new Macs) can natively run iOS apps is because the new Apple M1 is based on the Arm instruction set, just like your smartphone, instead of the x86-64 instructions used in Macs and Windows PCs. But the reverse is also true: we’re currently taking Apple’s word that existing Mac apps will work well when they don’t run natively. Yesterday’s was the second presentation in a row where we saw canned demos and unlabeled graphs instead of actual benchmarks and performance comparisons.
https://www.theverge.com/2020/11/11/21559515/apple-silicon-arm-computer-m1-chip-transition-microsoft-surface-rt
Posted by: Speed | Wednesday, 11 November 2020 at 05:03 PM
I apply what I call the "ASOEE" rule in these cases; viz. don't buy the latest version of anything computer related, as there are usually kinks. With new Apple products interoperability issues with non-Apple hardware and software often arise. I learned my lesson the expensive way after my main and business critical Konica Minolta printing unit (i.e. not photo printer) could not be used with a new Apple acquisition for over 7 months when I upgraded in disregard of ASOEE (and my work-around defeated the purposes of the upgrade anyway). So I recommend you wait, if you can, until at least the second generation (or more) of M1 chipped Macs for any kinks to be resolved ASOEE (i.e. at someone else's expense) before you buy in.
Posted by: Michael Bearman | Wednesday, 11 November 2020 at 07:34 PM
I do agree that this whole chip thing basically boils down to Apple's drive to control everything in its product line. Its constant (and in my opinion largely unnecessary) updates further that drive -- they make it really hard to have a stable relationship between a non-Apple piece of software and an updated Mac. As a novelist, I recently ran into a new bug, created because Apples "Pages" is not completely compatible with Word for Mac anymore. It's too complicated to explain here, but it nearly drove me crazy, and created an immense amount of extra work for me on deadline, and has me thinking seriously about moving to Windows. The problem there is, Windows has always baffled me. And it's ugly. It's like moving from Nikon to Ricoh. 8-)
A historical note: Apple is nowhere near the "wealthiest and most influential company in human history." I think those honors would have to go to the British East India Company, which once controlled an estimated half of all world trade, and basically owned India, with a private Army of 260,000 men; or perhaps the Dutch East India Company, with a net worth that in modern terms would amount to about $7.9 trillion.
Posted by: John Camp | Wednesday, 11 November 2020 at 08:04 PM
Many claims for computer performance are similar to the claims for cameras - performance is more than megahertz and megapixels....
Moore's law still holds and integration allows more functions on smaller chips that use less power. Integrating on a single chip means less board space too.
Another thing the chip designer can do is optimize the hardware/software operations leading to efficiency, speed and reliability.
But this is a terrible time to get me to say something nice about Apple. Fixing their bugs caused by upgrades has cost me most of my time for two weeks. None of what happened was unique to me. Searches showed plenty of other users had the same problem. Lots of suggestions on how to fix but nothing worked but wipe and start over.
Total time to date: about 50-60 hours.
So I guess I'm not really enthusiastic about brand new MACs right now.
Advice: Remember to keep backups - several and several formats.
JH - computing since IBM computers used vacuum tubes.....
Posted by: JH | Wednesday, 11 November 2020 at 10:14 PM
I’m waiting to hear what Joe Holms has to say, but having control of both the operating system and the cpu means that you can do things that aren’t otherwise possible. Having a 8‑core CPU, 8-core GPU, and 16‑core Neural Engine sharing 16 gig of ram and apple supplying developer tools to exploit that is exciting.
I fondly remember developing software on the Amiga where you could xor 16 meg of data in one clock tick with the blitter chip in 1986 when that was a big deal. App specific hardware is a wonderful thing.
The real win for Apple is that by providing tools and hardware, there will be applications that only can run on Mac, yet it will still run other software from legacy x86 OSs.
The real question is left to do on computers that they don’t do well enough now other than the same stuff they do now but with bigger data faster?
Posted by: hugh crawford | Wednesday, 11 November 2020 at 11:39 PM
Mea culpa. John is quite correct that the Dutch East India Company was the wealthiest company in human history and, in even conservative modern normalized terms, had coffers stuffed many times the size of Apple’s.
Posted by: Kenneth Tanaka | Thursday, 12 November 2020 at 07:34 AM
The early benchmarks say that these are the fastest Macs ever at single thread jobs. Things get more problematic once you start using multiple threads and the GPUs. There's also the issue of RAM being maxed out at 16GB because it's built into the M1 chip itself. For simpler, less memory intensive tasks, these Macs fly in a league of their own. Run Photoshop with multiple layers on a large file with filters that call the GPU, well, the Intel+Radeon Macs are better, particularly if you've stuffed them with memory.
A few other comments to the comments:
* Yes, you want the MacBook 13" if you're going to run sustained tasks that max out the CPU. That's because the fan makes it so that it sustains peak speeds longer.
* If you're running legacy software that isn't being updated, your mileage may vary. Adobe will have an M1-enabled version of Lightroom shortly, and then we'll be able to do some real side-by-side comparisons.
* People are forgetting that Apple lent developers Apple Silicon Macs for the last few months. I suspect we're going to see fairly rapid software evolution once the final M1 Macs ship.
* Pages isn't a good substitute for Word on the Mac, and never has been. The Gold Standard for that is NisusWriter Pro. And NisusWriter Pro shows you how much Microsoft has ignored in Word, particularly when it comes to really big documents.
* As I wrote on my site: these are LOW-END Macs, and they're faster than a lot of high-end Macs. We're all waiting to see just what happens when Apple goes upscale with their chips, because what just happened to the MacBook Air and low-end Mini is pretty spectacular.
Posted by: Thom Hogan | Thursday, 12 November 2020 at 08:13 AM
The new Mac Mini has my most interest because I can repurpose the many storage drives and other external components currently attached to my aging iMac. I would also attach a fine color- critical NEC monitor.
That all said, I'm going to wait to see how the new platformed Lightroom runs on these Apple Silicon models. I'm willing to wait a year for higher end Macs to be released if these entry level models aren't muscular enough.
We will see significant convergence with iOS devices in 1.5-2 years when 5G is included with the Macbook Pro models to start.
Posted by: Michael Elenko | Thursday, 12 November 2020 at 11:45 AM
It can add. Your computer can add. It can make decisions depending on result of addition. Your computer can do this too. Perhaps it will be faster than your computer (but since it can have only 16GB memory it will not be faster for demanding tasks unless your computer is old and tiny: my macbook from 2013 also has 16GB, such is progress). It will not be faster if you must run old binaries under emulation, so better have current subscriptions to Photoshop etc. If you spend lot of time waiting for computations (not for your network) being faster may matter to you. It may use less power than the computer you have, and this may matter if you have a laptop and use it away from power a lot.
But let us jump up and down and be excited: it is a shiny new toy.
Posted by: Zyni Möe | Thursday, 12 November 2020 at 01:29 PM
Addendum: Apple products that use iOS, iPadOS, tvOS, watchOS, and bridgeOS operating systems have been colloquially referred to as an iDevice.
Jailbreaking is when a user modifies, or patches, the iDevice's system to gain access to resources that are not usually made available to the user. Many use jailbreaking to install apps that are not available from Apple's App Store.
The reliability and security of jailbroken devices can be viewed as questionable since the patching process could also allow third-parties the ability to execute malicious code.
[Thanks Alex. --Mike]
Posted by: Alex Mercado | Thursday, 12 November 2020 at 02:01 PM
I've got some experience working with ARM chips, emulation, software - that sort of stuff. And I have two reactions to these announcements: they are neat computers and bridge the lightweight low-power use of iPads to the more flexible MacOS use and never buy the first version! Or never buy with your own money if you're a middle income earner; the first version always has quirks.
I think there's interesting future potential here and we should even see lower costs if the competition can push Apple to do that. But especially the software is new and many apps won't work well, so I'd wait a bit and see.
Posted by: Oskar Ojala | Thursday, 12 November 2020 at 07:07 PM
The M1 is a seismic shift that goes way beyond Apple. Now the MacBook Air competes or beats in performances a MacBook Pro 16" with an Intel i9! Let that sink in.
The 3 computers that now have the M1 make more than 90% of Mac sales. In terms of sales volumes the transition is almost over already.
There are many reasons to that, among them the astonishing fact that where Intel can place one transistor, Apple can place four. AMD as well, but they have to carry the x86 baggage.
The MacBook Air with an M1 is cheaper and much faster than a Microsoft Surface Laptop. And is silent. And has way better battery life.
All that at a time when Windows compatibility is rapidly becoming less important due to the move to web-based apps and remote Windows apps hosted in Azure or AWS that can be used with a browser, an iPad, a Mac or an Android tablet.
Apple just showed how uncompetitive Intel has become, extending that to the whole PC market. Now either Microsoft gets their act together regarding Windows on ARM or life will become pretty hard for several PC builders who will only have the low-end junk to sell with very little margins.
Posted by: Stéphane Bosman | Friday, 13 November 2020 at 08:46 AM
Tidbits still around? I use them to try to dial into the Hong Kong supernet in 1993(?) and I were the 2nd mac user connected to the world using tidbit book and more important bits floppy disk. No one know how to it then I were told and you two sorted it out somehow. Most instruction is in pc using us robotics.
Btw, https://kosmofoto.com/2020/11/camera-used-to-take-1950s-candids-of-marilyn-monroe-goes-on-sale/?fbclid=IwAR0PofldW073kyGd1Rkac6VVJ2XPTmR407DJBpmsBUNE7wvJJBfYlE9EVk0 anyone know this USA camera. Never seen this before. Not a leica clone?
Posted by: Dennis ng | Friday, 13 November 2020 at 09:26 PM
My gripe with Mac (and Windows) is stuff like this: https://sneak.berlin/20201112/your-computer-isnt-yours/
There is only a small selection of good photo editors on Linux, but at least I am not paying a lot of money for a computer that spies on me...
Posted by: Christoph | Sunday, 15 November 2020 at 12:18 PM
Robert Hudma writes: "If your current MacBook Pro or Air has a failure, your data on the SSD is essentially not recoverable since it is soldered on the logic board.”. It’s astonishing to me that people still keep important data in in-backed-up devices. SSDs fail just like other components, so if you’re relying on repairability as a strategy, you WILL lose data. The only question is when. I have both a local backup drive and cloud backup for my photo archives, and two distinct off-site cloud backups with two separate providers for all important files. As to repairability, that’s been an issue for decades. Back around 1970 the diaphragm in my Chevy’s fuel pump broke, disabling the pump. The pump was held together with maybe a dozen or screws that enabled you to replace the diaphragm. But diaphragms were no longer available. I had to replace the entire pump with all those screws replaced by a crimped edge. What were the trade offs? Reduced manufacturing costs versus increased repair costs. Soldered-together motherboards have trade offs too. Connectors are a hot-spot for failures; soldering increases reliability. Soldering reduces size and weight as opposed to using connectors that would permit changing out the SSD or the CPU. You may or may not like the trade offs, but there’s something in it for the consumer as well as for the manufacturer.
Posted by: Bill Tyler | Sunday, 15 November 2020 at 02:10 PM
Miss the chime? It's back, as an option, in Big Sur. You can toggle it on and off through the "Sound" preference.
Posted by: Bob Curtis | Tuesday, 17 November 2020 at 09:24 AM