"The idea of Apple existing at the intersection of technology and liberal arts was central to the late Steve Jobs’ conception of Apple and, without question, a critical factor when it came to Apple’s success: at a time when technology was becoming accessible to consumers and their daily lives Apple created products — one product, really, the iPhone — that appealed to consumers not only because of what it did but how it did it.
"That said, it was telling that this artwork and the sentiment it signified was not referenced in the keynote itself; after a humorous skit about a world without apps, Tim Cook delivered platitudes about how Apple and its developers were on a “collective mission to change the world”, and immediately launched into what he said were six important announcements. It was not dissimilar to Sundar Pichai’s opening at Google I/O: when the announcements that matter are grounded on the realities of a company’s core competencies and position in the market, vision can feel extraneous."
Ben Thompson, Stratechery
"There’s a story told of the theoretical physicist Wolfgang Pauli that a friend showed him the paper of a young physicist that he suspected was not very good but on which he wanted Pauli's views. Pauli remarked sadly "It is not even wrong”. For a theory even to be wrong, it must be predictive and testable and falsifiable. If it cannot be falsified - if it does not make some prediction that could in theory be tested and proven false - then it does not count as science.
"I've always liked this quote in its own right, but it's also very relevant to talking about new technology and the way that people tend to dismiss and defend it. For as long as people have been creating technology, people have been saying it'll never amount to anything. As we create more and more - as 'software eats the world', the urge to dismiss seems only to get stronger, and so does the urge to defend. However, these conversations tend to follow a fairly predictable sequence, and quickly become unhelpful:
- That’s just a toy
- Successful things often started out looking like toys
- That’s just survivor bias - this one really is a toy
- You can't know that
- So tech is just a lottery?
"The problem with both of these lines of argument is that they have no predictive value. It is unquestionably true that many of the most important technology advances looked like toys at first - the web, mobile phones, PCs, aircraft, cars and even hot and cold running water at one stage looked like faddish toys for the rich or the young. Even video games, which literally are toys, are also largely responsible for the GPUs that now power the take-off of machine learning. But it's also unquestionably true that there were always lots of things that looked like toys and never did become anything more. So how do we tell? Is it that 'toys' occasionally turn into something else through some unpredictable chance? Do we throw up our hands and shrug? William Goldman famously said of Hollywood “Nobody knows anything”, but that feels like an abdication of reason and judgement. We should try to do better."
Benedict Evans, ben-evens.com
"On Monday, the most awaited and rumored device of Apple’s developer conference was finally announced as the last one thing of an over two-hour long keynote: HomePod.
"A little later in the day, in a room that is probably as large as my family room at home, I had the opportunity to listen to HomePod and compare its performance to an Amazon Echo and a Sonos Play 3. I listened to five songs across the three devices: Sia’s “The Greatest,” “Sunrise” by Norah Jones, “Superstition” by Stevie Wonder, “DNA” by Kendrick Lamar and a live performance of The Eagles’ “Hotel California.” The sound coming from HomePod was crisper and the vocals clearer than the Sonos. The comparison with Echo was the harsher of the two. No matter where I stood in the room, the music sounded great. What I did not get to do was talk to Siri! Even the demo was run from an iPad which would imply there is Bluetooth support with HomePod."
Carolina Milanesi, Tech.pinions
"I generally have posted about things that I have been directly involved with — either code I wrote or projects I managed. In this post I am taking a different tack to write about my perspective on the underlying causes of the Windows Vista (codename Longhorn) debacle. While this happened over a decade ago, this was a crucial period in the shift to mobile and had long-running consequences internally to Microsoft. I have found many of the descriptions of Microsoft’s problems, especially around the shift to mobile, to be unconvincing and not to mesh with my understanding or experience of what went wrong. Vanity Fair’s article Microsoft’s Lost Decade, ascribed it to bureaucratic rot and infighting (“life … had become staid and brutish”) or culture rot due to the negative effects of a competitive stack ranking evaluation system. A more recent article in The Atlantic describes it as a classic “Innovator’s Dilemma” story."
Terry Crowley, Hackernoon
"For some, Apple’s WWDC keynote event went liked they hoped, with the company introducing some exciting new products or technologies that hit all the sweet spots in today’s dramatically reshaped tech environment. Augmented reality (AR), artificial intelligence, smart speakers, digital assistants, convolutional neural networks, machine learning and computer vision were all mentioned in some way, shape or form during the address.
"For others, the event went like they expected, with Apple delivering on virtually all the big rumors they were “supposed” to meet: updated Macs and iPads, a platform for building AR apps on iOS devices, and a Siri-driven smart speaker."
Bob O'Donnell, Tech.pinions