If “gushed” seems derogatory, it’s not meant to be. Because Sinofsky, who knows a thing or two about shipping product, waxed eloquent in a long series of tweets on the topic. It’s worth reading his views on how unprecedented Apple’s execution has been.
What is it about Apple that causes Sinofsky to declare, “Under the hood, [Apple product engineering] is a team that over time has done more and executed better than any I can name, ever”? That’s incredibly high praise, especially given the source. Sinofsky hones in on three things that Apple has done remarkably well:
Fearless multi-year strategy
Clear unified planning/prioritization
Wildly unprecedented execution
Let’s take the first one, to start. On the multi-year strategy, every company of size does this. Few, however, hold to those multi-year projections beyond the immediate fiscal year, as Sinofsky calls out. In Apple’s case, its announcement of a two-year move to ARM-based architectures for Macs is tremendously aggressive, while also being dangerous. In Sinofsky’s words:
First, that’s like no time at all. Second, that’s an incredibly long time to tell everyone how long it will take and that they should be patient. Seriously. But really that is incredibly brave when so much could potentially change, more importantly could go wrong….The big thing about this is how Apple’s overall model…enables this to work. Every aspect of the system has to come together to create an environment where choices can be made AND supported that allow these plans to have integrity.
It’s precisely because Sinofsky has attempted to do this at a company known for solid execution that he’s well-positioned to comment on Apple’s achievement. Microsoft’s delivery of various Windows operating systems, he related, were constantly (and consistently) delayed. As for Microsoft’s shift to 64-bit? “It took 20 years for it to happen…[and] still isn’t done.”
This isn’t to demean Microsoft. Far from it: The company has a different customer base and has tended to prioritize backward compatibility over most other things. “We didn’t do anything wrong,” Sinofsky said. Even so, this commitment to easing customers forward, while preserving their ability to take a long time to get there, he said, “does…make it much less interesting/important for customers to move forward with you.”
It’s not about agile, scrum, or waterfall
How does Apple pull this off? According to Sinofsky, it’s by fixating everyone across a massive engineering organization (he estimates it at 20,000) on the whole, rather than the parts: “It is incredibly clear that everyone at Apple puts strategy requirements above anything ‘local’. When you wonder why there isn’t more new in Notes or why Mail is missing stuff it’s because supporting a multi-year strategy trumps individual teams and that’s a good thing.”
Microsoft’s approach, by contrast, wasn’t worse, he stressed, but very different. “Microsoft operated much more locally and hence was far more resilient, in many other businesses, and served many different customer types. Some would even say [it was] more responsive to customers.”
It is this laser-sharp focus on a holistic strategy that makes Apple unique, Sinofsky argued. It’s not about a particular product development methodology. It’s all about the supremacy of strategy: “This isn’t scrum or agile….[M]ost would call it waterfall BUT IT IS NOT. It is planning, iterating, prioritizing, discarding, restarting, and more. I argued most of my career that having a strategy and prioritizing is the only way to execute to have this impact.”
So could you do the same? Possibly. To do so requires exceptional people, but also that exceptionally strong commitment over prolonged periods of time to a central strategy. Turns out that’s really hard to pull off in practice, which is why Sinofsky celebrated Apple in the singular, not an array of Apples.
Disclosure: I work for AWS, but the views expressed here are mine and don’t reflect those of my employer.
Apple Weekly Newsletter
Whether you want iPhone and Mac tips or the latest enterprise-specific Apple news, we’ve got you covered.
Siri, Apple’s personal digital assistant, uses machine learning and natural speech to answer questions, return relevant search information, perform actions, and more. Here’s the lowdown on Siri.
Occasionally, seemingly small innovations pack tremendous impact. Certainly, that’s proven true with Apple’s Siri personal digital assistant. The voice-activated concierge so significantly reshapes the way people interact with devices that Alphabet technical adviser Eric Schmidt has stated the feature poses a threat to Google’s underlying search business.
Apple announced at its 2018 Worldwide Developers Conference (WWDC) that Siri is being updated to include predictive guidance and recommendations. Machine learning advancements, Siri’s voice recognition capability, and the ability to learn from users’ behaviors and routines meld together to make it all possible. macOS, iOS, and watchOS updates usher in a new era in which Siri Shortcuts and improved watchOS integration make it even easier for users to create custom Siri reminders and receive predictive notifications and customized recommendations without having to expend much, if any, additional effort.
Whether you want to shorten the time required to answer a question, schedule a ride, check a flight’s status, take an alternative route home due to traffic congestion, send a note letting others know you’re running behind, or send a text message or obtain navigation information without having to type, Siri offers intelligent assistance that adapts to the individual user’s nuances over time. Available in all of Apple’s operating systems–iOS, macOS, watchOS, and tvOS–users can customize the digital concierge to possess different voices and change the way its services are activated.
Siri was originally introduced as a standalone iOS app by Siri Inc. Apple acquired the company in April 2010. The feature was then integrated within iOS dating to version 5, after which the feature was steadily rolled into Apple’s other platforms, including watchOS, tvOS, and macOS. The platform now supports some 20 languages in dozens of countries.
What is Siri? Siri is a digital personal assistant that performs searches and completes actions in response to an end user’s natural voice commands and learns from a user’s behavior and routines to provide predictive recommendations and information.
Why does Siri matter? Siri introduces an innovative and revolutionary search and instruction strategy, being adopted by competitors (including Amazon’s Alexa, Google Assistant, and Microsoft’s Cortana), that changes the way users interact with devices and obtain information. By leveraging machine learning and artificial intelligence capabilities, the virtual assistant’s usefulness is enhanced without requiring additional user interaction.
Who does Siri affect? Users of any Apple device–whether the equipment is a smartphone, tablet, desktop computer, laptop, Apple TV, iPod touch, watch, or HomePod audio speaker–can access Siri capabilities, which help leverage investments the user has made in digital content and material across all Apple devices and services using an Apple ID.
What are the potential privacy and security risks of using Siri? Artificial intelligence, merged with machine learning trends and voice recognition capacities within a virtual assistant, raises multiple significant privacy and security concerns. The virtual assistant collects and leverages intimate knowledge and details of each user’s personal and professional lives. With such treasured information comes great safeguard responsibilities, but Apple claims to be up to the task.
How do you get and use Siri? Siri is integrated within iOS, macOS, watchOS, and tvOS. Users can customize settings for the virtual assistant, which is automatically integrated within contemporary Apple devices.
Siri is a digital personal assistant, integrated within Apple device operating systems, that enables Apple device users to get answers to questions, check the weather, confirm flights, perform searches, answer questions, complete actions, send a message and much more. The time-saving feature uses natural language and doesn’t require learning sophisticated or unfamiliar commands. Also, Siri adapts to a user’s nuances, learns from previous operations, and leverages a device’s existing capabilities to extend usefulness with a minimum of user instruction or interaction.
Siri is not a utility to be used in hectic, noisy environments, or a tool to be leveraged for performing complex commands, such as editing videos or photos. Instead, the digital concierge excels at performing time-saving commands (“Hey Siri, please text my spouse that I am running five minutes behind”), opening a specific file (“Hey Siri, please open the 2018 budget spreadsheet”), accessing specific photos (“Hey Siri, please open the new product shots photo album”), learning whether you need to take an umbrella to your client meeting (“Hey Siri, is it going to rain at 3: 00 pm?”), and similar tasks.
Don’t sell Siri’s capabilities short, though–Apple touts Siri’s ability to book rides, make payments, and display specific files, among other actions, too. The more time you spend with Siri, the more you’ll learn how it can be used to perform new and creative tasks.
Certainly, Apple will continue investing in the AI assistant. Apple announced at WWDC 2017 that it is using deep learning to improve Siri’s operation. Voice intonation and inflection tweaks help create a more natural sounding voice, while the technology also benefits from on-device learning to enable it to better respond to questions, provide more relevant information, and even recommend suggested articles, text changes, and search strings based on the user’s previous behavior.
And at WWDC 2018, Apple announced new watchOS innovations that make Siri even easier to use. Users need only to raise their wrist and start speaking–they don’t have to say “Hey Siri,” to begin issuing commands and questions to the virtual assistant. The watchOS Siri Face will begin supporting interactions with third-party apps, too, and includes such enhancements as estimating commute times and providing contextual updates, such as for sporting events.
introduce neural text to speech, also known as Neural TTS. Whereas Apple previously used short audio clips recorded by acting talent and pieced together to form words, phrases, and sentences, with Neural TTS the resulting speech sounds more like normal human talking with natural emphasis and cadence. The effect is particularly noticeable when Siri speaks longer, more complex statements.
At WWDC 2020, Apple announced an improved design for Siri in iOS 14 and iPadOS 14. On both iPhones and iPads, Siri no longer takes over the entire screen when summoned; instead, a small, orb-like icon pops up at the bottom of the screen, allowing you to still see whatever you were working on before initiating Siri.
Some additional notable features to Siri include: The ability to send audio messages, dictate on your device, and translate between languages with the new Translate app. Siri also has a bigger bank of responses to common questions and more facts available than ever before.
Instead of users having to stop what they’re doing, navigate to various menus and applications, access the keyboard, type specific instructions, and browse and occasionally revise results, Siri enables users to deliver simple and natural voice commands to Apple devices. Whether seeking to play a video, open a file, obtain navigational information, view a specific photo album, or perform other tasks, users can quickly perform all these actions using Siri via minimally disruptive voice commands.
Considering users configure Apple devices to join their iCloud and iTunes accounts, the content (spreadsheets, documents, presentations, PDFs, videos, photos, movies, TV shows, music, and other files) available to all their Apple devices becomes accessible to Siri. The result is a much more collaborative, efficient, and productive relationship between an end user, the end user’s digital content (files, photos, videos, music, applications, cloud services, etc.), and devices (Apple TV, iPhone, Mac, automobile entertainment system, watchOS, and iPad) that require often minimal voice interaction to sort, locate, view, and access.
Siri also simplifies the task of leveraging other Apple technologies. For example, an iPhone user on the go can instruct Siri to schedule a 2: 00 pm client appointment on Tuesday. When the Apple user returns to the office, powers on his or her Mac, and opens the Apple Calendar, the meeting will already be present on the calendar, assuming the user has configured Apple Calendar properly on all his/her devices. Apple’s quickly closed the gap from being able to enter such information on the go to being able to enter and synchronize such data using simple voice commands. The ramifications are impactful and wide-ranging.
At WWDC 2017, Apple announced the release of a new Siri speaker. Called HomePod, the Bluetooth-enabled, self-adjusting high-fidelity device sports six microphones to extend Siri functionality. Apple users can leverage Siri voice interaction technologies (think voice commands), enabling the device to play Apple Music, control smart home accessories, answer general knowledge inquiries, set clocks and timers, obtain news and weather information, and even get traffic reports and translations.
At its annual 2018 WWDC, Apple announced the introduction of Siri Shortcuts. The feature permits any app to receive access to Siri. Users can assign key phrases to specific apps, such as “Siri, I lost my keys,” to enable Siri to work with Tiles to provide the physical location of the missing keys in question. Using Shortcuts users can also create custom reminders and choose from hundreds of preformatted shortcut routines.
Users of Apple devices, including iPhones, iPads, Macs, Apple TVs, and Apple Watches, are affected by Siri innovations.
Customers who purchase automobiles equipped with Apple CarPlay also benefit; Siri functionality integrates with the car’s audio system’s capabilities and better links a user’s iPhone with the vehicle to simplify obtaining directions, making calls, listening to books, sending and receiving messages, and listening to music. And, as announced at WWDC 2017 and WWDC 2018, Siri is taking an increasingly prominent role in watchOS 4 and watchOS 5 platforms, respectively.
Everyone from business users seeking to coordinate schedules and maintain pace with the modern workplace to retirees seeking to monitor investments to students working to ensure busy academic and personal lives stay on track will find the virtual assistant, which learns from their behaviors and routines, a welcome addition to their increasingly frenetic responsibilities. As Siri increasingly integrates within Apple users’ lives, with its machine learning and artificial intelligence capacities, the personal assistant could soon prove a necessity.
Developers are also impacted, as software manufacturers benefit when their applications are integrated with Siri. Apple’s SiriKit assists developers with the process. SiriKit consists of two frameworks that developers can leverage to tie their applications and services with Siri.
Apple’s WWDC 2019 conference touted Siri refinements both within iOS 13 and CarPlay. SiriKit makes it easier for developers to integrate Siri functionality within their apps. CarPlay is one example, as Pandora and Waze support Siri beginning with iOS 13.
Siri iOS 13 improvements include Shortcut support. A quick method for automating instructions, such as directions to the next appointment on your calendar, Shortcuts are integrated within iOS 13 to provide more powerful access to all Shortcuts, including those added to Siri.
Those using Siri for navigation will find the AI assist improving over time, as well. With iOS 13, instead of saying take a right in 700 feet, Siri will simply say take the next right. The improvements are more natural, and subsequently, more quickly understood. When traveling to large venues, such as arenas or airports, Siri guides you closer to your actual intended destination within that location.
But even everyday actions benefit from Siri. Whether using podcasts or Maps, Siri better guides users by providing more accurate and contextual suggestions and recommendations. Users can also leverage Siri to perform more common tasks, such as tuning in to a specific radio station.
What are the potential privacy and security risks of using Siri?
In the wake of Facebook’s massive data leaks, which revealed comprehensive profile and behavior information for identifiable individual users, privacy and security concerns are receiving heightened awareness. In fact, digital privacy and security issues are likely to prove among the most publicized stories of 2018 and the next several years.
At its WWDC 2018 conference, Apple renewed its commitment to privacy and security, but concerns remain. Whenever a technology captures as much intimate, personal, sensitive, and strategic information as with which Siri is entrusted for each user, the value of that information proves significant, tremendously so for a variety of constituents. Thus, the challenge for Apple, which states it’s committed to safeguarding this sensitive data, is to avoid the type of questionable alliances and leaks that continue plaguing Facebook.
By making it more difficult for third-parties to track user behavior–by resisting the temptation to sell user data to advertisers or for data mining purposes and by presenting roadblocks to the release of complete profile information for a user–third-party app developers, websites, and other partners are going to find it much more difficult to mine Apple user’s information.
At WWDC 2019, Apple doubled-down on privacy. Having stated privacy is a fundamental human right, the company is increasingly positioning its technologies as possessing fundamental design strategies designed to preserve and protect user privacy, whereas such competitors as Google and Facebook are publicly collecting such user data to better target users with ads and promotions.
Apple has a history of developing loyal relationships with its users. Many Apple professionals are so loyal to the platform that they use Macs in the office, iPads at home, and iPhones everywhere in between. Mating Siri with the digital wearable (Apple Watch) and home speaker (HomePod) further increases the “stickiness” within the relationship that’s so prized by marketers.
Users seeking to leverage Siri’s capabilities need to purchase a contemporary Mac, iPad, iPhone, iPod touch, Apple Watch, or Apple TV. Siri settings can be customized using an iPhone’s or iPad’s Settings menu, a Mac’s System Preferences screen, or the Settings menu on an Apple TV.
The default method of accessing Siri on an iPhone or an iPad is to hold down the Home button. To summon Siri on a macOS Sierra-equipped Mac, you can leverage a keyboard shortcut assigned within System Preferences or by clicking the Siri icon on the menu bar (after configuring your Mac’s Siri preferences to enable its appearance). macOS Sierra also places a Siri icon in the Dock for easy access. Using an Apple Watch, you can ask Siri a question by pressing and holding the Digital Crown or by raising the Watch or tapping the screen and saying “Hey Siri,” unless you’re using watchOS 5 or newer, in which case you can just raise your arm and ask Siri your question. In watchOS 5 and newer, users need only to raise their wrist and begin speaking–it’s that easy.