A Mobile Developer’s Key Takeaways and Learnings from WWDC 2017

By Pallak Grewal

Pallak at WWDC

This year I got a chance to attend WWDC (Apple’s World Wide Developers Conference) for the first time. Glancing out through the airplane window at the quickly changing California landscape, I thought about the products, features, and updates I expected Apple to announce – advances in machine learning and a product to compete with home assistants currently on the market.

WWDC 2017 did not disappoint.

This year, WWDC moved back to San Jose after 15 years in San Francisco. Around 5,000 people from all over the world attended the conference. The attendees included not just seasoned developers but also designers, project managers, developers who want to learn iOS/MacOS development, and educators. WWDC had something for everyone. It was a jam-packed week of learning sessions during the day and parties organized by various companies in the evening. WWDC also had a talk by Michelle Obama on Tuesday morning and special lunch speakers during the week. On Thursday, Apple organized an attendee only event for the end of the conference- the Bash, which featured a concert by Fall Out Boy (one of my favourite bands so I really enjoyed the front row experience).

I reached the conference venue incredibly early for the keynote. It was still dark outside, but even at 5AM there were already more than a hundred people lined up! Even though the wait was long, this was an excellent opportunity to get to know other people. I talked to them about their expectations for the keynote and the conference overall.

As always, Apple announced better specs for almost all their products. Since Apple is really consumer focused, there was a lot of emphasis on providing a better user experience.

The new iMac and iPad got a great response from the attendees. The iPad Pro demo really sold the product. The new iPad boasts a new dynamic refresh rate up to 120Hz. This means that animations are smoother than ever and apps which do not require the higher refresh rate use a lower one, extending battery life. The iPad uses the A10X Fusion chip (an improved version of the A10 Fusion processor in iPhone 7s) and is equipped with the same cameras as the iPhone 7. The dock has been revamped for easier app switching and app pairings are now preserved. The iPad Pro also uses the new Apple File System (APFS), which aims to fix some core issues present in its predecessor. With the new refresh rate and iOS 11, the Apple Pencil is even more responsive and useful with the iPad Pro.

Apple also introduced the next version of MacOS – High Sierra (I expected someone to intervene and say it was a joke and announce the real name but only got a whole bunch of puns instead). High Sierra is pretty similar to Sierra visually but adds a lot of improvements behind the scenes, like using machine learning for Safari and Photos.

Next up was the announcement for iOS 11. iOS 10 has seen an impressively high adoption rate, with 86% devices running on it. On iOS 11, the control centre has been redesigned for faster access. The App Store also saw a major redesign for the first time since its launch. The Featured tab is being replaced by the Today tab with apps and daily stories curated by editors from across the world. Games have their own tab now. With in-app purchases rising in popularity, there is a section featuring top in-app purchases. The search tab now supports using app details as search keywords.

Finally, Apple unveiled the new HomePod – a Siri controlled speaker. The HomePod demo highlighted its sound quality and spatial awareness – how it adjusts its sound when placed next to a wall or another HomePod. Similar to Google Home and Amazon Echo, the HomePod doesn’t just plays music but can also control smart devices and do anything that Siri can. Knowing Apple, the HomePod can be expected to be a solid piece of hardware. The drawbacks with the HomePod currently are its price and that it only supports Apple music.

From a developer’s perspective, these are the things that I am extremely excited about:

Xcode 9

Xcode 9

The Xcode 9 source editor has been rewritten completely in Swift. Xcode is now faster, especially for Swift, and supports markdown. Remember all the times you’ve received a warning or error in Xcode but it gets cut off in the code editor? That has now been fixed. There is a new smart fix-it button, which not only fixes errors and warnings but can also insert missing methods for protocols and fill in missing switch cases with a single button click. Colours can now be added to the assets catalog and referenced by name in classes and storyboards.

Xcode Fix Button
The New fix button in Xcode 9

Xcode 9 also brings over the tokenized editing experience from Swift playgrounds. It has more semantic awareness and smart highlighting. For example, clicking on an if statement will highlight the code block in that statement. Xcode 9 also has a new refactoring system that takes advantage of tokenized editing. Right clicking on a token, like if, will bring up a contextual menu with different actions. If you use an action frequently, you can even bind it to a key. Renaming a class or method is more user friendly. Before applying the change, Xcode shows a preview of the new name in all the files that use it, even Storyboards.

Xcode 9 introduces support for different Swift versions for different targets. So now you can update your project code to Swift 4 but still use third party libraries written in Swift 3. You can also run multiple simulators and devices simultaneously for testing. Another big feature is wireless development – no need to plug in your phone anymore. This is especially helpful for devices like Apple TV which may already be plugged in and set up in a room. In collaboration with Github, Apple has also added built in Github integration to Xcode 9.

Relevant sessions

Swift

Swift

Migrating from Swift 3 to Swift 4 is considerably less effort than moving from Swift 2 to Swift 3. If you are still using Swift 2 in your projects, now is a good time to update to Swift 4.

Swift 4 introduces a number of improvements to version 3. These tweaks may be tiny but definitely have a big impact. For starters, private properties are now accessible in class extensions, so no need to use “fileprivate” for all your properties. Strings are finally a collection again (loud cheer from developers everywhere), so all collection operations can be performed on them without fetching the string contents as a characters collection. Another new feature is that you can check if an object is a specific class and conforms to multiple protocols.

Strings in Swift 3
Strings in Swift 3

Strings in Swift 4 – collection again!
Strings in Swift 4

The new Codable protocol makes it easy to automatically encode and decode data types for JSON compatibility as long as any nested properties also conform to the Codable protocol. The article below from Apple regarding codable protocol provides a quick overview of how flexible the Codable protocol is.

Relevant sessions & information

Drag and Drop API

Drag and Drop API

One of the reasons the iPad Pro demo was such a hit was because of the new drag and drop functionality. On the iPhone, drag and drop only works within the app, but on the iPad, it works seamlessly between different apps. Users can use multi-touch to drag and group data objects and then drop them as needed. The most impressive thing about drag and drop is how simple it was to integrate into your app. I would definitely recommend checking out all the drag and drop sessions and playing around with the API. Even if drag and drop are not a core functionality required by your app, it can tremendously improve user experience while interacting with other apps.

Relevant sessions

Machine Learning

Apple already uses machine learning internally in their products, like Spotlight. So far, these APIs had been private but now Apple wants to enable developers to use machine learning easily by providing an interface through Core ML. A majority of apps may not need machine learning for their core product but can provide a better user experience by leveraging these new APIs.

Apple wants developers to not worry about the machine learning models but wants to enable them to apply these models easily. You can take a machine learning model in any format and use Apple’s open source python package – coremltools – to convert it to a mlmodel class. You can drag this generated class into Xcode and start using it. That’s how simple it is to use the Core ML framework.

Apple also provides ready to use models built on top of the Core ML framework:

Natural language processing through the NSLinguisticTagger API

The NSLinguisticTagger API has been available since iOS 5 and has been used internally for keyboard prediction, Spotlight and even in Safari. Now with more powerful public features, it offers the ability to process text and identify the language, and tokenize text at various levels and use lemmatization. It can be used for multiple languages and promises high accuracy.

Image analysis and computer vision through the Vision framework

The Vision framework allows you to easily identify faces, detect features (face landmarks), read text and scan barcodes. It even supports tracking objects and faces in real time.

The best part is that Core ML processes data entirely on a device. This offers offline functionality as well as a high level of privacy.

Relevant sessions & information

ARKit

ARKit

Another new framework added this year is ARKit. ARKit offers plane detection through the camera and lets you add virtual objects in the real world. It can also apply the correct amount of light for your virtual objects with minimum effort. You can also move objects, scale them and interact with them in multiple other ways.

With ARKit, objects don’t just seem to be floating in space but are actually anchored to the surface or smaller feature points the way an actual object would be. This can be used to build some pretty realistic and complex scenes. For rendering the objects, you can use the Metal, SpriteKit and SceneKit. In addition to these Apple frameworks, Unity and Unreal will also support full ARKit features.

Relevant sessions

WWDC 2017 definitely forecasts an interesting year ahead! I can’t wait to see what people create with the new frameworks and APIs that are now available.

Welcome to WWDC17

To wrap up, here’s a fun fact about WWDC: Before each talk, there is an animation of people walking around. This mesmerizing animation was also playing on various screens in the conference as a screensaver. On closer inspection, one could tell that it was not a looping animation. In fact, it was a Swift app built by Apple. The people moving around could generate their own paths to avoid obstacles and walk across the screen. They used state machines to either keep moving, stop and become an obstacle or interact with another person in their path.

Other Recommended Sessions

Special Talks (Some talks may not be available yet)