Ryan O’Connor ’19 Wins Scholarship to WWDC17 (Part 2)

By Ryan O’Connor ’19

Monday, June 5

Monday was the most important day of the entire week. On this day, Apple held a conference to announce their most important new software features and announcements.

Inside the convention center.

The presentation kicked off with a description of some minor updates to tvOS, the operating system for Apple TV, and some new watch faces and redesigns for watchOS, the operation system for Apple Watch.

During the keynote, Apple CEO Tim Cook also highlighted that the youngest developer attending was a 10-year-old boy from Australia who has already produced five apps on the App Store, while the oldest developer attending was 82 years old from Japan, having just published her first app earlier this year.

The excitement really started when the announcements for macOS, Apple’s Mac operating system, began. Enormous speed improvements across the line were highlighted.

Another section especially stuck out to me due to its importance to students. This year, Apple added a Files app to iPad, which will allow us students to see all of their files from all of their apps used in one place…This is a huge update for students, and I think it will be extremely helpful when it launches publicly this fall.

After watching the keynote from the live stream for years, it was truly amazing being part of the excitement and cheering throughout the entire event. A notable addition to the new version of macOS was the movement to Apple File System, which replaced a file system that was decades old. This update had already come to devices running iOS earlier this year (iOS 10.3 or later).

Interesting hardware updates followed, with some speed improvements to existing models and the announcement of a radically new product, the iMac Pro. This device, geared towards professional video and graphics editors, packed an astonishing 18-core CPU and 128 GB of memory!

Following that, the peak of the event was focused on iOS, the mobile operating system for iPhones and iPads. Some interesting design and feature additions were made. One section that particularly interested me was the Vision and Machine Learning Frameworks. This functionality lets developers analyze (using machine learning) what a camera sees in real time, such as a person’s face or a guitar that they are holding, and then render an object on top of that person as if it were floating in real life. There are endless applications to these new features, which I am very excited to explore.

iOS 11 presentation during the Keynote.

Another section especially stuck out to me due to its importance to students. This year, Apple added a Files app to iPad, which will allow us students to see all of their files from all of their apps used in one place. Furthermore, drag and drop is now supported between different apps across the whole system – for example, dragging text, images, website URLs, and much more – to compile rich content on iPad.  Additionally, many more apps can now be stored in a dock which can be accessed from anywhere. This is a huge update for students, and I think it will be extremely helpful when it launches publicly this fall.

Star Wars Augmented Reality on the new 10.5” iPad Pro.

Apple also launched two more brand new products. First, a never seen before 10.5-inch iPad Pro with astonishing graphics and computing capabilities was released. The company demonstrated this device being used for Augmented Reality. One example involved placing a coffee cup, lamp, or plant on a table in real time through the device’s camera. Then, developers took it even further by rendering full scenes through the camera that you could walk around and interact with. I certainly enjoyed playing around with this framework this week, and I even used the template of a spaceship floating in real time to develop my own bread trail out of Apple logos (haha!), an AR solar system, and much more. Expect many more AR and “learning” apps in the near future due to the new frameworks provided by Apple to make these great apps!

The second new product that Apple released was called HomePod, a  high-quality speaker with smart capabilities allowing you to use voice commands to change our music or ask for things like the weather forecast. Apple is trying to get into this game with other companies such as Google, with Google Home; and Amazon, with Alexa.

HomePod preview on display.

Following the keynote, we had the chance to get a preview of the soon-to-be-released HomePod and iMac Pro in a stealthy Space Grey color, and play with some more of the Augmented Reality features in live action.

The next event for the day was the Platforms State of the Union. I won’t go into too much detail simply because of how much detail this event provides. It is more focused on the technical and code changes that some of the new features bring, so it was great for Apple to show us how we can implement the new features in our own apps. I would, however, like to mention the Swift Playgrounds app for iPad, which is an amazing and powerful resource for anyone who is interested in learning how to code. This event talked about how I, as a developer, could add features to my apps for functions like Siri, Drag & Drop and Split Screen, Machine Learning, Vision and AR, and much more.

Today was also a great day to really start meeting some of the great scholars and other attendees from around the world who are just as interested as I am in walking around a conference hall holding my phone up in the air to try out an AR that we just programmed! Shown in the picture below, everyone is anxiously playing with all of the new betas, or pre-release software, so they can get their hands on the new features and code.

Scholarship Lounge – attendees play with new betas, so they can get their hands on the new features and code

Tuesday, June 6

From Tuesday through Friday, hundreds of sessions and labs were held for us to attend. While there, we learned more about new technologies, asked for help from Apple engineers, or just hung out and coded.

Tuesday stood out, however, we received a surprise visit from former First Lady Michelle Obama.

Michelle Obama speaking to developers.

In her talk, she touched on various topics, such as what she has been doing since her husband left office and some initiatives she has undertaken. She talked to us developers and entrepreneurs about how our jobs cannot be done alone, as it is very important to have someone else who shares your values to be with you along the way, because one person cannot see every angle or possibility of a situation.

Mrs. Obama also discussed how making a difference often starts with small ideas and actions. You can try to act on a large scale, but where it really matters is at a local, community level. Then, the former First Lady talked about the importance of women must advocating for themselves because they are just as important in the industry as everyone else. Finally, Mrs. Obama talked about how as app developers, we must be role models to the world, as people all across the globe use our apps.

Afterwards, I attended sessions on topics like Machine Learning and Privacy. The Machine Learning session was particularly interesting. I learned all about the great tools that Apple has provided us with – for example, I experimented with one of the many already-trained models that can detect things like people, places, and objects using a camera. Then, I simply imported the model, and with a few lines of added code, it was ready to go!

Core Machine Learning Session

Fundamentally, the way the system works is that it gets input from the Vision (camera) or Natural Language Processing (text) frameworks, analyze it using Core Machine Learning, and then display feedback to the user using either simple text, or by overlaying objects on the scene using AR.

For example, I was shown how the camera can detect someone holding a guitar using the camera, and then use the ARKit framework to overlay a bubble with the guitar emoji right next to the person’s head!

I also attended some labs, where Apple engineers helped me with specific questions pertaining to things like my tvOS (Apple TV) app, iMessage apps, and data storage.

The last part of my day included a User Interface Consultation. Each developer is entitled to one of these throughout the conference. There, I had two Apple designers sit down with me and talk about me design practices and how I can improve them. While they were quite critical, I was happy that I got lots of feedback on how I can improve my design techniques. Specifically, we talked about the new app I am working on and how I can make the experience more intuitive, simple, and consistent. They stressed that I must prioritize my material well according to how much I think each section will be used.