WWDC 17

     

WWDC

One of the many benefits of working at Steelcase is that each year every developer is allotted $5000 to attend conferences. This year I entered the Apple lottery for WWDC tickets and I won! Over the course of 5 days I learned an immense amount about Swift 4, Xcode 9, iOS 11, Natural Language Processing, Computer Vision, and Machine Learning with CoreML. What follows is a massive brain dump of all the notes and highlights that I took during the conference.

Swift 4

Xcode 9

Swift refactoring in the Xcode IDE is now on par with refactoring in Android Studio. This includes:

Build times have been reduced by simultaneous indexing and building and bridging header file templating. Build time for PA from a cleaned build folder to app start for Xcode 8 is 121 seconds. For Xcode 9 it is 117 seconds. Successive launches of the app take 8 seconds for Xcode 8 and 4 seconds for Xcode 9.

New simulators have been added. These simulators have manipulable hardware buttons, dynamic resizing, and support multiple simulators run in parallel.

Wireless debugging is possible with phones running iOS 11+ and that are on the same wireless network as the mac running Xcode. To connect, make sure the device is plugged in first, then in the menu bar select Window > Devices and Simulators > Select your Device > and check Connect via network. You can now unplug the device and debug it wirelessly over the network. Once setup has been completed once it does not need to be repeated.

In the Xcode view debugger you can now choose to show clipped areas for enhanced visual debugging.

You can now wrap logical groups of UI tests commands into groups with Activities (i.e. XCTest.RunActivity(…))

You can now have Xcode automatically take screenshots during UI tests with XCUIElement.screenshot() and SCUIScreen.screenshot(). Both are deleted if the test passes unless you explicitly dictate otherwise.

XCUI.waitForElement() is now recommended over explicitly calling sleep()

iOS 11

Privacy

To uniquely identify an app instance use “let id = UUID()” (constructor natively supported by Foundation)

As of iOS 11 it is now necessary to define “When in Use” and “Always” permission strings in the info.plist file.

Apple assigns each device 2-bits that are uniquely referenced to each device and are accessible at run-time through the apple DeviceCheck API.

NLP

Vision

The Vision API enables Image Registration (stitching images together), Rectangle Recognition, and Object Tracking. It supplements existing computer vision pre-processing libraries in CoreImage and AVCapture as a high quality, but slower, alternative.

There are three parts to implement Vision

  1. Create the request
  2. Run the request
  3. Use the results.

Vision supports CIImage, CGImage, NSURL, and NSData formats. Make sure to use a completion handler for requests.

CoreML

To allow use of models trained outside of CoreML (i.e. Keras, Caffe, scikit-learn, libsvm, and XGBoos) execute pip install -U coremltools in the command line.

comments powered by Disqus