WWDC has started, and there have already been some brilliant announcements for Xcode 9, and if you’re a registered developer, you can go and download the beta now which includes Swift 4. There have been two SDK’s revealed that I can’t wait to mess around with CoreML and ARKit.
CoreML allows you to build more intuitive apps, that’s right finally the iPhone will gain consciousness and rise and rule us all. Maybe not, CoreML is Apple’s foundational machine learning technology that is used across all Apple products from Siri to Quicktype. Bascily it allows you to add deep learning models into your apps; two API’s revealed by Apple are the Vision API and Natural Language API.
Vision allows face tracking, face detection, landmarks, text detection, rectangle detection, barcode detection, object tracking, and image registration. This API is going to be so much fun to play with; I am currently downloading Xcode 9 Beta now, come on download faster I need my apps to recognise my face. The language API includes language identification, tokenization, lemmatization, part of speech, and named entity recognition.
Now let’s talk ARKit personally this is my favourite one. Now, this SDK can allow your device to sense how it moves within a room, achieving this by using CoreMotion data and Visual Inertial Odometry (VIO). This SDK has some powerful features, for example, estimate the total amount of light available in a scene and applies the correct amount of lighting to virtual objects, how amazing is that and it does this with the camera sensor. You can take advantage of the optimisations for ARKit in Metal, SceneKit, and third-party tools like Unity and Unreal Engine. ARKit is going to add a lot of fun to every app that includes it.
There so much information coming out of WWDC 2017 that it’s hard to keep track but in case you missed this Apple but together a nice opening to the event this year.