While we did not see any instances of the “one more thing” routine in the Apple event held this past week, which would be followed by the announcement of the new iPhone, Apple still did not give reason to many to sleep on this event this time. Apple has been very direct with letting developers know what they may expect to see improve or get added to their repertoire to realize the full potential of their products. And this September event it was no different, with the iPad Air rounding off the whole show, and deservingly so, for it brings with it landmark features for developers of Apple apps to use and develop with.
iPad Air Brings Neural Engine With 2x The Cores
The iPad Air was the vehicle for the flurry of new and the most major changes. The new A14 Bionic chip introduced in it has a 16-core neural engine which greatly enhances the experience of Apple’s newly-introduced Core ML. Apple’s documentation overviews this new addition, and it offers a whole host of supported frameworks, also developed by Apple, for developers to implement in their applications. Vision is a Computer Vision framework that lets applications perform landmark detection, facial detection, and other image tracking features, for example.
More abilities brought on by Core ML include (rather clearly named) functions such as Natural Language that enables NLP (Natural Language Processing) as part of Core ML, Speech for converting audio to text, and Sound Analysis for identifying sounds in recorded audio. Core ML itself utilizes Apple’s existing Accelerate framework and its BNNS library to great effect to achieve the results Apple expects it to do when developers get to try it.
Apple’s Previews of Core ML Features in Action
As a proof of concept, Apple also gave the spotlight to Karim Morsy, who showed off his creation: djay Pro AI. With the new iPad Air, Karim showed its ability: DJing in air, utilizing the users’ tracked hand movements recorded and processed using machine learning, enabled by Core ML, into turntable motion and the like. This sort of movement tracking shows off enormous potential in Core ML, making new innovations with the iPad Air to run it on.
Apple also showed Pixelmator Photo, a photo processing software for the iPad, running on the A14 Bionic, which uses ML Super Resolution to enhance photos and increase detail. The developer credits the A14 Bionic to make this feature possible to be implemented.
Xcode 12 Will Include All Core ML Functionality for Developer Use
Any developers looking to use Core ML would also appreciate the presence of all these features to implement in their applications in the production version of Xcode 12, which is also a requirement for uploading apps that will be compatible with iOS and iPad OS 14, which is also coming to Apple devices right away. It probably did make the week a bit hectic for iOS developers who had compatibility issues with iOS 14 in their apps, but it also brought to them a host of new features to implement in them and expand the potential of their creations.
‘Apple Silicon’ For, and of, The Future
This push for Machine Learning that makes full use of their ‘Apple Silicon’ chips also shows Apple’s own focus on their shift towards it for the products that still remain on Intel’s CPUs: their Macs. These new features will be ported directly to Apple-based Macs as soon as they are available and may end up giving users the very same experience on those machines as well. This shift is very favorable for Apple, as with this scalability they can truly unify their software ecosystem from one device to another, and even an iPad can serve as the vehicle for the development of apps for all their other platforms with this shift.
Apple has given its best tools to its developers to play around and learn, and it now is up to them to use them to their limits. The iPad Air is the first of many new iPads (and now Macs!) to come with these new features. We will surely see the results of this very soon, and it will be extremely interesting to see new applications with Machine Learning applications the likes of which Apple previewed in their event, in the future.