Apple announced at its annual WWDC developer conference that it was launching support for better Spotlight search and a new Live Text feature using improved on-device features. App store now has 600 million weekly visitors and Apple has paid $230 Billion dollars to developers. In this feature Apple will allow users to point their cameras at a particular object or scene and immediately interact with text visible on the screen.
Apple played its developer pitch with quiet composure at WWDC 2021 – without going out of its way, but ensuring that it does not come across as tone-deaf. From its battle against targeted user tracking, where its reputation was only bolstered by Facebook’s efforts to bring it down, Apple is widely viewed by many users as a pro-customer company. Even among many developers, Apple is portrayed as a friendly mentor – independent developers have often credited Apple with helping fuel their growth, and at times smaller developers have found Apple extremely useful.
Since Apple will be doing all the processing work on the iPhone or iPad that you’re on, using on-device intelligence, it looks like none of the information about those images or what you see on your camera viewfinder will ever leave your device. Apple issued some interesting data. The data included developer payout of $230 billion in its almost-13-year history, and over 600 million weekly average users. The feature announcement included a new Events page in the App Store, which would advertise upcoming in-app events to attract more users. In a way, this was Apple creating another avenue for developers to attract more users to its services, which is being made possible due to the App Store’s might.