In the last post I want to make regarding Siri, it needs to have the ability to do all voice interpretation for pre-set languages offline.
Ever since Craig Federighi took over Siri as the Senior Vice President that overlooks all software including Siri, it has been slowly getting better.
One of the biggest issues with Siri to date, is that you still need to have an internet connection so Apple’s servers can convert voice to text to then do Siri’s functions.
Apple introduced offline music controls via Siri recently, but this doesn’t work for me and if it did, it would only be for music.
Why do we need an internet connection so AirPods can use Siri and your Apple Watch to send an SMS to someone? It makes no sense.
The best way Apple should allow this is for users to download a language for offline use in settings based on the languages they speak. I’d imagine each language would take up 1GB of download and storage space on your device, but making it optional especially for those users with little storage would make it best for all scenarios.
If AirPods and Apple Watch is connected via Bluetooth to your iPhone which has a language saved offline, then this should work instead of saving the files to each device so they can work independent.
Maybe they could allow it separately on Apple Watch’s that aren’t paired to an iPhone via Bluetooth at the time, but it’s possible they will only allow this on newer models with more storage built in.
That brings me to my half-way point of this series of posts.
The 2nd half will have posts based on iOS, watchOS, Xcode/Development and maybe one on the Mac.
If you are enjoying them please like them and follow me here and on Twitter.