Mike Rockwell, the Apple Vision Pro chief, has replaced John Giannandrea as the executive in charge of Siri, in an executive shakeup to try and rescue Apple's flailing AI efforts.
I work on AI systems that integrate into other apps and make contextual requests. That’s the big feature that Apple hasn’t launched, and it’s very much a problem that others have solved before.
I’d have to know more specifics to really answer but the gist is that it will cost an exorbitant amount fir very little gain. There’s no magic to be had and every single honest survey shows people overwhelmingly don’t want it.
Yeah, it a lot of those studies are about stupid stuff like an LLM in-app to look at grammar, or a diffusion model to throw stupid clip art into things. No one gives a shit about that stuff. You can easily just cut and paste from OpenAI’s experience, and get access to more tools there.
That said, being able to ask an OS to look at one local vectorized DB of texts, images, documents, recognize context, then compose and complete tasks based upon that context. That shit is fucking cool.
That said, a lot of people haven’t experienced that yet, so when they get asked about “AI,” their responses are framed with what they’ve experienced.
It’s the “faster horse” analogy. People that don’t know about cars, busses, and trains will ask for a faster horse when you ask them to envision a faster mode of transport.
Okay, and fair enough, but I don’t want anything going to anyone. If it’s apple, it’s at least part of that DPA and it’s part of the whole deal, but OpenAI can gtfo of my phone. That’s a hard no.
Even then - what, find all the pictures of my dog bucky and message them to grandma? I guess? Put pushpin icons on the map where all my phot -oh wait it already does that. Umm . . how many of my phone contacts are Masons? ¯\_(ツ)_/¯ I dunno - I just don’t know what I’d use it for.
I think enterprise needs will ensure that people develops solutions to this.
Companies can’t have their data creeping out into the public, or even creeping out into other parts of the org. If you’re customer, roadmap, or HR data got into the wrong hands, that could be a disaster.
Apple, Google, and Microsoft will never get AI into the workplace is AI is sharing confidential enterprise data outside of an organization. And all of these tech companies desperately want their tools to be used in enterprises.
“it” is doing a lot of heavy lifting there. I agree a lot of the “AI” features being pushed are junk that I don’t want (e.g. image playground), but there are some that i do think that will help a lot as we have discussed in the other chain.
The problem is that Apple’s extensive marketing of Apple Intelligence has led to expectations that far surpass what the final product is likely to be.
Most people think generative AI is magic coming out of a hat, so even if Apple delivers at the same level as other companies, people will feel like they’ve been misled.
Apple’s big problem is that Apple intelligence’s two most interesting features, contextual awareness + Siri finally having deep integration, never got reliable enough to get into public or developer beta this year.
That was the thing everyone wanted, but they basically only got LLM summaries / drafting, and image generators. They got the stuff that is easy to make.
The problem is that Apple’s extensive marketing of Apple Intelligence has led to expectations that far surpass what the final product is likely to be.
like what specifically? I think most people will be very happy if the only practical improvement is Siri working a lot better, which should be achievable.
Why can’t it work?
I work on AI systems that integrate into other apps and make contextual requests. That’s the big feature that Apple hasn’t launched, and it’s very much a problem that others have solved before.
I’d have to know more specifics to really answer but the gist is that it will cost an exorbitant amount fir very little gain. There’s no magic to be had and every single honest survey shows people overwhelmingly don’t want it.
Yeah, it a lot of those studies are about stupid stuff like an LLM in-app to look at grammar, or a diffusion model to throw stupid clip art into things. No one gives a shit about that stuff. You can easily just cut and paste from OpenAI’s experience, and get access to more tools there.
That said, being able to ask an OS to look at one local vectorized DB of texts, images, documents, recognize context, then compose and complete tasks based upon that context. That shit is fucking cool.
That said, a lot of people haven’t experienced that yet, so when they get asked about “AI,” their responses are framed with what they’ve experienced.
It’s the “faster horse” analogy. People that don’t know about cars, busses, and trains will ask for a faster horse when you ask them to envision a faster mode of transport.
Okay, and fair enough, but I don’t want anything going to anyone. If it’s apple, it’s at least part of that DPA and it’s part of the whole deal, but OpenAI can gtfo of my phone. That’s a hard no.
Even then - what, find all the pictures of my dog bucky and message them to grandma? I guess? Put pushpin icons on the map where all my phot -oh wait it already does that. Umm . . how many of my phone contacts are Masons? ¯\_(ツ)_/¯ I dunno - I just don’t know what I’d use it for.
I think enterprise needs will ensure that people develops solutions to this.
Companies can’t have their data creeping out into the public, or even creeping out into other parts of the org. If you’re customer, roadmap, or HR data got into the wrong hands, that could be a disaster.
Apple, Google, and Microsoft will never get AI into the workplace is AI is sharing confidential enterprise data outside of an organization. And all of these tech companies desperately want their tools to be used in enterprises.
“it” is doing a lot of heavy lifting there. I agree a lot of the “AI” features being pushed are junk that I don’t want (e.g. image playground), but there are some that i do think that will help a lot as we have discussed in the other chain.
The problem is that Apple’s extensive marketing of Apple Intelligence has led to expectations that far surpass what the final product is likely to be.
Most people think generative AI is magic coming out of a hat, so even if Apple delivers at the same level as other companies, people will feel like they’ve been misled.
Apple’s big problem is that Apple intelligence’s two most interesting features, contextual awareness + Siri finally having deep integration, never got reliable enough to get into public or developer beta this year.
That was the thing everyone wanted, but they basically only got LLM summaries / drafting, and image generators. They got the stuff that is easy to make.
like what specifically? I think most people will be very happy if the only practical improvement is Siri working a lot better, which should be achievable.