LLMs on their own are not a viable replacement for assistants because you need a working assistant core to integrate with other services. LLM layer on top of assistants for better handling of natural language prompts is what I imagined would happen. What Gemini is doing seems ridiculous but I guess that’s Google developing multiple competing products again.
It is a replacement for a specific portion of a very complicated ecosystem-wide integration involving a ton of interoperability sandwiched between the natural language bits. Why this is a new product and not an Assistant overhaul is anybody’s guess. Some blend of complicated technical issues and corporate politics, I bet.
Pre-parse vs voice command library of commands. If there are, do them, pass confirmation and jump to 6.
If no valid commands, then pass to LLM.
have LLM heavily trained on commands and some API output for them. If none, then other responses
have response checked for API outputs, handle them appropriately and send confirmation forward, otherwise pass on output.
Convert to voice.
The LLM part obviously also needs all kinds of sanitation on both sides like they do now, but exact commands should preempt the LLM entirely, if you’re insisting on using one.
LLMs on their own are not a viable replacement for assistants because you need a working assistant core to integrate with other services. LLM layer on top of assistants for better handling of natural language prompts is what I imagined would happen. What Gemini is doing seems ridiculous but I guess that’s Google developing multiple competing products again.
It is a replacement for a specific portion of a very complicated ecosystem-wide integration involving a ton of interoperability sandwiched between the natural language bits. Why this is a new product and not an Assistant overhaul is anybody’s guess. Some blend of complicated technical issues and corporate politics, I bet.
The LLM part obviously also needs all kinds of sanitation on both sides like they do now, but exact commands should preempt the LLM entirely, if you’re insisting on using one.