Using
@OpenAI gpt-realtime-2 to get a glimpse of future voice-first experiences.
A market dashboard you don’t click through.
You direct it.
Say, “Focus on Apple,” and the whole interface changes.
Ask, “How did it do over the last 30 days?” and the chart updates.
Say, “Go back,” and the market view returns.
No menus.
No filters.
No hunting around.
Just intent.
What makes this model especially interesting is the interaction loop: you can interrupt it, add more context, change direction, and it keeps reasoning in real time while updating the experience around you.
The interface doesn’t ask you to navigate.
It just takes you there.