Skip to content

Add streaming support for real-time AI responses #63

@jhs

Description

@jhs

Overview

Enable streaming support for real-time token-by-token display of AI responses in both Python and TypeScript implementations.

Benefits

  • Improved user experience with progressive response rendering
  • Reduced perceived latency
  • Better for long responses

Implementation Notes

  • Must maintain isomorphic implementation across Python and TypeScript
  • Should integrate with existing Interviewer.go() flow
  • Consider how streaming interacts with field validation and tool calls

Scope

  • Python implementation
  • TypeScript implementation
  • React hook integration (via onToken callback)
  • Documentation and examples

Related

  • Mentioned during React integration design discussion
  • Would enhance useChatfield hook with streaming: true option

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions