Skip to content

Conversation

@laciferin2024
Copy link
Contributor

@laciferin2024 laciferin2024 commented Nov 9, 2025

PR Checklist

Please read and check all that apply.

Changesets

  • This PR includes a Changeset under .changeset/ describing the change (patch/minor/major) with a clear, user-focused summary
  • OR this PR is docs/tests-only and I added the skip-changeset label

Quality

  • UI builds locally: bun run build (Vite)
  • E2E or unit tests added/updated where applicable (playwright, vitest if used)
  • No breaking changes to public interfaces without a major bump

Notes

  • Add a changeset via: bun run changeset
  • Policy and examples: see aidocs/changesets.md

Summary by CodeRabbit

  • New Features

    • Enhanced billing transparency with improved cost tracking and header-based pricing data in receipt exports.
    • Upgraded streaming response handling for improved reliability and robustness.
  • Bug Fixes

    • Added graceful error handling for streaming failures and abort scenarios with UI rollback.
    • Improved text extraction from diverse response formats.
  • Chores

    • Configured SPA routing for improved deployment behavior.

@vercel
Copy link

vercel bot commented Nov 9, 2025

The latest updates on your projects. Learn more about Vercel for GitHub.

Project Deployment Preview Comments Updated (UTC)
console Building Building Preview Comment Nov 9, 2025 5:25pm

@coderabbitai
Copy link
Contributor

coderabbitai bot commented Nov 9, 2025

Caution

Review failed

The pull request is closed.

Walkthrough

The changes refactor the OpenAI streaming flow from fetch-based to axios-based with incremental chunk processing, implement transparent billing by extracting costs from response headers, add an output text extraction utility to normalize diverse response shapes, and update string literal types from single to double quotes. A Vercel SPA rewrite configuration was also added.

Changes

Cohort / File(s) Summary
ChatPlayground streaming and billing overhaul
src/components/ChatPlayground.tsx
Replaced fetch-based /responses streaming with axios post + onDownloadProgress chunked parsing; implemented SSE-like buffer parser to extract data blocks; added header-based cost extraction (openai-input-cost, openai-output-cost) into lastRun.BillingMeta; introduced extractOutputText helper to normalize OpenAI response shapes; updated string literal types to double quotes; enhanced error handling for streaming abort/failure scenarios.
Vercel SPA configuration
vercel.json
Added rewrite rule routing all paths /(.*\) to /index.html to enable single-page application behavior.

Sequence Diagram

sequenceDiagram
    participant User
    participant ChatPlayground
    participant axiosClient
    participant OpenAI API

    User->>ChatPlayground: Submit chat request
    ChatPlayground->>axiosClient: post with onDownloadProgress
    axiosClient->>OpenAI API: Stream request

    loop Chunk Reception
        OpenAI API-->>axiosClient: SSE-like data chunks
        axiosClient->>ChatPlayground: onDownloadProgress callback
        ChatPlayground->>ChatPlayground: Buffer & parse chunks
        ChatPlayground->>ChatPlayground: Extract delta, append to message
        ChatPlayground->>User: Render streamed content
    end

    OpenAI API-->>axiosClient: Response complete + headers
    axiosClient->>ChatPlayground: Extract billing headers<br/>(openai-input-cost, etc.)
    ChatPlayground->>ChatPlayground: Store in lastRun.BillingMeta
    ChatPlayground->>ChatPlayground: extractOutputText(response)
    ChatPlayground->>User: Display final message + receipt
Loading

Estimated code review effort

🎯 4 (Complex) | ⏱️ ~40 minutes

  • Streaming refactor in ChatPlayground.tsx: Careful review of the axios onDownloadProgress handler, buffer management, and SSE-like parser logic to ensure chunk fragmentation/edge cases are handled correctly.
  • Header extraction and billing metadata: Verify that cost headers are correctly parsed, stored in BillingMeta, and passed to receipt/export logic without data loss or type mismatches.
  • Output text extraction logic: extractOutputText utility must robustly handle multiple response shapes; verify it gracefully falls back across all documented OpenAI response variants.
  • Error handling paths: Confirm AbortError and APIUserAbortError are caught and UI state is correctly rolled back without orphaned messages.
  • Type system consistency: Ensure double-quoted string literals are consistently applied across all internal types and that ExpandableHash/toggleHashExpansion signatures align throughout.

Possibly related PRs

Poem

🐰 From fetch to axios we now stream so fast,
Chunks buffered and parsed, no droplets get past!
Billing transparence in headers we find,
Costs captured, costs logged, costs all aligned—
The tunnel grows brighter, responses more kind! ✨

✨ Finishing touches
  • 📝 Generate docstrings
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Post copyable unit tests in a comment
  • Commit unit tests in branch adapt/github-provider

📜 Recent review details

Configuration used: CodeRabbit UI

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 47c287d and c837d87.

⛔ Files ignored due to path filters (1)
  • bun.lock is excluded by !**/*.lock
📒 Files selected for processing (2)
  • src/components/ChatPlayground.tsx (20 hunks)
  • vercel.json (1 hunks)

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

@laciferin2024 laciferin2024 merged commit 48fe80e into main Nov 9, 2025
3 of 7 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants