🤖 fix: stabilize pre-stream workspace status indicator#3132
Conversation
|
@codex review |
|
Codex Review: Didn't find any major issues. Nice work! ℹ️ About Codex in GitHubYour team has set up Codex to review pull requests in this repo. Reviews are triggered when you
If Codex has suggestions, it will comment; otherwise it will react with 👍. Codex can also answer questions or update the PR. Try commenting "@codex address that feedback". |
|
@codex review |
|
Codex Review: Didn't find any major issues. Nice work! ℹ️ About Codex in GitHubYour team has set up Codex to review pull requests in this repo. Reviews are triggered when you
If Codex has suggestions, it will comment; otherwise it will react with 👍. Codex can also answer questions or update the PR. Try commenting "@codex address that feedback". |
059759f to
75c5245
Compare
|
@codex review |
|
Codex Review: Didn't find any major issues. What shall we delve into next? ℹ️ About Codex in GitHubYour team has set up Codex to review pull requests in this repo. Reviews are triggered when you
If Codex has suggestions, it will comment; otherwise it will react with 👍. Codex can also answer questions or update the PR. Try commenting "@codex address that feedback". |
|
@codex review |
75c5245 to
8c3bd13
Compare
|
Codex Review: Didn't find any major issues. Another round soon, please! ℹ️ About Codex in GitHubYour team has set up Codex to review pull requests in this repo. Reviews are triggered when you
If Codex has suggestions, it will comment; otherwise it will react with 👍. Codex can also answer questions or update the PR. Try commenting "@codex address that feedback". |
Keep the workspace sidebar status label anchored through the pre-stream handoff by reserving the loader slot and by reusing the pending requested model before stream-start. Add component and store coverage for the new pending-model sidebar path. --- _Generated with `mux` • Model: `openai:gpt-5.4` • Thinking: `xhigh` • Cost: `$5.46`_ <!-- mux-attribution: model=openai:gpt-5.4 thinking=xhigh costs=5.46 -->
8c3bd13 to
5969820
Compare
|
@codex review |
|
Codex Review: Didn't find any major issues. Another round soon, please! ℹ️ About Codex in GitHubYour team has set up Codex to review pull requests in this repo. Reviews are triggered when you
If Codex has suggestions, it will comment; otherwise it will react with 👍. Codex can also answer questions or update the PR. Try commenting "@codex address that feedback". |
Summary
Stabilize the workspace sidebar's pre-stream status indicator so it no longer flashes when the turn moves from
startingintostreaming.Background
The sidebar indicator showed a visible handoff flash before streaming began. The
startingphase rendered a spinner inline with the model label, so removing it at stream-start caused the text to jump left. The sidebar also used the fallback model duringstarting, so the label could change again at stream-start when a pending send had requested a different model.Implementation
pendingStreamModelthroughWorkspaceSidebarStateso the sidebar can use the actual requested pre-stream modelstarting -> streaminghandoff so the live transition stays smooth while the steady streaming layout remains unchangedstarting -> streaminghandoff and for pending requested model propagation into sidebar stateValidation
make static-checkbun test ./src/browser/components/WorkspaceStatusIndicator/WorkspaceStatusIndicator.test.tsxbun test ./src/browser/stores/WorkspaceStore.test.ts --filter "stream starting state"bun test ./src/browser/components/AgentListItem/AgentListItem.test.tsxmake typecheckmake lintRisks
Low risk. The change is scoped to the sidebar status indicator and the derived sidebar snapshot shape in
WorkspaceStore. The new tests cover both the visual handoff logic and the pending-model state path.Generated with
mux• Model:openai:gpt-5.4• Thinking:xhigh• Cost:$5.46