Conversation
|
|
|
@pinkeshmars I just had a glance and I think this should directly belong to App Events folder because it falls under Event driven paradigms? We shouldn't be creating new folders in concepts section for new features unless it doesn't fit anything else . |
No problem. I moved it under the App Events folder now. |
ayushflow
left a comment
There was a problem hiding this comment.
GenUI Chat Documentation Review
Overall this is solid documentation for a major feature — the writing is clear, the best practices section is genuinely excellent, and the FAQ coverage is thorough. The main issues are structural (wrong nesting, audience mismatch in the architecture section) and a gap in the "paradigm shift" narrative that this feature deserves.
Top priorities
- Restructure nesting — GenUI is a widget/capability, not a subcategory of App Events
- Lead with the paradigm shift — the "Build Primitives, Not Paths" framing should open the doc, not be buried in best practices
- Add one end-to-end tutorial — the single biggest gap; both examples are aspirational, not instructional
- Move architecture internals to a separate page or collapsible section
- Fill info gaps — model identity in setup, cost implications, conversation persistence, debugging guidance
Detailed comments on specific lines below.
| ] | ||
| --- | ||
|
|
||
| # GenUI Chat |
There was a problem hiding this comment.
Structure: GenUI shouldn't live under app-events/
GenUI is a widget/capability that uses app events as one of its three pillars — it's not an app event concept itself. This nesting implies GenUI is a subcategory of app events, which undersells the feature and confuses the information architecture. GenUI should be a top-level concept under ff-concepts/ (sibling to app-events, not a child of it).
|
|
||
| # GenUI Chat | ||
|
|
||
| Usually, applications follow a fixed model: developers design screens, define navigation, and hard-code interactions. Users are limited to these predefined flows, and anything outside those paths simply isn’t supported. |
There was a problem hiding this comment.
The paradigm shift is undersold
The intro sketches the vision in two paragraphs then jumps to a mechanical example. The "Build Primitives, Not Paths" framing (currently buried in Best Practices at line 175) is the clearest articulation of the paradigm shift — that before/after comparison should be the opening of this doc, not a best practice.
Also missing: an explicit differentiation from chatbots. Most developers have seen chat widgets. GenUI is fundamentally different — the AI renders real UI components, not text bubbles. A reader skimming might think "oh, another chatbot widget." Even a single callout saying "GenUI is not a chatbot" would immediately reframe expectations.
|  | ||
|
|
||
| :::note | ||
| This doesn’t replace traditional UI. Navigation, dashboards, and structured flows still play an important role. GenUI introduces a **new layer,** dynamic, adaptive, and conversational, that handles the long tail of use cases traditional interfaces can’t efficiently cover. |
There was a problem hiding this comment.
nit: "dynamic, adaptive, and conversational, that handles" — the comma after "conversational" creates a comma splice. Should be an em-dash:
GenUI introduces a **new layer** — dynamic, adaptive, and conversational — that handles the long tail...
| Follow the steps below to add GenUI Chat to your app: | ||
|
|
||
| 1. Make sure you’ve completed the [Firebase integration](../../../ff-integrations/firebase/connect-to-firebase-setup.md), including the [initial setup](../../../ff-integrations/authentication/firebase-auth/auth-initial-setup.md) and configuration files. | ||
| 2. Go to **Firebase Console > AI Logic** and enable it. |
There was a problem hiding this comment.
The user doesn't know what model they're using
Step 2 says "enable AI Logic" but the reader has no idea what LLM powers this until they reach the Architecture section at line 156 (where it's mentioned parenthetically). This is the setup section — it should say upfront:
GenUI is powered by Google Gemini via Firebase AI Logic.
Also missing: any mention of Firebase Blaze plan requirements or cost implications. Users will discover this when they hit billing errors, which is a bad first experience.
| <p></p> | ||
|
|
||
| ### Customization | ||
|
|
There was a problem hiding this comment.
Customization section has no fallback content
This section is just an Arcade embed with no text explanation of what can be customized. If the embed fails to load (corporate firewalls, reader is offline, Arcade goes down), the reader gets a heading and nothing else. Add at least a bullet list of the customizable properties (colors, avatars, header text, input placeholder, spacing, etc.).
| - Navigation context | ||
| - Device or sensor updates | ||
|
|
||
| The runtime listens on `FFAppEventService.instance.localEventsStream` and converts matching events into hidden `InternalMessage`s. |
There was a problem hiding this comment.
Implementation detail in user-facing docs
FFAppEventService.instance.localEventsStream is an internal class/stream name that users never interact with. This line should describe the behavior, not the implementation:
GenUI automatically listens for matching local events and converts them into hidden context messages for the conversation.
|
|
||
| ## Message Construction | ||
|
|
||
| Each listener has a `message_template`. GenUI resolves it in this order: |
There was a problem hiding this comment.
Message Construction section is confusing
"FlutterFlow variable binding, if configured" and "Literal input value, if configured" are unexplained. How does the user configure a variable binding vs. a literal value? What does this look like in the Properties panel? A screenshot or brief description of the configuration UI would make this actionable.
Also line 39: Event data: ${event.data?.toMap()} — showing Dart template syntax is unhelpful for users. Reframe as behavior:
If the event carries payload data, GenUI automatically appends it to the message sent to the model.
|
|
||
| ## Serialization Rules | ||
|
|
||
| The generated code serializes common FlutterFlow data types into model-friendly JSON: |
There was a problem hiding this comment.
nit: The serialization rules would be more useful with at least one concrete example showing the actual JSON output:
Color(0xFF4CAF50) → "#4CAF50"
DateTime(2024, 3, 15) → "2024-03-15T00:00:00.000"
Right now a reader knows that Color becomes a CSS string, but not what it looks like.
| List return values are serialized item-by-item using the same rules. | ||
|
|
||
| ## Best Practices | ||
|
|
There was a problem hiding this comment.
Missing: error handling behavior
What happens when a tool throws an exception at runtime? This is only documented in the main page's FAQ ("What happens when a tool fails?"). Since this is the dedicated tools page, it should mention the error handling behavior here — even a single sentence:
If a tool throws an exception, the error is caught and sent back to the model as a structured error payload. The UI remains stable and the model can explain the failure or suggest alternatives.
|
|
||
| **3. App Event Integration:** Your app’s events provide real-time context to the AI. Things like user actions, state changes, or backend updates can trigger responses. With auto-response enabled, the AI doesn’t wait for input; it proactively reacts and updates the experience as things happen. | ||
|
|
||
|  |
There was a problem hiding this comment.
nit: Image filename three-pillers.avif has a typo — should be three-pillars.avif.
Description
Add GenUI Chat docs
Linear ticket and magic word Fixes DEVR-1240
Type of change