Embedded AI vs Atlas
Atlas is our centralized conversational UI. Embedded AI is broadly about applying AI to in-app experiences and individual Anvil2 components. An experience may use one or both approaches depending on what is appropriate.
Within the app, use embedded AI patterns. Within the conversational interface, or with actions that immediately lead to it, use the Atlas Framework.
Embedded AI in Components
Anvil components are steadily receiving AI labeling integrated into the component. When embedded AI elements are enabled, components receive a combination of visual and behavioral updates. See AI Mark for more.
User awareness of AI
Users should be made aware that an interaction involves AI. The AI mark is sufficient at disclosing this, but additional disclosures may be needed in more complicated workflows.
How should disclaimers be used?
The mark itself denotes AI. You do not need to add tooltips or popovers explaining that AI has touched an element if you have the accompanying AI Mark.
Instead, use tooltips and popovers for AI Marks that appear on first page load that give the user additional information about how AI has touched something and why it matters. For example, a summary card on first load might include a tooltip discloser that AI has generated the summary based on the last three months of data. Think first about what helps a user understand what has changed from the UI norms, or take action in a given scenario.
How many AI marks are acceptable on one page?
We recommend one mark per page. Favor marking the places where AI meaningfully changes the outcome, and avoid treating the mark as decoration. If you do have multiple marks and the screen feels crowded, reduce marks before you shrink copy or layout. Use judgment and keep density tied to real AI involvement, not page real estate.
Last modified on April 14, 2026