Thanks to visit codestin.com
Credit goes to docs.flutter.dev

Skip to main content

AI Toolkit

Learn how to add the AI Toolkit chatbot to your Flutter application.

Hello and welcome to the Flutter AI Toolkit!

The AI Toolkit is a set of AI chat-related widgets that make it easy to add an AI chat window to your Flutter app. The AI Toolkit is organized around an abstract LLM provider API to make it easy to swap out the LLM provider that you'd like your chat provider to use. Out of the box, it comes with support for Firebase AI Logic.

Key features

#
  • Multiturn chat: Maintains context across multiple interactions.
  • Streaming responses: Displays AI responses in real-time as they are generated.
  • Rich text display: Supports formatted text in chat messages.
  • Voice input: Allows users to input prompts using speech.
  • Multimedia attachments: Enables sending and receiving various media types.
  • Function calling: Supports tool calls to the LLM provider.
  • Custom styling: Offers extensive customization to match your app's design.
  • Chat serialization/deserialization: Store and retrieve conversations between app sessions.
  • Custom response widgets: Introduce specialized UI components to present LLM responses.
  • Pluggable LLM support: Implement a simple interface to plug in your own LLM.
  • Cross-platform support: Compatible with Android, iOS, web, and macOS platforms.

Demo

#

Here's what the demo example looks like hosting the AI Toolkit:

AI demo app

The source code for this demo is available in the repo on GitHub.

Or, you can open it in Firebase Studio, Google's full-stack AI workspace and IDE that runs in the cloud:

Try in Firebase Studio

Get started

#
  1. Installation

    Add the following dependencies to your pubspec.yaml file:

    yaml
    dependencies:
      flutter_ai_toolkit: ^latest_version
      firebase_ai: ^latest_version
      firebase_core: ^latest_version
    
  2. Configuration

    The AI Toolkit supports both the Gemini endpoint (for prototyping) and the Vertex endpoint (for production). Both require a Firebase project and the firebase_core package to be initialized, as described in the Get started with the Gemini API using the Firebase AI Logic SDKs docs.

    Once that's complete, integrate the new Firebase project into your Flutter app using the flutterfire CLI tool, as described in the Add Firebase to your Flutter app docs.

    After following these instructions, you're ready to use Firebase to integrate AI in your Flutter app. Start by initializing Firebase:

    dart
    import 'package:firebase_core/firebase_core.dart';
    import 'package:firebase_ai/firebase_ai.dart';
    import 'package:flutter_ai_toolkit/flutter_ai_toolkit.dart';
    
    // ... other imports
    
    import 'firebase_options.dart'; // from `flutterfire config`
    
    void main() async {
      WidgetsFlutterBinding.ensureInitialized();
      await Firebase.initializeApp(options: DefaultFirebaseOptions.currentPlatform);
      runApp(const App());
    }
    
    // ...app stuff here
    

    With Firebase properly initialized in your Flutter app, you're now ready to create an instance of the Firebase provider. You can do this in two ways. For prototyping, consider the Gemini AI endpoint:

    dart
    import 'package:firebase_ai/firebase_ai.dart';
    import 'package:flutter_ai_toolkit/flutter_ai_toolkit.dart';
    
    // ... app stuff here
    
    class ChatPage extends StatelessWidget {
      const ChatPage({super.key});
    
      @override
      Widget build(BuildContext context) => Scaffold(
            appBar: AppBar(title: const Text(App.title)),
            // create the chat view, passing in the Firebase provider
            body: LlmChatView(
              provider: FirebaseProvider(
                // Use the Google AI endpoint
                model: FirebaseAI.googleAI().generativeModel(
                  model: 'gemini-2.5-flash',
                ),
              ),
            ),
          );
    }
    

    The FirebaseProvider class exposes the Firebase AI Logic SDK to the LlmChatView. Note that you provide a model name (you have several options from which to choose), but you do not provide an API key. All of that is handled as part of the Firebase project.

    For production workloads, it's easy to swap in the Firebase Logic AI endpoint:

    dart
    class ChatPage extends StatelessWidget {
      const ChatPage({super.key});
    
      @override
      Widget build(BuildContext context) => Scaffold(
            appBar: AppBar(title: const Text(App.title)),
            body: LlmChatView(
              provider: FirebaseProvider(
                // Use the Vertex AI endpoint
                model: FirebaseAI.vertexAI().generativeModel(
                  model: 'gemini-2.5-flash',
                ),
              ),
            ),
          );
    }
    

    For a complete example, check out the gemini.dart and vertex.dart examples.

  3. Set up device permissions

    To enable your users to take advantage of features like voice input and media attachments, ensure that your app has the necessary permissions:

    • Network access: To enable network access on macOS, add the following to your *.entitlements files:

      xml
      <plist version="1.0">
        <dict>
          ...
          <key>com.apple.security.network.client</key>
          <true/>
        </dict>
      </plist>
      

      To enable network access on Android, ensure that your AndroidManifest.xml file contains the following:

      xml
      <manifest xmlns:android="http://schemas.android.com/apk/res/android">
          ...
          <uses-permission android:name="android.permission.INTERNET"/>
      </manifest>
      
    • Microphone access: Configure according to the record package's permission setup instructions.

    • File selection: Follow the file_selector plugin's instructions.

    • Image selection: To take a picture on or select a picture from their device, refer to the image_picker plugin's installation instructions.

    • Web photo: To take a picture on the web, configure the app according to the camera plugin's setup instructions.

Examples

#

firebase_options.dart

To use the Vertex AI example app, place your Firebase configuration details into the example/lib/firebase_options.dart file. You can do this with the flutterfire CLI tool as described in the Add Firebase to your Flutter app docs from within the example directory.

::: note Be careful not to check the firebase_options.dart file into your git repo. :::

Feedback

#

Along the way, as you use this package, please log issues and feature requests as well as submit any code you'd like to contribute. We want your feedback and your contributions to ensure that the AI Toolkit is just as robust and useful as it can be for your real-world apps.