At Composio, we have been receiving a lot of emails lately regarding tech support, feature requests, collaboration, and other related matters. Managing them was getting very tough.
So, I built an AI tool to route emails to the right person and Slack channel based on the content of the email.
Here’s how I did it.
Here’s a quick introduction about us.
Composio is an open-source tooling infrastructure for building robust and reliable AI applications. We provide over 100+ tools and integrations across industry verticals from CRM, HRM, and Sales to Productivity, Dev, and Social Media. With any large language model, you can integrate third-party services like GitHub, Slack, Gmail, Doscord, etc, for free.
Please help us with a star. 🥹
It would help us to create more articles like this 💖
{% cta https://dub.composio.dev/hFeX2WP %}Star the Composio repository ⭐{% endcta %}
As mentioned, this project finds relevant emails and routes them to respective Slack channels and email IDs.
Here is the overall workflow.
To complete the project, you will need the following.
For LLM, we will use the OpenAI's GPT-4o.
To get an OpenAI API key, visit their website and create an API key. You might also need some credits.
You may also use Google's Gemini models.
The project has three parts.
To quickly get started, clone this repository.
Go to the backend directory and run the setup script. This will create a virtual environment and download the necessary libraries.
(Note: Grant permission -> chmod +x setup if you cannot execute it.sh) You'll then be prompted to log in to Composio, link Gmail, and access the Slack workspace.
Add API keys in .env file.
This is the setup file.
#!/bin/bash
# Create a virtual environment
echo "Creating virtual environment..."
python3 -m venv ~/.venvs/gmail_support_bot
# Activate the virtual environment
echo "Activating virtual environment..."
source ~/.venvs/gmail_support_bot/bin/activate
# Install libraries from requirements.txt
echo "Installing libraries from requirements.txt..."
pip install -r requirements.txt
# Login to your account
echo "Login to your Composio account"
composio login
# Add Gmail tool
echo "Add Gmail tool"
composio add gmail
# Add Slackbot tool
echo "Add Slakbot tool"
composio add slackbot
# Copy env backup to the .env file
if [ -f ".env.example" ]; then
echo "Copying .env.example to .env..."
cp .env.example .env
else
echo "No .env.example file found. Creating a new .env file..."
touch .env
fi
# Prompt user to fill the .env file
echo "Please fill in the .env file with the necessary environment variables."
echo "Setup completed successfully!"
This will create a Python virtual environment and install libraries from requirements.txt
. You will also be prompted to log in to Composio. This will redirect you to the Composio login page.
Create an account on Composio and paste the displayed key into the terminal to log in to your Composio account.
You will then be redirected to the Google Authentication page to add the Gmail and Google Sheet integrations.
Once you complete integration, you can visit the composio dashboard and monitor your integrations.
Execute the setup script.
cd backend && ./setup.sh
This is optional, you may skip this section.
We will use Firebase for user authentication and authorization.
So, import the libraries and authenticate using a service account. The account credentials are stored in the JSON file.
import firebase_admin
from firebase_admin import credentials, auth, firestore
from pathlib import Path
import os
cred = credentials.Certificate(f"{Path.cwd()}/firebase/support-bot-49f93-94ae307979d3.json")
firebase_admin.initialize_app(cred)
Intitialize the Firebase admin with required credentials.
Once authenticated, we can initialize a Firestore client:
db = firestore.client()
This allows us to query and manipulate documents in Firestore collections.
def get_user_by_username(username):
users_ref = db.collection('users')
query = users_ref.where('uid', '==', username).limit(1)
docs = query.get()
for doc in docs:
return doc.to_dict()
return False
The function fetches a user document from Firestore by querying the users
collection using the uid
field.
def update_row(uid, new_row):
users_ref = db.collection('users')
query = users_ref.where('uid', '==', uid).limit(1)
docs = query.get()
for doc in docs:
try:
doc.reference.update({'sheetsConfig.row': str(new_row)})
return True
except Exception as e:
print(f"Error updating user row: {e}")
return False
print(f"User with uid {uid} not found")
return False
The above function updates a specific field, sheetsConfig.row
, in the user’s document.
def update_spreadsheet_id(username: str, spreadsheet_id: str):
users_ref = db.collection('users')
query = users_ref.where('username', '==', username).limit(1)
docs = query.get()
for doc in docs:
try:
doc.reference.update(
{'sheetsConfig.spreadsheet_id': spreadsheet_id})
print(f"Successfully updated spreadsheet_id for user {username}")
return True
except Exception as e:
print(f"Error updating spreadsheet_id for user {username}: {e}")
return False
print(f"User {username} not found")
return False
Similar to the row update, this function updates the sheetsConfig.spreadsheet_id
field in the user’s Firestore document:
Let’s now build the AI bot.
First, we will define two prompts in the prompt.py
file.
prompt1 = f"""
1. Send an automatic reply to the sender of the email with the following message:
"Thank you for your email. We have received it and will get back to you shortly"
using GMAIL_REPLY_TO_THREAD action & if any attachments are present, use GMAIL_SEND_EMAIL send that too.
2. Check if the email subject (subject) or body (messageText) in the payload contains any of the keywords specified in this dictionary: {[{'slackChannel': 'dev-channel', 'email': '[email protected]', 'keywords': 'bugs, errors, issues'}, {'slackChannel': 'growth-channel', 'email': '[email protected]', 'keywords': 'Collaboration, partnership, sponser'}, {'slackChannel': 'hrishikesh-channel', 'email': '[email protected]', 'keywords': 'bill'}]}.
3. If a keyword match is found:
a. Check if the original email contains any attachments.
b. If attachments are present, use the GMAIL_GET_ATTACHMENT action to download them.
c. Send the email payload to the corresponding email address and slack channel. If attachments are present, include the downloaded attachments.
message: 'Forwarded email: subject & body.'
Payload: {payload}
"""
prompt2 = f"""
1. Check if the email subject (subject) or body (messageText) in the payload contains any of the keywords specified in this dictionary: {keywords}.
2. If a keyword match is found:
a. Check if the original email contains any attachments.
b. If attachments are present, use the GMAIL_GET_ATTACHMENT action to download them.
c. Send the email payload to the corresponding email address and slack channel. If attachments are present, include the downloaded attachments.
message: 'Forwarded email: subject & body.'
Payload: {payload}
"""
Here’s what each prompt means to do.
Next, in the agent.py
, we will define the agent and the workflow.
Import the libraries and load the environment variables.
import os
import re
import glob
import json
from composio.client.collections import TriggerEventData
from composio_crewai import Action, ComposioToolSet
from crewai import Agent, Crew, Task, Process
from crewai_tools.tools.base_tool import BaseTool
from langchain_openai import ChatOpenAI
from dotenv import load_dotenv
from typing import Any, Dict
import requests
from firebase.init import update_row
from firebase.init import db
# get_user_by_username
from pathlib import Path
from prompts import prompt1,prompt2
load_dotenv()
Now, create instances of OpenAI and Composio toolset.
lm = ChatOpenAI(model="gpt-4o")
# Trigger instance
composio_toolset1 = ComposioToolSet(api_key=os.environ.get("COMPOSIO_API_KEY"))
Initialize the event listener, with G
listener = composio_toolset1.create_trigger_listener()
Next define a callback function with the event listener decorator.
@listener.callback(filters={"trigger_name": "GMAIL_NEW_GMAIL_MESSAGE"})
def callback_new_message(event: TriggerEventData) -> None:
...
Inside the decorator, the trigger name is set to GMAIL_NEW_GMAIL_MESSAGE
, so whenever there is a new email event, the trigger will forward the email content to the callback message.
Now, this callback function receives the event data and pre-processes the email contents. Based on the content, the bot will route the emails to appropriate email IDs and Slack channels.
First, from the payload, extract the sender’s email.
listener.callback(filters={"trigger_name": "GMAIL_NEW_GMAIL_MESSAGE"})
def callback_new_message(event: TriggerEventData) -> None:
print("Received new email")
payload = event.payload
sender_email = payload['sender']
sender_email = sender_email.strip()
Now, extract the user name from the email using the UID.
@listener.callback(filters={"trigger_name": "GMAIL_NEW_GMAIL_MESSAGE"})
def callback_new_message(event: TriggerEventData) -> None:
print("Received new email")
payload = event.payload
sender_email = payload['sender']
sender_email = sender_email.strip()
def get_user_by_username(username):
users_ref = db.collection('users')
query = users_ref.where('username', '==', username).limit(1)
docs = query.get()
for doc in docs:
return doc.to_dict()
return False
user = get_user_by_username(event.metadata.connection.clientUniqueUserId)
uid = user['uid']
keywords = user['keywords']
user_email = user['email']
Now, define the Composio toolset with required actions.
@listener.callback(filters={"trigger_name": "GMAIL_NEW_GMAIL_MESSAGE"})
def callback_new_message(event: TriggerEventData) -> None:
print("Received new email")
payload = event.payload
sender_email = payload['sender']
sender_email = sender_email.strip()
def get_user_by_username(username):
users_ref = db.collection('users')
query = users_ref.where('username', '==', username).limit(1)
docs = query.get()
for doc in docs:
return doc.to_dict()
return False
user = get_user_by_username(event.metadata.connection.clientUniqueUserId)
uid = user['uid']
keywords = user['keywords']
user_email = user['email']
# Tools
composio_toolset = ComposioToolSet(
api_key=os.environ.get("COMPOSIO_API_KEY"),
output_dir=Path.cwd() / "attachments",
entity_id=event.metadata.connection.clientUniqueUserId)
tools = composio_toolset.get_actions(actions=[Action.GMAIL_SEND_EMAIL,
Action.GMAIL_GET_ATTACHMENT,
Action.GMAIL_REPLY_TO_THREAD,
Action.SLACKBOT_SENDS_A_MESSAGE_TO_A_SLACK_CHANNEL])
Here are the actions we will be using.
Now, define the CrewAI agent.
# Agent
email_assistant = Agent(
role="Email Assistant",
goal="Process incoming emails, send auto-replies, and forward emails based on keywords",
backstory="You're an AI assistant that handles incoming emails, sends automatic responses, and forwards emails to appropriate recipients based on content, including attachments.",
verbose=True,
llm=llm,
tools=tools,
allow_delegation=False,
)
The agent is given a specific role, goal, and backstory. These add additional contexts to the LLM before task completion. It also has the LLM and tools.
Now, define the task and the Crew..
task_description = prompt1 if user_email != sender_email else prompt2
process_new_email = Task(
description=task_description,
agent=email_assistant,
expected_output="Summary of email processing, including confirmation of auto-reply sent, whether the email was forwarded & message sent to slack based on keyword matching, and if any attachments were included in the forwarded email.",
)
email_processing_crew = Crew(
agents=[email_assistant],
tasks=[process_new_email],
verbose=1,
process=Process.sequential,
)
result = email_processing_crew.kickoff()
return result
This is what is happening in the above code block.
print("Email trigger listener activated!")
listener.listen()
This will start the event listener.
Run this script, and you will have the active event listener ready to fetch emails and orchestrate the workflow.
Now, we will define our API endpoints in the main.py
file.
Import libraries and modules and create the FasAPI app.
from fastapi import FastAPI, HTTPException, Request, Depends
from fastapi.security import HTTPBearer, HTTPAuthorizationCredentials
from pydantic import BaseModel
from fastapi.middleware.cors import CORSMiddleware
from firebase.init import auth
from composio_config import createNewEntity, isEntityConnected, enable_gmail_trigger
import logging
from initialise_agent import initialise
logging.basicConfig(level=logging.INFO)
logger = logging.getLogger(__name__)
app = FastAPI()
Add origins and middleware.
origins = [
"http://localhost",
"http://localhost:5173",
]
app.add_middleware(
CORSMiddleware,
allow_origins=origins,
allow_credentials=True,
allow_methods=["*"],
allow_headers=["*"],
)
Define Pydantic models.
# Pydantic models
class UserData(BaseModel):
username: str
appType: str
class NewEntityData(BaseModel):
username: str
appType: str
redirectUrl: str
class EnableTriggerData(BaseModel):
username: str
class InitialiseAgentData(BaseModel):
username: str
Define the following endpoints.
@app.post("/newentity")
async def handle_request(user_data: NewEntityData,
decoded_token: dict = Depends(verify_token)):
user_id = decoded_token['uid']
username = user_data.username
appType = user_data.appType
redirectUrl = user_data.redirectUrl
res = createNewEntity(username, appType, redirectUrl)
return res
@app.post("/enabletrigger")
async def handle_request(user_data: EnableTriggerData,
decoded_token: dict = Depends(verify_token)):
user_id = decoded_token['uid']
username = user_data.username
res = enable_gmail_trigger(username)
return res
@app.post("/checkconnection")
async def handle_request(user_data: UserData,
decoded_token: dict = Depends(verify_token)):
user_id = decoded_token['uid']
username = user_data.username
appType = user_data.appType
res = isEntityConnected(username, appType)
return res
@app.post("/initialiseagent")
async def handle_request(user_data: InitialiseAgentData,
decoded_token: dict = Depends(verify_token)):
username = user_data.username
res = initialise(username)
return res
@app.get("/")
async def handle_request():
return "ok"
Here are the descriptions of each endpoint.
POST /enabletrigger
: Enables a Gmail trigger for the specified username
based on the provided data and authentication token.username
and appType
is connected and authenticated using the decoded token.POST /initialiseagent
: Initializes an agent for the provided username
, authenticated using the decoded token.GET /
: This is a simple health check that returns "okay" to confirm that the service is running.Finally, define the Unicorn server.
if __name__ == "__main__":
import uvicorn
uvicorn.run(app, host="0.0.0.0", port=8000)
# Start the server (if running locally)
# Run the following command in your terminal: uvicorn main:app --reload
Start the Uvicorn server by running the script using.
python main.py
This will start your server at the port 8000.
For brevity, we will not go into the deep.
Let’s at the important pages.
This is going to be our home page.
import Hero from "../components/Hero";
import Benefits from "../components/Benefits";
import FAQ from "../components/FAQ";
import Working from "../components/Working";
import ActionButton from "../components/ActionButton";
const Home = () => {
return <section className="bg-white dark:bg-gray-900 mt-12">
<div className="py-8 px-4 mx-auto max-w-screen-xl text-center lg:py-16 lg:px-12">
<Hero />
<Benefits />
<Working />
<FAQ />
<div className="mt-20">
<ActionButton displayName={"Get started"} link={"#"} />
</div>
</div>
</section>
}
export default Home;
This will create a simple home page like the following image.
You can check out the codes for the checking page here.
On this page, users can add their Slack and Gmail accounts and configure keywords with their respective Slack channels and Gmail IDs.
Here is the app.jsx
file.
import { BrowserRouter, Routes, Route, Navigate } from "react-router-dom";
import { onAuthStateChanged } from "firebase/auth";
import { auth } from "./config/firebase";
import Navbar from "./components/Navbar";
import Home from "./pages/Home";
import Footer from "./components/Footer";
import ScrollToTop from "./components/ScrollToTop";
import { useState, useEffect } from "react";
import Login from "./pages/Login";
import Settings from "./pages/Settings";
import NotFound from "./pages/NotFound";
import SkeletonLoader from "./components/SkeletonLoader";
import { SnackbarProvider } from 'notistack'
const ProtectedRoute = ({ user, children }) => {
if (!user) {
return <Navigate to="/login" replace />;
}
return children;
};
const App = () => {
const [user, setUser] = useState(null);
const [loading, setLoading] = useState(true);
useEffect(() => {
const unsubscribe = onAuthStateChanged(auth, (user) => {
setUser(user);
setLoading(false);
});
return () => unsubscribe();
}, []);
if (loading) {
return <SkeletonLoader />
}
return (
<BrowserRouter>
<SnackbarProvider autoHideDuration={3000} preventDuplicate={true} anchorOrigin={{ vertical: 'bottom', horizontal: 'center' }}>
<Navbar user={user} />
<ScrollToTop />
<Routes>
<Route path="/login" element={<Login />} />
<Route path="/Settings" element={
<ProtectedRoute user={user}>
<Settings user={user} />
</ProtectedRoute>
} />
<Route path="/" element={<Home />} />
<Route path="*" element={<NotFound />} />
</Routes>
<Footer />
</SnackbarProvider>
</BrowserRouter>
);
}
export default App;
onAuthStateChanged
to track the user’s authentication status and conditionally render protected routes (like Settings
) based on whether the user is logged in.react-router-dom
to handle different routes (like /login
, /settings
, and /
), with protected routes redirecting unauthenticated users to the login page.not stack
.Now, the main.jsx
as the entrypoint to the app.
import { StrictMode } from 'react'
import { createRoot } from 'react-dom/client'
import App from './App.jsx'
import './index.css'
import ResponsiveMessage from './components/ResponsiveMessage.jsx'
createRoot(document.getElementById('root')).render(
<StrictMode>
<ResponsiveMessage />
<App />
</StrictMode>,
)
Once you are done with everything, launch the application.
Finally, run the application using the following npm
command.
npm run dev
This will start up the front-end server on the localhost:5345.
You can now visit the app, configure the agent, and see it in action.
In this article, you built a complete AI tool that handles the support emails and routes them to respective email IDs and Slack channels.
If you liked the article, explore and star the Composio repository for more AI use cases.
{% cta https://dub.composio.dev/hFeX2WP %}Star the Composio repository ⭐{% endcta %}
Thank you for reading the article!