In a current hackathon, I developed Agri Bot, an AI-powered chatbot designed to help farmers and agricultural lovers by offering correct and multilingual farming-related info. This text will stroll you thru the options, structure, and code behind Agri Bot, showcasing the way it leverages superior applied sciences to create a user-friendly expertise. Within the Agriculture sector, entry to well timed and correct info is essential for farmers and agricultural lovers. Enter Agri Bot, an AI in Agriculture chatbot designed to bridge the data hole by offering multilingual help and real-time information.
That is the UI of the streamlit Agribot app, it’s a multilingual, conversational and real-time bot:

Key Options of Agri Bot: AI for Farmers
Agri Bot is provided with a number of standout options that make it a useful AI for farmers:
- Multilingual Help: Communicates in a number of languages, together with English, Hindi, Telugu, Tamil, Bengali, Marathi, and Punjabi.
- AI-Powered Conversations: Makes use of the Llama 3-70B mannequin to ship clever and contextual responses.
- Actual-Time Info Retrieval: Integrates with Wikipedia, Arxiv, and DuckDuckGo to fetch the newest agricultural information.
- Context-Conscious Reminiscence: Remembers earlier interactions, guaranteeing a seamless person expertise.
- Consumer-Pleasant Interface: Constructed with Streamlit, the interface is intuitive and simple to navigate.
Tech Stack for Agri Bot
The tech stack for Agri Bot contains:
- Frontend: Streamlit (Python)
- Backend: LangChain, OpenAI LLM (through Groq API)
- Search Instruments: Wikipedia, Arxiv, DuckDuckGo
- Translation: Google Translator API
- Reminiscence Administration: LangChain ConversationBufferMemory
Additionally learn: Prime 7 Frameworks for Constructing AI Brokers in 2025
Steps Concerned to Construct the Agri Bot
Right here’s a breakdown of the code that powers Agri Bot:
1. Importing Libraries
import os
import time
import streamlit as st
from langchain.reminiscence import ConversationBufferMemory
from langchain.brokers import initialize_agent, AgentType
from langchain.chat_models import ChatOpenAI
from langchain.schema import SystemMessage, HumanMessage, AIMessage
from langchain_community.instruments import WikipediaQueryRun, ArxivQueryRun, DuckDuckGoSearchRun
from langchain_community.utilities import WikipediaAPIWrapper, ArxivAPIWrapper, DuckDuckGoSearchAPIWrapper
from langdetect import detect
from deep_translator import GoogleTranslator
from dotenv import load_dotenv, find_dotenv
We begin by importing the mandatory libraries. streamlit is used for the online interface, whereas langchain offers instruments for constructing conversational brokers. The deep_translator library is used for language translation.
2. Loading Setting Variables
load_dotenv(find_dotenv())
This line masses atmosphere variables from a .env file, which incorporates delicate info like API keys.
3. Initializing AI Instruments
wiki = WikipediaQueryRun(api_wrapper=WikipediaAPIWrapper(top_k_results=1, doc_content_chars_max=200))
arxiv = ArxivQueryRun(api_wrapper=ArxivAPIWrapper(top_k_results=1, doc_content_chars_max=200))
duckduckgo_search = DuckDuckGoSearchRun(api_wrapper=DuckDuckGoSearchAPIWrapper(area="in-en", time="y", max_results=2))
instruments = [wiki, arxiv, duckduckgo_search]
Right here, we initialize the instruments for fetching info from Wikipedia, Arxiv, and DuckDuckGo. Every instrument is configured to return a restricted variety of outcomes to make sure fast responses.
4. Loading the Language Mannequin
def load_llm():
return ChatOpenAI(
model_name="llama3-70b-8192",
temperature=1,
openai_api_key=os.getenv("GROQ_API_KEY"),
openai_api_base="https://api.groq.com/openai/v1"
)
This perform masses the language mannequin utilizing the Groq API. The temperature parameter controls the randomness of the mannequin’s responses.
5. Translation Features
def translate_to_english(textual content):
attempt:
detected_lang = detect(textual content) # Detect language
if detected_lang == "en":
return textual content, "en" # No translation wanted
translated_text = GoogleTranslator(supply=detected_lang, goal="en").translate(textual content)
return translated_text, detected_lang # Return translated textual content and unique language
besides Exception as e:
return textual content, "unknown" # Return unique textual content if translation fails
def translate_back(textual content, target_lang):
attempt:
if target_lang == "en":
return textual content # No translation wanted
return GoogleTranslator(supply="en", goal=target_lang).translate(textual content)
besides Exception as e:
return textual content # Return unique if translation fails
These capabilities deal with the interpretation of person enter to English and again to the unique language. They use the deep_translator library to carry out the translations.
6. Reminiscence Administration
if "chat_memory" not in st.session_state:
st.session_state.chat_memory = ConversationBufferMemory(memory_key="chat_history", return_messages=True)
This code ensures that the chat reminiscence is persistent throughout classes, permitting the bot to recollect earlier interactions.
7. Creating the Conversational Agent
def get_conversational_agent():
llm = load_llm()
return initialize_agent(
instruments=instruments,
llm=llm,
agent=AgentType.ZERO_SHOT_REACT_DESCRIPTION,
reminiscence=st.session_state.chat_memory,
verbose=True,
return_intermediate_steps=False,
max_iterations=5,
handle_parsing_errors=True
)
This perform initializes the conversational agent with the loaded language mannequin and the instruments for info retrieval.
8. Streamlit Chat UI
def essential():
# Set Background Picture
...
st.title("🌾 Agri Bot (Multilingual) 🌾")
st.subheader("Your Good Assistant for Farming and Agriculture")
if st.button("Reset Dialog"):
st.session_state.chat_memory.clear()
st.session_state.messages = []
st.success("Chat historical past cleared!")
if "messages" not in st.session_state:
st.session_state.messages = []
# Show previous chat historical past
for message in st.session_state.messages:
st.chat_message(message["role"]).markdown(message["content"])
# Get person enter
immediate = st.chat_input("Ask your farming-related query right here (in any language)...")
if immediate:
st.chat_message("person").markdown(immediate)
st.session_state.messages.append({"function": "person", "content material": immediate})
attempt:
translated_query, original_lang = translate_to_english(immediate)
st.write(f"🔍 *Detected Language:* {original_lang.higher()}") # Present detected language
st.write(f"🔄 *Translated Question:* {translated_query}") # Present translated question
agent = get_conversational_agent()
def trim_chat_memory(max_length=5):#
""" Retains solely the final `max_length` messages in reminiscence. """
chat_history = st.session_state.chat_memory.load_memory_variables({})["chat_history"]
if len(chat_history) > max_length:
st.session_state.chat_memory.chat_memory.messages = chat_history[-max_length:]#
return chat_history
# Apply trimming earlier than invoking the agent
chat_history = trim_chat_memory(max_length=5)#
conversation_context = "n".be a part of([msg.content for msg in chat_history])
full_prompt = f"""
Earlier dialog:
{conversation_context}
Consumer: {immediate}
Assistant: Think twice. You might be allowed to look a most of two instances strictly.
In case you have discovered sufficient info from earlier searches, STOP looking out and generate an convincing reply utilizing the accessible information.
"""
# Retry in case of rate-limit errors
max_retries = 3
for try in vary(max_retries):
attempt:
response = agent.invoke({"enter": full_prompt})
break # Exit loop if profitable
besides Exception as e:
st.warning(f"⚠ API Fee Restrict! Retrying {try + 1}/{max_retries}...")
time.sleep(2) # Wait and retry
response_text = response["output"] if isinstance(response, dict) and "output" in response else str(response)
final_response = translate_back(response_text, original_lang) # Translate again to unique language
st.chat_message("assistant").markdown(final_response)
st.session_state.messages.append({"function": "assistant", "content material": final_response})
besides Exception as e:
st.error(f"Error: {str(e)}")
Code Rationalization
Let’s break down the code’s performance, step-by-step:
1. Streamlit Setup
The code initializes a Streamlit software, creating the person interface for the chatbot.
2. Chat Enter
st.chat_input creates a textual content enter space the place the person can sort their messages.
3. Consumer Message Dealing with
When the person submits a message:
- The message is captured.
- translate_to_english converts the person’s message to English. That is essential for constant interplay with the English-centric LLM.
- The unique (person language) and translated (English) messages are displayed within the chat window utilizing st.chat_message.
4. LangChain Agent Question
- get_conversational_agent is named to retrieve or initialize a LangChain agent. This agent is designed to deal with conversational queries, probably utilizing an LLM and probably different instruments.
- The present dialog historical past (from st.session_state.chat_memory) is included within the immediate despatched to the agent. This context is important for a coherent dialog.
- The agent processes the immediate (together with the person’s translated message and the chat historical past) and generates a response in English.
5. Response Dealing with
- The English response from the agent is saved.
- translate_back converts the agent’s English response again to the person’s unique language.
- The translated response is displayed within the chat window utilizing st.chat_message.
6. Context Administration
trim_chat_memory is named to restrict the dialog historical past saved in st.session_state.chat_memory. This prevents the context from changing into too massive for the LLM to deal with, which is a standard limitation. It normally retains solely the newest messages.
7. Retry Mechanism
The code incorporates a retry loop. If the API name to the LLM or translation service fails (e.g., attributable to charge limiting or momentary community points), the code will retry the request a sure variety of instances earlier than giving up. This makes the chatbot extra strong.
8. Error Dealing with
The code contains attempt…besides blocks to catch potential errors throughout API calls or different operations. This prevents the appliance from crashing and offers a extra user-friendly expertise (e.g., displaying an error message).
9. Session State
st.session_state is used to retailer information that persists throughout person interactions. Particularly, it shops the chat_memory, which is the historical past of the dialog. This permits the chatbot to keep up context over a number of turns. With out st.session_state, the dialog would begin recent with each new message.
Testing the Agribot: An AI for farmers
That is the UI of the streamlit app.

Right here, I requested the bot “What are the crops grown in Haryana?”, we will see that it detected the language is “en”.

Now it provides real-time solutions utilizing Net search, Wikipedia and Arxiv AI brokers and presents solutions exactly.

This picture reveals that Agribot can perceive totally different regional languages and may reply in these languages, right here we will see that the detected language is Tamil “te” and the output can also be in Tamil language.

Future Enhancements in AI for Farmers
Whereas Agri Bot is purposeful, there are a number of areas for enchancment:
- Voice Enter and Responses: Including help for voice interactions might improve accessibility.
- Area-Particular High-quality-Tuning: High-quality-tuning the mannequin on agricultural information might enhance response accuracy.
- UI/UX Enhancements: Additional enhancements to the person interface might present a greater person expertise.
Conclusion
Agri Bot is a robust instrument that leverages AI and multilingual capabilities to help farmers and agricultural lovers. The mix of real-time info retrieval, language translation, and conversational reminiscence makes it a novel and helpful useful resource. I stay up for additional growing this venture and exploring new options to boost its performance of AI in Agriculture.