This project is a full-stack application designed to leverage natural language processing capabilities entirely locally and to integrate with the DSPy framework developed by StanfordNLP. It features a FastAPI backend for processing and a Streamlit frontend for interactive user interfaces. This implementation utilizes OpenAI or Cohere for language and embedding models, Weaviate for vector storage, and Arize Phoenix for observability.
This full-stack application combines the DSPy Framework with OpenAI, Cohere, Arize Phoenix, and Weaviate DB in a cohesive ecosystem. Here’s a brief overview of the system components:
First, navigate to the backend directory:
cd backend/
Second, setup the environment:
poetry config virtualenvs.in-project true
poetry install
poetry shell
Specify your environment variables in an .env file in backend directory. Example .env file:
ENVIRONMENT=<your_environment_value>
INSTRUMENT_DSPY=<true or false>
COLLECTOR_ENDPOINT=<your_arize_phoenix_endpoint>
OPENAI_API_KEY=<your_openai_api_key>
CO_API_KEY=<your_cohere_api_key>
Third, run this command to create embeddings of data located in data/example folder:
python app/utils/load.py
Then run this command to start the FastAPI server:
python main.py
First, navigate to the frontend directory:
cd frontend/
Second, setup the environment:
poetry config virtualenvs.in-project true
poetry install
poetry shell
Specify your environment variables in an .env file in backend directory. Example .env file:
FASTAPI_BACKEND_URL = <your_fastapi_address>
Then run this command to start the Streamlit application:
streamlit run about.py
This project now supports Docker Compose for easier setup and deployment, including backend services and Arize Phoenix for query tracing.
python -m app.utils.load
from the backend folder to create embeddings for the data located in the data/example
folder.docker-compose -f compose.yml up
to spin up services for the backend, and Phoenix.docker compose down
to spin down the services.The FastAPI and Streamlit integration allows for seamless interaction between the user and the NLP backend. Utilize the FastAPI endpoints for NLP tasks and visualize results and interact with the system through the Streamlit frontend.
This example is a fork of dspy-rag-fastapi by @diicellman and credit for the implementation goes to them.