A FastAPI project demonstrating integration with Ollama, an open-source lightweight large language model (LLM) runtime. This application allows you to interact with Ollama models via FastAPI endpoints, making it easy to build AI-powered APIs.
- Python 3.8 or higher
- Ollama installed locally
git clone https://github.com/DevanshuSave/FastAPI.git
cd FastAPIpip install -r requirements.txtFollow the official Ollama installation guide for your OS from https://ollama.ai/download
Start Ollama on your local machine:
ollama serveuvicorn main:app --reloadThe FastAPI app will be available at http://127.0.0.1:8000.
Open your browser and navigate to http://127.0.0.1:8000/docs to access the interactive API documentation and try out endpoints.