Skip to content

A FastAPI project demonstrating how to build a simple API that interacts with a local LLM using Ollama. Ideal for learning FastAPI, asynchronous endpoints, and AI-powered API integrations.

Notifications You must be signed in to change notification settings

DevanshuSave/FastAPI

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

18 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

FastAPI Project with Ollama Integration

Description

A FastAPI project demonstrating integration with Ollama, an open-source lightweight large language model (LLM) runtime. This application allows you to interact with Ollama models via FastAPI endpoints, making it easy to build AI-powered APIs.

Setup Instructions

Prerequisites

  • Python 3.8 or higher
  • Ollama installed locally

1. Clone the Repository

git clone https://github.com/DevanshuSave/FastAPI.git
cd FastAPI

2. Install Python Dependencies

pip install -r requirements.txt

3. Install and Run Ollama

Follow the official Ollama installation guide for your OS from https://ollama.ai/download

Start Ollama on your local machine:

ollama serve

4. Run the FastAPI Application

uvicorn main:app --reload

The FastAPI app will be available at http://127.0.0.1:8000.

5. Test the API

Open your browser and navigate to http://127.0.0.1:8000/docs to access the interactive API documentation and try out endpoints.

About

A FastAPI project demonstrating how to build a simple API that interacts with a local LLM using Ollama. Ideal for learning FastAPI, asynchronous endpoints, and AI-powered API integrations.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 3

  •  
  •  
  •  

Languages