👉 In this Post we will show how to connect these APIs to a real production database (like PostgreSQL ) using SQLAlchemy or SQLModel — with environment variables for security.
Excellent choice 💪 — moving from SQLite → PostgreSQL is exactly what you’d do in a real-world project or production API.
Let’s go step-by-step for both Flask and FastAPI, showing how to:
- Connect to PostgreSQL securely
- Use environment variables for credentials
- Perform CRUD operations (just like before)
🧱 Overview
| Framework | ORM | DB | Connection Method |
|---|---|---|---|
| Flask | SQLAlchemy | PostgreSQL | via psycopg2 driver |
| FastAPI | SQLModel (built on SQLAlchemy) | PostgreSQL | via asyncpg or psycopg2 |
🐘 Before You Start
🧩 1. Install PostgreSQL
If not already:
sudo apt install postgresql postgresql-contrib
Then start the service and create a test DB:
sudo -u postgres psql
CREATE DATABASE mydb;
CREATE USER myuser WITH PASSWORD 'mypassword';
GRANT ALL PRIVILEGES ON DATABASE mydb TO myuser;
\q
🧩 2. Install required Python libraries
For both frameworks:
pip install psycopg2-binary sqlalchemy
Then per framework:
| Flask | FastAPI |
|---|---|
pip install flask | pip install fastapi uvicorn sqlmodel |
🧩 3. Use Environment Variables for DB credentials
Create a .env file (never commit this in real projects):
DB_USER=myuser
DB_PASS=mypassword
DB_HOST=localhost
DB_PORT=5432
DB_NAME=mydb
And install dotenv:
pip install python-dotenv
🐍 1️⃣ Flask + PostgreSQL (with SQLAlchemy)
# flask_postgres_api.py
from flask import Flask, jsonify, request
from flask_sqlalchemy import SQLAlchemy
from dotenv import load_dotenv
import os
# Load environment variables
load_dotenv()
app = Flask(__name__)
# Build PostgreSQL connection string
POSTGRES_URL = f"postgresql+psycopg2://{os.getenv('DB_USER')}:{os.getenv('DB_PASS')}@{os.getenv('DB_HOST')}:{os.getenv('DB_PORT')}/{os.getenv('DB_NAME')}"
app.config['SQLALCHEMY_DATABASE_URI'] = POSTGRES_URL
app.config['SQLALCHEMY_TRACK_MODIFICATIONS'] = False
db = SQLAlchemy(app)
# Define model
class Item(db.Model):
__tablename__ = "items"
id = db.Column(db.Integer, primary_key=True)
name = db.Column(db.String(100), nullable=False)
price = db.Column(db.Float, nullable=False)
def to_dict(self):
return {"id": self.id, "name": self.name, "price": self.price}
# Initialize DB
with app.app_context():
db.create_all()
# CRUD Routes
@app.route("/items", methods=["GET"])
def get_items():
items = Item.query.all()
return jsonify([item.to_dict() for item in items])
@app.route("/items", methods=["POST"])
def add_item():
data = request.get_json()
new_item = Item(name=data["name"], price=data["price"])
db.session.add(new_item)
db.session.commit()
return jsonify({"message": "Item added", "item": new_item.to_dict()}), 201
@app.route("/items/<int:item_id>", methods=["PUT"])
def update_item(item_id):
data = request.get_json()
item = Item.query.get(item_id)
if not item:
return jsonify({"error": "Item not found"}), 404
item.name = data.get("name", item.name)
item.price = data.get("price", item.price)
db.session.commit()
return jsonify({"message": "Item updated", "item": item.to_dict()})
@app.route("/items/<int:item_id>", methods=["DELETE"])
def delete_item(item_id):
item = Item.query.get(item_id)
if not item:
return jsonify({"error": "Item not found"}), 404
db.session.delete(item)
db.session.commit()
return jsonify({"message": f"Item {item_id} deleted"})
if __name__ == "__main__":
app.run(debug=True)
✅ Run:
python flask_postgres_api.py
Your data will now persist in PostgreSQL, not in memory or SQLite.
⚡ 2️⃣ FastAPI + PostgreSQL (with SQLModel)
# fastapi_postgres_api.py
from fastapi import FastAPI, HTTPException
from sqlmodel import SQLModel, Field, Session, create_engine, select
from dotenv import load_dotenv
import os
# Load environment variables
load_dotenv()
# PostgreSQL connection
POSTGRES_URL = f"postgresql+psycopg2://{os.getenv('DB_USER')}:{os.getenv('DB_PASS')}@{os.getenv('DB_HOST')}:{os.getenv('DB_PORT')}/{os.getenv('DB_NAME')}"
engine = create_engine(POSTGRES_URL, echo=True)
# Define model
class Item(SQLModel, table=True):
id: int | None = Field(default=None, primary_key=True)
name: str
price: float
# Create table if not exists
SQLModel.metadata.create_all(engine)
app = FastAPI()
# CREATE
@app.post("/items")
def add_item(item: Item):
with Session(engine) as session:
session.add(item)
session.commit()
session.refresh(item)
return {"message": "Item added", "item": item}
# READ
@app.get("/items")
def get_items():
with Session(engine) as session:
items = session.exec(select(Item)).all()
return items
# UPDATE
@app.put("/items/{item_id}")
def update_item(item_id: int, updated_item: Item):
with Session(engine) as session:
item = session.get(Item, item_id)
if not item:
raise HTTPException(status_code=404, detail="Item not found")
item.name = updated_item.name
item.price = updated_item.price
session.commit()
session.refresh(item)
return {"message": "Item updated", "item": item}
# DELETE
@app.delete("/items/{item_id}")
def delete_item(item_id: int):
with Session(engine) as session:
item = session.get(Item, item_id)
if not item:
raise HTTPException(status_code=404, detail="Item not found")
session.delete(item)
session.commit()
return {"message": f"Item {item_id} deleted"}
✅ Run:
uvicorn fastapi_postgres_api:app --reload
Then visit:
- API:
http://127.0.0.1:8000/items - Swagger docs:
http://127.0.0.1:8000/docs
⚔️ Flask vs FastAPI (with PostgreSQL)
| Feature | Flask | FastAPI |
|---|---|---|
| ORM | SQLAlchemy | SQLModel (on top of SQLAlchemy) |
| Performance | Sync | Async ready |
| Input Validation | Manual | Auto via type hints |
| Auto Docs | ❌ | ✅ /docs |
| Recommended For | Simpler apps | Modern APIs & ML/AI services |
🧠 Summary
✅ You now have:
- Persistent CRUD APIs connected to PostgreSQL
- Secure connection via
.env - Ready-to-deploy production-style structure
Would you like me to show how to containerize these APIs using Docker (so you can run Flask/FastAPI + PostgreSQL together easily)?
That’s the next real-world step before cloud deployment.
Perfect 👍 — we’ll go step by step and build your understanding from scratch to deployment:
We’ll cover:
- Core REST API Concepts Recap
- Flask + PostgreSQL REST API (CRUD Example)
- FastAPI + PostgreSQL REST API (CRUD Example)
- Docker Deep Dive: what it is, why it’s used, and how to dockerize both apps
🧩 1️⃣ REST API — Quick Recap
A REST API (Representational State Transfer) allows two systems to communicate using HTTP methods.
| HTTP Method | Purpose | Example |
|---|---|---|
| GET | Read data | /users |
| POST | Create new data | /users |
| PUT/PATCH | Update data | /users/1 |
| DELETE | Remove data | /users/1 |
Each request returns a JSON response.
🧱 2️⃣ Flask + PostgreSQL Example
We’ll use:
Flask→ web frameworkpsycopg2→ PostgreSQL connectorPostgreSQL→ databaseDocker→ to containerize later
📦 Install
pip install flask psycopg2-binary
🧠 Flask Code: flask_app.py
from flask import Flask, request, jsonify
import psycopg2
app = Flask(__name__)
# PostgreSQL connection
conn = psycopg2.connect(
host="localhost",
database="testdb",
user="postgres",
password="yourpassword"
)
@app.route("/users", methods=["GET"])
def get_users():
cur = conn.cursor()
cur.execute("SELECT * FROM users;")
rows = cur.fetchall()
cur.close()
return jsonify(rows)
@app.route("/users", methods=["POST"])
def create_user():
data = request.json
name = data["name"]
age = data["age"]
cur = conn.cursor()
cur.execute("INSERT INTO users (name, age) VALUES (%s, %s)", (name, age))
conn.commit()
cur.close()
return jsonify({"message": "User added successfully!"})
if __name__ == "__main__":
app.run(debug=True)
🗃️ Table creation SQL (run in PostgreSQL shell):
CREATE TABLE users (
id SERIAL PRIMARY KEY,
name VARCHAR(100),
age INT
);
⚡ 3️⃣ FastAPI + PostgreSQL Example
We’ll use:
FastAPISQLAlchemy+asyncpgfor DB connectionUvicornfor running the server
📦 Install
pip install fastapi uvicorn sqlalchemy psycopg2-binary
🧠 FastAPI Code: fastapi_app.py
from fastapi import FastAPI, HTTPException
from pydantic import BaseModel
import psycopg2
app = FastAPI()
class User(BaseModel):
name: str
age: int
conn = psycopg2.connect(
host="localhost",
database="testdb",
user="postgres",
password="yourpassword"
)
@app.get("/users")
def get_users():
cur = conn.cursor()
cur.execute("SELECT * FROM users;")
rows = cur.fetchall()
cur.close()
return {"users": rows}
@app.post("/users")
def create_user(user: User):
cur = conn.cursor()
cur.execute("INSERT INTO users (name, age) VALUES (%s, %s)", (user.name, user.age))
conn.commit()
cur.close()
return {"message": "User added successfully!"}
Run:
uvicorn fastapi_app:app --reload
🐳 4️⃣ Deep Understanding of Docker (and Dockerizing API)
💡 What is Docker?
Docker is a containerization platform — it packages your code + dependencies + environment into lightweight containers that run anywhere (no “works on my machine” issue).
Think of a container as:
A mini virtual machine running only what’s needed for your app.
⚙️ Core Docker Concepts
| Term | Meaning |
|---|---|
| Image | A snapshot (template) of your app (like a recipe). |
| Container | A running instance of an image. |
| Dockerfile | A text file defining how to build your image. |
| Docker Hub | Public registry where images are stored. |
| docker-compose.yml | Defines multiple containers (e.g. app + db). |
🧩 Dockerfile for Flask or FastAPI
Create a file named Dockerfile in your project folder:
# Use official Python base image
FROM python:3.10
# Set working directory
WORKDIR /app
# Copy files
COPY . /app
# Install dependencies
RUN pip install -r requirements.txt
# Expose port (Flask 5000, FastAPI 8000)
EXPOSE 8000
# Command to run the app (use one)
CMD ["uvicorn", "fastapi_app:app", "--host", "0.0.0.0", "--port", "8000"]
🐘 Docker Compose for App + PostgreSQL
Create a file docker-compose.yml:
version: '3.8'
services:
db:
image: postgres:15
container_name: postgres_db
environment:
POSTGRES_DB: testdb
POSTGRES_USER: postgres
POSTGRES_PASSWORD: yourpassword
ports:
- "5432:5432"
api:
build: .
container_name: fastapi_app
ports:
- "8000:8000"
depends_on:
- db
▶️ Build and Run
docker-compose up --build
Now your FastAPI + PostgreSQL stack is running in containers:
- API →
http://localhost:8000 - PostgreSQL →
localhost:5432
🧠 Docker Summary
| Concept | Description |
|---|---|
| Why Docker? | Consistency, portability, isolated environment |
| Images vs Containers | Image = blueprint, Container = running instance |
| Dockerfile | Defines how to build your image |
| Compose | Runs multi-container apps easily |
| Common Command | docker ps, docker build, docker run, docker exec -it <container> bash |
Would you like me to take the next step —
➡️ show how to persist PostgreSQL data using Docker volumes and how to connect using SQLAlchemy ORM (models, queries, etc.) instead of raw SQL?
That’s the next real-world step for API + database projects.
Perfect ⚡ — you’re now entering real-world API + Database + Docker territory 💪
Let’s continue with a complete and production-style FastAPI + PostgreSQL stack using SQLAlchemy ORM and Docker volumes for data persistence.
🧱 Overview of What You’ll Build
✅ FastAPI app — RESTful CRUD API for users
✅ PostgreSQL — running inside Docker
✅ Docker Volume — persistent DB storage
✅ SQLAlchemy ORM — model-based DB access
✅ Docker Compose — manages both containers together
⚙️ 1️⃣ Project Structure
fastapi_postgres_docker/
│
├── app/
│ ├── main.py
│ ├── models.py
│ ├── database.py
│ ├── schemas.py
│ ├── crud.py
│ └── __init__.py
│
├── requirements.txt
├── Dockerfile
└── docker-compose.yml
🧩 2️⃣ requirements.txt
fastapi
uvicorn
sqlalchemy
psycopg2-binary
pydantic
🧠 3️⃣ database.py — PostgreSQL Connection
from sqlalchemy import create_engine
from sqlalchemy.ext.declarative import declarative_base
from sqlalchemy.orm import sessionmaker
DATABASE_URL = "postgresql://postgres:yourpassword@db:5432/testdb"
engine = create_engine(DATABASE_URL)
SessionLocal = sessionmaker(autocommit=False, autoflush=False, bind=engine)
Base = declarative_base()
Note: The hostname
dbrefers to the PostgreSQL service indocker-compose.yml.
🧬 4️⃣ models.py — ORM Models (Tables)
from sqlalchemy import Column, Integer, String
from .database import Base
class User(Base):
__tablename__ = "users"
id = Column(Integer, primary_key=True, index=True)
name = Column(String(100))
age = Column(Integer)
🧾 5️⃣ schemas.py — Request/Response Validation
from pydantic import BaseModel
class UserBase(BaseModel):
name: str
age: int
class UserCreate(UserBase):
pass
class UserOut(UserBase):
id: int
class Config:
orm_mode = True
🧩 6️⃣ crud.py — Database Operations
from sqlalchemy.orm import Session
from . import models, schemas
def get_users(db: Session):
return db.query(models.User).all()
def create_user(db: Session, user: schemas.UserCreate):
db_user = models.User(name=user.name, age=user.age)
db.add(db_user)
db.commit()
db.refresh(db_user)
return db_user
⚡ 7️⃣ main.py — FastAPI Routes
from fastapi import FastAPI, Depends
from sqlalchemy.orm import Session
from . import models, schemas, crud
from .database import SessionLocal, engine
models.Base.metadata.create_all(bind=engine)
app = FastAPI()
def get_db():
db = SessionLocal()
try:
yield db
finally:
db.close()
@app.get("/users", response_model=list[schemas.UserOut])
def read_users(db: Session = Depends(get_db)):
return crud.get_users(db)
@app.post("/users", response_model=schemas.UserOut)
def create_user(user: schemas.UserCreate, db: Session = Depends(get_db)):
return crud.create_user(db, user)
🐳 8️⃣ Dockerfile — FastAPI App
FROM python:3.10
WORKDIR /app
COPY requirements.txt .
RUN pip install -r requirements.txt
COPY ./app /app
EXPOSE 8000
CMD ["uvicorn", "main:app", "--host", "0.0.0.0", "--port", "8000"]
⚙️ 9️⃣ docker-compose.yml — FastAPI + PostgreSQL + Persistent Volume
version: '3.8'
services:
db:
image: postgres:15
container_name: postgres_db
environment:
POSTGRES_DB: testdb
POSTGRES_USER: postgres
POSTGRES_PASSWORD: yourpassword
ports:
- "5432:5432"
volumes:
- pgdata:/var/lib/postgresql/data
api:
build: .
container_name: fastapi_app
ports:
- "8000:8000"
depends_on:
- db
volumes:
pgdata:
▶️ 10️⃣ Run the Application
docker-compose up --build
✅ Now visit:
http://localhost:8000/docs→ Swagger UIGET /users→ list usersPOST /users→ add user (JSON body:{"name":"John","age":25})
💾 Persistent Data with Docker Volumes
- The line
volumes: - pgdata:/var/lib/postgresql/datameans:- Even if you delete the container, the PostgreSQL data remains stored on your machine.
- You can verify using:
docker volume ls - To remove all data:
docker-compose down -v
🔍 How the System Works Together
| Layer | Description |
|---|---|
| FastAPI | Defines API endpoints |
| SQLAlchemy ORM | Handles DB models & queries |
| PostgreSQL | Stores data persistently |
| Docker | Packages everything in isolated containers |
| Docker Compose | Orchestrates multiple containers |
| Volume | Ensures data persistence |
🧠 Key Takeaways
- ORM removes manual SQL, gives class-based DB interaction.
- Docker Compose makes multi-container development seamless.
- Volumes keep your DB safe from container deletion.
- FastAPI + PostgreSQL + Docker is a modern stack for microservices or data/AI backends.
Would you like the next level tutorial —
➡️ Adding PUT, DELETE, and GET by ID (full CRUD)
➡️ plus explaining how to migrate schema updates using Alembic (version control for database)?
Perfect 🚀 — now we’ll go from basic CRUD → full production-grade API using:
✅ Full CRUD operations (GET, POST, PUT, DELETE)
✅ Database migrations using Alembic (so you can evolve your DB safely)
✅ All running in Docker
🧩 Overview
You’ll extend your existing FastAPI + PostgreSQL + SQLAlchemy + Docker project.
Let’s jump step by step 👇
🧠 1️⃣ Update crud.py — Full CRUD Operations
from sqlalchemy.orm import Session
from . import models, schemas
# ✅ Read All
def get_users(db: Session):
return db.query(models.User).all()
# ✅ Read One
def get_user_by_id(db: Session, user_id: int):
return db.query(models.User).filter(models.User.id == user_id).first()
# ✅ Create
def create_user(db: Session, user: schemas.UserCreate):
db_user = models.User(name=user.name, age=user.age)
db.add(db_user)
db.commit()
db.refresh(db_user)
return db_user
# ✅ Update
def update_user(db: Session, user_id: int, updated_user: schemas.UserCreate):
user = db.query(models.User).filter(models.User.id == user_id).first()
if user:
user.name = updated_user.name
user.age = updated_user.age
db.commit()
db.refresh(user)
return user
# ✅ Delete
def delete_user(db: Session, user_id: int):
user = db.query(models.User).filter(models.User.id == user_id).first()
if user:
db.delete(user)
db.commit()
return user
⚡ 2️⃣ Update main.py — Add Routes for All Operations
from fastapi import FastAPI, Depends, HTTPException
from sqlalchemy.orm import Session
from . import models, schemas, crud
from .database import SessionLocal, engine
models.Base.metadata.create_all(bind=engine)
app = FastAPI()
def get_db():
db = SessionLocal()
try:
yield db
finally:
db.close()
# ✅ Create user
@app.post("/users", response_model=schemas.UserOut)
def create_user(user: schemas.UserCreate, db: Session = Depends(get_db)):
return crud.create_user(db, user)
# ✅ Read all users
@app.get("/users", response_model=list[schemas.UserOut])
def read_users(db: Session = Depends(get_db)):
return crud.get_users(db)
# ✅ Read single user by ID
@app.get("/users/{user_id}", response_model=schemas.UserOut)
def read_user(user_id: int, db: Session = Depends(get_db)):
user = crud.get_user_by_id(db, user_id)
if not user:
raise HTTPException(status_code=404, detail="User not found")
return user
# ✅ Update user
@app.put("/users/{user_id}", response_model=schemas.UserOut)
def update_user(user_id: int, user: schemas.UserCreate, db: Session = Depends(get_db)):
updated = crud.update_user(db, user_id, user)
if not updated:
raise HTTPException(status_code=404, detail="User not found")
return updated
# ✅ Delete user
@app.delete("/users/{user_id}")
def delete_user(user_id: int, db: Session = Depends(get_db)):
deleted = crud.delete_user(db, user_id)
if not deleted:
raise HTTPException(status_code=404, detail="User not found")
return {"message": "User deleted successfully"}
🧪 Test All Endpoints (via Swagger)
Run:
docker-compose up --build
Then open:
👉 http://localhost:8000/docs
You’ll see and can test:
| Method | Endpoint | Description |
|---|---|---|
| POST | /users | Create user |
| GET | /users | Read all |
| GET | /users/{id} | Read one |
| PUT | /users/{id} | Update |
| DELETE | /users/{id} | Delete |
⚙️ 3️⃣ Alembic — Database Migrations
When you later add or modify database columns, you’ll need to version-control schema changes.
This is what Alembic does (like Git for databases).
📦 Install Alembic
Add it to your requirements:
pip install alembic
⚙️ Initialize Alembic
From your project root:
alembic init alembic
This creates:
alembic/
├── versions/
│ └── (empty for now)
├── env.py
├── script.py.mako
alembic.ini
🧩 Configure Alembic
In alembic.ini, update DB URL:
sqlalchemy.url = postgresql://postgres:yourpassword@localhost:5432/testdb
⚠️ If using Docker, replace
localhostwithdb.
Modify alembic/env.py
At the top, add your models import:
from logging.config import fileConfig
from sqlalchemy import engine_from_config, pool
from alembic import context
from app.models import Base # 👈 Import your models here
from app.database import DATABASE_URL
config = context.config
fileConfig(config.config_file_name)
target_metadata = Base.metadata
config.set_main_option("sqlalchemy.url", DATABASE_URL)
🧱 Generate and Apply Migration
Step 1 — Create migration script:
alembic revision --autogenerate -m "create users table"
Step 2 — Apply migration:
alembic upgrade head
Alembic will now create or alter your tables automatically. 🎯
🔍 4️⃣ Typical Workflow Going Forward
| Task | Command |
|---|---|
| Make model changes | edit models.py |
| Generate migration | alembic revision --autogenerate -m "message" |
| Apply migration | alembic upgrade head |
| Rollback (undo last) | alembic downgrade -1 |
📁 5️⃣ Docker + Alembic Integration (Optional Pro Tip)
You can auto-run Alembic migrations inside Docker by adding a script entry before uvicorn.
Example in Dockerfile:
CMD ["sh", "-c", "alembic upgrade head && uvicorn main:app --host 0.0.0.0 --port 8000"]
This ensures:
- Whenever the container starts → migrations are applied automatically.
🧠 Key Takeaways
| Concept | Description |
|---|---|
| CRUD | Core create/read/update/delete operations |
| SQLAlchemy ORM | Cleaner Python-based DB handling |
| Alembic | Keeps DB schema in sync with code |
| Docker Volumes | Keeps data safe between rebuilds |
| Compose | Runs multi-container environments easily |
✅ Now your stack is production-ready:
- FastAPI app with full CRUD
- PostgreSQL database with persistence
- Alembic-managed schema
- Dockerized and portable
Leave a Reply