π In this Post we will show how to connect these APIs to a real production database (like PostgreSQL ) using SQLAlchemy or SQLModel β with environment variables for security.
Excellent choice πͺ β moving from SQLite β PostgreSQL is exactly what youβd do in a real-world project or production API.
Letβs go step-by-step for both Flask and FastAPI, showing how to:
- Connect to PostgreSQL securely
- Use environment variables for credentials
- Perform CRUD operations (just like before)
π§± Overview
| Framework | ORM | DB | Connection Method |
|---|---|---|---|
| Flask | SQLAlchemy | PostgreSQL | via psycopg2 driver |
| FastAPI | SQLModel (built on SQLAlchemy) | PostgreSQL | via asyncpg or psycopg2 |
π Before You Start
π§© 1. Install PostgreSQL
If not already:
sudo apt install postgresql postgresql-contrib
Then start the service and create a test DB:
sudo -u postgres psql
CREATE DATABASE mydb;
CREATE USER myuser WITH PASSWORD 'mypassword';
GRANT ALL PRIVILEGES ON DATABASE mydb TO myuser;
\q
π§© 2. Install required Python libraries
For both frameworks:
pip install psycopg2-binary sqlalchemy
Then per framework:
| Flask | FastAPI |
|---|---|
pip install flask | pip install fastapi uvicorn sqlmodel |
π§© 3. Use Environment Variables for DB credentials
Create a .env file (never commit this in real projects):
DB_USER=myuser
DB_PASS=mypassword
DB_HOST=localhost
DB_PORT=5432
DB_NAME=mydb
And install dotenv:
pip install python-dotenv
π 1οΈβ£ Flask + PostgreSQL (with SQLAlchemy)
# flask_postgres_api.py
from flask import Flask, jsonify, request
from flask_sqlalchemy import SQLAlchemy
from dotenv import load_dotenv
import os
# Load environment variables
load_dotenv()
app = Flask(__name__)
# Build PostgreSQL connection string
POSTGRES_URL = f"postgresql+psycopg2://{os.getenv('DB_USER')}:{os.getenv('DB_PASS')}@{os.getenv('DB_HOST')}:{os.getenv('DB_PORT')}/{os.getenv('DB_NAME')}"
app.config['SQLALCHEMY_DATABASE_URI'] = POSTGRES_URL
app.config['SQLALCHEMY_TRACK_MODIFICATIONS'] = False
db = SQLAlchemy(app)
# Define model
class Item(db.Model):
__tablename__ = "items"
id = db.Column(db.Integer, primary_key=True)
name = db.Column(db.String(100), nullable=False)
price = db.Column(db.Float, nullable=False)
def to_dict(self):
return {"id": self.id, "name": self.name, "price": self.price}
# Initialize DB
with app.app_context():
db.create_all()
# CRUD Routes
@app.route("/items", methods=["GET"])
def get_items():
items = Item.query.all()
return jsonify([item.to_dict() for item in items])
@app.route("/items", methods=["POST"])
def add_item():
data = request.get_json()
new_item = Item(name=data["name"], price=data["price"])
db.session.add(new_item)
db.session.commit()
return jsonify({"message": "Item added", "item": new_item.to_dict()}), 201
@app.route("/items/<int:item_id>", methods=["PUT"])
def update_item(item_id):
data = request.get_json()
item = Item.query.get(item_id)
if not item:
return jsonify({"error": "Item not found"}), 404
item.name = data.get("name", item.name)
item.price = data.get("price", item.price)
db.session.commit()
return jsonify({"message": "Item updated", "item": item.to_dict()})
@app.route("/items/<int:item_id>", methods=["DELETE"])
def delete_item(item_id):
item = Item.query.get(item_id)
if not item:
return jsonify({"error": "Item not found"}), 404
db.session.delete(item)
db.session.commit()
return jsonify({"message": f"Item {item_id} deleted"})
if __name__ == "__main__":
app.run(debug=True)
β Run:
python flask_postgres_api.py
Your data will now persist in PostgreSQL, not in memory or SQLite.
β‘ 2οΈβ£ FastAPI + PostgreSQL (with SQLModel)
# fastapi_postgres_api.py
from fastapi import FastAPI, HTTPException
from sqlmodel import SQLModel, Field, Session, create_engine, select
from dotenv import load_dotenv
import os
# Load environment variables
load_dotenv()
# PostgreSQL connection
POSTGRES_URL = f"postgresql+psycopg2://{os.getenv('DB_USER')}:{os.getenv('DB_PASS')}@{os.getenv('DB_HOST')}:{os.getenv('DB_PORT')}/{os.getenv('DB_NAME')}"
engine = create_engine(POSTGRES_URL, echo=True)
# Define model
class Item(SQLModel, table=True):
id: int | None = Field(default=None, primary_key=True)
name: str
price: float
# Create table if not exists
SQLModel.metadata.create_all(engine)
app = FastAPI()
# CREATE
@app.post("/items")
def add_item(item: Item):
with Session(engine) as session:
session.add(item)
session.commit()
session.refresh(item)
return {"message": "Item added", "item": item}
# READ
@app.get("/items")
def get_items():
with Session(engine) as session:
items = session.exec(select(Item)).all()
return items
# UPDATE
@app.put("/items/{item_id}")
def update_item(item_id: int, updated_item: Item):
with Session(engine) as session:
item = session.get(Item, item_id)
if not item:
raise HTTPException(status_code=404, detail="Item not found")
item.name = updated_item.name
item.price = updated_item.price
session.commit()
session.refresh(item)
return {"message": "Item updated", "item": item}
# DELETE
@app.delete("/items/{item_id}")
def delete_item(item_id: int):
with Session(engine) as session:
item = session.get(Item, item_id)
if not item:
raise HTTPException(status_code=404, detail="Item not found")
session.delete(item)
session.commit()
return {"message": f"Item {item_id} deleted"}
β Run:
uvicorn fastapi_postgres_api:app --reload
Then visit:
- API:
http://127.0.0.1:8000/items - Swagger docs:
http://127.0.0.1:8000/docs
βοΈ Flask vs FastAPI (with PostgreSQL)
| Feature | Flask | FastAPI |
|---|---|---|
| ORM | SQLAlchemy | SQLModel (on top of SQLAlchemy) |
| Performance | Sync | Async ready |
| Input Validation | Manual | Auto via type hints |
| Auto Docs | β | β
/docs |
| Recommended For | Simpler apps | Modern APIs & ML/AI services |
π§ Summary
β You now have:
- Persistent CRUD APIs connected to PostgreSQL
- Secure connection via
.env - Ready-to-deploy production-style structure
Would you like me to show how to containerize these APIs using Docker (so you can run Flask/FastAPI + PostgreSQL together easily)?
Thatβs the next real-world step before cloud deployment.
Perfect π β weβll go step by step and build your understanding from scratch to deployment:
Weβll cover:
- Core REST API Concepts Recap
- Flask + PostgreSQL REST API (CRUD Example)
- FastAPI + PostgreSQL REST API (CRUD Example)
- Docker Deep Dive: what it is, why itβs used, and how to dockerize both apps
π§© 1οΈβ£ REST API β Quick Recap
A REST API (Representational State Transfer) allows two systems to communicate using HTTP methods.
| HTTP Method | Purpose | Example |
|---|---|---|
| GET | Read data | /users |
| POST | Create new data | /users |
| PUT/PATCH | Update data | /users/1 |
| DELETE | Remove data | /users/1 |
Each request returns a JSON response.
π§± 2οΈβ£ Flask + PostgreSQL Example
Weβll use:
Flaskβ web frameworkpsycopg2β PostgreSQL connectorPostgreSQLβ databaseDockerβ to containerize later
π¦ Install
pip install flask psycopg2-binary
π§ Flask Code: flask_app.py
from flask import Flask, request, jsonify
import psycopg2
app = Flask(__name__)
# PostgreSQL connection
conn = psycopg2.connect(
host="localhost",
database="testdb",
user="postgres",
password="yourpassword"
)
@app.route("/users", methods=["GET"])
def get_users():
cur = conn.cursor()
cur.execute("SELECT * FROM users;")
rows = cur.fetchall()
cur.close()
return jsonify(rows)
@app.route("/users", methods=["POST"])
def create_user():
data = request.json
name = data["name"]
age = data["age"]
cur = conn.cursor()
cur.execute("INSERT INTO users (name, age) VALUES (%s, %s)", (name, age))
conn.commit()
cur.close()
return jsonify({"message": "User added successfully!"})
if __name__ == "__main__":
app.run(debug=True)
ποΈ Table creation SQL (run in PostgreSQL shell):
CREATE TABLE users (
id SERIAL PRIMARY KEY,
name VARCHAR(100),
age INT
);
β‘ 3οΈβ£ FastAPI + PostgreSQL Example
Weβll use:
FastAPISQLAlchemy+asyncpgfor DB connectionUvicornfor running the server
π¦ Install
pip install fastapi uvicorn sqlalchemy psycopg2-binary
π§ FastAPI Code: fastapi_app.py
from fastapi import FastAPI, HTTPException
from pydantic import BaseModel
import psycopg2
app = FastAPI()
class User(BaseModel):
name: str
age: int
conn = psycopg2.connect(
host="localhost",
database="testdb",
user="postgres",
password="yourpassword"
)
@app.get("/users")
def get_users():
cur = conn.cursor()
cur.execute("SELECT * FROM users;")
rows = cur.fetchall()
cur.close()
return {"users": rows}
@app.post("/users")
def create_user(user: User):
cur = conn.cursor()
cur.execute("INSERT INTO users (name, age) VALUES (%s, %s)", (user.name, user.age))
conn.commit()
cur.close()
return {"message": "User added successfully!"}
Run:
uvicorn fastapi_app:app --reload
π³ 4οΈβ£ Deep Understanding of Docker (and Dockerizing API)
π‘ What is Docker?
Docker is a containerization platform β it packages your code + dependencies + environment into lightweight containers that run anywhere (no βworks on my machineβ issue).
Think of a container as:
A mini virtual machine running only whatβs needed for your app.
βοΈ Core Docker Concepts
| Term | Meaning |
|---|---|
| Image | A snapshot (template) of your app (like a recipe). |
| Container | A running instance of an image. |
| Dockerfile | A text file defining how to build your image. |
| Docker Hub | Public registry where images are stored. |
| docker-compose.yml | Defines multiple containers (e.g. app + db). |
π§© Dockerfile for Flask or FastAPI
Create a file named Dockerfile in your project folder:
# Use official Python base image
FROM python:3.10
# Set working directory
WORKDIR /app
# Copy files
COPY . /app
# Install dependencies
RUN pip install -r requirements.txt
# Expose port (Flask 5000, FastAPI 8000)
EXPOSE 8000
# Command to run the app (use one)
CMD ["uvicorn", "fastapi_app:app", "--host", "0.0.0.0", "--port", "8000"]
π Docker Compose for App + PostgreSQL
Create a file docker-compose.yml:
version: '3.8'
services:
db:
image: postgres:15
container_name: postgres_db
environment:
POSTGRES_DB: testdb
POSTGRES_USER: postgres
POSTGRES_PASSWORD: yourpassword
ports:
- "5432:5432"
api:
build: .
container_name: fastapi_app
ports:
- "8000:8000"
depends_on:
- db
βΆοΈ Build and Run
docker-compose up --build
Now your FastAPI + PostgreSQL stack is running in containers:
- API β
http://localhost:8000 - PostgreSQL β
localhost:5432
π§ Docker Summary
| Concept | Description |
|---|---|
| Why Docker? | Consistency, portability, isolated environment |
| Images vs Containers | Image = blueprint, Container = running instance |
| Dockerfile | Defines how to build your image |
| Compose | Runs multi-container apps easily |
| Common Command | docker ps, docker build, docker run, docker exec -it <container> bash |
Would you like me to take the next step β
β‘οΈ show how to persist PostgreSQL data using Docker volumes and how to connect using SQLAlchemy ORM (models, queries, etc.) instead of raw SQL?
Thatβs the next real-world step for API + database projects.
Perfect β‘ β youβre now entering real-world API + Database + Docker territory πͺ
Letβs continue with a complete and production-style FastAPI + PostgreSQL stack using SQLAlchemy ORM and Docker volumes for data persistence.
π§± Overview of What Youβll Build
β
FastAPI app β RESTful CRUD API for users
β
PostgreSQL β running inside Docker
β
Docker Volume β persistent DB storage
β
SQLAlchemy ORM β model-based DB access
β
Docker Compose β manages both containers together
βοΈ 1οΈβ£ Project Structure
fastapi_postgres_docker/
β
βββ app/
β βββ main.py
β βββ models.py
β βββ database.py
β βββ schemas.py
β βββ crud.py
β βββ __init__.py
β
βββ requirements.txt
βββ Dockerfile
βββ docker-compose.yml
π§© 2οΈβ£ requirements.txt
fastapi
uvicorn
sqlalchemy
psycopg2-binary
pydantic
π§ 3οΈβ£ database.py β PostgreSQL Connection
from sqlalchemy import create_engine
from sqlalchemy.ext.declarative import declarative_base
from sqlalchemy.orm import sessionmaker
DATABASE_URL = "postgresql://postgres:yourpassword@db:5432/testdb"
engine = create_engine(DATABASE_URL)
SessionLocal = sessionmaker(autocommit=False, autoflush=False, bind=engine)
Base = declarative_base()
Note: The hostname
dbrefers to the PostgreSQL service indocker-compose.yml.
𧬠4οΈβ£ models.py β ORM Models (Tables)
from sqlalchemy import Column, Integer, String
from .database import Base
class User(Base):
__tablename__ = "users"
id = Column(Integer, primary_key=True, index=True)
name = Column(String(100))
age = Column(Integer)
π§Ύ 5οΈβ£ schemas.py β Request/Response Validation
from pydantic import BaseModel
class UserBase(BaseModel):
name: str
age: int
class UserCreate(UserBase):
pass
class UserOut(UserBase):
id: int
class Config:
orm_mode = True
π§© 6οΈβ£ crud.py β Database Operations
from sqlalchemy.orm import Session
from . import models, schemas
def get_users(db: Session):
return db.query(models.User).all()
def create_user(db: Session, user: schemas.UserCreate):
db_user = models.User(name=user.name, age=user.age)
db.add(db_user)
db.commit()
db.refresh(db_user)
return db_user
β‘ 7οΈβ£ main.py β FastAPI Routes
from fastapi import FastAPI, Depends
from sqlalchemy.orm import Session
from . import models, schemas, crud
from .database import SessionLocal, engine
models.Base.metadata.create_all(bind=engine)
app = FastAPI()
def get_db():
db = SessionLocal()
try:
yield db
finally:
db.close()
@app.get("/users", response_model=list[schemas.UserOut])
def read_users(db: Session = Depends(get_db)):
return crud.get_users(db)
@app.post("/users", response_model=schemas.UserOut)
def create_user(user: schemas.UserCreate, db: Session = Depends(get_db)):
return crud.create_user(db, user)
π³ 8οΈβ£ Dockerfile β FastAPI App
FROM python:3.10
WORKDIR /app
COPY requirements.txt .
RUN pip install -r requirements.txt
COPY ./app /app
EXPOSE 8000
CMD ["uvicorn", "main:app", "--host", "0.0.0.0", "--port", "8000"]
βοΈ 9οΈβ£ docker-compose.yml β FastAPI + PostgreSQL + Persistent Volume
version: '3.8'
services:
db:
image: postgres:15
container_name: postgres_db
environment:
POSTGRES_DB: testdb
POSTGRES_USER: postgres
POSTGRES_PASSWORD: yourpassword
ports:
- "5432:5432"
volumes:
- pgdata:/var/lib/postgresql/data
api:
build: .
container_name: fastapi_app
ports:
- "8000:8000"
depends_on:
- db
volumes:
pgdata:
βΆοΈ 10οΈβ£ Run the Application
docker-compose up --build
β Now visit:
http://localhost:8000/docsβ Swagger UIGET /usersβ list usersPOST /usersβ add user (JSON body:{"name":"John","age":25})
πΎ Persistent Data with Docker Volumes
- The line
volumes: - pgdata:/var/lib/postgresql/datameans:- Even if you delete the container, the PostgreSQL data remains stored on your machine.
- You can verify using:
docker volume ls - To remove all data:
docker-compose down -v
π How the System Works Together
| Layer | Description |
|---|---|
| FastAPI | Defines API endpoints |
| SQLAlchemy ORM | Handles DB models & queries |
| PostgreSQL | Stores data persistently |
| Docker | Packages everything in isolated containers |
| Docker Compose | Orchestrates multiple containers |
| Volume | Ensures data persistence |
π§ Key Takeaways
- ORM removes manual SQL, gives class-based DB interaction.
- Docker Compose makes multi-container development seamless.
- Volumes keep your DB safe from container deletion.
- FastAPI + PostgreSQL + Docker is a modern stack for microservices or data/AI backends.
Would you like the next level tutorial β
β‘οΈ Adding PUT, DELETE, and GET by ID (full CRUD)
β‘οΈ plus explaining how to migrate schema updates using Alembic (version control for database)?
Perfect π β now weβll go from basic CRUD β full production-grade API using:
β
Full CRUD operations (GET, POST, PUT, DELETE)
β
Database migrations using Alembic (so you can evolve your DB safely)
β
All running in Docker
π§© Overview
Youβll extend your existing FastAPI + PostgreSQL + SQLAlchemy + Docker project.
Letβs jump step by step π
π§ 1οΈβ£ Update crud.py β Full CRUD Operations
from sqlalchemy.orm import Session
from . import models, schemas
# β
Read All
def get_users(db: Session):
return db.query(models.User).all()
# β
Read One
def get_user_by_id(db: Session, user_id: int):
return db.query(models.User).filter(models.User.id == user_id).first()
# β
Create
def create_user(db: Session, user: schemas.UserCreate):
db_user = models.User(name=user.name, age=user.age)
db.add(db_user)
db.commit()
db.refresh(db_user)
return db_user
# β
Update
def update_user(db: Session, user_id: int, updated_user: schemas.UserCreate):
user = db.query(models.User).filter(models.User.id == user_id).first()
if user:
user.name = updated_user.name
user.age = updated_user.age
db.commit()
db.refresh(user)
return user
# β
Delete
def delete_user(db: Session, user_id: int):
user = db.query(models.User).filter(models.User.id == user_id).first()
if user:
db.delete(user)
db.commit()
return user
β‘ 2οΈβ£ Update main.py β Add Routes for All Operations
from fastapi import FastAPI, Depends, HTTPException
from sqlalchemy.orm import Session
from . import models, schemas, crud
from .database import SessionLocal, engine
models.Base.metadata.create_all(bind=engine)
app = FastAPI()
def get_db():
db = SessionLocal()
try:
yield db
finally:
db.close()
# β
Create user
@app.post("/users", response_model=schemas.UserOut)
def create_user(user: schemas.UserCreate, db: Session = Depends(get_db)):
return crud.create_user(db, user)
# β
Read all users
@app.get("/users", response_model=list[schemas.UserOut])
def read_users(db: Session = Depends(get_db)):
return crud.get_users(db)
# β
Read single user by ID
@app.get("/users/{user_id}", response_model=schemas.UserOut)
def read_user(user_id: int, db: Session = Depends(get_db)):
user = crud.get_user_by_id(db, user_id)
if not user:
raise HTTPException(status_code=404, detail="User not found")
return user
# β
Update user
@app.put("/users/{user_id}", response_model=schemas.UserOut)
def update_user(user_id: int, user: schemas.UserCreate, db: Session = Depends(get_db)):
updated = crud.update_user(db, user_id, user)
if not updated:
raise HTTPException(status_code=404, detail="User not found")
return updated
# β
Delete user
@app.delete("/users/{user_id}")
def delete_user(user_id: int, db: Session = Depends(get_db)):
deleted = crud.delete_user(db, user_id)
if not deleted:
raise HTTPException(status_code=404, detail="User not found")
return {"message": "User deleted successfully"}
π§ͺ Test All Endpoints (via Swagger)
Run:
docker-compose up --build
Then open:
π http://localhost:8000/docs
Youβll see and can test:
| Method | Endpoint | Description |
|---|---|---|
| POST | /users | Create user |
| GET | /users | Read all |
| GET | /users/{id} | Read one |
| PUT | /users/{id} | Update |
| DELETE | /users/{id} | Delete |
βοΈ 3οΈβ£ Alembic β Database Migrations
When you later add or modify database columns, youβll need to version-control schema changes.
This is what Alembic does (like Git for databases).
π¦ Install Alembic
Add it to your requirements:
pip install alembic
βοΈ Initialize Alembic
From your project root:
alembic init alembic
This creates:
alembic/
βββ versions/
β βββ (empty for now)
βββ env.py
βββ script.py.mako
alembic.ini
π§© Configure Alembic
In alembic.ini, update DB URL:
sqlalchemy.url = postgresql://postgres:yourpassword@localhost:5432/testdb
β οΈ If using Docker, replace
localhostwithdb.
Modify alembic/env.py
At the top, add your models import:
from logging.config import fileConfig
from sqlalchemy import engine_from_config, pool
from alembic import context
from app.models import Base # π Import your models here
from app.database import DATABASE_URL
config = context.config
fileConfig(config.config_file_name)
target_metadata = Base.metadata
config.set_main_option("sqlalchemy.url", DATABASE_URL)
π§± Generate and Apply Migration
Step 1 β Create migration script:
alembic revision --autogenerate -m "create users table"
Step 2 β Apply migration:
alembic upgrade head
Alembic will now create or alter your tables automatically. π―
π 4οΈβ£ Typical Workflow Going Forward
| Task | Command |
|---|---|
| Make model changes | edit models.py |
| Generate migration | alembic revision --autogenerate -m "message" |
| Apply migration | alembic upgrade head |
| Rollback (undo last) | alembic downgrade -1 |
π 5οΈβ£ Docker + Alembic Integration (Optional Pro Tip)
You can auto-run Alembic migrations inside Docker by adding a script entry before uvicorn.
Example in Dockerfile:
CMD ["sh", "-c", "alembic upgrade head && uvicorn main:app --host 0.0.0.0 --port 8000"]
This ensures:
- Whenever the container starts β migrations are applied automatically.
π§ Key Takeaways
| Concept | Description |
|---|---|
| CRUD | Core create/read/update/delete operations |
| SQLAlchemy ORM | Cleaner Python-based DB handling |
| Alembic | Keeps DB schema in sync with code |
| Docker Volumes | Keeps data safe between rebuilds |
| Compose | Runs multi-container environments easily |
β Now your stack is production-ready:
- FastAPI app with full CRUD
- PostgreSQL database with persistence
- Alembic-managed schema
- Dockerized and portable