24
loading...
This website collects cookies to deliver better user experience
docker-compose.yml
at the root of you project, with the following instructions :version:"3.9"services:web:build:.depends_on:-dbports:-"80:80"env_file:-postgres.env-.envvolumes:-./app:/appdb:image:postgresrestart:alwaysvolumes:-data-volume:/var/lib/postgresql/dataenv_file:-postgres.envpg_admin:image:dpage/pgadmin4environment:[email protected]_DEFAULT_PASSWORD=passwordports:-"81:80"volumes:data-volume:
web
servicedb
service
This service runs a postgres image using the same postgres.env
than the API to configure the database. That way we can keep our environment variables DRY.
It also uses the data-volume
declared at the end of the file to persist the data.
The pg_admin
service
#postgres.env POSTGRES_PASSWORD=passwordPOSTGRES_USER=pollPOSTGRES_DATABASE=pollPOSTGRES_HOST=db
docker-compose up
and you should see the output for each service. Now you can access pgAdmin at localhost:81
. To connect to the postgres database, simply add a new server and enter information from postgres.env
like so : app
folder run poetry add sqlalchemy psycopg2-binary alembic pydantic
. SQLAlchemy
(And also by Django when working with Postgres)pydantic
class called BaseSettings
. Add a config.py
file to app/app
with the following code :#app/app/config.pyfrom pydantic import BaseSettingsclassSettings(BaseSettings): POSTGRES_USER: str
POSTGRES_PASSWORD: str
POSTGRES_DATABASE: str
POSTGRES_HOST: str
settings = Settings()
, pydantic will read the variables from the environment and validate them against the types we defined, raising an error if anything is missing. This class is actually a great selling point of pydantic, offering tons of other functionalities like automatic parsing of list and dict type. More information is available here.app/app/database.py
and put the following code in it :# app/app/database.pyfrom sqlalchemy import create_enginefrom sqlalchemy.ext.declarative import declarative_basefrom sqlalchemy.orm import sessionmakerfrom . import config# The declarative base we'll use to create our model Base = declarative_base()
makemigrations
and migate
commands, but we need to initialize it first ! cd
into the app
directory and run alembic init
. This will create the configuration scripts for alembic. Your project strucutre should now look like this : ALEMBIC_CONFIG
environment variable ://.envALEMBIC_CONFIG=/app/app/alembic.ini
env.py
file to configure the connection to the database. If you open this file, you'll see that it's mostly composed of two functions, corresponding to the two available modes of alembic: run_migrations_online
: Configure the online mode. This is the one that we will be using.run_migrations_offline
: Configure the offline mode. This mode allows the user to generate SQL instructions instead of running the migrations directly against the databasealembic.ini
we'll use the Setting class we created earlier to generate it from our environement variables. Like so :from app.config import Settings
settings = Settings()
...defrun_migrations_offline(): ...
url = f"postgresql://{settings.POSTGRES_USER}:{settings.POSTGRES_PASSWORD}@{settings.POSTGRES_HOST}/{settings.POSTGRES_DATABASE}" context.configure(
url=url,
target_metadata=target_metadata,
literal_binds=True,
dialect_opts={"paramstyle": "named"},
compare_server_default=True,
)
...defrun_migrations_online(): ...
url = f"postgresql://{settings.POSTGRES_USER}:{settings.POSTGRES_PASSWORD}@{settings.POSTGRES_HOST}/{settings.POSTGRES_DATABASE}" connectable = engine_from_config(
{"sqlalchemy.url": url},
prefix="sqlalchemy.",
poolclass=pool.NullPool,
)
alembic revision -m "test revision"
. This will create an empty revision in the file alembic/version
folder :"""test revision
Revision ID: 2af1b91bea53
Revises: 27259876f63d
Create Date: 2021-11-12 18:21:26.442182
"""from alembic import opimport sqlalchemy as sa# revision identifiers, used by Alembic.revision = '2af1b91bea53'down_revision = '27259876f63d'branch_labels = Nonedepends_on = Nonedefupgrade():passdefdowngrade():pass
Question
model : This will hold the question textChoice
model: A possible answer for a question.Question
and a Choice
models.py
file to the poll folder, with the following code in it :#/app/polls/models.pyfrom datetime import datetimefrom sqlalchemy import Columnfrom sqlalchemy import DateTimefrom sqlalchemy import ForeignKeyfrom sqlalchemy import Integerfrom sqlalchemy import Stringfrom sqlalchemy.orm import relationshipfrom app.database import BaseclassQuestion(Base): __tablename__ = "poll_question" id = Column(Integer, index=True, primary_key=True)
question_text = Column(String(200), nullable=False)
pub_date = Column(DateTime, nullable=False, default=datetime.utcnow)
choices = relationship("Choice", backref="question")classChoice(Base): __tablename__ = "poll_choice" id = Column(Integer, index=True, primary_key=True)
choice_text = Column(String(200), nullable=False)
votes = Column(Integer, default=0, nullable=False)
question_id = Column(Integer, ForeignKey("poll_question.id"))
django-orm
: __tablename__
class attribute. Here I followed the Django convention <app_name>_<model_name>
but you're free to do as you please !nullable=False
question.choices
env.py
file. This is a bit like registering a new app into a django app settings.py
. Base
, as it is our declartive base that contains all the information for building the tables corresponding to our model. Each time a class inherits from Base
, it adds its own instructions to Base.metadata
. These instructions are passed to Alembic through the target_metadata
variable, so we need to assign it to Base.metadata
.#app/app/alembic/env.py# We import the models and our declartive baseimport polls.modelsfrom app.database import Base... # Find this line near the top of the file # And replace None with Base.metadatatarget_metadata = Base.metadata
alembic version --autogenerate -m "create question and choice models"
to automatically creathe the migration file. """create_question_and_choice
Revision ID: 27259876f63d
Revises:
Create Date: 2021-11-11 10:52:50.386678
"""from alembic import opimport sqlalchemy as sa# revision identifiers, used by Alembic.revision = '27259876f63d'down_revision = Nonebranch_labels = Nonedepends_on = Nonedefupgrade():# ### commands auto generated by Alembic - please adjust! ### op.create_table('poll_question',
sa.Column('id', sa.Integer(), nullable=False),
sa.Column('question_text', sa.String(length=200), nullable=False),
sa.Column('pub_date', sa.DateTime(), nullable=False),
sa.PrimaryKeyConstraint('id')
)
op.create_index(op.f('ix_poll_question_id'), 'poll_question', ['id'], unique=False)
op.create_table('poll_choice',
sa.Column('id', sa.Integer(), nullable=False),
sa.Column('choice_text', sa.String(length=200), nullable=False),
sa.Column('votes', sa.Integer(), nullable=False),
sa.Column('question_id', sa.Integer(), nullable=True),
sa.ForeignKeyConstraint(['question_id'], ['poll_question.id'], ),
sa.PrimaryKeyConstraint('id')
)
op.create_index(op.f('ix_poll_choice_id'), 'poll_choice', ['id'], unique=False)# ### end Alembic commands ###defdowngrade():# ### commands auto generated by Alembic - please adjust! ### op.drop_index(op.f('ix_poll_choice_id'), table_name='poll_choice')
op.drop_table('poll_choice')
op.drop_index(op.f('ix_poll_question_id'), table_name='poll_question')
op.drop_table('poll_question')# ### end Alembic commands ###
alembic upgrade head
to apply the migration, then head over to pgAdmin, and you should see the new tables !