Alembic version table in a separate schemas - python-3.x

I have 100s of alembic version tables for different applications in postgres. Some of the applications use different username for migrations and its possible to set search_path in postgres for those application migrations. Based on the database username, due to search_path, the version tables are created in different postgres schemas. Some of the applications use common username and end up having version table name conflict in public schema as they do not have search_path set to specific schema. How do i enable alembic to use a specific postgres schema to create the version table?

The Alembic EnvironmentContext.configure method takes an argument version_table_schema which allows you to specify the schema to put the version table in. (Docs)
For example in your env.py
from alembic import context
...
def run_migrations_online():
connectable = config.attributes.get('connection', None)
with connectable.connect() as connection:
context.configure(
connection=connection,
target_metadata=target_metadata,
version_table='alembic_version',
version_table_schema="my_schema", # <-- Set the schema name here
)
...

Related

In python pyramid web framework, how can I drop all db table rows before seeding?

I am using a cookiecutter to make a pyramid web app.
It has a function to seed the db here:
https://github.com/Pylons/pyramid-cookiecutter-starter/blob/latest/%7B%7Bcookiecutter.repo_name%7D%7D/%7B%7Bcookiecutter.repo_name%7D%7D/sqlalchemy_scripts/initialize_db.py#L15
But if I run it twice, or change the entries that I am adding, I get duplicate entries and errors. I am using a sqlite db with sqlalchemy.
What code can I add inside setup_models that will drop db all db rows before writing the new model instances?
It would be great if this looped over all models and deleted all instances of them.
def setup_models(dbsession):
"""
Add or update models / fixtures in the database.
"""
model = models.mymodel.MyModel(name='one', value=1)
dbsession.add(model)
I am updating the db by running:
# to run the initial migration that adds the tables to the db, run this once
venv/bin/alembic -c development.ini upgrade head
# seed the data, I want to be able to keep editing the seed data
# and re-run this command and have it will wipe the db rows and insert the seed data defined in setup_models
venv/bin/initialize_suppah_db development.ini
By default, SQLite does not enforce foreign key constraints at the engine level (even if you have declared them in the table DDL), so you could probably just use something as simple as
insp = inspect(engine)
with engine.begin() as conn:
for table_name in insp.get_table_names():
conn.exec_driver_sql(f'DELETE FROM "{table_name}"')
One can do this by:
looping over all model classes
making all instances of those classes as to be deleted
committing the session/transaction to delete them
THEN seeding the data
the below code does this:
import transaction
from ..models.meta import Base
def delete_table_rows(dbsession):
model_clases = [cls for cls in Base.__subclasses__()]
with transaction.manager as tx:
for model_clases in model_clases:
for instance in dbsession.query(model_clases).all():
dbsession.delete(instance)
transaction.commit()
def setup_models(dbsession):
"""
Add or update models / fixtures in the database.
"""
delete_table_rows(dbsession)
# your custom seed code here
model = models.mymodel.MyModel(name='one', value=1)
dbsession.add(model)

What is similar to postgresql uuid DEFAULT uuid_generate_v4() in a Django Model?

What would be similar to the postgresql schema below in a Django model? The current model, when doing an insert into the database, gives the error null value in column "account_uuid" violates not-null constraint. When generating the table directly in postgresql the insert works fine.
postgresql schema example:
CREATE TABLE accounts (
account_uuid uuid DEFAULT uuid_generate_v4()
);
django model attempt:
import uuid
from django.db import models
class Account(models.Model):
account_uuid = models.UUIDField(
default=uuid.uuid4,
editable=False)
Is it added while you have data on the DB? If yes, You need to add the account_uuid value for each record.

Dynamically change schema of DB SQLAlchemy instance

I have a Flask App which uses multiple schemas on the same MySQL database. Each schema has the same tables with the same structure and it represents a different "instance" used by the app for different accounts connecting to the application.
Is it possible to dynamically tell the db object which schema to use?
In order to follow SO rules I will also paste here the relevant part of the Flask-SQLAlchemy documentation on the topic.
Multiple Databases with Binds
Starting with 0.12 Flask-SQLAlchemy can
easily connect to multiple databases. To achieve that it preconfigures
SQLAlchemy to support multiple “binds”.
What are binds? In SQLAlchemy speak a bind is something that can
execute SQL statements and is usually a connection or engine. In
Flask-SQLAlchemy binds are always engines that are created for you
automatically behind the scenes. Each of these engines is then
associated with a short key (the bind key). This key is then used at
model declaration time to assocate a model with a specific engine.
If no bind key is specified for a model the default connection is used
instead (as configured by SQLALCHEMY_DATABASE_URI).
Example Configuration
The following configuration declares three
database connections. The special default one as well as two others
named users (for the users) and appmeta (which connects to a sqlite
database for read only access to some data the application provides
internally):
SQLALCHEMY_DATABASE_URI = 'postgres://localhost/main'
SQLALCHEMY_BINDS = {
'users': 'mysqldb://localhost/users',
'appmeta': 'sqlite:////path/to/appmeta.db'
}
Creating and Dropping Tables
The create_all() and drop_all() methods by default operate on all declared binds, including the
default one. This behavior can be customized by providing the bind
parameter. It takes either a single bind name, 'all' to refer to
all binds or a list of binds. The default bind
(SQLALCHEMY_DATABASE_URI) is named None:
>>> db.create_all()
>>> db.create_all(bind=['users'])
>>> db.create_all(bind='appmeta')
>>> db.drop_all(bind=None)
Referring to Binds
If you declare a model you can specify the bind to use with the bind_key attribute:
class User(db.Model):
__bind_key__ = 'users'
id = db.Column(db.Integer, primary_key=True)
username = db.Column(db.String(80), unique=True)
Internally the bind key is stored in the table’s info dictionary as
'bind_key'. This is important to know because when you want to create
a table object directly you will have to put it in there:
user_favorites = db.Table('user_favorites',
db.Column('user_id', db.Integer, db.ForeignKey('user.id')),
db.Column('message_id', db.Integer, db.ForeignKey('message.id')),
info={'bind_key': 'users'}
)
If you specified the bind_key on your models you can use them
exactly the way you are used to. The model connects to the specified
database connection itself.
Here's link to Official Documentation

Generate SQLite database scheme from Java code

Is there any way to generate SQLite database model from Java code using JOOQ?
You can generate DDL statements like CREATE TABLE .. or ALTER TABLE .. ADD CONSTRAINT .. using the DSLContext.ddl() API, for instance:
// SCHEMA is the generated schema that contains a reference to all generated tables
Queries ddl =
DSL.using(configuration)
.ddl(SCHEMA);
for (Query query : ddl.queries()) {
System.out.println(query);
}
This is documented here:
https://www.jooq.org/doc/latest/manual/sql-building/ddl-statements/generating-ddl/

Database schema with sails.js and sails-postgresql

Is there any way to set database schema with sails-postgresql waterline adapter?
By default postgres adapter allways choose default public schema in database but I want to connect it to another schema.
For example I have database dev, schema test in database dev and table users in schema test.
Now I want to select all data from table users, in sql syntax I can simply write:
SELECT * FROM test.users
How to make it work in sails ?
When I write a model that uses postgres adapter, method Users.find() will look for the table users in default public schema. I want to change it to look in schema test without interactions with my postgres database.
Is it possible?
There is support for this, although it is as-yet undocumented. You can set the schema name for a model using the meta.schemaName property, eg:
module.exports = {
tableName: 'users',
meta: {
schemaName: 'test'
},
attributes: {
...
}
};
Update
It turns out this functionality was essentially broken for several versions, but it has been revamped and released in Sails-Postgresql v0.11.0. The syntax is the same as above. The main caveat is that it will not work with multiple Waterline models sharing the same tableName.
It appears that this is a bit buggy on the latest 1.0.3 version, but I found a way to accomplish it, by doing :
postgresql: {
adapter: require('sails-postgresql'),
url: 'postgresql://postgres:blablabla#localhost:5432/simplerp',
schemaName: 'radius',
}
On your config/datastores.js file.
Peace, out!

Resources