I am using mongoengine v0.15.0. How to fetch the name of the database connected to? Of course, I would have supplied the name in the uri string. But, is there a way to query mongo and find it?
Thanks,
Harsha
All the information about the DB connection created in the mongoengine can be found by calling get_db() which returns a pymongo.database.Database object. Then you can access the database name in the attribute name. Here is an example.
from mongoengine.connection import get_db, connect
connect("test_db")
# Then, somewhere where you want to get the DB name
db = get_db()
print("Database name: ", db.name)
The output:
Database name: test_db
Related
What would be similar to the postgresql schema below in a Django model? The current model, when doing an insert into the database, gives the error null value in column "account_uuid" violates not-null constraint. When generating the table directly in postgresql the insert works fine.
postgresql schema example:
CREATE TABLE accounts (
account_uuid uuid DEFAULT uuid_generate_v4()
);
django model attempt:
import uuid
from django.db import models
class Account(models.Model):
account_uuid = models.UUIDField(
default=uuid.uuid4,
editable=False)
Is it added while you have data on the DB? If yes, You need to add the account_uuid value for each record.
I have a Flask App which uses multiple schemas on the same MySQL database. Each schema has the same tables with the same structure and it represents a different "instance" used by the app for different accounts connecting to the application.
Is it possible to dynamically tell the db object which schema to use?
In order to follow SO rules I will also paste here the relevant part of the Flask-SQLAlchemy documentation on the topic.
Multiple Databases with Binds
Starting with 0.12 Flask-SQLAlchemy can
easily connect to multiple databases. To achieve that it preconfigures
SQLAlchemy to support multiple “binds”.
What are binds? In SQLAlchemy speak a bind is something that can
execute SQL statements and is usually a connection or engine. In
Flask-SQLAlchemy binds are always engines that are created for you
automatically behind the scenes. Each of these engines is then
associated with a short key (the bind key). This key is then used at
model declaration time to assocate a model with a specific engine.
If no bind key is specified for a model the default connection is used
instead (as configured by SQLALCHEMY_DATABASE_URI).
Example Configuration
The following configuration declares three
database connections. The special default one as well as two others
named users (for the users) and appmeta (which connects to a sqlite
database for read only access to some data the application provides
internally):
SQLALCHEMY_DATABASE_URI = 'postgres://localhost/main'
SQLALCHEMY_BINDS = {
'users': 'mysqldb://localhost/users',
'appmeta': 'sqlite:////path/to/appmeta.db'
}
Creating and Dropping Tables
The create_all() and drop_all() methods by default operate on all declared binds, including the
default one. This behavior can be customized by providing the bind
parameter. It takes either a single bind name, 'all' to refer to
all binds or a list of binds. The default bind
(SQLALCHEMY_DATABASE_URI) is named None:
>>> db.create_all()
>>> db.create_all(bind=['users'])
>>> db.create_all(bind='appmeta')
>>> db.drop_all(bind=None)
Referring to Binds
If you declare a model you can specify the bind to use with the bind_key attribute:
class User(db.Model):
__bind_key__ = 'users'
id = db.Column(db.Integer, primary_key=True)
username = db.Column(db.String(80), unique=True)
Internally the bind key is stored in the table’s info dictionary as
'bind_key'. This is important to know because when you want to create
a table object directly you will have to put it in there:
user_favorites = db.Table('user_favorites',
db.Column('user_id', db.Integer, db.ForeignKey('user.id')),
db.Column('message_id', db.Integer, db.ForeignKey('message.id')),
info={'bind_key': 'users'}
)
If you specified the bind_key on your models you can use them
exactly the way you are used to. The model connects to the specified
database connection itself.
Here's link to Official Documentation
Is it possible to append data to a subcollection in the cloud firestore database using the python firebase admin sdk? If so, what am i missing?
I am trying to append data into a subcollection of a specific document located in googles cloud firestore - Not the real time database. I have done a fair amount of research with this being the most prevalent resource so far:
add data to firestore which does not explicitly say it isn't possible
i am able to create documents, read documents, etc. i just cannot seem to append to or access subcollections without getting the whole document
the flow of my code goes as follows:
import time
import json
import firebase_admin
from firebase_admin import credentials
from firebase_admin import db
from firebase_admin import firestore
#authentication
cred = credentials.Certificate('key.json')
app = firebase_admin.initialize_app(cred)
cloudDb = firestore.client() #open connection. uses 'app' implicitly
doc = cloudDb.collection(collection).document(document)
data['date'] = firestore.SERVER_TIMESTAMP
data['payload'] = [0,1,2,3,4,5]
doc.collection("data").set(data) #*
#--testing (collection)
# |
#----docId (document)
# |
#------company (field)
#------accountinfo(field)
#------data (subcollection)
# |
#--------date (field)
#--------payload (array)
this however fails on the last line (line with asterisk) with the error:
('Cannot convert to a Firestore Value', <object object at 0xb6b07e60>, 'Invalid type', <class 'object'>)
ok so solved my own question thanks to Doug getting me to write a script which could be run on another machine. (ie a smaller script with less going on)
my issue was trying to set the collection as my data object instead of creating a document within the collection and setting that. ie:
doc.collection("data").set(data) #error
doc.collection("data").document().set(data) #success
I created a new table in the Bluemix SQL Database service by uploading a csv (baseball.csv) and took the default table name of "baseball".
I created a simple app in Node.js which is just trying to select data from the table with select * from baseball, but I keep getting the following error:
[IBM][CLI Driver][DB2/NT] SQL0204N "USERxxxx.BASEBALL" in an undefined name
Why can't it find my database table?
This issue seems independent of bluemix, rather it is usage error.
This error is possibly caused by following:
The object identified by name is not defined in the database.
User response
Ensure that the object name (including any required qualifiers) is correctly specified in the SQL statement and it exists.
try running "list tables" from command prompt to check if your table spelling is correct or not.
http://www-01.ibm.com/support/knowledgecenter/SSEPGG_9.7.0/com.ibm.db2.luw.messages.sql.doc/doc/msql00204n.html?cp=SSEPGG_9.7.0%2F2-6-27-0-130
I created the table from SQL Database web UI in bluemix and took the default name of baseball. It looks like this creates a case-sensitive table name.
Unfortunately for me, the sql_db libary (and all db2 clients I believe) auto-capitalizes the SQL query into "SELECT * FROM BASEBALL"
The solution was to either
A. Explicitly name my table BASEBALL in the web UI; or
B. Modify my sql query by quoting the table name:
select * from "baseball"
More info at http://www.ibm.com/developerworks/data/library/techarticle/0203adamache/0203adamache.html#N10121
I wish to safely pass a schema name that must be double-quote escaped to the database engine, in this case when constructing a GRANT statement I want to pass a variable containing test safely to the database.
GRANT SELECT ON ALL TABLES IN SCHEMA "test" TO readuser
I'm unsure how to do this from SQLAlchemy.
If it helps I am using psycopg2 to connect to postgreSQL
I've never tried issuing database maintenance queries like GRANT through SQLAlchemy. I suppose the ORM won't issue this kind of query, therefore I guess that you want to issue it textually using Session.execute. If so, the examples in the documentation are pretty straightforward about how to do it:
session.execute(
"GRANT SELECT ON ALL TABLES IN SCHEMA :param TO readuser",
{ "param": "test" }
)