I am running:
sequelize db:migrate
on my local machine and it outputs:
Sequelize CLI [Node: 15.10.0, CLI: 6.2.0, ORM: 5.8.6]
Loaded configuration file "src/db/db.config.js".
Using environment "development".
but no migration takes place on my local machine. The same code and command migrate the DB normally on a Google cloud instance. How do I debug this problem? My device is Macbook Pro running macOs High Sierra 10.13.6.
Note: the database is already created. Even when I try to delete the DB and run sequelize db:create, still nothing happens.
Related
I tried the following command and it returns no errors but the data is not imported in my postgres database.
Database is already created in Postgres.
pgloader mysql://user:password#127.0.0.1/db_name postgresql://pgloader_pg:password#127.0.0.1/pg_db_name
This is the result:
LOG pgloader version "3.6.7~devel"
I am not sure what is the issue.
Was trying to run the nodejs quickstart for google cloud spanner. I started of the emulator instance via running below command on my development server:
docker run -p 9010:9010 -p 9020:9020 gcr.io/cloud-spanner-emulator/emulator
On the development server I could also create instances as follows:
# configuration first
gcloud config configurations create emulator
gcloud config set auth/disable_credentials true
gcloud config set project my-project
gcloud config set api_endpoint_overrides/spanner http://localhost:9020/
# creating instance
gcloud spanner instances create test-instance \
--config=emulator-config --description="Test Instance" --nodes=1
I could successfully create instance.
Now I am trying to run the quick start samples from a different machine on the same network. I made the following changes in the schema.js file (line number 30).
const spanner = new Spanner({
projectId: projectId,
apiEndpoint: 'http://dev-server-ip',
port: 9020
});
And I run the program as follows using nodejs:
node schema.js createDatabase test-instance example-db my-project
I got the following error:
schema.js createDatabase <instanceName> <databaseName> <projectId>
Creates an example database with two tables in a Cloud Spanner instance.
Options:
--version Show version number
[boolean]
--help Show help
[boolean]
Error: Could not load the default credentials. Browse to https://cloud.google.com/docs/authentication/getting-
started for more information.
at GoogleAuth.getApplicationDefaultAsync (D:\work\gcloud-connectors\nodejs-spanner\samples\node_modules\go
ogle-auth-library\build\src\auth\googleauth.js:183:19)
at processTicksAndRejections (node:internal/process/task_queues:96:5)
at async GoogleAuth.getClient (D:\work\gcloud-connectors\nodejs-spanner\samples\node_modules\google-auth-l
ibrary\build\src\auth\googleauth.js:565:17)
at async GrpcClient._getCredentials (D:\work\gcloud-connectors\nodejs-spanner\samples\node_modules\google-
gax\build\src\grpc.js:145:24)
at async GrpcClient.createStub (D:\work\gcloud-connectors\nodejs-spanner\samples\node_modules\google-gax\b
uild\src\grpc.js:308:23)
EDIT
Issue resolved. You need to set the environment variable.
export SPANNER_EMULATOR_HOST=dev-server-ip:9010
Note the port is the gRPC port. 9010. The code change for the Spanner constructor is also not necessary.
I am using python google app engine
could you tell me, how i can run python3 google app engine with ndb on local system?
Help me
https://cloud.google.com/appengine/docs/standard/python3
Please try this
Go to service account https://cloud.google.com/docs/authentication/getting-started
create json file
and add install this pip
$ pip install google-cloud-ndb
now open linux terminal
export GOOGLE_APPLICATION_CREDENTIALS="/path/to/credentials.json"
if window then open command prompt
set GOOGLE_APPLICATION_CREDENTIALS=C:\path\to\credentials.json
run this code in python3 in your terminal/command prompt
from google.cloud import ndb
client = ndb.Client()
with client.context():
contact1 = Contact(name="John Smith",
phone="555 617 8993",
email="john.smith#gmail.com")
contact1.put()
see this result in your datastore.. Google console
App Engine is a Serverless service provided by Google Cloud Platform where you can deploy your applications and configure Cloud resources like instances' CPU, memory, scaling method, etc. This will provide you the architecture to run your app.
This service is not meant to be used on local environments. Instead, it is a great option to host an application that (ideally) has been tested on local environments.
Let's say: You don't run a Django application with Datastore dependencies using App Engine locally, You run a Django application with Datastore (and other) dependencies locally and then deploy it to App Engine once it is ready.
Most GCP services have their Client libraries so we can interact with them via code, even on local environments. The ndb you asked belongs to the Google Cloud Datastore and can be installed in Python environments with:
pip install google-cloud-ndb
After installing it, you will be ready to interact with Datastore locally. Please find details about setting up credentials and code snippets in the Datastore Python Client Library reference.
Hope this is helpful! :)
You can simply create emulator instance of the datastore on your local:
gcloud beta emulators datastore start --project test --host-port "0.0.0.0:8002" --no-store-on-disk --consistency=1
And then use it in the code in main app file:
from google.cloud import ndb
def get_ndb_client(namespace):
if config.ENVIRONMENT != ENVIRONMENTS.LOCAL:
# production
db = ndb.Client(namespace=namespace)
else:
# localhost
import mock
credentials = mock.Mock(spec=google.auth.credentials.Credentials)
db = ndb.Client(project="test", credentials=credentials, namespace=namespace)
return db
ndb_client = get_ndb_client("ns1")
I have an alpine machine on my virtual machine and I want to install mongodb. I added the package for mongodb using "apk add mongodb". I started mongo daemon using command mongod in one terminal. Then opened another terminal with mongo shell using mongo --disableJavaScriptJIT. I tried adding files and reading them from the database and that worked fine. But when I do sudo service mongodb restart I got the following output.
* Caching service dependencies ... [ ok ]
* Starting mongodb ...
* start-stop-daemon: failed to start `/usr/bin/mongod' [ !! ]
* ERROR: mongodb failed to start
The first thing you should do is to read the log file. I think that you’ll read here that mongodb doesn’t have rights to access some files. When you started it manually, you haven’t run it as user mongodb, have you…?
If this hypothesis is right, then the solution is to fix owner (and group) of the /var/lib/mongodb (recursively).
I have a recently deployed app on an Ubuntu server using Dokku. This is a Node.js app with a Mongodb database.
For the site to work properly I need to to load geojson file in the database. On my development machine this was done from the ubuntu command line using the mongoimport command. I can't figure out how to do this in Dokku.
I also need to add a geospatial index. This was done from the mongo console on my development machine. I also cant figure out how to do that on the Dokku install.
Thanks a lot #Jonathan. You helped me solve this problem. Here is what I did.
I used mongodump on my local machine to create a backup file of the database. It defaulted to a .bson file.
I uploaded that file to my remote server. On the remote server I put the bson file inside a folder called "dump". Then tarred that folder. I initially used the -z flag out of habit but mongo/dokku didn't like the gzip. So I used tar with no compression like so:
tar -cvf dump.tar dump
next I ran the dokku mongo import command:
$dokku mongo:import mongo_claims < dump.tar
2016-03-05T18:04:17.255+0000 building a list of collections to restore from /tmp/tmp.6S378QKhJR/dump dir
2016-03-05T18:04:17.270+0000 restoring mongo_claims.docs4 from /tmp/tmp.6S378QKhJR/dump/docs4.bson
2016-03-05T18:04:20.729+0000 [############............] mongo_claims.docs4 22.3 MB/44.2 MB (50.3%)
2016-03-05T18:04:22.821+0000 [########################] mongo_claims.docs4 44.2 MB/44.2 MB (100.0%)
2016-03-05T18:04:22.822+0000 no indexes to restore
2016-03-05T18:04:22.897+0000 finished restoring mongo_claims.docs4 (41512 documents)
2016-03-05T18:04:22.897+0000 done
That did the trick. My site immediately had all the data.
mongodump will export all the data + indexes from an existing database.
https://docs.mongodb.org/manual/reference/program/mongodump/
Then mongorestore will restore a mongodump with indexes to an existing database.
https://docs.mongodb.org/manual/reference/program/mongorestore/
mongorestore recreates indexes recorded by mongodump.
You can do both commands from you dev machine to the Dokku database.
Importing works well, but since you mentionned mongo console, it's nice to know that you can also connect to your Mongo instance if you use https://github.com/dokku/dokku-mongo's mongo:list and mongo:connect...
E.g.:
root#somewhere:~# dokku mongo:list
NAME VERSION STATUS EXPOSED PORTS LINKS
mydb mongo:3.2.1 running 1->2->3->4->5 mydb
root#somewhere:~# dokku mongo:connect mydb
MongoDB shell version: 3.2.1
connecting to: mydb
> db
mydb
Mongo shell!
> exit
bye
For dokku v0.5.0+ and dokku-mongo v1.7.0+
Use mongodump to export your data into an archive:
mongodump --db mydb --gzip --archive=mydb.archive
Use dokku:import to import your data from an archive
dokku mongo:import mydb < mydb.archive