I'm playing around with Ghost blogging platform https://github.com/TryGhost/Ghost/issues/769. It uses knex module for nodejs to interact with an sqlite3 database. The migrations to rollback look like this. I'm wondering if there's a way to run this from sqlite3 console, or how would I do it if I want to rollback migrations?
down = function () {
return when.all([
knex.schema.dropTableIfExists('posts_tags'),
knex.schema.dropTableIfExists('roles_users'),
knex.schema.dropTableIfExists('permissions_users'),
knex.schema.dropTableIfExists('permissions_roles'),
knex.schema.dropTableIfExists('users')
]).then(function () {
return when.all([
knex.schema.dropTableIfExists('roles'),
knex.schema.dropTableIfExists('settings'),
knex.schema.dropTableIfExists('permissions'),
knex.schema.dropTableIfExists('tags'),
knex.schema.dropTableIfExists('posts')
]);
});
};
exports.up = up;
exports.down = down;
exports.constraints = constraints;
There isn't currently a way to run this through the command line. We keep both the up and down migrations defined in case we need them in future, but there isn't a use for them at present.
The migration system in Ghost is also currently being rewritten before any actual migrations are done in version 0.4.
Related
Let's say I have a node in a secondary realtime database called "test" with a value of "foobar".
I want to set up a function that prevents it from being deleted. More realistically this node would have several child nodes, where the function first checks if it can be deleted or not. However, here we never allow it to be deleted to keep the code as short as possible.
So I add a function that triggers onDelete and just rewrites the value.
In short:
Secondary database has: {"test":"foobar"}
onDelete function:
exports.testDelete = functions.database
.instance("secondary")
.ref("test")
.onDelete(async (snap, context) => {
await snap.ref.set(snap.val());
});
When running this with emulators, I would expect that when I delete the node, the node would just reappear in the secondary database, which is what happens when deployed to production. In the emulators, the node reappears, but in the main database instead of the secondary database. The only way I see to fix this is to replace snap.ref.set(snap.val()) with admin.app().database("https://{secondarydatabasedomain}.firebasedatabase.app").ref().child("test").set(snap.val()) which looks a little cumbersome just to get emulators to work.
Am I doing something wrong here?
I am using node 14, and firebase CLI version 9.23.0
To specify instance and path :
You have followed the syntax :
Instance named "my-app-db-2": functions.database.instance('my-app-db-2').ref('/foo/bar')
You have mentioned the instance name otherwise it will redirect to the default database so the syntax seems correct.
For triggering the event data follow the syntax as :
onDelete(handler: (snapshot: DataSnapshot, context: EventContext) => any): CloudFunction
For example you can refer to the Documentation :
// Listens for new messages added to /messages/:pushId/original and creates an
// uppercase version of the message to /messages/:pushId/uppercase
exports.makeUppercase = functions.database.ref('/messages/{pushId}/original')
.onCreate((snapshot, context) => {
// Grab the current value of what was written to the Realtime Database.
const original = snapshot.val();
functions.logger.log('Uppercasing', context.params.pushId, original);
const uppercase = original.toUpperCase();
// You must return a Promise when performing asynchronous tasks inside a Functions such as
// writing to the Firebase Realtime Database.
// Setting an "uppercase" sibling in the Realtime Database returns a Promise.
return snapshot.ref.parent.child('uppercase').set(uppercase);
});
If all above syntax has been followed correctly then I will recommend you to report a bug with a minimal repro on the repo along with including the entire cloud function as mentioned by Frank in a similar scenario.
I am writing tests for a service that uses knex, however, since the knex calls has several uses of knex.fn.now() my tests will produce varied results over time. I'm wondering if it's possible to mock/spy/hijack the inner calls to knex.fn.now() to something I can control, while letting the rest of the code stay in its 'real' implementation. I can only find examples of mocking knex completely which would make the purpose of my testing pointless.
So I'm wondering if it's possible to have jest listen for a specific function call and insert another value in it's stead.
You can mock Knex package by creating a folder __mocks__/knex/index.js.
Inside this file u can require the real knex implementation, change it, and export.
It should look something like this:
// __mocks__/knex/index.js
const knex = require('knex');
const fixedTime = new Date();
knex.fn.now = () => fixedTime;
module.exports = knex;
I searched a lot, but have not found the specific solution to my problem yet.
I have an async function that polls a database as node script that runs forever when using node 12, but in v14 the implementation changed and it closes immediately after running once.
(async function pollDatabase() {
const db = new Db();
return async.forever(async function (pollFn) {
const query = "SELECT * FROM templates WHERE data->>'isProcessed'='0' LIMIT 1;";
const templates = await db.query(query);
if (!templates || !templates.length) {
setTimeout(pollFn, 1000);
return;
}
await doSomethingWithTemplateAndReExecuteThisFunction(templates[0], pollFn);
});
})();
The weird thing is that for example an express server just stays running, but I have not figured out yet how that works. I was not planning to convert this background script to a server. Any thoughts on what would be the best way to make this run forever as some kind of background task? At the moment it is a docker container containing just this script.
I have solved it myself in the end. I don't know exactly why it happened with the switch to Node 14, but the combination of an outdated pg-promise and bluebird library seems to have been the culprit. I guess it probably has to do with the way Promises are now handled in Node 14.
Old versions:
pg-promise#3.2.6 and bluebird#2.10.2
New versions:
pg-promise#^10.9.4 with the native Promise implementation
I'm trying to do a simple command line database transformation with node.js and sequelize. I've simplified my errant code down to the following, but it never returns:
// Set up database connection and models
var models = require('../models_sequelize');
models.User.findOne()
.then(a => {
console.log(a.name);
});
I get a name printed, but then the script hangs. What is wrong? How do I debug this to see what's stuck? I get the impression that there's an orphan promise that's not being fulfilled, but I don't understand where or why. I must be missing something obvious.
If I run the same interactively from the node console, it returns fine.
Sirko's comment re: close() gave me something to go on. I can stop the hanging with the following code:
var models = require('../models_sequelize');
models.User.findOne()
.then(a => {
console.log(a.name);
models.sequelize.close();
})
Alternatively, this seems to work too as I guess it's doing exactly the same thing:
var models = require('../models_sequelize');
models.User.findOne()
.then(a => {
console.log(a.name);
})
.finally(() => {
models.sequelize.close();
});
I also found something about connection pooling timeouts, but I don't think that affects my simple use case. I imagine it'll come into play in more complicated examples.
Would still like to find a good reference as to why this is necessary rather than just my guess.
I am running several loopback tests via mocha (let's call them test1.js, test2.js and test3.js).
When i run the independently everything works well. However, when I ask mocha to run them all, things that are created in the first test in the in-memory DB collide with tests being done later on (test 2 or 3).
Is there a way to ensure we start each test with an empty DB? Something like:
app.dataSources.db.reset()
Thanks a lot!
UPDATE: What I ended up doing: I looked at the DataSource code and found that you can do automigrate on the memory DB.
before("wipe DB (if used with other tests)", function(done) {
app.dataSources.db.automigrate(function(err) {
done(err);
});
});
Get a hold of the db via app.dataSources.db and execute automigrate as in:
before("wipe DB (if used with other tests)", function(done) {
app.dataSources.db.automigrate(function(err) {
done(err);
});
});
Cheers.
Normally you should clean up after each test.
You can use hooks, like afterEach