How to find Google Spanner table creation date and time? - google-cloud-spanner

I am trying to the table creation date and time in Google Spanner but I dont see any option exist. Even there is no information available in the information.schema

In Spanner, you cannot get a create_time for table and I believe that's due to to how Spanner table is designed and shared to multiple regions thus you cannot get a reliable create_time. A workaround would be; when you create a table you can record that time and save it to another table. This way you will have a register for all your table creation time. A code for this would be:
// Imports the Google Cloud client library
const {Spanner} = require('#google-cloud/spanner');
// Creates a client
const projectId = "PROJECTID"
const instanceId = "INSTANCEID";
const databaseId = "DATABASEID";
const spanner = new Spanner({projectId});
// Gets a reference to a Cloud Spanner instance and database
const instance = spanner.instance(instanceId);
const database = instance.database(databaseId);
// create the table TablesMeta that will holds your tables creation time
const tableMetaschema = [
'CREATE TABLE TablesMeta (' +
' tableId INT64 NOT NULL,' +
' TableName STRING(1024),'+
' TableCreateTime DATE' +
') PRIMARY KEY(tableId)'
];
database.updateSchema(TablesMeta)
// create a new table and add its creation time to the TablesMeta table
const singerSchema = [
'CREATE TABLE Singers (' +
' SingerId INT64 NOT NULL,' +
' FirstName STRING(1024),' +
' LastName STRING(1024),' +
' SingerInfo BYTES(MAX),' +
') PRIMARY KEY(SingerId)'
];
database.updateSchema(singerSchema).then(() => {
database.runTransaction(async (err, transaction) => {
if (err) {
console.error(err);
return;
}
try {
const [rowCount] = await transaction.runUpdate({
sql:
'INSERT TablesMeta (tableId, TableName, TableCreateTime) VALUES (1, #TableName, #TableCreateTime)',
params: {
TableName: 'Singers',
TableCreateTime: Spanner.date(),
},
});
console.log(
`Successfully inserted ${rowCount} record into the TablesMeta table.`
);
await transaction.commit();
} catch (err) {
console.error('ERROR:', err);
} finally {
// Close the database when finished.
database.close();
}
});
})

Related

Why do we use put id as a param in put method

I am trying to Increase a value in database,
const handleStockUpdate = (event) => {
event.preventDefault();
const newQuantity = event.target.restock.value;
const quantity = {quantity: newQuantity};
setNewCount({...book, quantity: book.quantity + parseInt(newQuantity)});
if(newQuantity < 0){
toast("Please enter a vlaid number")
}
else{
axios.put(`http://localhost:5000/stock/${`idHeare`}`, {quantity})
}
}
api
app.put('/stock/:id', async(req, res) => {
const id = req.params.id;
console.log(id);
const qunatity = req.body.quantity.quantity;
const book = await bookCollections.findOne({_id: ObjectId(id)});
const newUpdate = parseInt(book.quantity) + parseInt(qunatity);
const result = await bookCollections.updateOne({_id:ObjectId(id)},
$set({qunatity: newUpdate})
)
res.send(result);
})
My question is, why should I use Id? and where can I get it? without Id it gets networkErrro
Put method is used to update data.
The id you are providing to the api will be used to retrieved that row from your database which you want update. As you may know that in a database table every data row must have a unique id.
So for your book object , if you are retrieving it from the database in the first place then you should have a Id with that book data you got.
You have to provide that book id on the put request.

How do I run an update while streaming using pg-query-stream and pg-promise?

I am trying to load 50000 items from the database with text in them, tag them and update the tags
I am using pg-promise and pg-query-stream for this purpoes
I was able to get the streaming part working properly but updating has become problematic with so many update statements
Here is my existing code
const QueryStream = require('pg-query-stream')
const JSONStream = require('JSONStream')
function prepareText(title, content, summary) {
let description
if (content && content.length) {
description = content
} else if (summary && summary.length) {
description = summary
} else {
description = ''
}
return title.toLowerCase() + ' ' + description.toLowerCase()
}
async function tagAll({ db, logger, tagger }) {
// you can also use pgp.as.format(query, values, options)
// to format queries properly, via pg-promise;
const qs = new QueryStream(
'SELECT feed_item_id,title,summary,content FROM feed_items ORDER BY pubdate DESC, feed_item_id DESC'
)
try {
const result = await db.stream(qs, (s) => {
// initiate streaming into the console:
s.pipe(JSONStream.stringify())
s.on('data', async (item) => {
try {
s.pause()
// eslint-disable-next-line camelcase
const { feed_item_id, title, summary, content } = item
// Process text to be tagged
const text = prepareText(title, summary, content)
const tags = tagger.tag(text)
// Update tags per post
await db.query(
'UPDATE feed_items SET tags=$1 WHERE feed_item_id=$2',
// eslint-disable-next-line camelcase
[tags, feed_item_id]
)
} catch (error) {
logger.error(error)
} finally {
s.resume()
}
})
})
logger.info(
'Total rows processed:',
result.processed,
'Duration in milliseconds:',
result.duration
)
} catch (error) {
logger.error(error)
}
}
module.exports = tagAll
The db object is the one from pg-promise whereas the tagger simply extracts an array of tags from text contained in the variable tags
Too many update statements are executing from what I can see in the diagnostics, is there a way to batch them?
If you can do everything with one sql statement, you should! Here you're paying the price of a back and forth between node and your DB for each line of your table, which will take most of the time of your query.
Your request can be implemented in pure sql:
update feed_items set tags=case
when (content = '') is false then lower(title) || ' ' || lower(content)
when (summary = '') is false then lower(title) || ' ' || lower(summary)
else title end;
This request will update all your table at once. I'm sure it'd be some order of magnitude faster than your method. On my machine, with a table containing 100000 rows, the update time is about 600ms.
Some remarks:
you don't need to order to update. As ordering is quite slow, it's better not to.
I guess the limit part was because it is too slow? If it is the case, then you can drop it, 50000 rows is not a big table for postgres.
I bet this pg-stream things does not really stream stuff out of the DB, it only allows you to use a stream-like api from the results it gathered earlier... No problem about that, but I thought maybe there was a misconception here.
This is the best I could come up with to batch the queries inside the stream so that we dont need to load all data in memory or run too many queries. If anyone knows a better way to batch especially with t.sequence feel free to add another answer
const BATCH_SIZE = 5000
async function batchInsert({ db, pgp, logger, data }) {
try {
// https://vitaly-t.github.io/pg-promise/helpers.ColumnSet.html
const cs = new pgp.helpers.ColumnSet(
[
{ name: 'feed_item_id', cast: 'uuid' },
{ name: 'tags', cast: 'varchar(64)[]' },
],
{
table: 'feed_items',
}
)
const query =
pgp.helpers.update(data, cs) + ' WHERE v.feed_item_id=t.feed_item_id'
await db.none(query)
} catch (error) {
logger.error(error)
}
}
async function tagAll({ db, pgp, logger, tagger }) {
// you can also use pgp.as.format(query, values, options)
// to format queries properly, via pg-promise;
const qs = new QueryStream(
'SELECT feed_item_id,title,summary,content FROM feed_items ORDER BY pubdate DESC, feed_item_id DESC'
)
try {
const queryValues = []
const result = await db.stream(qs, (s) => {
// initiate streaming into the console:
s.pipe(JSONStream.stringify())
s.on('data', async (item) => {
try {
s.pause()
// eslint-disable-next-line camelcase
const { feed_item_id, title, summary, content } = item
// Process text to be tagged
const text = prepareText(title, summary, content)
const tags = tagger.tag(text)
queryValues.push({ feed_item_id, tags })
if (queryValues.length >= BATCH_SIZE) {
const data = queryValues.splice(0, queryValues.length)
await batchInsert({ db, pgp, logger, data })
}
} catch (error) {
logger.error(error)
} finally {
s.resume()
}
})
})
await batchInsert({ db, pgp, logger, data: queryValues })
return result
} catch (error) {
logger.error(error)
}
}

ES6 Async/Await, ExpressJS and Postgres transactions

REVISED QUESTION
I've revised the question, in the hope of getting a clearer answer.
I'm trying to process data in ExpressJS, based on the incoming req.body and the existing data in the table.
I'm receiving a req.body that contains a JSON list of updated fields. Some of those fields are stored as JSONB in Postgres. If an incoming field is JSONB, then the form (external code) that is making the request has already run a jsonpatch.compare() to generate the list of patches, and it is these patches and not the full values that are being passed in. For any non-JSONB values, incoming values just need to be passed through to the UPDATE query.
I have a working version, as below, that pretends that the existing JSONB values in the table ARE NULL. Clearly, this is NOT what is needed. I need to pull the values from the db. The non-querying-of-current-values version and a bare minimum router, looks like this:
const express = require('express')
const bodyParser = require('body-parser')
const SQL = require('sql-template-strings')
const { Client } = require('pg')
const dbConfig = require('../db')
const jsonpatch = require('fast-json-patch')
const FormRouter = express.Router()
I have some update code:
````javascript
const patchFormsRoute = (req, res) => {
const client = new Client(dbConfig)
const { id } = req.body
const parts = []
const params = [id]
// list of JSONB fields for the 'forms' table
const jsonFields = [
'sections',
'editors',
'descriptions',
]
// list of all fields, including JSONB fields in the 'forms' table
const possibleFields = [
'status',
'version',
'detail',
'materials',
...jsonFields,
]
// this is a DUMMY RECORD instead of the result of a client.query
let currentRecord = { 'sections':[], 'editors':[], 'descriptions':[] }
possibleFields.forEach(myProp => {
if (req.body[myProp] != undefined) {
parts.push(`${myProp} = $${params.length + 1}`)
if (jsonFields.indexOf(myProp) > -1) {
val = currentRecord[myProp]
jsonpatch.applyPatch(val, req.body[myProp])
params.push(JSON.stringify(val))
} else {
params.push(req.body[myProp])
}
}
})
const updateQuery = 'UPDATE forms SET ' + parts.join(', ') + ' WHERE id = $1'
client.connect()
return client
.query(updateQuery, params)
.then(result => res.status(200).json(result.rowCount))
.catch(err => res.status(400).json(err.severity))
.then(() => client.end())
}
FormRouter.route('/')
.patch(bodyParser.json({ limit: '50mb' }), patchFormsRoute)
exports.FormRouter = FormRouter
I promise, that this is working code, which does almost what I need. However, I want to replace the dummy record with the data already in the table, fetched contemporaneously. My issue, is because multiple clients could be updating a row at the same time (but looking at orthogonal elements of the JSONB values), I need the fetch, calc, and update to happen as a SINGLE TRANSACTIOn. My plan is to:
BEGIN a transaction
Query Postgres for the current row value, based on the incoming id
For any JSONB fields, apply the patch to generate the correct value for that field in the UPDATE statement.
Run the UPDATE statement with the appropriate param values (either from the req.body or the patched row, depending on whether the field is JSONB or not)
COMMIT the transaction, or ROLLBACK on error.
I've tried implementing the answer from #midrizi; maybe it's just me, but the combination of awaits and plain testing of res sends the server off into Hyperspace... and ends in a timeout.
In case anyone is still awake, here's a working solution to my issue.
TLDR; RTFM: A pooled client with async/await minus the pooling (for now).
const patchFormsRoute = (req, res) => {
const { id } = req.body
// list of JSONB fields for the 'forms' table
const jsonFields = [
'sections',
'editors',
'descriptions',
]
// list of all fields, including JSONB fields in the 'forms' table
const possibleFields = [
'status',
'version',
'detail',
'materials',
...jsonFields,
]
const parts = []
const params = [id]
;(async () => {
const client = await new Client(dbConfig)
await client.connect()
try {
// begin a transaction
await client.query('BEGIN')
// get the current form data from DB
const fetchResult = await client.query(
SQL`SELECT * FROM forms WHERE id = ${id}`,
)
if (fetchResult.rowCount === 0) {
res.status(400).json(0)
await client.query('ROLLBACK')
} else {
const currentRecord = fetchResult.rows[0]
// patch JSONB values or update non-JSONB values
let val = []
possibleFields.forEach(myProp => {
if (req.body[myProp] != undefined) {
parts.push(`${myProp} = $${params.length + 1}`)
if (jsonFields.indexOf(myProp) > -1) {
val = currentRecord[myProp]
jsonpatch.applyPatch(val, req.body[myProp])
params.push(JSON.stringify(val))
} else {
params.push(req.body[myProp])
}
}
})
const updateQuery =
'UPDATE forms SET ' + parts.join(', ') + ' WHERE id = $1'
// update record in DB
const result = await client.query(updateQuery, params)
// commit transaction
await client.query('COMMIT')
res.status(200).json(result.rowCount)
}
} catch (err) {
await client.query('ROLLBACK')
res.status(400).json(err.severity)
throw err
} finally {
client.end()
}
})().catch(err => console.error(err.stack))
}

Node.js Query sqlite with 'sqlite`

I'm trying to get a hang of Node (I mainly use python) so I'm working on a small project to read an write data to a sqlite database.
I am having no issue writing to the database luckily, but I cannot seem to get queries to work at all. I've tested the queries in the sql terminal and they are successful.
So far, I have something like
const fs = require("fs");
const util = require("util");
const sqlite = require("sqlite");
const Promise = require("bluebird")
// const DATABASE = ":memory:";
const DATABASE = "./database.sqlite";
function insertDataIntoDatabase(transactions, db) {
// Write each transaction into the database.
let sqlStatement = "INSERT INTO Trx \
(name, address, amount, category) \
VALUES "
for (var i = 0; i < transactions.length; ++i) {
let trx = transactions[i];
sqlStatement += util.format(
"('%s', '%s', %d, '%s'), ",
trx.name,
trx.address,
trx.amount,
trx.category,
);
}
sqlStatement = sqlStatement.substring(0, sqlStatement.length - 2);
db.then(db => db.run(sqlStatement))
.catch((err) => console.log(err));
}
function getTransactions (db, category) {
// Return an array of valid transactions of a given category.
let where = "";
if (category) {
where = util.format("WHERE category='%s'", category);
}
let sqlStatement = util.format("SELECT * from Trx %s", where);
sqlStatement = "SELECT * from Trx"; // Trying to figure out whats happening
console.log(sqlStatement);
db.then(db => {
db.all(sqlStatement)
.then((err, rows) => {
console.log(rows); // undefined
console.log(err); // []
})
})
}
// Set up the db connection
const db = sqlite.open(DATABASE, { cached: true })
.then(db => db.migrate({ force: 'last' }));
// Read transactions and write them to the database
fs.readFile("transactions.json", "utf8", (err, data) => {
let transactions = JSON.parse(data).transactions;
insertDataIntoDatabase(transactions, db);
})
// Get transaction data
getValidTransactions(db, 'credit');
// Close connection to DB
db.then(db => db.close());
Looking at this again, I think the issue is the async nature of Node. The query was successful, but at that point in time, I had not inserted the data from the json file into the database yet, hence the empty query.

Azure Node.js Table Entity Update

I'm trying to update a table entity during data insertion using server Node script. Here is what I'm trying to do. I have a Address table which has a column named geolocation of type "geography". When a user updates the Address, I'm using npm-geocoder to get the latitude and longitude to update the geolocation column.
Here is the code snippet.
var table = module.exports = require('azure-mobile-apps').table();
table.dynamicSchema = true;
table.insert(function (context) {
var address = context.item.lines1 + ' ' + context.item.lines2 + ' ' + context.item.city + ' ' + context.item.state + ' ' + context.item.zip;
var geocoderProvider = 'google';
var httpAdapter = 'https';
var extra = {
apiKey: '',
formatter: null
};
var geocoder = require('node-geocoder')(geocoderProvider, httpAdapter, extra);
geocoder.geocode(address)
.then(function(res) {
var geolocation = "POINT("+ res[0].longitude + " " + res[0].latitude +")";
console.log("Value of Geolocation is ", geolocation);
context.item.geolocation = geolocation;
})
.catch(function(err) {
console.log("Error ", err);
});
return context.execute();
});
However, i don't see the table being updated with the geolocation. Any pointers?
I looked at few samples available online but they are mostly based on previous mobile services where the insert method signature is different for ex:-
function insert(item, user, request) {
var queryString = "INSERT INTO Place (title, description, location) VALUES (?, ?, geography::STPointFromText('POINT(' + ? + ' '
+ ? + ')', 4326))";
mssql.query(queryString, [item.title, item.description, item.longitude.toString(), item.latitude.toString()], {
success: function() {
request.respond(statusCodes.OK, {});
}
});
}
context.execute() is being called before the geocode() promise is resolved. Move context.execute() inside the callback and return the promise from the function...
table.insert(function (context) {
// ...
var geocoder = require('node-geocoder')(geocoderProvider, httpAdapter, extra);
return geocoder.geocode(address)
.then(function(res) {
var geolocation = "POINT("+ res[0].longitude + " " + res[0].latitude +")";
console.log("Value of Geolocation is ", geolocation);
context.item.geolocation = geolocation;
return context.execute();
})
.catch(function(err) {
console.log("Error ", err);
});
});
Per my experience, I think you can try to use the update operation instead of the insert operation for the table to update an existed object. And if the table had the column named geolocation, the dynamicSchema property enabled seems to be not necessary for the table.
Meanwhile, you can try to check the table access property for the update operation, please see the setions How to: Require Authentication for access to tables & How to: Disable access to specific table operations of the doc https://azure.microsoft.com/en-us/documentation/articles/app-service-mobile-node-backend-how-to-use-server-sdk/.

Resources