Adding an Entity to an Azure Cosmos Table with Javascript Azure Function - azure

In Azure, I have a javascript HTTPTrigger Function App with:
const azure = require('azure-storage')
const tableSvc = azure.createTableService(
process.env.COSMOS_TABLE_ACCOUNT,
process.env.COSMOS_TABLE_KEY,
process.env.COSMOS_TABLE_ENDPOINT
)
const entGen = azure.TableUtilities.entityGenerator
const testData = {
PartitionKey: entGen.String('test'),
RowKey: entGen.String(1),
name: entGen.String('It works!')
}
const insertTestData = () => new Promise((resolve, reject) => {
tableSvc.insertEntity('tests', testData, (error, res) => {
if (error) return reject(error)
resolve(res)
})
})
...
I've confirmed that the environment variables are all set and populated with the values from Azure Cosmos DB -> Cosmos Table Instance -> Connection String.
I've also tried connecting with:
const tableSvc = azure.createTableService(
process.env.COSMOS_TABLE_CONNECTION_STRING
)
When I call insertTestData(), I'm getting back an error back from the .insertEntity callback with an empty object: {}. No entities are being added to my tests table, as confirmed by the Data Explorer.
Any ideas how to perform this operation or get more information in my debugger? I have an Insight monitor attached to the process, but it reports a successful completion.

I noticed that you're passing a numeric value for RowKey attribute.
RowKey: entGen.String(1)
When I used the code, it complained about that.
When I changed the code to:
RowKey: entGen.String('1')
I was able to insert the entity.
Here's my complete code:
const azure = require('azure-storage')
const tableSvc = azure.createTableService(
'account-name',
'account-key',
'https://account-name.table.core.windows.net'
)
const entGen = azure.TableUtilities.entityGenerator
const testData = {
PartitionKey: entGen.String('test'),
RowKey: entGen.String('1'),
name: entGen.String('It works!')
}
console.log(testData);
const insertTestData = () => new Promise((resolve, reject) => {
tableSvc.insertEntity('test', testData, (error, res) => {
if (error) return reject(error)
resolve(res)
})
})
console.log('----------------');
insertTestData()
.then((result) => {
console.log('result');
console.log(result);
})
.catch((error) => {
console.log('error');
console.log(error);
})
I used azure-storage NPM package (version 2.10.2).

Related

Error code 42601 in postgreSQL pg-promise, in NodeJS

I'm trying to make a bulk Insert/Update data in a db, but I have got this error, I don't know why, this is my code
This is the function of the promise of the bulkInsert
async bulkInsertFeatureTreatment(req, res) {
const { featuresTreatments } = await decryptRequest(req);
await db
.tx((t) => {
const queries = featuresTreatments.map((featureTreatment) => {
return t.none(
featuresTreatmentsDB.insertFeatureTreatment(featureTreatment)
);
});
return t.batch(queries);
})
.then((data) =>
cryptedResponse(res, response(200, INSERT_DATA_SUCCESS, data))
)
.catch((error) =>
cryptedResponse(res, response(500, INSERT_DATA_NOT_SUCCESS, error))
);
}
this is the function of insert/update, the upper function calls this
insertFeatureTreatment: (featuresTreatments) =>
`INSERT INTO ${TABLE_NAME} (${COL_ID_TREATMENT}, ${COL_ID_FEATURE}, ${COL_VALUE})
VALUES ('${featuresTreatments.id_treatment}', '${featuresTreatments.id_feature}', '${featuresTreatments.value}')
ON CONFLICT (${COL_ID_TREATMENT}, ${COL_ID_FEATURE})
DO UPDATE ${TABLE_NAME}
SET ${COL_VALUE} = '${featuresTreatments.value}'
WHERE ${COL_ID_TREATMENT} = '${featuresTreatments.id_treatment}'
AND ${COL_ID_FEATURE} = '${featuresTreatments.id_feature}'`,
Ok, that was a syntax error, it were about the conflict syntax, so this is the result:
insertFeatureTreatment: (featuresTreatments) =>
INSERT INTO ${TABLE_NAME} (${COL_ID_TREATMENT}, ${COL_ID_FEATURE}, ${COL_VALUE}) VALUES ('${featuresTreatments.id_treatment}', '${featuresTreatments.id_feature}', '${featuresTreatments.value}') ON CONFLICT (${COL_ID_TREATMENT}, ${COL_ID_FEATURE}) DO UPDATE SET ${COL_VALUE} = '${featuresTreatments.value}',

Using Cloud Functions to filter data by string from cloud firestore

I need to filter a firestore data by string, and knowing that firestore don't have a proper way to do this inside his tools (Something as "%LIKE%" SQL operator) our new strategy is to use cloud functions to filter this data using a regex and recover it after.
But we having some troubles:
1-Can we manage assync functions inside Cloud Functions?
2-After this function, how can we recover this data? (We trying to use Fetch(),but looks like it don't works.)
Here's my Cloud Function:
const functions = require('firebase-functions');
const admin = require('firebase-admin');
admin.initializeApp(MY CREDENTIALS)
exports.filterAdverts = functions.https.onRequest( (req, res) => {
let vm = this;
let filteredResults = [];
let {filter,id_client} = JSON.parse(req.body);
let regex = new RegExp(filtro, 'i');
admin
.firestore()
.collection('database')
.doc(id_client)
.collection('classifieds')
.where('active', '==', 1)
.get().then(snapshot => {
snapshot.docs.forEach(doc => {
if (doc.data().txt_comentary.match(regex)) {
filteredResults.push(doc.data())
}
})
});
res.send(filteredResults.toString())
});
As you can see, i need to filter the variable txt_comentary by the Regex. Here's the fetch function:
filterAds: function(){
let filter = {filter:this.form_seach.filtro_comentary, id_cliente:this.id_cliente};
fetch('Cloud-function-URL', {
headers: {'Content-Type':'application/x-www-form-urlencoded'},
method: 'GET',
mode: 'no-cors'
}).then(async (response) => {
await console.log(response)
}).catch(error => {
console.log(error)
})
I'm really stucked to do this, can anybody help me?
I reach a result, i'm used the onCall function to trigger it. Here's the code from cloud function:
exports.filterAds = functions.https.onCall( async (data, context) => {
let vm = this;
let filteredResults = [];
let {filter,id_client} = data;
let regex = new RegExp(filter, 'i');
await admin.firestore().collection('database').doc(id_client).collection('classifieds').where('active', '==', 1) .get().then(snapshot => {
snapshot.docs.forEach(doc => {
if (doc.data().txt_comentary.match(regex)) {
filteredResults.push(doc.data())
}
})
})
return filteredResults;
});
and here's the function to trigger it:
filter_Ads: function () {
const vm = this;
const filteredValues = firebase.functions().httpsCallable("filtrarAnuncios");
valoresFiltrados({
filter: *input string*,
id_cliente: *part of path in my firebase search*,
})
.then(async (response) => {
console.log(respose)
})
.catch((error) => {
console.log(error);
});
},
With this i can filter by the regex and recover the result from this cloud function.
Yes you can specify the version of nodejs and use a modern one that handle async / await.
you should call res.send inside your .then(snapshot => {...}) not after, because here you are calling it before getting the data from the database
Also in your test, instead of await console.log(response) you should do console.log(await response.text())
A few other things are surprising with your code, like how you just .toString your array instead of sending it as JSON

How to prevent Knex from hanging on insert

I am using Knex.js to insert values from an array into a PostgreSQL database. The problem I keep running into is that Knex will hang after inserting rows in the database.
I've been struggling with this for several hours, and have tried a variety of solutions, including Get Knex.js transactions working with ES7 async/await, Make KnexJS Transactions work with async/await, and Knex Transaction with Promises.
No matter which flavor I try, I come back to the hang. I'm pretty sure I'm missing something obvious, but it's possible I haven't had enough coffee.
Here's my test code:
const testArray = [
{line: 'Canterbury Tales'},
{line: 'Moby Dick'},
{line: 'Hamlet'}
];
const insertData = (dataArray) => {
return new Promise( (resolve, reject) => {
const data = dataArray.map(x => {
return {
file_line: x.line
};
});
let insertedRows;
db.insert(data)
.into('file_import')
.then((result) => {
insertedRows = result.rowCount;
resolve(insertedRows);
})
});
}
const testCall = (b) => {
insertData(b).then((result) => {
console.log(`${result} rows inserted.`);
})
}
testCall(testArray);
This returns the following:
3 rows inserted.
EDIT: Updating with solution
Thanks to #sigmus, I was able to get this working by adding db.destroy(). Here's the updated code block, fully functional:
const testArray = [
{line: 'Canterbury Tales'},
{line: 'Moby Dick'},
{line: 'Hamlet'}
];
const insertData = (dataArray) => {
return new Promise( (resolve, reject) => {
const data = dataArray.map(x => {
return {
file_line: x.line
};
});
let insertedRows;
db.insert(data)
.into('file_import')
.then((result) => {
insertedRows = result.rowCount;
resolve(insertedRows);
})
.finally(() => {
db.destroy();
});
});
}
const testCall = (b) => {
insertData(b).then((result) => {
console.log(`${result} rows inserted.`);
process.exit(0);
})
}
testCall(testArray);
If you add process.exit(0); right after console.log(`${result} rows inserted.`); the script should exit.
It may be the case it's a connection pool issue, try using destroy like explained here: https://knexjs.org/#Installation-pooling

TypeError: firestoreService.snapshot_ is not a function

I've been using firebase functions test to do some testing on my functions. I have some code that is supposed to post a thing to firestore, basically in the same way that the examples show to do in the realtime database examples:
exports.addMessage = functions.https.onRequest((req, res) => {
const original = req.query.text;
admin.firestore()
.collection('messages')
.add({ original })
.then(documentReference => res.send(documentReference))
.catch(error => res.send(error));
});
For my test, I've spoofed some basic functionality using sinon, mocha and chai. Here is my current test, which is failing with the error message: TypeError: firestoreService.snapshot_ is not a function
describe('addMessage', () => {
// add message should add a message to the database
let oldDatabase;
before(() => {
// Save the old database method so it can be restored after the test.
oldDatabase = admin.firestore;
});
after(() => {
// Restoring admin.database() to the original method.
admin.firestore = oldDatabase;
});
it('should return the correct data', (done) => {
// create stubs
const refStub = sinon.stub();
// create a fake request object
const req = {
query : {
text: 'fly you fools!'
}
};
const snap = test.firestore.makeDocumentSnapshot({ original: req.query.text }, 'messages/1234');
// create a fake document reference
const fakeDocRef = snap._ref;
// create a fake response object
const res = {
send: returnedDocRef => {
// test the result
assert.equal(returnedDocRef, fakeDocRef);
done();
}
};
// spoof firestore
const adminStub = sinon.stub(admin, 'firestore').get(() => () => {
return {
collection: () => {
return {
add: (data) => {
const secondSnap = test.firestore.makeDocumentSnapshot(data, 'messages/1234');
const anotherFakeDocRef = secondSnap._ref;
return Promise.resolve(anotherFakeDocRef);
}
}
}
}
});
// call the function to execute the test above
myFunctions.addMessage(req, res);
});
});
My question is how the heck do I fix this?
I previously had a test that was just passing the first snap and fakeDocRef, and my test was passing fine, but as soon as I resolve the promise with the new fake document reference, it fails...
Any help would be appreciated! Thanks!
There are three different types of the calls, that are different:
Operating on the Collections.
Operating on the Documents.
Operating on the results of the query.
They have to be used consistently.
Please refer a documentation to see the difference operation on the collection and the document.

Export a dynamic variable

I'm trying to export a variable in node.js like this:
let news = [];
const fetchNews = new Promise ((resolve, reject) => {
let query = 'SELECT id, name FROM news';
mysql.query(query, [], (error, results) => {
if (error)
reject({error: `DB Error: ${error.code} (${error.sqlState})`})
results = JSON.parse(JSON.stringify(results));
news = results;
resolve(results);
});
});
if(!news.length)
fetchNews
.then(results => {news = results})
.catch(err => {console.log('Unable to fetch news', err)});
exports.news = news;
When I use this code in some other module like this:
const news = require('./news.js').news;
console.log(news);
//returns [];
Can somebody point out my mistake in first code?
There are a couple of things that seem odd in the way you are doing this:
You have an async operation but you want just the value without actually awaiting on the operation to complete. Try something like this:
module.exports = new Promise ((resolve, reject) => {
mysql.query('SELECT id, name FROM news', (error, results) => {
if (error)
reject({error: `DB Error: ${error.code} (${error.sqlState})`})
resolve(JSON.parse(JSON.stringify(results)));
});
});
Then to get the news:
var getNewsAsync = require('./news')
getNewsAsync.then(news => console.log(news))
It would be cleaner/shorter if you actually utilize async/await with the mysql lib.
Update:
With Node 8 and above you should be able to promisify the mySQL lib methods. Although there might be better npm options out there to get this to work. Here is an untested version:
const mysql = require('mysql');
const util = require('util');
const conn = mysql.createConnection({yourHOST/USER/PW/DB});
const query = util.promisify(conn.query).bind(conn);
module.exports = async () => {
try {return await query('SELECT id, name FROM news')} finally {conn.end()}
}
To get the news:
var getNewsAsync = require('./news')
console.log(await getNewsAsync())

Resources