New to Node and having trouble with the Youtube Data API - node.js

Using Node.js for the first time and working with the Youtube Data API. I can't quite get the data I want from the API when I make a request to it. This is what the data coming back from the API is supposed to look like:
/**
* API response
*/
{
"kind": "youtube#commentThreadListResponse",
"etag": "\"VPWTmrH7dFmi4s1RqrK4tLejnRI/yVL3QyyDwJFkFNOcCd4KZCcTFDw\"",
"nextPageToken": "QURTSl9pMlQySG1zcHRKb0dNZ3dWdlYtcUhyRDFDVlJXaHFmdVFiMUlaUFJfTTNjdTFpQzFNWUNuWjhBY0d2ZV8tTGR2aHFXRXRJVDZRQVpRM0YzNndWVXlQVFNwOU94UVFCWVd2empIVUlGdHlFR25keU8=",
"pageInfo": {
"totalResults": 20,
"resultsPerPage": 20
},
"items": [
{
"kind": "youtube#commentThread",
"etag": "\"VPWTmrH7dFmi4s1RqrK4tLejnRI/OqxtT8nFAjcFFrHa4DbZrY_NItM\"",
"id": "z13bwzmokuzcxtcqn04cclqbiozixldh21o"
},
{
"kind": "youtube#commentThread",
"etag": "\"VPWTmrH7dFmi4s1RqrK4tLejnRI/1B_usKd_ZpCLxG5l5nL7QfUtG3o\"",
"id": "z13puhijunbzytdcn22lstwptmybyzwdl"
},
{
"kind": "youtube#commentThread",
"etag": "\"VPWTmrH7dFmi4s1RqrK4tLejnRI/h8sS5KTOFa7CQWU5Je2Fp5UQ0bk\"",
"id": "z13dfbwzjyrpiznqc04cgjlpbyn0wtaiqpw0k"
},
{
"kind": "youtube#commentThread",
"etag": "\"VPWTmrH7dFmi4s1RqrK4tLejnRI/FQEl6XU95FHiM1ijRxC5fqngmqk\"",
"id": "z12atro51wfhzvmp104cihfytveyshbr4s40k"
},
{ ...........AND SO ON
I then use the following code in an attempt to console.log() this data from the youtube API
var DATABASE = youtube.commentThreads.list(
{ 'videoId': '7YcW25PHnAA', 'part': 'id, replies'}, function(err, data){
if(err){
console.error('Error: ' + err);
}
});
var resp = JSON.stringify(DATABASE);
console.log(resp);
But this is my output instead:
{
"uri": {
"protocol": "https:",
"slashes": true,
"auth": null,
"host": "www.googleapis.com",
"port": null,
"hostname": "www.googleapis.com",
"hash": null,
"search": "?videoId=7YcW25PHnAA&part=id%2C%20replies&key=AIzaSyDTTnj4HncXQCM3U-9XUvHyIf7kE9f2ZUk",
"query": "videoId=7YcW25PHnAA&part=id%2C%20replies&key=AIzaSyDTTnj4HncXQCM3U-9XUvHyIf7kE9f2ZUk",
"pathname": "/youtube/v3/commentThreads",
"path": "/youtube/v3/commentThreads?videoId=7YcW25PHnAA&part=id%2C%20replies&key=AIzaSyDTTnj4HncXQCM3U-9XUvHyIf7kE9f2ZUk",
"href": "https://www.googleapis.com/youtube/v3/commentThreads?videoId=7YcW25PHnAA&part=id%2C%20replies&key=AIzaSyDTTnj4HncXQCM3U-9XUvHyIf7kE9f2ZUk"
},
"method": "GET",
"headers": {
"User-Agent": "google-api-nodejs-client/0.10.0",
"host": "www.googleapis.com",
"accept": "application/json"
}

Related

Unable to update CosmosDB via LogicApps

I am unsure as to why I am constantly facing the following errors when trying to use Logic Apps to update a document in CosmosDB:
PartitionKey extracted from document doesn't match the one specified in the header
One of the specified inputs is invalid
For error 1, I sent the following request via LogicApps:
{
"method": "post",
"headers": {
"x-ms-documentdb-is-upsert": "True",
"x-ms-documentdb-raw-partitionkey": "12347"
},
"path": "/dbs/bc-gamification-management/colls/bcpoints/docs",
"host": {
"connection": {
"name": <omitted as this shouldn't matter>
}
},
"body": {
"curr_point": 500,
"id": "12347",
"overall_point": 1400
}
}
Not too sure where I got this idea but for error 2, I omitted the partition key from the body request:
{
"method": "post",
"headers": {
"x-ms-documentdb-is-upsert": "True",
"x-ms-documentdb-raw-partitionkey": "12347"
},
"path": "/dbs/bc-gamification-management/colls/bcpoints/docs",
"host": {
"connection": {
"name": <omitted as this shouldn't matter>
}
},
"body": {
"curr_point": 500,
"overall_point": 1400
}
}
I have tried troubleshooting this using: https://learn.microsoft.com/en-us/azure/cosmos-db/sql/troubleshoot-bad-request and various other methods like, using "id" and "/id" in the Partition key value instead of the actual value of the partition key. But all these methods did not work and I am not too sure why...
FYI, the CosmosDB has items with the following sample:
{
"id": "12347",
"overall_point": 1200,
"curr_point": 300,
"_rid": <omitted as this shouldn't matter>,
"_self": <omitted as this shouldn't matter>,
"_etag": <omitted as this shouldn't matter>,
"_attachments": <omitted as this shouldn't matter>,
"_ts": <omitted as this shouldn't matter>
}
The "id" field is also the Partition Key for the Collection. Please advice :")
What you need is something like this
{
"method": "post",
"headers": {
"x-ms-documentdb-is-upsert": "True",
"x-ms-documentdb-raw-partitionkey": "\"12347\""
},
"path": "/dbs/testdb/colls/testcoll/docs",
"host": {
"connection": {
"name": "xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx"
}
},
"body": {
"curr_point": 500,
"id": "12347",
"overall_point": 1400
}
}
You need to put your partition key within quotes, like this.

How to parse content of BLOB-file in Azure Logic App

I'm working on an Azure Logic App which is triggered everytime a JSON-file is added to a BLOB-storage. The JSON-file contains a CustomerId and based on this Id I want to sent the contents of the JSON-file to a different endpoint using an HTTP-request.
My Azure Logic App currently looks like this;
I've been researching and trying a lot of things for the entire morning, but I can't get my head around this. I've tried things like;
json(body('Get_blob_content_using_path'))
and
decodeBase64(body('Get_blob_content_using_path'))
and just the default option like visible in the screenshot. But I can't figure out how to do this. All I want is go left or right based on the CustomerId.
So for clarity. The problem lies within the condition-step of the Logic App. I can retrieve the BLOB-file from the storage, but the issue is with parsing the CustomerId from the JSON so I can validate it within the condition. Does anybody has an idea on how I can fix this?
In the end I was able to solve the issue by adding a compose step before the condition and after the steps which will get me the content of the BLOB-file. The compose get's the content of the blob-file which I then can validate against the CustomerId I want. This topic got me in the right direction for my solution.
UPDATE:
The final logic app looks like this;
Which is begin created with the following logic app code;
{
"$connections": {
"value": {
"azureblob": {
"connectionId": "<snip>",
"connectionName": "azureblob",
"id": "<snip>"
},
"slack": {
"connectionId": "<snip>",
"connectionName": "slack",
"id": "<snip>"
}
}
},
"definition": {
"$schema": "https://schema.management.azure.com/providers/Microsoft.Logic/schemas/2016-06-01/workflowdefinition.json#",
"actions": {
"Compose": {
"inputs": "#base64ToString(body('Get_blob_content').$content)",
"runAfter": {
"Get_blob_content": [
"Succeeded"
]
},
"type": "Compose"
},
"Condition_2": {
"actions": {
"Condition_3": {
"actions": {
"Delete_blob_3": {
"inputs": {
"host": {
"connection": {
"name": "#parameters('$connections')['azureblob']['connectionId']"
}
},
"method": "delete",
"path": "/datasets/default/files/#{encodeURIComponent(encodeURIComponent(triggerBody()?['Id']))}"
},
"runAfter": {},
"type": "ApiConnection"
}
},
"else": {
"actions": {
"Copy_blob_2": {
"inputs": {
"host": {
"connection": {
"name": "#parameters('$connections')['azureblob']['connectionId']"
}
},
"method": "post",
"path": "/datasets/default/copyFile",
"queries": {
"destination": "/<some-blob-container>/#{triggerBody()?['Name']}",
"overwrite": false,
"queryParametersSingleEncoded": true,
"source": "#triggerBody()?['Path']"
}
},
"runAfter": {
"Post_message_2": [
"Succeeded"
]
},
"type": "ApiConnection"
},
"Delete_blob_4": {
"inputs": {
"host": {
"connection": {
"name": "#parameters('$connections')['azureblob']['connectionId']"
}
},
"method": "delete",
"path": "/datasets/default/files/#{encodeURIComponent(encodeURIComponent(triggerBody()?['Id']))}"
},
"runAfter": {
"Copy_blob_2": [
"Succeeded"
]
},
"type": "ApiConnection"
},
"Post_message_2": {
"inputs": {
"host": {
"connection": {
"name": "#parameters('$connections')['slack']['connectionId']"
}
},
"method": "post",
"path": "/chat.postMessage",
"queries": {
"channel": "<snip>",
"text": "<some-message>"
}
},
"runAfter": {},
"type": "ApiConnection"
}
}
},
"expression": {
"or": [
{
"equals": [
"#outputs('HTTP_2')['statusCode']",
200
]
},
{
"equals": [
"#outputs('HTTP_2')['statusCode']",
202
]
}
]
},
"runAfter": {
"HTTP_2": [
"Succeeded",
"Failed"
]
},
"type": "If"
},
"HTTP_2": {
"inputs": {
"authentication": {
"password": "<some-password>",
"type": "Basic",
"username": "<some-username>"
},
"body": "#outputs('Compose')",
"headers": {
"Content-Type": "application/json"
},
"method": "POST",
"uri": "<some-url>"
},
"runAfter": {},
"type": "Http"
}
},
"else": {
"actions": {
"Condition_4": {
"actions": {
"Delete_blob_5": {
"inputs": {
"host": {
"connection": {
"name": "#parameters('$connections')['azureblob']['connectionId']"
}
},
"method": "delete",
"path": "/datasets/default/files/#{encodeURIComponent(encodeURIComponent(triggerBody()?['Id']))}"
},
"runAfter": {},
"type": "ApiConnection"
}
},
"else": {
"actions": {
"Copy_blob_3": {
"inputs": {
"host": {
"connection": {
"name": "#parameters('$connections')['azureblob']['connectionId']"
}
},
"method": "post",
"path": "/datasets/default/copyFile",
"queries": {
"destination": "/<some-blob-container>/#{triggerBody()?['Name']}",
"overwrite": false,
"queryParametersSingleEncoded": true,
"source": "#triggerBody()?['Path']"
}
},
"runAfter": {
"Post_message_3": [
"Succeeded"
]
},
"type": "ApiConnection"
},
"Delete_blob_6": {
"inputs": {
"host": {
"connection": {
"name": "#parameters('$connections')['azureblob']['connectionId']"
}
},
"method": "delete",
"path": "/datasets/default/files/#{encodeURIComponent(encodeURIComponent(triggerBody()?['Id']))}"
},
"runAfter": {
"Copy_blob_3": [
"Succeeded"
]
},
"type": "ApiConnection"
},
"Post_message_3": {
"inputs": {
"host": {
"connection": {
"name": "#parameters('$connections')['slack']['connectionId']"
}
},
"method": "post",
"path": "/chat.postMessage",
"queries": {
"channel": "<snip>",
"text": "<some-message>"
}
},
"runAfter": {},
"type": "ApiConnection"
}
}
},
"expression": {
"or": [
{
"equals": [
"#outputs('HTTP_3')['statusCode']",
200
]
},
{
"equals": [
"#outputs('HTTP_3')['statusCode']",
202
]
}
]
},
"runAfter": {
"HTTP_3": [
"Succeeded"
]
},
"type": "If"
},
"HTTP_3": {
"inputs": {
"authentication": {
"password": "<some-password>",
"type": "Basic",
"username": "<some-username>"
},
"body": "#outputs('Compose')",
"headers": {
"Content-Type": "application/json"
},
"method": "POST",
"uri": "<some-url>"
},
"runAfter": {},
"type": "Http"
}
}
},
"expression": {
"and": [
{
"contains": [
"#outputs('Compose')",
"\"CustomerId\":\"00000000-0000-0000-0000-000000000000\""
]
}
]
},
"runAfter": {
"Compose": [
"Succeeded"
]
},
"type": "If"
},
"Get_blob_content": {
"inputs": {
"host": {
"connection": {
"name": "#parameters('$connections')['azureblob']['connectionId']"
}
},
"method": "get",
"path": "/datasets/default/files/#{encodeURIComponent(encodeURIComponent(triggerBody()?['Path']))}/content",
"queries": {
"inferContentType": true
}
},
"runAfter": {},
"type": "ApiConnection"
}
},
"contentVersion": "1.0.0.0",
"outputs": {},
"parameters": {
"$connections": {
"defaultValue": {},
"type": "Object"
}
},
"triggers": {
"When_a_blob_is_added_or_modified_(properties_only)": {
"inputs": {
"host": {
"connection": {
"name": "#parameters('$connections')['azureblob']['connectionId']"
}
},
"method": "get",
"path": "/datasets/default/triggers/batch/onupdatedfile",
"queries": {
"folderId": "<some-generated-folderid>",
"maxFileCount": 100
}
},
"metadata": {
"<some-generated-folderid>": "/<some-blob-container>"
},
"recurrence": {
"frequency": "Minute",
"interval": 1
},
"splitOn": "#triggerBody()",
"type": "ApiConnection"
}
}
}
}

GeoJSON parse node

I am getting an 500 error 'unexpected problem has occurred' when I parse the GeoJSON data from this weather API site using node app.
The code is a simple proxy server to receive request from client for weather info on a particular site and process an async request to the weatherAPI, when the response is received its send to the client. When I replace the URL with something that returns JSON it works. The issue is when the response data is GeoJSON.
Appreciate if anyone help shed some light on how to parse the GeoJSON response in node JavaScript.
Thank you in advance.
Here is my node app code:
function initialize() {
// Setting URL and headers for request
var options = {
url: 'https://api.weather.xxx/points/39.7456,-97.0892',
headers: {
'User-Agent': 'request'
}
};
// Return new promise
return new Promise(function(resolve, reject) {
// Do async job
request.get(options, function(err, resp, body) {
if (err) {
reject(err);
} else {
resolve(JSON.parse(body));
}
})
})
}
http.createServer(function (req, res) {
var initializePromise = initialize();
initializePromise.then(function(result) {
var geoDetails = result;
console.log("Initialized Geo details");
// Use user details from here
console.log(geoDetails);
res.writeHead(200, { 'Content-Type': 'text/plain' });
res.write('request successfully proxied!' + '\n' +
JSON.stringify(geoDetails, true, 2));
res.end();
}, function(err) {
console.log(err);
})
}).listen(9000);
Here is the GeoJSON data :
{
"#context": [
"...",
{
"wx": "...",
"s": "...",
"geo": "...",
"unit": "...",
"#vocab": "...",
"geometry": {
"#id": "s:GeoCoordinates",
"#type": "geo:wktLiteral"
},
"city": "s:addressLocality",
"state": "s:addressRegion",
"distance": {
"#id": "s:Distance",
"#type": "s:QuantitativeValue"
},
"bearing": {
"#type": "s:QuantitativeValue"
},
"value": {
"#id": "s:value"
},
"unitCode": {
"#id": "s:unitCode",
"#type": "#id"
},
"forecastOffice": {
"#type": "#id"
},
"forecastGridData": {
"#type": "#id"
},
"publicZone": {
"#type": "#id"
},
"county": {
"#type": "#id"
}
}
],
"id": "...api.weather.xxx/points/39.7456,-97.0892",
"type": "Feature",
"geometry": {
"type": "Point",
"coordinates": [
-97.0892,
39.7456
]
},
"properties": {
"#id": "...api.weather.xxx/points/39.7456,-97.0892",
"#type": "wx:Point",
"cwa": "TOP",
"forecastOffice": "...api.weather.xxx/offices/TOP",
"gridX": 31,
"gridY": 80,
...
"relativeLocation": {
"type": "Feature",
"geometry": {
"type": "Point",
"coordinates": [
-97.086661,
39.679376
]
},
"properties": {
"city": "Linn",
"state": "KS",
"distance": {
"value": 7366.9851976444,
"unitCode": "unit:m"
},
"bearing": {
"value": 358,
"unitCode": "unit:degrees_true"
}
}
},
...
}
}
I am interested in getting all the Properties in plain text or JSON.
Modify your headers to accept JSON.
var options = {
url: 'https://api.weather.gov/points/39.7456,-97.0892',
headers: {
'user-agent': 'request',
'accept': 'application/json'
}
};

Azure Logic for getting data from SQL to FTP

I've a task of taking data from SQL and uploading the data as a CSV file up to an FTP server.
Now I've done this for a single SQL row just fine. The problem I'm having is looping over all rows (foreach loop) and inserting these rows as the content of the CSV file. I've tried a FTP Create File Task inside a foreach loop, but I can only access a single row at a time to set as the file's content - I need all the rows!
Also to keep in mind is that these files will have 200k+ rows.
I could of course just write a C# console app for this but the ease at which I got this far without writing any code makes it seem like it will be a worthwhile endeavor.
We recently added "Table" primitive for this scenario, support in designer is still work in progress, but you can use it in code view.
In below scenario, I'm getting rows from a table in SQL Azure, producing an CSV with two columns using data from the SQL query (First Name, Last Name), then send it via e-mail.
"Get_rows": {
"inputs": {
"host": {
"api": {
"runtimeUrl": "https://logic-apis-southcentralus.azure-apim.net/apim/sql"
},
"connection": {
"name": "#parameters('$connections')['sql']['connectionId']"
}
},
"method": "get",
"path": "/datasets/default/tables/#{encodeURIComponent(encodeURIComponent('[SalesLT].[Customer]'))}/items",
"queries": {
"$top": 10
}
},
"runAfter": {},
"type": "ApiConnection"
},
"tableCsv0": {
"inputs": {
"columns": [
{
"header": "First Name",
"value": "#item()?['FirstName']"
},
{
"header": "Last Name",
"value": "#item()?['LastName']"
}
],
"format": "csv",
"from": "#body('Get_rows')?['value']"
},
"runAfter": {
"Get_rows": [
"Succeeded"
]
},
"type": "Table"
},
"Send_an_email": {
"inputs": {
"body": {
"Body": "#body('tableCsv0')",
"Subject": "Subject",
"To": "deli#microsoft.com"
},
"host": {
"api": {
"runtimeUrl": "https://logic-apis-southcentralus.azure-apim.net/apim/office365"
},
"connection": {
"name": "#parameters('$connections')['office365']['connectionId']"
}
},
"method": "post",
"path": "/Mail"
},
"runAfter": {
"tableCsv0": [
"Succeeded"
]
},
"type": "ApiConnection"
}
So just following up to show how Derek's answer helped me with my problem to get a large number of rows to up to a file on an FTP server. I ended up using the output body of the Execute Stored Procedure action as the GetRows action was limited to 512 rows.
NOTE: As the Table action is not available in the designer, yet, do everything in the code viewer, opening the designer caused issues and deleted all my code at one point.
"actions": {
"Create_file": {
"inputs": {
"body": "#body('tableCsv0')",
"host": {
"api": {
"runtimeUrl": "https://logic-apis-northeurope.azure-apim.net/apim/ftp"
},
"connection": {
"name": "#parameters('$connections')['ftp']['connectionId']"
}
},
"method": "post",
"path": "/datasets/default/files",
"queries": {
"folderPath": "transactions/ready/ecommerce/tickets_test/",
"name": "grma_tickets_#{formatDateTime(utcNow(),'yyyyMMdd_hhmmss')}.csv"
}
},
"runAfter": {
"tableCsv0": [
"Succeeded"
]
},
"type": "ApiConnection"
},
"Execute_stored_procedure": {
"inputs": {
"host": {
"api": {
"runtimeUrl": "https://logic-apis-northeurope.azure-apim.net/apim/sql"
},
"connection": {
"name": "#parameters('$connections')['sql']['connectionId']"
}
},
"method": "post",
"path": "/datasets/default/procedures/#{encodeURIComponent(encodeURIComponent('[Scheduledjob].[GetBArcodesForGRMA]'))}"
},
"runAfter": {},
"type": "ApiConnection"
},
"tableCsv0": {
"inputs": {
"columns": [
{
"header": "EventDateTime",
"value": "#item()?['EventDateTime']"
},
{
"header": "EventName",
"value": "#item()?['EventName']"
}
],
"format": "csv",
"from": "#body('Execute_stored_procedure')['ResultSets']['Table1']"
},
"runAfter": {
"Execute_stored_procedure": [
"Succeeded"
]
},
"type": "Table"
}

CouchDB document update handler truncating URL parameters, despite example from documentation

According to the documents here: https://wiki.apache.org/couchdb/Document_Update_Handlers#Request
I can specify an update handler that accepts query string parameters. eg. (straight from the docs):
handlerName: function(doc, req) {
var field = req.query.field;
var value = req.query.value;
var message = 'set ' + field + ' to ' + value;
doc[field] = value;
return [doc, message];
}
However when I look at the req object, there is no query.value field. My cURL command looks like this:
curl -X PUT http://127.0.0.1:5984/map_reduce2/_design/mp2/_update/test/1?field=THEFIELD&value=THEVALUE
And the resultant document looks like this:
{
"_id": "1",
"_rev": "24-06f05b375e4da9ec2fc88e28711fff7d",
"x": {
"info": {
"db_name": "map_reduce2",
"doc_count": 30,
"doc_del_count": 4,
"update_seq": 168,
"purge_seq": 0,
"compact_running": false,
"disk_size": 970856,
"data_size": 187724,
"instance_start_time": "1461749557106546",
"disk_format_version": 6,
"committed_update_seq": 168
},
"id": "1",
"uuid": "cb9251557f16b50c26ef2abd8200a727",
"method": "PUT",
"requested_path": [
"map_reduce2",
"_design",
"mp2",
"_update",
"test",
"1?field=THEFIELD"
],
"path": [
"map_reduce2",
"_design",
"mp2",
"_update",
"test",
"1"
],
"raw_path": "/map_reduce2/_design/mp2/_update/test/1?field=THEFIELD",
"query": {
"field": "THEFIELD"
},
"headers": {
"Accept": "*/*",
"Host": "127.0.0.1:5984",
"User-Agent": "curl/7.43.0"
},
"body": "undefined",
"peer": "127.0.0.1",
"form": {
},
"cookie": {
},
"userCtx": {
"db": "map_reduce2",
"name": null,
"roles": [
]
},
"secObj": {
}
}
}

Resources