Universal-Aanalytics: Visitor.event() call is failing with label values - node.js

I am using universal-analytics for Google Analytics, but the visitor.event() call with label value as a string is not working.
As per the below document, it says the label value should be non-negative, does it mean it only accepts Integers or String as well?
https://github.com/peaksandpies/universal-analytics/blob/master/AcceptableParams.md
FYI:
visitor.event(category, action, label).send(); // Working and events are reported.
Example: {"category":"APISvc","action":"URLStats","label":"Called"}
visitor.event(category, action, label, labelValue).send(); // NOT Working.
Example: {"category":"APISvc","action":"URLStats","label":"Success","labelValue":"2S"}
Context: Using this in Cloud Functions (NodeJS) on Google Firebase.
Appreciating your help!

The value component, for a Google Analytics event, if it is defined it must be an integer (https://support.google.com/analytics/answer/1033068?hl=en).
Example:
{"category":"APISvc","action":"URLStats","label":"Success","labelValue":2}

Related

Problem accesing a dictionary value on dialogflow fullfilment

I am using fullfilment section on dialogflow on a fairly basic program I have started to show myself I can do a bigger project on dialogflow.
I have an object setup that is a dictionary.
I can make the keys a constant through
const KEYS=Object.keys(overflow);
I am going through the values using
if(KEYS.length>0){
var dictionary = overflow[keys[i]]
if I stringify dictionary using
JSON.stringify(item);
I get:
{"stack":"overflow","stack2":"overflowtoo", "stack3":3}
This leads me to believe I am actually facing a dictionary, hence the name of the variable.
I am accesing a string variable such as stack and unlike stack3
Everything I have read online tells me
dictionary.stack
Should work since
JSON.stringify(item);
Shows me:
{"stack":"overflow","stack2":"overflowtoo","stack3":3}
Whenever I:
Try to add the variable to the response string.
Append it to a string using output+=${item.tipo};
I get an error that the function crashed. I can replace it with the stringify line and it works and it gives me the JSON provided so the issue isnt there
Dictionary values are created as such before being accessed on another function:
dictionary[request.body.responseId]={
"stack":"overflow",
"stack2":"overflowtoo",
"stack3":3 };
Based on the suggestion here I saw that the properties where being accessed properly but their type was undefined. Going over things repeatedly I saw that the properties where defined as list rather than single values.
Dot notation works when they stop being lists.
Thanks for guiding me towards the problem.

Google Cloud Datastore Cursor with google.cloud.ndb

I am working with Google Cloud Datastore using the latest google.cloud.ndb library
I am trying to implement pagination use Cursor using the following code.
The same is not fetching the data correctly.
[1] To Fetch Data:
query_01 = MyModel.query()
f = query_01.fetch_page_async(limit=5)
This code works fine and fetches 5 entities from MyModel
I want to implementation pagination that can be integrated with a Web frontend
[2] To Fetch Next Set of Data
from google.cloud.ndb._datastore_query import Cursor
nextpage_value = "2"
nextcursor = Cursor(cursor=nextpage_value.encode()) # Converts to bytes
query_01 = MyModel.query()
f = query_01.fetch_page_async(limit=5, start_cursor= nextcursor)
[3] To Fetch Previous Set of Data
previouspage_value = "1"
prevcursor = Cursor(cursor=previouspage_value.encode())
query_01 = MyModel.query()
f = query_01.fetch_page_async(limit=5, start_cursor=prevcursor)
The [2] & [3] sets of code do not fetch paginated data, but returns results same as results of codebase [1].
Please note I'm working with Python 3 and using the
latest "google.cloud.ndb" Client library to interact with Datastore
I have referred to the following link https://github.com/googleapis/python-ndb
I am new to Google Cloud, and appreciate all the help I can get.
Firstly, it seems to me like you are expecting to use the wrong kind of pagination. You are trying to use numeric values, whereas the datastore cursor is providing cursor-based pagination.
Instead of passing in byte-encoded integer values (like 1 or 2), the datastore is expecting tokens that look similar to this: 'CjsSNWoIb3Z5LXRlc3RyKQsSBFVzZXIYgICAgICAgAoMCxIIQ3ljbGVEYXkiCjIwMjAtMTAtMTYMGAAgAA=='
Such a cursor you can obtain from the first call to the fetch_page() method, which returns a tuple:
(results, cursor, more) where results is a list of query results, cursor is a cursor pointing just after the last result returned, and more indicates whether there are (likely) more results after that
Secondly, you should be using fetch_page() instead of fetch_page_async(), since the second method does not return you the cursors you need for pagination. Internally, fetch_page() is calling fetch_page_async() to get your query results.
Thirdly and lastly, I am not entirely sure whether the "previous page" use-case is doable using the datastore-provided pagination. It may be that you need to implement that yourself manually, by storing some of the cursors.
I hope that helps and good luck!

Azure Application Insight. Custom attribute length restriction

I'm using Azure App Insight as a logging tool and store log data by the following code:
private void SendTrace(LoggingEvent loggingEvent)
{
loggingEvent.GetProperties();
string message = "TestMessage";
var trace = new TraceTelemetry(message)
{
SeverityLevel = SeverityLevel.Information
};
trace.Properties.Add("TetstKey", "TestValue");
var telemetryClient = new TelemetryClient();
telemetryClient.Context.InstrumentationKey = this.InstrumentationKey;
telemetryClient.Track(trace);
}
everything works well. I see logged record in App insight as well as in App insight analytics (in trace table). My custom attributes are written in special app insight row section - customDimensions. For example, the above code will add new attribute with "TestKey" key and "TestValue" value into customDimensions section.
But when I try to write some big text (for example JSON document with more then 15k letters) I still can do it without any exceptions, but the writable text will be cut off after some document length. As the result, the custom attribute value in customDimensions section will be cropped too and will have only first part of document.
As I understand there is the restriction for max text length which is allowed to be written in app insight custom attribute.
Could someone know how can I get around with this?
The message has the highest allowed limit of 32768. For items in the property collection, value has max limit of 8192.
So you can try one of the following options:
Use message field to the fullest by putting the big text there.
Split the data into multiple, and add to properties collection separately.
eg:
trace.Properties.Add("key_part1", "Bigtext1_upto8192");
trace.Properties.Add("key_part2", "Bigtext2_upto8192");
Reference: https://github.com/MicrosoftDocs/azure-docs/blob/master/includes/application-insights-limits.md

Getting arguments/parameters values from api.ai

I'm now stuck on the problem of getting user input (what user says) in my index.js. For example, the user says: please tell me if {animals} can live between temperature {x} to {y}. I want to get exact value (in string) for what animals, x and y so that I can check if it is possible in my own server. I am wondering how to do that since the entities need to map to some exact key values if I annotate these three parameters to some entities category.
The methods for ApiAiApp is very limited: https://developers.google.com/actions/reference/nodejs/ApiAiApp
And from my perspective, none of the listed methods work in this case.
Please help!
Generally API.AI entities are for some set of known values, rather than listening for any value and validating in the webhook. First, I'd identify the kinds of entities you expect to validate against. For the temperatures (x and y), I'd use API.AI's system entities. Calling getArgument() for those parameters (as explained in the previous answer) should return the exact number value.
For the animals, I'd use API.AI's developer entities. You can upload them in the Entity console using JSON or CSV. You can enable API.AI's automated expansion to allow the user to speak animals which you don't support, and then getArgument() in webhook the webhook will return the new value recognized by API.AI. You can use this to validate and respond with an appropriate message. For each animal, you can also specify synonymous names and when any of these are spoken, and getArgument() will return the canonical entity value for that animal.
Extra tip, if you expect the user might speak more than one animal, make sure to check the Is List box in the parameter section of the API.AI intent.
If "animals", "x", and "y" are defined as parameters in your Intent in API.AI, then you can use the getArgument() method defined in ApiAiApp.
So the following should work:
function tempCheck( app ){
var animals = app.getArgument('animals');
var x = app.getArgument('x');
var y = app.getArgument('y');
// Check the range and then use something like app.tell() to give a response.
}

Array to function parameters

I'm using the node.js Redis library and I'm attempting to bulk-subscribe to many keys. I've got an array which is dynamic i.e
var keys {'key1','key2',...,'keyN'}
and I want to feed each index in as parameters to subscribe in the Redis library which takes one or more string(s). I've tried the apply function in JS using..
redisClient.subscribe.apply(this,keys);
but it doesn't cause a subscription. Any suggestions on how I can get over this issue?
Your example data is totally invalid JS, but I'm assuming you have it correct in your code.
You need to set the proper function context:
redisClient.subscribe.apply(redisClient, keys);

Resources