I have a local countly installation, I have been able to make it multi tenet, now any dev from my organization can make an account on it, make applications under his account, and send data on it.
We would like to add some more data to our dashboard, like "Gender","age" e.t.c, and that I suppose will be in form of another metric like "device details" and "carriers", I would like know how do we add a custom metric to it.
I am new to nodejs and express, but I am getting hang of it, so just elementary know how will do, anyways I am reading code at the moment, so If I realize how to do it, i'll post it here as well.
There is a generic piece of code which gets the params from incoming http request, and throws them to the db, like all other params, the new collection is also made by this process automatically.
in file api/parts/data/usage.js
var predefinedMetrics = [
{ db: "devices", metrics: [{ name: "_device", set: "devices", short_code: common.dbUserMap['device'] }] },
{ db: "carriers", metrics: [{ name: "_carrier", set: "carriers", short_code: common.dbUserMap['carrier'] }] },
{ db: "device_details", metrics: [{ name: "_os", set: "os", short_code: common.dbUserMap['platform'] }, { name: "_os_version", set: "os_versions", short_code: common.dbUserMap['platform_version'] }, { name: "_resolution", set: "resolutions" }] },
{ db: "app_versions", metrics: [{ name: "_app_version", set: "app_versions", short_code: common.dbUserMap['app_version'] }] },
{ db: "gender", metrics: [{ name: "_gender", set: "gender", short_code: common.dbUserMap['gender'] }] }]
I just added gender to this list of predefined metrices, and now when i send the gender along with my http request, it gets saved easily.
Now I am working on rendering it on dashboard.
Related
I have built a schema that should handle the multi-language on MongoDB.
It looks something like that
name: 'product-en',
code: 'xyz',
translation: [
{
lang: 'ar',
name: 'product-ar'
},
{
lang: 'fr',
name: 'product-fr'
},
]
But let's say if I want to add a category for this product and also the category has a translation and I will use referencing so when i populate the document, the result will be something like that.
name: 'product-en',
code: 'xyz',
category: {
name: "category-en",
translation: [
{
lang: "ar",
name: "category-ar"
},
{
lang: "fr",
name: "category-fr"
}
]
},
translation: [
{
lang: 'ar',
name: 'product-ar'
},
{
lang: 'fr',
name: 'product-fr'
},
]
So, I'm using NestJS and I thought that I should use an interceptor to only apply the requested language so if the user wants the products in English, it will remove only the translation key, and if he wants it in Arabic it will search for the Arabic language in the translation array and merge its value in the name.
I tried to apply that with the merge method in Loadash and it worked but for the populated key which is category it won't cause it will only search for the doc.translation and won't care about doc.category.translation and if there is more like doc.pla.plapla.translation it won't know that it should merge it also.
So, what should I do to solve this issue?
I am using Analytics Reporting v4 and Node.js.
I need to get a number of triggered events for a group of dimensions.
For example:
dimensions: date, source, medium, campaign
metrics: pageviews, totalEvents
where eventAction = "Test Action"
When I combine these two metrics: pageviews and totalEvents,
it shows the wrong numbers in result. But when I use them separately, then it works well.
True results for metrics:
total pageviews - 32 (but shows 17)
total events - 9
Maybe someone knows why? Maybe because it does not calculate pageviews where the user didn't do an action ("Test Action")? And how can I do this correctly?
Response example - http://i.imgur.com/BUkqiQG.png
Request code:
reportRequests: [{
view_id: viewId,
dateRanges: [{
startDate: '2020-02-10',
endDate: '2020-02-10',
}],
dimensions: [
{
name: 'ga:date'
},
{
name: 'ga:source'
},
{
name: 'ga:medium'
},
{
name: 'ga:campaign'
}
],
metrics: [
{
expression: 'ga:pageviews'
},
{
expression: 'ga:totalEvents'
},
],
orderBys: [{
fieldName: 'ga:date',
sortOrder: "ASCENDING"
}],
dimensionFilterClauses: [{
filters: [
{
dimension_name: 'ga:eventAction',
operator: 'EXACT',
expressions: ["Test Action"]
}
]
}]
}]
This is because you are filtering down the pages to those which only have the event.
The source, medium, campaign dimensions are all session level. Therefore, when you report on those, and just pageviews, they give total pageviews.
However, when you filter the results to where eventAction=Test, it only returns the pageviews where that event action occurred.
Instead of this, I would suggest using a segment, something like:
"segments": [{
"dynamicSegment": {
"sessionSegment": {
"segmentFilters": [{
"simpleSegment" :{
"orFiltersForSegment": [{
"segmentFilterClauses":[{
"dimensionFilter": {
"dimensionName": "ga:eventAction",
"expressions": ["Test Action"]
}
}]
}]
}
}]
}
}
}]
More info: https://developers.google.com/analytics/devguides/reporting/core/v4/rest/v4/reports/batchGet#DynamicSegment
In my data model, I have an instance: db.items , which has relation "hasMany" with db.images, db.services. So one item can have many images and many services.
I would like to update my instance calling the following:
let res = await db.items
.upsert(
{
id: req.body.id,
type: req.body.type,
name: req.body.name,
images: req.body.images,
services: req.body.services
},
{
include: [
{
model: db.images,
as: "images"
},
{
model: db.services,
as: "services"
}
]
}
);
The problem is that all the aliases are not updated, but the name and type of the item updated. What could be the problem? Include perfectly works with db.create() call, but with upsert - maybe there is a different way of doing things?
I have a basic user collection that consists of document like so:
{
user: 3949fj9fn9rjhfne93,
name: "Jerry",
country: 'au',
friends: ['20fn39r4hf93', 'g9en3irhr934', '4i5henifuw92']
}
Each of those friends also has a document the same, along with there being a large collection of countries that can be queried via the country code.
{
code: 'AU',
name: 'Australia'
}, {
code: 'NZ',
name: 'New Zealand'
}
My question is, how would I include the full document for each of those array items within the result, like so:
{
user: 3949fj9fn9rjhfne93,
name: "Jerry",
country: {
code: 'AU',
name: 'Australia'
},
friends: [{
user: 20fn39r4hf93,
name: "Bob",
friends: ['2049429fr', 'djwij393i4']
}, {
user: g9en3irhr934,
name: "Foo",
friends: []
}, {
user: 4i5henifuw92,
name: "Bar",
friends: ['2049429fr']
}]
}
I am using Mongoose in my application, and I understand that this could be done by using a simple for loop and pushing the results to the user object and then returning it in node with res.json(user), although what if the user had hundreds (or even thousands) of friends? The request would be huge. I also need to do this in multiple places within my API.
Is there a more efficient way to achieve this?
The following code represents an Account Model in Sails.js v0.9.4 .
module.exports = {
attributes: {
email: {
type: 'email',
unique: true,
required: true
},
password:{
type: 'string',
minLength: 6,
maxLength: 15,
required:true
}
}
};
When I send two POSTS and a PUT request via Postman to localhost:8080/account, the unique property of the email fails.
Specifically, I send the following HTTP requests from Postman:
POST http://localhost:8080/account?email=foo#gmail.com&password=123456
POST http://localhost:8080/account?email=bar#gmail.com&password=123456
PUT http://localhost:8080/account?id=1&email=bar#gmail.com
GET http://localhost:8080/account
The last GET request shows me:
[
{
"email": "bar#gmail.com",
"password": "123456",
"createdAt": "2013-09-30T18:33:00.415Z",
"updatedAt": "2013-09-30T18:34:35.349Z",
"id": 1
},
{
"email": "bar#gmail.com",
"password": "123456",
"createdAt": "2013-09-30T18:33:44.402Z",
"updatedAt": "2013-09-30T18:33:44.402Z",
"id": 2
}
]
Should this happen?
*For those who don't know, Waterline generates by default an id which automatically increments in every insertion.
This is because your schema is not updated in your disk database (".tmp/disk.db").
You need to shutdown sails, drop your DB and restart sails.
The DB will be reconstruct with your good schema.
Attention : the data will be drop too !
If you want keep your data, you can just update the schema part of ".tmp/disk.db".
What I have doing to keep data and rebuild schema by sails.js :
copy ".tmp/disk.db"
clean ".tmp/disk.db"
shutdown sails.js
start sails.js
-> the database is empty and the schema is updated
copy old "counters" part
copy old "data" part
You must have this in your schema (file ".tmp/disk.db" -> "schema" part) for the unique field :
"xxx": {
"type": "string",
"unique": true
},
I hope this help you.
I ran into this same issue. To solve it, you have to avoid using the 'disk' ORM adapter. For some reason it appears that it doesn't support uniqueness checks.
Other adapters such as mongo and mysql should support uniqueness checks, so this shouldn't be an issue outside of development.
For the course of development, change the default adapter in config/adapters.js from 'disk' to 'memory'. Should look like this:
module.exports.adapters = {
// If you leave the adapter config unspecified
// in a model definition, 'default' will be used.
'default': 'memory',
// In-memory adapter for DEVELOPMENT ONLY
memory: {
module: 'sails-memory'
},
...
};
I'm not certain this is the issue, but have you added schema:true to your models and adapters?
My mongo adapter config looks like this:
module.exports.adapters = {
'default': 'mongo',
mongo: {
module: 'sails-mongo',
url: process.env.DB_URL,
schema: true
}
};
And my User model looks like this (trimmed a bit):
module.exports = {
schema: true,
attributes: {
username: {
type: 'string',
required: true,
unique: true
}
//...
}
};
There is no need to delete current database to solve this, in stead change the waterline migrate option from safe to alter. This way the underlying database will adapt this setting.
I wouldn't recommend migrate: alter in a production environment, though. ;)
Here is my /config/local.js:
module.exports = {
...
models: {
migrate: 'alter'
},
}
According to the official documentation of sails
You should configure the option "migrate" in "alter" to create the schemas with their indexes
There's nothing wrong with adding or removing validations from your
models as your app evolves. But once you go to production, there is
one very important exception: unique. During development, when your
app is configured to use migrate: 'alter', you can add or remove
unique validations at will. However, if you are using migrate: safe
(e.g. with your production database), you will want to update
constraints/indices in your database, as well as migrate your data by
hand.
http://sailsjs.com/documentation/concepts/models-and-orm/validations
var InvoiceSchema = new Schema({
email: {type: 'email', required: true}
name : {type: String}
});
InvoiceScheme({email: 1}, {unique: true});
Set Uniquee In Nodejs