How to filter data from database table? - node.js

How can I do the filtration inside database table, and get the data base on my conditions or filters.
Table: customers:
Lets assume a situation. I have a table name customers,
Now I am having columns for store customer address (customer address is type of object {}) address column is typeOf object with porperties country, state, city, etc. another column I am having for dob (date of birth),
Now I want to take certain inputs or filters from user, all the filters can be optional,
for example let's assume there are 3 filters country, fromDob, toDob, and state, now we may or may not have fromDOB or toDOB parameter if user want to fetch the data between the specific date of birth or example between 2009-12-12 to 2011-12-12. but there are some more filters as well for example with dob filters there is a filter for country and the country should be US so I need to fetch that data from customer_account table where dob is between 2009-12-12 and 2011-12-12 and the country should be US inside address object.
but there is one more filter name state, but somtime user may paas all the filters or sometime user just want to fetch the data for some specific filters. all the filters are optional. user can use any of them or few of them or all of them it's user's wish, what my task is, to fetching the data based on user's provided filters.
But I am not sure how can I do so.
I am implementing all the code inside nodejs and Knexjs.
Any help?
Thanks

you can use if condition inside the where conditions builder
knex('users')
.where((builder) => {
if(fromDob)
builder
.Where('Dob', '>', fromDob)
if(state)
builder.where('state', state)
return builder
}
)

Ok, for you to be able to filter the data between date of birth you need to use this approach in mongoDB
$filter: {
input: <array>,
cond: <expression>,
as: <string>,
limit: <number expression>
}
$input field is the array of documents that resolves to an array
$cond An expression that resolves to a boolean value used to determine if an element should be included in the output array. The expression references each element of the input array
$as Optional. A name for the variable that represents each individual element of the input array. If no name is specified, the variable name defaults to this.
$limit Optional. A number expression that restricts the number of matching array elements that
$filter
returns. You cannot specify a limit less than 1. The matching array elements are returned in the order they appear in the input array.
Example
db.customers.aggregate({
$filter: {
input: [userInput],
as: "date"
cond: [
{
"date": {
$lt: "2011-12-12" $and $gt: "2009-12-12"
}
}, {
"address": {
$eq: "US"
}
}
]
}
});
Result
| id | dob | Address |
|:---- |:------:| -----:|
| 1 | "date" | {country: "US", state: "", address: ""} |
I hope this helps.

Related

NodeJS with MS-SQL

In NodeJs with MS-SQL, I want to retrieve two three table data in the form of array of objects
Hello there my name is Shaziya, please help me out (โ—ยดโŒ“`โ—)
Actually i have done NodeJs from YouTube,
I want learn NodeJS with MS-SQL, do you or any friends have such course for Advance understanding,
Like how to connect 4 5 tables and show data in array of objects format
How to make nested query or subquires like...
I mean if wanna do table match with two table product and order
Product ID with Order table by matching product ID
data {
productId : 1:
productName : "abc",
[{
orderId : 1
orderName : "xyz"
},
{
orderId : 2
orderName : "pqr"
}
]
}
At least i got some course or solution for that where i stuck

MongoDB sort by custom calculation in Node.JS mongodb driver

I'm Using Node.JS MongoDB driver. I have a collection of job lists with salary and number of vacancies, I want to sort them according to one rule, if either salary or number of vacancies are greater they will get top priority in sorting, and I came up with this simple formula
( salary / 100 ) + num_of_vacancies
eg:
Top priority ones
{ salary: 5000 , num_of_vacancies: 500 } // value is 550
{ salary: 50000 , num_of_vacancies: 2 } // value is 502
And Less priority for
{ salary: 5000 , num_of_vacancies: 2 } // value is 52
But my Problem is, As far as I know, MongoDB sort takes arguments only to sort in ascending or descending order and a property to sort. How do I sort with custom expression.
The data in MongoDB looks like this // not the full varsion
{
title:"job title",
description:"job description",
salary:5000,
num_of_vacancy:50
}
This is just an option. Adjust it for a mongo driver.
$addFields we create the field to sort, named toSortLater just for semantic purposes.
add a $sort stage, and sort high values first. Change to 1 for the opposite behaviour.
db.collection.aggregate([{
$addFields:{
toSortLater:{
$add:[
{$divide:["$salary", 100]},
"$num_of_vacancies"]
}}}, {$sort:{"toSortLater":-1}}
])

Split out JSON input and apply same JSON field name as column name in Alteryx workflow

I'm using Alteryx 2019.3 and looking to build a workflow which uses JSON as input. When it reads the JSON it puts the JSON key value pairs into columns called JSON_Name and JSON_ValueString
In an example I have mocked up, the field names in the JSON below looks like this in the JSON_Name column:
customer.0.name
customer.0.contactDetails.0.company
customer.0.contactDetails.0.addressDetails.0.address
customer.0.contactDetails.0.addressDetails.0.addressType
customer.0.departments.0.name
What I want to do is the split it out into different tables and have the last part of the JSON_Name value as the column name so it looks something like this (caps show table name):
CUSTOMER
customerId
CONTACTDETAILS
customerId
company
ADDRESSDETAILS
customerId
address
addressType
DEPARTMENTS
customerId
name
How do I do this in Alteryx and how can I get it to work when I'm there can be multiple entries in the JSON list?
Thanks for any help
JSON input (mock up for example)
{
"id": "1234",
"contactDetails": [{
"company": "company1",
"addressDetails":
[{
"address": "City1",
"addressType": "Business"
}]
}]
"departments":
[{
"name": "dept1
}]
}
You can do this with a Text to columns and then a series of filters to split it into different datasets (tables). You probably want to use crosstabs to get the format of the tables right.

MongoDB Data Structure

I'm a bit of a noob with MongoDB, so would appreciate some help with figuring out the best solution/format/structure in storing some data.
Basically, the data that will be stored will be updated every second with a name, value and timestamp for a certain meter reading.
For example, one possibility is water level and temperature in a tank. The tank will have a name and then the level and temperature will be read and stored every second. Overall, there will be 100's of items (i.e. tanks), each with millions of timestamped values.
From what I've learnt so far (and please correct me if I'm wrong), there are a few options as how to structure the data:
A slightly RDMS approach:
This would consist of two collections, Items and Values
Items : {
_id : "id",
name : "name"
}
Values : {
_id : "id",
item_id : "item_id",
name : "name", // temp or level etc
value : "value",
timestamp : "timestamp"
}
The more document db denormalized method:
This method involves one collection of items each with an array of timestamped values
Items : {
_id : "id",
name : "name"
values : [{
name : "name", // temp or level etc
value : "value",
timestamp : "timestamp"
}]
}
A collection for each item
Save all the values in a collection named after that item.
ItemName : {
_id : "id",
name : "name", // temp or level etc
value : "value",
timestamp : "timestamp"
}
The majority of read queries will be to retrieve the timestamped values for a specified time period of an item (i.e. tank) and display in a graph. And for this, the first option makes more sense to me as I don't want to retrieve the millions of values when querying for a specific item.
Is it even possible to query for values between specific timestamps for option 2?
I will also need to query for a list of items, so maybe a combination of the first and third option with a collection for all the items and then a number of collections to store the values for each of those items?
Any feedback on this is greatly appreciated.
Don't use timestamp if you are not modifying the ObjectId.
As ObjectId itself has time stamp in it.
So you will be saving a lot of memory by it.
MongoDB Id Documentation
In case if you dont require the previous data then you can use update query in MongoDB to update the fields every second instead of storing.
If you want to store the updated data each time then instead of updating store it in flat structure.
{ "_id" : ObjectId("XXXXXX"),
"name" : "ItemName",
"value" : "ValueOfItem"
"created_at" : "timestamp"
}
Edit 1: Added timestamp as per the comments

Index multiple MongoDB fields, make only one unique

I've got a MongoDB database of metadata for about 300,000 photos. Each has a native unique ID that needs to be unique to protect against duplication insertions. It also has a time stamp.
I frequently need to run aggregate queries to see how many photos I have for each day, so I also have a date field in the format YYYY-MM-DD. This is obviously not unique.
Right now I only have an index on the id property, like so (using the Node driver):
collection.ensureIndex(
{ id:1 },
{ unique:true, dropDups: true },
function(err, indexName) { /* etc etc */ }
);
The group query for getting the photos by date takes quite a long time, as one can imagine:
collection.group(
{ date: 1 },
{},
{ count: 0 },
function ( curr, result ) {
result.count++;
},
function(err, grouped) { /* etc etc */ }
);
I've read through the indexing strategy, and I think I need to also index the date property. But I don't want to make it unique, of course (though I suppose it's fine to make it unique in combine with the unique id). Should I do a regular compound index, or can I chain the .ensureIndex() function and only specify uniqueness for the id field?
MongoDB does not have "mixed" type indexes which can be partially unique. On the other hand why don't you use _id instead of your id field if possible. It's already indexed and unique by definition so it will prevent you from inserting duplicates.
Mongo can only use a single index in a query clause - important to consider when creating indexes. For this particular query and requirements I would suggest to have a separate unique index on id field which you would get if you use _id. Additionally, you can create a non-unique index on date field only. If you run query like this:
db.collection.find({"date": "01/02/2013"}).count();
Mongo will be able to use index only to answer the query (covered index query) which is the best performance you can get.
Note that Mongo won't be able to use compound index on (id, date) if you are searching by date only. You query has to match index prefix first, i.e. if you search by id then (id, date) index can be used.
Another option is to pre aggregate in the schema itself. Whenever you insert a photo you can increment this counter. This way you don't need to run any aggregation jobs. You can also run some tests to determine if this approach is more performant than aggregation.

Resources