Split out JSON input and apply same JSON field name as column name in Alteryx workflow - alteryx

I'm using Alteryx 2019.3 and looking to build a workflow which uses JSON as input. When it reads the JSON it puts the JSON key value pairs into columns called JSON_Name and JSON_ValueString
In an example I have mocked up, the field names in the JSON below looks like this in the JSON_Name column:
customer.0.name
customer.0.contactDetails.0.company
customer.0.contactDetails.0.addressDetails.0.address
customer.0.contactDetails.0.addressDetails.0.addressType
customer.0.departments.0.name
What I want to do is the split it out into different tables and have the last part of the JSON_Name value as the column name so it looks something like this (caps show table name):
CUSTOMER
customerId
CONTACTDETAILS
customerId
company
ADDRESSDETAILS
customerId
address
addressType
DEPARTMENTS
customerId
name
How do I do this in Alteryx and how can I get it to work when I'm there can be multiple entries in the JSON list?
Thanks for any help
JSON input (mock up for example)
{
"id": "1234",
"contactDetails": [{
"company": "company1",
"addressDetails":
[{
"address": "City1",
"addressType": "Business"
}]
}]
"departments":
[{
"name": "dept1
}]
}

You can do this with a Text to columns and then a series of filters to split it into different datasets (tables). You probably want to use crosstabs to get the format of the tables right.

Related

How to filter data from database table?

How can I do the filtration inside database table, and get the data base on my conditions or filters.
Table: customers:
Lets assume a situation. I have a table name customers,
Now I am having columns for store customer address (customer address is type of object {}) address column is typeOf object with porperties country, state, city, etc. another column I am having for dob (date of birth),
Now I want to take certain inputs or filters from user, all the filters can be optional,
for example let's assume there are 3 filters country, fromDob, toDob, and state, now we may or may not have fromDOB or toDOB parameter if user want to fetch the data between the specific date of birth or example between 2009-12-12 to 2011-12-12. but there are some more filters as well for example with dob filters there is a filter for country and the country should be US so I need to fetch that data from customer_account table where dob is between 2009-12-12 and 2011-12-12 and the country should be US inside address object.
but there is one more filter name state, but somtime user may paas all the filters or sometime user just want to fetch the data for some specific filters. all the filters are optional. user can use any of them or few of them or all of them it's user's wish, what my task is, to fetching the data based on user's provided filters.
But I am not sure how can I do so.
I am implementing all the code inside nodejs and Knexjs.
Any help?
Thanks
you can use if condition inside the where conditions builder
knex('users')
.where((builder) => {
if(fromDob)
builder
.Where('Dob', '>', fromDob)
if(state)
builder.where('state', state)
return builder
}
)
Ok, for you to be able to filter the data between date of birth you need to use this approach in mongoDB
$filter: {
input: <array>,
cond: <expression>,
as: <string>,
limit: <number expression>
}
$input field is the array of documents that resolves to an array
$cond An expression that resolves to a boolean value used to determine if an element should be included in the output array. The expression references each element of the input array
$as Optional. A name for the variable that represents each individual element of the input array. If no name is specified, the variable name defaults to this.
$limit Optional. A number expression that restricts the number of matching array elements that
$filter
returns. You cannot specify a limit less than 1. The matching array elements are returned in the order they appear in the input array.
Example
db.customers.aggregate({
$filter: {
input: [userInput],
as: "date"
cond: [
{
"date": {
$lt: "2011-12-12" $and $gt: "2009-12-12"
}
}, {
"address": {
$eq: "US"
}
}
]
}
});
Result
| id | dob | Address |
|:---- |:------:| -----:|
| 1 | "date" | {country: "US", state: "", address: ""} |
I hope this helps.

NodeJS with MS-SQL

In NodeJs with MS-SQL, I want to retrieve two three table data in the form of array of objects
Hello there my name is Shaziya, please help me out (โ—ยดโŒ“`โ—)
Actually i have done NodeJs from YouTube,
I want learn NodeJS with MS-SQL, do you or any friends have such course for Advance understanding,
Like how to connect 4 5 tables and show data in array of objects format
How to make nested query or subquires like...
I mean if wanna do table match with two table product and order
Product ID with Order table by matching product ID
data {
productId : 1:
productName : "abc",
[{
orderId : 1
orderName : "xyz"
},
{
orderId : 2
orderName : "pqr"
}
]
}
At least i got some course or solution for that where i stuck

Spark SQL expand array to multiple columns

I am storing json messages for each row update from a oracle source in S3.
json structure is as below
{
"tableName": "ORDER",
"action": "UPDATE",
"timeStamp": "2016-09-04 20:05:08.000000",
"uniqueIdentifier": "31200477027942016-09-05 20:05:08.000000",
"columnList": [{
"columnName": "ORDER_NO",
"newValue": "31033045",
"oldValue": ""
}, {
"columnName": "ORDER_TYPE",
"newValue": "N/B",
"oldValue": ""
}]
}
I am using spark sql to find the latest record for each key based on max value for unique identifier.
columnList is a array with list of columns for the table .i want to join multiple tables and fetch the records which are latest.
How can i join the columns from the json array of one table with columns from another table. Is there a way to explode the json array to multiple columns . For example above json will have ORDER_NO as one column and ORDER_TYPE as another column. How can i create a data frame with multiple columns based on columnName field
For eg: new RDD should have the columns (tableName, action, timeStamp, uniqueIdentifier, ORDER_NO, ORDER_NO)
Value of ORDER_NO and ORDER_NO field should be mapped from newValue field in json.
Have found a solution for this by programmatically creating the schema by using the RDD apis
Dataset<Row> dataFrame = spark.read().json(inputPath);
dataFrame.printSchema();
JavaRDD<Row> rdd = dataFrame.toJavaRDD();
SchemaBuilder schemaBuilder = new SchemaBuilder();
// get the schema column names in appended format
String columnNames = schemaBuilder.populateColumnSchema(rdd.first(), dataFrame.columns());
SchemaBuilder is a custom class created which takes the rdd details and return a delimiter separated column names.
Then using RowFactory.create call, json values are mapped to the schema.
Doc reference http://spark.apache.org/docs/latest/sql-programming-guide.html#programmatically-specifying-the-schema

Azure Stream Analytics query language get value by key from array of key value pairs

I am trying to extract a specific value from an array property in the Stream Analytics query language.
My data looks as follows:
"context": {
"custom": {
"dimensions": [{
"MacAddress": "ma"
},
{
"IpAddress": "ipaddr"
}]
}
}
I am trying to obtain a result that has "MacAddress", "IpAddress" as column titles and "ma", "ipaddr" as rows.
I am currently achieving this with this query:
SELECT
GetRecordPropertyValue(GetArrayElement(MySource.context.custom.dimensions, 0), 'MacAddress') AS MacAddress,
GetRecordPropertyValue(GetArrayElement(MySource.context.custom.dimensions, 1), 'IpAddress') AS IpAddress,
I am trying to use CROSS APPLY but so far no luck. Below the CROSS APPLY query:
SELECT
flat.ArrayValue.MacAddress as MacAddress,
flat.ArrayValue.IpAddress as IpAddress
FROM
[ffapi-track-events] as MySource
CROSS APPLY GetArrayElements(MySource.context.custom.dimensions) as flat
This one produces two rows instead of one:
MacAddress, IpAddress
ma ,
, ipaddr
so I'm missing precisely the flattening when writing it like that.
I would like to bypass hardcoding the index 0 as it's not guaranteed that MacAddress won't switch places with "IpAddress"... So I need something like FindElementInArray by condition, or some means to join with the dimensions array.
Is there such thing?
Thank you.

Logic for reverse search result

I have a usecase where for a given result value i want to reverse lookup all the search conditions defined that will give this as result.
So, I have a set of search conditions defined in a table as key value list. Each entry in this table is a search query. Now, I have a random value in dataset which can be result of any search entries defined in the table. I want to lookup that table so that for this value i can get all the search queries possible where this value would appear as its result.
The search table consist of fields search_conditions, search_table among other fields.
Schema would be like
Search_Table
id (long)
search_table_id (long)
search_conditions (json array as text)
This is value of one such search condition
[
{
"key": "name",
"operator": "equals",
"value": "jeff"
},
{
"key": "age",
"operator": "between",
"value": [
20,
40
]
}
]
Value that i have to search can be say a random user {"name": "mr x", "age":12}.
This may not be exactly a technology based question but its solution may require technology. Any help will be appreciated. The concern is more about optimization as this has to be done in real time.

Resources