I'm fairly new to Azure Functions.
I've created a C# WebHook / Azure Function (I guess that's the right thing) to take my json content and convert it into a collection of simple poco/dto objects.
public static class GenericWebHookCSharp
{
[FunctionName("GenericWebHookCsharpOne")]
public static async Task<HttpResponseMessage /* object */> Run([HttpTrigger(WebHookType = "genericJson")]HttpRequestMessage req, TraceWriter log)
{
try
{
log.Info(string.Format("C# GenericWebHookCsharpOne about to process a request. ('{0}')", DateTime.Now.ToLongTimeString()));
//IUnityContainer container = new UnityContainer();
//container.RegisterType<IJsonToPersonRequestWrapperConverter, JsonToPersonRequestWrapperConverter>();
//IJsonToPersonRequestWrapperConverter jsonConvtr = container.Resolve<IJsonToPersonRequestWrapperConverter>();
//ICollection<Person> emps = await jsonConvtr.ConvertHttpRequestMessageToPersonCollection(req);
/* above commented code is my "real" code where I take the INPUT request-body-as-json and convert it into a ICollection of Person(s) */
/* below code, I just fake-creating some persons */
string jsonContent = await req.Content.ReadAsStringAsync();
ICollection<Person> persons = new List<Person>();
for(int i = 0; i< 10; i++)
{
persons.Add(new Person() { PersonUuid = Guid.NewGuid(), LastName = "LN" + i.ToString(), FirstName = "FN" + i.ToString(), BirthDate = DateTimeOffset.Now.AddYears(-1 * (20 + i))});
}
string serializedJson = Newtonsoft.Json.JsonConvert.SerializeObject(persons);
log.Info(string.Format("C# GenericWebHookCsharpOne finished a request. ('{0}')", DateTime.Now.ToLongTimeString()));
return req.CreateResponse(HttpStatusCode.OK , serializedJson);
}
catch (Exception ex)
{
string errorMsg = ex.Message;// ExceptionHelper.GenerateFullFlatMessage(ex);
log.Error(errorMsg);
return req.CreateResponse(HttpStatusCode.BadRequest, errorMsg);
}
}
}
In another .cs file
public class Person
{
public Guid PersonUuid { get; set; }
public string LastName { get; set; }
public string FirstName { get; set; }
public DateTimeOffset? BirthDate { get; set; }
}
If I debug it in Visual Studio, it works fine.
So I added this as a step on my Logic App as seen below
So I want to add a new step that is a "for each" step. Here is what I get now: (image below). All I see is "body" from the initial trigger and the "convert" webhook function (that I have above)........
How do I get the "persons" (so I can do a for-each-person) collection to show up and be available for the next step on the Logic App?
EDIT/APPEND:
The end game is to push a service-bus-message for "each" of my Person(s).
As requested, here is the "person json".....
[{
"PersonUuid": "7ec8cc4d-831c-4c89-8516-47424ee2658d",
"LastName": "LN0",
"FirstName": "FN0",
"BirthDate": "1997-08-17T09:46:16.9839382-04:00"
},
{
"PersonUuid": "275264bc-5a86-476d-a189-512afa1e3ce4",
"LastName": "LN1",
"FirstName": "FN1",
"BirthDate": "1996-08-17T09:46:16.9844385-04:00"
},
{
"PersonUuid": "e522b827-2d2e-465d-a30a-c4b619d2e8e4",
"LastName": "LN2",
"FirstName": "FN2",
"BirthDate": "1995-08-17T09:46:16.9844385-04:00"
},
{
"PersonUuid": "f16bce36-3491-4519-bc82-580939f61b2e",
"LastName": "LN3",
"FirstName": "FN3",
"BirthDate": "1994-08-17T09:46:16.9844385-04:00"
},
{
"PersonUuid": "42456057-39ef-45aa-bd7c-ad6a8fa74fd1",
"LastName": "LN4",
"FirstName": "FN4",
"BirthDate": "1993-08-17T09:46:16.9844385-04:00"
},
{
"PersonUuid": "14088a6e-3c44-4cb0-927d-19f5eda279c4",
"LastName": "LN5",
"FirstName": "FN5",
"BirthDate": "1992-08-17T09:46:16.9844385-04:00"
},
{
"PersonUuid": "332a5cde-3cd1-467a-9dfc-2b187d6ae32e",
"LastName": "LN6",
"FirstName": "FN6",
"BirthDate": "1991-08-17T09:46:16.9844385-04:00"
},
{
"PersonUuid": "6debe134-19e6-4b16-a91d-05ded511eff6",
"LastName": "LN7",
"FirstName": "FN7",
"BirthDate": "1990-08-17T09:46:16.9844385-04:00"
},
{
"PersonUuid": "e61ef8a1-09d3-4c5b-b948-df8e0858cd29",
"LastName": "LN8",
"FirstName": "FN8",
"BirthDate": "1989-08-17T09:46:16.9844385-04:00"
},
{
"PersonUuid": "e9b27632-d3a4-4fe8-8745-04edfa8854f7",
"LastName": "LN9",
"FirstName": "FN9",
"BirthDate": "1988-08-17T09:46:16.9844385-04:00"
}]
Ok.
I gotta get this written down before I forget. Wow, what a ride.
First the C# code on the WebHook. Note the "anonymousPersonWrapper" code.
public static class GenericWebHookCSharp
{
[FunctionName("GenericWebHookCsharpOne")]
public static async Task<HttpResponseMessage /* object */> Run([HttpTrigger(WebHookType = "genericJson")]HttpRequestMessage req, TraceWriter log)
{
try
{
log.Info(string.Format("C# GenericWebHookCsharpOne about to process a request. ('{0}')", DateTime.Now.ToLongTimeString()));
///////* below code, I just fake-creating some persons */
string jsonContent = await req.Content.ReadAsStringAsync();
ICollection<Person> persons = new List<Person>();
for (int i = 0; i < 3; i++)
{
persons.Add(new Person() { PersonUuid = Guid.NewGuid(), LastName = "LN" + i.ToString(), FirstName = "FN" + i.ToString(), BirthDate = DateTimeOffset.Now.AddYears(-1 * (20 + i)) });
}
/* the below is the "trick" to allow the for-each to work in the Logic-App. at least in my experience */
var anonymousPersonWrapper = new
{
personWrapper = persons
};
string personWrapperJsonString = JsonConvert.SerializeObject(anonymousPersonWrapper);
log.Info(string.Format("C# GenericWebHookCsharpOne finished a request. ('{0}')", DateTime.Now.ToLongTimeString()));
HttpResponseMessage returnReq = req.CreateResponse(HttpStatusCode.OK , personWrapperJsonString );
return returnReq;
}
catch (Exception ex)
{
string errorMsg = ex.Message;
log.Error(errorMsg);
return req.CreateResponse(HttpStatusCode.BadRequest, errorMsg);
}
}
}
But putting a breakpoint on "return returnReq;", I was able to see that personWrapperJsonString contained the below Json:
{
"personWrapper": [{
"PersonUuid": "31fb318d-a9bf-4c2f-ad16-0810ddd73746",
"LastName": "LN0",
"FirstName": "FN0",
"BirthDate": "1997-08-17T15:10:08.9633612-04:00"
},
{
"PersonUuid": "73fdacc7-e1e8-48ff-b161-1bd8b5f4aec1",
"LastName": "LN1",
"FirstName": "FN1",
"BirthDate": "1996-08-17T15:10:08.9633612-04:00"
},
{
"PersonUuid": "d18b4324-2d3e-41ca-9525-fe769af89e9c",
"LastName": "LN2",
"FirstName": "FN2",
"BirthDate": "1995-08-17T15:10:08.9633612-04:00"
}]
}
Ok.
Then I added a "Parse Json" action (below image)
Then I setup the Parse-Json. Below.
The above parse-json setup is not complete.
Click on the button "Use sample payload to generate schema" and that will pop a new window. Paste in your "personWrapper" json from earlier. As seen in the below image.
The above will of course create the json-schema that you need (that is for-each friendly). As seen below.
Now we're so close.
Add a For-Each (using the "More" button when you add a new step) (as seen below)
Now we setup the for-each. Looked what showed up! The "personWrapper" (below image)
For grins, I made the sessionId be the PersonUuid value (just to show that I can get hold of one of the scalar properties of the object. (image below).
And now the json as the Content of the Service Bus message. (below image)
I then published the Azure-Functions and deployed the Logic-App, and sent a request to the trigger.
Back to azure portal. The PersonUuid showed up as the SessionId! (image below)
And a quick check in Service Bus Explorer to "peek" the contents of the message (image below)
Ok, a few breadcrumbs:
I got a hint from here about putting the collection side a "wrapper".
Json.NET validate JSON array against Schema
A few errors I got along the way
"Invalid type. Expected Object but got Array."
UnsupportedMediaType "The WebHook request must contain an entity body formatted as JSON."
"this output is an array" "a foreach cannot be nested inside of another foreach"
'Json' expects its parameter to be a string or an XML.The provided value is of type 'Array.
As Steven Van Eycken mentioned that we could parse string to array with json fucntion in the logic application. In your case we could parse the string to array in the Logic app or return jarry directly from Azure function . We can choose one of the following ways to do that . I also test it on my side, it works correctly.
In the logic App
json(body('Your action name'))
Or
Return Jarry directly in the Azure function
var jarry =JArray.Parse(Newtonsoft.Json.JsonConvert.SerializeObject(persons));
log.Info(string.Format("C# GenericWebHookCsharpOne finished a request. ('{0}')", DateTime.Now.ToLongTimeString()));
return req.CreateResponse(HttpStatusCode.OK, jarry);
Related
Is it possible to consume Flux endpoint as Observable or Flowable in Retrofit?
What I am aiming to achieve is to emit items from endpoint to the consumer.
Spring boot WebFlux endpoint
#RestController
#RequestMapping("user")
class UserController {
private val repo = listOf(
UserDoc().apply {
id = 1
name = "test1"
createdAt = LocalDateTime.now()
},
UserDoc().apply {
id = 2
name = "test2"
createdAt = LocalDateTime.now()
},
UserDoc().apply {
id = 3
name = "test3"
createdAt = LocalDateTime.now()
}
)
#GetMapping
fun findAll(): Flux<UserDoc> = Flux.just(*repo.toTypedArray())
}
Sample response
[
{
"id": 1,
"name": "test1",
"createdAt": "2022-10-04T15:25:34.540953"
},
{
"id": 2,
"name": "test2",
"createdAt": "2022-10-04T15:25:34.540976"
},
{
"id": 3,
"name": "test3",
"createdAt": "2022-10-04T15:25:34.54098"
}
]
Retrofit client
fun main() {
val userApi = Retrofit.Builder()
.addCallAdapterFactory(RxJava2CallAdapterFactory.createWithScheduler(Schedulers.io()))
.addConverterFactory(GsonConverterFactory.create())
.baseUrl("http://localhost:2020/")
.client(OkHttpClient())
.build()
.create(UserApi::class.java)
val compositeDisposable = CompositeDisposable()
compositeDisposable.add(
userApi.findAll()
.subscribeOn(Schedulers.io())
.subscribe {user: User ->
println(user)
}
)
Thread.currentThread().join()
}
interface UserApi {
#GET("user")
fun findAll(): Flowable<User>
}
data class User(
val id: Long,
val name: String
)
The current implementation return this error:
com.google.gson.JsonSyntaxException: java.lang.IllegalStateException: Expected BEGIN_OBJECT but was BEGIN_ARRAY at line 1 column 2 path $
If I change the Flowable<User> to Flowable<List<User>> it's works fine.
Is it possible to subscribe the list user by user? or create WebSocket and create my custom Observable?
{
"data": [
{
"id": 10,
"title": "Administration",
"active": true,
"type": {
"id": 2,
"name": "Manager"
}
},
{
"id": 207,
"title": "MCO - Exact Match 1",
"active": true,
"type": {
"id": 128,
"name": "Group"
}
},
{
"id": 1201,
"title": "Regression",
"active": false,
"type": {
"id": 2,
"name": "Manager"
}
}
]
}
i am trying to create a tuple in the below format using linq. not sure how to start with group/aggregate. Any help is appreciated. I went over few threads and could not able to find something similar to this.
var tuple = new List<Tuple<int, List<Dictionary<int,bool>>>();
2 10, true
1201, false
128 207, true
Here is a full working code:
var o = new {
data = new [] {
new {
id = 10,
title = "Administration",
active = true,
type = new {
id = 2,
name = "Manager"
}
},
new {
id = 207,
title = "MCO - Exact Match 1",
active = true,
type = new {
id = 128,
name = "Group"
}
},
new {
id = 1201,
title = "Regression",
active = false,
type = new {
id = 2,
name = "Manager"
}
}
}
};
var result = o.data.GroupBy(
item => item.type.id, // the group key
item => new Dictionary<int, bool>() {{ item.id, item.active }}, // the transformed elements in the group
(id, items) => new Tuple<int, List<Dictionary<int, bool>>>(id, items.ToList()) // transformation of grouping result to the final desired format
).ToList();
// check correctness
foreach (var entry in result) {
Console.Write(entry.Item1);
foreach (var dict in entry.Item2) {
foreach (var kvp in dict)
Console.WriteLine("\t\t" + kvp.Key + "\t" + kvp.Value);
}
}
And this is how it works:
o is the data model, represented using anonymous types. You can obviously use a strongly typed model here, if you already have it;
on o we apply the four-argument version of GroupBy, described in detail in the official docs from Microsoft. Basically:
the first lambda expression selects the group key;
the second lambda defines the elements that are part of each group;
the third lambda transforms each (group key, enumeration of group elements) into the Tuple<int, List<Dictionary<int, bool>>> format;
at the end we call ToList() to compute the result and store it as a list of tuples.
the last part prints the result (did not spend much time prettifying it, but it does its job validating the code).
I'm using lowdb https://github.com/typicode/lowdb.
I have a small database that looks like this
{
"orders": [
{
"id": "0",
"kit": "not a real order"
},
{
"id": "1",
"kit": "kit_1"
}
],
"total orders": 21,
"216862330724548608": 1
}
is it possble to change the "kit": "x" to "kit": "y"
x and y are user input so I can't just use replace because I don't know what it will be equal to.
I did try to use some kind of replace but it didn't work
let = updateOrders = (items, id, newValue) => {
const {orders} = items;
orders.map((item) => {
item.kit = newValue;
//if you need id check uncomment below code and add id in arguments and pass id
// if (item.id === id) {
// item.kit = newValue;
// }
})
console.log(orders);
};
updateOrders(orders, 'updated');
Hopefully it would help.
I am new to ELastic Search.
Data in Elastic search is in Parent-Child Model.I want to perform search in this data using java api.
parent type contains author details and child type contains book details like book name,book publisher, book category.
While performing a search on child details,I need to get the parent details also and vice versa. Sometimes search conditions will be on parent type as well as child. eg search for books written by author1 and type Fiction.
How can i implement this in java? I have referred the elastic search documentation but not able to get a solution
Please help
First set up your index with the parent/child mapping. In the mapping below I have also added a untokenized field for categories so you can execute filter queries on that field. (For creating the index and documents I'm using the JSON API not the Java API as that was not part of the question.)
POST /test
{
"mappings": {
"book": {
"_parent": {
"type": "author"
},
"properties":{
"category":{
"type":"string",
"fields":{
"raw":{
"type":"string",
"index": "not_analyzed"
}
}
}
}
}
}
}
Create some author documents:
POST /test/author/1
{
"name": "jon doe"
}
POST /test/author/2
{
"name": "jane smith"
}
Create some book documents, specifying the relationship between book and author in the request.
POST /test/book/12?parent=1
{
"name": "fictional book",
"category": "Fiction",
"publisher": "publisher1"
}
POST /test/book/16?parent=2
{
"name": "book of history",
"category": "historical",
"publisher": "publisher2"
}
POST /test/book/20?parent=2
{
"name": "second fictional book",
"category": "Fiction",
"publisher": "publisher2"
}
The Java class below executes 3 queries:
Search on all books that have the term 'book' in the title and
return the authors.
Search on all authors that have the terms 'jon doe' in the name and
return the books.
Search for books written by 'jane smith' and that are of type Fiction.
You can run the class from the command line, or import into Eclipse and right click on the class and select 'Run As > Java Application'. (You'll need to have the Elasticsearch library in the classpath.)
import java.util.concurrent.ExecutionException;
import org.elasticsearch.action.search.SearchRequestBuilder;
import org.elasticsearch.action.search.SearchResponse;
import org.elasticsearch.client.Client;
import org.elasticsearch.client.transport.TransportClient;
import org.elasticsearch.common.settings.ImmutableSettings;
import org.elasticsearch.common.settings.Settings;
import org.elasticsearch.common.transport.InetSocketTransportAddress;
import org.elasticsearch.index.query.FilterBuilders;
import org.elasticsearch.index.query.HasChildQueryBuilder;
import org.elasticsearch.index.query.HasParentQueryBuilder;
import org.elasticsearch.index.query.QueryBuilders;
import org.elasticsearch.index.query.TermFilterBuilder;
public class ParentChildQueryExample {
public static void main(String args[]) throws InterruptedException, ExecutionException {
//Set the Transport client which is used to communicate with your ES cluster. It is also possible to set this up using the Client Node.
Settings settings = ImmutableSettings.settingsBuilder()
.put("cluster.name", "elasticsearch").build();
Client client = new TransportClient(settings)
.addTransportAddress(new InetSocketTransportAddress(
"localhost",
9300));
//create the searchRequestBuilder object.
SearchRequestBuilder searchRequestBuilder = new SearchRequestBuilder(client).setIndices("test");
//Query 1. Search on all books that have the term 'book' in the title and return the 'authors'.
HasChildQueryBuilder bookNameQuery = QueryBuilders.hasChildQuery("book", QueryBuilders.matchQuery("name", "book"));
System.out.println("Exectuing Query 1");
SearchResponse searchResponse1 = searchRequestBuilder.setQuery(bookNameQuery).execute().actionGet();
System.out.println("There were " + searchResponse1.getHits().getTotalHits() + " results found for Query 1.");
System.out.println(searchResponse1.toString());
System.out.println();
//Query 2. Search on all authors that have the terms 'jon doe' in the name and return the 'books'.
HasParentQueryBuilder authorNameQuery = QueryBuilders.hasParentQuery("author", QueryBuilders.matchQuery("name", "jon doe"));
System.out.println("Exectuing Query 2");
SearchResponse searchResponse2 = searchRequestBuilder.setQuery(authorNameQuery).execute().actionGet();
System.out.println("There were " + searchResponse2.getHits().getTotalHits() + " results found for Query 2.");
System.out.println(searchResponse2.toString());
System.out.println();
//Query 3. Search for books written by 'jane smith' and type Fiction.
TermFilterBuilder termFilter = FilterBuilders.termFilter("category.raw", "Fiction");
HasParentQueryBuilder authorNameQuery2 = QueryBuilders.hasParentQuery("author", QueryBuilders.matchQuery("name", "jane smith"));
SearchResponse searchResponse3 = searchRequestBuilder.setQuery(QueryBuilders.filteredQuery(authorNameQuery2, termFilter)).execute().actionGet();
System.out.println("There were " + searchResponse3.getHits().getTotalHits() + " results found for Query 3.");
System.out.println(searchResponse3.toString());
System.out.println();
}
}
You can use Parent-Child documents for this.
Let's create an index bookstore with simple mappings for author documents and book documents. You can add more fields as per your requirements. See this for more information about indexing parent/child documents.
PUT bookstore
{
"mappings": {
"author": {
"properties": {
"authorname": {
"type": "string"
}
}
},
"book": {
"_parent": {
"type": "author"
},
"properties": {
"bookname": {
"type": "string"
}
}
}
}
}
Now let's add two authors:
PUT bookstore/author/1
{
"authorname": "author1"
}
PUT bookstore/author/2
{
"authorname": "author2"
}
Now let's add two books of author author1:
PUT bookstore/book/11?parent=1
{
"bookname": "book11"
}
PUT bookstore/book/12?parent=1
{
"bookname": "book12"
}
Now let's add two books of author author2:
PUT bookstore/book/21?parent=2
{
"bookname": "book21"
}
PUT bookstore/book/22?parent=2
{
"bookname": "book22"
}
We're done indexing documents. Now let's start searching.
Search all books authored by author author1 (Read more about this here)
POST bookstore/book/_search
{
"query": {
"has_parent": {
"type": "author",
"query": {
"term": {
"authorname": "author1"
}
}
}
}
}
Search the author of book book11 (Read more about this here)
POST bookstore/author/_search
{
"query": {
"has_child": {
"type": "book",
"query": {
"term": {
"bookname": "book11"
}
}
}
}
}
Search for books named book12 and authored by author1. You need to use bool queries to achieve this. (There can be a better example for this scenario with more fields in the documents)
POST bookstore/book/_search
{
"query": {
"bool": {
"must": [
{
"has_parent": {
"type": "author",
"query": {
"term": {
"authorname": "author1"
}
}
}
},
{
"term": {
"bookname": {
"value": "book12"
}
}
}
]
}
}
}
I've done something similar with "spring-data-elasticsearch" library. There are heaps of samples available on their test suite.
Follow this link on git : https://github.com/spring-projects/spring-data-elasticsearch/blob/master/src/test/java/org/springframework/data/elasticsearch/NestedObjectTests.java
List<Car> cars = new ArrayList<Car>();
Car saturn = new Car();
saturn.setName("Saturn");
saturn.setModel("SL");
Car subaru = new Car();
subaru.setName("Subaru");
subaru.setModel("Imprezza");
Car ford = new Car();
ford.setName("Ford");
ford.setModel("Focus");
cars.add(saturn);
cars.add(subaru);
cars.add(ford);
Person foo = new Person();
foo.setName("Foo");
foo.setId("1");
foo.setCar(cars);
Car car = new Car();
car.setName("Saturn");
car.setModel("Imprezza");
Person bar = new Person();
bar.setId("2");
bar.setName("Bar");
bar.setCar(Arrays.asList(car));
List<IndexQuery> indexQueries = new ArrayList<IndexQuery>();
IndexQuery indexQuery1 = new IndexQuery();
indexQuery1.setId(foo.getId());
indexQuery1.setObject(foo);
IndexQuery indexQuery2 = new IndexQuery();
indexQuery2.setId(bar.getId());
indexQuery2.setObject(bar);
indexQueries.add(indexQuery1);
indexQueries.add(indexQuery2);
elasticsearchTemplate.putMapping(Person.class);
elasticsearchTemplate.bulkIndex(indexQueries);
elasticsearchTemplate.refresh(Person.class, true);
SearchQuery searchQuery = new NativeSearchQueryBuilder().build();
List<Person> persons = elasticsearchTemplate.queryForList(searchQuery, Person.class);
How can I make ObjectGraphBuilder to build my class instance from an string? I mean if I have
String myString = """invoices{
invoice(date: new Date(106,1,2)){
item(count:5){
product(name:'ULC', dollar:1499)
}
item(count:1){
product(name:'Visual Editor', dollar:499)
}
}
invoice(date: new Date(106,1,2)){
item(count:4) {
product(name:'Visual Editor', dollar:499)
}
}
"""
how can turn this string (myString) into an instance of the invoice class (I assume I have to use ObjectGraphBuilder but how?)
Given an instance of the class invoice ( with all of its nested properties), how can I turn that instance into an string like myString?
I also want to be able serialize and deserialize from a text file too but I assume it is the same as the string.
You can work with GroovyShell to evaluate the string and delegate the methods called in the script to an ObjectGraphBuilder. I repeated the "invoices" method. If this is unacceptable, take a look at Going to Mars with Domain-Specific Languages, by Guillaume Laforge, where he teaches how to customize the compiler.
I also created an Invoices class, because of the way ObjectGraphBuilder works. If this will be dynamic for you, take a look at its resolvers.
import groovy.transform.ToString as TS
#TS class Invoices { List<Invoice> invoices=[] }
#TS class Invoice { List<Item> items=[]; Date date }
#TS class Item { Integer count; Product product }
#TS class Product { String name; Integer dollar; Vendor vendor }
#TS class Vendor { Integer id }
String myString = """
invoices {
invoice(date: new Date(106,1,2)){
item(count:5){
product(name:'ULC', dollar:1499)
}
item(count:1){
product(name:'Visual Editor', dollar:499)
}
}
invoice(date: new Date(106,1,2)){
item(count:4) {
product(name:'Visual Editor', dollar:499)
}
}
}
"""
invoicesParser = { Closure c ->
new ObjectGraphBuilder().invoices c
}
binding = new Binding( [invoices: invoicesParser] )
invoices = new GroovyShell(binding).evaluate myString
assert invoices.invoices.size() == 2
Update: as for your second question, i'm not aware, and neither could found, any way back to the object graph builder representation. You can roll your own, but i think you will be better if you try something like json. Does your use case permit you to do so?
use( groovy.json.JsonOutput ) {
assert invoices.toJson().prettyPrint() == """{
"invoices": [
{
"date": "2006-02-02T02:00:00+0000",
"items": [
{
"product": {
"vendor": null,
"dollar": 1499,
"name": "ULC"
},
"count": 5
},
{
"product": {
"vendor": null,
"dollar": 499,
"name": "Visual Editor"
},
"count": 1
}
]
},
{
"date": "2006-02-02T02:00:00+0000",
"items": [
{
"product": {
"vendor": null,
"dollar": 499,
"name": "Visual Editor"
},
"count": 4
}
]
}
]
}"""
}