Index and Analyze Xml Data in Graylog - graylog2

Is it possible to import and index XML data with Graylog?
I know that it's built over ES, which only indexes fields of JSON objects.
Is there a way to do it out of the box or I need to manually convert XML to JSON?

Related

SoapUI Groovy script to compare db values against Json response

Hi Guys i am new to API automation in SoapUI. i want to know if it is possible to compare data received from the JDBC step against the Json Response. Currently how i am doing is, defining the xpath for each tag and saving values in variables and comparing them. is there a way to automate this in a generic way without defining xpaths for each element in xmls

Flat file Schema (Delimited) creation wizard in Biztalk server 2015

i want to add nested child record with elements in Positional File (Flat File schema Creation wizard) using Biztalk,
for Eg. Generated instance to be like this: <Root><Child_Rec1><Child_Ele>ELEMENT</Child_Ele></Child_Rec1></Root>
Sorry, what you are asking for is not possible with the Flat File Disassemler.
It is not supported to have Delimited content within Positional Content.
The way to work around this is the split the delimited content in a Map and set the target fields.

CSV to XML trnformation using cloveretl

I want to transform my csv file to XML using cloveretl.
I gone through the basic tutorial, all explains abound direct mapping from csv to xml, csv header column names are used for xml element name.
I have one complex XSD, and I want to map csv to that XSD generated xml.
When I generate metadata using my XSD 213 fmt file generated in cloveretl.
How do map all these together ?
I saw an option to map individually, one csv metadata to one fmt. Like this I have to do 213 mapping and combine all ?
I assume you have two components: UniversalDataReader and XMLWriter. The edge between them should not have the metadata from the XSD schema (you do not have to extract metadata from the XSD at all), it should have metadata extracted from the input CSV file. Otherwise, you would not be able to read the file in the first place.
Then, in XMLWriter, you can set the XSD schema or create the mapping manually. For more information, see http://doc.cloveretl.com/documentation/UserGuide/topic/com.cloveretl.gui.docs/docs/extxmlwriter.html#xsd-mapping
If the CSV and the result XML are simple enough, you do not need the XSD schema at all.

Why we use json and xml in web api

Why we use json and xml for web api rather then othen platform like jquery and array.
Its interview based question and I was enable to response .
JSON and XML are ways to format and data-interchange.
JQuery is a library was built on top of javascript. so it has nothing to do in data-interchange.
Array by its definition: its a method for storing information. in other words you can use arrays in any format you want to store data.
In Conclusion, Web API is a service that provides or gathers data. and in general you can exchange data between client and server by JSON, XML, or you can use whatever data format you want such as HTML. JQuery can be use to call the Web API that can return an array of data in any data format.
This is not compulsory that you have to return the result in JSON or XML, but most of the time we are returning values in these format because there are media type formatter for JSON and XML, the capability of media type formatter is to read the CLR object from HTTP message and to write the CLR object into HTTP message. This is the main reason that we return values in JSON and XML format.

How do I store the contents of an array in tablestorage

I need to store the contents of an array into Azure tablestorage. The array will have between 0 and 100 entries. I don't want to have to create 100 different elements so is there a way I can pack up the array, store it and unpack it later. Any examples would be much appreciated. I just don't know where to start :-(
You need to serialize the array into binary or xml and then use the appropriate column type to store the data (binary object or xml.)
XML will be the most flexible because you can still query the values while they are in storage. (You can't query binary data. Not easily anyway.) Here is an example of serializing and here is one for inserting the value into a table.
Some detail on XML support in Azure:
The xml Data Type
SQL Azure Database supports xml data
type that stores XML data. You can
store xml instances in a column or in
a variable of the xml type.
Support for XML Data Modification
Language
The XML data modification language
(XML DML) is an extension of the
XQuery language. The XML DML adds the
following case-sensitive keywords to
XQuery and they are supported in SQL
Azure Database:
insert (XML DML)
delete (XML DML)
replace value of (XML DML)
Support for xml Data Type Methods
You can use the xml data type methods
to query an XML instance stored in a
variable or column of the xml type.
SQL Azure Database supports the
following xml data type methods:
query() Method (xml data type)
value() Method (xml data type)
exist() Method (xml data type)
modify() Method (xml data type)
nodes() Method (xml data type)
If you really are starting out in Azure Table Storage, then there are a few nice "simple" tutorials around - e.g. http://blogs.msdn.com/b/jnak/archive/2008/10/28/walkthrough-simple-table-storage.aspx
Once you are happy with reading/writing entities then there are several ways you can map your array to Table Storage.
If you ever want to access each element of your array separately from the persistent storage, then you should create 0 to 99 separate entities - each with their own entity in the Table store.
If you don't ever want to access them separately, then you can just store the array in a single entity (row) in the table - e.g. using PartitionKey="MyArrays", RowKey="" and having another column which contains the array serialised to e.g. JSON.
As a variation on 2, you could also store the array items - 0 to 99 - in separate columns ("Array_0",..."Array_99") in the row. There are ways you could map this to a nice C# Array property using the Reading/Writing events on the table storage entity - but this might not be the best place to start if you're beginning with Azure.
Be careful, besides the 1MB entity limit there is a per field limit as well (I think it's 64kb)
Your best bet is to use the Lokad Fat Entity
http://code.google.com/p/lokad-cloud/wiki/FatEntities

Resources