U-sql error json - azure

I'm trying to extract data from the Data Lake store, which are saved in the json. When I try to submit script I get the error:
Vertex failure triggered quick job abort. Vertex failed:
SV1_Extract[0] with error: Vertex user code error.
I will add that I used:
https://github.com/Azure/usql/tree/master/Examples/DataFormats/Microsoft.Analytics.Samples.Formats
REFERENCE ASSEMBLY [jsonextr];
REFERENCE ASSEMBLY [Newtonsoft.Json];
#searchlog =
EXTRACT city string
FROM "/weather/logs/2016/09/23/08_0_a26cf4d21dd24f53b7903a9206195c58.json"
USING new Microsoft.Analytics.Samples.Formats.Json.JsonExtractor();
#res =
SELECT *
FROM #searchlog;
OUTPUT #res
TO "/datastreamanalitics/SearchLog-from-Data-Lake.json"
USING new Microsoft.Analytics.Samples.Formats.Json.JsonOutputter();
json:
{"city":{"id":7532702,"name":"Brodnica","coord":{"lon":19.406401,"lat":53.2579},"country":"PL","population":0},"cod":"200","message":0.0067,"cnt":1,"list":[{"dt":1474624800,"temp":{"day":15.97,"min":10,"max":17.14,"night":10.01,"eve":17.02,"morn":10},"pressure":1025.32,"humidity":79,"weather":[{"id":802,"main":"Clouds","description":"scattered clouds","icon":"03d"}],"speed":3.22,"deg":271,"clouds":32}],"EventProcessedUtcTime":"2016-09-23T08:04:06.9372695Z","PartitionId":0,"EventEnqueuedUtcTime":"2016-09-23T08:04:05.2300000Z"}

Related

Azure Data Factory Copy Data using XML Source

Lets assume I have a simple XML file source which I've mapped to a corresponding sink in my SQL server database.
<Date Date="2020-03-13Z">
<Identification>
<Identifier>Maverick</Identifier>
</Identification>
<Pilot HomeAirport="New York">
<AirportICAOCode>USA</AirportICAOCode>
</Pilot>
</Date>
And then the schema
CREATE TABLE pilots
identifier VARCHAR(20),
ICAO_code VARCHAR(3)
)
I created a stored procedure in my sql server database that takes an input of the user-defined table type pilots_type which corresponds to the above schema to merge my data correctly.
But the pipeline fails when run with the error:
{
"errorCode": "2200",
"message": "ErrorCode=UserErrorInvalidPluginType,'Type=Microsoft.DataTransfer.Common.Shared.PluginNotRegisteredException,Message=Invalid type 'XmlFormat' is provided in 'format'. Please correct the type in payload and retry.,Source=Microsoft.DataTransfer.ClientLibrary,'",
"failureType": "UserError",
"target": "Sink XML",
"details": []
}
See image
Here the source is a blob that contains the XML.
Is XML not supported as a source after all?
XML is supported as a source.
I've made a same test according to your sample xml file and sql table successfully.
I created a Table Type named ct_pilot_type:
CREATE TYPE ct_pilot_type AS TABLE(
identifier nvarchar(MAX),
ICAO_code nvarchar(MAX)
)
I created the stored procedure named spUpsertPolit:
CREATE PROCEDURE spUpsertPolit
#polit ct_pilot_type READONLY
AS
BEGIN
MERGE [dbo].[pilot_airports] AS target_sqldb
USING #polit AS source_tblstg
ON (target_sqldb.identifier = source_tblstg.identifier)
WHEN MATCHED THEN
UPDATE SET
identifier = source_tblstg.identifier,
ICAO_code = source_tblstg.ICAO_code
WHEN NOT MATCHED THEN
INSERT (
identifier,
ICAO_code
)
VALUES (
source_tblstg.identifier,
source_tblstg.ICAO_code
);
END
I set the sink in the Copy activity:
I set the mapping:
It cpoied successfully:
The result shows:

Error while connecting DB2/IDAA using ADFV2

I am trying to connect DB2/IDAA using ADFV2 - while executing simple query "select * from table" - I am getting below error:
Operation on target Copy data from IDAA failed: An error has occurred on the Source side. 'Type = Microsoft.HostIntegration.DrdaClient.DrdaException, Message = Exception or type' Microsoft.HostIntegration.Drda.Common.DrdaException 'was thrown. SQLSTATE = HY000 SQLCODE = -343, Source = Microsoft.HostIntegration.Drda.Requester, '
I checked a lot and tried various options but still it's an issue.
I tried query "select * from table with ur" - query to call with read-only but still get above result.
If I use query like select * from table; commit; - then activity succeeded but no record fetch.
Is anyone have solution ?
I have my linked service setup like this. additional connection properties value is : SET CURRENT QUERY ACCELERATION = ALL

Not able to use useBeamSchema for automatically converting Pcollection to table row schema

//Follwing code to readfile from GCS bucket, tranform and write to Bigquery
PCollection<Quote> quotes = ...//get tranfrometed data
quotes.apply(BigQueryIO
.<Quote>write()
.to("my-project:my_dataset.my_table")
.useBeamSchema()
.withWriteDisposition(BigQueryIO.Write.WriteDisposition.WRITE_`enter code here`TRUNCATE));
//get error
//Exception in thread "main" java.lang.IllegalArgumentException: Unable to infer a coder and no Coder was //specified. Please set a coder by invoking Create.withCoder() explicitly or a schema by invoking //Create.withSchema().
I think you have to set a schema in your PCollection. Please see example below.

Data Factory V2 Query Azure Table Storage but use a lookup Value

I have a SQL watermark table which contains the last date in my destination table
My source data is coming from an Azure Storage Table and the date time is a string
I set up the date time in the watermark table to match the format in the Azure table storage
I create a lookup and a copy task
If I hard code the date into the Query for source and run this works fine CreatedAt ge '2019-03-06T14:03:11.000Z'
But obviously I dont want to hard code this value. I want to use the date from the lookup
But when I replace the hardcoded date with the lookup value
CreatedAt ge 'activity('LookupWatermarkOld').output'
I get an error
{
"errorCode": "2200",
"message":"ErrorCode=FailedStorageOperation,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=A
storage operation failed with the following error 'The remote server returned an error: (400) Bad Request.'.,Source=,
''Type=Microsoft.WindowsAzure.Storage.StorageException,Message=The remote server returned an error: (400) Bad Request.,
Source=Microsoft.WindowsAzure.Storage,StorageExtendedMessage=Syntax
error at position 42 in 'CreatedAt ge 'activity('LookupWatermarkOld').output''.\nRequestId:8c65ced9-b002-0051-79d9-d41d49000000\nTime:2019-03-07T11:35:39.0640233Z,,''Type=System.Net.WebException,Message=The remote server returned an error: (400) Bad Request.,Source=Microsoft.WindowsAzure.Storage,'",
"failureType": "UserError",
"target": "CopyMentions"
}
Can anyone help me with this? How do you use the Lookup value in a Azure Table query?
check this out:
1) Lookup activity. Query field:
SELECT MAX(WatermarkColumnName) as LastId FROM TableName;
Also, make sure that you checked "First row only" option.
2) In Copy Data activity use query. Query field:
#concat('SELECT * FROM TableName as s WHERE s.WatermarkColumnName > ''', activity('LookupActivity').output.firstRow.LastID, '''')
Finally I got some help on this and it works with
CreatedAt gt '#{activity('LookupWatermarkOld').output.firstRow.WaterMarkValue}'
the WaterarkValue is the column name from the SQL Lookup table
The Lookup creates an array so you have to specify the FirstRow from this array
And wrap in '' so its used as a string value
--For recent ADFv2
Use the watermark/lookup/output value in parameter.
Example: ParamUserCount = #{activity('LookupActivity').output.count}
or for output function
and you can use it in query as
Example: "select * from userDetails where usercount = {$ParamUserCount}"
make sure you enclose the query in " " to set as string and parameter in query should be enclosed in { }

Unable to filter parquet file using where clause.... error "unsafe symbol Unstable"

I am unable to fiter give parquet file.
I have dataframe with "family_id" of String and "lastStagedTs" of Date type. i.e. 2018-11-30 in format.
I am trying to filter the parquet file like below i.e. select data which is greater than 2018-11-23
val lastRun:String = "2018-11-23"
val parquetDf = sparkSession.read.format("parquet")
.load(parquetFile)
.select("family_id", "lastStagedTs")
.where($"lastStagedTs" > lastRun)
When I'm running above line of code I getting the below error:
unsafe symbol Unstable (child of <none>) in runtime reflection universe
at scala.reflect.internal.Symbols$Symbol.<init>(Symbols.scala:205)
at scala.reflect.internal.Symbols$TypeSymbol.<init>(Symbols.scala:3030)

Resources