Databricks and academic graph access - azure

I am using databricks to extract data from microsoft academic graph. When I ran the query
Get affiliations
"Affiliations = MAG.getDataframe('Affiliations')
Affiliations = Affiliations.select(Affiliations.AffiliationId, Affiliations.DisplayName)
Affiliations.show(3)" given in their documentation I receive an error
"shaded.databricks.org.apache.hadoop.fs.azure.AzureException: java.lang.IllegalArgumentException: Invalid characters in hostname".
How do I resolve this issue?

Make sure you have followed this tutorial without missing any steps.
By following this tutorial, I'm able to get Affiliations as follows.
Hope this helps. Do let us know if you any further queries.

I had the same problem, I found the mistake in the key for my storage account name. There was a space at the end of the valued of the key

Related

Azure Synapse and Jira Rest Api Error 21155

I am trying to call the Rest Api from Jira via the REST Connector of Synapse.
I always get the error 21155.
Error occured when deserializing source JSON file.
Check if data is in valid JSON.
Unexpected character encountered while parsing value <Path", line 0, position 0.
Does anybody know how to solve this problem?
I checked in postmen and there it worked and the result is a valid json.
I tried to reproduce your scenario in my environment like I am also getting output in postman but facing error in ADF as below:
Getting output in postman
Facing similar error in ADF
I believe the error information implies that the source data's non-standard JSON format prevents ADF from deserializing it. JSON that isn't legal cannot be copied using ADF. Because when I tried with another Api with Correct Json format output, I am able to see the preview the data, as below:
The connection documentation states that ADF supports the JIRA connector. Perhaps you could give that a go.

Logic Apps and CosmosDB

So I'm trying to do something I expected to be simple - using Logic App to insert a json object into Cosmos DB.
I've created a CosmosDB (based on Core SQL API) and created a container called Archive with the partition key /username (which is in my json object).
Now to Logic App.
First, I YOLO'ed the simple approach.
Which gave me error: "The input content is invalid because the required properties - 'id; ' - are missing"
But my json object doesnt have an id field. Maybe use the IsUpsert parameter? I dont feel like manipulating my input document to add a field called 'id'.
Which gave me error: "One of the specified inputs is invalid" Okay - feels even worse.
I now tried to use the other logic app connector (Not V2, but the original).
which gave me error: "The partition key supplied in x-ms-partitionkey header has fewer components than defined in the the collection."
I saw that this connector(unlike the V2 one) has a Partition Key Value parameter from UI, which I added to pass the value of my username
which gave me error "The input content is invalid because the required properties - 'id; ' - are missing".
At this point I thought, let me just give the bloody machine what it wants, and so I added "id" to my json object.
and yes that actually worked.
So questions are
With Logic Apps connectors, are you only able to insert json objects into Cosmos DB without that magic field "id" in the message payload?
If the partitioning key is required. Why is it not available from the V2 connector parameter UI?
Thanks.
The v1 version of this thing doesn't have partition key because it's so old, Cosmos didn't have partitioned containers. You will want to use the v2 preview. I think that will go GA at some point here soon because it's been in preview for a while now.
And yes, With Cosmos DB you can't insert anything without it's id so you will need to generate from whatever calls your endpoint and pass it in the body of your http request.

Azure Consumption Usage API - Filter Not Working

I have an issue trying to filter usages for Azure Consumption, as specified in the official documentation, based on:
properties/resourceName eq '{resourceName}
The complete URI is:
$ConsumtionUsagesUri = "https://management.azure.com/subscriptions/$subId/providers/Microsoft.Billing/billingPeriods/$BillingPeriod/providers/Microsoft.Consumption/usageDetails?$expand=meterDetails,additionalProperties&$filter=properties/resourceName eq '{resourceName}'&api-version=2019-10-01"
The query returns all the results regardless of filter. Thanks everyone for your help!
This may be because your syntax is not correct. If you would like to filter by resource name, you have to follow this syntax. Please note that the case sensitivity may return different result.
properties/instanceName eq '{instanceName}'
For other syntax, please check the Azure Consumption API repo. I believe this will work with Azure Consumption API.
it will be like => ?api-version=2021-10-01&$expand=meterDetails&metric=amortizedcost&$filter=tags/Vendor eq '******'
API documentation is not correct.

DocumentDB Data migration Tool, can't migrate from db to db

I'm using DocumentDB Data Migration Tool to migrate a documentDB db to a newly created documentDB db. The connectionStrings verify say it is ok.
It doesn't work (no data transferred (=0) but not failure written in the log file (Failed = 0).
Here is what is done :
I've tried many things such as :
migrate / transfer a collection to a json file
migrate to partitionned / non partitionned documentdb db
for the target indexing policy I've taken the source indexing policy (json got from azure, documentdb db collection settings).
...
Actually nothing's working, but I have no error logs, maybe a problem of documentdb version ?
Thanx in advance for your help.
After debugging the solution from the tool's repo I figure the tools fail silently if you mistyped the database's name like I did.
DocumentDBClient just returns an empty async enumerator.
var database = await TryGetDatabase(databaseName, cancellation);
if (database == null)
return EmptyAsyncEnumerator<IReadOnlyDictionary<string, object>>.Instance;
I can import from an Azure Cosmos DB DocumentDB API collection using DocumentDB Data Migration tool.
Besides, based on my test, if the name of the collection that we specify for Source DocumentDB is not existing, no data will be transferred and no error logs is written.
Import result
Please make sure the source collection that you specified is existing. And if possible, you can try to create a new collection and import data from this new collection, and check if data can be transferred.
I've faced same problem and after some investigation found that internal document structure was changed. Therefor after migration with with tool documents are present but couldn't be found with data explorer (but with query explorer using select * they are visible)
I've migrated collection through mongo api using Mongichef
#fguigui: To help troubleshoot this, could you please re-rerun the same data migration operation using the command line option? Just launch dt.exe from the same folder as Data Migration Tool for syntax required. Then after you launch it with required parameters, please paste the output here and I'll take a look what's broken.

azure table storage query

when I Post data and Query a Table with the database as: Dev datastorage (emulator) it works.
When I Post data Table with the data in Azure data base (have account) it works.
When I Get data from Table with the data in Azure data base (have account) it does not works.
In both the cases the code is the same.except the key and account credentials.
Is it I should do anything to Query ?
var query = azure.TableQuery
.select().from('dummytable').where('PartitionKey eq ?', key);
can any one suggest why Query is not working.
should there be anything else that need to be done
From Storage Explorer it works, I am able to see the entities.
only from the program I am not able to get the response. But in the same program "PUT"operation is working.
Was happening the same to me.. did an upgrade from the azure npm package 0.6.1 to 0.6.7 and now works, hope this helps.
I would look at the value that is on your partition key. There are some values that aren't on the list of invalid characters that Azure has issue with. For example before SDK 1.7 you could safely insert a % in a key, but if you queried for it specifically it wouldn't work. To test if this is the problem try running your query but without the filter and make sure your row is returned.
After reading the msdn mailing lists, I have upgraded the azure npm with the latest package 0.6.7 and it works. looks to be an issue with azure

Resources