Azure Data Factory Dataflows - dynamic sink paramterers - azure

I'm working on dataflows that will handle my dimensions load.
I wanted it to be as parametrized as it can be so i created generic source and sink (both Azure Synapse).
In debug settings of dataflows i can put requested values (tableName and schema name).
It is working for source without issue however i have no idea why but sink is not reading values
I got
Connection failed
{ "Message": "No value provided for Parameter 'tableName'" } - RunId: 27be90a3-294a-48fa-93f0-d3fc2d6df3f5
but in debug parametes it's provided.
Anyone knows how to fix it?
Debug settings

I tried to reproduce your issue:
Then i configured default value in the sink dataset parameter to solve the issue:

Related

Azure Data Factory - Error Code 11402 - value of property invalid

I just stumbled upon an error regarding the linked services (under connections) of my Data Factory. Today I just wanted to create a new connection to a Dynamics CRM System, but encountered the following error code:
Error code 11402
DetailsThe value of the property 'Organization Name' is invalid: 'Organization cannot be null or empty.
Parameter name: Organization Name'.
Organization cannot be null or empty.
Parameter name: Organization Name
Activity ID: ccdf3a23-f43a-4256-8ecc-af46ca17638d.
For some reason this even occurs if I duplicate an existing Dataverse-connection. Now all my connections won't work anymore and I wonder if I screwed up some configuration by accident. Unfortunately I can't seem to find an option to enter the Organization Name nor does a Parameter with such a name exist (e. g. in a Dataset).
Has anyone a hint where I can enter the Organization Name or can bypass this problem?
Thanks in advance!

Error when debugging a data factory pipeline with global parameters

I added a few global parameters to a data factory pipeline. However, when starting a debug session, I get the following error. Any idea?
{"code":"BadRequest","message":null,"target":"pipeline//runid/xxxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxxxx","details":null,"error":null}
Removing the global parameters fixes the issue.
Any idea how to fix this issue?
Thanks
Have you tried going into monitor in data factory and check the issue under error section?
I faced similar issue once when i debugged the pipeline...but resolved the error by getting info from monitor error and making necessary changes to pipeline.

Invalid resource name creating a connection to azure storage

I'm trying to create a project of the labeling tool from the Azure form recognizer. I have successfully deployed the web app, but I'm unable to start a project. I get this error all every time I try:
I have tried with several app instances and changing the project name and connection name, none of those work. The only common factor and finding here as that it is related to the connection.
As I see it:
1) I can either start a new project or use one on the cloud:
First I tried to create a new project:
I filled the fields with these values:
Display Name: Test-form
Source Connection: <previuosly created connection>
Folder Path: None
Form Recognizer Service Uri: https://XXX-test.cognitiveservices.azure.com/
API Key: XXXXX
Description: None
And got the error from the question's title:
"Invalid resource name creating a connection to azure storage "
I tried several combinations of names, none of those work.
Then I tried with the option: "Open a cloud project"
Got the same error instantly, hence I deduce the issue is with the connection settings.
Now, In the connection settings I have this:
At first glance, since the values are accepted and the connection is created. I guess it is correct but it is the only point I failure I can think of.
Regarding the storage container settings, I added the required CORS configuration and I have used it to train models with the Forms Recognizer, So that part does works.
At this point at pretty much stuck, since I error message does not give me many clues on where is the error.
I was facing a similar error today
You have to add container name before "?sv..." in your SAS URI in connection settings
https://****.blob.core.windows.net/**trainingdata**?sv=2019-10-10..

Jira Rest API Calls in Azure Data Factory

Good Day
I configured a Pipeline Copy Data job in Azure Data Factory to extract data from Jira with an API call using the rest API connector in Azure.
When i configure and test the connection it is successful.
Now when i try to preview the data in the Copy container i get the following error.
Does anyone know what this error means and how do i bypass it?
I believe i am not the first one trying to extract data from Jira via Rest API.
Thank you and Regards
Rayno
Error occurred when deserializing source JSON file ".Check if the data
is in valid JSON object format.Unexpected character encountered while
parsing value:<.Path".....
I think the error already indicates the root cause.You data format is invalid JSON format,you could try to simulate rest api invoke to make sure if the situation exists.ADF can't help you handle the illegal deserialization.
In addition,according to the connector doc,ADF supports JIRA connector.Maybe you could try to have a try on that.

SQL Azure unexpected database deletion/recreation

I've been scratching my head on this for hours, but can't seem to figure out what's wrong.
Here's our project basic setup:
MVC 3.0 Project with ASP.NET Membership
Entity Framework 4.3, Code First approach
Local environment: local SQL Server with 2 MDF database files attached (aspnet.mdf + entities.mdf)
Server environment: Windows Azure + 2 SQL Azure databases (aspnet and entities)
Here's what we did:
Created local and remote databases, modified web.config to use SQLEXPRESS connection strings in debug mode and SQL Azure connection strings in release mode
Created a SampleData class extending DropCreateDatabaseAlways<Entities> with a Seed method to seed data.
Used System.Data.Entity.Database.SetInitializer(new Models.SampleData()); in Application_Start to seed data to our databases.
Ran app locally - tables were created and seeded, all OK.
Deployed, ran remote app - tables were created and seeded, all OK.
Added pre-processor directives to stop destroying the Entity database at each application start on our remote Azure environment:
#if DEBUG
System.Data.Entity.Database.SetInitializer(new Models.SampleData());
#else
System.Data.Entity.Database.SetInitializer<Entities>(null);
#endif
Here's where it got ugly
We enabled Migrations using NuGet, with AutomaticMigrationsEnabled = true;
Everything was running smooth and nice. We left it cooking for a couple days
Today, we noticed an unknown bug on the Azure environment:
we have several classes deriving from a superclass SuperClass
the corresponding Entity table stores all of these objects in the same SuperClass table, using a discriminator to know which column to feed from when loading the various classes
While the loading went just fine before today, it doesn't anymore. We get the following error message:
The 'Foo' property on 'SubClass1' could not be set to a 'null' value. You must set this property to a non-null value of type 'Int32'.
After a quick check, our SuperClass table has columns Foo and Foo1. Logical enough, since SuperClass has 2 subclasses SubClass1 and SubClass2, each with a Foo property. In our case, Foo is NULL but Foo1 has an int32 value. So the problem is not with the database - rather, it would seem that the link between our Model and Database has been lost. The discriminator logic was corrupted.
Trying to find indications on what could've gone wrong, we noticed several things:
Even though we never performed any migration on the SQL Azure Entity database, the database now has a _MigrationHistory table
The _MigrationHistory table has one record:
MigrationID: 201204102350574_InitialCreate
CreatedOn: 4/10/2012 11:50:57 PM
Model: <Binary data>
ProductVersion: 4.3.1
Looking at other tables, most of them were emptied when this migration happened. Only the tables that were initially seeded with SampleData remained untouched.
Checking in with the SQL Azure Management portal, our Entity database shows the following creation date: 4/10/2012 23:50:55.
Here is our understanding
For some reason, SQL Azure deleted and recreated our database
The _MigrationHistory table was created in the process, registering a starting point to test the model against for future migrations
Here are our Questions
Who / What triggered the database deletion / recreation?
How could EF re-seed our sample data since Application_Start has System.Data.Entity.Database.SetInitializer<Entities>(null);?
EDIT: Looking at what could've gone wrong, we noticed one thing we didn't respect in this SQL Azure tutorial: we didn't remove PersistSecurityInfo from our SQL Azure Entity database connection string after the database was created. Can't see why on Earth it could have caused the problem, but still worth mentioning...
Nevermind, found the cause of our problem. In case anybody wonders: we hadn't made any Azure deployment since the addition of the pre-processor directives. MS must have restarted the machine our VM resided on, and the new VM recreated the database using see data.
Lesson learned: always do frequent Azure deployments.

Resources