Script Activity in ADF does not take the new parameter values for Dynamic Link service - azure

I have Dynamic Link Service created in ADF with parameters to pass the values dynamically whenever i want to change the server's name in Linked service. I am using this link service in Script Activity in Pipeline.
For first time, script activity works fine. After i change the server name, it does not take the new server name by default.
For example, i created Linked service with Dev SQL Server. I create Script activity in Pipeline. It works fine for Dev. if i change SQL Server name in Dynamic Linked Service to QA server. Script activity is still pointing to Dev Server. It does not take new parameter value.
I tried changing the parameter value. Same scenario works fine for Data Service which i used in copydata in Pipeline

I have reproduced the above and able to change the server name in script activity successfully.
First, I have created linked service(Azure SQL database) parameters.
Don't give any default values.
Given the parameters like below.
Then in Script activity, given pipeline parameters to it. You can directly give values using dynamic content. For sample I have given a select query.
I have two SQL servers rakeshserver and rakeshserver2. While debugging it will ask values for pipeline parameter values. If you gave values using dynamic content in script activity, then it executes directly.
first server and table in first server:
Result:
Second server and table in second server:
Result:
If your Script activity is giving the result for same server means, taking the default value might be the reason for it. Give the parameter value either during debug or using dynamic content then it may work as above.

Related

i can't connect to input container. but the container is acccessible and the file is there

I learning azure, specifically datafactory, so in a basic exercice.
1 - I should create a input container, and a output container (using azure sorage 2).
2 - After that, i created the datasets for input and output.
3 - And finally. I should connect the dataflow to my input dataset.
but
i can test conections on the datasets to prove that i created it without problems. but i cant test the connection on my dataflow to the input dataset.
enter image description here
i tryed
recreating it with different names.
keep only the needed file in the storage
use different input file (i am using a sample similar to the "movies.csv" expected to the exercise.
I created azure blob container and uploaded file
I created linked service with azure storage account
I created a dataset with above linked service following below procedure:
I tested the connection, it connected successfully.
I didn't get any error. The error which you mentioned above is related to dynamic content. If you assign any parameters in dataset provide the values of parameters correctly. I added parameters in dataset as below
I try to test the dataset I got error:
I added values for parameters in debug settings
Tested the connection, it connected successfully
Otherwise add the sink to the dataflow and try to debug it, it may work.
I think I found the solution.
when i am working with "debug on" and for some reason i create another "data flow", you cant connect to the new datasets.
But
if I restart the debug (put off and on again), the connections start working again.

Iterate through files in Data factory

I have a Datalake gen 1 with folder structure /Test/{currentyear}/{Files}
{Files} Example format
2020-07-29.csv
2020-07-30.csv
2020-07-31.csv
Every day one new file gets added to the folder.
I need to create ADF to load the files in the SQL server.
COnditions
When my ADF runs for the first time it needs to iterate all files and load into sql server
When ADF executing starting from second time( daily once) it needs to pick up only todays file and load into SQL server
Can anyone tell me how to design ADF with above conditions
This should be designed as two part.
When my ADF runs for the first time it needs to iterate all files and
load into sql server
You should create a temporary pipeline to acheieve this.(I think you know how to do this, so this part I will not talk about.)
When ADF executing starting from second time( daily once) it needs to
pick up only todays file and load into SQL server
So this needs you to create another pipeline which is continuous running.
Two points to acheive this:
First, trigger this pipeline by event trigger.(When the file is upload, trigger this pipeline.).
Second, filter the file by specific format:
For your requirement, the expression should be #{formatDateTime(utcnow(),'yyyy-MM-dd')}.
On my side, I can do that successful. Please have a try on your side.

How to retreive endpoint of a service

i want to add my project's endpoint in the project tear down script. What is the syntax in order to get the endpoint for all requests and test requests as the user will assign their endpoint via all requests and test requests before running the project?
i seen an example using test step but i don't want to retrieve it via the test step route:
testRunner.testCase.getTestStepByName("dd").getHttpRequest().getEndpoint();
The tear down script use either , log, context, runner nd project variables.
Thanks
Based on the information updated in the question, it looks like you have to access the endpoint in the TearDown Script of the project.
It also appears that you would need to execute the same set of tests against different base url of the endpoint and domain. Not sure even you might need to use the credentials accordingly.
Considering the above, it would be easy to project level properties.
Here you would go:
Create a project level custom property for base url, say BASE_URL as property name and value as http://10.0.0.1:8008. Of course, change it with actual value as needed with respect to the tests to be executed.
Similarly create another project level property for domain, say DOMAIN_NAME and provide its value according the test.
Double click on service / interface, click on Service Endpoints tab.
Remove all the existing values.
Add a new endpoint by clicking + icon.
Add ${#Project#BASE_URL} as endpoint and ${#Project#DOMAIN_NAME} as domain values
If required, you use the same approach for the credentials.
Now click on Assign button there and choose All requests and Tests option from the dropdown.
Similarly, do the same if you have multiple services / interfaces.
How to access the above values in TearDown Script?
log.info "Endpoint : ${project.getPropertyValue('BASE_URL')}"
log.info "Domain : ${project.getPropertyValue('DOMAIN_NAME')}"
When you want to change domain or base url, just change the values of the respective project properties before you run execute the tests against different servers / environments.
EDIT:
The values for the endpoint or domain can passed dynamically (without even changing value saved in the project) from command line using SOAPUI_HOME/bin/testrunner utility while executing the tests. For more details, refer documentation

Switching production azure tables powering cloud service

Would like to know what would be the best way to handle the following scenario.
I have an azure cloud service uses a Azure storage table to lookup data against requests. The data in the table is generated offline periodically (once a week).
When new data is generated offline I would need to upload it into a separate table and make config changes (change table name) to the service to pick up data from the new table and re-deploy the service. (Every time data changes I change the table name - stored as a constant in my code - and re-deploy)
The other way would be to keep a configuration parameter for my azure web role which specifies the name of the table which holds current production data. Then, within the service I read the config variable for every request - get a reference to the table and fetch data from there.
Is the second approach above ok - or would it have a performance hit because I read the config, create a table client on every request that comes to the service. (The SLA for my service is less than 2 seconds)
To answer your question, 2nd approach is definitely better than the 1st one. I don't think you will take a performance hit because the config settings are cached on 1st read (I read it in one of the threads here) and creating table client does not create a network overhead because unless you execute some methods on the table client, this object just sits in the memory. One possibility would be to read from config file and put that in a static variable. When you change the config setting, capture the role environment changing event and update the static variable to the new value from the config file.
A 3rd alternative could be to soft code the table name in another table and have your application read the table name from there. You could update the table name as part of your upload process by first uploading the data and then updating this table with the new table name where data has been uploaded.

Can I send data to a RemoteApp using Remote Desktop Services?

When I launch a RemoteApp via Remote Desktop Web Access, is there a way to send data to the remote app?
Desired senario:
A user logs into a website with their credentials. They also provide demographic information such as first name, last name, address, etc.
The website connects to the RemoteApp via SSO and makes the demographic information available to the RemoteApp.
For example, if the RemoteApp is a Windows Forms app, can I get this information and display it in a message box?
Edit1: TomTom's response in this question mentions using named pipes to send data. Is that applicable to this problem?
It turns out you can pass command line parameters to the RemoteApp using the remoteapplicationcmdline property like such:
remoteapplicationcmdline:s:/Parameter1: 5234 /Parameter2: true
(The names "/Parameter1" and "/Parameter2" are just examples. Your remote app will have to define and handle these as appropriate.)
This setting is part of the RdpFileContents property of the MsRdpClientShell object.
Here is a resource for other RdpFileContents properties.
Your code might end up looking something like this:
MsRdpClientShell.PublicMode = true;
MsRdpClientShell.RdpFileContents = 'redirectclipboard:i:1 redirectposdevices:i:0 remoteapplicationcmdline:s:/Parameter1: 5234 /Parameter2: true [Other properties here...]';
MsRdpClientShell.Launch();
For larger amounts of information, we might send preliminary data to a web service, retrieve an identifier back, pass this identifier to the RemoteApp via the command line, then have the RemoteApp query the web service to get all the information.
Of course, for the parameters to be of use the program must be looking for them. Setting up a database to query has a little security issue if it is sensitive data.
If the program (RemoteApp) is looking for data in the form of a CSV or table or something, then you might be able to send a lot of data to be processed. It just depends upon what parameters (and form) the program is going to use.

Resources