Workflow error on power automate step of renaming a file - sharepoint

I got this error on the power automate workflow step;
The expression
"web/GetFileByServerRelativeUrl('/sites/ESMManila/Shared
Documents/General/WorkloadPlanning/CurrentWorkPlan.xlsx')/moveto(newurl=concat('/sites/ESMManila/Shared
Documents/General/WorkloadPlanning/',outputs('Compose'),'WorkPlan','.xlsx'))"
is not valid.
When i used this uri;
_api/web/GetFileByServerRelativeUrl('/sites/ESMManila/Shared%20Documents/General/WorkloadPlanning/CurrentWorkPlan.xlsx')/moveto(newurl=concat('/sites/ESMManila/Shared%20Documents/General/WorkloadPlanning/',outputs('Compose'),'WorkPlan','.xlsx'))
When I use it on the Send an HTTP request to Sharepoint flow step for the function of renaming a file appended with the compose formula result.

If you want to use the expression within the value you could use the #{} formatting around the concat function.
Try this instead:
_api/web/GetFileByServerRelativeUrl('/sites/ESMManila/Shared%20Documents/General/WorkloadPlanning/CurrentWorkPlan.xlsx')/moveto(newurl=#{concat('/sites/ESMManila/Shared%20Documents/General/WorkloadPlanning/',outputs('Compose'),'WorkPlan','.xlsx')})

Related

Azure Data Factory removing spaces from column names of csv file

I'm a bit new to azure data factory so apologies if I'm missing anything obvious. I've done several searches and I can't find anything that quite fits.
So the situation is that we have an existing pipeline that will take the path to a csv file and pass this in as a delimited data set. As a sink it is using a parquet data set. This is a generic process that we can pass any delimited file into and it will output it as parquet.
This has been working well but now we have started receiving files with spaces and special characters in the header which causes the output to parquet to fail. Unfortunately we don't have control over the format of the files we receive so I can't handle this at source.
What I would like to do is on ingestion of the file replace any spaces and other special characters in the header with an underscore. If I were doing this on premise I could quickly create a powershell script to do it. I had thought about creating a custom task in AFD to call a powershell script to do this in the blob storage but that seems more complicated than it should be. Is there something else I can do to get this process working while keeping it generic?
As #Joel Cochran mentioned, you can use the below expression in Select transformation to replace space and special characters in the header.
regexReplace($$,'[^a-zA-Z]','_')
Source:
In Select transformation, remove the auto mappings and add new rule base mapping to use this expression.
preview:
You can change the output filename not directly in the Copy activity, assuming you are using this activity.
The workaround is to use a parameter for the filename output that you can cleanup.
You can use the Get Metadata activity to get all filenames from the source csv files.
Then loop over these files with a foreach activity.
Within the foreach activity you can set the output filename with the new name with the cleaned value.
The function could look like this:
#replace(item().name, ' ', '_')
More information on the replace function

Passing dynamic content inside a SQL Query in lookup stage of Azure Data Factory

I'm using a lookup stage as a source to fetch some data and i want to pass that output as the input to the next lookup stage. I tried adding #activity('Step1').output.firstRow.Col and it failed with scalar variable was not being declared. Also tried with #{activity('Step1').output.firstRow.col} which failed and the log shows only default expressions supported. Please help if it is possible.
I have accomplished this using dataflow, but considering the performance i would like to know if it can be done in a pipeline.
Please try this:
Query:select * from dbo.test5 where col = '#{activity('Lookup1').output.firstRow.col}'
Output:

Logic APP : ActionFailed. An action failed. No dependent actions succeeded

I am facing the issue with for loop execution with logic APP in azure. Apparently complete playbook execute successfully and functionally its working good. However, i am getting this error because it takes "body" parameter from previous step as input and nothing else. The body is long json and therefore should not be the right input for foreach loop. I tried adding account or Ip address as input but that fails as well.
Input
Output
Please help here
As you mentioned there is just one item in your json data array which contains "MachineId", I assume the first item contains "MachineId". Please refer to the solution below, it will help you to use the only "MachineId" in the 24 cycles of your loop.
We can input an expression to use the "MachineId" in first item:
body('Parse_JSON')[0].MachineId
(In the screenshot above, I just use a "Set variable" to replace your two actions in "For each" loop, but I think there is no difference between them)
Please have a try with this solution~

VBF syntax for SSRS expression cannot figure out proper construct

looking for help on what should be a very basic function. I am trying to get a SUM of a specific value, however I do not seem to get the syntax correct.
Here is what I have
=Sum(Fields!PriorYearSalesDollars.Value - Sum(Fields!PriorYearCost.Value
+Sum(Fields!PriorYearFrtCost.Value)))
However I get an error when trying to sum. Is there another way to test this also? Each time I modify the expression I then have to save the report and upload to the report server and test again. If I do it through the preview function in visual studio it throws a generic error on the whole report. When running from report server, just this specific column shows #Error
This is the syntax that works after FIRST changing the column format to numbers where I accidentally did it as currency first. Not sure why currency didn't work, but this is correct.
=Sum(Fields!PriorYearSalesDollars.Value) - (Sum(Fields!PriorYearCost.Value) + Sum(Fields!PriorYearFrtCost.Value))

What am I missing in trying to pass Variables in an SSIS Execute SQL Task?

I am creating an SSIS Execute SQL Task that will use variables but it is giving me an error when I try to use it. When I try to run the below, it gives me an error and when I try to build the query, it gives me an error SQL Sytnax Errors encountered and unable to parse query. I am using an OLEDB connection. Am I not able to use variables to specify the tables?
You can't parameterize a table name.
Use the Expressions editor in your Execute SQL Task to Select a SqlStatementSource property.
Try "SELECT * FROM " + #[User::TableName]
After clicking OK twice (to exit the Task editor), you should be able to reopen the editor and find your table name in the SQL statement.
Add a string cast in a case where it might be a simple Object - (DT_WSTR,100)
You are using only single parameter(?) in the query and assigning 3 inputs to that parameters which is not fair put only single input and assign some variable as input as shown in image and change the value of variable respectively.
the parameter name should be incremented by 1 start with 0 because they are the indexes representing the "?" in the query which was written the query window.

Resources