Copy PDF files from Sharepoint folder to FTP using Power automate - sharepoint

I am new to power automate. I have a requirement to copy files from sharepoint folder to FTP using power automate.
Step 1: Created "When file is created in folder" (sharepoint) -> Provided sharepoint address and folder
Step 2: Create File( FTP) -> provided connection information
I have provided FTP details but it is failed to create connection and it says "the remote name could not be resolved. ClientRequestId:0d0b38ab-af93-4d9e-91ec-d97dd33c068a"
enter image description here

Please do not use the whole ftp endpoint like ftps://waws-xxxxxxx.ftp.azurewebsites.windows.net/site/wwwroot as "Server Address" in your FTP connector.
Remove ftps:// at the beginning of ftp endpoint, just use waws-xxxxxxx.ftp.azurewebsites.windows.net/site/wwwroot instead.

Related

How to create local file link in description section of work item within Azure Devops?

I am trying to create a link to a local file within the description section of a task in Azure DevOps.
I have tried using the extension file:///H:\Documents\test.xlsx The link appears as a hyperlink, however it is not clickable.
How to create local file link in description section of work item
within Azure Devops?
I am afraid this is not achievable in Azure devops. The description section of the work item cannot add the link associated with the local file.
You can attach local files as attachments to work item.
Choose the Attachment tab icon to attach a file with supplemental information.

Copy files from Azure blob to Sharepoint folder using Microsoft Flow

I want to copy files from Azure blob storage to SharePoint Folder using Microsoft Flow. I have tried several times and the flow always fail when its running.
I have attached the flow that I'm currently trying to execute:
Can someone help me with this?
For your problem, please refer to the logic I post below (I have upload a testcsv.csv file to the blob storage):
After the trigger "When a blob is added or modified", we need to use "Get blob content" action to get the content of the csv file. Then add "Create file" action of SharePoint and put the file content which we got from blob to the "File Content" box.
By the way, as you mentioned it is a csv file, so in my blob storage container I only have one file by default. If there are more than one file in your blob storage, you can use "List blobs" action and use "For each" to loop it and then create each of the file in SharePoint.
I tried that however failed as it would not handle if you have folder structure in the blob and you'd like to mirror that structure in SP and copy individual files to folders.

Copy On-Prem Flat file to Azure Blob using Azure Data Factory V2

I am trying to load a Flat file to BLOB using the ADF V2. I have installed the Self Hosted Integration Runtime for the same. The Integration Runtime on Local Machine shows that is successfully connected to the cloud Service as in the SC below. However while making the LinkedService to the on Prem File, some credentials are required. I am not sure of what UserName or Password should be fed in. I have tried both On-Prem and Azure passwords (Wanted to try). Please see the SC.
Could you please guide as how the connection can be made to a local Flat file in my case.
Thanks
- Akshay
Note: You can choose a file while creating the File System as a source in ADF.
You may follow the following steps to select the text file while creating File system as source:
First create a linked service as follows:
Host: **C:\AzureLearn\**
Create a copy activity and select Source as follows:
Click on Source => New
Select New DataSet => Select File => File System and continue
Select Format= > Choose DelimitedText and continue
=> Select previously created File system linked service and click on browse.
Choose a file or folder.
Here you can find the file located under the previously selected folder while creating File System.
Hope this helps.
The LinkedService connection to BLOB or AzureSQL Server are being blocked by the firewall of my organisation. It won't let my system Integration runtime connect my resources to the public cloud.
I followed the same steps on my personal machine and everything went smoothly. Will get the firewall restrictions sorted and update this link for more information.

Copying files in fileshare with Azure Data Factory configuration problem

I am trying to learn using the Azure Data Factory to copy data (a collection of csv files in a folder structure) from an Azure File Share to a Cosmos DB instance.
In Azure Data factory I'm creating a "copy data" activity and try to set my file share as source using the following host:
mystorageaccount.file.core.windows.net\\mystoragefilesharename
When trying to test the connection, I get the following error:
[{"code":9059,"message":"File path 'E:\\approot\\mscissstorage.file.core.windows.net\\mystoragefilesharename' is not supported. Check the configuration to make sure the path is valid."}]
Should I move the data to another storage type like a blob or I am not entering the correct host url?
You'll need to specify the host in json file like this "\\myserver\share" if you create pipeline with JSON directly or you use set the host url like this "\myserver\share" if you're using UI to setup pipeline.
Here is more info:
https://learn.microsoft.com/en-us/azure/data-factory/connector-file-system#sample-linked-service-and-dataset-definitions
I believe when you create file linked service, you might choose public IR. If you choose public IR, local path (e.g c:\xxx, D:\xxx) is not allowed, because the machine that run your job is managed by us, which not contains any customer data. Please use self-hosted IR to copy your local files.
Based on the link posted by Nicolas Zhang: https://learn.microsoft.com/en-us/azure/data-factory/connector-file-system#sample-linked-service-and-dataset-definitions and the examples provided therein, I was able to solve it an successfully create the copy action. I had two errors (I'm configuring via the data factory UI and not directly the JSON):
In the host path, the correct one should be: \\mystorageaccount.file.core.windows.net\mystoragefilesharename\myfolderpath
The username and password must be the one corresponding to the storage account and not to the actual user's account which I was erroneously using.

How do I upload a VM to Azure

I see a lot of confusion about how to connect to Azure and upload a VM. It involves creating a management certificate with makecert and uploading with csupload, and there are a lot of flags to get wrong. So I thought I'd ask the question and answer it to save someone some trouble.
(cut from initial question and pasted as an answer)
Basic Principles
You must have Visual Studio and the Azure SDK installed.
To connect to Azure, you create a security certificate on your local machine that identifies you. Then you go to Azure and import the certificate. Now your local machine and Azure are able to talk to each other securely. For this reason you can't start the work on one machine and finish it on another. Work on one machine.
You must have the certificate in your Current User certificate store and also exported to your hard drive. You need a copy on the hard drive to upload to Azure, and you need it in the certificate store because when you connect to Azure that's where it will look for it. You can create it on your hard drive and import it, or you can create it in the certificate store and export it. The following instructions show you how to do the latter.
Create the Certificate
Open a Visual Studio command prompt as an Administrator. (right click on the menu item, and click "run as administrator".
Copy/paste the following:
makecert -sky exchange -r -n "CN=MyCertificateName" -pe -a sha256 -len 2048 -ss My "MyCertificateName.cer" This will create the certificate and install it in your Current User certificate store. It will not create a copy on your hard drive. It's the "My" key word that causes the certificate to be stored in the certificate store for your current account.
Open the certificate manager by typing certmgr in the start menu command. You should see Certificates - Current User at the top. Open Personal/Certificates and you should see the certificate you just created.
Right click the certificate and click All Tasks, Export. Click Next. Select No do not export the private key. Click Next. Select the DER encoded format. Click Next. Save the certificate on your hard drive somewhere with the same name as you used to create it (doesn't have to be the same but it avoids confusion).
Import the Certificate into Azure
Log into Azure.
Click Settings then Management Certificates then Upload.
Browse to the management certificate that you just exported and saved, and upload it.
Copy the Subscription Identifier and Thumbprint from the uploaded certificate and paste them into a text file. Save the file on your local hard drive. You need these numbers handy for the next step.
If you want to be safe, delete the certificate that you exported to your hard drive. You don't need it there any more. Azure will look for the certificate in your certificate store when it authorizes you, not on your hard drive.
At this point you are able to make a secure connection between your computer/account and Azure. You will now use this secure connection to upload your Virtual Machine.
Upload your Virtual Machine
First establish a secure connection to Azure. Open an Azure command prompt as an Administrator and enter the following:
csupload Set-Connection "SubscriptionId=YourSubscriptionIdGoesHere;CertificateThumbprint=YourCertificateThumbPrintGoesHere;ServiceManagementEndpoint=https://management.core.windows.net"
Finally it's time to upload the file. Open the Azure portal, select your storage account and copy the blobs service endpoint URL. Enter the following at the same Azure command prompt as above:
csupload Add-PersistentVMImage -Destination "YourBlobServiceEndPointUrlGoesHere/vhds/YourVhdNameGoesHere" -Label YourVhdNameGoesHere-LiteralPath "ThePathToYourVhdOnTheLocalComputerGoesHere" -OS Windows
The VHD should begin to upload.
Here's an easier way, you will need:
Windows Azure PowerShell
Open "Windows Azure PowerShell"
-OR- open a PS prompt and run:
Set-ExecutionPolicy RemoteSigned
Import-Module "C:\Program Files (x86)\Microsoft SDKs\Windows Azure\PowerShell\Azure\Azure.psd1"
2.
Get-AzurePublishSettingsFile
(Will prompt you to save a .publishsettings file required in the next step)
3.
Import-AzurePublishSettingsFile "C:\Temp\Windows Azure...credentials.publishsettings"
4.
add-azurevhd -destination "https://.blob.core.windows.net/vhds/File.vhd" -localfilepath "C:\Users\Public\Documents\Hyper-V\Virtual hard disks\File.vhd"
For more info see:
Get Started with Windows Azure Cmdlets

Resources