Unable to upload a PDF to DocuSign via CLM API - docusignapi

I have been tasked with integrating with your CLM API. I am not an expert, and I was only able to get as far as your API creating "dummy" file with correct name, but it is not getting replaced with my file.
I have set up my code using configuration from Postman collection received from one of your colleagues, but it only covers upload of a file from local path, while the requirement I have is to move it from Azure Blob Storage.
I am attaching POST call payload and file size before/after for your reference.
Do you have any existing code samples (javascript or c#) that could facilitate this?
Many thanks,
Piotr
file_size_before_after
post_call

This is not supported. You have to first make an API call and download the file from Azure blob service to a local file on your server. Then, you make another API call to upload the file you downloaded from Azure to CLM. You cannot go directly from Azure to CLM.

Related

Security concerns when uploading data to Azure Blob Storage directly from frontend

I am working on a project where we have to store some audio/video files on Azure Blob Storage and after the file is uploaded we need to calculate some price on the basis of the length of the file in minutes. We have an Angular frontend and the idea was to upload the file directly from the frontend, get the response from Azure with the file stats , then call a backend API to put that data in the database.
What I am wondering is what are the chances of manipulation of data in between getting the file data back from Azure and calling our backend API. Is there any chance the length could be modified before sending it to our API?
One possible solution would be to make use of Azure Event Grid with Blob integration. Whenever a blob is uploaded, an event will be raised automatically that you can consume in an Azure Function and save the data in your database.
There's a possibility that a user might re-upload same file with different size. If that happens, you will get another event (apart from the original event when the blob was first created). How you handle with updates would be entirely up to you.

Azure Storage : How to upload a .pdf or .docx file using Rest API

Recently I have been working on adding documents to Azure storage using blob and file share. But then I realized that in file share using rest API I can upload in two steps
Creating a file
Adding content
I am able to doing that but my requirement here is to upload the .pdf, .docx document at once
and then there should be a way to download them as well.
Could some one please help.
Thanks
Unfortunately, there's no batch download capability available in Azure Blob Storage. You will need to download each blob individually. What you could do is download blobs in parallel to speed things up.
There is an alternative way you can approach using C# or PowerShell.
Would recommend you to please go through this MS document :
https://learn.microsoft.com/en-us/azure/storage/blobs/storage-blob-scalable-app-download-files?tabs=dotnet
And this one also
https://azurelessons.com/upload-and-download-file-in-azure-blob-storage/
Reference: How to download multiple files in a single request from Azure Blob Storage using c#?

With Azure logic apps When blob is Added or Modified how to get filename of uploaded file?

I need to get the file name of a file when it is uploaded to blob storage using Logic Apps. I'm new to Logic Apps and this seems like it should be easy but I'm not having any luck.
To try and find the filename I'm sending what's available to me in an email. I will eventually use the filename as part of an http post to another service.
The logic app is triggered as it should be when I upload but I do not get any data in my email for the items I chose. I am not uploading to a subfolder. I've looked at code views and searched other post but not finding a solution. Any help most appreciated.
Thanks
Instead of using Inbuilt When a blob is added or modified in Azure Storage connector, try using When a blob is added or modified (properties only) (V2) and add List of Files Display Name connector in order to get the file name.
Here are the screenshots
Here is the overall Logic app flow
Here is the screenshot from my outlook

Logic Apps Azure Data lake Upload file Action-large files are failing to upload with 413 status code

I am trying to upload a file to Azure Data lake using Azure Data lake Upload File action of Logic Apps. It is working fine for small files about 20 MB. But files with 28 MB or greater are failing with Status code 413- request entity too large.
I have enabled Chunking also in the Upload File Action.Is there any solution for this?
Thanks for the response George.
I have got a workaround. My scenario involves getting file from SharePoint online and uploading to Azure Data Lake. In earlier setup which had the above issue, I was using SharePoint trigger -When a file is created or modified in a folder, which returns file content, to get the file from SharePoint and Datalake Upload File action to upload it to Azure Data Lake. This setup was failing for files larger than 27MB (request entity too large-413) in File Upload Action even when chunking was enabled at File Upload action.
After some troubleshooting, I got a workaround which involves using another SharePoint trigger-When a file is created or modified in a folder(properties-only). It returns metadeta instead of file content. After getting metadeta I used Get File Content SharePoint Action to get the file content to upload to Azure Data lake which worked fine.
Logic app has the limits for message, for the Logic Apps message size limit, see Logic Apps limits and configuration.
However actions that support chunking can access the message content in these outputs. So you just need to set the Allow chunking on.
I test with a 40MB blob file and it succeeds. Further more information you could refer to this doc:Handle large messages with chunking in Azure Logic Apps. Hope this could help you.

Windows azure requests

I have an application that is deployed on Windows Azure, in the application there is a Report part, the reports works as shown below.
The application generates the report as a PDF file and save it in a certain folder in the application.
I have a PDF viewer in the application that takes the URL of the file and displays it.
As you know, in windows azure I will have several VMs that will handled through a Load balancer so I can not ensure that the request in step 2 will go to the same VM in step 1, and this will cause a problem for me.
Any help is very appreciated.
I know that I can use BLOB, but this is not the problem.
The problem is that after creating the file on a certain VM, I give the PDF viewer the url of the pdf viewer as "http://..../file.pdf". This will generate a new request that I cannot control, and I cannot know which VM will server, so even I saved the file in the BLOB it will not solve my problem.
as in any farm environment, you have to consider saving files in a storage that is common for all machines in the farm. In Windows Azure, such common storage is Windows Azure Blob Storage.
You have to make some changes to your application, so that it saves the files to a Blob stroage. If these are public files, then you just mark the Blob Container as public and provide the full URL to the file in blob to the PDF viewer.
If your PDF files are private, you have to mark your container as private. Second step is to generate a Shared Access Signature URL for the PDF and provide that URL to the PDF viewer.
Furthermore, while developing you can explore your Azure storage using any of the (freely and not so freely) available tools for Windows Azure Storage. Here are some:
Azure Storage Explorer
Azure Cloud Storage Studio
There are a lot of samples how to upload file to Azure Storage. Just search it with your favorite search engine. Check out these resources:
http://msdn.microsoft.com/en-us/library/windowsazure/ee772820.aspx
http://blogs.msdn.com/b/windowsazurestorage/archive/2010/04/11/using-windows-azure-page-blobs-and-how-to-efficiently-upload-and-download-page-blobs.aspx
http://wely-lau.net/2012/01/10/uploading-file-securely-to-windows-azure-blob-storage-with-shared-access-signature-via-rest-api/
The Windows Azure Training Kit has great lab named "Exploring Windows Azure Storage"
Hope this helps!
UPDATE (following question update)
The problem is that after creating the file on a certain VM, I give
the PDF viewer the url of the pdf viewer as "http://..../file.pdf".
This will generate a new request that I cannot control, and I cannot
know which VM will server, so even I saved the file in the BLOB it
will not solve
Try changing a bit your logic, and follow my instructions. When your VM create the PDF, upload the file to a blob. Then give the full blob URL for your pdf file to the PDF viewer. Thus the request will not go to any VM, but just to the blob. And the full blob URL will be something like http://youraccount.blob.core.windows.net/public_files/file.pdf
Or I am missing something? What I understand, your process flow is as follows:
User makes a special request which would cause PDF file generation
File is generated on the server
full URL to the file is sent back to the client so that a client PDF viewer could render it
If this is the flow, that with suggested changes will look like the following:
User make a special request which would cause PDF file generation
File is generated on the server
File is uploaded to a BLOB storage
Full URL for the file in the BLOB is returned back to the client, so that it can be rendered on the client.
What is not clear? Or what is different in your process flow? I do exaclty the same for on-the-fly report generation and it works quite well. The only difference is that my app is Silverlight based and I force file download instead of inline-displaying.
An alternative approach is not to persist the file at all.
Rather, generate it in memory, set the content type of the response to "application/pdf" and return the binary content of the report. This is particularly easy if you're using ASP.NET MVC, but you can use a HttpHandler instead. It is a technique I regularly use in similar circumstances (though lately with Excel reports rather than PDF).
The usefulness of this approach does depend on how you're generating the PDF, how big it is and what the load is on your application.
But if the report is to be served just once, persisting it just so that another request can be made by the browser to retrieve it is just wasteful (and you have to provide the persistence mechanism).
If the same file is to be served multiple times and it is resource-intensive to create, it makes sense to persist it, then.
You want to save your PDF to a centralized persisted storage. VM's hard drive is neither. Azure Blob Storage is likely the simplest and best solution. It is dirt cheap to store and access. API for storing files and access them is very simple
There are two things you could consider.
Windows Azure Blob + Queue Storage
Blob Storage is a cost effective way of storing binary and sharing that information between instances. You would most likely use a worker role to create the Report which would store the report to Blob Storage and drop a completed message on the Queue.
Your web role instance could monitor the queue looking for reports that are ready to be displayed.
It would be similar to the concept used in the Windows Azure Guest Book app.
Windows Azure Caching Service
Similarly [and much more expensive] you could share the binary using the Caching Service. This gives a common layer between your VMs in which to store things, however you won't be able to provide a url to the PDF you'd have to download the binary and use either an HttpHandler or change the content-type of the request.
This would be much harder to implement, very expensive to run, and is not guaranteed to work in your scenario. I'd still suggest Blobs over any other means
Another option would be to implement a sticky session handler of your own. Take a look at:
http://dunnry.com/blog/2010/10/14/StickyHTTPSessionRoutingInWindowsAzure.aspx

Resources