Provide a videoUrl to the videoindexer with query parameters in it - azure

We're in the process of implementing the videoindexer.
To upload videos, we'd like to use the videoUrl method instead of uploading the video file. For this we're using url's of videos on our blob storage. These require a SAS token to be served, so the url contains query parameters.
However, I'm unable to provide a videoUrl with query parameters to the endpoint on the videoindexer.
Example of a test request:
https://api.videoindexer.ai/trial/Accounts/MY_ACCOUNT_ID/Videos?accessToken=MY_ACCESS_TOKEN&name=interview&description=interview&privacy=private&partition=some_partition&indexingPreset=AudioOnly&streamingPreset=NoStreaming&videoUrl=https://manualtovideos.blob.core.windows.net/asset-xxxxx/interview.mp4?sp=rl&st=2020-12-03T16:48:42Z&se=2020-12-04T16:48:42Z&sv=2019-12-12&sr=b&sig=l57dDjKYr...8%25253D
When I shorten the blob url using a url shortener service, it works.
The docs say I need to url encode the videoUrl, so I'm doing that using javascript's encodeURI
But this doesn't change the url much, since it disregards ?'s and &'s.
Do I need to encode the url in a different way somehow?
Or is there another way to authenticate, so I can use the blob url without the sas token, since it's also on Azure?

You need to encode the URL.
You can see how it's created using the Azure Video Analyzer for Media Developer Portal in the upload method.

So it turned out I needed to use encodeURIComponent() to encode the videoUrl parameter instead of just encodeURI() or escape()

Related

Can I upload the file to the blob storage and give that path to the docusign instead of base64 or bytes

I have larger files it is taking too much time when I give base64 or even bytes. I am looking for an option where I can directly give bloburl to docusign envelope. If there is any option please help
Yes, you can upload documents in binary format and from cloud providers.
Cloud providers
DocuSign can load directly from
Google Drive
Dropbox
Box
OneDrive (Personal and Business)
Documentation. To use this feature as a developer, first set it up using the DocuSign web app. Then use DocuSign's API logging to see how the API calls reference the files on the cloud servers.
Also see the docs for the Document.remoteUrl attribute.
Use multi-part mime format with the regular Envelopes:create call
Multi-part mime enables you to include one or more documents in binary format. Docs. Example using Node.js. Examples for other languages are also available as part of the QuickStart code examples.
Use Chunked Uploads
The Chunked Uploads API resource enables you to upload a document in binary format, in multiple parts if needed, then use the resulting reference URL when creating an envelope.
The API Request Builder uses chunked uploads. So you can monitor the API via API logging while using the API Request Builder to see what it is doing.
In any case, remember that there's a maximum request size for any API call to DocuSign of 34MB. There are other limits too. Discuss with your DocuSign contacts if you have any issues with limits.

AWS S3 Bucket Presigned url issue

I have written an API to read user posts after the token is passed in the header. API returns many posts at a time with the nested attachment name for images or videos etc. Currently, all attachments are stored in a server folder directory. Attachments is displayed by another API which accept attachment name as a parameter and return attachment url.
Now I want to move on AWS S3 bucket with the same concept with presigned URL.
API is being used on the flutter app.
I created a API which accept user auth token and return upload presigned URL for s3 bucket.
For displaying attachments i am thinking two option.
Create a another API which accept attachment name(object key name) and will return presigned URL of that attachment.
Post API return json data after replacing all attachment name with presigned URL. But this will take too long for nested json data by looping.
I am new in AWS s3 bucket. Please guide what will be the best way to handle this.
How facebook, twitter, instagram handle private files.
The best option I see is returning the post data with already generated pre-signed url.
Having a separate api for generating the presigned url would mean
the system will need to authorize the input anyway (if the user is authorized to access the particular object)
the user has to wait anyway until the signed links are generared, just with additional call
this will take too long for nested json data by looping
Not really, generating the presigned url makes no backend / S3 calls, it's just a bunch of HMAC computations. But, as already mentioned, these need to be done regardless which option is chosen

Get Records from Salesforce using Azure Logic Apps

I am trying to use Azure Logic Apps to read data from one of the Salesforce objects which has huge number of records. The Salesforce connector fetches the record and returns the pagination link as #odata.nextLink. But when I try to use a JSON parser to read the value it’s coming as null.
I tried to access the nextlink in a browser but it requires authentication. What authentication do we pass here ?
I would like to use an until action to iterate till I get a next link . So how do I check the condition for the until loop ?
nextLink doesn't look like core Salesforce thing, it might be the OData connector preprocessing the results for you. You'd have to consult the documentation for the connector (if any).
Salesforce's REST API will return field with nextRecordsUrl if there's next page of results, you'd call that in loop until the field disappears. You'd call it like any other REST API resource available after login, by passing Authorization: Bearer <sessionId also known as accessToken here>. Again - probably the connector abstracts this away from you. Don't think you can send headers like that in browser, you'd need curl, Postman, SoapUI or similar http client.
If you don't get better answer and documentation is scarce - consider using raw REST API. Or Azure Data Factory has an almost-decent Salesforce connector?
https://developer.salesforce.com/docs/atlas.en-us.api_rest.meta/api_rest/dome_query.htm

Can we hit a URL and check for the response as a job in Azure Data Factory?

Say I have a URL like (https://hello.world.com). Is it possible to hit this URL in Azure Data Factory and observe the response?
Yes, you can. You can use the Web activity.
Specify the URL you want and choose Get as method.
This will output the html content in the response json element.

NodeJS - Protect image url to only authorized user

So I'm currently building an app. User would have the possibility to upload image. The image they upload should only be visible by them. So I'm wondering how I can achieve this considering the image is available through a URL.
Actually what I was thinking was, in order to get the picture, the user should do a REST API request and the api would return the image data if the user has the correct permission.
The frontend of my app is in React and the backend (Rest api) in nodeJS.
Edit: Actually the image are store on AWS S3 (this can change if needed)
Thanks !
The best option to allow only authorized users to fetch an image is S3 Presigned URL You can refer to the article, it thoroughly describes how to implement S3 Presigned URL. Another code example with Node JS. If you code in another language just google it "AWS S3 Presigned URL" and you will find it.

Resources