With AWS S3 MultiPart upload to a named directory using C# and the .Net SDK - c#-4.0

The following fails with this error message:
"The specified upload does not exist. The upload ID may be invalid, or the upload may have been aborted or completed."
UploadPartRequest uploadRequest = new UploadPartRequest()
.WithBucketName(IniValues.Instance.TargetBucketName)
.WithKey("junk/20070125.log")
.WithUploadId(initResponse.UploadId)
.WithPartNumber(i)
.WithPartSize(partSize)
.WithFilePosition(filePosition)
.WithFilePath("C:\\InetTemp\\Logs\\20070125.log");
The problem is with the ".WithKey("junk/20070125.log")". If I strip out the "junk/" it works perfectly.
So the question is, how to upload a file to a specific AWS directory? All the documentation I found shows tha correct way to be to prepend the directory name and a forward slash. What am I missing?

It turns out I was adding the folder name to the string after calling InitiateMultipartUploadRequest. Once I changed the key value to be consistent across the upload calls it began to work.

Related

when i go to update the picture ic cloudinary it gives error "no such file or directory

when i go to update the picture ic cloudinary it gives error "no such file or directory , open D:/web development/mern-stack-project-course-commerce/backend/undefine
please help and check images
backend cloudinary code
throow This Error
i expect my problem slove this
The error message you're seeing is thrown by the NodeJS code which is part of the Cloudinary Node JS SDK and indicates that there is either no file at the path you provided ('D:/web development/mern-stack-project-course-commerce/backend/undefine') - this actually might be the issue since undefined is not an image, or there is a file but the permissions do not allow your code to open the file.
May I please ask you to double-check the path and if there is in fact a file there and what permissions it has? Perhaps you actually wanted a relative path?
You could also try to pass a path to a file that you definitely know exists and has the correct permissions and see if you can upload that.

How to Download a File (from URL) in Typescript

Update: This question used to ask about Google Cloud Storage, but I have since realized the issue actually is reproducable merely trying to save the download to local disk. Thus, I am rephrasing the question to be entirely about file downloads in Typescript and to no longer mention Google Cloud Storage.
When attempting to download and save a file in Typescript with WebRequests (though I experienced the same issue with requests and request-promises), all the code seems to execute correctly, but the resultant file is corrupted and cannot be viewed. For example, if I download an image, the file is not viewable in any applications.
// Seems to work correctly
const download = await WebRequest.get(imageUrl);
// `Buffer.from()` also takes an `encoding` parameter, but it's unclear how to determine the encoding of a download
const imageBuffer = Buffer.from(download.content);
// I *think* this line is straightforward
const imageByteArray = new Uint8Array(imageBuffer);
// Saves a corrupted file
const file = fs.writeFileSync("/path/to/file.png", imageByteArray);
I suspect the issue lies within the Buffer.from call not correctly interpreting the downloaded content, but I'm not sure how to do it right. Any help would be greatly appreciated.
Thanks so much!
From what I saw in the examples for web-request, download.content is just a string. If you want to upload a string to Cloud Storage using the node SDK, you can use File.save, passing that string directly.
Alternatively, you could use one the solutions seen here.

Can't create .zip file with Azure Logic Apps' SharePoint Create File action

I'm trying to create a .zip file by passing the returned body of an HTTP GET request to SharePoint's Create File.
Body is:
{
"$content-type": "application/zip",
"$content": "UEsDBBQACA...="
}
Shouldn't this just work? The docs only define the Create File field as "Content of the file." but that's not super informative...
I believe I've done this before with a file that was application/pdf and it worked. Unfortunately, I can't find that Logic App (I think it may have been an experiment I've since deleted).
I should note that the Create File action does create a valid .zip file, in that it's not corrupt, but archive is empty. It's supposed to contain a single .csv file.
I tried decoding the Base64 content and it's definitely binary data.
Any idea where I'm going wrong?
I test with Postman and when I use the form-data way to POST the request, I found the .zip file couldn't be open. Then I check the Logic App run history and I find the problem is if just use the triggerbody() as the file content it will fail.
This is because the triggerbody() not just have the $content, so I change the expression to triggerBody()['$multipart'][0]['body'] then it works and the .zip file is full.

Can we read files from server path using any fs method in NodeJs

In my case I need to read file/icon.png from cloud storage/bucket which is a token base URL/path. Token resides in header of request.
I tried to use fs.readFile('serverpath') but it gave back error as 'ENOENT' i.e. 'No such file or directory' is existed, but file is existed on that path. So are these methods are eligible to make calls and read files from server or they work only with static path, if that is so then in my case how to read file from cloud bucket/server.
Here i need to pass that file-path to UI, to show this icon.
Use this lib to handle GCS operations.
https://www.npmjs.com/package/#google-cloud/storage
If you do need use fs, install https://cloud.google.com/storage/docs/gcs-fuse, mount bucket to your local filesystem, then use fs as you normally would.
I would like to complement Cloud Ace's answer by saying that if you have Storage Object Admin permission you can make the URL of the image public and use it like any other public URL.
If you don't want to make the URL public you can get temporary access to the file by creating a signed URL.
Otherwise, you'll have to download the file using the GCS Node.js Client.
I posted this as an answer as it is quite long to be a comment.

Setting Metadata in Google Cloud Storage (Export from BigQuery)

I am trying to update the metadata (programatically, from Python) of several CSV/JSON files that are exported from BigQuery. The application that exports the data is the same with the one modifying the files (thus using the same server certificate). The export goes all well, that is until I try to use the objects.patch() method to set the metadata I want. The problem is that I keep getting the following error:
apiclient.errors.HttpError: <HttpError 403 when requesting https://www.googleapis.com/storage/v1/b/<bucket>/<file>?alt=json returned "Forbidden">
Obviously, this has something to do with bucket or file permissions, but I can't manage to get around it. How come if the same certificate is being used in writing files and updating file metadata, i'm unable to update it? The bucket is created with the same certificate.
If that's the exact URL you're using, it's a URL problem: you're missing the /o/ between the bucket name and the object name.

Resources