Might just be an edge case but I extracted a zip file to a directory using the zip file module. When extracting, zip file names the directory it extracts to.
If there is a way I get to specify the name of the folder Zip file creates to extract the files to? I am hitting an error because I am using the same folder zipped up to test zip file and it keeps using the old folder name which already exists so it throws an error. Here is my code:
orginalFolderName = jobFolder + name
with zipfile.ZipFile(directory,"r") as zip_ref:
zip_ref.extractall(jobFolder)
os.rename(orginalFolderName, newFoldername)
directory = newFoldername
with zipfile.ZipFile(filepath) as z:
z.extractall(dest_folder)
filepath - Complete path of zipfile
dest_folder - destination folder
Related
I am trying to delete the folder with name "repro" and its contents in my build drop location. I have configured my delete files steps as below
Source Folder: $(BuildDropLocation)\$(BuildNumber)\CTrest\lime
Contents:
**/repro/*
repro folder resides here
$(BuildDropLocation)\$(BuildNumber)\CTrest\lime\version\package\code**repro**..
Is there something that I am missing here?
Here is the doc for the command: Delete Files task. Examples of contents:
**/temp/* deletes all files in any sub-folder named temp.
**/temp* deletes any file or folder with a name that begins with temp.
I think, **/repro* will be more suitable in your case.
I am trying to divide the image dataset into train and test. For this I am copying the images from one folder to other in python. For this I have given the address of both source and destination. But the problem arises when it displays the above error. It can not find the image files to copy. Although I have given correct image address which is "C:\Users\DELL\coil-20-unproc\imagename". Still can't copy the images
original_dataset_dir=r"C:\Users\DELL\coil-20-unproc"
# Copy object1 images to train_obj1_dir
fnames = ['obj1_{}.png'.format(i) for i in range(0,72)]
for fname in fnames:
src = os.path.join(original_dataset_dir, fname)
dst = os.path.join(train_obj1_dir, fname)
shutil.copyfile(src, dst)
Jupyter notebook and Jupyter lab refer to relative path from location that it was started up. You can try these
Copy the file to your startup directory.
(You could enter !pwd in a cell and execute to find put your startup directory)
Create a link from a file in your startup directory to that file.
I have this Nodejs lambda function where some files are in a subfolder, like this:
- index.js
- connectors/
- affil.js
I have a Cannot find module error when trying to require the affil.js file. Trying to read it with fs.readFile returns an access denied error.
When I move the file to the root folder, it is accessible. Is there a requirement that Lambda functions files must all be at the root directory? How can I fix that?
Mostly it is because of the way zipping the files making the problem. Instead of zipping the root folder you have to select all files and zip it like below,
Please upload all files and subfolders like below. Please include node_modules folder as well in the zip.
As pointed by #Vijayanath Viswanathan, the issue is with how the zip file is created rather than Lambda.
I used to feed gulp-zip with this:
var src = gulp.src('src/**/*')
The correct way is to prevent folders from being included:
var src = gulp.src('src/**/*.js')
or (if you need to include file with other file extensions)
var src = gulp.src('src/**/*', {nodir: true})
I need to zip some files in amazon s3 without needing to write them to file locally first. Ideally my code worked in development but i don't have many write privileges in production.
folder = output_dir
files = fs.glob(folder)
f = BytesIO()
zip = zipfile.ZipFile(f, 'a', zipfile.ZIP_DEFLATED)
for file in files:
filename = os.path.basename(file)
image = fs.get(file, filename)
zip.write(filename)
zip.close()
the proplem is at this line in production
image = fs.get(file, filename)
Because i don't have write privileges.
My last resort is to write to /tmp/ directory which i have privileges to.
Is there a way to zip files from a url path or directly in the cloud?
I ended up using python tempfile which ended up being a perfect solution.
Using NamedTemporaryFile gave me the guarantee to create named and system visible temporary files that could be deleted automatically. No manual work.
i am trying to unzip a zip file and save into target path
i tried these thing
var zip = new AdmZip(x);
zip.extractAllTo(/target path,false);
and then
fs.createReadStream(path/to/arch.zip).pipe(unzip.Extract({ path: "targetpath"}));
it is extracting the zip file and save that unzip file into target path.it is fine.
but if i upload the two zip file(it contain same name),it is overwrite that folder.
for example
first if i upload image.zip ,it will be extracted and it will be stored into images(target folder)
now images folder contain image folder.
again if i upload image.zip,it will be extracted and it will be overwrite image folder
so images folder contain again one image folder.
but if i upload image1.zip file , the images folder contain image and image1 folder.
so how to save even the folder contain the same name.