I have variable address = /data/train/1.jpg, and I'm trying to read file by
im = Image.open(address)
FileNotFoundError: [Errno 2] No such file or directory: '/data/train/1.jpg'
By some reasons I can't use full name of file.
I started jupyter notebook from folder which actually contains file 1.jpg in /data/train/.
How can I fix it?
Relative addressing means from the perspective of the current working directory. So if you script is in the same directory that the data folder is in, your path to the file would be ./data/train/1.jpg. Note the ./, which means the current directory.
use relative path, this one is absolute
address = './data/train/1.jpg'
im = Image.open(address)
in this case . means current location while slash means the environments root (view and explanation) on this depends on your OS
Related
As far as I'm aware I'm using best practices to define paths (using raw strings) and how I go about joining them (using os.path.join()), e.g.
import os
fdir = r'C:\Code\...\samples'
fpath = os.path.join(fdir, 'fname.ext')
and doing so has not caused me any problems when running my code within a Python or command shell. If I print fpath to the console I get consistent use of \s in the path:
C:\Code...\samples\fname.ext
But when I run a Docker containerized version of the code and run the image I get the error:
FileNotFoundError: [Errno 2] No such file or directory:
'C:\Code\...\samples/fname.ext'
I don't understand why os.path.join() has used a / to join fdir and fname.ext when the rest of the path included \\. It doesn't do this when I run the code outside of the container.
I have tried using os.path.normpath():
fpath = os.path.join(fdir, 'fname.ext')
fpath = os.path.normpath(fpath)
as discussed here, and os.sep.join():
fpath = os.sep.join([fdir, 'fname.ext'])
as covered here, and Path().joinpath():
from pathlib import Path
fpath = Path(fdir).joinpath('fname.ext')
as well as Path() / 'path_to_add':
fpath = Path(fdir) / 'fname.ext'
as discussed here, but in every case I end up with the same result using os.path.join().
Can someone please help me to understand what is going on and how to create consistent paths that will work whether I run the code in Python in a Windows environment, or in a Docker container?
Update Nov. 16:
In trying to keep my question brief I think I've left out details that are crucial. Apologies to those who have kindly taken the time to offer suggestions based on my incomplete description of the problem.
My code needs to import/export files from/to directories that are defined within a user-specified configuration file.
So the configuration file has a section of code where the user defines variables and paths, e.g.
samplesDir = r"path-to-samples-directory"
The variables are stored in a dictionary of dictionaris and stored as a .json.
At the start of the code the user defines the key that selects the dictionary of interest so that at various parts in my code when a file needs to be imported/exported, the paths are at hand.
So back to my example, samplesDir is stored in the configuration dictionary, cfgDict, so all I need to do is append the file name:
sampleFpath = os.path.join(sampleDir, sampleFname)
and sampleFname is determined based on other variables.
Because of the dynamic nature of the variables (including directory paths and file paths), I think it rules out the use of static path defined in a .yml with Docker Compose.
Update Nov. 18:
It may help to include a few more details and some screenshots.
The above screenshot shows the file and folder structure of the src directory containing the source code, the main app.py script for command-line use, the Dockerfile, etc.
The configs folder contains JSON files that includes variables, paths to directories and files. The user can create configuration files either by copying an existing one and modifying the entries, or configuration files can be generated by calling config.py.
Within config.py I have pre-set variables and paths, so that the directory path to the configuration files (configs), sample files (sample_DROs) and others (e.g. fiducials) are all within src.
I don't anticipate any reason why the user would want to store the config files anywhere else, nor do I expect them to want to use different sample files (or move them elsewhere). However, they will undoubtedly create their own fiducials and may decide not to store them in the fiducials directory (i.e. somewhere not within the src directory).
Likewise I have pre-set the download directory (based on the parameters stored within the configuration files, files are fetched from a server and downloaded) to be the default Downloads directory:
rootDownloadDir = os.path.join(Path.home(), "Downloads", "xnat_downloads")
Those files are later imported, processed, and the outputs are (by default) exported into sub-directories within rootDownloadDir.
Within Dockerfile I set the working directory of the container to be that of the source code and copy all of the contents of src (with the exception of some directories defined in .dockerignore):
WORKDIR C:/Code/WP1.3_multiple_modalities/src
...
COPY . .
so that the structure of the container mimics that of WORKDIR:
Hence I have allowed for flexibility in import/export directories, and they are by default a combination of paths within and outside of the src directory. And so, the code executed within the container will need to access files both within and outside of src.
That said, I don't know what rootDownloadDir will look like when os.path.join(Path.home(), "Downloads", "xnat_downloads") is run within the container.
This has got me thinking - Is it bad practice to set the download directory outside of src?
Returning to the original error:
the sample file is in the container:
From the actual behavior I can suppose that the container is based on Unix-like image. Path separator is / in such systems.
To build an environment-independent path which works inside and outside of the container you need the following steps:
Mounting of host folder to container directory.
Environment variable inside and outside the container.
I can show an example of how this is achievable via docker-compose tool and its configuration file docker-compose.yml:
# docker-compose.yml file
version: '3'
services:
<service_name>: # your service name here
image: <image_name> # name of image your container is built on
environment:
- SAMPLES_PATH=/samples
volumes:
- C:\Code\somepath\samples:/samples
In your python code you can use the following structure:
import os
fdir = os.getenv('SAMPLES_PATH', r'C:\Code\...\samples')
fpath = os.path.join(fdir, 'fname.ext')
I wish to access the file system, but the file:// prefix does not seem to correspond to the absolute path.
When trying to fetch('file://<absolute path to file>')
e.g. Users/name/dir/file I seem to be unable to locate the file required.
Omitting the file part, fetch appends the relative path to the end of localhost, what does file reference if it's not the absolute path?
I am trying to divide the image dataset into train and test. For this I am copying the images from one folder to other in python. For this I have given the address of both source and destination. But the problem arises when it displays the above error. It can not find the image files to copy. Although I have given correct image address which is "C:\Users\DELL\coil-20-unproc\imagename". Still can't copy the images
original_dataset_dir=r"C:\Users\DELL\coil-20-unproc"
# Copy object1 images to train_obj1_dir
fnames = ['obj1_{}.png'.format(i) for i in range(0,72)]
for fname in fnames:
src = os.path.join(original_dataset_dir, fname)
dst = os.path.join(train_obj1_dir, fname)
shutil.copyfile(src, dst)
Jupyter notebook and Jupyter lab refer to relative path from location that it was started up. You can try these
Copy the file to your startup directory.
(You could enter !pwd in a cell and execute to find put your startup directory)
Create a link from a file in your startup directory to that file.
I already have the relative path: /home/Folder1/Folder2 which its original absolute path is /home/user1/Folder1/Folder2. And I have several scripts that are using /home/Folder1/Folder2. Now, I need to delete user1 so I created user2 with the same structure of user1 so now I have a new path which is /home/user2/Folder1/Folder2. If I delete user1, my scripts will then fail because they are using the relative path /home/Folder1/Folder2 which its original absolute path is /home/user1/Folder1/Folder2. So I want my new path /home/user2/Folder1/Folder2 to point to /home/Folder1/Folder2 so that my scripts won't fail and I don't want to go through the trouble of opening each script and change the relative path to my new created path. Any idea how I can do this?
I guess, you got confused between soft links and absolute/relative path.
I assume you have a soft link created from "/home/Folder1/Folder2" pointing to "/home/user1/Folder1/Folder2" and you want to delete user1 directory and create user2 directory with same structure. If my assumption is right, recreate the softlink "/home/Folder1/Folder2" to point to "/home/user2/Folder1/Folder2". Your existing scripts will work seamlessly.
My python version is 3.5 through Anaconda on Windows 10 environment. I'm using Pyminizip because I need password protected for my zip files, and Zipfile doesn't support it yet.
I am able to zip single file through the function pyminizip.compress, and the encrypt function worked as expected. However, when trying to use pyminizip.compress_multiple I always encountered a Python crash (as pictures) and I believe it's due to the problem of my bad input format.
What I would like to know is: What's the acceptable format for input argument src file LIST path? From Pyminizip's documentation:
pyminizip.compress_multiple([u'pyminizip.so', 'file2.txt'], "file.zip", "1233", 4, progress)
Args:
1. src file LIST path (list)
2. dst file path (string)
3. password (string) or None (to create no-password zip)
4. compress_level(int) between 1 to 9, 1 (more fast) <---> 9 (more compress)
It seems the first argument src file LIST path should be a list containing all files required to be zipped. Accordingly, I tried to use compress_multiple to compress single file with command:
pyminizip.compress_multiple( ['Filename.txt'], 'output.zip', 'password', 4, optional)
and it lead to Python crash. So I try to add a full path into the args.
pyminizip.compress_multiple( [os.getcwd(), 'Filename.txt'], ... )
and still, it crashed again. So I think maybe I have to split the path like this
path = os.getcwd().split( os.sep )
pyminizip.compress_multiple( [path, 'Filename.txt'], ...)
still got a bad luck. Any ideas?
Pyminizip requires the path name (or relative path name from where the script is running from) in the files.
Your example:
pyminizip.compress_multiple( [os.getcwd(), 'Filename.txt'], ... )
gives a list of files of os.getcwd(), and then another file, 'Filename.txt'. You need to combine them into a single path using os.path.join()
in your filename example, you will need:
pyminizip.compress_multiple( [os.path.join(getcwd(), 'Filename.txt')],...)
conversly:
pyminizip.compress_multiple( [os.path.join(getcwd(), 'Filename1.txt'), os.path.join(getcwd(), 'Filename2.txt')],...)
From here - https://pypi.org/project/pyminizip/, the usage of compress_multiple is
pyminizip.compress_multiple([u'pyminizip.so', 'file2.txt'], [u'/path_for_file1', u'/path_for_file2'], "file.zip", "1233", 4, progress)
The second parameter is a bit confusing, but if used, it will create a zip file, which when uncompressed, will create a directory structure like: