I have an application running on linux which reads image file path from db and show it... similarly there is another windows based application which reads the same db for image file path and shows it. I have configured the SMB service to share the folder with windows...
The problem is that when the linux app stores the image path it uses the path as /data/images/file.tip and windows app cannot read this file because the path is not absolute... same is the case if windows app modifies the db it uses \server\images etc.,
how do i translate windows path to linux and vice versa...
In your application, can you not specify an OS Flag/Variable, and only store the image file name.
Then based on OS, look in XXX folder for that file.
So store "Image.jpg" in DB, and on the Windows App open "C:\Image.jpg" and on Linux open "/home/user/Image.jpg"
The / and \ are OS specific, so when you retrieve a data from database, try first to replace / with \ if you are in windows, or \ with / if you are on windows.
Related
I have one Linux machine and one Windows machine for developments. For data sharing, we have set up a shared Windows directory in another Windows machine, which both my Linux and Windows can access.
I am now using DVC for version control of the shared data. To make it easy, I mount the shared Windows folder both in Windows and in Linux development machine. In Windows, it looks like
[core]
analytics = false
remote = remote_storage
['remote "remote_storage"']
url = \\my_shared_storage\project_dir
In Linux, it looks like:
[core]
analytics = false
remote = remote_storage
['remote "remote_storage"']
url = /mnt/mount_point/project_dir
As you can see, Windows and Linux have different mounting points. So my question is: is there a way to make that both Windows and Linux have the same ùrl in the DVC configuration file?
If this is impossible, is there another alternative solution for DVC keeps data in remote shared Windows folder? Thanks.
If you are using a local remote this way, you won't be able to have to the same url on both platforms since the mount points are different (as you already realized).
The simplest way to configure this would be to pick one (Linux or Windows) url to use as your default case that gets git-committed into .dvc/config. On the other platform you (or your users) can override that url in the local configuration file: .dvc/config.local.
(Note that .dvc/config.local is a git-ignored file and will not be included in any commits)
So if you wanted Windows to be the default case, in .dvc/config you would have:
[core]
analytics = false
remote = remote_storage
['remote "remote_storage"']
url = \\my_shared_storage\project_dir
and on your Linux machine you would add the file .dvc/config.local containing:
['remote "remote_storage"']
url = /mnt/mount_point/project_dir
See the DVC docs for dvc config --local and dvc remote modify --local for more details:
https://dvc.org/doc/command-reference/config#description
https://dvc.org/doc/command-reference/remote/modify#command-options-flags
I am working on a web application developed on .net core 2.0 and the same is hosted on a Linux machine (Fedora OS) using docker image.
The application has a document manager, which is storing and reading files from an external source(a network share / physical path on the base machine). This was working fine on normal dev environment and windows machines.
Please advice, how to get the file system / file path on Linux system with Docker.
var pathFolder = "E:\\DocLib\\";
var filePath = pathFolder + "\\" + newFileName;
Hi We're using SAS EG V 7.13 HF1 (7.100.3.5419) (64-bit).
I'm running a script from a local Windows directory. I have a file called sas_pwd.sas I'm referencing in the command:
%include 'sas_pwd.sas';
However it's not being picked up by my SAS script.
I'm getting the following error message.
24 %include 'sas_pwd.sas';
WARNING: Physical file does not exist, /sasdata/work/sasuser/rolap/sas_pwd.sas.
ERROR: Cannot open %INCLUDE file sas_pwd.sas.
Which is in my home directory for SAS.
So how can I either have the script point to the file on my local machine where it's in the directory say: c:\mydir\me
or how can I have the script find the file ion the Linux server where it's in say /home/rolap/?
If you can answer to both that'd be a bonus.
If the file exists in your Linux home directory then reference it there. Usually you can use ~ as an alias for your home directory.
%include '~/sas_pwd.sas';
Or you could hard code your home directory path.
%include '/home/rolap/sas_pwd.sas';
LInux will not have access to your PC unless you are running some type of fileserver on your PC and have mounted your disk onto the Linux machine. But there are tasks available for Enterprise Guide that you can use to upload a file from your PC to your SAS server.
I'm currently working with a requirement: download a file from the database then write it to a shared folder. Temporarily, I'm working on a path on my local:
File.WriteAllBytes(path, content);
My problem is the shared folder is on a windows machine and only a specific account will be allowed to write to this folder.
Now I know the basics of Impersonation but I don't know if it is possible to impersonate on a Docker container on a Linux machine.
In short, I want to deploy my application on a Linux container then write a file to a windows shared folder with limited access.
Is the folder on the host or mounted on the host? If so you can then map the host folder to the container. e.g.
C:\> "Hello" > c:\temp\testfile.txt
C:\> docker run -v c:/temp:/tmp busybox cat /tmp/testfile.txt
c:/temp being a local path on the host
/tmp being the path in the container.
More details here: volume-shared-filesystems
I'm trying to move db2 db from windows to linux server. When I move data to linux db by this command:
db2move DBNAME load -lo REPLACE -u userID -p password > load_remote.txt
I had this error:
SQLCODE: -3126 - SQLSTATE:
SQL3126N Remote client requires absolute path for files and directories.
Thanks.
Do you mean to use the 'load client' syntax (instead of just load) ?
See the details in the documentation.
The LOAD command requires that the files to be loaded are already on the Db2-target-server.
The LOAD CLIENT alternative allows the files to be on a remotely connected Db2-client (or on your Windows Db2-server if that is the source machine).
You can also just copy the IXF files to the Linux Db2-server, and open an SSH session to that Linux environment and run the LOAD command there. Your choice.
As with the LOAD command, LOAD CLIENT operates on one file at a time (in your case, one file per table) unless using lobsinsepfiles option, or other special cases.