Sublime Text 3 with SFTP package not mapping (syncing) - sublimetext3

Background:
I'm currently using sublime text 3 (free) and have installed the (free) sublime SFTP package and I'd like to work remotely on my server files. I saw a youtube video that this is possible by mapping remote -> local
Steps taken
I have created a folder (subfolder on my desktop)
dragged the folder into the sublime sidebar
setup the sftp config file (in the above folder) and set the remote_path for simplicity to just "/"
clicked on map remote -> local
Status
After it outputs that the connection is working:
Connecting to FTP server "xxx" as "yyyy#zzz.de" ... success
Validating remote folder "/" .... success
It's in an almost never ending loop with the status:
"Determining operations to sync remote path "/" down to local path "/Users/DS/Desktop/project_folder/3. Website/FTP/" ........."
I had let it run yesterday to see what happens and at some point after quite a while it did seem downloading but when I checked my folder it was still empty.
I have already checked the access rights of the folder (which is in a subfolder of my desktop) but it's fine. I have also clicked on "download folder" but this doesn't work either (just in the same loop).
At this point, I'm not sure what to do anymore. The connection is working, but it's just not mapping.
Could anyone help me??

Related

ddev launch command returns "File not found."

My organization is establishing a ddev-pantheon setup on Windows 10. Successful "ddev pull pantheon" commands have executed. 403 errors arose after running ddev start and clicking the project links; this error disappeared after establishing the folder with "index.php" as the docroot folder in my "config.yaml" file. However, now "File not found." is being displayed after clicking both project links (those returned after a "ddev start" or "ddev restart" command). Could it be that some file in the repo linked to the index.php file is having trouble locating another file - how do I get rid of this message and view the site?
If anyone is willing to help me establish a functional connection, then I would appreciate it. It will also be helpful to know where ddev users usually clone their git repositories and how I can locate the files downloaded following a "ddev pull pantheon" command. Could the presence of lando .yml files cause issues? Any help is appreciated. Thank you.
Moving the "index.php" file (and those referenced by it) into the initial docroot folder did not get rid of the "File not found." message. Neither did deleting the repo, redownloading it, and establishing the new repo folder as the docroot folder in the "config.yaml" file resolve this issue.
As you discovered, the most common reason to get a 403 is that your docroot is set wrong (or that there is no index.php or index.html in your docroot). This happens enough that there's an FAQ for it, https://ddev.readthedocs.io/en/latest/users/basics/faq/#why-do-i-get-a-403-or-404-on-my-project-after-ddev-launch
Please look at your .ddev/config.yaml and see what's there for docroot when you're having this problem, and use ddev ssh to inspect what's inside the container.
ddev logs may help you understand why the 403 is happening.
You don't say whether you're using mutagen or not.
ddev pull pantheon should have nothing to do with your 403 problem; it's not clear why you mention ddev pull pantheon. I supposed you could have a really messed up pantheon.yaml that could do it, or that you could be downloading a broken database. Are you saying that your project is only broken after you do a pull? If so, ddev delete -Oy will get you back to where you started, so you can demonstrate that. You can also do ddev pull pantheon --skip-files or ddev pull pantheon --skip-db as part of your debugging process.
The way things should be working:
You should have checked out your git repo that has the code for your project.
On most project types you would have done a ddev composer install after that.
Then a ddev pull pantheon would load your database with the upstream database. You can see the contents of the database with ddev mysql or by using the PhpMyAdmin UI (ddev launch -p).
The files from Pantheon will be put into your upload_dir. For example, this would be web/sites/default/files on a standard Drupal project.
BTW, the recommended environment on Windows is WSL2, you'll like it a lot in the long run.
This sort of problem would be easier to sort out in a more interactive environment, so you're invited to the DDEV Discord at https://discord.gg/hCZFfAMc5k .

build.artifactstagingdirectory doesn't exist Azure Devops

I'm trying to publish a ClickOnce application. To do so, I'm building the project with the msbuild publish target argument, copying the files to the artifact staging directory, then attempting to upload that directory via FTP.
However, the ftpupload task is failing:
With the following error:
550 The system cannot find the file specified.
For whatever reason, removing the / at the begging of the Remote Directory field fixed it. Most say the / at the beginning is needed, however. It might vary from one FTP server to another.
550 The system cannot find the file specified.
This error doesn't indicate that the specified files are missing in build.artifactstagingdirectory folder.
Actually the task would always succeed with the warning Could not find any files to upload even when the build.artifactstagingdirectory folder is empty:
To make the FTP upload task work successfully, you may need to check your Remote directory input of that task and make sure your FTP remote Server is configured well.
1.Assuming your remote home directory is C:\FTPfolder. You should use /TestFolder/as input if you want to upload the files to C:\FTPfolder\TestFolder. (Be careful about the slash)
2.If your upload task would create a new folder in remote directory during this process, you should make sure the user account has the related directory access like creating sub-folders...
PS: I run this pipeline in my self-hosted agent to upload files to remote directory in remote machine with Serv-U ftp server. The task works well in my side.

How to handle files in case-sensitive way in Vagrant on Windows host

On My Windows 8 I've installed VirtualBox + Vagrant. I used Laravel Homestead (with Ubuntu) as box. When running site on this VM or running command line I would expect it is being run on Linux and not on Windows. But I found some strange issue:
First my folder mappings:
folders:
- map: D:\DaneAplikacji\easyphp\data\localweb\projects\testprovag\strony
to: /home/vagrant/code
sites:
- map: learn.app
to: /home/vagrant/code/my-first-app/public
When I run in my browser http://learn.app:8000 I got correct output - page from /home/vagrant/code/my-first-app/public what is the same as code from D:\DaneAplikacji\easyphp\data\localweb\projects\testprovag\strony\my-first-app/public just to be clear.
Now the problem:
In my public folder I've created 2 simple files:
File with name test (it's empty) and file index.php with content:
<?php
if (file_exists('TEST')) {
echo "file exists";
}
else {
echo "file NOT exists";
}
So now I run http://learn.app:8000 in browser and I get output file exists. This is the result I wouldn't expect. As far as I know in Linux (my box is Ubuntu) you may have files with different case in names (in opposite to Windows) so I would expect I got file NOT exists.
I've tested it in my VM running php index.php and I get exact same result file exists what is again unexpected.
Now what I did I copied those 2 files to other directory on my VM /home/vagrant/TESTS - this directory is not mapped using Vagrant. Now when I run php index.php I get file NOT exists what is expected result.
To be honest I completely doesn't understand it. Question - does PHP when using Vagrant mapping operating on VM filesystem (in this case Ubuntu) or on Virtual BOX host filesystem (in my case Windows). Is there any way to make it work to get desired result? I know this question might be a bit software related but it's really connected to PHP and Laravel and maybe I miss something here.
I think this issue can be solved not using Samba or too much work.
In Windows cmd I run:
vagrant plugin install vagrant-winnfsd
It installed a plugin for NFS for Windows although at http://docs.vagrantup.com/v2/synced-folders/nfs.html you have clear info that NFS doesn't work for Windows:
Windows users: NFS folders do not work on Windows hosts. Vagrant will ignore your request for NFS synced folders on Windows.
I modified my Homestead.yaml file mapping from:
folders:
- map: D:\DaneAplikacji\easyphp\data\localweb\projects\testprovag\strony
to: /home/vagrant/code
to:
folders:
- map: D:\DaneAplikacji\easyphp\data\localweb\projects\testprovag\strony
to: /home/vagrant/code
type: "nfs"
(probably if not using Homestead.yaml you can add type: nfs, something like that: config.vm.synced_folder ".", "/vagrant", type: "nfs" in your Vagrantfile)
Now when I run
vagrant up
I got 2 or 3 notices for admin password (probably some Windows configuration of NFS - it will appear only when I run vagrant up first after adding NFS type) but now both for using url http://learn.app:8000 I got for the case from question file NOT exists and the same when I run php index.php in box commandline.
Note: this solution doesn't make you can create test and TEST files in the same directory and you will have them in your file system. It seems to handle file in case sensitive way, so if you create a file in wrong case in your app (and later in your code you want to load it/require) you will notice that it doesn't work on your Vagrant Linux box (and it would work on Windows WAMP and you would be suprised when moving on production).
With the default vagrant share that you use the mounted folder still provided by the underlying file system. It won't handle linux ACLs properly either.
One solution we found for this is instead of sharing the host's folder with the client, we set up samba on the guest and shared it back to the host. It's more cumbersome and you have to do more configuration but at least you app runs in an environment as it should be
Interesting. I use Windows 7 with Homestead - and I also assumed that the case sensitivity file issue would be handled by Vagrant. But I ran some similar tests - and you are correct - it is actually case insensitive.
I can confirm the issue is not to do with PHP at all - the issue actually occurs inside Vagrant on the command line itself:
touch EXAMPLE
rm example
That passes on the Vagrant box when I am SSH into it. But on a real Ubuntu box (I tested it on my server) - that command fails.
The reason is that Vagrant is calling to Windows to check if the file exists for the mapped folders. But if you make a virtual non-mapped folder inside of Vagrant - then the 'call' to see if the file exists remains inside of Vagrant and thus handled only by Ubuntu - and that is why your other test passed.
I tried turning on Case Sensitivity for Windows 7 as documented here - but it didnt solve the problem

Pulling remote 'client uploaded assets' into a local website development environment

This is an issue we come up against again and again, how to get live website assets uploaded by a client into the local development environment. The options are:
Download all the things (can be a lengthy process and has to be repeated often)
Write some insane script that automates #1
Some uber clever thing which maps a local folder to a remote URL
I would really love to know how to achieve #3, have some kind of alias/folder in my local environment which is ignored by Git but means that when testing changes locally I will see client uploaded assets where they should be rather than broken images (and/or other things).
I do wonder if this might be possible using Panic Transmit and the 'Transmit disk' feature.
Update
Ok thus far I have managed to get a remote server folder mapped to my local machine as a drive (of sorts) using the 'Web Disk' option in cPanel and then the 'Connect to server' option in OS X.
However although I can then browse the folder contents in a safe read only fashion on my local machine when I alias that drive to a folder in /sites Apache just prompts me to download the alias file rather that follow it as a symlink might... :/
KISS: I'd go with #2.
I usally put a small script like update_assets.sh in the project's folder which uses rsync to download the files:
rsync --recursive --stats --progress -aze user#site.net:~/path/to/remote/files local/path
I wouldn't call that insane :) I prefer to have all the files locally so that I can work with them when I'm offline or on a slow connection.
rsync is quite fast and maybe you also want to check out the --delete flag to delete local files when they were removed from remote.

Perforce - Client Unknown error (in webstorm)

I am getting a "client unknown" error when trying to commit any files from webstorm to perforce. My p4v is configured correctly and works outside of Webstorm and my p4 command line is also correctly configured yet when I use the exact same setup in webstorm I get the client unknown error. My client is setup and correctly copied into perforce. Any idea what might be going on here? Are there logs that will show me a more complete error?
For those who are facing this issue :
Check the "p4 info" output and see if "User name", "Client name" and Client are having correct data.
If they are 'none' / 'unknown', then check or set below environment variables
P4CLIENT
P4USER
P4PORT
P4CLIENT : Should match with your "workspace" name
P4USER : Username to login to perforce
P4PORT : Should have proper URL "HOSTNAME:PORT" of perforce server : Ex: 192.128.10.130:6666
I figured this out finally, though I ran into a new issue.
Resolution: The perforce client P4 (not P4v) has to be installed, and it was but it was not in the correct directory. P4 needs to be installed in your Applications on a Mac and that file needs to be made editable. You will also likely need to change your permissions for that file to allow system read/write access.
To make the file executable once it is in the Applications directory from the command prompt navigate to your applications directory and type: chmod +x p4 (http://www.perforce.com/perforce/doc.current/manuals/p4guide/01_install.html)
Then you can find the file in finder and right click it, the click get info from the context menu. From there at the very bottom will be file permissions. I set them all to read/write. You could also do this from the command line by typing chmod 755 p4 I think but I am not great on the command line so use at your own risk.
As far as your workspace is concerned that should be whatever your workspace is set to in P4V.
At that point if you hit test connection inside Webstorm->preferences->perforce it should work or at least give you a new error with some more information.
Mine was able to connect successfully but now when I try to update a file I get an error saying "path '/users/my-path.....' is not under the client's root '/users/my-path' even though the first path specified is clearly a child of the second path. Still working on this error.
Mine was able to connect successfully but now when I try to update a file I get an error saying "path '/users/my-path.....' is not under the client's root '/users/my-path' even though the first path specified is clearly a child of the second path. Still working on this error.
In my case i resolve this problem -> i had physical way to my files /Users/.... but in perforce i had a root like /users/.... (in lowercase), i change root and its help for me in mac

Resources