build.artifactstagingdirectory doesn't exist Azure Devops - azure

I'm trying to publish a ClickOnce application. To do so, I'm building the project with the msbuild publish target argument, copying the files to the artifact staging directory, then attempting to upload that directory via FTP.
However, the ftpupload task is failing:
With the following error:
550 The system cannot find the file specified.

For whatever reason, removing the / at the begging of the Remote Directory field fixed it. Most say the / at the beginning is needed, however. It might vary from one FTP server to another.

550 The system cannot find the file specified.
This error doesn't indicate that the specified files are missing in build.artifactstagingdirectory folder.
Actually the task would always succeed with the warning Could not find any files to upload even when the build.artifactstagingdirectory folder is empty:
To make the FTP upload task work successfully, you may need to check your Remote directory input of that task and make sure your FTP remote Server is configured well.
1.Assuming your remote home directory is C:\FTPfolder. You should use /TestFolder/as input if you want to upload the files to C:\FTPfolder\TestFolder. (Be careful about the slash)
2.If your upload task would create a new folder in remote directory during this process, you should make sure the user account has the related directory access like creating sub-folders...
PS: I run this pipeline in my self-hosted agent to upload files to remote directory in remote machine with Serv-U ftp server. The task works well in my side.

Related

Jenkins pipeline advise required

I want to create a very simple pipeline that will clone a repo from Github and then copy a JSON file to a directory. But there is a catch here and requires your guide if that is possible or not. It is on a Linux machine and the folder structure is like this:-
/var/www/myjenkins.com/
/var/www/myapi.com/html/json/
On my Linux machine, this Jenkin is running which is deployed on the location as /var/www/myjenkins.com/ and the new pipeline which I need to create in Jenkin has to perform the job of copying a JSON file (after git clone) to another location as /var/www/myapi.com/html/json/ which is out of root level of Jenkins. Is that really possible to traverse out of the root level of Jenkins then copy the file into another location?
Would be glad if you can advise on this. Thanks

Azure web job read file from directory

I have a web job published to an empty azure website. In the root directory (D:\home) I have added a new folder called 'company'. When I run my web job I get the error:
Unhandled Exception: System.IO.FileNotFoundException: Could not find file 'D:\home\company\B20150324.txt'.
The file is definitely there. I have confirmed with ftp and the sites Kudu CMD directory explorer.
My fault, file was named using a Local DateTime but Azure server is not.

How to use to make a file executable on Openshift server after pushing it via git

The original poser is found here.
I want to ensure my index.cgi is set to 755, even afer i push files to git.
This is not happening and the file permission , based on the umask i understand is getting set to 700.
I am unable to create the post-update script on the server , which is to be kept at openshift/hooks location, due to the set permissions.
So i tried using action hooks to do the job.
I created a file named stop in my action hooks local folder.
Following this i pushed my index file to the server.
My index file still shows permission as 700.
How can i resolve this ?
Try updating the permissions in git.
git update-index --chmod=<permissions> <your_file>

Pulling remote 'client uploaded assets' into a local website development environment

This is an issue we come up against again and again, how to get live website assets uploaded by a client into the local development environment. The options are:
Download all the things (can be a lengthy process and has to be repeated often)
Write some insane script that automates #1
Some uber clever thing which maps a local folder to a remote URL
I would really love to know how to achieve #3, have some kind of alias/folder in my local environment which is ignored by Git but means that when testing changes locally I will see client uploaded assets where they should be rather than broken images (and/or other things).
I do wonder if this might be possible using Panic Transmit and the 'Transmit disk' feature.
Update
Ok thus far I have managed to get a remote server folder mapped to my local machine as a drive (of sorts) using the 'Web Disk' option in cPanel and then the 'Connect to server' option in OS X.
However although I can then browse the folder contents in a safe read only fashion on my local machine when I alias that drive to a folder in /sites Apache just prompts me to download the alias file rather that follow it as a symlink might... :/
KISS: I'd go with #2.
I usally put a small script like update_assets.sh in the project's folder which uses rsync to download the files:
rsync --recursive --stats --progress -aze user#site.net:~/path/to/remote/files local/path
I wouldn't call that insane :) I prefer to have all the files locally so that I can work with them when I'm offline or on a slow connection.
rsync is quite fast and maybe you also want to check out the --delete flag to delete local files when they were removed from remote.

db.* files in /home from Perforce?

I see several db.* files in my /home directory, and it seems they come from perforce. For example, some files are db.archmap, db.bodtext, db.change, db.changex
Are these files useful? Can I delete them? They are making my /home directory messy
You have started a server using your home directory as the Perforce server's P4ROOT folder. Those files are files that are generated from starting the server and cannot be deleted unless you want to hose your server installation. It's not clear to me how you've started the server instance, so I'll try and cover multiple bases with my answer.
If you want to start up the server under your own account, you should set the P4ROOT environment variable and point it to where you want the server to store its files. Alternatively, when you start the server, you can specify the root folder on the command line using the -r option:
p4d -r /home/mark/p4server
which would put the server's files into the directory called 'p4server' off of my home directory.
Typically it is best to run the perforce server using a user that is dedicated to running perforce. I use a user called 'perforce'. I set P4ROOT (and other variables) in that users environment. If you cannot use a separate user, it might be easier to use the -r command line option that I mentioned above.
Those files are only server files, not client files. So it is safe to delete them, but if you start the server back up it will recreate them. So you might want to uninstall the server.
Unless you are running a beta version, they have p4sandbox coming soon(maybe in the beta, I forget) which MAY create those files. I don't have a beta version, so I can't verify what new files the client may or may not create.
You can check the documentation here to see what these files do/are for.

Resources