IIS Webdeploy: copy files to client - iis

I am trying to copy some files to a destination Folder (which should be no problem with Webdeploy) and after that process I would like to get these files on the client (the computer which is starting the msdeploy process).
There seem to be a lot of examples on how to copy files to a server, but I could not find any about geting files from the server.

What you are describing is not currently supported by MSDeploy. Workarounds that I can think of include:
Defining a postSync argument that copies files to a directory accessible via FTP
Install MSDeploy on the client and have the server trigger a separate deployment back to the client (this would require that the client be publically accessible)

Related

Azure Windows App Service files not available across nodes

Our application has the ability to request a generation of a file that are then downloaded by the client. We are seeing issues that when our app service has more than one node that the files generated are not available across the other nodes.
E.g.:
POST request to generate file and save to d:\home\site\wwwroot\app_data is send by user from machine-1 and is successful.
GET request from user to download this file is received by machine-2, this fails because the file cannot be found.
My reading of the microsoft docs is that anything in d:\home is backed by azure storage and is not local to the machine: https://learn.microsoft.com/en-us/azure/app-service/operating-system-functionality#file-access
File access across multiple instances The home directory contains an
app's content, and application code can write to it. If an app runs on
multiple instances, the home directory is shared among all instances
so that all instances see the same directory. So, for example, if an
app saves uploaded files to the home directory, those files are
immediately available to all instances.
But this doesn't seem to be happening, is there something else that needs configuring?

Jenkins Slave Node : Can I use it to take over build done on different domain?

I have successfully set up Jenkins on local domain as a test. It builds from SCM, zips the build, extracts to a unique timestamp folder, and then copies over the files to the IIS folder.
I now have to set it up to deploy to a Azure VM. Now things are getting hairy.
I get the file to copy across - it takes a long time. Unzipping literally takes an hour.
Cross domain user rights are also making things difficult as the user running Jenkins service does not exist on production boxes which are on Azure domains.
What are my options?
Should I install a slave node on the production box and then "activate" the slave from the master and then let the slave :
1. perhaps copy the file over from Azure storage to the production box?
2. extract the files
3. Copy the files to the IIS folder.
Well, there's no clear answer to this, try what works best for you. So the main options i see are:
1. Use slave node in Azure, upload zip to some place (Azure storage account or whatever) and let slave node handle the download\unpacking\etc.
2. Use remote PowerShell and connect directly to servers in Azure and download the zip from the web (or Azure storage or whatever) and extract it.
3. Use a tool, like Octopus, which does literally the same, but is kind of build with deployments in mind.

How to deploy a data file on tomcat web server in linux debian OS

I have a web application developed using JSP and Servlet. This web application is deployed on server having Debian Linux as OS and The Tomcat version is 5.5.31. As this applications required some data files, These data files will be get created automatically when setting are done using a standalone java application. This application is deployed on another machine. This setup is done. As I dont know much about Debian Linux and where my application is goes on it so I have some doubts in deployment of these autimatically generated data files which are as follows
As I made the .war file of my web application and deployed it using Tomcat Manager. so I dont know where exactly my application goes. I dont know the exact path. How do I find it?
Is it possible to create FTP for this web application which is deployed on Debian Linux server? I think that if creating FTP is possible then I will directly connect to FTP using my Stand alone Java program and will easily do the creation of the file and other file and directory manipulation.
If you've deployed a war, the application isn't anywhere on the filesystem as such. Most servers will unpack the war somewhere, but you shouldn’t rely on where that is.
I can think of several options:
getServletContext().getAttribute("javax.servlet.context.tempdir") to get the application's temp directory, then inform you external program of this location and place the file somewhere in there in a know location.
Arrange for a "know location" outside of the application, such as /tmp/somewhere or /var/cache/your-app/somewhere to place such files. (Note: /tmp is usually cleaned on startup of a linux machine)
As for getting the file onto the server from a remote machine: You could get your client to upload the file directly to your webapp (something like Apache HTTPClient will help you there), which means that you could do without the "know location" above. If you want to do this outside of the application though, I'd avoid FTP (due to security). Instead, I'd go with scp (secure copy).
Edit: Reading between the lines a little, you mention "setting" in the data file. If this is a configuration file which is not changed once the app is running, you may find it more convenient to have a "deploy" step on your server which simply takes the settings file and adds it to the war before deploying it. This is easy enough with "ant war" for example. You could then access the file using getClass().getResourceStream(..) or such.

Can SharePoint be installed by simply copying files?

I have a SharePoint website I need to move to another completely different server.
Can I do this by simply copying files from IIS to the other server's IIS folder?
I assume I need to copy the database as well as change the config file's database connection.
I assume I don't need to install anything on the server other than ftp files across i.e. I don't need to install files via an installer or exe.
The short answer is no.
SharePoint includes Service applications (ex. OWSTimer) that have to be registered and installed.
Also there are COM components that must be registered properly.

How to do version control via ftp?

I have a web dev. client using a shared host that doesn't allow shell access, and thus no access to SVN, Git, etc. I've tried to convince him to move to one of the many cheap options that allow it, but he won't do it. If I use version control on my staging server, are there any tools that will allow me to replicate the changes to production via ftp? Locally I have both mac & windows, the staging server is linux, so something that works on any of those platforms....
Using your Linux staging server you could keep a separate checked out copy that you use specifically for that host and then use a utility to mirror that directory with the host server.
LFTP is useful for this kind of thing. Its available for most Linux distributions and includes a 'mirror' function:
Mirror specified source directory to
local target directory. If target
directory ends with a slash, the source base name is appended to
target
directory name. Source and/or target can be URLs pointing to
directories.
Some kind of ftp mirror software is what you need. Not tested it but a quick search gave me this Java application. You could run that over your up-to-date checked out repository.
Good thing for keeping SVN repo and FTP copy in sync is svn2web. May I suggest creating separate branch for production copy and do merges to that branch for uploading to production server.
You probably need to write a batch file that is able to
Export the SVN repository
Upload the exported files to your Linux server via FTP
Short of finding / implementing some FUSE based CoW file system that supports immutable versions .. I'd just find another (more developer friendly) host. As far as I know, no FTP server supports this natively, nor can I think of any elegant means of putting it in place with script hackery.
I could be wrong.
This question (and answer) really helped me just now as I implemented version control via gitolite on a separate server and lftp.
Here’s what I did:
Set up gitolite on my ubuntu staging server
created base repo (i.e. foo.git) on staging server
cloned foo.git into working directory on staging server
cloned foo.git into working directory on local development machine
Developed locally
Pushed changes to foo.git repo on staging server
On staging server, logged into working directory, and pulled in changes from foo.git
lftp-ed into shared host (like you mention above)
Once in shared host, ran:
mirror -R --only-newer --delete --parallel=10 /source/directory/ /target/directory
Notes on the mirror command options:
-R - this pushes the source/directory to the target/directory. (mirror pulls in from target to source without this, think reverse)
—only-newer - without this option, even if you only changed one file, the mirror command will send all the files in the source directory over to the target directory. with this option only the changed (newer) files are transferred over the wire.
—delete - deletes files that are no longer in the source directory but still in the target directory. one of my pushes involved deleting expired assets. without this option, the same files would have stayed put on my shared host after executing the mirror command.
—parallel=10 - transfers 10 files at once (instead of 1 by default). this made the process much faster
While this is what worked for me, I’m sure there are ways to improve on this. I was grateful for this question and thought i’d share my experience.
Rsync will do this over an FTP connection. You probably already have it installed if you’re on a Unix-like system.

Resources