I'm trying to automate the deployment of my website to a remote server using Gitlab and it's CI/CD facilities. I'm using a static site generator called middleman that generates all the files into a build folder. This works locally, and if I were to manually upload the files it would reflect precisely what I want. The trouble is when using the following command although it generates all the files correctly;
$ lftp -c "set ftp:ssl-allow no; open -u $USERNAME,$PASSWORD $HOST; mirror -Rv build/ ./public_html --delete-first --ignore-time --parallel=10 --exclude-glob .git* --exclude .git/"
This is what ends up being spat out;
Removing old file `stylesheets/styles-bb5541bd.css'
Removing old file `stylesheets/styles-bb5541bd.css.gz'
Transferring file `stylesheets/styles-4deda93b.css'
Transferring file `stylesheets/styles-4deda93b.css.gz'
I'm using asset hashes as you can see, but here it's updating the stylesheet hash, but not reflecting it in the individual HTML files that would obviously refer to the new hash... what gives? The files themselves were updated but they aren't being uploaded.
Any help on this is greatly appreciated.
It's probably too late for you, but the issue is "--ignore-time". If the file size remains the same it won't be re-uploaded. So a change from 'stylesheets/styles-bb5541bd.css' to 'stylesheets/styles-4deda93b.css' in your html file won't change the size hence it won't be mirrored.
Update:
Because of these issues with direct LFTP usage I now use git-ftp.
Here's my .gitlab-ci.yml
https://gist.github.com/westhouseit/5310a21ca6e6218ebc20ba94530bb0a6
and .git-ftp-ignore
https://gist.github.com/westhouseit/d3e84f3c26d733b286c0481f957052ef
Related
I am building a web application for an online "build your own" card game. In the application, I have a cards.json file that holds custom card data. This file is changed with fs whenever a user creates a card. Whenever I push local changes, the cards.json file gets overwritten on deploy. That means all the remote data gets lost on every deploy. How can I include a cards.json file remotely but not change the file whenever I push changes using git push heroku master?EDIT: I guess for clarification reasons, I have tried using a .gitignore as well as removing the file from the staging area. I'm not entirely sure, but I think the issue is that when the application is deployed the file is overwritten there.
So I just found out that the data created during runtime will always be deleted/reset.
https://devcenter.heroku.com/articles/dynos#ephemeral-filesystem
I guess the best fixes for anyone else who has this same issue are:
a) Look into Databases and Heroku Add-ons, or
b) This is very workaround, and there might be better ways to do it, but:
// Go into a new directory, and use
$ heroku ps:copy <FILENAME> --app <APPNAME>
// Then, copy+paste the data from this file into your main repo.
/* Now, each time you do this, you need to make sure you delete that file from the
* extra directory you created as ps:copy only works when the file doesnt exist locally.
*/
I think git fetch doesn't work in this instance, as it only pulls that unchanged file, rather than the changed one from the dyno.
Look up the .gitignore file in git, seems to me that's exactly what you're looking for.
If it doesn't recognize .gitignore properly at first:
git add [uncommitted changes you want to keep] && git commit
git rm -r --cached .
git add .
git commit -m "fixed untracked files"
In .gitignore add the cards.json along with the path .
eg. src/test/resources/testdata/cards.json
So we are currently moving away from our current deployment provider: Beanstalk, which is great but we are on the top tier and we keep running out of space or hitting our repository limits. So we are moving away so please do not suggest any other SaaS provider.
I personally use Gitlab for my own projects and a few company projects and it's amazing we use a self hosted version on our local server in our company building.
We have CI setup and currently are using the following deployment code (I have minified the bits just to the deployment for development) - this uses the shell executer for deploying as we deploy to an existing linux server.
variables:
HOSTNAME: '<hostname>'
USERNAME: '<username>'
PASSWORD: '<password>'
PATH_DEV: '/path/to/www'
# Define the stages (we can add as many as we want)
stages:
# - build
- deploy
# The code for development deployment
deploy_dev:
stage: deploy
script:
- echo "Deploying to development environment..."
- rm .gitlab-ci.yml
- rsync -urltvz --filter=':- .gitignore' --exclude=".git" -e "sshpass -p"$PASSWORD" ssh -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null" * $USERNAME#$HOSTNAME:$PATH_DEV
- echo "Finished deploying."
environment:
name: Development
url: http://dev.domain.com
only:
- envdev
The Problem:
When we use the above code to deploy it's perfect and works really well, and it deploys all the code after optimisation etc, but we have found a little bug here.
When you delete a file then the rsync command will not delete the file, now I did some searching and found the --remove flag you can add, and it worked - but it deleted all the user uploaded content as well. Now I added the .gitignore in to the filtering, so it would ignore some the files in their (which are usually user generated) or configuration files or/and libraries (npm, etc.). This is fine until a user started uploading files using the media manager in our framework which stores in a folder that is not in the .gitignore file and it can't because it contains other files, as we also add our own files in there so they're editable by the user, so now I am unsure how to manage this.
What we are looking for is a CI setup, which will upload file changes to the server, so it would search through the latest commits, and find the latest files that have been changed and then push only them files up. Of course I would like to do this with the Gitlab CI still, so any ideas examples or tutorials would be amazing.
Thanks in advance.
~ Danny
May it helps: https://github.com/banago/PHPloy
Looks this tool designed for php project, but I think it can use other web deployment.
how it works:
PHPloy stores a file called .revision on your server. This file contains the hash of the commit that you have deployed to that server. When you run phploy, it downloads that file and compares the commit reference in it with the commit you are trying to deploy to find out which files to upload. PHPloy also stores a .revision file for each submodule in your repository.
I run my pages job and it passes, however with the following message at the end
Uploading artifacts...
WARNING: public: no matching files
Uploading artifacts to coordinator... ok
Job succeeded
The website appears not to be served. All the build steps succeeded without error. I tried the build locally on my machine and verified it is correct. The website's entry point is index.html (I guess that's correct?).
How can I troubleshoot this problem? It would be nice if I could do the job "manually" so I could check a few things after the files are built on the CI machine. Like that I don't have to commit+push a new .gitlab-ci.yml all the time for checking / trying things.
Any suggestions are highly appreciated! Thanks!
P.S.: I build the website using Sphinx if that is of importance.
Edit - Some details
I build the documentation via Sphinx' Makefile (which is part of my documentation's source). Sphinx confirms me that the files are placed in build/html (I confirmed this on my local machine) and I copy them to the public folder. Here's the corresponding excerpt of my ci.yaml:
- make html
- mkdir ~/.public
- cp -r build/html/* ~/.public/
- cd
- mv .public public
I don't know what information from Sphinx' conf.py could be interesting for that case, I've scanned through it and it doesn't seem to be corrupted (also the local build works).
As an output I obtain an index.html + several other HTML files which are linked from index.html. This all gets placed in ~/public.
I would really appreciate to be able to do those build steps manually on the build server as I could take a look at the build files then and maybe figure what's wrong. I didn't find any documentation that this was possible however I also don't think that's really the idea behind CI. Right now I'm not sure how I should tackle this problem as it builds fine on my machine and on the other hand I can't access the build server directly.
Edit 2
I verified with
ls -al ~/public
in my ci.yaml file the generated files and they are all at the correct place. Especially:
$ ls -al ~/public
[...]
-rw-r--r--. 1 root root 5621 Apr 13 23:31 index.html
[...]
So it seems that GitLab pages is expecting something else than / something in addition to index.html?? I've run the Jekyll example from the their examples pages repository and this worked fine having an index.html. But maybe Jekyll produces some more files during the build process.
According to this documentation and this tutorial GitLab pages will only consider a folder named public which resides inside the project's directory. That is the HTML content should go to ~/projectname/public instead of ~/public.
I think I got eaten by this problem. Actually ~/public in a docker image, where we are connected as root… is /root/public :) and not what gitlab pages expects.
You should try
mv build/html public
I've been trying to make my user uploaded date on OpenShift accessible publicly. However, I run into the issue that I can't seem to make it work in any way.
I'm using NodeJS to upload the files to process.env.OPENSHIFT_DATA_DIR via express4 and fs.
The files upload just fine. However, I've read plenty of messages saying that I should link the folders together using "ln -sf ../route/to/app-root/data/folder linked_folder". Which I've done, but I cannot still access them publicly.
I honestly don't know what else should I do. Do the files automatically sync? Because that doesn't seem to be the case. Or should I be uploading to my repo folder and then OpenShift automatically links it to the data dir folder?
My current exact setup when doing "ln" is:
01| cd app-root/repo/public/
02| ln -sf ../../data/user-files user-files
Doing this to link the user-files folder in repo/public with the openshift data/user-files folder.
So the thing is that I can't access the files in the front end by doing "ln" at all. No clue where to go from here.
All what you need is:
1. store all your files in OPENSHIFT_DATA_DIR directory
2. write a script that run before server.js or app.js it's function is to copy all data from OPENSHIFT_DATA_DIR to your desired directory inside your repo like public directory or whatever you want.
SAMPLE: initDataBeforeRun.js
var fs = require('fs');
fs.writeFileSync('./public', fs.readFileSync(process.env.OPENSHIFT_DATA_DIR));
I'm having a weird issue when pushing my app to heroku.
It's an angularjs front app with a basic nodejs server to be able to run it on heroku.
I'm pushing a deployment branch with all the app already "compile" by grunt in a /dist folder
My problem is in the /dist/public directory, I have 4 folders : js, css, img and fonts ; but after a push and checking on the dyno with heroku run bash, only the img one is in /dist/public, the 3 others aren't there.
I try to do a new push, renaming the public folder to another name (ie shared) and this time, all 4 folders are there, so it seems heroku's doing something with folders named public but I can't figure why and how to avoid this suppression/ignoring thing.
Has any of you encountered the same issue, and how to resolve it without having to rename my public folder ?
EDIT :
Adding my .gitignore file for those of you wondering about that:
/.vagrant/machines
/node_modules
/app/bower_components
/.sass-cache
/test
/app/src/lib/config.js
/dist
Do a git add -f dist/public/js dist/public/css dist/public/fonts from within your repo.
You have a .gitignore rule for /dist, which will ignore any files within /dist and its subdirectories, unless they are already being tracked. My guess is, that the files you have newly generated were not being tracked earlier, and hence they were silently ignored.
The -f flag in the git add above will add those forcefully (overriding the ignore rule), and so you will be able to make commits.
If there are only a few files, and you want to avoid adding the whole folders, I would suggest adding each of the individual files forcefully (i.e., with the -f flag).