.gitlab-ci.yml - how to sync to ftp on repo root - gitlab

I'm using this ci.yml file to sync my data to an external ftp server.
image: ubuntu:18.04
before_script:
- apt-get update -qy
- apt-get install -y lftp
build:
script:
# Sync to FTP
- lftp -c "set ssl:verify-certificate no; open $FTP_LOCATION; user $FTP_USERNAME $FTP_PASSWORD; mirror -Rev / Apps/ --ignore-time --parallel=10; "
Since I'm not syncing a folder, but rather from the repo root - it seems to be the case that it tries to sync the whole ubuntu image to the FTP.
How can I just sync the files in the repo from root without syncing the whole ubuntu image?

Related

FTP Send package publish DotNet Framework GitLab-ci-yml

I'm trying to publish an aspnet mvc 5 project via ftp using GitLab CI / CD.
I configured the runner as it is at the link https://medium.com/#gabriel.faraday.barros/gitlab-ci-cd-with-net-framework-39220808b18f
I'm having difficulty in the last step, which is to take the generated publish and send it to another server via ftp, as the runner executes with powershel the lftp generates an error in the build.
can you help me?
Here is my yaml code:
variables:
NUGET_PATH: 'C:\Tools\Nuget\nuget.exe'
MSBUILD_PATH: 'C:\Program Files (x86)\Microsoft Visual Studio\2017\Community\MSBuild\15.0\Bin\msbuild.exe'
build_job:
stage: build
cache:
key: build-package
policy: push
script:
- echo "*****Nuget Restore*****"
- '& "$env:NUGET_PATH" restore'
- echo "*****Build Solution*****"
- '& "$env:MSBUILD_PATH" /p:Configuration=Release /clp:ErrorsOnly'
- '& "$env:MSBUILD_PATH" FisioSystem.MVC\FisioSystem.MVC.csproj /p:DeployOnBuild=True /p:Configuration=Release /P:PublishProfile=Publish_FisioSystems.pubxml'
- echo "*****Install lftp*****"
- apt-get update -qq && apt-get install -y -qq lftp
- echo "*****Upload file to ftp*****"
- lftp -c "set ftp:ssl-allow no; open -u $FTP_USERNAME,$FTP_PASSWORD $FTP_HOST; mirror -R C:/Deploy/ ./../manager --ignore-time --parallel=10 --exclude-glob .git* --exclude .git/; quit"
artifacts:
name: "$CI_JOB_NAME-$CI_COMMIT_REF_NAME"
when: always
paths:
- ./FisioSystem.MVC/bin/release
expire_in: 1 week
only:
- master
Thanks!
If your gitlab runner is custom windows machine then manually install lftp. Then command will be available in your pipeline.
After you installed lftp on runner, just remove from pipeline
- echo "*****Install lftp*****"
- apt-get update -qq && apt-get install -y -qq lftp

Change directory in pipe line bitbucket

My folder structure:
-backend
-frontend
My reactapp is placed in frontend directory.
image: node:10.15.3
pipelines:
default:
- step:
caches:
- node
script: # Modify the commands below to build your repository.
- yarn install
- yarn test
- yarn build
This one fails. How do I go to the frontend-directory to run this?
Bitbucket Pipeline run in one bitbucket cloud server.
So, similar as using a local command line interface, you can navigate using comands like cd, mkdir.
image: node:10.15.3
pipelines:
default:
- step:
caches:
- node
script: # Modify the commands below to build your repository.
- cd frontend
- yarn install
- yarn test
- yarn build
- cd ../ #if you need to go back
#Then,probably you will need to deploy your app, so you can use:
- apt-get update
- apt-get -qq install git-ftp
- git ftp push --user $FTP_USERNAME --passwd $FTP_PASSWORD $FTP_HOST
If you need to test syntax of your yml file, try here

GitLab FTP deploy - Job failed: execution took longer than 1h0m0s seconds

Im new to GitLab and CI but I want deploy from GitLab repo to FTP via lftp
its goes to lftp and still running 1 hour and then return:
ERROR: Job failed: execution took longer than 1h0m0s seconds
.gitlab-ci.yml
...
deploy:
stage: deploy
image: mwienk/docker-lftp:latest
only:
- dev
script:
- lftp -c "set ftp:ssl-allow no; open -u $FTP_USERNAME,$FTP_PASSWORD -p $FTP_PORT $FTP_HOST; mirror -Rev ./ gitlab --ignore-time --parallel=10 --exclude-glob .git* --exclude .git/"
...
also tried
script:
- apt-get update -qq && apt-get install -y -qq lftp
Its SFTP protocol, maybe lftp is asking for something on background and not continue? Its not upload anything on FTP. Any advice?
Using SFTP you should try with port 22 and prefix your host like so: sftp://example.com
A very useful tool is also the lftp debug command and --verbose flag for the mirror command. Just include it in your script like so:
lftp -c "set ftp:ssl-allow no; debug; open -u $FTP_USERNAME,$FTP_PASSWORD -p $FTP_PORT $FTP_HOST; mirror -Rev ./ gitlab --verbose --ignore-time --parallel=10 --exclude-glob .git* --exclude .git/"
Also you should try to install lftp with:
apt-get update -qq && apt-get install -y -qq lftp
since this version includes the library libgnutls for supporting secure connections.
This is the configuration which worked on my setup deploying with FTP:
.gitlab-ci.yml
...
deploy:
stage: deploy
script:
- apt-get update -qq && apt-get install -y -qq lftp
- lftp -u $FTP_USER,$FTP_PASS $HOST -e "mirror -e -R -p ./dist/ new/ ; quit"
- echo "deployment complete"
# specify environment this job is using
environment:
name: staging
url: http://example.com/new/
# needs artifacts from previous build
dependencies:
- build
lftp documentation: https://lftp.yar.ru/lftp-man.html

Use Gitlab Pipeline to push data to ftpserver

I want to deploy to a ftp server using a Gitlab pipeline.
I tried this code:
deploy: // You can name your task however you like
stage: deploy
only:
- master
deploy:
script:
- apt-get update -qq && apt-get install -y -qq lftp
But I get a error message. What is the best way to do this? :)
Then add the following code in your .gitlab-ci.yml file.
variables:
HOST: "example.com"
USERNAME: "yourUserNameHere"
PASSWORD: "yourPasswordHere"
deploy:
script:
- apt-get update -qq && apt-get install -y -qq lftp
- lftp -c "set ftp:ssl-allow no; open -u $USERNAME,$PASSWORD $HOST; mirror -Rnev ./public_html ./ --ignore-time --parallel=10 --exclude-glob .git* --exclude .git/"
only:
- master
The above code will push all your recently modified files in your Gitlab repository into public_html folder in your FTP Server root.
Just update the variables HOST, USERNAME and PASSWORD with your FTP Credentials and commit this file to your Gitlab Repository, you are good to go.
Now whenever you make changes in your master branch, Gitlab will automatically push your changes to your remote FTP server.
Got it :)
image: mwienk/docker-git-ftp
deploy_all:
stage: deploy
script:
- git config git-ftp.url "ftp://xx.nl:21/web/new.xxx.nl/public_html"
- git config git-ftp.password "xxx"
- git config git-ftp.user "xxxx"
- git ftp init
#- git ftp push -m "Add new content"
only:
- master
try this. There's a CI Lint tool in Gitlab that helps with formatting errors. The linter was showing an error, the additional deploy statement.
deploy:
stage: deploy
only:
- master
script:
- apt-get update -qq && apt-get install -y -qq lftp
I use this
deploy:
script:
- apt-get update -qq && apt-get install -y -qq lftp
- lftp -c "set ftp:ssl-allow no; open -u $FTP_USERNAME,$FTP_PASSWORD $FTP_HOST; mirror -v ./ $FTP_DESTINATION --reverse --ignore-time --parallel=10 --exclude-glob .git* --exclude .git/"
environment:
name: production
only:
- master

upload submodule to ftp server using bitcucket pipeline

Is there any way to upload git submodules to ftp server using bitbucket pipeline?
I'm able to upload main repo to ftp server but not it's sub module.
The code I have used is as follows:
# This is a sample build configuration for Other.
# Check our guides at https://confluence.atlassian.com/x/5Q4SMw for more examples.
# Only use spaces to indent your .yml configuration.
# -----
# You can specify a custom docker image from Docker Hub as your build environment.
# image: atlassian/default-image:latest
pipelines:
default:
- step:
script:
- apt-get update
- apt-get -qq install git-ftp
- git submodule update --init --recursive
- git ftp push --user $FTP_USERNAME --passwd $FTP_PASSWORD $FTP_SERVER

Resources