git-ftp fatal: Remote host not set - gitlab

I'm getting this error after pipeline runs:
Preparing to unpack .../git-ftp_1.3.1-1_all.deb ...
Unpacking git-ftp (1.3.1-1) ...
Setting up libcurl3:amd64 (7.52.1-5+deb9u6) ...
Processing triggers for libc-bin (2.24-11+deb9u3) ...
Setting up curl (7.52.1-5+deb9u6) ...
Setting up git-ftp (1.3.1-1) ...
$ git ftp push --user $FTP_USERNAME --passwd $FTP_PASSWORD sftp://$FTP_HOST
fatal: Remote host not set.
ERROR: Job failed: exit code 1
This is my .yml config:
image: samueldebruyn/debian-git
stage_deploy:
only:
- develop
script:
- apt-get update
- apt-get -qq install git-ftp
- git ftp push --user $FTP_USERNAME --passwd $FTP_PASSWORD sftp://$FTP_HOST
A month ago it worked fine. The values of the variables are correct..
Any ideas?

Before you make git ftp push you have to initialize git ftp, so add the line:
- git ftp init --user $FTP_USERNAME --passwd $FTP_PASSWORD ftp://$FTP_HOST
note: this only work for the first time
or if you want to execute this every commit add the line before git ftp push :
- git reset --hard

I finally solved it pressing the "Clear Runner Caches" button and running it again.

Related

How can I stop Jenkins from resetting user permissions after each build?

I have a 'hello world' NodeJS project I'm trying to build on a fresh install of Jenkins (running on Ubuntu 18.04.3 LTS on a Digital Ocean server). I have limited experience with Linux so please let me know anything I'm missing here.
Jenkins build "execute shell":
whoami
npm install
./script/test
Jenkins console output:
Started by user twulz
Running as SYSTEM
Building in workspace /var/lib/jenkins/workspace/node-app
No credentials specified
> git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
> git config remote.origin.url https://github.com/Twulz/node-app.git # timeout=10
Fetching upstream changes from https://github.com/Twulz/node-app.git
> git --version # timeout=10
> git fetch --tags --progress -- https://github.com/Twulz/node-app.git +refs/heads/*:refs/remotes/origin/*
> git rev-parse refs/remotes/origin/master^{commit} # timeout=10
> git rev-parse refs/remotes/origin/origin/master^{commit} # timeout=10
Checking out Revision 81d9f909cfd34cd5eb65a123dd9f2a1e67686512 (refs/remotes/origin/master)
> git config core.sparsecheckout # timeout=10
> git checkout -f 81d9f909cfd34cd5eb65a123dd9f2a1e67686512
Commit message: "Solve vulnerabilities"
> git rev-list --no-walk 81d9f909cfd34cd5eb65a123dd9f2a1e67686512 # timeout=10
[node-app] $ /bin/sh -xe /tmp/jenkins4626689890347957662.sh
+ whoami
jenkins
+ npm install
audited 12356 packages in 4.203s
found 0 vulnerabilities
+ ./script/test
/tmp/jenkins4626689890347957662.sh: 4: /tmp/jenkins4626689890347957662.sh: ./script/test: Permission denied
Build step 'Execute shell' marked build as failure
Finished: FAILURE
So I think the problem is the jenkins user does not have permission to execute the test script /var/lib/jenkins/workspace/script/test.
On the command line I ran these commands to try to change the permissions:
$cd /var/lib/jenkins/workspace/node-app/script
$ls -l test
-rw-r--r-- 1 jenkins jenkins 51 Sep 22 07:30 test
$sudo chmod -R 757 /var/lib/jenkins/workspace/node-app
$sudo systemctl restart jenkins
$ls -l test
-rwxr-xrwx 1 jenkins jenkins 51 Sep 22 07:35 test
Then selected "Build Now" on my jenkins project, immediately after I ran again:
$ls -l test
-rw-r--r-- 1 jenkins jenkins 51 Sep 22 07:50 test
I also tried giving recursive permissions to the whole jenkins folder, or just to the test file, but both still failed: sudo chmod -R 757 /var/lib/jenkins or sudo chmod -R 757 /var/lib/jenkins/workspace/node-app/script/test
I've re-applied and re-saved my configuration in jenkins, as suggested in another thread but there was no change.
So something in the build process is resetting the permissions - how can I ensure the jenkins user retains the right permissions to run this script?
So I found the answer, I hope this helps others - git saves the executable permissions in GitHub too so in my case each time Jenkins pulled the latest code, it was overwriting the permissions to what was saved in the git repo. I was initially working in Windows so I didn't realise this would be an issue.
To fix the problem I simply had to clone the repo, change the permissions and commit again:
$git clone https://github.com/Twulz/node-app.git
$sudo chmod -R 757 node-app
$cd node-app/
$git add .
$git status
$git commit -m "Update execute permissions"
$git push

How to pull submodules with --remote within Gitlab CI?

I need my Gitlab CI to update submodules with --remote flag so that the HEAD is set to the remote's HEAD. After a bit of Googling I found that I need to set GIT_SUBMODULE_STRATEGY to none and run git submodule update --recursive --remote --init manually:
variables:
GIT_STRATEGY: clone
GIT_SUBMODULE_STRATEGY: none
before_script:
- apk add git || ( apt-get update && apt-get -y install git )
- git submodule update --recursive --remote --init
test:build:
services:
- docker:dind
image: ubuntu
variables:
DOCKER_HOST: tcp://docker:2375
DOCKER_DRIVER: overlay2
script:
- echo "done
Unfortunately I'm getting a CI failure (names edited):
$ git submodule update --recursive --remote --init
Submodule 'current_project_name/submodule_project_name' (ssh://git#gitlab.someserver.net:9931/someorg/submodule_project_name.git) registered for path 'current_project_name/submodule_project_name'
Cloning into '/builds/someorg/current_project_name/current_project_name/submodule_project_name'...
Host key verification failed.
fatal: Could not read from remote repository.
Please make sure you have the correct access rights
and the repository exists.
fatal: clone of 'ssh://git#gitlab.someserver.net:9931/someorg/submodule_project_name.git' into submodule path '/builds/someorg/current_project_name/current_project_name/submodule_project_name' failed
Failed to clone 'current_project_name/submodule_project_name'. Retry scheduled
Cloning into '/builds/someorg/current_project_name/current_project_name/submodule_project_name'...
Host key verification failed.
fatal: Could not read from remote repository.
Please make sure you have the correct access rights
and the repository exists.
fatal: clone of 'ssh://git#gitlab.someserver.net:9931/someorg/submodule_project_name.git' into submodule path '/builds/someorg/current_project_name/current_project_name/submodule_project_name' failed
Failed to clone 'current_project_name/submodule_project_name' a second time, aborting
I can see that the CI does have permissions to clone that submodule_project_name because if I set GIT_SUBMODULE_STRATEGY e.g. to recursive, CI manages to pull it (but it's not --remote, so it doesn't work the way I want). Unfortunately when my before_script tries to do it, I'm getting the error. How can I bypass it?
I mentioned before updating the ~/.ssh/.known_hosts file, as in here.
This is not needed when fetching the submodules before the script (which is not what you are doing with GIT_SUBMODULE_STRATEGY set to NONE)
With dind (Docker In Docker), consider also this thread, regarding ssh-add for private keys, and .dockerini / .dockerenv SSH directives.
The OP d33tah confirms in the comments:
I actually didn't add any key, assuming that since Gitlab CI's defaults can pull the key, I should be able to as well.
Then I found that docs say that I needed a deploy key and I added one
Yes: adding the public key on Gitlab side is mandatory.

Use Gitlab Pipeline to push data to ftpserver

I want to deploy to a ftp server using a Gitlab pipeline.
I tried this code:
deploy: // You can name your task however you like
stage: deploy
only:
- master
deploy:
script:
- apt-get update -qq && apt-get install -y -qq lftp
But I get a error message. What is the best way to do this? :)
Then add the following code in your .gitlab-ci.yml file.
variables:
HOST: "example.com"
USERNAME: "yourUserNameHere"
PASSWORD: "yourPasswordHere"
deploy:
script:
- apt-get update -qq && apt-get install -y -qq lftp
- lftp -c "set ftp:ssl-allow no; open -u $USERNAME,$PASSWORD $HOST; mirror -Rnev ./public_html ./ --ignore-time --parallel=10 --exclude-glob .git* --exclude .git/"
only:
- master
The above code will push all your recently modified files in your Gitlab repository into public_html folder in your FTP Server root.
Just update the variables HOST, USERNAME and PASSWORD with your FTP Credentials and commit this file to your Gitlab Repository, you are good to go.
Now whenever you make changes in your master branch, Gitlab will automatically push your changes to your remote FTP server.
Got it :)
image: mwienk/docker-git-ftp
deploy_all:
stage: deploy
script:
- git config git-ftp.url "ftp://xx.nl:21/web/new.xxx.nl/public_html"
- git config git-ftp.password "xxx"
- git config git-ftp.user "xxxx"
- git ftp init
#- git ftp push -m "Add new content"
only:
- master
try this. There's a CI Lint tool in Gitlab that helps with formatting errors. The linter was showing an error, the additional deploy statement.
deploy:
stage: deploy
only:
- master
script:
- apt-get update -qq && apt-get install -y -qq lftp
I use this
deploy:
script:
- apt-get update -qq && apt-get install -y -qq lftp
- lftp -c "set ftp:ssl-allow no; open -u $FTP_USERNAME,$FTP_PASSWORD $FTP_HOST; mirror -v ./ $FTP_DESTINATION --reverse --ignore-time --parallel=10 --exclude-glob .git* --exclude .git/"
environment:
name: production
only:
- master

upload submodule to ftp server using bitcucket pipeline

Is there any way to upload git submodules to ftp server using bitbucket pipeline?
I'm able to upload main repo to ftp server but not it's sub module.
The code I have used is as follows:
# This is a sample build configuration for Other.
# Check our guides at https://confluence.atlassian.com/x/5Q4SMw for more examples.
# Only use spaces to indent your .yml configuration.
# -----
# You can specify a custom docker image from Docker Hub as your build environment.
# image: atlassian/default-image:latest
pipelines:
default:
- step:
script:
- apt-get update
- apt-get -qq install git-ftp
- git submodule update --init --recursive
- git ftp push --user $FTP_USERNAME --passwd $FTP_PASSWORD $FTP_SERVER

Bitbucket Pipelines Push full files to ftp

We are currently testing Bitbucket pipelines Beta (love it so far).
However I have a question.
We are uploading our source/ files fine using git-ftp , but on top of that we need to push the full Subdomain patch files (Index.php & .htaccess) they dont very often change, but we have to push them each time fully and not just the changes.
So far we can't get it working using git-ftp.
Are we doing something wrong?
The error message we keep keeping is:
git ftp push --user $Username --passwd $Pwd ftp://dev.iwantaspeaker.com/public_html/
No changed files for dev.iwantaspeaker.com/public_html/. Everything up-to-date.
I have included some of the code below and hope you can help. Thanks.
image: samueldebruyn/debian-git
pipelines:
branches:
develop:
- step:
script:
- echo "Pipeline Init for dev."
- apt-get update
- apt-get -qq install git-ftp
- echo "'_$(git status -uno --porcelain | wc -l)_'"
- git status -uno --porcelain
- echo "Initiating Push site:dev Source."
- git config git-ftp.syncroot Source/
- git ftp push --user $Username --passwd $Pwd ftp://dev.iwantaspeaker.com/public_html/
- echo "Initiating Push site:dev subDomianPatch."
- git config git-ftp.syncroot SubDomainPatches/dev/dev_subdomain_patch/public_html/
- git ftp push --user $Username --passwd $Pwd ftp://dev.iwantaspeaker.com/public_html/
You can use file .git-ftp-include
https://github.com/git-ftp/git-ftp/blob/master/man/git-ftp.1.md#syncing-untracked-files

Resources