The vinyl-ftp package has a function clean() but I'm not sure how to use it right. I need to:
get all files from my build folder
put them into the target folder on my ftp server
clean files if they're not available locally
I have the following gulp task:
gulp.task('deploy', () => {
let conn = ftp.create({host:host,user:user,password: password});
return gulp.src('build/**', {base: './build/', buffer: false })
.pipe(conn.newer('/path/on/my/server/')) // only upload newer files
.pipe(conn.dest('/path/on/my/server/'))
.pipe(conn.clean('build/**', './build/'));
});
1) and 2) is OK, but the clean() function does nothing
The vinyl-ftp docs have this to say:
conn.clean( globs, local[, options] )
Globs remote files, tests if they are locally available at <local>/<remote.relative> and removes them if not.
Note that globs expects a path for the remote files on your FTP server. Since your remote files are located in /path/on/my/server/ you have to specify that path as your glob:
.pipe(conn.clean('/path/on/my/server/**', './build/'));
Since I got a lot of struggle with this, here a working peace of code. It removes all files from the server that dont exist locally except the usage folder:
var connection = ftp.create({ ... });
connection.clean([
'/*.*',
'/!(usage)*',
'/de/**',
'/en/**',
'/images/**',
'/fonts/**',
'/json/**',
'/sounds/**'
], './dist', { base: '/' });
My files are locally on the ./dist folder and remote directly in the root directory (/) (of the used ftp user).
Related
I'm using windows 10 x64.
I'm using dropbox to hold all my projects, and having a 'node_modules' folder inside a dropbox folder is a disaster.
First of all not all npm packages work in a way that would allow this method , because as it says in line one " does not begin with '/', '../', or './' ", so that won't work I think.
So, I tried to make syslink, but as I figured out, windows doesn't allow creating true hard links for folders, only files.
My method to create a folder junction was to have this bat file inside each project folder, and run it as administrator.
SET dest=%~dp0node_modules
SET src=F:\work\node_modules
MKLINK /J %dest% %src%
The bat file works, it creates the syslink, but 3 problems occur.
Dropbox sees it as a real folder and starts syncing it.
That folder doesn't show in the selective syncing to prevent syncing it.
npm can't use it for some reason and when I run 'npm install', It creates a new real 'node_modules' folder to replace the syslink, I get this error output:
I would not try to use junctions, they are pretty hard to work with and can lead to data loss if you don't know what you're doing.
Instead, you could go with the following approach.
Create a json config file containing the path to your modules:
{"modules_root": "F:/work/node_modules/"}
In your code, load this json and require your modules by prepending the path from your config file:
var fs = require("fs");
var modules_root = JSON.parse(fs.readFileSync("modules_root.json")).modules_root;
var express = require(modules_root + "./express")
If you ever decide to use local modules, you could just change the config file to:
{modules_root: "./node_modules/"}
I'm trying to use gulp-elm with a monolithic architecture. I have setup my project dir with client and server directories and I've put my gulp file in the main directory. The directory structure is pretty simple.
project/
gulpfile.js
package.json
client/
elm-package.json
server/
...
When i run, for example,
gulp elm-init
with the following task:
// File paths
var paths = {
dest: 'client/dist',
elm: 'client/src/*.elm',
static: 'client/src/*.{html,css}'
};
// Init Elm
gulp.task('elm-init', function(){
return elm.init({ cwd : 'client' });
});
// Compile Elm to HTML
/*gulp.task('elm', ['elm-init'], function(){
return gulp.src(paths.elm)
.pipe(plumber())
.pipe(elm({ cwd : 'client' }))
.pipe(gulp.dest(paths.dest));
});*/
the elm-stuff folder and elm-package.json get moved to the main project directory. Is this expected? if not, is there a correct way to use a gulpfile from the parent directory to build an elm package in a nested directory? I think my effort matches the example.
gulp.task('init-nested', function(){
return elm.init({cwd: 'elm/nested-elm/'});
});
gulp.task('nested', ['init-nested'], function(){
return gulp.src('elm/nested-elm/*.elm')
.pipe(elm.make({filetype: 'html', cwd: 'elm/nested-elm/'}))
.pipe(gulp.dest('dest/'));
});
I've tried looking at the source, as well as following dependencies to see if i could figure it out myself, but i'm relatively unfamiliar with node so it's hard for me to figure out exactly what's going on in the gulp-elm source (as well as one of the deps i checked out.)
I was using this tutorial by Auth0 which had an old version of gulp-elm in package.json. Ugh!
I am working on a web app that uses Node.js. In this app, I have a Gulp file. I am using Gulp 4. During my build process, I am attempting to copy multiple files to directories at once. My directory structure looks like this:
./
dest/
source/
child/
index.js
index.bak
file.js
README.md
My real directory structure is more involved. However, I am trying to copy ./source/file.js to ./dest/file.js and ./source/child/index.js to ./dest/child/index.js. Notice that I do not want to copy README.md or index.bak over to the ./dest directory. In an attempt to do this, I have the following function:
function copy() {
let files = [
'source/file.js',
'source/child/**/*.*'
];
return gulp
.src(files)
.pipe(gulp.dest('dest'))
;
}
My problem is, everything just gets copied to the dest directory. The directory structure does not get preserved. While would be fine if I could figure out how to copy files to different directories in a single task. I tried the following:
function copy() {
return gulp
.src('source/child/index.js')
.pipe(gulp.dest('dest/child'))
.src('source/file.js')
.pipe(gulp.dest('dest'))
;
}
However, that approach just generates an error that says:
TypeError: gulp.src(...).pipe(...).src is not a function
So, I'm stuck. I'm not sure how to copy multiple files to multiple directories from a single gulp task.
You need to use the base option as mentioned here ref. It will make sure your directory is copied as it is.
function copy() {
let files = [
'source/file.js',
'source/child/**/*.*'
];
return gulp
.src(files, {base: 'source/'})
.pipe(gulp.dest('dest'));
}
I am using node.js express to serve some static file like svg and json to the client, so I used sendFile() to send the files directly.
so here is my server file structures,
/root // the root of the server
/maps // put some static files
/routes/api // put the web API
in the web API
app.get('/buildings/map',function(req,res){
var mappath = 'maps/ARM-MAP_Base.svg';
res.sendfile(mappath);
})
It works perfectly on my local server to send files to the client, so it means the server could locate the file and send it. but when the server is deployed to the AWS, this methods would encounter a error - 242:Error: ENOENT, stat node.js, looks like it can't open the file in that path
I read some solutions like combining the __dirname with mappath, it didn't work since it would bring to the path of /routes/api/maps/...
so far I have no idea why it works on my local computer but fail to work on the AWS
Relative fs paths like mappath will be resolved from the current working directory, which isn't guaranteed to be consistent. It works locally because you're executing your application with /root as your working directory.
This is why you're finding recommendations to use __dirname, which an be used to resolve paths relative to the current script.
Though, along with it, you'll want to use ../ to resolve parent directories.
var mappath = 'maps/ARM-MAP_Base.svg';
res.sendfile(__dirname + '/../../../' + mappath);
This assumes the current script is located in and __dirname would be /root/maps/routes/api as the indentation in your directory tree suggests.
How can i read the public directory in a meteor application inside my /server path.
I tried using the native 'fs' package but i keep getting a file/directory not found error.
var fs = Npm.require('fs');
var files = fs.readdirSync('/public/soundfiles/');
Has anyone used the filesystem package to read static files inside a meteor application?
I learned that it is best to upload files in your private folder if you are not displaying them outside.
In my case I need to store XML uploads and process them.
At first I wrote the XML into the public folder but that would trigger a reload.
Then I renamed the the upload folder to /public/.#uploads which would stop the reload of Meteor, but then again...it completely ignored that folder during build and the uploaded folder would not exist in the build (throw ENOENT error during read).
So I figured out it is best to put the files in /private/files and then reading goes as follows:
result = fs.readdirSync('assets/app/files')
Everything in the private folder will be moved to the Assets folder where during runtime there is an APP folder available (you do not see that in your build folder structure).
It helps to just simple dump result = fs.readdirSync('.') to see what folder you in and look through the structure.
***UPDATE*****
Locally putting files in private folder still triggered meteor rebuild/update (perhaps not in production..) so I found another solution using the UploadServer just to define the upload directory:
https://github.com/tomitrescak/meteor-uploads
This works for me in Meteor 1.0:
var fs = Npm.require('fs')
var xsd = fs.readFileSync(process.cwd().split('.meteor')[0] + 'server/company.xsd', 'utf8')
Access files without the "/public" part. In a running Meteor app, the public directory becomes your root, and everything that is located at /public/whatever can be accessed at /whatever.
Additionally, if you're playing around with files, you might find these useful:
FileSaver.js
CollectionFS
This is no longer true. For Meteor 0.8, the folder "../client/app" is public. Thus, use fs.readdirSync('../client/app') to get files and folders in public.
Source: personal experience and https://stackoverflow.com/a/18405793
For meteor 1.0.2 public is /web.browser/app/
Checked by entering .meteor dir
Total path in linux /home/user/your_app_name/.meteor/local/build/programs/web.browser/app/
And to get to root is `process.env.PWD or process.cwd().
Im not sure if its work deployed.
_meteor_bootstrap_.serverDir +'/assets/app'
This is path to private folder.
For Meteor 1.4, use server Assets.
See the official docs on Assets
http://docs.meteor.com/api/assets.html
On the server you can use fs to access any part of the meteor directory tree, not just /public, for example
import fs from 'fs';
const rd = process.env.PWD;
const obj = JSON.parse(fs.readFileSync(`${rd}/private/file.json`));
would read and parse a json file located at private/file.json under your meteor app directory root.