I'm building an application which require few front-end lib/frameworks such as:
jQuery
JQueryUI
AngularJS
Foundation
I'm using bower to download the components. At this moment my HTML looks like:
<script src="components/jquery/jquery.js"></script>
<script src="components/angular/angular.js"></script>
<script src="components/etc/etc.js"></script>
My goal is to make a grunt script which automatically takes the installed components, concat and minify them and output them as lib.js.
Questions:
With all my researches I figure out how to concat all files from a directory.
My goal here is to get the bower components and concat them without listing them one by one in the gruntfile. How I can archieve this?
Also Is it possible to make a custom jQuery UI build with just the modules I want instead of having the entire UI.
Thanks.
usemin is your friend here.
Install usemin, copy, concat and uglify:
npm install --save-dev grunt-usemin
npm install --save-dev grunt-contrib-copy
npm install --save-dev grunt-contrib-concat
npm install --save-dev grunt-contrib-uglify
Set up a build block in your HTML file:
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="utf-8" />
<title>usemin</title>
<!-- build:js lib.js -->
<script src="components/jquery/jquery.js"></script>
<script src="components/angular/angular.js"></script>
<script src="components/etc/etc.js"></script>
<!-- endbuild -->
</head>
<body>
<h1>usemin</h1>
</body>
</html>
Set up your Gruntfile:
module.exports = function(grunt) {
grunt.initConfig({
pkg: grunt.file.readJSON('package.json'),
copy: {
dist: {
files: [ {src: 'index.html', dest: 'dist/index.html'} ]
}
},
'useminPrepare': {
options: {
dest: 'dist'
},
html: 'index.html'
},
usemin: {
html: ['dist/index.html']
}
});
grunt.loadNpmTasks('grunt-contrib-uglify');
grunt.loadNpmTasks('grunt-contrib-copy');
grunt.loadNpmTasks('grunt-contrib-concat');
grunt.loadNpmTasks('grunt-usemin');
grunt.registerTask('default', ['useminPrepare', 'copy', 'concat', 'uglify', 'usemin']);
};
Run grunt
grunt
Results:
├── Gruntfile.js
├── components
│ ├── angular
│ │ └── angular.js
│ ├── etc
│ │ └── etc.js
│ └── jquery
│ └── jquery.js
├── dist
│ ├── index.html
│ └── lib.js
├── index.html
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="utf-8" />
<title>usemin</title>
<script src="lib.js"></script>
</head>
<body>
<h1>usemin</h1>
</body>
</html>
"My goal here is to get the bower components and concat them without listing them one by one in the gruntfile"
You can take all javascript files from your dependencies directory and sub-directories, and have them concatenated that way:
grunt.config('concat.mydeps', {
files: [{
src: ['components/**/*.js'],
dest: 'dist/lib.js'
}]
})
... but if the order of script execution is important, this is a recipe for disaster :-).
Also, it's quite likely that these folder would contain minified and non minified versions, leading you to include some scripts twice...
A way to avoid that side effect would be in the line of:
grunt.config('concat.mydeps', {
files: [{
src: ['components/**/*.js', '!components/**/*min.js'],
dest: 'dist/lib.js'
}]
})
... but again, this is certainly not bulletproof - a given component may very well have a builded version, and a splitted source alongside.
IMHO, the only sane way out is to list explicitly the files you want aggregated, in the order you need (just like you do in your html for now).
Related
I'm trying to use environment variables in "nested" dependencies of a project bundled with Parcel, using .env files, but I'm getting undefined instead of the desired value.
According to the docs, NODE_ENV is automatically set to "production" when building, else it's set to "development", so it will load .env.local either .env.production depending on that value.
Consider my file structure:
./
├── .env.local
├── .env.production
├── src
│ ├── index.html
│ ├── scripts
│ │ ├── main.js
│ │ └── APIService.js
└── package.json
My script for launching the app in package.json is this:
"scripts": {
"start": "parcel src/index.html"
}
… and the src/index.html file loads the main.js file with a simple tag:
<!DOCTYPE html>
<!-- ... -->
<script src="main.js" defer></script>
If I try to log an environment variable set in .env.local in main.js, it will work perfectly because it is the JS entry point, but if I try to get the same exact variable into an imported module like APIService.js, I get undefined:
// main.js
import APIService from "./APIService";
console.log(process.env.API_ENDPOINT); // ✅ http://localhost:5001/functions/app
// APIService.js
console.log(process.env.API_ENDPOINT); // ❌ undefined
Am I missing something?
How to get access to the environment variables inside imported files?
PS: I've already tried to use the dotenv package in addition, but without success.
I tried to solve with the following questions:
Gitlab does not load assets
Gitlab CI - Publish Failed Test Results to Pages
How to get pelican site generate on GitLab Pages when continuous integration passes and artifacts are being built?
Cannot pass artifacts between jobs in GitLab CI/CD
But no luck.
I built a GitLab's YAML file. It is very confused with the absolute or relative paths and does not detect the files and the images from the folder assets.
Here is:
pages:
stage: deploy
script:
- echo "Olá Felipe e Daniel! :-)"
artifacts:
paths:
- public
only:
- gusbemacbe
I want to change public to ., I am not sure if it will work. If it can not, I will maintain, but I want to fix the YAML file.
Here is the repository tree:
repository
├── public
│ ├── 404.html
│ ├── assets
│ │ ├── css
│ │ ├── fonts
│ │ ├── images
│ │ └── js
│ ├── index.html
│ └── nbproject
│ ├── private
│ │ ├── private.properties
│ │ └── private.xml
│ ├── project.properties
│ └── project.xml
Here is small CSS snippet:
.about
{
background-image: url('/assets/images/quem-somos#1.jpg');
background-size: cover;
background-position: center;
text-align: center;
padding: 25px 0px 20px 0px;
}
And small HTML snippet:
<!DOCTYPE html>
<html lang="pt-BR">
<head>
<link rel="icon" href="./assets/images/favicons/favicon.ico" />
<link rel="stylesheet" media='all' type="text/css" href="./assets/css/font-awesome.css">
<link rel="stylesheet" media='all' type="text/css" href="./assets/css/style.css">
<link rel="stylesheet" media='all' type="text/css" href="./assets/css/media-queries.css">
<body>
<a class="navbar-brand" href="index.html">
<img src="./assets/images/logotipo.svg" width="30" height="30" alt="">
</a>
<script src="./assets/js/jquery.js"></script>
<script src="./assets/js/popper.js"></script>
<script src="./assets/js/bootstrap.bundle.js"></script>
<script src="./assets/js/firebase/firebase-app.js"></script>
<script src="./assets/js/firebase/firebase-analytics.js"></script>
<script src="./assets/js/firebase/firebase-firestore.js"></script>
<script src="./assets/js/firebase/firebase-storage.js"></script>
</body>
</html>
You need to copy the assets into the public directory. For example:
pages:
stage: deploy
script:
- mkdir public
- cp assets/* public/
artifacts:
paths:
- public
only:
- gusbemacbe
My question is about architecture and how to add installed libraries to index.html?
I'm working with SAP application (AngularJS, nodeJS). All worked fine, but when I migrated from bower to yarn i got stuck.
I don’t understand how to add libraries to index.html.
Later I had the following file structure:
.
├── bower_components
| ├── …
| └── …
├── node_modules
| ├── …
| └── …
├── app
| ├── …
| └── …
├── bower.json
├── package.json
├── index.html
And earlier I automatically added js files into script section by wiredep
Index.html:
<!-- bower:js -->
<script src="../../bower_components/jquery/dist/jquery.js"></script>
<script src="../../bower_components/tether/dist/js/tether.js"></script>
<script src="../../bower_components/bootstrap/dist/js/bootstrap.js"></script>
<script src="../../bower_components/angular/angular.js"></script>
<script src="../../bower_components/angular-xeditable/dist/js/xeditable.js"></script>
<script src="../../bower_components/angular-bootstrap/ui-bootstrap-tpls.js"></script>
<script src="../../bower_components/angular-route/angular-route.js"></script>
<script src="../../bower_components/angular-file-upload/dist/angular-file-upload.min.js"></script>
<script src="../../bower_components/angular-animate/angular-animate.js"></script>
<script src="../../bower_components/angular-sanitize/angular-sanitize.js"></script>
<script src="../../bower_components/angular-read-more/dist/readmore.min.js"></script>
<script src="../../bower_components/js-xlsx/dist/xlsx.core.min.js"></script>
<script src="../../bower_components/angular-js-xlsx/angular-js-xlsx.js"></script>
<script src="../../bower_components/angular-ymaps/angular-ymaps.js"></script>
<script src="../../bower_components/angular-cookies/angular-cookies.js"></script>
<script src="../../bower_components/query-string/query-string.js"></script>
<script src="../../bower_components/angular-oauth2/dist/angular-oauth2.js"></script>
But later when I migrated from bower to yarn I have only one node_modules folder:
.
├── node_modules
| ├── … modules to backend (nodejs)
| └── … modles to frontend (angularJS and so on)
├── app
| ├── …
| └── …
├── package.json
├── index.html
When I use gulp-inject to automatically add js libraries in my index.html, gulp adds all js files (more then 1000 files) to index.html. Most of them related to backend.
How to solve my problem?
try to use bundles. require all libraries in main bundle file like this: 'use strict';
var angular = require('angular');
var moment = require('moment');
browserify can help with that.
Hope this puts you on the right track.
I'm setting up my first project using serverless and while I'm finding a lot of great "getting started" tutorials, I'm having a hard time finding anything about actual project structure.
My thoughts is to use the below structure for my functions, shared libs and core configuration/dependencies:
.
├── functions/
│ │
│ ├── users/
│ │ ├── handler.js
│ │ └── serverless.yml
│ │
│ └── roles/
│ ├── handler.js
│ └── serverless.yml
│
├── shared/
│ └── common.js
│
├── node_modules/
└── package.json
My main curiosity is around deployment and how that pertains to dependencies and shared files. Additionally, automating deploy of this structure seems strange as I'm gathering I would need to deploy each of the functions separately which I can script, but wondering if that's needed or advised.
I have dealt with this a bit and found it quite frustrating. If you deploy from your setup, what does your api look like? With individual serverless.yaml files you end up with independent api endpoints (assuming you are triggering with api calls and not something like s3).
I ended up with a structure like this:
|- serverless/
|--- serverlsss.yml
|--- web pack.config.js
|--- dist/
|--- node_modules() /* dev and common code */
|--- src/
|----- function1/
|-------- node_modules
|-------- package.json
|-------- index.js
|----- function2/
|-------- node_modules
|-------- package.json
|-------- index.js
I use the serverless webpack plugin to output the the individual functions into the dist/ directory. The serverless.yaml then points to these.
The webpack.config.js looks like this:
const nodeExternals = require('webpack-node-externals');
const path = require('path');
module.exports = {
entry: {
'function1': './src/function1/index.js',
'function2': './src/function2/index.js',
},
target: 'node',
output:{
libraryTarget: 'commonjs2',
path: path.join(__dirname, './dist'),
filename: '[name].js'
},
externals: [nodeExternals()],
module: {
loaders: [
/* Babel is nice, but it's adding a some bulk and I don't need it
{
test: /\.js$/,
loaders: ['babel'],
include: __dirname,
exclude: /node_modules/,
}, */
{
test: /\.json$/,
loaders: ['json']}
],
},
};
// externals: [nodeExternals()],
// nodeExternals seems to break aws-sdk when serving locally
// aws-sdk should be in dev-dependencies of root folder
// that way it's available in dev but doesn't get packaged.
// It's available to the lambdas when deployed.
After that just make sure you set the individual flag in serverless.yml
package:
individually: true
The webpack plugin is quite nice and does most of the heavy lifting. With this I can do a single deploy and all the functions end up as individual lambda functions all under one api endpoint. You also get the Webpack dev server so you can run serverless webpack serve to test your functions.
It was a bit of a pain to setup, but it's been working pretty well.
I have a project that contains modules for both the server and client of my application, which are each built using webpack. I'm using Karma with Jasmine to test my client (which uses Angular) and I would like to use Jasmine to test the server, which is written in typescript, too.
Unfortunately, the only guides that I could find online used jasmine-node (which to be untouched for the past few years) instead of jasmine-npm. Can anyone suggest a way that I can use Jasmine, or an alternative, for testing within my project?
I've tried writing a jasmine.json file, or editing the one generated by jasmine with the init cli command, however this doesn't seem to work with typescript files.
At the moment, my project's structure is like so:
├── client
│ ├── karma.conf.js
│ ├── protractor.conf.js
│ ├── src
│ ├── tsconfig.json
│ └── webpack.config.js
├── server
│ ├── src
│ ├── tsconfig.json
│ └── webpack.config.js
└── node_modules
Its definately possible to use jasmine for your server side tests. Follow these steps and you'll be fine:
1) In your package.json add the following dev dependencies:
"jasmine": "latest",
"jasmine-core": "latest",
2) Setup your jasmine.json so that it will include all files you want to run tests on, etc.
{
"spec_dir": "dist/dev/server/spec",
"spec_files": [
"unit/server/**/*[sS]pec.js"
],
"helpers": []
}
3) Add unit.ts that will bootstrap your testing process:
const Jasmine = require("jasmine");
let j = new Jasmine();
j.loadConfigFile("./jasmine.json");
j.configureDefaultReporter({
showColors: true
});
j.execute();
Now all you have to do is compile and run your resulting unit.js in nodejs.