Create .env file from a .json file - node.js

.env files are used to store environment variables in a certain project. I am looking to be able to easily programmatically modify an .env file, and in that case using JSON (a .json file) would be much easier as far as I can tell.
Say I had file like so env.json:
{
"use_shell_version": true,
"verbosityLevel":3,
"color": "yellow"
}
is there a good way to export those? Is there some .env file format that is easily modified by machines instead of by hand?

You can convert a JSON file to .env with a simple for loop.
function convertToEnv (object) {
let envFile = ''
for (const key of Object.keys(object)) {
envFile += `${key}=${object[key]}\n`
}
return envFile
}
So the with your example object given,
const object = {
"use_shell_version": true,
"verbosityLevel":3,
"color": "yellow"
}
const env = convertToEnv(object)
console.log(env)
output would be
use_shell_version=true
verbosityLevel=3
color=yellow

Related

How can I replace text from a config.json file automatically by a variable in node js

Hi and thank you for your help
I have a config.json file that contains this:
{
"test": {
"hi": {
"text": "Hi ${user.name}"
}
}
}
and I have index.js file that contains:
var config = require('./config.json')
var user = {name: "Test", tag: "#1234")
console.log(`${config.test.hi.text}`) // Output: "Hi ${user.name}"
// Expected output: Hi Test
I want when you change in the config.json the user.name to something like user.tag its automatically replaces him without .replace() function
thank you for your help :D
When using Template literals, the expressions in the placeholders and the text between the backticks (` `) get passed to a function that concatenates the strings into a single string, replacing the values inside $(variable).
This process happens at the time you define the template and cannot be resolved later as you do in your code. Refer to the documentation: Template literals
It would be also a bad coding practise as if the user variable didn't exist in the index.js file it wouldn't give you a compile error, but a nasty runtime error.
The only way to do it is to have your template literal in reach of your variable scope, that means that the template literal can read the variable at the moment it's executed. If you want to have the user instance and the template in different files, you can use a callback function as this:
config.js
const callback = (user) => {
return `Hi ${user.name}`
}
const config = {
callback,
anotherConfig: {
hi: {
example: "This is another config"
}
}
}
export default config;
index.js
import config from './config.js';
const user = {name: "Test", tag: "#1234"};
console.log(config.callback(user))
Output
Hi Test

nodejs read .ini config file

I have a config file. It has variables stored in the following manner.
[general]
webapp=/var/www
data=/home/data
[env]
WEBAPP_DEPLOY=${general:webapp}/storage/deploy
SYSTEM_DEPLOY=${general:data}/deploy
As you can see it has 2 sections general and env. Section env uses the variables from section general.
So I want to read this file into a variable. Let's say config. Here's I want config object to look like:
{
general: {
webapp: '/var/www',
data: '/home/data'
},
env: {
WEBAPP_DEPLOY: '/var/www/storage/deploy',
SYSTEM_DEPLOY: '/home/data/deploy'
}
}
I general I am looking for a config parser for nodejs that supports string interpolation.
I would assume most ini libraries don't include the variable expansion functionality, but with lodash primitives a generic "deep object replacer" isn't too complex.
I've switched the : delimiter for . so has and get can lookup values directly.
const { get, has, isPlainObject, reduce } = require('lodash')
// Match all tokens like `${a.b}` and capture the variable path inside the parens
const re_token = /\${([\w$][\w\.$]*?)}/g
// If a string includes a token and the token exists in the object, replace it
function tokenReplace(value, key, object){
if (!value || !value.replace) return value
return value.replace(re_token, (match_string, token_path) => {
if (has(object, token_path)) return get(object, token_path)
return match_string
})
}
// Deep clone any plain objects and strings, replacing tokens
function plainObjectReplacer(node, object = node){
return reduce(node, (result, value, key) => {
result[key] = (isPlainObject(value))
? plainObjectReplacer(value, object)
: tokenReplace(value, key, object)
return result
}, {})
}
> plainObjectReplacer({ a: { b: { c: 1 }}, d: 'wat', e: '${d}${a.b.c}' })
{ a: { b: { c: 1 } }, d: 'wat', e: 'wat1' }
You'll find most config management tools (like ansible) can do this sort of variable expansion for you before app runtime, at deployment.

Conditional settings for Gulp plugins dependent on source file

The plugin gulp-pug allows to pass global variables to pug files via data property.
What if we don't need full data set in each .pug file? To implement conditional data injection, we need to access to current vinyl file instance inside pipe(this.gulpPlugins.pug({}) or at least to know the source file absolute path. Possible?
const dataSetForTopPage = {
foo: "alpha",
bar: "bravo"
};
const dataSetForAboutPage = {
baz: "charlie",
hoge: "delta"
};
gulp.src(sourceFileGlobsOrAbsolutePath)
.pipe(gulpPlugins.pug({
data: /*
if path is 'top.pug' -> 'dataSetForTopPage',
else if path is 'about.pug' -> 'dataSetForAboutPage'
else -> empty object*/
}))
.pipe(Gulp.dest("output"));
I am using gulp-intercept plugin. But how to synchronize it with gulpPlugins.pug?
gulp.src(sourceFileGlobsOrAbsolutePath)
.pipe(this.gulpPlugins.intercept(vinylFile => {
// I can compute conditional data set here
// but how to execute gulpPlugins.pug() here?
}))
// ...
It was just one example, but we will deal with same problem when need to conditional plugins options for other gulp plugins, too. E. g:
.pipe(gulpPlugins.htmlPrettify({
indent_char: " ",
indent_size: // if source file in 'admin/**' -> 2, else if in 'auth/**' -> 3 else 4
}))
You'll need to modify the stream manually - through2 is probably the most used package for this purpose. Once in the through2 callback, you can pass the stream to your gulp plugins (as long as their transform functions are exposed) and conditionally pass them options. For example, here is a task:
pugtest = () => {
const dataSet = {
'top.pug': {
foo: "alpha",
bar: "bravo"
},
'about.pug': {
foo: "charlie",
bar: "delta"
}
};
return gulp.src('src/**/*.pug')
.pipe(through2.obj((file, enc, next) =>
gulpPlugins.pug({
// Grab the filename, and set pug data to the value found in dataSet by that name
data: dataSet[file.basename] || {}
})._transform(file, enc, next)
))
.pipe(through2.obj((file, enc, next) => {
const options = {
indent_char: ' ',
indent_size: 4
};
if(file.relative.match(/admin\//)) {
options.indent_size = 2;
} else if(file.relative.match(/auth\//)) {
options.indent_size = 3;
}
file.contents = new Buffer.from(html.prettyPrint(String(file.contents), options), enc);
next(null, file);
}))
.pipe(gulp.dest('output'));
}
For the pug step, we call through2.obj and create the pug plugin, passing it data grabbed from our object literal, indexed by filename in this example. So now the data passed into the compiler comes from that object literal.
For the html step you mention, gulp-html-prettify doesn't expose its transform function, so we can't reach into it and pass the transform back to the stream. But in this case that's OK, if you look at the source it's just a wrapper to prettyPrint in the html package. That's quite literally all it is doing. So we can just rig up our step using through2 to do the same thing, but changing our options based on the vinyl file's relative path.
That's it! For a working example see this repo: https://github.com/joshdavenport/stack-overflow-61314141-gulp-pug-conditional

Convert a json file from server body to a json object in node

This should be so simple but I've been stuck on it for more than an hour and it's driving me crazy.
I'm working with an API that's returning data as zipped .json files. I've managed to unzip the files, but now need to parse these files to json objects.
The data is in a buffer, and looks like this:
{ "name": "foo1", "job": "bar1" }
{ "name": "foo2", "job": "bar2" }
{ "name": "foo3", "job": "bar3" }
{ "name": "foo4", "job": "bar4" }
Of course, parsing this with JSON.parse() fails because the data is a .json file, not an array of jsons.
How can I parse this data correctly? fs expects a filepath to read the file, which wouldn't work in my case (as far as I'm aware) because the data is from a buffer, not from a local file.
tl;dr: How do you I parse a .json file that doesn't have a filepath?
First of all, the example you provided (where each line is a string representing a JSON object) is not a JSON file.
It’s a file containing multiple JSON formatted strings, one per line.
Without a surrounding array, it is no wonder you cannot parse it.
I’m also unsure what you mean by the data being kept in a buffer.
Do you mean that you’ve read the contents of the file using the standard fs.readfile() or variant?
If this is the case, you need to convert the Buffer returned from your readfile to a String, as in:
var contents = fs.readfileSync(FILEPATH).toString()
Once you have done so, you could construct an Array using the contents of your file and convert the result to a JSON formatted string using:
fs.readfile(FILEPATH, (err, buf) => {
if (err) {
throw err
}
let objAry = []
buf.toString().split(/\r?\n/).forEach( line => {
objAry.push(JSON.parse(line))
})
let jsonStr = JSON.stringify(objAry)
// do something with the result string
// possibly sending it as a response to an API
// request as ‘Content-Type: application/json’
})
Obviously, you’ll want to add error handling (try/catch) to this.
you can change buffer data to utf-8 charset String then parse it:
JSON.parse(buffer.toString('utf-8'))

Concatenation of the contents of the files with the particle names in the array

I have many files(on master) in dir with names file1.domain file2.domain file3.domain someanothername.domain
Domain always is the same.
I need defined type, that i can use like this.
mydefinedtype { "title":
filenames => ["file1", "file2", "file3"],
}
And it will create file on node with content of files
file1.domain file2.domain file3.domain
i guess you mean that you want to deploy some files in your nodes.
if so, you can do as follows. this will copy file1,2,3 from your_module/files to the target directory:
$filenames = ["file1", "file2", "file3"]
define copy_file {
file { "/targetdir/$name":
source => 'puppet:///modules/your_module/$name',
}
}
copy_file { $filenames }
see: http://docs.puppetlabs.com/guides/file_serving.html

Resources