I have a system with user accounts distributed between projects. The projects have each a folder structure with uploaded files. The documents are stored on AWS S3. Through the portal the users are able to manage (CRUD) the folders and documents.
But I also want to implement a client application that syncs a local folder with the different projects folders. Does AWS have such an API? I know about the cli tool S3cmd, is that the way to go?
Or does AWS have an API (preferably for NodeJS) that works with this kind of functionality, syncing a local folder with an S3 folder?
What would be the 'correct way' (if any) to go?
You can use this npm module for sync s3 folder
Install npm module
npm install s3-sync
Use module in your code segment
var stream = s3sync({
key: process.env.AWS_ACCESS_KEY
, secret: process.env.AWS_SECRET_KEY
, bucket: 'sync-testing'
})
stream.write({
src: __filename
, dest: '/uploader.js'
})
stream.end({
src: __dirname + '/README.md'
, dest: '/README.md'
})
More details refer this
Related
In my nestjs application am uploading files to custom folder "Uploads". But am getting error when am trying to access the file
when I call this URL :
http://localhost:3000/Uploads/file.png
{"statusCode":404,"message":"Cannot GET
/Uploads/file.png","error":"Not Found"}
But when I upload files to public folder am able to download it as:-
http://localhost:3000/file.png
How to download the files from Uploads folder?
Use this code in your main.ts
app.useStaticAssets(join(__dirname, '..', 'public'), {
prefix: '/Uploads/',
});
I am generating a doc file using docx-template npm package in node js. and the file is getting successfully saved in my backend/controller folder on my local machine. Now i want to do prod deployment on Heroku but i dont know what path has to be set to save the file in production.
I have used 'fs' module to read and write file. Shown below.
fs.writeFileSync(
path.resolve(__dirname, `${contractName} ${element.frequency}.docx`),
buffer
);
You probably have to use the "send(Buffer)" feature of express.
See the following page :
https://expressjs.com/en/api.html#res.send
const contentTypes = {
docx: "application/vnd.openxmlformats-officedocument.wordprocessingml.document",
pptx: "application/vnd.openxmlformats-officedocument.presentationml.presentation",
xlsx: "application/vnd.openxmlformats-officedocument.spreadsheetml.sheet",
};
res.set('Content-Type', contentTypes.docx);
res.send(buffer);
This is because the "fs" module is used to write the data on the disk, but if it is in production, most likely what you want is to let the user "download" a file, and that is done using res.send() which takes as an argument a Buffer to achieve this feature.
I have a site running Gatsby and Gatsby-Source-Drupal7, it is a plugin that uses Graphql to make an axios get request to https://stagingsupply.htm-mbs.com/restws_resource.json and uses the json data to query. I am able to run it just fine on my computer by going to localhost:8000 and it creates over 200k nodes, but when I try to deploy on any cloud service provider like Gatsby Cloud or Netlify it doesn't fetch any nodes or data at all from the site.
Warning from console
Starting to fetch data from Drupal
warn The gatsby-source-drupal7 plugin has generated no Gatsby nodes. Do you need
it?
Code
code from gatsby config
module.exports = {
siteMetadata: {
title: `new`,
siteUrl: `https://www.yourdomain.tld`,
},
plugins: [
{
resolve: `gatsby-source-drupal7`,
options: {
baseUrl: `https://stagingsupply.htm-mbs.com/`,
apiBase: `restws_resource.json`, // optional, defaults to `restws_resource.json`
},
},
]
}
gatsby-config.js from node_modules/gatsby-source-drupal7
const createNode = actions.createNode; // Default apiBase to `jsonapi`
apiBase = apiBase || `restws_resource.json`; // Fetch articles.
// console.time(`fetch Drupal data`)
console.log(`Starting to fetch data from Drupal`);
const data = yield axios.get(`${baseUrl}/${apiBase}`, {
auth: basicAuth
});
const allData = yield Promise.all(_.map(data.data.list,
Link to repo that works on local computer https://github.com/nicholastorr/gatsby-d7
any and all help will be appreciated
As you pointed out, you've played around with the Node versions using NODE_ENV and engines workarounds. My guess also relies on a mismatching Node version between environments but as Netlify docs suggests, there are only two ways of customizing Node versions to manage dependencies.
Set a NODE_VERSION environment variable.
Add a .node-version or .nvmrc file to the site’s base directory in your repository. This will also tell any other developer using the
repository which version of Node.js it depends on.
Without seeing your Netlify build command (to see the NODE_VERSION) there's no .node-version nor .nvmrc in your repository. I'd try creating it at the root of the project with the v14.17.1 in it and trying a fresh install.
In addition, double-check other server-related conflicts like IP-blocking, etc.
Error was nothing Gatsby or Node related, my site was block the IP of the server :>
I created following express API
const express = require("express");
const app = express();
const bodyParser = require("body-parser");
const cookieParser = require("cookie-parser");
require("dotenv/config");
//routes
const authRoute = require("./routes/auth.js");
const adminRoute = require("./routes/admin.js");
//middleweres
//converting body into json using body parser
app.use(cookieParser());
app.use(bodyParser.json());
app.use("/", express.static("public"));
app.use("/api/auth", authRoute);
app.use("/api/admin", adminRoute);
// starting express server
// app.listen(5000, () => {
// console.log("listning on port 5000");
// });
module.exports = {
app,
};
in public folder I have html file and css and js inside public/static folders
html, css and js in public folder are generated with react build.
I am trying to deploy this API in google cloud function with following command
gcloud functions deploy outreach-dashboard --entry-point app --runtime nodejs10 --trigger-http --allow-unauthenticated
function is getting deployed but problem is when I see the function on gcp dashboard it source does not contain public folder and if I download source as zip then I can see public folder there but its empty.
I need public folder to get deployed so I can serve it using
express.static("public")
You are trying to serve several endpoints in the same Cloud Functions. I saw some hack on Stack overflow where folks bend the framework to achieve this. It's not my recommendation.
Cloud Run is a very similar platform. The same underlying infrastructure, and feature very close (I wrote an article on this). But you serve a containerize webserver, more suitable for your use case.
You can easily have a try on it.
Uncomment your "starting app express" part
Test locally if it works.
Then run this command
gcloud beta run deploy outreach-dashboard --source=. --platform=managed --region=us-central1 --allow-unauthenticated
change the region if needed
The command :
upload the sources (take care of your .gitignore and .gcloudignore file to be sure to upload all the files)
Create a container with your source. To achieve this, Buildpacks.io is used. Exactly the same process that with Cloud Functions and App Engine.
Deploy the container on Cloud Run.
If the problem persist, there is may an issue with the automatic container build process (maybe some file are automatically discarded). In this case, you can write a very simple Dockerfile similar to this one that you have in the getting started documentation.
This time, you can create and deploy in 2 steps (and 2 commands)
# create the container with Cloud Build based on the docker file
gcloud builds submit --tag gcr.io/<PROJECT_ID>/<containerName>
# Deploy on Cloud Run
gcloud beta run deploy outreach-dashboard --image=gcr.io/<PROJECT_ID>/<containerName> --platform=managed --region=us-central1 --allow-unauthenticated
As it turns out I have remove public folder from .gitignore
and after that also need to tell GCF to treat public folder as static folder by creating app.yaml file in root folder
content of app.yaml
runtime: nodejs12
handlers:
- url: /static
static_dir: public
- url: /.*
script: auto
I' have a node_package bookiza (installed globally) that POSTS/PATCHes values to a receiving substrate using credentials taken off a .rc file.
I'm currently saving the .rc file inside the module itself, at usr/lib/node_modules/bookiza, but I can do so anywhere I like. The problem in storing it inside the package is that the settings are overwritten whenever the user npm i -g installs again, to update the package.
function updateBookizaConfig(res) {
var bookizaConfig = JSON.parse(fs.readFileSync(path.join(__dirname, '..', '.bookizarc')).toString());
bookizaConfig.token = res.body.key;
bookizaConfig.username = res.body.username;
bookizaConfig.email = res.body.email;
fs.writeFileSync(path.join(__dirname, '..', '.bookizarc'), JSON.stringify(bookizaConfig, null, 2));
// Move or copy the config file outside of package to retain credentials upon package update.
// cp('-R', path.join(__dirname, '..', '.bookizarc'), path.join(__dirname, '..', '..'));
console.log(chalk.bold.cyan('Registration successful'));
}
This works, but note that the .dotfile file is saved inside usr/lib/node_modules/ directory, as a sibling to other global packages installed on the machine. Now I could put the settings file anywhere else on the machine too, but what is the good practice/standard way of doing this?
Will it be better for me to put the settings file inside a usr/lib/node_modules/dots folder where in future other package writers could also probably put their .rc files?
Users in the comments have already hit on this solution, but here for the record anyway:
npm recommends you save user-specific config data in the user's home directory rather than in npm's modules directory, both because it is inconvenient to persist those settings, and because it is a problem in multi-user environments.
There are a number of modules that will find the user's home directory in a cross-platform way for you to put your files in; we like https://www.npmjs.com/package/osenv