How to pin (store) files in ipfs using node js? - node.js

i want to store files of type image in ipfs with node js without using any gateway like pinata , i have tried to install "ipfs-http-client" and "ipfs-core" but i always get an error Module not found for these two libraries . I've installed many version of the libraries and always get the same error . I have installed and runned a local ipfs node and still get the same error. I am really blocked for days now . Before, i have used pinata but now when using the same endpoints i have used before i get the error : {"error":{"reason":"INVALID_ROUTE","details":"The provided route does not match a valid Pinata endpoint"}} even though the same routes worked for me one month ago .
This is the code i have tried to try ipfs-http-client
const IPFSClient = require("ipfs-http-client");
const ipfs = IPFSClient({ host: "localhost", port: 5001, protocol: "http" });
router.post("/add-to-ipfs", async (req, res) => {
const file = await ipfs.add({
path: "hello.txt",
content: Buffer.from("Hello, IPFS!"),
});
console.log(file.path, file.hash);
res.send(file.hash);
});
but i always get the error Unable to resolve path to module 'ipfs-http-client' or Module not found even though it exists in package.json and node_modules . I'll appreciate a lot if someone hep me .

Related

Dockerfile setup for Deploying a Puppeteer Nodejs App on Droplet

I have exhausted all the probable solutions on stackoverflow and beyond all to no success.
My use case is a very simple nodejs app that uses puppeteer ^19.7.1.
My directory structure has the file .puppeteerrc.cjs with content
const { join } = require('path');
/**
* #type {import("puppeteer").Configuration}
*/
module.exports = {
// Changes the cache location for Puppeteer.
cacheDirectory: join(__dirname, '.cache', 'puppeteer'),
};
However, when the server starts I am constantly greeted with the error message:
/workspace/.cache/puppeteer/chrome/linux-1069273/chrome-linux/chrome: error while loading shared libraries: libnss3.so: cannot open shared object file: No such file or directory
Everything works well on my localhost. The issue only starts when I hosted it on Digital Ocean Droplets.
I tried copying over the Dockerfile setup here https://pptr.dev/troubleshooting#running-puppeteer-in-the-cloud as is over to the root of my project to see if the issue will be resolved all to no available.
So, please I would really appreciate if any one can help me with a working Dockerfile configuration to address this as I've spent all day on this to no success.
I intend to host the app on Digital Ocean's Droplets
Thanks in anticipation for your time

GraphQL Tools loadFilesSync works for runtime but not jest in NodeJS - why?

I am working on setting up an Apollo GraphQL Server (Express based) and use the pattern where .graphql files are loaded and merged as demonstrated in this guide. This leads me to use loadFilesSync, path and the Node provided __dirName.
My typedefs module looks like this
const path = require('path');
const { loadFilesSync } = require('#graphql-tools/load-files');
const { mergeTypeDefs } = require('#graphql-tools/merge');
const schemaDefsArray = loadFilesSync(
path.join(__dirname, './types'),
{
recursive: true,
extensions: ['graphql'],
})
const typeDefs = mergeTypeDefs(schemaDefsArray);
module.exports = typeDefs;
My setup is a docker container built from a repo checked out to the WSL file system in Windows. Both the Apollo/Express server and Jest tests are run on Node from inside that container.
My problem is that during regular runtime (Express based Apollo GraphQL Server), __dirName is /srv/<project name>/src/schema and all .graphql-files are resolved for the schema as intended. But during test-runs (Jest running directly on Node) __dirName is a fully qualified \\wsl$\Ubuntu-20.04\home\<user name>\projects\<project name>\src\schema. The Apollo Server loads, reads, and resolves the schema fine while Jest tests attempting to load the same files through the same module cannot find the directory.
Why is the working directory resolved differently, what does it mean for the loadFiles function and how do I fix this?

Basic express setup: not sending anything to local port

I created a frontend app and now trying to incorporate backend into it.
ON the same frontend app i added an index.js file in the root directory, and installed express and required it in index.js file.
Very basic setup as below:
const express = require('express')
const cors = require('cors')
const port = process.env.PORT || 3001
const app = express()
app.get('/', (req, res) => {
res.send({
greetings: 'hi'
})
})
app.listen(port, () => {console.log(`Server on port ${port}`)})
Server is successfully on port 3001 as per my terminal, however, on localhost:3001 I'm not seeing any json response I set up in app.get.
It says Cannot GET / instead. When i inspected in devtool(Network) it says 404.
This seems a very straightforward setup, but what could've gone wrong here?
i just figured why. I installed nodemon but my “start” script is “node index.js”. Should’ve used “nodemon index.js”
Working now with nodemon index.ks
Your code is fine, There are no errors, I tested it and it works as expected.
However few things to note, Keep Backend in Seperate folder/dirctory unless required.
Coming back to your question, There are many possiblity such as some modules are not installed properly
try running following command
//this will install if any library is currupt or not installed properly
npm i
if it doesn't work then try clearing cache
Also keep in mind, In nodeJS dev server does not automatically refresh changes, you need to restart server to see changes or you can use dev dependancy called Nodemon (this will auto restart server on saving changes)

deploy module to remote server that is running node.js

I'm working on my app.js in node.js
-- trying to deploy server-side script.
Many fine node.js modules need a require('something');
I use NPM locally, which works for require, as modules are nicely visible in the local node_modules folder structure. but now I'm ready to upload or bundle to a host. I can't run npm on this hosted server.
const Hapi = require('hapi');
will result in
Error: Cannot find module 'hapi'
because I don't know how to copy/install/bundle/ftp files to my host.
Hapi is just an example. Most anything that has a require will need something on the host.
I used webpack to create a server side bundle.js but just sticking bundle.js under /node_modules doesn't do anything.
Most modules have a complex folder structure underneath --- I'm trying to avoid copying a ton of folders and files under /node_modules. Ideally, I want to combine the modules into a bundle.js and have those modules visible to app.js
but I am open to other ideas.
I have not yet tried using webpack to bundle app.js TOGETHER with the various modules. Have you had luck with that approach?
thanks.
I've tried to upload hapi files a folder-ful at a time, reaching a new require('something') error at every step.
'use strict';
const Hapi = require('hapi'); // <-- how can I deploy hapi on my node.js server?
// Create a server with a host and port
const server=Hapi.server({
host:'localhost',
port:8000
});
// Add the route
server.route({
method:'GET',
path:'/hello',
handler:function(request,h) {
return'hello world';
}
});
// Start the server
async function start() {
try {
await server.start();
}
catch (err) {
console.log(err);
process.exit(1);
}
console.log('Server running at:', server.info.uri);
};
start();
one approach that worked: using webpack to bundle the back end js.
thanks to
https://medium.com/code-oil/webpack-javascript-bundling-for-both-front-end-and-back-
end-b95f1b429810
the aha moment... run webpack to create bundle-back.js then
tie bundle-back.js to my node server
**You start your backend server with ‘bundle-back.js’ using:
node bundle-back.js
In other words, include app.js in the bundle with the modules.

How to upload file using easy-ftp in node?

I am trying to upload a file into my hosting server using node and easy-ftp.
I try with the following code:
var EasyFtp = require ("easy-ftp");
var ftp = new EasyFtp();
var config = {
host:'homexxxxx.1and1-data.host',
type:'SFTP',
port:'22',
username:'u90xxxx',
password:"mypass"
};
ftp.connect(config);
ftp.upload("/test/test.txt", "/test.txt", function(err){
if (err) throw err;
ftp.close();
});
No error message but no file uploaded
I tried the same using promises
const EasyFTP = require('easy-ftp-extra')
const ftp = new EasyFTP()
const config = {
host:'homexxxxx.1and1-data.host',
type:'SFTP',
port:'22',
username:'u90xxxx',
password:"mypass"
};
ftp.connect(config);
ftp.upload('/test.txt', '/test.txt')
.then(console.log)
.catch(console.error)
ftp.upload()
The same issued. No file is uploaded. No error in node console.
The config is the same used in filezilla to transfer files. SFTP protocol. Everything working well with filezilla.
What I am doing wrong?
Looks like you may have a path problem over here.
"/test/test.txt"
The path specified will try to take file from root folder like "c:\test\test.txt".
Assuming you want the file to be taken from your project folder try this path:
"./test/test.txt"
Other things in your code are precisely the same as in mine and mine works.
For me, it was just silently failing, and intelli-sense was not available.
npm remove easy-ftp
npm install easy-ftp
npm audit fix --force until no more vulnerabilities
Afterwards, intelli-sense was available and it started working.

Resources