Nodejs REST API to rewrite url or work as middleware for accessing distant photos - node.js

I am making a rest api with node/express for exposing data with assests urls from another servers , like this :
Client ------> RestAPI with nodeJs/express (API Y) --------> API for Images (API X)
The API for Images (API X): provide json or links for images with url like "http://APIX.com/link1/X.jpg"
The assets / images are located in (API X) server .
What am trying to do is when the client calls the (API Y) and want to get data with image urls like "http://APIY.com/api/X.jpg" ,the (API Y) fetch the images from (API X) and return it to client with links from (API Y), so the client will not know the correct source of images and think that the images are hosted in (API Y) server.
Any idea on how can i implement this in NODE js/express ? thx.

Thanks #jfriend00 , i manage to find a solution with your "proxy" suggestion, i used the http-proxy-middlware npm package , like follow :
I am using express and http-proxy-middleware with Typescript
import express from "express";
import proxy from "http-proxy-middleware";
export default class APIYServer {
constructor(private port: number) {}
public start(): void {
const app = express();
app.use(
//the Url exposed with APIY
"/APIY/assets",
proxy({
//the Url of the APIX server
target: "https://www.APIX.com/",
changeOrigin: true,
pathRewrite: {
//"/APIX-assets-path/" is the path where images are located
"^/APIY/assets": "/APIX-assets-path/"
},
onProxyRes
})
);
//function for handling proxy response
function onProxyRes(proxyResponse: any, request: any, response: any) {
if (proxyResponse.statusCode != 200) {
console.log("---FAIL ---");
}
// DELLETING COOKIES INFOS so the client can't find source of images
Object.keys(proxyResponse.headers).forEach(function(key) {
delete proxyResponse.headers[key];
});
delete proxyResponse.headers['Set-Cookie'];
}
//run APIY server on port 8888
app.listen(this.port, () => {
console.log("Server Started on 8888");
});
}
}
When calling "localhost:8888/APIY/assets/01.png" its give me the images located in "https://www.APIX.com/APIX-assets-path/01.png"...and that's what i m looking for :D

Related

How to determine http vs https in nodejs / nextjs api handler

In order to properly build my urls in my xml sitemaps and rss feeds I want to determine if the webpage is currently served over http or https, so it also works locally in development.
export default function handler(req, res) {
const host = req.headers.host;
const proto = req.connection.encrypted ? "https" : "http";
//construct url for xml sitemaps
}
With above code however also on Vercel it still shows as being served over http. I would expect it to run as https. Is there a better way to figure out http vs https?
As Next.js api routes run behind a proxy which is offloading to http the protocol is http.
By changing the code to the following I was able to first check at what protocol the proxy runs.
const proto = req.headers["x-forwarded-proto"];
However this will break the thing in development where you are not running behind a proxy, or a different way of deploying the solution that might also not involve a proxy. To support both use cases I eventually ended up with the following code.
const proto =
req.headers["x-forwarded-proto"] || req.connection.encrypted
? "https"
: "http";
Whenever the x-forwarded-proto header is not present (undefined) we fall back to req.connection.encrypted to determine if we should serve on http vs https.
Now it works on localhost as well a Vercel deployment.
my solution:
export const getServerSideProps: GetServerSideProps = async (context: any) => {
// Fetch data from external API
const reqUrl = context.req.headers["referer"];
const url = new URL(reqUrl);
console.log('====================================');
console.log(url.protocol); // http
console.log('====================================');
// const res = await fetch(`${origin}/api/projets`)
// const data = await res.json()
// Pass data to the page via props
return { props: { data } }
}

Url forwarding does not work when the target has a base url (NodeJs http-express-proxy)

I am calling a backend API service with this line in my NodeJs application (Angular):
this.http.get<Car[]>('/server/api/v1/cars')
But get the following error : GET http://127.0.0.1:4200/server/api/v1/cars 404 (Not Found)
I did expect the url to be translated to http://127.0.0.1:8080/api-baseurl/api/v1/cars with the following server.js (run withnode server.js) :
const proxy = require('express-http-proxy');
.
.
app.use('/server', proxy('http://localhost:8080/api-baseurl'));
But it looks like proxy does not handle forwarding when the target has a base url : using this line app.use('/server', proxy('http://localhost:8080/api-baseurl')); worked when the api did not have a base url, but that is non the case anymore.
I got the answer. What I did not understand form https://www.npmjs.com/package/express-http-proxy is that the first proxy() parameter matches the host, not the specific resource path. ProxyReqPathResolver must be used for the resource path, a function in which I can edit the path :
const proxy = require('express-http-proxy');
.
.
// Set our api routes proxy to point to backend server
app.use('/server', proxy('http://localhost:8080', {
proxyReqPathResolver: function (req) {
updatedPathComplete = '/api-baseurl' + req.url;
console.log('proxy on /server: ' + req.url + ' => ' + updatedPathComplete);
return updatedPathComplete;
}
}));
In the console log :
> node server-prod.js
API running on 4200
proxy on /server: /api/v1/cars => /api-baseurl/api/v1/cars

How to Connect Reactivesearch to an external Elasticsearch cluster?

I am trying to connect my reacetivesearch application to external elasticsearch provider( not AWS). They dont allow making changes to the elasticsearch cluster and also using nginx in front of the cluster .
As per the reacetivesearch documentation I have cloned the proxy code and only made changes to the target and the authentication setting(as per the code below ) .
https://github.com/appbaseio-apps/reactivesearch-proxy-server/blob/master/index.js
Proxy is successfully starting and able to connect the remote cluster . However when I connect reacetivesearch app through proxy I get the following error.
Access to XMLHttpRequest at 'http://localhost:7777/testing/_msearch?' from origin 'http://localhost:3000' has been blocked by CORS policy: Response to preflight request doesn't pass access control check: No 'Access-Control-Allow-Origin' header is present on the requested resource
I repeated the same steps with my local elasticsearch cluster using the same proxy code and getting the same error .
Just was wondering do we need to make any extra changes to make sure the proxy sending the right request to the elasticsearch cluster ? I am using the below code for the proxy.
const express = require('express');
const proxy = require('http-proxy-middleware');
const btoa = require('btoa');
const app = express();
const bodyParser = require('body-parser')
/* This is where we specify options for the http-proxy-middleware
* We set the target to appbase.io backend here. You can also
* add your own backend url here */
const options = {
target: 'http://my_elasticsearch_cluster_adddress:9200/',
changeOrigin: true,
onProxyReq: (proxyReq, req) => {
proxyReq.setHeader(
'Authorization',
`Basic ${btoa('username:password')}`
);
/* transform the req body back from text */
const { body } = req;
if (body) {
if (typeof body === 'object') {
proxyReq.write(JSON.stringify(body));
} else {
proxyReq.write(body);
}
}
}
}
/* Parse the ndjson as text */
app.use(bodyParser.text({ type: 'application/x-ndjson' }));
/* This is how we can extend this logic to do extra stuff before
* sending requests to our backend for example doing verification
* of access tokens or performing some other task */
app.use((req, res, next) => {
const { body } = req;
console.log('Verifying requests ✔', body);
/* After this we call next to tell express to proceed
* to the next middleware function which happens to be our
* proxy middleware */
next();
})
/* Here we proxy all the requests from reactivesearch to our backend */
app.use('*', proxy(options));
app.listen(7777, () => console.log('Server running at http://localhost:7777 🚀'));
Regards
Yep you need to apply CORS settings to your local elasticsearch.yaml as well as your ES service provider.
Are you using Elastic Cloud by any chance? They do allow you to modify Elasticsearch settings.
If so:
Login to your Elastic Cloud control panel
Navigate to the Deployment Edit page for your cluster
Scroll to your '[Elasticsearch] Data' deployment configuration
Click the User setting overrides text at the bottom of the box to expand the settings editor.
There's some example ES CORS settings about halfway down the reactivebase page that provide a great starting point.
https://opensource.appbase.io/reactive-manual/getting-started/reactivebase.html
You'll need to update the provided http.cors.allow-origin: setting based on your needs.

angular 4 / res.download from nodejs back-end

I'm calling an Angular component which calls an api service I created to call a nodejs back-end. The back-end download a zip file, using res.download. I believe that the response is not correctly handled, because when I call the back-end directly from the url (localhost:3000/api/download/file), it works perfectly. Here's the code below :
1) Angular component
downloadZipFile(index) {
this._apiService.downloadZip(index).subscribe(data => {
});
}
2) Angular apiService
downloadZip(index) {
return this._http.get('http://localhost:3000' + appConfig.__apiUrl + 'download/' + index);
}
3) NodeJS API
router.get('/download/:index', (req, res) => {
res.download(path.join(__dirname, 'downloads/' + req.params.index + '.zip'));
});
You are attaching multiple time API url in your angular service HTTP request. Please remove one of them. and check again.
downloadZip(index) {
return this._http.get('http://localhost:3000/download/' + index);
}
Add above code in your service.

Wildcard subdomain info sharing between node server and Nuxt/Vue client

We are building a multi_tenant solution with NodeJS/Express for the back end and VueJS/Nuxt for the front-end. Each tenant will get their own subdomain like x.mysite.com, y.mysite.com, etc.
How can we make both our back end and front-end read the subdomain name and share with each other?
I have some understanding that in the Vue client, we can read suvdomain using window.location. But I think that's too late. Is there a better way? And what about the node /express setup? How do we get the suvidhaon info there?
Note that Node/Express server is primarily an API to interface with database and for authentication.
Any help or insight to put us on the right path is appreciated.
I'm doing something similar in my app. My solution looks something like this...
Front End: In router.vue, I check the subdomain to see what routes to return using window.location.host. There is 3 options
no subdomain loads the original routes (mysite.com)
portal subdomain loads the portal routes (portal.mysite.com)
any other subdomain loads the routes for the custom client subdomain, which can be anything and is dynamic
My routes for situation #3 looks like this:
import HostedSiteHomePage from 'pages/hostedsite/hosted-site-home'
export const hostedSiteRoutes = [
{ path: '*', component: HostedSiteHomePage }
]
The asterisk means that any unmatched route will fallback to it.
In your fallback page (or any page), you will want this (beforeMount is the important part here):
beforeMount: function () {
var host = window.location.host
this.subdomain = host.split('.')[0]
if (this.subdomain === 'www') subdomain = host.split('.')[1]
this.fetchSiteContent()
},
methods: {
fetchSiteContent() {
if (!this.subdomain || this.subdomain === 'www') {
this.siteContentLoaded = true
this.errorLoadingSite = true
return
}
// send subdomain to the server and get back configuration object
http.get('/Site/LoadSite', { params: { site: this.subdomain } }).then((result) => {
if (result && result.data && result.data.success == true) {
this.siteContent = result.data.content
} else {
this.errorLoadingSite = true
}
this.siteContentLoaded = true
}).catch((err) => {
console.log("Error loading " + this.subdomain + "'s site", err)
this.errorLoadingSite = true
this.siteContentLoaded = false
})
},
}
I store a configuration object in json in the database for the subdomain, and return that to the client side for a matching subdomain then update the site to match the information/options in the config object.
Here is my router.vue
These domain names are supported:
mysite.com (loads main/home routes)
portal.mysite.com (loads routes specific to the portal)
x.mysite.com (loads routes that support dynamic subdomain, fetches config from server)
y.mysite.com (loads routes that support dynamic subdomain, fetches config from server)
localhost:5000 (loads main/home routes)
portal.localhost:5000 (loads routes specific to the portal)
x.localhost:5000 (loads routes that support dynamic subdomain, fetches config from server)
y.localhost:5000 (loads routes that support dynamic subdomain, fetches config from server)
import Vue from 'vue'
import VueRouter from 'vue-router'
// 3 different routes objects in routes.vue
import { portalRoutes, homeRoutes, hostedSiteRoutes } from './routes'
Vue.use(VueRouter);
function getRoutes() {
let routes;
var host = window.location.host
var subdomain = host.split('.')[0]
if (subdomain === 'www') subdomain = host.split('.')[1]
console.log("Subdomain: ", subdomain)
// check for localhost to work in dev environment
// another viable alternative is to override /etc/hosts
if (subdomain === 'mysite' || subdomain.includes('localhost')) {
routes = homeRoutes
} else if (subdomain === 'portal') {
routes = portalRoutes
} else {
routes = hostedSiteRoutes
}
return routes;
}
let router = new VueRouter({
mode: 'history',
routes: getRoutes()
})
export default router
As you can see I have 3 different set of routes, one of which is a set of routes that supports dynamic subdomains. I send a GET request to the server once i load the dynamic subdomain page and fetch a configuration object that tells the front end what that site should look like.

Resources