✨ Hello everyone!✨
General Problem:
I have a web app that has about 50 images that shouldn't be able to be accessed before the user logs into the site. This should be a simple answer I suspect, there are plenty of sites that also require this basic protection. Maybe I do not know the right words to google here, but I am having a bit of trouble. Any help is appreciated.
App details:
My web app is built in typescript react, with a node.js/express/mongoDB backend. Fairly typical stuff.
What I have tried:
My best thought so far was to upload them into the public folder on the backend server hosted on heroku. Then I protected the images with authenication middlewear to any url that had "/images/" as a part of it. This works, partially. I am able to see the images when I call the api from postman with the authenication header. But I cannot figure out a way to display that image in my react web app. Here is the basic call I used.
fetch(url,
{
headers: {
Authorization:token,
},
}
);
and then the actual response is just an empty object when I try to copy it
{}
but I also get this when I console log the pure response, some kind of readable stream:
from following related question
I came up with the following: (which is normally wrapped in a asyc function)
const image = await fetch(url,{headers:{ Authorization:token}});
const theBlob = await image.blob();
console.log(URL.createObjectURL(theBlob));
which gives me the link: http://localhost:3000/b299feb8-6ee2-433d-bf05-05bce01516b3 which only displays a blank page.
Any help is very much appreciated! Thanks! 😄
After lots of work trying to understand whats going on, here is my own answer:
const image = await axios(url, { responseType: "blob", headers: {Authorization: token }});
const srcForImage = URL.createObjectURL(image.data)
Why it makes sense now
So I did not understand the innerworkings of what was going on. Please correct me, but the following is my understanding:
So the image was being sent in binary. What I had to do to fix that was to set the reponseType in axios as "blob", which then sent a blob, which I believe means its base 64 encoded instead. Then the function URL.createObjectURL does some magic, and must save it to the browser as part of the page. Then we can just use that as the image url. When you visit it yourself, you must type the 'blob:' part of the url it give you too, otherwise its blank, or stick it in <img src={srcForImage}/> and it works great. I bet it would've worked in the original fetch example in the question, I just never put the url in a tag or included 'blob:' as part of the URL.
That's correct, you send the auth token and the backend uses that to auth the user (check that he exists in the DB, that he has the correct Role and check the jwt too)
The server only responds with the images if the above is true
If your server is responding with an empty object then the problem is the backend not the frontend, console.log what you're sending to the frontend
Related
I've developed an API on Firebase Cloud Functions and I want to include a docs path to it.
I'm using swagger and I could successfully test it locally (localhost:PORT/docs) but when I deploy the function to Firebase it's not working, it redirects me to an authorization page.
I think I figured out why this is:
Let's say the name of my Cloud function is cfunc. Then the base url for it is something like https://region-name-project-name.cloudfunctions.net/cfunc. Based on how I included the swagger documentation:
const swaggerDoc = require('./docs/swagger.config.json')
app.use(
'/docs',
allowCors,
swaggerUi.serve,
swaggerUi.setup(swaggerDoc, {
customCssUrl: '/assets/swagger.css',
customSiteTitle: 'My Function Title',
customfavIcon: '/assets/logo.ico',
swaggerOptions: {
supportedSubmitMethods: [] //to disable the "Try it out" button
}
})
)
the docs should be located at https://region-name-project-name.cloudfunctions.net/cfunc/docs. When I try to access that URL, watching "Network" in my browser DevTools, it attempts a GET at that URL with response 304 and then redirects to https://region-name-project-name.cloudfunctions.net/docs and that's what brings up the Google Authentication page, since there's no Cloud Function named "docs" so Google thinks I'm trying to access something else in Firebase Cloud Functions (the same thing happens if I do something like https://region-name-project-name.cloudfunctions.net/tomato)
But I still don't know how to fix this redirect or why it's happening. I tried adding the Cloud Function URL to the host parameter of the swagger.config.json file, and some modifications to CORS, like allowing more Request Methods, adding json as content type, allowing authentication on headers, but nothing seems to be working.
Hope I was clear enought, if not tell me any other info you need (it's one of my first posts here :B)
Found the SOLUTION
After testing a BUNCH of different things, I found out that the redirection was in fact happening always removing one slice of the path after, for example I changed the docs endpoint to '/something/docs' and when accessing the URL that would be https://region-name-project-name.cloudfunctions.net/cfunc/something/docs it redirected to https://region-name-project-name.cloudfunctions.net/cfunc/docs which did not bring up the Google Authentication thing but now wasn't a valid path for my docs so it returned a 'Cannot GET /cfunc/docs'.
For some reason this redirection DOES NOT happen if you add an extra forward slash ('/') at the end of the documentation URL. So, in the first case, where the endpoint for the documentation is only '/docs', accessing the URL https://region-name-project-name.cloudfunctions.net/cfunc/docs/ does it. I do not know why that is, I'm probably posting an Issue on the swagger repo, but if someone has some extra data on why or how to make it work otherwise it would be awesome to hear.
Hope this helps someone else!
EDIT:
Oh and another thing I forgot, it's apparently better if you setup swagger-ui as if you were using express Router, even if you are not (maybe Firebase loads the Cloud Function with something like a router), so instead of app.use('/docs', swagger-ui.serve, swagger-ui.setup(swagger-file)) do app.use('/docs', swagger-ui.serve) and then app.get('/docs', swagger-ui.setup(swagger-file))
I'm building a desktop app using Electron and Vue as framework.
I also need to authenticate the user using Azure AD and I'm using msal-node.js as library to do that.
I'm able to authenticate with the server in azure and get the user info, but I cannot figure it out how to set the redirect URL.
First I have to say that the behaviour between dev and prod change drastically and I'm going to explain both scenarios and, in both of them I'm going to use history mode or not
DEV - using createWebHistory
Return Url in Azure and .env file: http://localhost:8080/
This is what I've got from the devTools during the normal navigation (no authenticated)
And this is what I've got after the authentication (the call to the API is successful):
Blank page in the app.
DEV - using createWebHashHistory
Return Url in Azure and .env file: http://localhost:8080/#/
After the authentication (failed):
Blank page in the app.
PROD
In prod I must use createWebHasHistory otherwise I've got blank page from the beginning.
The first problem I've got in production is the url itself.
When I create the window I call the following url:
await win.loadURL('app://./index.html')
In azure I cannot use the same url because it's not a valid url.
If I use just:
await win.loadURL('app://index.html')
I've got blank page
Any idea?
Thank you
The solution I've found it's pretty simple. Probably it's not the most "elegant", but it works, at least for prod. In dev I've still got the same weird problem described above.
Basically I'm starting a node server (localhost:3031 for example), within the app itself, then I'm catching the redirect url with it (localhost:3031/redirect) and serving the internal url from it:
expressApp.get('/redirect', async (req, res) => {
await win.loadURL('app://./index.html#about')
})
As I said, it works and I don't see any security issue with that, but, if you have any other idea or suggestion, please let me know.
Thank you
UPDATE
I've found the issue with Dev as well. In order to authenticate I'm using what Microsoft is suggesting in its documentation.
If you look at the file AuthProvider.js there is this portion of code, at the beginning:
const CUSTOM_FILE_PROTOCOL_NAME = process.env.REDIRECT_URI.split(':')[0];
Down below, in the method "getTokenIteractive" there is this other piece of code that applies the new protocol:
protocol.registerFileProtocol(CUSTOM_FILE_PROTOCOL_NAME, (req, callback) => {
const requestUrl = new URL(req.url)
callback(path.normalize(`${__dirname}/${requestUrl.path}`))
})
In Dev my REDIRECT_URI is "http://localhost:3031/redirect", but the app protocol must be "app" (or whatever you have chosen) in order to work with Vue. So, I've just wrapped this last method in a condition based on the environment and now everything works as expected everywhere.
I hope all this can be useful to someone.
I ran into a similar issue and your solution helped me out, thank you! Can I ask how you handled the logout redirect?
Also have you tried onBeforeRequest to handle the redirects, instead of a node server?
It was used as an example in an auth0 blog: https://auth0.com/blog/securing-electron-applications-with-openid-connect-and-oauth-2/
Trying to Upload an image by XMLHttpRequest(), have issue understanding what is the correct URL to access file via xhr.open(...).
Following this example for server side code and everything..
var xhr = new XMLHttpRequest();
xhr.open('post','../../../../server/routes/saveImage.js',true);
xhr.onload = function() {
if(this.status ==200) {
resolve(this.response);
} else {
reject (this.statusText);
}
};
The Project directory is something like this
Project
client
app
component
product
addproduct.js <-- xhr.open is called from here
server
routes
saveImage.js <-- File being called
Also regarding paths let me know if there is a more convenient way to check the access path or absolute path to use in url.
I think there is a conceptual problem in many levels.
First, When you are XHRing(ajax) an url that means you are accessing the url from CLIENT SIDE. So, Let's say you have an app and HTTP posting or getting an url. Do you have that file from client side? The answer is obviously NO.
let's say you are hosting the app in:
http://localhost/myapps/app
So, When you access ./someFile.txt, ../someFile.txt and ../../someFile.txt you are actually requesting
./someFile.txt-> http://localhost/myapps/app/someFile.txt
../someFile.txt-> http://localhost/myapps/someFile.txt
../../someFile.txt-> http://localhost/someFile.txt
Now, For your problem. You need to host the Server Side upload code somewhere. The example assumes the Server Side code is hosted in, for example, http://localhost/upload and use xhr.open('post','/upload',true);
You need to understand requireing or importing or fs.readFile a file is accessing the path internally. But when you host the app, any client side code like Ajax(XHR) is accessing the url from outside.
The problem indeed was with setting up a server side route, this was mostly clarified, thanks to the answer from #Arkita and comment from #Kasper. But I went ahead to dig for a solution, which may not be very useful to others as this was a dumb question in the first place, but here it goes..
On client side
xhr.open('post','/saveImage/save',true);
on server side if you are using Express.js or other connect based frameworks
app.post('/saveImage/save',(req,res,next)=> {....})
Also the example I linked above may be outdated, this seems more helpful.
I want to send a request to this Amazon Alexa API.
That page contains the last 50 activities I made with my Amazon Echo. The page returns JSON. Before you can request that page, you need to authorize your account, so the proper cookies are set in your browser.
If I do something simple as:
const rp = require("request-promise");
const options = {
method: "GET",
uri: "https://alexa.amazon.com/api/activities?startTime=&size=50&offset=-1",
json: true
};
rp(options).then(function(data) {
console.log(data);
}).catch(function(err) {
console.log(err);
});
I can send a GET request to that URL. This works fine, except Amazon has no idea it's me who's sending the request, because I haven't authorized my NodeJS application.
I've successfully copied ~10 cookies from my regular browser into an incognito tab and authorized that way, so I know copying the cookies will work. After adding them all using tough-cookie, it didn't work, unfortunately. I still got redirected to the signin page (according to the error response).
How do I authorize for this API, so I can send my requests?
I have been looking for a solution for this too. The best idea I have is to use account linking, but I haven't try it yet. Looks like ASK-CLI has interface for this also, but I can't figure it out how to use it (what is that URL?). For linking account to 3rd party server is not easy, but link it back to Amazon for the json API should not be that complicated.
there is a website that works with virtual items for an online game. I made a chrome extension that automates some actions on that website. Since I'd like to make this run on my raspberryPi (and chromium with the extension seems to be too slow and unefficient) I am trying to move this into node.js.
The login for the website works with Steam OpenID. It allows you to select items from a list, click a few buttons, then it sends you a tradeoffer on steam.
My extension works with the website while I was logged in there. It receives their database with jQuery getJSON, loops through the array, pushes some values into an array and then sends a post request telling the website which items I want and which items I am offering.
Here is how I am sending the request from chrome:
function withdrawXHR(botId, playerItems, botItems) {
$.ajax({
url: websiteURL,
type: 'post',
data: {
"steamid": botId,
"peopleItems": playerItems,
"botItems": botItems
},
success: function (data) {
console.error('>> Done: ' + data)
console.log("")
},
error: function(XMLHttpRequest, textStatus, errorThrown) {
console.error('>> Error: ' + errorThrown)
console.log("")
}
});
}
I can do everything in node so far like receiving their database, working through it, filter out the values I need, but I can't manage to send a working request. The problem is probably the login / how the website knows who I am.
I used wrapAPI (a chrome extension) to catch the request that is being sent when manually working with the website. Here is what it looks like:
So these are the things I am wondering about:
How would I send this request from node?
How does the website know who I am? They obviously know, because they are sending me an offer, but I can't see any "personal" data in that request.
Would I need to log into Steam OpenId from Node in some way? Is that possible?
What is a CF-RAY? (See the end of the captured request).
I am quite new to JS and requests in general and even "newer" to Node.js. I don't fully understand how the background of sending requests works. I would just need some tips, ideas on how to achieve my goal here.
Any help is greatly appreciated! Thank you! :)
You cannot use XMLHttpRequest for resources across domains. ( incidentally, unless you are using an extension)
I would look into grabbing express.js, and something called CORS. CORS permits cross-domain requests.
Here: http://enable-cors.org/server_expressjs.html
And here is some information on XHR requests in browser extensions: https://developer.chrome.com/extensions