I send request from nodejs to server, and in my url query cyrillic text like this: https://somesite.com/wf/server/postSomeStuff?id=13&name=Имя.pdf&other=true;
But server see Имя like A;>1>60=I8=0. (for example). And I want to make this query encode to unicode. I was trying to set headers like Accept-Charset or Accept-Encoding, but it's not helps.. how can I change encode only in url, not content?
you can use javascript encodeURI function before sending request for example
var url = 'https://somesite.com/wf/server/postSomeStuff?id=13&name='+encodeURI('Имя.pdf')+'&other=true';
and decode it on the server side
var name = decodeURI(req.params.name);
Related
I want to do some operations with response from python requests library. After I use below function;
response = requests.get(f'{AUTHORIZE_URL}?client_id={CLIENT_ID}&response_type=code&state={STATE}&redirect_uri={REDIRECT_URI}')
I need to get an URL something like this in return;
http://127.0.0.1:8000/products/auth/?state=2b33fdd45jbevd6nam&code=MGY1MTMyNWY0YjQ0MzEwNmMxMjY2ZjcwMWE2MWY5ZDE5MzJlMjA1YjdkNWExNGRhYjIzOGI5NzQ5OWZkNTA5NA
While doing it, it will be easier to use JSON in order to get state and code values from URL but I cannot use it because I think the content type does not allow this.
See this for Content-Type explanation: Content-Type
In short the "content-type" in the headers of response got by using requests.get tells you what kind of the content server did send, in your case you'we got a response in the form of the HTML (like .html document) and you can read that response with response.text, if the "content-type" is "application/json" then you can read it as JSON like this response.json().
I see that you use some local server, your local server should send in headers "Content-Type": "application/json" and then you should be able to read JSON from response like this (you need to send JSON not hmtl or text from server):
targetURL = 'http://127.0.0.1:8000/products/auth/?state=2b33fdd45jbevd6nam&code=MGY1MTMyNWY0YjQ0MzEwNmMxMjY2ZjcwMWE2MWY5ZDE5MzJlMjA1YjdkNWExNGRhYjIzOGI5NzQ5OWZkNTA5NA'
response.get(targetURL).json()
I'm having an issue with a NodeJS REST api created using express.
I have two calls, a get and a post set up like this:
router.get('/:id', (request, response) => {
console.log(request.params.id);
});
router.post('/:id', (request, response) => {
console.log(request.params.id);
});
now, I want the ID to be able to contain special characters (UTF8).
The problem is, when I use postman to test the requests, it looks like they are encoded very differently:
GET http://localhost:3000/api/â outputs â
POST http://localhost:3000/api/â outputs â
Does anyone have any idea what I am missing here?
I must mention that the post call also contains a file upload so the content type will be multipart/form-data
You should encode your URL on the client and decode it on the server. See the following articles:
What is the proper way to URL encode Unicode characters?
Can urls have UTF-8 characters?
Which characters make a URL invalid?
For JavaScript, encodeURI may come in handy.
It looks like postman does UTF-8 encoding but NOT proper url encoding. Consequently, what you type in the request url box translates to something different than what would happen if you typed that url in a browser.
I'm requesting: GET localhost/ä but it encodes it on the wire as localhost/ä
(This is now an invalid URL because it contains non ascii characters)
But when I type localhost/ä in to google chrome, it correctly encodes the request as localhost/%C3%A4
So you could try manually url encoding your request to http://localhost:3000/api/%C3%A2
In my opinion this is a bug (perhaps a regression). I am using the latest version of PostMan v7.11.0 on MacOS.
Does anyone have any idea what I am missing here?
yeah, it doesn't output â, it outputs â, but whatever you're checking the result with, think you're reading something else (iso-8859-1 maybe?), not UTF-8, and renders it as â
Most likely, you're viewing the result in a web browser, and the web server is sending the wrong Content-Type header. try doing header("Content-type: text/plain;charset=utf-8"); or header("Content-type: text/html;charset=utf-8"); , then your browser should render your â correctly.
Can routes in express not take a full URL as a parameter?
For example,
router.get("/new/:url", <some function>);
gives me the Cannot GET error when the :url is https://www.google.com
You can't get full URL like this format.This type of format is used to take parameters send by client
router.get("/new/:url", <some function>);
//you can get url as params
req.params.url//Use your URL
You should encode url parameter before sending. Your example encoded would be Http%3A%2F%2Fwww.google.com. On server side you can decode parameter to get value from before.
I think you are not much aware about ExpressJS routing because your url https://www.google.com have // which is used route separation.
In you case, we know that ExpressJS support regex route. I think following regex will work for you
app.get("/new/:protocol(http:|https:|ftp:)?/?/:url", <some function>);
In above case, you have bunded with limited protocol http, https and ftp. You may add more protocol by using | separator( or condition) and even you don't know what would be protocol then you like following
app.get("/new/:protocol?/?/:url", <some function>);
In above both route, ? means option that routes works file for
/new/www.google.com
/new/https://www.google.com
and in your function, you may append protocol in url like
function newUrl(req, res) {
if(req.params.protocol)
req.params.url = req.params.protocol + '//' + req.params.url;
console.log(req.params.url);
}
I have a server that makes url redirection using nodejs. I use this to make the redirection :
response.writeHead(302, {Location: url}); response.end();
This works well with normal url like google.com but when I have other characters like cyrillic it bugs for example if I do a url = 'ru.wikipedia.org/wiki/Путин,_Владимир_Владимирович' (with https:// infront) then the redirection bugs. Do I have to somehow reencode the string before passing it to the redirection? because when I make a console.log(url), it's displaying the correct url with the cyrilic letters.
After some more test i manage to see that the data encrypted is as follow using node-icu-charset-detector:
----[NOTICE] charset: ISO-8859-2
----[NOTICE] redirect: https://ru.wikipedia.org/wiki/Путин,_Владимир_Владимирович
And the link I'm getting on my browser is like 'https://ru.wikipedia.org/wiki/%1FCB8=,%12;048%3C8#%12;048%3C8#%3E28G'
You can encode the url since HTTP header values doesn't support utf-8 encoded value:
response.writeHead(302, {Location: encodeURI(url)});
In Power Query, I can download data from Web using the Web.Contents function, but there's an api that required the request to contains multipart/form data in the following format
"__rdxml"=<*Some data*>
So how do you do this using the Web.Contents function?
I tried, doing
...
PostContent = "__rdxml=<*Some data*>",
Source Web.Contents(url,Content=Text.ToBinary(PostContent))
...
But server response with 400 Bad Request.
I checked the raw request with Fiddler, it seem like the request is not sending with content-type=multipart/form-data header.
I tried manually adding the content-type header with content-type=multipart/form-data, but that doesn't work either. Same 400 Bad Request in the response.
Any idea?
multipart/form-data is a fairly complicated encoding, requiring a bunch of MIME-specific headers. I would first try to see if you can use application/x-www-form-urlencoded instead:
let
actualUrl = "http://some.url",
record = [__rdxml="some data"],
body = Text.ToBinary(Uri.BuildQueryString(record)),
options = [Headers =[#"Content-type"="application/x-www-form-urlencoded"], Content=body],
result = Web.Contents(actualUrl, options)
in
result
EDIT: I've come up with an example of using multipart/form-data with Power Query. It's at https://gist.github.com/CurtHagenlocher/b21ce9cddf54e3807317