Using retrofit to make a #Post but sending a xml body content in Kotlin? - retrofit2

I am very new at Kotlin. I am wondering if this is possible to do it in Kotlin.
I have a request that is basically a POST to an endpoint, it sends information as part of the body, something like the following:
<?xml version="1.0" encoding="UTF-8" ?>
<!DOCTYPE...">
<some more tags...>
<request>
<clientAuth>
<client>
{client-id}
</client>
</cientAuth>
....
So I basically received some parameters (Like client-id) and I need to form this xml (as far as I am doing it at postman) and send the request.
Checking some libraries I noticed with retrofit is sort of possible, but I have no idea, is it really possible?
I know for "form url enconder" requests is simpler by using #FormUrlEncoded but not sure if there is one for xmls...
So far I got this:
an interface...
#POST("user-endpoint")
fun clientRequest(
#Field("clientId") clientId: String
): Deferred<Response<Void>>
and I invoke clientRequest from other function.
Testing the logic works fine for endpoints without parameters, but I have no idea how to send that xml, is it even possible?
I find out I can use tickaroo to represent my body as an object and then use retrofit without problem.

Related

Cyrillic input retrieved as Mojibake [duplicate]

When I was still using PrimeFaces v2.2.1, I was able to type unicode input such as Chinese with a PrimeFaces input component such as <p:inputText> and <p:editor>, and retrieve the input in good shape in managed bean method.
However, after I upgraded to PrimeFaces v3.1.1, all those characters become Mojibake or question marks. Only Latin input comes fine, it are the Chinese, Arabic, Hebrew, Cyrillic, etc characters which become malformed.
How is this caused and how can I solve it?
Introduction
Normally, JSF/Facelets will set the request parameter character encoding to UTF-8 by default already when the view is created/restored. But if any request parameter is been requested before the view is been created/restored, then it's too late to set the proper character encoding. The request parameters will namely be parsed only once.
PrimeFaces encoding fail
That it failed in PrimeFaces 3.x after upgrading from 2.x is caused by the new isAjaxRequest() override in PrimeFaces' PrimePartialViewContext which checks a request parameter:
#Override
public boolean isAjaxRequest() {
return getWrapped().isAjaxRequest()
|| FacesContext.getCurrentInstance().getExternalContext().getRequestParameterMap().containsKey("javax.faces.partial.ajax");
}
By default, the isAjaxRequest() (the one of Mojarra/MyFaces, as the above PrimeFaces code has obtained by getWrapped()) checks the request header as follows which does not affect the request parameter encoding as request parameters won't be parsed when a request header is obtained:
if (ajaxRequest == null) {
ajaxRequest = "partial/ajax".equals(ctx.
getExternalContext().getRequestHeaderMap().get("Faces-Request"));
}
However, the isAjaxRequest() may be called by any phase listener or system event listener or some application factory before the view is been created/restored. So, when you're using PrimeFaces 3.x, then the request parameters will be parsed before the proper character encoding is been set and hence use the server's default encoding which is usually ISO-8859-1. This will mess up everything.
Solutions
There are several ways to fix it:
Use a servlet filter which sets ServletRequest#setCharacterEncoding() with UTF-8. Setting the response encoding by ServletResponse#setCharacterEncoding() is by the way unnecessary as it won't be affected by this issue.
#WebFilter("/*")
public class CharacterEncodingFilter implements Filter {
#Override
public void doFilter(ServletRequest request, ServletResponse response, FilterChain chain) throws ServletException, IOException {
request.setCharacterEncoding("UTF-8");
chain.doFilter(request, response);
}
// ...
}
You only need to take into account that HttpServletRequest#setCharacterEncoding() only sets the encoding for POST request parameters, not for GET request parameters. For GET request parameters you'd still need to configure it at server level.
If you happen to use JSF utility library OmniFaces, such a filter is already provided out the box, the CharacterEncodingFilter. Just install it as below in web.xml as first filter entry:
<filter>
<filter-name>characterEncodingFilter</filter-name>
<filter-class>org.omnifaces.filter.CharacterEncodingFilter</filter-class>
</filter>
<filter-mapping>
<filter-name>characterEncodingFilter</filter-name>
<url-pattern>/*</url-pattern>
</filter-mapping>
Reconfigure the server to use UTF-8 instead of ISO-8859-1 as default encoding. In case of Glassfish, that would be a matter of adding the following entry to <glassfish-web-app> of the /WEB-INF/glassfish-web.xml file:
<parameter-encoding default-charset="UTF-8" />
Tomcat doesn't support it. It has the URIEncoding attribute in <Context> entry, but this applies to GET requests only, not to POST requests.
Report it as a bug to PrimeFaces. Is there really any legitimate reason to check the HTTP request being an ajax request by checking a request parameter instead of a request header like as you would do for standard JSF and for example jQuery? The PrimeFaces' core.js JavaScript is doing that. It would be better if it has set it as a request header of XMLHttpRequest.
Solutions which do NOT work
Perhaps you'll stumble upon below "solutions" somewhere on the Internet while investigating this problem. Those solutions do won't ever work in this specific case. Explanation follows.
Setting XML prolog:
<?xml version='1.0' encoding='UTF-8' ?>
This only tells the XML parser to use UTF-8 to decode the XML source before building the XML tree around it. The XML parser actually being used by Facelts is SAX during JSF view build time. This part has completely nothing to do with HTTP request/response encoding.
Setting HTML meta tag:
<meta http-equiv="Content-Type" content="text/html; charset=utf-8"/>
The HTML meta tag is ignored when the page is served over HTTP via a http(s):// URI. It's only been used when the page is by the client saved as a HTML file on local disk system and then reopened by a file:// URI in browser.
Setting HTML form accept charset attribute:
<h:form accept-charset="UTF-8">
Modern browsers ignore this. This has only effect in Microsoft Internet Explorer browser. Even then it is doing it wrongly. Never use it. All real webbrowsers will instead use the charset attribute specified in the Content-Type header of the response. Even MSIE will do it the right way as long as you do not specify the accept-charset attribute.
Setting JVM argument:
-Dfile.encoding=UTF-8
This is only used by the Oracle(!) JVM to read and parse the Java source files.

Sending an HTML email with base64 image as part of the HTML

I'm trying to send an email, with HTML content that includes an image tag,
for example:
<img ng-src="data:image/gif;base64,iVBORw0KGgoAAAANSUhEU...gAAASwAAAAmCC" />
unfortunately none of the mail client i'm using support this kind of "src" on image tag.
tried to Google it, it seems as known issue, but none of the answers was good for me.
by the way, i'm using AngularJS to bind the model to the html content, then pass it as an html string
to the WebApi controller, and then send it with an Smtp client.
Hope someone can help me solved this somehow,
Thanks,
Nadav S.
Yes, that is correct. Most clients do not support the "data:" url and even if they do, the size of the binary you can embed is very limited. Barely enough for a thumbnail, not enough for a real picture.
The correct way to do this is with mime multipart/related and the "cid:" url. Then one part contains the HTML and the other part contains the base64 encoded picture. The image part contains a header with a field called "Content-ID". The value is any unique string surrounded by <>. For example:
Content-ID: <xxxyyy>
In your HTML you use the following code:
<img src="cid:xxxyyy"/>
See rfc-2392 for the full specification.

Reusing Yesod widgets in AJAX results

I'm writing a very simple Yesod message list that uses AJAX to add new list items without reloading the page (both in the case of other users modifying the database, or the client themselves adding an item). This means I have to encode the HTML structure of the message items in both the Halmet template (when the page loads initially) and the Julius template (for when the dynamic addition happens). They look something like this:
In homepage.hamlet:
$if not $ null messages
<ul id=#{listId}>
$forall Entity mid message <- messages
<li id=#{toPathPiece mid}>
<p>#{showMarkdown $ messageText message}
<abbr .timeago title=#{showUTCTime $ messagePosted message}>
And in homepage.julius:
function(message) {
$('##{rawJS listId}').prepend(
$('<li>')
.attr('id', message.id)
.append('<p>' + message.text + '</p>')
.append($('<abbr class=timeago />')
.attr('title', message.posted).timeago())
.slideDown('slow')
);
}
I'd love to be able to unify these two representations somehow. Am I out of luck, or could I somehow abuse widgets into both generating an HTML response, and filling in code in a JavaScript file?
Note: Of course, I understand that the templates would have to work very differently, since the AJAX call is getting its values from a JS object, not from the server. It's a long shot, but I thought I'd see if anyone's thought about this before.
I think it's something of a AJAX best-practice to pick one place to do your template rendering, either on the server or client. Yesod is (currently) oriented toward doing the rendering on the server.
This can still work with AJAX replacement of contents, though. Instead of getting a JSON response from the POST, you should get a text/html response that contains the result of rendering the template on the server with the values that would have been returned via JSON and then replacing the innerHTML of the DOM node that's being updated.
If you want to support both JSON and HTML responses (to support 3rd party applications via API or something) you would have to make the format of the response be a function of the request; either appending ".json" or ".html" to the URL or including a HTTP header that lists the specific document type required by the client.
It would be nice if Yesod provided a 'jwhamlet' template or something that would render the HTML via javascript in order to support client rendering, but I'm not aware of one. That's not to say there isn't one I'm not aware of, though, so keep an eye open for other answers.
If you wanted to make such a thing, you might try tweaking the hamlet quasi-quote code so that instead of expanding the quasi-quotes to an html-generating function, it expanded them to a JSON-generating function and a pre-rendered chunk of text that's a template in mustache-style such that the JSON returned by the function would provide the correct context for the template to be rendered the way you want.

JSF using UTF-8 not working with letters with accents [duplicate]

When I was still using PrimeFaces v2.2.1, I was able to type unicode input such as Chinese with a PrimeFaces input component such as <p:inputText> and <p:editor>, and retrieve the input in good shape in managed bean method.
However, after I upgraded to PrimeFaces v3.1.1, all those characters become Mojibake or question marks. Only Latin input comes fine, it are the Chinese, Arabic, Hebrew, Cyrillic, etc characters which become malformed.
How is this caused and how can I solve it?
Introduction
Normally, JSF/Facelets will set the request parameter character encoding to UTF-8 by default already when the view is created/restored. But if any request parameter is been requested before the view is been created/restored, then it's too late to set the proper character encoding. The request parameters will namely be parsed only once.
PrimeFaces encoding fail
That it failed in PrimeFaces 3.x after upgrading from 2.x is caused by the new isAjaxRequest() override in PrimeFaces' PrimePartialViewContext which checks a request parameter:
#Override
public boolean isAjaxRequest() {
return getWrapped().isAjaxRequest()
|| FacesContext.getCurrentInstance().getExternalContext().getRequestParameterMap().containsKey("javax.faces.partial.ajax");
}
By default, the isAjaxRequest() (the one of Mojarra/MyFaces, as the above PrimeFaces code has obtained by getWrapped()) checks the request header as follows which does not affect the request parameter encoding as request parameters won't be parsed when a request header is obtained:
if (ajaxRequest == null) {
ajaxRequest = "partial/ajax".equals(ctx.
getExternalContext().getRequestHeaderMap().get("Faces-Request"));
}
However, the isAjaxRequest() may be called by any phase listener or system event listener or some application factory before the view is been created/restored. So, when you're using PrimeFaces 3.x, then the request parameters will be parsed before the proper character encoding is been set and hence use the server's default encoding which is usually ISO-8859-1. This will mess up everything.
Solutions
There are several ways to fix it:
Use a servlet filter which sets ServletRequest#setCharacterEncoding() with UTF-8. Setting the response encoding by ServletResponse#setCharacterEncoding() is by the way unnecessary as it won't be affected by this issue.
#WebFilter("/*")
public class CharacterEncodingFilter implements Filter {
#Override
public void doFilter(ServletRequest request, ServletResponse response, FilterChain chain) throws ServletException, IOException {
request.setCharacterEncoding("UTF-8");
chain.doFilter(request, response);
}
// ...
}
You only need to take into account that HttpServletRequest#setCharacterEncoding() only sets the encoding for POST request parameters, not for GET request parameters. For GET request parameters you'd still need to configure it at server level.
If you happen to use JSF utility library OmniFaces, such a filter is already provided out the box, the CharacterEncodingFilter. Just install it as below in web.xml as first filter entry:
<filter>
<filter-name>characterEncodingFilter</filter-name>
<filter-class>org.omnifaces.filter.CharacterEncodingFilter</filter-class>
</filter>
<filter-mapping>
<filter-name>characterEncodingFilter</filter-name>
<url-pattern>/*</url-pattern>
</filter-mapping>
Reconfigure the server to use UTF-8 instead of ISO-8859-1 as default encoding. In case of Glassfish, that would be a matter of adding the following entry to <glassfish-web-app> of the /WEB-INF/glassfish-web.xml file:
<parameter-encoding default-charset="UTF-8" />
Tomcat doesn't support it. It has the URIEncoding attribute in <Context> entry, but this applies to GET requests only, not to POST requests.
Report it as a bug to PrimeFaces. Is there really any legitimate reason to check the HTTP request being an ajax request by checking a request parameter instead of a request header like as you would do for standard JSF and for example jQuery? The PrimeFaces' core.js JavaScript is doing that. It would be better if it has set it as a request header of XMLHttpRequest.
Solutions which do NOT work
Perhaps you'll stumble upon below "solutions" somewhere on the Internet while investigating this problem. Those solutions do won't ever work in this specific case. Explanation follows.
Setting XML prolog:
<?xml version='1.0' encoding='UTF-8' ?>
This only tells the XML parser to use UTF-8 to decode the XML source before building the XML tree around it. The XML parser actually being used by Facelts is SAX during JSF view build time. This part has completely nothing to do with HTTP request/response encoding.
Setting HTML meta tag:
<meta http-equiv="Content-Type" content="text/html; charset=utf-8"/>
The HTML meta tag is ignored when the page is served over HTTP via a http(s):// URI. It's only been used when the page is by the client saved as a HTML file on local disk system and then reopened by a file:// URI in browser.
Setting HTML form accept charset attribute:
<h:form accept-charset="UTF-8">
Modern browsers ignore this. This has only effect in Microsoft Internet Explorer browser. Even then it is doing it wrongly. Never use it. All real webbrowsers will instead use the charset attribute specified in the Content-Type header of the response. Even MSIE will do it the right way as long as you do not specify the accept-charset attribute.
Setting JVM argument:
-Dfile.encoding=UTF-8
This is only used by the Oracle(!) JVM to read and parse the Java source files.

Unicode input retrieved via PrimeFaces input components become corrupted

When I was still using PrimeFaces v2.2.1, I was able to type unicode input such as Chinese with a PrimeFaces input component such as <p:inputText> and <p:editor>, and retrieve the input in good shape in managed bean method.
However, after I upgraded to PrimeFaces v3.1.1, all those characters become Mojibake or question marks. Only Latin input comes fine, it are the Chinese, Arabic, Hebrew, Cyrillic, etc characters which become malformed.
How is this caused and how can I solve it?
Introduction
Normally, JSF/Facelets will set the request parameter character encoding to UTF-8 by default already when the view is created/restored. But if any request parameter is been requested before the view is been created/restored, then it's too late to set the proper character encoding. The request parameters will namely be parsed only once.
PrimeFaces encoding fail
That it failed in PrimeFaces 3.x after upgrading from 2.x is caused by the new isAjaxRequest() override in PrimeFaces' PrimePartialViewContext which checks a request parameter:
#Override
public boolean isAjaxRequest() {
return getWrapped().isAjaxRequest()
|| FacesContext.getCurrentInstance().getExternalContext().getRequestParameterMap().containsKey("javax.faces.partial.ajax");
}
By default, the isAjaxRequest() (the one of Mojarra/MyFaces, as the above PrimeFaces code has obtained by getWrapped()) checks the request header as follows which does not affect the request parameter encoding as request parameters won't be parsed when a request header is obtained:
if (ajaxRequest == null) {
ajaxRequest = "partial/ajax".equals(ctx.
getExternalContext().getRequestHeaderMap().get("Faces-Request"));
}
However, the isAjaxRequest() may be called by any phase listener or system event listener or some application factory before the view is been created/restored. So, when you're using PrimeFaces 3.x, then the request parameters will be parsed before the proper character encoding is been set and hence use the server's default encoding which is usually ISO-8859-1. This will mess up everything.
Solutions
There are several ways to fix it:
Use a servlet filter which sets ServletRequest#setCharacterEncoding() with UTF-8. Setting the response encoding by ServletResponse#setCharacterEncoding() is by the way unnecessary as it won't be affected by this issue.
#WebFilter("/*")
public class CharacterEncodingFilter implements Filter {
#Override
public void doFilter(ServletRequest request, ServletResponse response, FilterChain chain) throws ServletException, IOException {
request.setCharacterEncoding("UTF-8");
chain.doFilter(request, response);
}
// ...
}
You only need to take into account that HttpServletRequest#setCharacterEncoding() only sets the encoding for POST request parameters, not for GET request parameters. For GET request parameters you'd still need to configure it at server level.
If you happen to use JSF utility library OmniFaces, such a filter is already provided out the box, the CharacterEncodingFilter. Just install it as below in web.xml as first filter entry:
<filter>
<filter-name>characterEncodingFilter</filter-name>
<filter-class>org.omnifaces.filter.CharacterEncodingFilter</filter-class>
</filter>
<filter-mapping>
<filter-name>characterEncodingFilter</filter-name>
<url-pattern>/*</url-pattern>
</filter-mapping>
Reconfigure the server to use UTF-8 instead of ISO-8859-1 as default encoding. In case of Glassfish, that would be a matter of adding the following entry to <glassfish-web-app> of the /WEB-INF/glassfish-web.xml file:
<parameter-encoding default-charset="UTF-8" />
Tomcat doesn't support it. It has the URIEncoding attribute in <Context> entry, but this applies to GET requests only, not to POST requests.
Report it as a bug to PrimeFaces. Is there really any legitimate reason to check the HTTP request being an ajax request by checking a request parameter instead of a request header like as you would do for standard JSF and for example jQuery? The PrimeFaces' core.js JavaScript is doing that. It would be better if it has set it as a request header of XMLHttpRequest.
Solutions which do NOT work
Perhaps you'll stumble upon below "solutions" somewhere on the Internet while investigating this problem. Those solutions do won't ever work in this specific case. Explanation follows.
Setting XML prolog:
<?xml version='1.0' encoding='UTF-8' ?>
This only tells the XML parser to use UTF-8 to decode the XML source before building the XML tree around it. The XML parser actually being used by Facelts is SAX during JSF view build time. This part has completely nothing to do with HTTP request/response encoding.
Setting HTML meta tag:
<meta http-equiv="Content-Type" content="text/html; charset=utf-8"/>
The HTML meta tag is ignored when the page is served over HTTP via a http(s):// URI. It's only been used when the page is by the client saved as a HTML file on local disk system and then reopened by a file:// URI in browser.
Setting HTML form accept charset attribute:
<h:form accept-charset="UTF-8">
Modern browsers ignore this. This has only effect in Microsoft Internet Explorer browser. Even then it is doing it wrongly. Never use it. All real webbrowsers will instead use the charset attribute specified in the Content-Type header of the response. Even MSIE will do it the right way as long as you do not specify the accept-charset attribute.
Setting JVM argument:
-Dfile.encoding=UTF-8
This is only used by the Oracle(!) JVM to read and parse the Java source files.

Resources