I have a rest service that requires passing an encrypted key as part of the path. I urlencode the key and it works great when just placed in the browser. However, in my code I user WebRequest.Create and that appears to replace any backslashes that are generated by the encryption key. This results in the service thinking that it part of the route and fails with a 404. IS this a known defect in the .net framework or am I missing something? Seems like a pretty big deal.
Edit: (Simplified sample code)
string key = System.Web.HttpUtility.UrlEncode(TripleDESEncode("sharedkey"));
string uri = string.Format("http://mydomail.com/deposit/{0}.{1}", key, "json");
//uri looks like this here http://mydomail.com/deposit/FHnapfF5yBCEKt3%2f3YOQ5g%3d%3d.json
HttpWebRequest webRequest = (HttpWebRequest)WebRequest.Create(uri);
//Now the address in the HttpWebRequest is this...
//http://mydomail.com/deposit/FHnapfF5yBCEKt3/3YOQ5g%3d%3d.json
Hopefully this helps.
Ok, I ended-up making a compromise with my client and skipped the encryption for straight serializing to base64. This was only acceptable due to the nature of what I am passing. encrytion will be required in the future and I see this as a major problem that needs to be fixed. At least a workaround proposed. If I come across one I will post it.
Thanks everyone!
Final code:
HttpUtil.UrlEncode(Convert.ToBase64String(Encoding.UTF8.GetBytes("sharedKey")));
Use the UrlPathEncode method when the value is part of the path, not part of the query string:
string key = System.Web.HttpUtility.UrlPathEncode(TripleDESEncode("sharedkey"));
Related
I am using Facebook Owin Authentication and more or less follow Microsoft sample. I am more or less following the First time user logs in, everything is ok. But if they sign out and try again, it seems like the previous .AspNet.Correlation.Facebook is not removed, but set to empty string. So my next call to api/getexternallogin looks like this in Fiddler:
This is when we are generating a correlationId and having multiple cookies at this point is not a show stopper. In the response, I set it to the new CorrelationId:
Later when facebook calls back to "/signin-facebook", we try to validate the correlationId in ValidateCorrelationId method. The request seems like this:
So the new CorrelationId has been set but the extra cookie with no value means when I go Request.Cookies["ValidateCorrelationId"], it returns empty string.
I have checked the code and it seems like the only methods modifying this cookie are GenerateCorrelationId and ValidateCorrelationId. Implementation of these methods can be found in here:
http://katanaproject.codeplex.com/SourceControl/latest#src/Microsoft.Owin.Security/Infrastructure/AuthenticationHandler.cs
Curiously enough, my browser does not seem to see the issue:
Any ideas will be much appreciated.
OK this has taken me a fair bit of frustration but when Response.Cookies.Delete(".AspNet.Correlation.Facebook") is called in ValidateCorrelationId method, it sends the following in response:
So the value of "expires" has been concatenated and treated as two separate "set-cookie"s. Hence, the cookie is not expired but its value set to empty string. It seems like the comma after "Thu" is causing it.
The fix I have come up with was to comment out Response.Cookies.Delete(".AspNet.Correlation.Facebook") and do the following instead:
Response.Headers.Add("Set-Cookie", new[] { CorrelationKey + "=; path=/; expires=Fri 02-Jan-1970 00:00:00 GMT" })
No commas there and it is working now.
This does seem like a genuine bug in OWIN.
I am writing a client side code in Visual C++ 2012 using C++ Rest SDK (codename "Casablanca").
I have a client created and wish to POST a text string to the server. However, when I send the following code, it is compiling but not sending sending the request.
When I remove everything after "methods::POST" and send a blank post request, then it is sent and received by the server.
Can you please guide me where the problem is. The documentation related to this function is available on Casablanca Documentation.
pplx::task<http_response>resp = client.request(methods::POST,L"",L"This is the random text that I wish to send", L"text/plain");
I think the usage you give here looks correct.
Is your Casablanca the latest version ? Please check that out from here : http://casablanca.codeplex.com/
If you are sure your measurement is accurate, you may want to create a minimal repro and file a bug here : http://casablanca.codeplex.com/workitem/list/basic
I was having a similar problem, all my POSTs was arriving in blank on server , after a few hours work above it, i found a possible solution.
I changed the default content type to application/x-www-form-urlencoded and I started to pass the values like this Example data=text1&data2=text2
client.request(methods::POST,L"",L"data=text1&data2=text2", L"application/x-www-form-urlencoded");
The body parameter must be a json::value.
I cannot comment yet so I have to put my thoughts in an answer. I solved this problem like this: There is an overload of the request method that takes as a parameter the content type so that you do not have to change the code.
m_client->request(methods::POST, L"/statuses/update.json?" + url_encode(data),L"",L"application/x-www-form-urlencoded");
Obviously you would have to implement the url_encode method but that is not difficult. There is a pretty good implementation in "Cassablanca". A search on this site will alos turn up some good examples.
My Scenario:
I am using Monotouch for iOS to create an iPhone app. I am calling ASP.NEt MVC 4 Web API based http services to login/log off. For Login, i use the POST webmethod and all's well. For Logoff, i am calling the Delete web method. I want to pass JSON data (serialized complex data) to the Delete call. If i pass simple data like a single string parameter as part of the URL itself, then all's well i.e. Delete does work! In order to pass the complex Json Data, here's my call (i have adjusted code to make it simple by showing just one parameter - UserName being sent via JSON):
HttpWebRequest req = (HttpWebRequest)HttpWebRequest.Create("http://localhost/module/api/session/");
req.ContentType = "application/json";
req.CookieContainer = jar;
req.Method = "Delete";
using (var streamWrite = new StreamWriter(req.GetRequestStream()))
{
string jSON = "{\"UserName\":\"" + "someone" + "\"}";
streamWrite.Write(jSON);
streamWrite.Close();
}
HttpWebResponse res = (HttpWebResponse)req.GetResponse();
on the server, the Delete method looks has this definition:
public void Delete(Credentials user)
where Credentials is a complex type.
Now, here's the issue!
The above code, gets into the Delete method on the server as soon as it hits:
req.GetRequestStream()
And hence the parameter sent to the Delete method ends up being null
And here's the weird part:
If i use the exact same code using a test VS 2010 windows application, even the above code works...i.e it does not call Delete until req.GetResponse() is called! And in this scenario, the parameter to the Delete method is a valid object!
QUESTION
Any ideas or Is this a bug with Monotouch, if so, any workaround?
NOTE:
if i change the Delete definition to public void Delete(string userName)
and instead of json, if i pass the parameter as part of the url itself, all's well. But like i said this is just a simplified example to illustrate my issue. Any help is appreciated!!!
This seems to be ill-defined. See this question for more details: Is an entity body allowed for an HTTP DELETE request?
In general MonoTouch (based on Mono) will try to be feature/bug compatible with the Microsoft .NET framework to ease code portability between platforms.
IOW if MS.NET ignores the body of a DELETE method then so will MonoTouch. If the behaviour differs then a bug report should be filled at http://bugzilla.xamarin.com
I have very little to go on here. I can't reproduce this locally, but when users get the error I get an automatic email exception notification:
Invalid length for a Base-64 char array.
at System.Convert.FromBase64String(String s)
at System.Web.UI.ObjectStateFormatter.Deserialize(String inputString)
at System.Web.UI.ObjectStateFormatter.System.Web.UI.IStateFormatter.Deserialize(String serializedState)
at System.Web.UI.Util.DeserializeWithAssert(IStateFormatter formatter, String serializedState)
at System.Web.UI.HiddenFieldPageStatePersister.Load()
I'm inclined to think there is a problem with data that is being assigned to viewstate.
For example:
List<int> SelectedActionIDList = GetSelectedActionIDList();
ViewState["_SelectedActionIDList"] = SelectedActionIDList;
It's difficult to guess the source of the error without being able to reproduce the error locally.
If anyone has had any experience with this error, I would really like to know what you found out.
After urlDecode processes the text, it replaces all '+' chars with ' ' ... thus the error. You should simply call this statement to make it base 64 compatible again:
sEncryptedString = sEncryptedString.Replace(' ', '+');
I've seen this error caused by the combination of good sized viewstate and over aggressive content-filtering devices/firewalls (especially when dealing with K-12 Educational institutions).
We worked around it by storing Viewstate in SQL Server. Before going that route, I would recommend trying to limit your use of viewstate by not storing anything large in it and turning it off for all controls which do not need it.
References for storing ViewState in SQL Server:
MSDN - Overview of PageStatePersister
ASP Alliance - Simple method to store viewstate in SQL Server
Code Project - ViewState Provider Model
My guess is that something is either encoding or decoding too often - or that you've got text with multiple lines in.
Base64 strings have to be a multiple of 4 characters in length - every 4 characters represents 3 bytes of input data. Somehow, the view state data being passed back by ASP.NET is corrupted - the length isn't a multiple of 4.
Do you log the user agent when this occurs? I wonder whether it's a badly-behaved browser somewhere... another possibility is that there's a proxy doing naughty things. Likewise try to log the content length of the request, so you can see whether it only happens for large requests.
Try this:
public string EncodeBase64(string data)
{
string s = data.Trim().Replace(" ", "+");
if (s.Length % 4 > 0)
s = s.PadRight(s.Length + 4 - s.Length % 4, '=');
return Encoding.UTF8.GetString(Convert.FromBase64String(s));
}
int len = qs.Length % 4;
if (len > 0) qs = qs.PadRight(qs.Length + (4 - len), '=');
where qs is any base64 encoded string
As others have mentioned this can be caused when some firewalls and proxies prevent access to pages containing a large amount of ViewState data.
ASP.NET 2.0 introduced the ViewState Chunking mechanism which breaks the ViewState up into manageable chunks, allowing the ViewState to pass through the proxy / firewall without issue.
To enable this feature simply add the following line to your web.config file.
<pages maxPageStateFieldLength="4000">
This should not be used as an alternative to reducing your ViewState size but it can be an effective backstop against the "Invalid length for a Base-64 char array" error resulting from aggressive proxies and the like.
This isn't an answer, sadly. After running into the intermittent error for some time and finally being annoyed enough to try to fix it, I have yet to find a fix. I have, however, determined a recipe for reproducing my problem, which might help others.
In my case it is SOLELY a localhost problem, on my dev machine that also has the app's DB. It's a .NET 2.0 app I'm editing with VS2005. The Win7 64 bit machine also has VS2008 and .NET 3.5 installed.
Here's what will generate the error, from a variety of forms:
Load a fresh copy of the form.
Enter some data, and/or postback with any of the form's controls. As long as there is no significant delay, repeat all you like, and no errors occur.
Wait a little while (1 or 2 minutes maybe, not more than 5), and try another postback.
A minute or two delay "waiting for localhost" and then "Connection was reset" by the browser, and global.asax's application error trap logs:
Application_Error event: Invalid length for a Base-64 char array.
Stack Trace:
at System.Convert.FromBase64String(String s)
at System.Web.UI.ObjectStateFormatter.Deserialize(String inputString)
at System.Web.UI.Util.DeserializeWithAssert(IStateFormatter formatter, String serializedState)
at System.Web.UI.HiddenFieldPageStatePersister.Load()
In this case, it is not the SIZE of the viewstate, but something to do with page and/or viewstate caching that seems to be biting me. Setting <pages> parameters enableEventValidation="false", and viewStateEncryption="Never" in the Web.config did not change the behavior. Neither did setting the maxPageStateFieldLength to something modest.
Take a look at your HttpHandlers. I've been noticing some weird and completely random errors over the past few months after I implemented a compression tool (RadCompression from Telerik). I was noticing errors like:
System.Web.HttpException: Unable to validate data.
System.Web.HttpException: The client disconnected.---> System.Web.UI.ViewStateException: Invalid viewstate.
and
System.FormatException: Invalid length for a Base-64 char array.
System.Web.HttpException: The client disconnected. ---> System.Web.UI.ViewStateException: Invalid viewstate.
I wrote about this on my blog.
This is because of a huge view state, In my case I got lucky since I was not using the viewstate. I just added enableviewstate="false" on the form tag and view state went from 35k to 100 chars
During initial testing for Membership.ValidateUser with a SqlMembershipProvider, I use a hash (SHA1) algorithm combined with a salt, and, if I changed the salt length to a length not divisible by four, I received this error.
I have not tried any of the fixes above, but if the salt is being altered, this may help someone pinpoint that as the source of this particular error.
As Jon Skeet said, the string must be multiple of 4 bytes. But I was still getting the error.
At least it got removed in debug mode. Put a break point on Convert.FromBase64String() then step through the code. Miraculously, the error disappeared for me :) It is probably related to View states and similar other issues as others have reported.
In addition to #jalchr's solution that helped me, I found that when calling ATL::Base64Encode from a c++ application to encode the content you pass to an ASP.NET webservice, you need something else, too. In addition to
sEncryptedString = sEncryptedString.Replace(' ', '+');
from #jalchr's solution, you also need to ensure that you do not use the ATL_BASE64_FLAG_NOPAD flag on ATL::Base64Encode:
BOOL bEncoded = Base64Encode(lpBuffer,
nBufferSizeInBytes,
strBase64Encoded.GetBufferSetLength(base64Length),
&base64Length,ATL_BASE64_FLAG_NOCRLF/*|ATL_BASE64_FLAG_NOPAD*/);
We have a high security application and we want to allow users to enter URLs that other users will see.
This introduces a high risk of XSS hacks - a user could potentially enter javascript that another user ends up executing. Since we hold sensitive data it's essential that this never happens.
What are the best practices in dealing with this? Is any security whitelist or escape pattern alone good enough?
Any advice on dealing with redirections ("this link goes outside our site" message on a warning page before following the link, for instance)
Is there an argument for not supporting user entered links at all?
Clarification:
Basically our users want to input:
stackoverflow.com
And have it output to another user:
stackoverflow.com
What I really worry about is them using this in a XSS hack. I.e. they input:
alert('hacked!');
So other users get this link:
stackoverflow.com
My example is just to explain the risk - I'm well aware that javascript and URLs are different things, but by letting them input the latter they may be able to execute the former.
You'd be amazed how many sites you can break with this trick - HTML is even worse. If they know to deal with links do they also know to sanitise <iframe>, <img> and clever CSS references?
I'm working in a high security environment - a single XSS hack could result in very high losses for us. I'm happy that I could produce a Regex (or use one of the excellent suggestions so far) that could exclude everything that I could think of, but would that be enough?
If you think URLs can't contain code, think again!
https://owasp.org/www-community/xss-filter-evasion-cheatsheet
Read that, and weep.
Here's how we do it on Stack Overflow:
/// <summary>
/// returns "safe" URL, stripping anything outside normal charsets for URL
/// </summary>
public static string SanitizeUrl(string url)
{
return Regex.Replace(url, #"[^-A-Za-z0-9+&##/%?=~_|!:,.;\(\)]", "");
}
The process of rendering a link "safe" should go through three or four steps:
Unescape/re-encode the string you've been given (RSnake has documented a number of tricks at http://ha.ckers.org/xss.html that use escaping and UTF encodings).
Clean the link up: Regexes are a good start - make sure to truncate the string or throw it away if it contains a " (or whatever you use to close the attributes in your output); If you're doing the links only as references to other information you can also force the protocol at the end of this process - if the portion before the first colon is not 'http' or 'https' then append 'http://' to the start. This allows you to create usable links from incomplete input as a user would type into a browser and gives you a last shot at tripping up whatever mischief someone has tried to sneak in.
Check that the result is a well formed URL (protocol://host.domain[:port][/path][/[file]][?queryField=queryValue][#anchor]).
Possibly check the result against a site blacklist or try to fetch it through some sort of malware checker.
If security is a priority I would hope that the users would forgive a bit of paranoia in this process, even if it does end up throwing away some safe links.
Use a library, such as OWASP-ESAPI API:
PHP - http://code.google.com/p/owasp-esapi-php/
Java - http://code.google.com/p/owasp-esapi-java/
.NET - http://code.google.com/p/owasp-esapi-dotnet/
Python - http://code.google.com/p/owasp-esapi-python/
Read the following:
https://www.golemtechnologies.com/articles/prevent-xss#how-to-prevent-cross-site-scripting
https://www.owasp.org/
http://www.secbytes.com/blog/?p=253
For example:
$url = "http://stackoverflow.com"; // e.g., $_GET["user-homepage"];
$esapi = new ESAPI( "/etc/php5/esapi/ESAPI.xml" ); // Modified copy of ESAPI.xml
$sanitizer = ESAPI::getSanitizer();
$sanitized_url = $sanitizer->getSanitizedURL( "user-homepage", $url );
Another example is to use a built-in function. PHP's filter_var function is an example:
$url = "http://stackoverflow.com"; // e.g., $_GET["user-homepage"];
$sanitized_url = filter_var($url, FILTER_SANITIZE_URL);
Using filter_var allows javascript calls, and filters out schemes that are neither http nor https. Using the OWASP ESAPI Sanitizer is probably the best option.
Still another example is the code from WordPress:
http://core.trac.wordpress.org/browser/tags/3.5.1/wp-includes/formatting.php#L2561
Additionally, since there is no way of knowing where the URL links (i.e., it might be a valid URL, but the contents of the URL could be mischievous), Google has a safe browsing API you can call:
https://developers.google.com/safe-browsing/lookup_guide
Rolling your own regex for sanitation is problematic for several reasons:
Unless you are Jon Skeet, the code will have errors.
Existing APIs have many hours of review and testing behind them.
Existing URL-validation APIs consider internationalization.
Existing APIs will be kept up-to-date with emerging standards.
Other issues to consider:
What schemes do you permit (are file:/// and telnet:// acceptable)?
What restrictions do you want to place on the content of the URL (are malware URLs acceptable)?
Just HTMLEncode the links when you output them. Make sure you don't allow javascript: links. (It's best to have a whitelist of protocols that are accepted, e.g., http, https, and mailto.)
You don't specify the language of your application, I will then presume ASP.NET, and for this you can use the Microsoft Anti-Cross Site Scripting Library
It is very easy to use, all you need is an include and that is it :)
While you're on the topic, why not given a read on Design Guidelines for Secure Web Applications
If any other language.... if there is a library for ASP.NET, has to be available as well for other kind of language (PHP, Python, ROR, etc)
For Pythonistas, try Scrapy's w3lib.
OWASP ESAPI pre-dates Python 2.7 and is archived on the now-defunct Google Code.
How about not displaying them as a link? Just use the text.
Combined with a warning to proceed at your own risk may be enough.
addition - see also Should I sanitize HTML markup for a hosted CMS? for a discussion on sanitizing user input
There is a library for javascript that solves this problem
https://github.com/braintree/sanitize-url
Try it =)
In my project written in JavaScript I use this regex as white list:
url.match(/^((https?|ftp):\/\/|\.{0,2}\/)/)
the only limitation is that you need to put ./ in front for files in same directory but I think I can live with that.
Using Regular Expression to prevent XSS vulnerability is becoming complicated thus hard to maintain over time while it could leave some vulnerabilities behind. Having URL validation using regular expression is helpful in some scenarios but better not be mixed with vulnerability checks.
Solution probably is to use combination of an encoder like AntiXssEncoder.UrlEncode for encoding Query portion of the URL and QueryBuilder for the rest:
public sealed class AntiXssUrlEncoder
{
public string EncodeUri(Uri uri, bool isEncoded = false)
{
// Encode the Query portion of URL to prevent XSS attack if is not already encoded. Otherwise let UriBuilder take care code it.
var encodedQuery = isEncoded ? uri.Query.TrimStart('?') : AntiXssEncoder.UrlEncode(uri.Query.TrimStart('?'));
var encodedUri = new UriBuilder
{
Scheme = uri.Scheme,
Host = uri.Host,
Path = uri.AbsolutePath,
Query = encodedQuery.Trim(),
Fragment = uri.Fragment
};
if (uri.Port != 80 && uri.Port != 443)
{
encodedUri.Port = uri.Port;
}
return encodedUri.ToString();
}
public static string Encode(string uri)
{
var baseUri = new Uri(uri);
var antiXssUrlEncoder = new AntiXssUrlEncoder();
return antiXssUrlEncoder.EncodeUri(baseUri);
}
}
You may need to include white listing to exclude some characters from encoding. That could become helpful for particular sites.
HTML Encoding the page that render the URL is another thing you may need to consider too.
BTW. Please note that encoding URL may break Web Parameter Tampering so the encoded link may appear not working as expected.
Also, you need to be careful about double encoding
P.S. AntiXssEncoder.UrlEncode was better be named AntiXssEncoder.EncodeForUrl to be more descriptive. Basically, It encodes a string for URL not encode a given URL and return usable URL.
You could use a hex code to convert the entire URL and send it to your server. That way the client would not understand the content in the first glance. After reading the content, you could decode the content URL = ? and send it to the browser.
Allowing a URL and allowing JavaScript are 2 different things.