generateSharedAccessSignature not adding sv parameter? - azure

I'm trying to generate a Shared Access Signature and am using the code here (http://blogs.msdn.com/b/brunoterkaly/archive/2014/06/13/how-to-provision-a-shared-access-signatures-that-allows-clients-to-upload-files-to-to-azure-storage-using-node-js-inside-of-azure-mobile-services.aspx) for a custom API to generate the SAS.
It seems to be missing the sv=2014-02-14 parameter when calling "generateSharedAccessSignature()".
The SAS url doesn't seem to work when I try it (getting a 400 xml not valid error) but if I try a SAS generated from Azure Management Studio the URL contains the "sv" parameter and works when I attempt to upload with it.
Any ideas?

Based on the Storage Service REST API Documentation, sv parameter in Shared Access Signature is introduced in storage service version 2014-02-14. My guess is that Azure Mobile Service is using an older version of the storage service API and this is the reason you don't see sv parameter in your SAS token.
You could be getting 400 error (invalid XML) because of this. In the earlier version of storage service API, the XML syntax for committing block list was different than what is used currently. I have had one more user come to my blog post complaining about the same error. Please try the following XML syntax when performing a commit block list operation and see if the error is gone:
<?xml version="1.0" encoding="utf-8"?>
<BlockList>
<Block>[base64-encoded-block-id]</Block>
<Block>[base64-encoded-block-id]</Block>
...
<Block>[base64-encoded-block-id]</Block>
</BlockList>
Please notice that we're not using Latest node. Instead we're using Block node.

Leaving the sv parameter out and setting it as part of the PUT request header worked using:
xhr.setRequestHeader('x-ms-version','2014-02-14');
You can check out this example for an azure file upload script: http://gauravmantri.com/2013/02/16/uploading-large-files-in-windows-azure-blob-storage-using-shared-access-signature-html-and-javascript/
...which will work with the generated SAS from the question's original blog link - http://blogs.msdn.com/b/brunoterkaly/archive/2014/06/13/how-to-provision-a-shared-access-signatures-that-allows-clients-to-upload-files-to-to-azure-storage-using-node-js-inside-of-azure-mobile-services.aspx
Add the request header in the beforeSend like so:
beforeSend: function(xhr) {
xhr.setRequestHeader('x-ms-version','2014-02-14');
},

Related

Precompiled Azure Function and SOAP endpoints

I'm writing a precompiled Azure function that will perform a SOAP call to ServiceNow. The code works as a standalone exe but I can't seem to get it converted to a precompiled function. In know it's because my DLL can't find the app.config file but what's the best way to get around it. Error message below. ServiceNow requires I set certain bindings and endpoint configuration. The other contractors for their ServiceNowSoapClient class allow me to specify a url directly but don't seem to allow me to get to the binding settings.
Exception while executing function: Functions.TimerTriggerCSharp.
System.ServiceModel: Could not find endpoint element with name
'ServiceNowSoapDev' and contract 'ServiceNowReference.ServiceNowSoap'
in the ServiceModel client configuration section. This might be
because no configuration file was found for your application, or
because no endpoint element matching this name could be found in the
client element.
In WCF you can define your client binding and endpoint programmatically instead of using app.config. Use the constructor of the generated client with two parameters:
new ServiceNowSoapClient(binding, remoteAddress);
See more code here.

What is the Azure API version

I'm trying to access the result of a GET request provided by Azure, as shown in the example : https://msdn.microsoft.com/sv-se/library/azure/dn820159.aspx
My problem is that the api-version is a mandatory argument, but I have no idea about what to write inside. I'm a bit lost with the Azure Batch documentation, it doesn't seem to be complete.
I found something in an Azure webpage : https://azure.microsoft.com/en-us/documentation/articles/search-api-versions/ and the api-version was api-version=2015-02-28. However, if I try it in my browser, I have this answer : "key":"Reason","value":"The specified api version string is invalid".
Any idea of what I can put inside the api-version parameter ?
Have a look here
As the time of this writing
The version of the Batch API described here is '2016-07-01.3.1', and
using that version is recommended where possible.
Earlier versions include '2016-02-01.3.0', '2015-12-01.2.1',
'2015-11-01.2.1', '2015-06-01.2.0', '2015-03-01.1.1', and
'2014-10-01.1.0'.
So try specifying '2016-07-01.3.1'

FIWARE object storage GE: Different type of responses obtained when downloading an image object

I can effectively make use of all operations available for Object Storage on my FIWARE account.
Nonetheless I have identified a strange behaviour when downloading objects from a container.
Please find below the procedure to reproduce that strange behaviour:
I upload two objects ("gonzo.png" and "elmo.png") to the container "photos"
1.1. First, by means of cloud UI (https://cloud.lab.fiware.org/#objectstorage/containers/) I manually upload the object "gonzo.png"
1.2. Later, by following the instructions from Object Storage GE programmer's guide I programmatically (or with the help of standalone Rest Client) upload the object "elmo.png"
I download the objects from the container "photos"
2.1 First, by following the instructions from Object Storage GE programmer's guide I successfully download object "gonzo.png". The webservice response body is the binary content of such object.
2.2. Later, by following same instructions as in step 2.1 I try to download the object "elmo.png". Now the webservice response body is a json with metadata and the binary content of the object.
What can I do receive a standard response body for both objects? Either binary or either json.
Why do I get a different response if the object is originally uploaded via Cloud UI or via external tool (program or rest client) ?
As in Download blob from fiware object-storage I have already tried to set the header response_type: text and the behaviour is the same.
There are many object stores out there, having different APIs.
The Object Storage GE was initially based on the CDMI API [1].
Currently, it is based on Openstack Swift [2].
The Cloud Portal still uses some of the CDMI features and specifically it may do 64-bit encoding of some types of objects in which case the object content is a json which contains the metadata and a base64 encoding of the data. I suspect that this is what happened to the object you have created using the cloud UI.
Thus, please use Swift native API for all operations.
The API is well documented here: http://developer.openstack.org/api-ref-objectstorage-v1.html
and the python examples in the programmer guide (https://forge.fiware.org/plugins/mediawiki/wiki/fiware/index.php/Object_Storage_-_User_and_Programmers_Guide) also use the Native API.
[1] google for SNIA CDMI. Having less then 10 replutation I cannot put too many links
[2] google for Openstack Swift

Upload File with brackets ([ & ]) in the name

I'm moving a ClickOnce install from a regular web server to Azure Blob storage and have a problem with some of the files. The filenames contains [ ] and CloudBlob.UploadFile fails with an exception:
Microsoft.WindowsAzure.Storageclient.StorageException:
Error accessing blob storage: Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature.
The code has been used for a while and only fails on files with [ ] in the name so I don't believe that it is an "authentication failure". In this particular case, this is the seventh file being uploaded in a loop. I found this link on MSDN about valid file names and this on stack overflow which both show problems with square brackets in URL's and reference UrlEncode. I added in a call to UrlEncode and that did not help. The container is created with public access since we use it to support customer downloads of our software. We have been hosting a "test" install in another container and have not had permission problems accessing that either.
I can upload the file with no name changes and then rename the file to add the "path" using newdesic's Azure Storage Explorer tool so what is that doing that I am not doing?
I see you're using the 1.7 SDK. This is a small encoding issue with the SDK which is also present in v2.0. Let's see what happens.
No encoding
account.CreateCloudBlobClient()
.GetContainerReference("temp")
.GetBlobReference("abc[]def.txt")
.UploadFile("myfile.txt");
If you don't encode the blob name, you'll end up with a request to the following URL which is causing the authentication exception:
http://account.blob.core.windows.net/temp/abc[]def.txt
The is because the SDK uses Uri.EscapeUriString internally to encode your string, but this doesn't take into account square brackets.
Encoding
Then you would expect the following to do the trick:
account.CreateCloudBlobClient()
.GetContainerReference("temp")
.GetBlobReference(HttpUtility.UrlEncode("abc[]def.txt"))
.UploadFile("myfile.txt");
The issue here is that you'll end up with this url:
http://account.blob.core.windows.net/temp/abc%255b%255ddef.txt
So what's happening here? Calling HttpUtility.UrlEncode turns abc[]def.txt to abc%5B%5Ddef.txt, which is correct. But internally, the SDK will encode this string again which results in abc%255b%255ddef.txt, which isn't what you want.
Workaround
The only way to apply encoding which takes square brackets into accounts is by using a small workaround. If you pass the full URL to the GetBlobReference method, the SDK assumes you did all the encoding yourself:
var container = account.CreateCloudBlobClient().GetContainerReference("temp");
var blob = container.GetBlobReference(String.Format("{0}/{1}",
container.Uri, System.Web.HttpUtility.UrlEncode("abc[]def.txt")));
blob.UploadFile("myfile.txt");
This results in a correctly encoded URL:
http://account.blob.core.windows.net/temp/abc%5b%5ddef.txt
And if you use a tool like CloudXplorer, you'll see the blob with the correct filename:
There are two known breaks in the Uri class in .Net 4.5
• ‘[‘,’]’ characters are no longer escaped
• ‘\’ character is now escaped as %5C
This is causing an authentication when the server attempts to validate the signature of the request as the canonicalized string is now different on the client and the server.
There are a few workarounds clients can use while this issue is present. The correct solution will depend on your specific application and requirements.
Avoid the ‘[‘,’]’, or ‘\’ characters in resource names
By simply avoiding these characters all together you will be able to avoid the issue described above.
Target .Net 4.0
Currently the recommendation is for clients to simply continue to target their applications to .Net 4.0 while a full solution is being investigated. Note, since .Net 4.5 is an in place upgrade clients can still take advantage of some performance improvements in the GC etc, without specifically targeting the .Net 4.5 profile. For Windows RT developers this is not an option and you will therefore require the workarounds detailed below.
Pre-Escape Data
If possible a client can pre- escape the data or replace the given characters with non-affected ones.
This is why the workaround above is working.

How can I access the parameters of a service on a Carbon server in plain txt

What I've done is broken the default 'Version' service on my WSO2 DSS, I tried to set the Scopes variable for WS-Discovery and didn't put a closing tag/element when creating the parameter.
Now when I try to access the parameters screen I get an xml Parse error
TID: [0] [WSO2 Data Services Server] [2012-08-22 12:38:04,404] ERROR {org.wso2.carbon.service.mgt.ServiceAdmin} - Error occured while getting parameters of service : Version
{org.wso2.carbon.service.mgt.ServiceAdmin}org.apache.axiom.om.OMException: com.ctc.wstx.exc.WstxUnexpectedCharException: Unexpected character '<' (code 60) in end tag Expected '>'. at [row,col {unknown-source}]: [2,58] at org.apache.axiom.om.impl.builder.StAXOMBuilder.next(StAXOMBuilder.java:296) at
I'm assuming this is stored in the H2 database, I've tried looking for the parameter in the .db file using notepad but I can't find it.
Is there another way to connect/browse the H2 db?
I've scanned through the repository, database and conf directories for clues without success.
UPDATE:
Yes you can connect to the H2 db using the included database Explorer under the Tools menu.
Use the connection details found in the repository/conf/registry.xml file
Then you can do SQL queries on it - (I haven't found the answer yet though)
UPDATE 2:
I don't think the parameters are held in the H2 db, but I managed to fix my problem by:
downloading the Version.aar file using the link on the list services page
deleting the Version service
Copying the Version.aar file into the repository/deployment/server/axis2services dir
I guess deleting the service removed any records/references to my broken parameter
I believe you've tried setting service parameters via the UI? Usually the service parameters you specify via the UI do not get saved in the services.xml of the original axis2 service archive. Instead, they get saved in the registry that is shipped with DSS and get applied to the service at runtime. But if you specify a malformed parameter then wouldn't be saved in the registry instead, throwing an exception while trying to engage that parameter. So there'll be no record saved corresponding to that kind of malformed parameters.
Hope this helps!
Cheers,
Prabath

Resources