I am trying to move the uploading on a remote server. After I choose a file with the code below and click upload the file IS uploaded, but an error returns saying code: "-200" message: "HTTP Error"
var uploader = new plupload.Uploader(
{
runtimes : 'html4, html5, flash, silverlight',
browse_button : 'bt_browse',
container: document.getElementById('container'),
url : 'http://remote.com/upload.php',
silverlight_xap_url : 'js/Moxie.xap',
chunks_size: '20mb',
max_retries: 3,
filters : {
max_file_size : '100mb'
},
multi_selection : true,
init: {
PostInit: function() {
document.getElementById('filelist').innerHTML = '';
document.getElementById('bt_uploadfiles').onclick = function() {
uploader.start();
return false;
};
},
FilesAdded: function(up, files) {
plupload.each(files, function(file) {
//build list
}},
UploadProgress: function(up, file) {
$("#progressBar"+file.id).val(Math.round(file.percent));
if(Math.round(file.percent)==100){
$("#progressBar"+file.id).hide();
$("#deleteFile" + file.id).hide();
}
},
FileUploaded: function(up, file, info) {
if(file!=undefined) {
var json = $.parseJSON(info.response);
if(json.error == undefined)
moveFile(json.result, file.name, file.id);
}
},
UploadComplete: function(){
},
Error: function(up, err) {
}
}
});
What can I do to escape this error and continue? In my case FileUploaded and UploadProgress are not hit at all - after I hit upload I directly moved to Error function. This is really weird for me since after that I check the folder where it is supposed to be and the file is there.
Any help will be much appreciated.
I came across the same error when I was using PlUpload in an MVC5 application. The problem was that the REST method could not be found. PlUpload using a multipart for data. The code below shows how this could be implemented in a WebAPI
public class UploadFilesController : ApiController
{
[HttpPost]
[Route("UploadFiles")]
public async Task<HttpResponseMessage> PostFormData()
{
// Check if the request contains multipart/form-data.
if (!Request.Content.IsMimeMultipartContent())
{
throw new HttpResponseException(HttpStatusCode.UnsupportedMediaType);
}
string root = HttpContext.Current.Server.MapPath("~/App_Data");
var provider = new MultipartFormDataStreamProvider(root);
try
{
// Read the form data.
await Request.Content.ReadAsMultipartAsync(provider);
var TestId = provider.FormData.Get("TestId");
var chunk = provider.FormData.Get("chunk");
var chunks = provider.FormData.Get("chunks");
var fileName = provider.FormData.Get("name");
int chunkId = Convert.ToInt32(chunk);
int totalChunks = Convert.ToInt32(chunks);
Boolean isLastChunch = chunkId == totalChunks - 1;
foreach (MultipartFileData file in provider.FileData)
{
//Console.WriteLine(file.Headers.ContentDisposition.FileName);
//Console.WriteLine("Server file path: " + file.LocalFileName);
string FileDestination = Path.GetDirectoryName(file.LocalFileName) + #"\" + fileName;
using (FileStream fileUpload = new FileStream(file.LocalFileName, FileMode.Open))
{
using (var fs = new FileStream(FileDestination, chunkId == 0 ? FileMode.Create : FileMode.Append))
{
var buffer = new byte[fileUpload.Length];
fileUpload.Read(buffer, 0, buffer.Length);
fs.Write(buffer, 0, buffer.Length);
}
}
File.Delete(file.LocalFileName);
}
if (isLastChunch) {
// Do something with the completed file
}
return Request.CreateResponse(HttpStatusCode.OK);
}
catch (System.Exception e)
{
return Request.CreateErrorResponse(HttpStatusCode.InternalServerError, e);
}
}
Related
When I try to upload blobs to my azure storage account I get the following error response
<?xml version="1.0" encoding="utf-8"?>
<Error>
<Code>OutOfRangeInput</Code>
<Message>One of the request inputs is out of range.
RequestId:--------------------------
Time:2017-10-29T07:13:37.4218874Z
</Message>
</Error>
I am uploading multiple blobs of which some are uploaded successfully while others are not. The ones that throw the error have large blob-names (about 100 characters) so assume it may be due to blob-names size. But according to https://blogs.msdn.microsoft.com/jmstall/2014/06/12/azure-storage-naming-rules/ the maximum blob-names can be 1024 and my blob-names are way less than that limit.
An example blob-name would be "65/36/aluminium_03_group67_product_02pCube1_product_02group2_product_02Flow000_Albedo.png"
Edit Code to upload the blob.
The code to upload is in Javascript. I am breaking the file into multiple chunks and uploading. Here is the function responsible for uploading files
function AzureFileUpload(file, uploadUrl, successCallback, progressCallback, errorCallback){
this.file = file;
this.uploadUrl = uploadUrl;
this.successCallback = successCallback;
this.progressCallback = progressCallback;
this.errorCallback = errorCallback;
this.reader = new FileReader();
this.maxBlockSize = 256 * 1024;
this.blockIds = [];
this.totalBytesRemaining = this.file.size;
this.currentFilePointer = 0;
this.bytesUploaded = 0;
this.uploadFlag = true;
var self = this;
this.reader.onloadend = function(evt) {
if (evt.target.readyState == FileReader.DONE) { // DONE == 2
var uri = self.uploadUrl + '&comp=block&blockid=' + self.blockIds[self.blockIds.length - 1];
var requestData = new Uint8Array(evt.target.result);
self.ReadBlock();
if(self.uploadFlag){
self.UploadBlock(requestData, uri);
}
}
};
this.ReadBlock();
}
AzureFileUpload.prototype.UploadBlock = function(requestData, blockUrl){
var self = this;
$.ajax({
url: blockUrl,
type: "PUT",
data: requestData,
processData: false,
beforeSend: function(xhr) {
xhr.setRequestHeader('x-ms-blob-type', 'BlockBlob');
xhr.setRequestHeader('x-ms-blob-cache-control', "public, max-age=864000");
},
success: function(data, status) {
self.UpdateProgress(requestData.length);
self.bytesUploaded += requestData.length;
if (parseFloat(self.bytesUploaded) == parseFloat(self.file.size)) {
self.CommitBlocks();
}
},
error: function(xhr, desc, err) {
// console.log(desc);
// console.log(err);
self.Error("Unexpected error occured while uploading model. Plaese try after some time");
}
});
};
AzureFileUpload.prototype.pad = function(number, length){
var str = '' + number;
while (str.length < length) {
str = '0' + str;
}
return str;
};
AzureFileUpload.prototype.ReadBlock = function(){
if (this.totalBytesRemaining > 0) {
var fileContent = this.file.slice(this.currentFilePointer, this.currentFilePointer + this.maxBlockSize);
var blockId = "block-" + this.file.name + "-" + this.pad(this.blockIds.length, 6);
this.blockIds.push(btoa(blockId));
this.reader.readAsArrayBuffer(fileContent);
this.currentFilePointer += this.maxBlockSize;
this.totalBytesRemaining -= this.maxBlockSize;
if (this.totalBytesRemaining < this.maxBlockSize) {
this.maxBlockSize = this.totalBytesRemaining;
}
}
};
AzureFileUpload.prototype.UpdateProgress = function(bytesUploaded){
console.log("Progress",bytesUploaded);
if(this.progressCallback){
this.progressCallback(bytesUploaded);
}
};
AzureFileUpload.prototype.CommitBlocks = function(){
var self = this;
var uri = this.uploadUrl + '&comp=blocklist';
var request = '<?xml version="1.0" encoding="utf-8"?><BlockList>';
for (var i = 0; i < this.blockIds.length; i++) {
request += '<Latest>' + this.blockIds[i] + '</Latest>';
}
request += '</BlockList>';
$.ajax({
url: uri,
type: "PUT",
data: request,
beforeSend: function(xhr) {
xhr.setRequestHeader('x-ms-blob-content-type', self.file.type);
xhr.setRequestHeader('x-ms-blob-cache-control', "public, max-age=864000");
},
success: function(data, status) {
console.log("Block Commited", data);
if(self.successCallback){
self.successCallback();
}
},
error: function(xhr, desc, err) {
self.Error("Unexpected error occured while uploading model. Plaese try after some time");
}
});
};
AzureFileUpload.prototype.Error = function(msg){
this.CancelUpload();
if(this.errorCallback){
this.errorCallback(msg);
}
};
AzureFileUpload.prototype.CancelUpload = function(){
this.uploadFlag = false;
};
The problem is with the following line of code:
var blockId = "block-" + this.file.name + "-" + this.pad(this.blockIds.length, 6);
Essentially the max length of a block id can be 64 bytes (Ref: https://learn.microsoft.com/en-us/rest/api/storageservices/put-block - see URI parameters section). Because you're including file name in block id computation and your file name is large, you're exceeding this limitation.
Please try with the following line of code and you should not get this error:
var blockId = "block-" + this.pad(this.blockIds.length, 6);
Please note that block ids are scoped to a blob so it is not really necessary for you to include the blob name to make the block ids unique to a blob.
If your using a connection string this could also be an issue, double check it (and the casing) as container names etc are case sensitive. You can read more on naming rules here https://learn.microsoft.com/en-us/rest/api/storageservices/Naming-and-Referencing-Containers--Blobs--and-Metadata?redirectedfrom=MSDN
I am working with azure blob storage, i have done that with PHP language, now i want to upload files on azure blob storage with jquery, so i used one plugin for that, when i try to upload file on that it is giving me error in console
Cross-Origin Request Blocked: The Same Origin Policy disallows reading
the remote resource at (Reason: CORS header
‘Access-Control-Allow-Origin’ missing).
I read about the CORS configuration, and i enable that configuration in Azure Blob Storage, here is my screenshot for that
Here is my jquery code
$(function () {
var ju = {
sasUrl: $('#txtSAS'),
list: $('#tbFileList tbody'),
btnUpload: $('#btnUpload'),
btnClear: $('#btnClear'),
btnAddFile: $('#btnAddFile'),
sasSearchName: 'sas',
btnLink: $('#btnLink'),
linkPopover: $('#linkPopover'),
init: function () {
this.root = location.href;
if (this.root.indexOf('?') > 0) {
this.root = this.root.substr(0, this.root.indexOf('?'));
this.root = this.root.replace(/#/g, '');
}
this.btnClear.hide();
var sas = this.queryString(this.sasSearchName);
if (sas) {
this.sasUrl.val(sas);
}
this.list.blobuploader(
{
url: ju.sasUrl.val(),
beforeSend: function (blob) {
},
progress: function (blob) {
ju.progress(blob.element.closest('tr'), blob.loaded, blob.size);
},
success: function (blob, data, status) {
var st = blob.speed(true);
var msg = 'total time: ' + ((st.end - st.start) / 1000).toFixed(2) + 'S<br/>'
+ 'max speed: ' + st.max + '<br/>'
+ 'min speed: ' + st.min + '<br/>'
+ 'average speed: ' + st.average;
ju.status(blob.element, msg);
var download = '<a target="_blank" role="button" class="btn btn-link" href="'
+ blob.blobUrl
+ '" >' + blob.name + '</a>';
ju.log(blob.element.closest('tr').find('td:first'), download);
},
error: function (blob, block, xhr, desc, err) {
var msg = $('<span></span>');
msg.append('upload ' + blob.name + ' error.');
var btn = $('<button type="button" id="btnUpload" class="btn btn-sm btn-primary pull-right" role="button">Retry</button>');
btn.click(function () {
ju.retry($(this).closest('tr'));
});
msg.append(btn)
ju.status(blob.element, msg, 'danger');
}
});
this.btnClear.click(function () {
ju.clear();
});
this.btnAddFile.find('input').on('change', function () {
ju.add();
});
this.btnUpload.click(function () {
ju.upload();
});
this.btnLink.popover({
html: true,
content: this.linkPopover,
container: 'body'
});
this.btnLink.on('shown.bs.popover', function () {
var panel = $('#linkPopover');
panel.find('#txtShareUrl').val(ju.getLongUrl());
panel.find('#ckShortUrl').click(function () {
if ($(this).is(':checked')) {
ju.generateShortUrl();
} else {
panel.find('#txtShareUrl').val(ju.getLongUrl());
}
})
panel.find('.close').click(function () {
ju.btnLink.popover('toggle');
});
panel.find('#ckShortUrl').attr('checked', false);
panel.find('.loading').hide();
});
this.sasUrl.on('change', function () {
ju.linkPopover.find('#ckShortUrl').attr('ckecked', false);
ju.linkPopover.find('.loading').hide();
});
var code = $('.prettyprint');
code.text(code.text().replace('site-domain', location.origin));
},
progress: function (tr, loaded, total) {
var percent = (loaded / total * 100).toFixed(2);
var span = tr.find('td:last .percent');
if (span.length == 0) {
span = $('<span class="percent"/>').appendTo(tr.find('td:last').empty());
}
span.text(percent + '%');
},
log: function (td, message, type) {
var div = td.empty();
if (type) {
div = $('<div class="alert alert-' + type + '"/>').appendTo(td);
}
if (message instanceof jQuery) {
div.append(message);
} else {
div.html(message);
}
},
information: function (element, info, type) {
var td = element.closest('tr').find('td:eq(1)');
if (info) {
ju.log(td, info, type);
} else {
return td.html();
}
},
status: function (element, message, type) {
var td = element.closest('tr').find('td:last');
if (message) {
ju.log(td, message, type);
} else {
return td.html();
}
},
add: function () {
var tr = $('<tr/>'), td = $('<td/>');
var file = this.btnAddFile.find('input');
this.btnAddFile.append(file.clone(true));
var f = file.get(0).files[0];
td.append(file)
.append(f.name)
.appendTo(tr);
td = $('<td/>')
.append(f.type, f.type ? '<br/>' : '', (f.size / 1000).toFixed(2) + 'KB')
.appendTo(tr);
$('<td><span class="percent"></span></td>').appendTo(tr);
tr.appendTo(this.list);
this.btnClear.show();
},
setProperties: function () {
if (!this.sasUrl.val()) {
alert('Please typedin the Container SAS');
return;
}
var blockSize = parseInt($('#txtBlockSize').val());
var maxThread = parseInt($('#txtMaxThread').val());
if (isNaN(blockSize) || isNaN(maxThread)) {
alert("Block Size and Max Thread can only be number.");
return;
}
if (blockSize > 4096) {
alert('The block size should be less than 4096kb');
return;
}
if (blockSize < 1) {
alert('The block size should be greater than 1kb');
return;
}
if (maxThread < 0) {
maxThread = 0;
}
this.list.blobuploader('option', { maxThread: maxThread, blockSizeKB: blockSize, url: this.sasUrl.val() });
return true;
},
upload: function () {
if (this.setProperties()) {
this.list.blobuploader('upload');
}
},
retry: function (tr) {
if (this.setProperties()) {
if (tr) {
var element = tr.find('input[type="file"]');
var blob = this.list.blobuploader('blob', element);
this.list.blobuploader('retry', blob);
} else {
this.list.blobuploader('retry');
}
}
},
clear: function () {
this.list.empty();
this.btnClear.hide();
},
queryString: function (name, value) {
if (!value) {
name = name.replace(/[\[]/, "\\\[").replace(/[\]]/, "\\\]");
var regex = new RegExp("[\\?&]" + name + "=([^&#]*)"),
results = regex.exec(location.search);
return results == null ? "" : atob(decodeURIComponent(results[1].replace(/\+/g, " ")));
} else {
return name + '=' + encodeURIComponent(btoa(value));
}
},
getLongUrl: function () {
return this.root + '?' + this.queryString('sas', this.sasUrl.val());
},
generateShortUrl: function () {
var request = gapi.client.urlshortener.url.insert({
'resource': {
'longUrl': this.getLongUrl()
}
});
request.execute(function (response) {
if (response.id != null) {
ju.linkPopover.find('.loading').hide();
ju.linkPopover.find('#txtShareUrl').val(response.id);
}
else {
ju.linkPopover.find('.loading').text('error.');
}
});
}
}
ju.init();
prettyPrint();
})
function gapiload() {
gapi.client.setApiKey('AIzaSyDzeVB4WDi6azVvIu6uc8hIhWxf99dB6c8');
gapi.client.load('urlshortener', 'v1', function () { });
}
In the input we need to add "Input Your Container SAS Here" i am adding there
https://*****.blob.core.windows.net/?sv=2017-04-17&ss=bfqt&srt=sco&sp=rwdlacup&se=2017-09-10T01:51:20Z&st=2017-09-09T17:51:20Z&spr=https&sig=****** this SAS url, it will get this SAS url and after then we need to select file and upload it.
Can anyone please tell me what is exact issue ?
Thanks
I also download and test the library, it worked fine with following setting on my blob storage service. The MaxAgeInSeconds setting will cache the preflight OPTIONS request. I suggest you reset it to 0 and run your code again(Please use different browsers to test it).
In addition, there are multi CORS setting under Azure Storage panel. Please mark sure that you were setting the right one for Azure Blob Storage.
I am working on a project where I am creating an excel file using XLSX node.js library, sending it to a client via Restify where I then use the FileSaver.js library to save it on the local computer. When I write the xlsx workbook to file on the backend, it opens fine, however, when I open it on the client, it is corrupted. I get the error: "Excel cannot open this file. The file format or file extension is not valid. Verify that the file has not been corrupted and that the file extension matches the format of the file".
Here is my code for writing and sending the file on the backend:
var wopts = { bookType:'xlsx', bookSST:false, type:'binary' };
var workbook = xlsx.write(wb, wopts);
res.send(200, workbook);
On the front end, I am using code from the XLSX documentation:
function s2ab(s) {
var buf = new ArrayBuffer(s.length);
var view = new Uint8Array(buf);
for (var i=0; i!=s.length; ++i)
view[i] = s.charCodeAt(i) & 0xFF;
return buf;
}
saveAs(new Blob([s2ab(response.data)],{type:""}), "test.xlsx");
Any thoughts on why this would not work? Any help would be much appreciated. Thanks.
As Luke mentioned in the comments, you have to do a base64 encoding before sending the buffer. Here's a snippet that used the NPM module node-xlsx.
var xlsx = require('node-xlsx');
router.get('/history', function (req, res) {
var user = new User();
user.getHistory(req.user.userId, req.query.offset, req.query.limit)
.then(function (history) {
if (req.headers.contenttype && req.headers.contenttype.indexOf('excel') > -1) {
var data = [['Data', 'amount'], ['19/12/2016', '10']];
var xlsxBuffer = xlsx.build([{ name: 'History', data: data }]);
res.end(xlsxBuffer.toString('base64'));
} else {
res.send(history);
}
})
.catch(function (err) {
res.status(500).send(err);
});
});
And this is the frontend code using Angular:
$scope.getXlsFile = function() {
var config = {
params: {
offset: $scope.offset,
limit: $scope.limit
},
headers: {
'contentType': 'application/vnd.ms-excel',
'responseType': 'arraybuffer'
}
};
$http.get('/api/history', config)
.then(function(res) {
var blob = new Blob([convert.base64ToArrayBuffer(res.data)]);
FileSaver.saveAs(blob, 'historial.xlsx');
})
}
where convert is the following factory:
.factory('convert', function () {
return {
base64ToArrayBuffer: function (base64) {
var binary_string = window.atob(base64);
var len = binary_string.length;
var bytes = new Uint8Array(len);
for (var i = 0; i < len; i++) {
bytes[i] = binary_string.charCodeAt(i);
}
return bytes.buffer;
}
}
})
I'm attempting to use the ng-file-upload directive to provide file upload functionality in my angular app.
I've got it working for the most part - I can select multiple files and loop through to grab the file name and file types. I just can't seem to figure out where the actual binary data of each file is stored in the file object.
I tried using the approach outlined in this post - AngularJS Upload a file and send it to a DB, but that results in a an error that "$q is not defined".
function create_blob(file) {
var deferred = $q.defer();
var reader = new FileReader();
reader.onload = function () {
deferred.resolve(reader.result);
};
reader.readAsDataURL(file);
return deferred.promise;
}
So then I tried the approach outlined in this post - Send an uploaded image to the server and save it in the server, but again I'm running into an error reading "dataURI.split is not a function".
function dataURItoBlob(dataURI) {
var binary = atob(dataURI.split(',')[1]);
var mimeString = dataURI.split(',')[0].split(':')[1].split(';')[0];
var array = [];
for (var i = 0; i < binary.length; i++) {
array.push(binary.charCodeAt(i));
}
return new Blob([new Uint8Array(array)], {
type: mimeString
});
}
The code I'm using is as follows:
function create_blob(file) {
var deferred = $q.defer();
var reader = new FileReader();
reader.onload = function () {
deferred.resolve(reader.result);
};
reader.readAsDataURL(file);
return deferred.promise;
}
function dataURItoBlob(dataURI) {
var binary = atob(dataURI.split(',')[1]);
var mimeString = dataURI.split(',')[0].split(':')[1].split(';')[0];
var array = [];
for (var i = 0; i < binary.length; i++) {
array.push(binary.charCodeAt(i));
}
return new Blob([new Uint8Array(array)], {
type: mimeString
});
}
$scope.uploadFiles = function (files) {
$scope.files = files;
angular.forEach(files, function (file) {
if (file && !file.$error) {
//var reader = new FileReader();
//console.log(reader.readAsDataURL(file));
//var binary = create_blob(file);
var fileBinary = dataURItoBlob(file);
$http({
url: root + '/DesktopModules/ServiceProxy/API/NetSuite/InsertCaseFile',
method: "POST",
//headers: { 'caseId': id, 'fileName': file.name, fileContent: $.base64.encode(file) }
headers: { 'caseId': id, 'fileName': file.name, fileContent: fileBinary }
}).
success(function (data, status, headers, config) {
//if (data == true) {
// getCase();
// $scope.newMessage = "";
// //toaster.pop('success', "", "Message succesfully submitted.",0);
//}
}).
error(function (data, status, headers, config) {
});
file.upload.progress(function (evt) {
file.progress = Math.min(100, parseInt(100.0 * evt.loaded / evt.total));
});
}
});
}
What am I overlooking?
It depends on what format your DB is accepting for file upload. If it support multipart form data, then you can just use
Upload.upload({file: file, url: my/db/url}).then(...);
if it accepts post requests with file's binary as content of the request (like CouchDB, imgur, ...) then you can do
Upload.http({data: file, url: my/db/url, headers: {'Content-Type': file.type}})...;
if you db just accept json objects and you want to store the file as base64 data url in the database like this question then you can do
Upload.dataUrl(file, true).then(function(dataUrl) {
$http.post(url, {
fileBase64DataUrl: dataUrl,
fileName: file.name,
id: uniqueId
});
})
I have to pin secondary tile in my windows phone 8.1 application.
I followed the msdn tutorial : http://code.msdn.microsoft.com/windowsapps/secondary-tiles-sample-edf2a178/
It does work with internal image (ms-appx://.. ) but not with web url (http://)
working sample:
var logo = new Windows.Foundation.Uri("ms-appx:///Images/square30x30Tile-sdk.png");
var currentTime = new Date();
var TileActivationArguments = data.ad_id + " WasPinnedAt=" + currentTime;
var tile = new Windows.UI.StartScreen.SecondaryTile(data.ad_id,
data.subject,
TileActivationArguments,
logo,
Windows.UI.StartScreen.TileSize.square150x150);
tile.visualElements.foregroundText = Windows.UI.StartScreen.ForegroundText.light;
tile.visualElements.square30x30Logo = logo;
tile.visualElements.showNameOnSquare150x150Logo = true;
var selectionRect = this.element.getBoundingClientRect();
// Now let's try to pin the tile.
// We'll make the same fundamental call as we did in pinByElement, but this time we'll return a promise.
return new WinJS.Promise(function (complete, error, progress) {
tile.requestCreateForSelectionAsync({ x: selectionRect.left, y: selectionRect.top, width: selectionRect.width, height: selectionRect.height }, Windows.UI.Popups.Placement.above).done(function (isCreated) {
if (isCreated) {
complete(true);
} else {
complete(false);
}
});
});
And if I use
var logo = new Windows.Foundation.Uri(data.images[0]);
I got an invalid parameter exception.
You can take a look at the documentation for the SecondaryTile.Logo property. In it you'll see this:
The location of the image. This can be expressed as one of these schemes:
ms-appx:///
ms-appdata:///local/
You can download the image first and then set it using the ms-appdata:///local/ scheme. I'm not sure that changing the logo with something from the Internet is a good idea, though. This should be the app's logo, so it should be in the package.
I found the solution
fileExists: function (fileName) {
var applicationData = Windows.Storage.ApplicationData.current;
var folder = applicationData.localFolder;
return folder.getFileAsync(fileName).then(function (file) {
return file;
}, function (err) {
return null;
});
},
download: function (imgUrl, imgName) {
return WinJS.xhr({ url: imgUrl, responseType: "blob" }).then(function (result) {
var blob = result.response;
var applicationData = Windows.Storage.ApplicationData.current;
var folder = applicationData.localFolder;
return folder.createFileAsync(imgName, Windows.Storage.
CreationCollisionOption.replaceExisting).then(function (file) {
// Open the returned file in order to copy the data
return file.openAsync(Windows.Storage.FileAccessMode.readWrite).
then(function (stream) {
return Windows.Storage.Streams.RandomAccessStream.copyAsync
(blob.msDetachStream(), stream).then(function () {
// Copy the stream from the blob to the File stream
return stream.flushAsync().then(function () {
stream.close();
});
});
});
});
}, function (e) {
//var msg = new Windows.UI.Popups.MessageDialog(e.message);
//msg.showAsync();
});
},
var self = this;
this.download(data.images[0], data.ad_id).then(function () {
self.fileExists(data.ad_id).then(function (file) {
var logo = new Windows.Foundation.Uri("ms-appdata:///Local/" + data.ad_id);
....
I need to download the image, store it and then I can use ms-appdata:///Local