Error Loading Preview on Firebase Storage when click on the image - android-studio

We are making app for school project.. and we use firebase as our database.. when I try to directly upload the image files on firebase storage, I cant view it, theres this message "Error loading preview".. while my other team member can view it on his device.. same thing happen when uploading and retrieving images to our app.. I cant seem to upload and view image(using Piccasso and Glide) on my device.. but my other group member works fine..
Im using my phone to run our app. Storing and retreiving other data works fine except for images.
please help

Related

Google Maps JavaScript API warning: InvalidKey and Google Maps JavaScript API error: InvalidKeyMapError

I am trying to deploy an open source project (https://github.com/LiteFarmOrg/LiteFarm) on local host through docker composer and i have followed the instructions provided in the link, the app is a node.js app with frontend in react.js i have provided the (google_api_key) for the map to work in the application in the ".env" file but i am getting the error of "invalidkey" yet i tried to provide another newly generated key same error i faced. the screenshot is as
invalid key error
also in the location text field the "icon of something went wrong is coming".

updating a deployment - uploaded images gets deleted after redeployment to google cloud

So I have a node js web app, this web app has a folder to store images uploaded by users from a mobile app. How I upload the image to the folder is by using the image's base64 string, and using fs.writeFile to save the image to the folder, like this:
fs.writeFile(__dirname + '/../images/complaintImg/complaintcase_' + data.cID + '.jpg', Buffer.from(data.complaintImage, 'base64'), function (err) {
if (err) {
console.log(err);
} else {
console.log("success");
}
});
The problem is, whenever the application is redeployed to google cloud, the images gets deleted. This is because the image folder of the local version of the application is empty - when the user uploads an image, i don't get a local copy of that image.
How do i prevent the images from getting deleted with every deployment? because the app is constantly updated (changes to js or html files), i can't have the images getting deleted with every deployment. How do i update a deployment to only deploy certain files? the gcloud app deploy command seems to deploy the entire project. or should i upload the images directly to google cloud storage?
please help, currently the mobile app isn't released to the public yet, so having the images deleted with every deployment is still not a big problem now, but it will be once it's released to the public. because the images they upload are very important. thank you in advance!
It appears that your __dirname directory you chose may be under /tmp or, if you use the flexible environment, some other directory local to your instance. If so the images will disappear whenever new instances are started (which always happens at new deployment, but it can happen in between deployments as well). This is expected, the instances are always started "from scratch".
You need to store the files that your app creates and you want to survive instance (re)starts on a persistent storage product, like Cloud Storage, see Using Cloud Storage (or Using Cloud Storage for flexible env). Note that you can't use the regular filesystem calls with Cloud Storage, you need to use the documented client library.
As stated in Dan Cornilescu's answer, for user uploaded files, you should store them in Cloud Storage for GAE Standard or for GAE Flexible.
Just as a reference, there is an alternative for those who are using Python 2.7, Java 8 or PHP 5, which is the BlobStore API

Kiip web sdk integration showing bad request error in console

I just integrated KIIP SDK for web in my node js application and it works correctly . But the problem is it showing error on browser console on each page refresh, But this error does not affect working of this sdk, it works perfectly.
the errror message showing is,
'POST https://api.kiip.me/2.0/web/moment/?r=1426508956613 400 (Bad Request)'.
My kiip code integration is as follows,
1) Included the script file on head tag
2)Declared the app key as global variable,
kiip_app_key='app-key from kiip site';
3) And intialized the kiip instance and invoked the method.
var kiipInstance = new Kiip(kiip_app_key);
kiipInstance.setTestMode();
kiipInstance.postMoment('received offer');
Andrew from Kiip here.A few things could be causing this:
The page is running from a local file or host. Solution: run the test page on a server.
The app is not submitted for live rewards. Solution: Submit for live rewards in the Kiip dashboard.
You're passing an incorrect app key. Solution: copy the app_key for the corresponding app from the Kiip dashboard and paste it as you're global variable.
Hope this helps,
Andrew

Image Uploading in Meteor , uploading empty files

I am working on meteor framework and trying to upload images in one of my app using the code from the link
https://gist.github.com/3922137
Everything works good , except that it uploads the empty file in my public folder of app .
I checked console and it shows 503 error there after I select a file to upload
Here is the console screenshot
http://img40.imageshack.us/img40/2956/consolewl.jpg
It keeps on looping and the number of errors keeps on adding in console
I am using meteor on windows .
Does anyone has managed to get file uploads work in meteor on windows platform ? and if yes can you please share the code which worked for you .
Thanks
Aman
The reason you are getting a 503 is that when anything in the public folder changes meteor reloads. Because your uploads are going there the server is resetting. Change your code not to save it to public but rather somewhere else and you should see the error go away.

Azure: Downloading from Blob Storage results in permissions error?

I’ve uploaded some files to Blob storage, and now I’m using the OnStart method to retrieve those files and run them. Right now I’m working locally.
Using the following code:
using (var fileStream = System.IO.File.OpenWrite(#"C:\testfolder"))
{
blob.DownloadToStream(fileStream);
}
Results in a “Access to the path 'C:\testfolder' is denied.” error.
What do you think is causing this? And - will this be an issue once the project is actually pushed up to Azure? I can change permissions locally, but I'm hoping that once it's actually in a live worker role, it won't be an issue.
Any help would be awesome :)
Scratch that - it looks like the C:\testfolder should specify the file name, not the location. I've changed it to C:\testfolder\test.txt and it works just fine :).

Resources