I have a Chrome Extension that uses Local Storage.
I set it using:
chrome.storage.sync.set({key: 'mymenu', listarray: listarray}, function() {});
and get it using:
chrome.storage.sync.get(['key', 'listarray'], function(result) {
Overall it works, even after restarts. But once in a while (approx. every few days to a week) the local storage is gone. It is possible to re-do the setting and it works again for another few days to a week.
What could be the reason the extension keep losing the local storage?
Related
as I patiently wait for Firebase storage to be added to the emulators, I was wondering if there is a way I can avoid modifying live storage files and folders when running hosting / functions in the emulator?
For example I use the following code to delete all the files in a folder. Last night someone accidentally deleted all the documents in our emulator as part of a test and it deleted all the LIVE storage folders as we use an import of real documents into our emulator 🤦
async function deleteStorageFolder(path:string) {
const bucket = admin.storage().bucket();
return bucket.deleteFiles({
prefix: path
})
Is there any way I can tell firebase to avoid using the production storage APIs when emulators are running?
I have used the following condition in my function to prevent using firebase storage API when running in emulator:
if (process.env.FUNCTIONS_EMULATOR == "true") {
console.log(`Running in emulator, won't call firebase storage`)
} else {
// Code goes here to run storage APIs
}
So I have a node js web app, this web app has a folder to store images uploaded by users from a mobile app. How I upload the image to the folder is by using the image's base64 string, and using fs.writeFile to save the image to the folder, like this:
fs.writeFile(__dirname + '/../images/complaintImg/complaintcase_' + data.cID + '.jpg', Buffer.from(data.complaintImage, 'base64'), function (err) {
if (err) {
console.log(err);
} else {
console.log("success");
}
});
The problem is, whenever the application is redeployed to google cloud, the images gets deleted. This is because the image folder of the local version of the application is empty - when the user uploads an image, i don't get a local copy of that image.
How do i prevent the images from getting deleted with every deployment? because the app is constantly updated (changes to js or html files), i can't have the images getting deleted with every deployment. How do i update a deployment to only deploy certain files? the gcloud app deploy command seems to deploy the entire project. or should i upload the images directly to google cloud storage?
please help, currently the mobile app isn't released to the public yet, so having the images deleted with every deployment is still not a big problem now, but it will be once it's released to the public. because the images they upload are very important. thank you in advance!
It appears that your __dirname directory you chose may be under /tmp or, if you use the flexible environment, some other directory local to your instance. If so the images will disappear whenever new instances are started (which always happens at new deployment, but it can happen in between deployments as well). This is expected, the instances are always started "from scratch".
You need to store the files that your app creates and you want to survive instance (re)starts on a persistent storage product, like Cloud Storage, see Using Cloud Storage (or Using Cloud Storage for flexible env). Note that you can't use the regular filesystem calls with Cloud Storage, you need to use the documented client library.
As stated in Dan Cornilescu's answer, for user uploaded files, you should store them in Cloud Storage for GAE Standard or for GAE Flexible.
Just as a reference, there is an alternative for those who are using Python 2.7, Java 8 or PHP 5, which is the BlobStore API
I believe I set up my service worker incorrectly from the get-go, and now I'm having trouble resetting everything. My problem now is that the only way for users to fix the issue is to manually unregister the service worker and cached files, which isn't very helpful. I've added a script in web pack to unregister the service worker and delete the cached storage files, however since the bundle file is cached anyways by the user's browser, they cannot even see the changes that I make to the file, so they are stuck with old code.
I've tried everything I could think of and starting to exhaust resources, but it seems like I can't do a whole lot for the user agent since they keep receiving old files, and are unable to even see these new changes.
Also, I added cache busting which I thought would fix this issue, however, it did not.
Any suggestions?
Unregistering your service worker should be unnecessary. You should be able to just deploy a new SW and your users will wills start getting that.
See this answer about SW updates. Basically anytime a user visits your site the SW will be fetched fresh if the current version the browser has is 24 hours old.
In the new version of your SW, change the prefix or suffix and workbox will switch to a fresh cache and ignore all of the existing cached assets.
You'll probably also want to cleanup the old cache with something like this:
const suffix = 'v1';
self.addEventListener('activate', function(event) {
event.waitUntil(
caches.keys()
.then(keys => keys.filter(key => !key.endsWith(suffix)))
.then(keys => Promise.all(keys.map(key => caches.delete(key))))
);
});
I'm working on an app that instead of a database uses file system in the server's root directory. It's basically a note application that allows me to save notes. Each note is a serialized object of Note class represented by following structure \Data\Notes\MyUsername\Title.txt
When I'm testing this on localhost through IIS Express everything works fine and I can easily go step by step there.
However, once I publish the app to Azure, the folder structure is still there (made a test Controller that uses Directory.GetFiles() and .GetDirectories() to simulate folder browsing so I'm sure that the files are there) but the file simply doesn't get loaded.
Loading script that's being called:
public T Load<T>(string filePath) where T : new()
{
StreamReader reader = null;
try
{
reader = new StreamReader(filePath);
var RawDB = reader.ReadToEnd();
return JsonConvert.DeserializeObject<T>(RawDB);
}
catch
{
return default(T);
}
finally
{
if (reader != null)
reader.Dispose();
}
}
Since I can't normally debug the app on Azure I tried to dump as much info as I can through ViewData and even there, everything looks okay and the paths match, but the deserialized object is still null, and this is only when trying to open an existing note WITHOUT creating a new one first (more on that later)
Additionally, like I said, those new notes get saved in the folder structure, and there's a Note sidebar on the left that allows users to switch between notes. The note browser is nothing more but a list that's collected with a .GetFiles() of that folder.
On Azure, this works normally and if I were to delete one manually it'd be removed from the sidebar as well.
Now here's the kicker. On localhost, adding a note adds it to the sidebar and I can switch between them normally.
Adding a note on Azure makes all Views only display that new note regardless of which note I open and the new note does NOT get stored in the structure (I don't know where it ended up at all!) even though the path is defined at that point normally and it should save just like it does on localhost.
var model = new ViewNoteModel()
{
Note = Load<Note>($#"{NotePath}\{Title}.txt"), //Works on localhost, fails on Azure on many levels. Title is a URL param.
MyNotes = GetMyNotes() //works fine, reads right directory on local and Azure
};
To summarize:
Everything works fine on localhost, Important part doesn't work on Azure.
If new note is not created but an existing note is opened, Correct note gets loaded (based on URL Param) on Localhost, it breaks on Azure and loads default Note object (not null, just the default constructor data since it's required by JsonConvert)
If a new note is created, you'll see it on Localhost and you'll be able to open all other notes regardless, you will see only the new note on Azure regardless of note picked.
It's really strange and I have no idea what could cause this? I thought it had something to do with Azure requests being handled differently so maybe controller pushes the View before the model is initialized completely but that doesn't make sense since there's nothing async here.
However the fact that it loads a note that doesn't exist on the server it's even more apsurd and I have no explanation for that.
Additionally this issue is not linked with a session. I logged in through my phone and it showed the fake note there as well right away.
P.S. Before you say anything about storage, please note this. Our university grants us a very limited Azure subscription. Simple lowest tier App service and 5DTU SQL server and 99% of the rest is locked out of our subscription. This is why I'm storing stuff on the server, not because I believe it's the smart thing to do.
I’ve uploaded some files to Blob storage, and now I’m using the OnStart method to retrieve those files and run them. Right now I’m working locally.
Using the following code:
using (var fileStream = System.IO.File.OpenWrite(#"C:\testfolder"))
{
blob.DownloadToStream(fileStream);
}
Results in a “Access to the path 'C:\testfolder' is denied.” error.
What do you think is causing this? And - will this be an issue once the project is actually pushed up to Azure? I can change permissions locally, but I'm hoping that once it's actually in a live worker role, it won't be an issue.
Any help would be awesome :)
Scratch that - it looks like the C:\testfolder should specify the file name, not the location. I've changed it to C:\testfolder\test.txt and it works just fine :).