I have built a desktop app using node webkit and need to cache PDF files that are viewed via the App when online so that they are also available offline. I haven't found a solution yet but during testing I noticed that files that I had previously viewed online were available offline even though I haven't written any code for this yet. Therefore these must already be cached automatically. I did a search to find where the files are being saved exactly but couldn't find anything.
Can anyone explain this or point me in the direction of information on this so that I understand how it works and ensure my App can utilise the default behaviour of the browser caching?
********UPDATE***********
I have found a solution to store the PDFs locally, however this isn't my query. I am looking for an explanation as to HOW the PDFs are available when offline without this code I have written. The files must be automatically be stored somewhere otherwise how would they display?
The default caching behavior of node-webkit is controlled by the page-cache property in package.json :
"webkit": {
"page-cache": true
},
Only typical web resources can be cached this way (scripts, style sheets, etc.). To be able to view PDF files offline, you can store them manually.
There are several ways to do that :
Save a file directly to disk (the simple solution, just store the files in App.dataPath)
Use a database
Use Web Storage
Use the application cache
All of these are documented here : Save persistent data in app
The default location to cache your app files is mentioned in your package.json manifest file.When the app is initialized the settings in your manifest files are loaded by default.Since cached files cannot be accessed programmatically,you can overwrite the default files manually.
To get the application’s data path in user’s directory for windows,you can write it in Jason format in your package :
Windows: %LOCALAPPDATA%/
You can read about other cache menthods in node webkit's documentation :
http://docs.nwjs.io/en/latest/References/App/#appclearcache
Related
Ok, I have never seen anything like this before and hoping someone else has. I just finished patching our Dev and Test servers to Nov2017CU (SharePoint 2013). Since then, any solutions that are using JS injection from Site Assets are not updating. I'll make a change to the file, the library reflects that I made the change, but when I attempt to load the page accessing the js file, the changes are not reflected. Hard refreshes and full cache cleans are not affecting it. If I close and reopen my editor (VSCode) my changes are gone. When I look at the version history, the current version doesn't have my changes, but the previous version does. If I try to revert to that version, it doesn't take (still shows the previous version of the file).
Here's where it becomes extra weird. I have deleted the entire file from the library. Reset IIS (heck, I even rebooted the server at one time). It somehow still loads the file. The file is no longer in the library, but the server is still serving it up to the browser. I have confirmed it is not getting it from another location as the Dev tools are showing the file is located in the Asset Library the file was deleted from. Even users who have never accessed the site before are still getting that file in their browser.
This isn't limited to a single site either. I have other developers in different sub sites (same site collection) that are having the same issues.
Anyone seen this before?
Looks like your web application has BLOB cache enabled which is causing files to served from the cache.
There are 2 ways to fix:
1) The heavy handed way would be to flush the BLOB cache using powershell commands mentioned:
$webApp = Get-SPWebApplication "<WebApplicationURL>"
[Microsoft.SharePoint.Publishing.PublishingCache]::FlushBlobCache($webApp)
This will flush all the files in the BLOB. Usually, the files are cached based on the max-age attribute value. So, that is the reason that your files are being served even if you had deleted it from the source.
2) The surgical knife approach would be to append a query string, like (https://sitecollurl/siteassets/app.js?v=1.1), to the file references (usually in master page, page layouts, webpart references, script links etc. wherever it is referenced). When you append a query string to the file, it will force the browser to download the newer version of the file. Would prefer this approach as it will not unnecessarily clear other files from BLOB.
Hey Guys
At the moment I have a NodeJS webapp in the making which scrapes a website for data. Specifically, this webapp scrapes images for the purpose of downloading them. For example, all the image permalinks are scraped from the reddit front page. They are then sent to the client to download individually. My issue is with the website I am scraping there can be thousands of images.
This provides a horrible user experience if 1000+ images are downloaded to the download folder.
As a result I have two options.
A) Download to a temporary folder on the server. Zip. Send to client for download. Delete from server
B) Download files to browser cache. zip. download to specified download directory.
My question to you is this; Is option B even possible?
I am relatively new to this entire process and I can't find anything to actually zip the files in the browser cache. I can implement option A relatively easily however this requires a large amount of bandwidth, something I can find for around $5/MO on DigitalOcean. However this entire project is a learning experience and as a result I would love to be able to manage files in the browser cache instead.
I am using the following NPM Modules:
NodeJS
Express
Cheerio
Request
Further Update
I had come across a plugin for NPM called jsZip: https://stuk.github.io/jszip/
However, I was unaware it could be implemented on the client side as well. This was purely an error on my part. This brings up an interesting issue of WebStorage: https://www.w3schools.com/html/html5_webstorage.asp
the maximum storage size for the session is 5MB
From here I will attempt to implement this answer here: How do you cache an image in Javascript to my current code and will update this answer with the result for anyone else facing this issue.
I want to find the correct place to save my user settings for my uwp app. I know there exists:
local: Data that exists on the current device and is backed up in the cloud
roaming: Data that exists on all devices on which the user has installed the app
temporary: Data that could be removed by the system any time the app isn't running
-localcache: Persistent data that
exists only on the current device
I can access the above places with ApplicationData.Current. Which are located somewhere in C:\Users\bla\AppData\Local\Packages\1e7e-94a6-4235-a0c5-9b143f8b_8webbwe
The project also contains a Asset folder, and I can't find a good source which tells me where the folder is located when the app is installed (not in developer mode).
Some developers place there settings into the asset folder. Why? What's the advantage? Is there also a file size limit like for ApplicationData.Current ? When deploying a settings folder into the asset folder will it be available for all user which installed my app? Any background informations regarding the asset folder are appreciated.
Settings files are most appropriate in the ApplicationData folders or ApplicationData.LocalSettings or .RoamingSettings See Store and retrieve settings and other app data
The assets folder is purely a convention. The "assets" name is not special other than to suggest what types of files go in the folder. It is just a useful way to organize the application package to have a place for assets (images, etc.) that are used in the app.
When the app is installed the assets will be in the Package.InstalledLocation directory and can be addressed with an ms-appx:///assets/ URI. Typically this will end up somewhere in \Program Files\WindowsApps\.
Putting a settings file in assets would be a bad idea as the InstalledLocation is read-only and as settings are user data.
There is no hard size limit for files in ApplicationData folders, although if too much data is stored in RoamingFolder then it won't roam. The files will still be available locally.
I set up my working Core Data sqlite file for versioning. The versioning setup process created 3 files:
foo.sqlite
foo.sqlite-shm
foo.sqlite-wal
Since then, I can access the Core Data store programmatically (using MagicalRecord), but I can't read any data using either the Firefox add-in (SQLite Manager) or the app SQLiteManager. I'm concerned that when I send the updated app to the App Store, the additional files are not going to go and the app is going to crash.
What do I need to do to make sure new versioning-enabled sqlite files go with the app?
Those are not version-related files, they're SQLite log files. These files get created automatically when write-ahead logging is enabled. That's not the default in iOS 6, but it's possible if you use PRAGMA journal_mode=WAL;. It might or might not be the default in iOS 7 (I have no comment at this time).
I don't know why Firefox and SQLiteManager can't open the file. I speculate that they're both using an old version of SQLite (since WAL is only available as of SQLite 3.7.0). Regardless, they have nothing to do with whether the necessary files are available in your app. You can find out what's included in the app by just looking. The .app is just a directory, really, so take a look inside and see what's there.
If you are using lightweight migration (wich is enabled by passing the
right options when you open the store), Core Data takes care of
upgrading the schema in-place.
The additional WAL and SHM files are not a result of lightweight
migration, but are instead simply produced by SQLite in the “write
ahead logging” mode that Core Data puts it into. (An
oversimplification is that new data goes into the .wal file until
enough accumulates and then it is moved to the .sqlite file.)
Yes, you definitely want to test using Ad Hoc builds for lightweight
migration; testing from Xcode is insuffient.
Mike Fikes
I have an app that synchronises content with a web server so that the app ends up with an offline and cut down version of the server based web pages. All text and html is stored in a SQLite database but what is the best approach for handling file assets? In my case this is a mix of image and audio files.
The synchronisation is all set up in the core project and my Touch project has a Content directory set up for storing the assets and my intention had been to have a similar setup for Droid. I could pass the list of files needed to the UI projects and download them from there but that seems wrong.
Thanks.
For that I would create a Service in Mvx which the ViewModels you create use for getting the external assets. Take for instance the Daily Dilbert Tutorial. You could consider the daily comics as being very similar to your external assets, where the DilbertService is used to get all the comics and presents them in a List. However your list could be a list of files located on the SDcard or where you decide to store your files.