I'm trying to do some FirefoxOS apps, but I have not seen any easy way to store local data. I hear about IndexedDB, but it seems too complex. Is there any other alternative? If not, is there any easy tutorial about it?
I have considered to store and recove remote data (doing a croos domain request), but I'm having some issues with the permissions. Is there any tutorial about XHR for FirefoxOS?
Thanks.
The best IndexDB doc I can found is Using IndexDB in MDN.
And there are plenty of default Firefox OS apps (gaia) such as gallery, browser using IndexDB. You can see how it works in real life.
Or you can use the more lightweight window.localStorage API, which works like a dictionary.
localStorage.setItem(key, value);
localStorage.getItem(key);
EDIT: Note that localStorage is not recommend because its block the main thread. You should use gaia/shared/asyncStorage instead.
For XHR you can check Firefox-OS-Boilerplate-App for a working XHR demo
I recommend you using asyncStorage over localStorage, is an asynchronous version of localStorage, with the same api and the benefits of IndexedDB.
You can see the code and learn how to use it reading the comments of the file:
https://github.com/mozilla-b2g/gaia/blob/master/shared/js/async_storage.js
The podcasts reference app talks about both IndexedDB and SystemXHR, which is the privileged API for doing cross-domain requests:
https://marketplace.firefox.com/developers/docs/apps/podcasts
You can use DataStore in firefox Os by using data store you can also share that data with other apps and you can also give permission to other apps to write in data store or not.
you can follow this link.
https://developer.mozilla.org/en-US/docs/Archive/Firefox_OS/API/Data_Store_API/Using_the_Data_Store_API
Just for using data store your app need to certified.
navigator.getDataStores('mystore').then((store)=>{
store[0].getLength().then((ln)=> console.log(ln))
})
Related
Bots are amazing, unless you're Google Analytics
After many months of learning to host my own Discord bot, I finally figured it out! I now have a node server running on my localhost that sends and receives data from my Discord server; it works great. I can do all kinds of the things I want to with my Discord bot.
Given that I work with analytics everyday, one project I want to figure out is how to send data to Google Analytics (specifically GA4) from this node server.
NOTE: I have had success in sending data to my Universal Analytics property. However, as awesome as that was to finally see pageviews coming into, it was equally heartbreaking to recall that Google will be getting rid of Universal Analytics in July of this year.
I have tried the following options:
GET/POST requests to the collect endpoint
This option presented itself as impossible from the get-go. In order to send a request to the collection endpoint, a client_id must be sent along with the request itself. And this client_id is something that must be generated using Google's client id algorithm. So, I can't just make one up.
If you consider this option possible, please let me know why.
Install googleapis npm package
At first, I thought I could just install the googleapis package and be ready to go, but that idea fell on its face immediately too. With this package, I can't send data to GA, I can only read with it.
Find and install a GTM npm package
There are GTM npm packages out there, but I quickly found out that they all require there to be a window object, which is something my node server would not have because it isn't a browser.
How I did this for Universal Analytics
My biggest goal is to do this without using Python, Java, C++ or any other low level languages. Because, that route would require me to learn new languages. Surely it's possible with NodeJS alone... no?
I eventually stumbled upon the idea of actually hosting a webpage as some sort of pseudo-proxy that would send data from the page to GA when accessed by something like a page scraper. It was simple. I created an HTML file that has Google Tag Manager installed on it, and all I had to do was use the puppeteer npm package.
It isn't perfect, but it works and I can use Google Tag Manager to handle and manipulate input, which is wonderful.
Unfortunately, this same method will not work for GA4 because GA4 automatically excludes all identified bot traffic automatically, and there is no way to turn that setting off. It is a very useful feature for GA4, giving it quite a bit more integrity than UA, and I'm not trying to get around that fact, but it is now the Bane of my entire goal.
https://support.google.com/analytics/answer/9888366?hl=en
Where to go from here?
I'm nearly at the end of my wits on figuring this one out. So, either an npm package exists out there that I haven't found yet, or this is a futile project.
Does anyone have any experience in sending data from NodeJS to GA4? (or even GTM?) How did you do it?
...and this client_id is something that must be generated using Google's client id algorithm. So, I can't just make one up...
Why, of course you can. GA4 generates it pretty much the same as UA does. You don't need anything from google to do it.
Besides, instead of mimicking just requests to the collect endpoint, you may just wanna go the MP route right away: https://developers.google.com/analytics/devguides/collection/protocol/ga4 The links #dockeryZ gave, work perfectly fine. Maybe try opening them in incognito, or in a different browser? Maybe you have a plugin blocking analytics urls.
Moreover, you don't really need to reinvent the bicycle. Node already has a few packages to send events to GA4, here's one looking good: https://www.npmjs.com/package/ga4-mp?activeTab=readme
Or you can just use gtag directly to send events. I see a lot of people doing it even on the front-end: https://www.npmjs.com/package/ga-gtag Gtag has a whole api not described in there. Here's more on gtag: https://developers.google.com/tag-platform/gtagjs/reference Note how the library allows you to set the client id there.
The only caveat there is that you'll have to track client ids and session ids manually. Shouldn't be too bad though. Oh, and you will have to redefine the concept of a pageview, I guess. Well, the obvious one is whenever people post in the chan that is different from the previous post in a session. Still, this will have to be defined in the code.
Don't worry about google's bot traffic detection. It's really primitive. Just make sure your useragent doesn't scream "bot" in it. Make something better up.
I have 1 project which is divided into multiple SPA, I have 5 SPA, written in 2 in Angular, 2 in react and 1 in vue js. Now I have an integrated server which will serve the different files as per routing. I need to share the data from one app to another with least interaction of database. This is a scenario of micro frontends. Hope this clears my problem.
Any help will be appreciated.
There are three ways with which you can share data:
URL: Query Params/Path Params (Only for small data like ID, filters, etc.)
Session Storage: Use this only if you are not navigating to other tab/window
Local Storage: Most convenient and preferred way
Of course, if you are persisting state to Local Storage, then you have to handle flushing of the state by yourself when the user logs out.
This is a bit painful process to handle. You will have to write code to manage serialization and deserialization of JSON to Local Storage. To ease this, it is better if you have the same state management solution across all micro-apps. I recommend the use of Redux/MobX to do this. But if you are using Redux for React, Ng-Rx for Angular and Vuex for Vue, then you will not have any ready-made solution.
Also, when you are saving the state to Local Storage, either debounce it or do it lazily with little delay for performance reasons.
We are using micro-frontends for last two years and we use the mix of Local and Session storage to do our things. Luckily, for all the apps we use Redux, even with Vue, and that allows us to use redux-localstorage.
You can also use Cookies but it is generally better to avoid them.
1st, Custom element creation
I have worked for micro-front-end elements base architecture with #Angular/element module. As I worked, I used bellow flow
For code, I have followed build-a-micro-frontend-application-using-angular-elements.
Here it will provide you elements like native html elements.
2nd, Another good approach is to use library feature in angular. You can write your components, directive or pipe, then publish them or use them into other project directly. In this approach again you can reuse the same code.
3rd, We can use i-frame, but now days it is causing lot of security issues from browser.
Another option is to use a frontend event bus like EEV. Your application shell would be responsible for creating a shared event listener. Then each micro-frontend could emit events on that shared channel.
You could also use an RxJS Subject as a message bus within your App Shell and subscribe to it in the Micro Frontend Applications. Here's an example
I hope that gives you a couple additional ideas.
I came across same scenario where i have to pass the data from one micro-app to another.
and after lot of R&D i found that event based communication is the best , where i transfer the data in form on Event Objects.
here is some Example:
For sending data:
var event = new CustomEvent('userData', { "detail": { "id": id, ...rec } });
window.dispatchEvent(event);
and for receiving the data on other app is:
window.addEventListener("userData", function (e: CustomEvent) {
this.id = e.detail.id;
this.country = e.detail.country;
this.contact = e.detail.contact;
this.company = e.detail.company;
this.changeDetectorRef.detectChanges();
}.bind(this));
This approach does not need the DB communication.
Hope this will resolve your query!!!
Happy Coding!!!
Every reverse engineering of the Google Speech API requires an API key but Chrome is able to call the server, seemingly without one. How does this work internally?
Is it possible to use the API for any sort of large scale speech transcription?
Looking into the Chromium source code, http://src.chromium.org/viewvc/chrome/trunk/src/content/browser/speech/google_one_shot_remote_engine.cc indicates that the server is passed an API key with the request.
It also seems that Google Chrome comes with an API key, and chromium can, depending on the distribution. https://code.google.com/p/chromium/wiki/ChromiumBrowserVsGoogleChrome
It's still unclear to me why the browser calls to the server are not effected by the 50 calls/day limit.
I am trying to create a simple text editor that has Operational Transform multi tenant support and while it was reasonably easy to get the editor working and syncing across clients using shareJS, my problem is I would like to sync the shareJS doc's with a Folder structure on the Server side (this will eventually be a git repo)
I am completely new to sharejs and Operational Transforms and found the shareJS documentation a little tough to follow for more complex example.
Any suggestions on How might I approach this problem?
What I have tried to do is to implement a client on the server side that could get the entire doc text on update but (and this is the lack of experience I'm sure) the only way I can think to accomplice is to use the client api to cycle through all documents and write each to a file. But to me this sounds horribly inefficient. Can anyone point me to any resources that might help or offer some advice as to how I could approach this?
This is a bit late thought but you can still call getSnapshop method on the server side and dump that into a file on your file system. If it is not run locally you can create a tiny router with express on your local machine that listen for post request and you post the dumped file into the post request body on your sharejs server and then on your machine dump the post request body to a file, that should work.
Beware of security considerations if you use a auth system on your server.
I'm building my first node.js app, and trying to integrate it with the Instapaper API. Instapaper only uses XAuth, and I'm a little confused how it works.
Should I use the node-oauth module and try to customize it for xauth? Or should I roll my own xauth? Or something else, maybe?
I'd checkout something like EveryAuth, see how they are handling the various options out there, forking it, and then contributing back with a new implementation.
Good luck man.
Here is how to get it working with the oauth module.
https://stackoverflow.com/a/9645033/186101
You'd still need to handle storing the tokens for your users, but that answer should get you u and running for testing the API in node.