I have a function that requires Socialite - Google's token as an external API authentication.
My goal is that, after successfully authenticating with Socialite, the function should run in interval. So I decided to use Laravel Scheduler.
However, as I read the whole documentation. I assume that it works without human intervention. Basically setup your logic and schedule it in Consol\Kernel.php and that's it.
But in my case, I needed to authenticate with google Socialite, of course it will need the user to click first.
Question is, How do I run the cron job after authentication?
Related
How can I check if the user is currently active from the backend(Express)?
I am using Auth0 react SDK for authentication on the frontend.
Use case:
The express server has a cron job that checks if the user session is active and runs some method.
Options:
The simplest option is probably to push logged-in timestamp to the database in intervals using an API from the react app and express cron checks the database timestamp. If the timestamp is in the ballpark of the current time, it executes the method.
However, I believe there could be a better way.
Like when the express cron executes, it can somehow get the users with currently active sessions/logged-in and performs the desired function.
Any tips or recommendations?
I need to find the best possible way to do the following task.
I have different users let's say (over 500) and all users have a scheduled function that need to be run twice every day. But if any of the user's phone is off. Then of course that function wont work since its code is written on client side.
now what i want to do is Run a scheduled function in the backend using Node js, but idk how to run that for every user. (note : every user has different schedules). Thats why i wrote that in client side, but with with a possibility of phone might be switched off so its bit off to do that.
What should i do in this scenario? any leads?
I'm building a REST API with Nodejs with MongoDB as database.
This API only have GET routes to retrieve data. This data is aggregated from others sources via cron job.
I'm not sure how to correctly implement this and what are the best practices.
Do i need to create POST/PUT route to put data in database ?
Do i need to just put data directly in the database ?
Edit : (more informations)
Only the cron jobs would use POST route
The cron jobs are getting data from others REST API and some web scraping.
Is it a good idea to have my cron in the same application with the API, or if I have to make another application to manage my cron jobs and populate my database ?
I suggest creating an API which can be called using an accessKey for updating the data because you would not want to write your mongo db username and password in a shell file.
But if the cron job is a k8s cron job or just a file written in some programming language which can access the db in secure way and is hosted in the same network then you can go ahead with doing it from cron job.
If you'd like to control the data flow entirely through API at this point, creating a POST route would be the way to go. I am not sure how are your GET routes secured - if not at all, consider implementing or at least hard-coding some sort of security for routes that modify your data (oAuth2 or similar).
If you can access the database directly and it's desired, you can just directly insert/update the data.
The second option would be probably quicker, but the first one offers more space for expansion in the future and could be more useful overall.
So in the end, both options are valid, it's up to preference and your use case.
I want to build a nodejs application to scrape data from a website every 20mins and store it in firebase. Can you please tell me which product of google( compute engine, app engine or cloud functions ) is effective for this requirement as below are the things i am expecting to do,
1. Run Nodejs, cheerio to scrape data from website and store in firebase
2. Schedule it to run 20mins initially later may change it to 30mins or 1hr.
After reading the docs, i know that there are too many ways to implement this, but i am looking for a cost/resource effective way.
Pointers and ideas would be good.
Host the Node.js application within the App Engine[1] as Cloud Functions are event-driven[2]. You can use App Engine standard[3] or App Engine flexible[4] environment. For the scheduling part, Google Cloud Platform has a Cron Service[5] and you can create a cron job for your task hitting App Engine[6]. You can find a sample design here[7].
It depends on how much time your script spends waiting on requests. During that time the script is idle but you're getting charged at a super-high rate.
If you're doing a lot of concurrency then I would say do it with cloud functions.Another pro of doing it that way is your ip won't get blocked because it will be different every time.
Regarding scheduling, I'm not sure if Google lets do that, but I know AWS does.
A cost effective/simple way would be to use cronjob.org and have it send an http request to your cloud functions url to trigger it. If you're worried about other people triggering it, tell your cronjob to send an http header w/ an api key. Check this api key in your cloud function code to verify cronjob.org sent the request. I don't think it gets any more easy/cheap than this.
I have written this module in node js, which is an express middleware and will enable your node app with an API for accessing cloud storage services such as dropbox.
For example this will list the available services
wget http://localhost:6805/api/v1.0/services/list/
And this will list a directory of the user Dropbox
wget http://localhost:6805/api/v1.0/dropbox/exec/ls/path/to/folder/
Of course, the user must have connected his Dropbox account to the app. To do so, your app must call this
wget http://localhost:6805/api/v1.0/dropbox/connect/
Which returns an URL, which you will open to let the user authorize the access to the service (this is an oauth2 authorization mechanism). Then call this to finish the auth process
wget http://localhost:6805/api/v1.0/dropbox/login/
My question is : how to test the API? I mean functionnal tests. I could mock each service (Dropbox for example) but it may be a lot of work don't you think?
No answer yet, so I can say that for now, the only way I have found is to use selenium to simulate a browser.
I open a test page, and type my test login/password, just like a human would.
Then I run tests normally