Automatic Server Inquery Clasic ASP - iis

I have an admin system where I can send emails with my lead info in it.
What I'm trying to achieve is an auto command where 10 minutes after I sent the email another email will be sent. The page with the action of sending the second email is ready, but how do I "activate" the action without logging into the system and do it manually?
I'm using IIS w. Classic ASP.

You'll need to use a batch file to schedule this task.
Run batch file from asp http://bytes.com/topic/asp-classic/answers/442265-asp-run-command-line
The batch command for scheduled tasks Make server automatic run asp-script every day
Here is how to add 5 minutes (or 10 minutes) to current time Adding to %TIME% variable in windows cmd script
I let you read and combine all that.

Related

Schedule mail based on condition

I need to schedule a mail every 6 hours whenever the user's balance falls below a threshold. I'm trying to use node-cron package but everytime my server restarts, I'm sure it will start rescheduling and I don't want that to happen.

HTTP Web Server: Agent did not complete within configured time limit

I have a web application that builds web-pages using agent (it's written in LS and we use [print html] to output HTML) and from time to time I see an error as below.
02-11-2020 10:00:18 HTTP Web Server: Agent did not complete within configured time limit [/path-to-database.nsf/web?openagent] Anonymous
02-11-2020 10:00:18 HTTP Server: Execution time limit exceeded by Agent '(Web)|Web' in database '/path-to-database.nsf'. Agent signer 'signer name'.
As a result HTTP task stuck so I have to restart it, but that means I have to monitor it all the time.
It does not seems to be related to agent time execution, otherwise I would have this issue constantly.
The activity does not seems to be the issue as well, according to google analytics it's around ~50 active users.
I doubt [Server Tasks\Agent manager] will help, because agent runs under HTTP task.
Does anybody know how to figure out what is the reason of such issue and where I have to dig to fix it.
Update
Domino version 11.0
The agent is triggered by anonymous visitor and does some relatively heavy computation to construct HTML response (loops and lookups are present, but I'm sure all loops ends properly, without infinitive run).
I guess settings for HTTP Agents are under this section (so 2 mins).
Web Agents and Web Services
Run web agents and web services concurrently? Enabled
Web agent and web services timeout: 120 seconds
In general request takes between 300ms-1 second, however there are some heavy pages with 1-5 seconds (but nothing like 10 seconds or more).
I notice the error only when we get more than 50 active users (who activity open new pages and thus trigger the agent).
I guess Richard is right and there must be some condition when agent stuck (maybe related to views update or some background process).
For now I simply restart HTTP to get this issue fixed (for some time).
So my question could be re-phrased to:
What can cause delay of the agent that build web page (taking into account it's related to 50-100 active users).
Thanks a lot :-)

Google Cloud Platform : Running several hours scraping script

I have a NodeJS script, that scrapes URLs everyday.
The requests are throttled to be kind to the server. This results in my script running for a fairly long time (several hours).
I have been looking for a way to deploy it on GCP. And because it was previously done in cron, I naturally had a look at how to have a cronjob running on Google Cloud. However, according to the docs, the script has to be exposed as an API and http calls to that API can only run for up to 60 minutes, which doesn't fit my needs.
I had a look at this S.O question, which recommends to use a Cloud Function. However, I am unsure this approach would be suitable in my case, as my script requires a lot more processing than the simple server monitoring job described there.
Has anyone experience in doing this on GCP ?
N.B : To clarify, I want to to avoid deploying it on a VPS.
Edit :
I reached out to google, here is their reply :
Thank you for your patience. Currently, it is not possible to run cron
script for 6 to 7 hours in a row since the current limitation for cron
in App Engine is 60 minutes per HTTP
request.
If it is possible for your use case, you can spread the 7 hours to
recurrring tasks, for example, every 10 minutes or 1 hour. A cron job
request is subject to the same limits as those for push task
queues. Free
applications can have up to 20 scheduled tasks. You may refer to the
documentation
for cron schedule format.
Also, it is possible to still use Postgres and Redis with this.
However, kindly take note that Postgres is still in beta.
As I a can't spread the task, I had to keep on managing a dokku VPS for this.
I would suggest combining two services, GAE Cron Jobs and Cloud Tasks.
Use GAE Cron jobs to publish a list of sites and ranges to scrape to Cloud Tasks. This initialization process doesn't need to be 'kind' to the server yet, and can simple publish all chunks of works to the Cloud Task queue, and consider itself finished when completed.
Follow that up with a Task Queue, and use the queue rate limiting configuration option as the method of limiting the overall request rate to the endpoint you're scraping from. If you need less than 1 qps add a sleep statement in your code directly. If you're really queueing millions or billions of jobs follow their advice of having one queue spawn to another.
Large-scale/batch task enqueues
When a large number of tasks, for
example millions or billions, need to be added, a double-injection
pattern can be useful. Instead of creating tasks from a single job,
use an injector queue. Each task added to the injector queue fans out
and adds 100 tasks to the desired queue or queue group. The injector
queue can be sped up over time, for example start at 5 TPS, then
increase by 50% every 5 minutes.
That should be pretty hands off, and only require you to think through the process of how the cron job pulls the next desired sites and pages, and how small it should break down the work loads into.
I'm also working on this task. I need to crawl website and have the same problem.
Instead of running the main crawler task on the VM, I move the task to Google Cloud Functions. The task is consist of add get the target url, scrape the web, and save the result to Datastore, then return the result to caller.
This is how it works, I have a long run application that call be called a master. The master know what URL we are going to access in to. But instead of access the target website by itself, it sends the url to a crawler function in GCF. Then the crawling tasked is done and send result back to the master. In this case, the master only request and get a small amount of data and never touch the target website, let the rest to GCF. You can off load your master and crawl the website in parallel via GCF. Or you can use other method to trigger GCF instead of HTTP request too.

Using windows task scheduler how can i execute my exe file after IIS RESET

I have a scenario where i need to execute .exe after IIS reset. In windows task scheduler we can get option to run execitable file for some condition where we can select IIS configuration/ log and can provide event id to match.
I search couple of option but did not find any.
Can anybody suggest - how can i run my executable file after IIS reset happens using task scheduler.
As per this link,
3201 is start
3202 is stop
3201 IIS start command received from user %1. The logged data is the status code.
3202 IIS stop command received from user %1. The logged data is the status code.
Since, IISReset is a stop and start operation, you can have the trigger based on the start event, i.e., 3202.
Your task trigger will be like this:
You can cross-check the event viewer for the events from IISReset

Schedule task for Node.js web application

I have made a web real time application that connected to Node.js server through a websocket. In my website I can turn on/off an LED connected to Arduino Uno.
What I want to do is, I want my website have capability to turn on/off led at certain date and time dynamically. What I mean 'dynamically' is I can add new or remove current schedule task.
I have been trying using node-schedule, cron, but it's just a static schedule task. I can't change or add new task.
Use a db / file. You can store the dates in a json and then edit it as per your convenience. Use node-cron to create events of what you wanna do from the data. Create function that removes entry from json when you want to and it also remove it from the upcoming tasks by task.destroy() method of node-cron.
https://github.com/kdichev/Green-Systems/blob/development/PumpController.js
check what I have done with my pump. on line 19 I have an array of times that will run the pump according to the entries given.

Resources