How to perform bulk actions in CQRS? - cron

I need to perform certain action on multiple entities (cron job) do I run a command in a loop with transaction or it's bad?

Imagine yourself not automating this task. Would you sit behind your computer executing this loop yourself by clicking buttons from a GUI and sending these commands separately?
If so, I think an automated loop executing these commands does nothing different from that.

Related

Testing Mass Update Scripts

I'm going to start writing my first mass update script and am currently using a Sandbox account. I am wondering what is the best way to test the script considering it will make changes on a lot of items? Is it possible to test the script on a small sample?
When you execute a Mass Update, you have to build a search in the UI first, then you manually select which search results the script should process. Thus, the best way to test your script is to only select one result at a time, or just a few results. Perhaps build a few test records that match your search criteria and process those.
At any rate, with Mass Updates, you are in full control over which results are processed each time you execute the script.
What I often do ( this dead simple in SS 1) is to create a companion suitelet interface that just calls the mass update function
That way you pass a record id, via the suitelet params, to the mass update function and it runs immediately. In another tab you can make changes etc and then just refresh the suitelet when you want to run it again.
I find this more convenient when developing than going through the mass update interface for each development iteration.
I always test it by simply feeding in an internal ID to the script, in the debugger. That is pretty much how things will run with a mass update.

How to write a DB2 function for a Linux shell script

I have a trigger on one of the DB2 tables. What i need is, every time that trigger runs, it should invoke a Linux shell script.
How to do this-- same as any other process? If my trigger was to invoke a Java process instead of a shell script, putting the bytecode (.class file) of that process to ..SQLLIB/function and defining a function for it would do the job.
Is this much different for Linux script, any subtleties?
I don't have Linux for another few days but deployment is right around the corner and nervous at it just the same.
TIA.
You cannot invoke a shell script from SQL, including from a trigger. What you can do is create a Java UDF the way you described, then use Runtime.exec() to invoke the script.
Note that your approach, apart from introducing security risks, can affect data consistency (a transaction can be rolled back, but you cannot "unrun" the script) and concurrency (a transaction with all its acquired locks will have to wait until your script returns).
A better approach would be to use an asynchronous process to decouple the external action from the database transaction. As an example, your trigger might insert a record into a log table, and the external process would read that log table on schedule, perform the required action when a new record is found, then delete the processed record.
There is an good article about
Making Operating System Calls from SQL
which includes sample code.

Loop until data set is not in use with JCL

I am working in mainframe and I need to wait a dataset is released to execute automatically a JOB. Do you know any simple way to loop until a dataset is not in use in JCL? I was looking on the web and i found some solutions with REXX but they seemed too complicated to do such simple thing as I need. Also I have never used REXX.
Regards!
P.D. Also, the data set could not exist.
Edit: I need this becouse I run a XCOM Job which transfer a file of another system to a mainframe dataset. The problem is when this JOB finish, maybe the file is still beign transfered, and would like to wait to transfer be completed before to start the next JOB. Maybe editing the sentence of the next JOB associated to the dataset.
The easy way to do this is to ensure that your file transfer package allocates the dataset with an OLD disposition, that will create a system level enqueue on the dataset and prevent your job from running until the enqueue is released.
Many file transfer packages offer some sort of 'file complete' exit that can also trigger a job once a dataset transmission is fully complete.
But you can't loop in JCL. You can in REXX, but it has a host of issues that you have to deal with, not at all simple.

How to pause a php script launched with crontab?

I have a PHP scraper running every night on a very large site. Crontab launches the script at 2am and pkill it at 7am. Now I am concerned that brutally killing the script might result in data loss. Let's say that crontab calls the script off while the script is busy writing my scraped data into the database, then the next day the database will refuse that last/first record because it is already present (even if not completely).
Is there any way I can freeze the script with crontab? (That is, without adding a sleep() to my script)
Let's say that crontab calls the script off while the script is busy writing my scraped data into the database
That would be a problem, since you will run into some transaction timeout or something, if you stop your process externally. A better way would be to let the script halt/pause on its own. You could for example define some marker file that is checked by the script periodically so that the script can halt/pause in a controlled way.
Having one large cronjob that can't be interrupted is usually a sign of bad design for a number of reasons.
Most notably, you can't interrupt the run for no reason whatsoever or you'll end up with corrupted data. This can become a big problem in case you have an unexpected power loss or server crash.
Also, it doesn't scale. If you need to process more data, you can't scale it to multiple servers. If you have run times of a few hours now, you may end up exhausting a complete server very soon.
I would recommend to seriously rethink the functionality of this cronjob and restructure it so you have a number of smaller tasks that are queued up somewhere. (It can even be the database.) You could then mask the SIGINT and SIGTERM signals when processing a single task and check the received signals in between tasks. This will allow you to notify the process using either of the aforementioned and have it shut down gracefully.
That being said, things do break and servers do crash. I also urge you to work out plans for data recovery in case the cronjob breaks down while working on something.

How to complete SharePoint 2010 state machine workflow when all tasks completed?

I am new to SharePoint. Sorry if answer to my questions is obvious.
First question: I have state machine workflow which creates about 30 tasks (some of them creates after previous completed using OnTaskChnage activity). I have to log tasks changes and complete workflow when all tasks completed. I see 2 ways to do it:
1) I can create eventDrivenActivity for every of 30 tasks.
OnTaskChanged
--Code (log changes)
--If (allTaskCompleted()) //not code, but activity, what use
----then SetState(Completed); //condition allTaskComplete() from code
But I think it is not good way because of I can't reuse code and do 30 same steps.
2) I can do loging in code and then if need, complete workflow from code, but I don't know how can I do it. I can cancel workflow from the code
SPWorkflowManager.CancelWorkflow(itemWorkflow);
but I can't find any information, how to complete it (or setState to "Completed"). May be I am doing something wrong and workflow have to complete itself then all tasks completed, but it does not happen (It stays in "In progress").
Second question: Is there any possibility to run some code after every change in workflow tasks (as far as I understood, OnWorkflowChanged and OnWorkflowModified not suitable for my needs), or to add programmatically handler to my 30 tasks (not to Tasks list at all, but only for my tasks)?
Thank you in advance.
Best regards
Mikhail.
PS: sorry for my writing. English is not my native language.

Resources