First day learning viewflow, I managed to get the tutorial to work, but I have a use case that I don't know how to implement.
What I want is when a workflow is started, I want it to automatically assign the task to the workflow starter (the user), how do I go about reference the current request object inside the workflow?
eg.
start = (flow.Start(CreateProcessView)).Permission(auto_create=True).Next(this.fill_request)
fill_request = (flow.View(UpdateProcessView).Assign(#current user))
An .Assign(...) could be specified with a callable that takes a process activation and should return a user. Ex .Assign(lambda act: User.objects.get(...))
There are several callable shortcuts provided by Viewflow. Any this.[task_name].owner point to a user who completed that task, and activation,process.created_by points to a user who made the .Start task
fill_request = (
flow.View(UpdateProcessView)
.Assign(lambda act: act.process.created_by)
# .Assign(this.start.owner)
)
Related
I am creating a bidding system where students can log in to a system and make a request for a tutor and sign a contract via a TKinter UI. I have managed to do this successfully. But now I am required to do the following,
"when a student or a tutor logs in into the system, the system should notify the user if there is a contract that is going to expire within a month."
I am having trouble trying to implement this feature as I feel this needs an observer and I for the life of me can't wrap my head around the Observer design pattern. I have no problem creating the methods to call for the contracts and determine which contract expires but I can't seem to figure out how to like display it using an observer. Can I get some guidance on this?
For reference, this is the method that would get all the contracts that are expiring within a month.
def expiring_contracts(self,username):
user_contracts = self.all_contracts_by_user(username)
expiring_contracts = []
for i in user_contracts:
time_string = i["expiryDate"]
time_datetime = datetime.strptime(time_string,'%Y-%m-%dT%H:%M:%S.%f')
one_month = datetime.now() + dateutil.relativedelta.relativedelta(months=1)
if time_datetime < one_month:
expiring_contracts.append(i)
return expiring_contracts
the self.all_contracts_by_user is another method within this class that gets all the contracts by this user. I must implement a design pattern for this.
I started a complex project management application and I have the challenge of building resource permissions management for different types of user profiles.
My challenge is:
User story
John is an user with a common user profile.
John creates a project in the application.
John creates several tasks and adds them to the project.
John adds an user responsible for each task.
Added users must have access to the project and the tasks to which they have been added.
John creates a specific task and adds it as a subtask to one of the project's tasks.
In this subtask John adds an user as responsible, automatically that user must have access to the subtask, the task and the project.
And at any time, John can restrict access to a project resource, such as defining that a specific user can only view tasks
The way I started.
I created a specification pattern for each use case, where I inform the variables and it returns a true or false answer.
However, I have to request for each resource and, in my opinion, this is not performative.
What I mentioned is one of the simplest cases, there are others that are more complex.
canEditTaskOnProject(): boolean {
if (!this.project) {
console.error(
`Project not provided on ${TaskPermission.name}.${this.canEditTaskOnProject.name}`
);
return false;
}
return new ProjectLeader(this.project, this.userId)
.or(new Creator(this.task, this.userId))
.or(new FullAccessTaskPermission(this.project, this.userId))
.or(new TaskResponsible(this.task, this.userId))
.or(
new RestrictTaskPermission(this.project, this.userId).and(
new Creator(this.task, this.userId).or(
new TaskResponsible(this.task, this.userId)
)
)
)
.or(
new ReadAndWriteTaskPermission(this.project, this.userId).and(
new TaskResponsible(this.task, this.userId)
)
)
.isSatisfiedBy(this.userId);
}
I would very much like suggestions from experienced people who have already done something similar. I am a beginner in the area and in the company where I work, there are no seniors.
Thank you in advance!
I found the best way using casl:
import { defineAbility } from '#casl/ability';
export default defineAbility((can, cannot) => {
can('read', 'Article');
cannot('read', 'Article', { published: false }); // inverted rule
});
I need to :
1. Create a single page location application
2. Display all the asset present in the selected location in a table
3. Provide a button from which user can navigate to WOTRACK to view all the workorder(s) created on selected location and its asset(s).
I am facing difficulty in the 3rd one. I have tried Launch in Context and it is working fine except am not able to pass sql query like 'location={location} and assetnum in ({asset.assetnum})'. I need to filter workorders with particular location and all its assets.
I tried to save all the assets in the location to a Non-persistant attribute and passing the values of the attribute in the Launch in context url, Its working as expected but to do so I have written a script on 'Initialize value' which is causing performance issues.
script goes like this:
from psdi.server import MXServer;
from psdi.mbo import MboConstants;
if app == "LOCATION1" :
if mbo.getString("LOCATION") is not None:
Locsite = mbo.getString("SITEID")
desc = mbo.getString("DESCRIPTION")
MaxuserSet = MXServer.getMXServer().getMboSet("MAXUSER", mbo.getUserInfo())
MaxuserSet.setWhere(" userid='"+user+"' ")
MaxuserSet.reset()
UserSite = MaxuserSet.getMbo(0).getString("DEFSITE")
if Locsite == UserSite:
AssetSet = mbo.getMboSet("ASSET")
AssetSet.setFlag(MboConstants.DISCARDABLE, True);
if not AssetSet.isEmpty():
AssetList = ""
AssetMbo = AssetSet.moveFirst()
while AssetMbo is not None:
AssetList = AssetList + str(AssetMbo.getString("ASSETNUM")) + "%2C"
AssetMbo = AssetSet.moveNext()
mbo.setValue("non-persitant",str(AssetList),11L)
and in the LIC url i have given : 'http://xx.x.x.xx/maximo/ui/?event=loadapp&value=wotrack&tabid=List&additionalevent=useqbe&additionaleventvalue=location={LOCATION}|assetnum={non-persistant}'
Is there any other feasible solution to the requirement?
Thanks in Advance
Launch In Context is better used for sending the user to an outside-of-Maximo application and passing along some data from inside-Maximo to provide context in that external app.
What you are doing sounds like a good place to use a workflow process with an Interaction node. The developer tells the Interaction node which app to take the user to and which Relationship to use to find the data the user should work with there.
Why don't you add a table control inside the table details (expanded table row) and show a list of the work orders there. From the WONUM in that table, you could have an app link to take them to WOTRACK, if they want more details about a particular work order. No customization (automation scripting) needed. No workflow needed. Nice and simple.
I have what seemed like a fairly simple requirement for a process that im beginning to question is even possible.
The image below shows my current process. I am trying to achieve two things:
A user creates an initial user task for adding a note, they should be able to add as many notes as they wish with one user task per note
A new sub-process is spawned for each new note (user task) that the user has created.
The process above presents the following problems:
A sub-process should be spawned for each task, however they seem to overwrite each other
Im not sure if the sub-process requires a unique id for each new sub-process spawned
So it turns out that the solution to this question requires a bit of scripting using groovy.
Below is the updated process model diagram, in it I start a new instance of the Complete Task process using a script task then if the user wishes to add more tasks the exclusive gateway can return the user to the Create task (user task) OR finish the process.
I clear down any values in the fields held within the user task within the script task before I pass the scope back to the user task.
The image below shows my Complete Task process that gets called by the main process using a script
Here I avoid using parallel gateways in preference of creating a new instance of the Create Task (user task) and a new instance of the Complete task process (not subprocess) via means of the script.
To start a new instance of the Complete Task process we have to start the process using the function startProcessInstanceByKeyAndTenantId() under a runtimeService instance for the process, although I could also use startProcessInstanceByIdAndTenantId():
//Import required libraries
import org.activiti.engine.RuntimeService;
import org.activiti.engine.runtime.ProcessInstance;
//instantiate RunTimeService instance
RuntimeService runtimeService = execution.getEngineServices().getRuntimeService();
//get tenant id
String tenantId = execution.getTenantId();
//variables Map
Map<String, Object> variables = runtimeService.getVariablesLocal(execution.getProcessInstanceId());
//start process (processId, variables, tenantId)
ProcessInstance completeTask = runtimeService.startProcessInstanceByKeyAndTenantId("CompleteTask", variables, tenantId);
//Clear variables to create a fresh task
execution.setVariable("title", "");
execution.setVariable("details", "");
Using this approach I avoid creating multiple subprocesses from the parent process and instead create multiple processes that run separate from the parent process. This benefits me as if the parent process completes the others continue to run.
Seems like you are updating only one variable (or a single set of variables) as a result of each task. This will override the previous value. use distinct variables, or append something before each variable to mark it unique for the task/ sub-process completed. see collapsed sub-process
Yes, each sub process gets its own unique execution id, But the main execution ID or process instance ID remains same
I am working on Oracle 10gR2.
And here is my problem -
I have a procedure, lets call it *proc_parent* (inside a package) which is supposed to call another procedure, lets call it *user_creation*. I have to call *user_creation* inside a loop, which is reading some columns from a table - and these column values are passed as parameters to the *user_creation* procedure.
The code is like this:
FOR i IN (SELECT community_id,
password,
username
FROM customer
WHERE community_id IS NOT NULL
AND created_by = 'SRC_GLOB'
)
LOOP
user_creation (i.community_id,i.password,i.username);
END LOOP;
COMMIT;
user_Creation procedure is invoking a web service for some business logic, and then based on the response updates a table.
I need to find a way by which I can use multi-threading here, so that I can run multiple instances of this procedure to speed up things. I know I can use *DBMS_SCHEDULER* and probably *DBMS_ALERT* but I am not able to figure out, how to use them inside a loop.
Can someone guide me in the right direction?
Thanks,
Ankur
what you can do is submit lots of jobs in the same time. See Example 28-2 Creating a Set of Lightweight Jobs in a Single Transaction
This fills a pl/sql table with all jobs you want to submit in one tx, all at the same time. As soon as they are submitted (enabled) they will start running, as many as the system can handle, or as many as are allowed by a resource manager plan.
The overhead that the Lightweight jobs have is very ... minimal/light.
I would like to close this question. DBMS_SCHEDULER as well as DBMS_JOB (though DBMS_SCHEDULER is preferred) can be used inside the loop to submit and execute the job.
For instance, here's a sample code, using DBMS_JOB which can be invoked inside a loop:
...
FOR i IN (SELECT community_id,
password,
username
FROM customer
WHERE community_id IS NOT NULL
AND created_by = 'SRC_GLOB'
)
LOOP
DBMS_JOB.SUBMIT(JOB => jobnum,
WHAT => 'BEGIN user_creation (i.community_id,i.password,i.username); END;'
COMMIT;
END LOOP;
Using a commit after SUBMIT will kick off the job (and hence the procedure) in parallel.