I have multiple cucumber features that I want to run.
The steps are spread across multiple classes , and they share common data between then for this purpose we use dependency injection (pico-container) , we share the driver object and data using this method
I understand that because of the pico container we cannot run test in parallel as it will mess up the execution.
I want to know if there is any other way to run test in parallel in this approach.
Project arch :
Package : Feature file : This includes all the feature file.
Package : StepDefination : This includes multiple class and step
definitions are split in those
classes based on the microservices they interact
This also include the scenarioContext class that we use the share
data.
Package : Runner class : This includes the runner class with features parameter pointing to the
feature package
Please throw your ideas on my problem here.
Related
I am trying to run two step definition files(cucumber) simultaneously in that I applied #Before annotation in one step definition file. While i am running these, annotation is applying to both and giving the result. How can i stop or control annotation is applicable to only one step definition file?
All step definition files are parsed when Cucumber starts executing. Cucumber then uses the scenarios in your feature files to determine which step definitions to call. There is no concept of "running a step definition file".
A Before hook will run before each Scenario that Cucumber executes. It makes no difference which feature file the scenario is written in or which step definition file the hook is defined in.
You may want to consider using a Background instead, especially if a non-technical reader of the feature file would find the behaviours it describes important. A Background runs before each scenario written in the same feature file.
Alternatively, you might accomplish what you want by using conditional (or tagged) hooks. A conditional Before hook will only be run if the scenario has tags that satisfy the hook's tag expression.
We have test interface created that we use with all our origen flows that has test flow methods defined. I believe there's possibly an issue with our test methods and to debug I'd like to use the default func method rather than the one defined in our interface.
Where would I find this method and what is the correct way to integrate it?
Thanks
func is not a flow method provided by Origen, rather it is a method that is commonly used in examples to indicate the notion of "a functional test".
The underlying API provided by Origen is flow.test(test_object, options), and within most test program source code the flow will be inferred, so often you will see code that simply calls test.
The test object can be just a name, or it can be an object representing the test like a TestInstance or TestSuite object provided by the Teradyne and Advantest APIs respectively.
You can create a test program flow (and documentation for it) by using only the test method, you can see examples of that in the source code for this video on how to create test program flows: http://origen-sdk.org/origen/videos/5-create-program-flow/
A test program however, is comprised of more than just a flow, and you would normally also want the other files that define what the test is to be generated in addition to the flow.
In time, we hope that the Origen community will produce libraries of off-the-shelf methods like func which would generate the complete test as well as inserting into the flow, currently though it is the application's responsibility to create such methods within their test program interface.
See this example source code for how to create a func method that can target multiple tester types - http://origen-sdk.org/origen/videos/6-create-program-tests/
To start with, you shouldn't worry about the multiple tester aspect, just refer to this guide for the APIs that are available to generate the test program artefacts for creating tests on the V93K - http://origen-sdk.org/origen/guides/program/v93k/
How does cucumber initialize a given step definition Java class which corresponds to a given feature file having multiple scenarios and all scenarios are sharing a Background (Given condition) to initialize a instance variable.
My Question is to know is it like one step definition Java class 's object per scenario given in feature file?
If my understanding is correct , does that impact performance of execution? Can it be improved having thread safety intact.
I'd like to have a module which can be developed in isolated envorinment but still remains a module which can be plugged into anoter project.
The idea: Currently I have state machine driven modular project where every module is defined by DSL so the main project has its context and command mappings and state machine. Now one of the modules will become essentially the same thing - it'll have its own context, its own child modules and its own DSL definition which will be separated from the main context.
Is this possible?
Is there some best practice of how to automatically forward events from the main context throught the module to the module context?
Is there a way to map the modules private dispatcher as a dispatcher for the isolated context?
It seems to be completely possible
Since I didn't find any documentation or example regarding this use case I think there is no best practice
As far as I understand it's possible to create own application context class which would expose possibility to override the context dispatcher. But it won't solve much because command triggers can only listen to modules and not to the whole context
So I assume that the best way to solve this is to create a separated communication module which will be mapped inside the "child" DSL and the "parent" module will then locate it through the core factory of the "child assembler" and trigger events through it. It also makes the communication more testable because that will channel entire communication through a single point where it can be easily tested/mocked/observed and it also abstracts away implementation and events of the "parent" application
HexMachina only support one context by applications (parent-child context should be supported in future). I’m not certain to understand exactly what you want, but let start with few things.
Communication between modules.
Modules have two dispatchers, one internal for all internal communication with FrontController, and one public for all external communication.
To communicate between module, one module has to subscribe to the other. In DSL, it defines like that :
<chat id="chat" type="hex.ioc.parser.xml.mock.MockChatModule">
<listen ref="translation"/>
</chat>
<translation id="translation" type="hex.ioc.parser.xml.mock.MockTranslationModule">
<listen ref="chat">
<event static-ref="hex.ioc.parser.xml.mock.MockChatModule.TEXT_INPUT" method="onSomethingToTranslate"/>
</listen>
</translation>
In this example, when chat module dispatchPublicMessage(MockChatModule.TEXT_INPUT, [“data”]), the onSomethingToTranslate(textToTranslate : String) method of translation module is executed.
Split DSL in many files
You can use context inclusion and conditional attributes to organize your DSL files by “component”, and define what you want to use at compile time.
<root name="applicationContext">
<include if=“useModuleA” file="context/ModuleA.xml"/>
</root>
Conditional attribute value is defined by compiler flags (-D useModuleA=1) or directly in code check this link.
Driven many modules with the state machine
If you want driven many modules on one state change, you have used command to manage that.
<state id="assemblingEnd" ref="applicationContext.state.ASSEMBLING_END">
<enter command-class="hex.ioc.parser.xml.assembler.mock.MockStateCommand" fire-once="true"/>
</state>
I hope this can help you. Let me know if you want more detail.
Suppose you have a Parser class that reads the file and does something with the data it contains. On a diagram how to you show that it gets data from some entity that is not represented by a class, but rather exists separately as, in this example - file.
Assuming you want to show the structure. Use a class or interface, as UML does not have to mean a Java class, you can also use an artifact which is more part of the deployment notation, but is fine to use elsewhere. If you think about it a file is a fairly concrete concept, especially if it has a name.
From the OMG UML spec:
10.3.1 Artifact (from Artifacts, Nodes)
An artifact is the specification of a physical piece of information
that is used or produced by a software development process, or by
deployment and operation of a system. Examples of artifacts include
model files, source files, scripts, and binary executable files, a
table in a database system, a development deliverable, or a
word-processing document, a mail message.