Random string in cucumber scenarios - cucumber

I am testing a GUI using cucumber. I need to test CRUD operations of the GUI.
When I write a scenario to create a new entity in GUI, I am unable to run multiple times, since the second time scenario fails because the ID I specified for the entity already exists (created in the first run) in the system the second time I run the test.
The system I am testing doesn't allow deleting entities. System needs to be started in a special mode to delete entities, so deleting the entity created after the test is not an option.
It would be great if I could use a random number for the entity id. For an example:
when user creates a new Branch with following values:
|Branch ID|<random_string_1>|
|Address|1, abc, def.|
|Telephone|01111111111|
And user searches for a branch by "Branch ID" = "<random_string_1>"
Then branch details should be as following
|Branch ID|<random_string_1>|
|Address|1, abc, def.|
|Telephone|01111111111|
Is there an option in cucumber to do something like this? Or, is there any other way I can achieve this?

In the end, I've added RandomStringTransformer class to test suite:
public class RandomStringTransformer extends Transformer<String> {
private static final Map<String, String> RANDOM_STRINGS = new HashMap<>(); //Key -> random string
public static final RandomStringTransformer INSTANCE = new RandomStringTransformer();
#Override
public String transform(String input) {
return transformString(input);
}
public DataTable transform(DataTable dataTable) {
dataTable.getGherkinRows().forEach(dataTableRow -> dataTableRow.getCells().replaceAll(this::transformString));
return dataTable;
}
private String transformString(String input) {
final String[] inputCopy = {input};
Map<String, String> replacements = new HashMap<>();
Matcher matcher = Pattern.compile("(<random_string_[^>]*>)").matcher(input);
while (matcher.find()) {
String group = matcher.group(0);
replacements.put(group, RANDOM_STRINGS.computeIfAbsent(group, key -> Utilities.getNextUniqueString()));
}
replacements.forEach((key, value) -> inputCopy[0] = inputCopy[0].replace(key, value));
return inputCopy[0];
}
}
And used the transformer in step definition:
#When("^user creates a branch of name "([^"]*)" with following values$")
public void branchIsCreatedWithDetails(#Transform(RandomStringTransformer.class) String branchName, DataTable fieldValues) {
fieldValues = RandomStringTransformer.INSTANCE.transform(fieldValues);
//Now, fieldValues table values and branchName are replaced with random values if they were in format <random_string_SOMETHING>
}

The #Transform annotation is not supported in Cucumber 3 anymore.
You have to transform data manually in the method body.
#When("^user creates a branch of name "([^"]*)" with following values$")
public void branchIsCreatedWithDetails(String branchName, DataTable fieldValues) {
fieldValues = RandomStringTransformer.INSTANCE.transform(fieldValues);
//Now, fieldValues table values and branchName are replaced with random values if they were in format <random_string_SOMETHING>
}
Read this for more information to migrate: http://grasshopper.tech/98/

Related

Cucumber V5-V6 - passing complex object in feature file step

So I have recently migrated to v6 and I will try to simplify my question
I have the following class
#AllArgsConstructor
public class Songs {
String title;
List<String> genres;
}
In my scenario I want to have something like:
Then The results are as follows:
|title |genre |
|happy song |romance, happy|
And the implementation should be something like:
#Then("Then The results are as follows:")
public void theResultsAreAsFollows(Songs song) {
//Some code here
}
I have the default transformer
#DefaultParameterTransformer
#DefaultDataTableEntryTransformer(replaceWithEmptyString = "[blank]")
#DefaultDataTableCellTransformer
public Object transformer(Object fromValue, Type toValueType) {
ObjectMapper objectMapper = new ObjectMapper();
return objectMapper.convertValue(fromValue, objectMapper.constructType(toValueType));
}
My current issue is that I get the following error: Cannot construct instance of java.util.ArrayList (although at least one Creator exists)
How can I tell cucumber to interpret specific cells as lists? but keeping all in the same step not splitting apart? Or better how can I send an object in a steps containing different variable types such as List, HashSet, etc.
If I do a change and replace the list with a String everything is working as expected
#M.P.Korstanje thank you for your idea. If anyone is trying to find a solution for this here is the way I did it as per suggestions received. Inspected to see the type fromValue has and and updated the transform method into something like:
if (fromValue instanceof LinkedHashMap) {
Map<String, Object> map = (LinkedHashMap<String, Object>) fromValue;
Set<String> keys = map.keySet();
for (String key : keys) {
if (key.equals("genres")) {
List<String> genres = Arrays.asList(map.get(key).toString().split(",", -1));
map.put("genres", genres);
}
return objectMapper.convertValue(map, objectMapper.constructType(toValueType));
}
}
It is somehow quite specific but could not find a better solution :)

eliminating embedded actions from antlr4 grammar

I have an antlr grammar in which embedded actions are used to collect data bottom up and build aggregated data structures. A short version is given below, where the aggregated data structures are only printed (ie no classes are created for them in this short sample code).
grammar Sample;
top returns [ArrayList l]
#init { $l = new ArrayList<String>(); }
: (mid { $l.add($mid.s); } )* ;
mid returns [String s]
: i1=identifier 'hello' i2=identifier
{ $s = $i1.s + " bye " + $i2.s; }
;
identifier returns [String s]
: ID { $s = $ID.getText(); } ;
ID : [a-z]+ ;
WS : [ \t\r\n]+ -> skip ;
Its corresponding Main program is:
public class Main {
public static void main( String[] args) throws Exception
{
SampleLexer lexer = new SampleLexer( new ANTLRFileStream(args[0]));
CommonTokenStream tokens = new CommonTokenStream( lexer );
SampleParser parser = new SampleParser( tokens );
ArrayList<String> top = parser.top().l;
System.out.println(top);
}
}
And a sample test is:
aaa hello bbb
xyz hello pqr
Since one of the objectives of antlr is to keep the grammar file reusable and action-independent, I am trying to delete the actions from this file and move it to a tree walker. I took a first stab at it with the following code:
public class Main {
public static void main( String[] args) throws Exception
{
SampleLexer lexer = new SampleLexer( new ANTLRFileStream(args[0]));
CommonTokenStream tokens = new CommonTokenStream( lexer );
SampleParser parser = new SampleParser( tokens );
ParseTree tree = parser.top();
ParseTreeWalker walker = new ParseTreeWalker();
walker.walk( new Walker(), tree );
}
}
public class Walker extends SampleBaseListener {
public void exitTop(SampleParser.TopContext ctx ) {
System.out.println( "Exit Top : " + ctx.mid() );
}
public String exitMid(SampleParser.MidContext ctx ) {
return ctx.identifier() + " bye "; // ignoring the 2nd instance here
}
public String exitIdentifier(SampleParser.IdentifierContext ctx ) {
return ctx.ID().getText() ;
}
}
But obviously this is wrong, because at the least, the return types of the Walker methods should be void, so they dont have a way to return aggregated values upstream. Secondly, I dont see a way how to access the "i1" and "i2" from the walker code, so I am not able to differentiate between the two instances of "identifier" in that rule.
Any suggestions on how to separate the actions from the grammar for this purpose?
Should I use a visitor instead of a listener here, since the visitor has the capability of returning values? If I use a visitor, how do I solve the problem of differentiating between "i1" and "i2" (as mentioned above)?
Does a visitor perform its action only at the exit of a rule (unlike the listeners, which exist for both entry and exit)? For example, if I have to initialize the list at the entry of rule "top", how can I do it with a visitor, which executes only at the conclusion of a rule? Do I need a enterTop listener for that purpose?
EDIT: After the initial post, I have modified the rule "top" to create and return a list, and pass this list back to the main program for printing. This is to illustrate why I need an initialization mechanism for the code.
Based on what you are trying to do I think you may benefit from using ANTLR's BaseVisitor Class rather than the BaseListener Class.
Assuming your grammar is this (I generalized it and I'll explain the changes below):
grammar Sample;
top : mid* ;
mid : i1=identifier 'hello' i2=identifier ;
identifier : ID ;
ID : [a-z]+ ;
WS : [ \t\r\n]+ -> skip ;
Then your Walker would look like this:
public class Walker extends SampleBaseVisitor<Object> {
public ArrayList<String> visitTop(SampleParser.TopContext ctx) {
ArrayList<String> arrayList = new ArrayList<>();
for (SampleParser.MidContext midCtx : ctx.mid()) {
arrayList.add(visitMid(midCtx));
}
return arrayList;
}
public String visitMid(SampleParser.MidContext ctx) {
return visitIdentifier(ctx.i1) + " bye " + visitIdentifier(ctx.i2);
}
public String visitIdentifier(SampleParser.IdentifierContext ctx) {
return ctx.getText();
}
}
This allows you to visit and get the result of any rule you want.
You are able to access i1 and i2, as you labeled them through the visitor methods. Note that you don't really need the identifier rule since it contains only one token and you can access a token's text directly in the visitMid, but really it's personal preference.
You should also note that SampleBaseVisitor is a generic class, where the generic parameter determines the return type of the visit methods. For your example I set the generic parameter Object, but you could even make your own class which contains the information you want to preserve and use that for your generic parameter.
Here are some more useful methods which BaseVisitor inherits which may help you out.
Lastly, your main method would end up looking something like this:
public static void main( String[] args) throws IOException {
FileInputStream fileInputStream = new FileInputStream(args[0]);
SampleLexer lexer = new SampleLexer(CharStreams.fromStream(fileInputStream));
CommonTokenStream tokens = new CommonTokenStream(lexer);
SampleParser parser = new SampleParser(tokens);
for (String string : new Walker().visitTop(parser.top())) {
System.out.println(string);
}
}
As a side note, the ANTLRFileStream class is deprecated in ANTLR4.
It is recommend to use CharStreams instead.
As Terence Parr points out in the Definitive Reference, one main difference between Visitor and Listener is that the Visitor can return values. And that can be convenient. But Listener has a place too! What I do for listener is exemplified in this answer. Granted, there are simpler ways of parsing a list of numbers, but I made that answer to show a complete and working example of how to aggregate return values from a listener into a public data structure that can be consumed later.
public class ValuesListener : ValuesBaseListener
{
public List<double> doubles = new List<double>(); // <<=== SEE HERE
public override void ExitNumber(ValuesParser.NumberContext context)
{
doubles.Add(Convert.ToDouble(context.GetChild(0).GetText()));
}
}
Looking closely at the Listener class, I include a public data collection -- a List<double> in this case -- to collect values parsed or calculated in the listener events. You can use any data structure you like: another custom class, a list, a queue, a stack (great for calculations and expression evaluation), whatever you like.
So while the Visitor is arguably more flexible, the Listener is a strong contender too, depending on how you want to aggregate your results.

How to mock the Data Stax Row object[com.datastax.driver.core.Row;] - Unit Test

Please find the below code for the DAO & Entity Object and Accessor
#Table(name = "Employee")
public class Employee {
#PartitionKey
#Column(name = "empname")
private String empname;
#ClusteringColumn(0)
#Column(name = "country")
private String country;
#Column(name = "status")
private String status;
}
Accessor:
#Accessor
public interface EmployeeAccessor {
#Query(value = "SELECT DISTINCT empname FROM EMPLOYEE ")
ResultSet getAllEmployeeName();
}
}
DAO getAllEmployeeNames returns a List which are employee names
and it will be sorted in ascending order.
DAO
public class EmployeeDAOImpl implements EmployeeDAO {
private EmployeeAccessor employeeAccessor;
#PostConstruct
public void init() {
employeeAccessor = datastaxCassandraTemplate.getAccessor(EmployeeAccessor.class);
}
#Override
public List<String> getAllEmployeeNames() {
List<Row> names = employeeAccessor.getAllEmployeeName().all();
List<String> empnames = names.stream()
.map(name -> name.getString("empname")).collect(Collectors.toList());
empnames.sort(naturalOrder()); //sorted
return empnames;
}
}
JUnit Test(mockito):
I am not able to mock the List[datastax row]. How to mock and returns a list of rows with values "foo" and "bar".Please help me in unit test this.
#Category(UnitTest.class)
#RunWith(MockitoJUnitRunner.class)
public class EmployeeDAOImplUnitTest {
#Mock
private ResultSet resultSet;
#Mock
private EmployeeAccessor empAccessor;
//here is the problem....how to mock the List<Row> Object --> com.datastax.driver.core.Row (interface)
//this code will result in compilation error as we are mapping a List<Row> to the ArrayList<String>
//how to mock the List<Row> with a list of String row object
private List<Row> unSortedTemplateNames = new ArrayList() {
{
add("foo");
add("bar");
}
};
//this is a test case to check if the results are sorted or not
//mock the accessor and send rows as "foo" & "bar"
//after calling the dao , the first element must be "bar" and not "foo"
#Test
public void shouldReturnSorted_getAllTemplateNames() {
when(empAccessor.getAllEmployeeName()).thenReturn(resultSet);
when(resultSet.all()).thenReturn(unSortedTemplateNames); //how to mock the List<Row> object ???
//i am testing if the results are sorted, first element should not be foo
assertThat(countryTemplates.get(0), is("bar"));
}
}
Wow! This is overly complex, hard to follow, and not an ideal way to write unit tests.
Using PowerMock(ito) along with "static" references in your own code is not recommended and is a sure sign of a code smells.
First, I am not sure why you decided to use a static reference (e.g. EmployeeAccessor.getAllEmployeeName().all(); inside the EmployeeDAOImpl class, getAllEmployeeNames() method) instead of using the instance variable (i.e. empAccessor), which is more conducive to actual "unit testing"?
The EmployeeAccessor, getAllEmployeeName() "interface" method is not static (clearly). However, seemingly, whatever this (datastaxCassandraTemplate.getAccessor(EmployeeAccessor.class);) generates makes it so (really?), which then requires the use of PowerMock(ito), o.O
Frameworks like PowerMock, and extensions of (i.e. "PowerMockito"), were meant to test and mock code used by your application (unfortunately, but necessarily so) where this "other" code makes use of statics, Singletons, private methods and so on. This anti-pattern really ought not be followed in your own application design.
Second, it is not really apparent what the "Subject Under Test" (SUT) is in your test case. You implemented a test class (i.e. EmployeeDAOImplTest) for, supposedly, your EmployeeDAOImpl class (the actual "SUT"), but inside your test case (i.e. shouldReturnSorted_getAllTemplateNames()), you are calling... countryLocalizationDAOImpl.getAllTemplateNames(); thus testing the CountryLocalizationDAOImpl class (??), which is not the "SUT" of the EmployeeDAOImplTest class.
Additionally, it is not apparent that the EmployeeDAOImpl even uses a CountryLocalizationDAO instance (assuming an interface here as well), and if it does, then it is certainly something that should be "mocked" when the EmployeeDAOImpl "interacts" with instances of CountryLocalizationDAO, particularly in the context of a unit test. The only correlation between the EmployeeDAO and CountryLocalizationDAO is that the Employee has a country field.
There are a few other problems with your design/setup as well, but anyway.
Here are a few suggestions...
First, let's test what your EmployeeDAOImplTest is meant to test... EmployeeDAO.getAllEmployeeNames() in a sorted fashion. This in turn may give you ideas of how to test your "CountryLocalizationDAO, getAllTemplateNames() method perhaps (if it even makes sense, i.e. getAllTemplateNames() is in fact dependent on an Employee's country, when Employees are ordered by name (i.e. "empname" and accessed via EmployeeAccessor).
public class EmployeeDAOImpl implements EmployeeDAO {
private final EmployeeAccessor employeeAccessor;
// where does the DataStaxCassandraTemplate reference come from?!
private DataStaxCassadraTemplate datastaxCassandraTemplate = ...;
public EmployeeDAOImpl() {
this(datastaxCassandraTemplate.getAccessor(EmployeeAccessor.class));
}
public EmployeeDAOImpl(EmployeeAccessor employeeAccessor) {
this.employeeAccessor = employeeAccessor;
}
protected EmployeeAccessor getEmployeeAccessor() {
return this.empAccessor;
}
public List<String> getAllEployeeNames() {
List<Row> nameRows = getEmployeeAccessor().getAllEmployeeName().all();
...
}
}
Then in your test class...
public class EmployeeDAOImplUnitTest {
#Mock
private EmployeeAccessor mockEmployeeAccessor;
// SUT
private EmployeeDAO employeeDao;
#Before
public void setup() {
employeeDao = new EmployeeDAOImpl(mockEmployeeAccessor);
}
protected ResultSet mockResultSet(Row... rows) {
ResultSet mockResultSet = mock(ResultSet.class);
when(mockResultSet.all()).thenReturn(Arrays.asList(rows));
return mockResultSet;
}
protected Row mockRow(String employeeName) {
Row mockRow = mock(Row.class, employeeName);
when(mockRow.getString(eq("empname")).thenReturn(employeeName);
return mockRow;
}
#Test
public void getAllEmployeeNamesReturnsSortListOfNames() {
when(mockEmployeeAccessor.getAllEmployeeName())
.thenReturn(mockResultSet(mockRow("jonDoe"), mockRow("janeDoe")));
assertThat(employeeDao.getAllEmployeeNames())
.contains("janeDoe", "jonDoe");
verify(mockEmployeeAccessor, times(1)).getAllEmployeeName();
}
}
Now, you can apply similar techniques if in fact there is an actual correlation between Employees and CountryLocalizationDAO via the EmployeeAccessor.
Hope this helps get you on a better track!
-j

How to retrieve data using a strong typed model in LinqToSql

This code works fine.
using (ContextDB db = new ContextDB())
{
var custAcct = (from c in db.CustAccts
select new
{
c.AcctNo,
c.Company,
c.UserName
}).ToList();
But this one doesn't
public class CustAcct
{
public int AcctNo { get; set; }
public string Company { get; set; }
public string UserName { get; set; }
}
....
....
....
using (ContextDB db = new ContextDB())
{
CustAcct custAcct = (from c in db.CustAccts
select new
{
c.AcctNo,
c.Company,
c.UserName
}).ToList();
It returns this error:
Cannot implicitly convert type 'System.Collections.Generic.IEnumerable' to 'EMailReader.Models.CustAcct'. An explicit conversion exists (are you missing a cast?)
I used Google, found many related topics but still couldn't put it to work using the available solutions
I just need to return data to a strong typed model.
EDITED:
After more research I found this solution bellow, but I wonder why I cannot retrieve directly in the list from LinqToSql.
List<CustAcct> temp = new List<CustAcct>();
IEnumerable<dynamic> items = custAcct;
foreach (var item in items)
{
temp.Add(new CustAcct()
{
AcctNo = item.AcctNo,
Company = item.Company,
UserName = item.UserName,
});
}
You are re defining those properties by creating new Class. And this will override LINQ2SQL generated class.
Just change "public class CustAcct" to "public partial class CustAcct".
This will solve your problem, and you do not need to define those properties again. Remove those from your class. Those will be automatically create for you.
If you can just post your class, and I will change it for you.
//Shyam

Store sessionScope java.util.TreeMap variable in a document in xPage

I am working on an application where I am creating a java.util.TreeMap containing data fetched from various other documents of the application and then assigning that treemap to a sessionsScope variable. This is working fine.
Now I want to provide a functionality wherein I need to store this map inside a NotesDocument.
But when I try doing this, I am getting an error.
var doc:NotesDocument = database.createDocument();
doc.replaceItemValue("Form","testForm");
print("json = "+sessionScope.get("Chart_Map"));
doc.replaceItemValue("Calender_Map",sessionScope.get("Chart_Map"));
doc.save();
Exception:
Error while executing JavaScript action expression
Script interpreter error, line=4, col=13: [TypeError] Exception occurred calling method NotesDocument.replaceItemValue(string, java.util.TreeMap) null**
Is it possible to store a java.util.TreeMap in a notesdocument field?
If yes then how to implement that?
If no then why not? has that something to do with serializability?
You can't store Java objects inside Document fields unless you use the MimeDomino Document data source
http://www.openntf.org/main.nsf/blog.xsp?permaLink=NHEF-8XLA83
Or even better the new openntf Domino API that has this functionallity built in
http://www.openntf.org/main.nsf/project.xsp?r=project/OpenNTF%20Domino%20API
using MimeStorage
Fredrik is right, the MimeDomino makes most sense. If you are not ready and your field isn't too big for a normal Notes item, you could use CustomDataBytes as Sven suggested - or you use JSON by subclassing TreeMap. It could look like this:
import java.util.TreeMap;
import java.util.Vector;
import com.google.gson.Gson;
import com.google.gson.JsonSyntaxException;
import lotus.domino.Item;
import lotus.domino.NotesException;
public class TreeMapItem extends TreeMap<String, String> {
private static final long serialVersionUID = 1L;
public static TreeMapItem load(Item source) throws JsonSyntaxException, NotesException {
Gson g = new Gson();
TreeMapItem result = g.fromJson(source.getText(), TreeMapItem.class);
return result;
}
public void save(Item target) throws NotesException {
Gson g = new Gson();
target.setValueString(g.toJson(this));
}
}
I used Google's Gson, it is quite easy, but you might need to deploy it as plug-in for the Java security to work. There is build in JSON in XPages too - a little more work. An alternate approach would be to use 2 fields in Domino, one to load the keys from and one for the values - it would be in line with Domino practises from classic.
A third approach would be be to store the values separated using a pipe character:
#SuppressWarnings({ "unchecked", "rawtypes" })
public void saveCompact(Item target) throws NotesException {
Vector v = new Vector();
for (Map.Entry<String, String> me : this.entrySet()) {
v.add(me.getKey()+"|"+me.getValue());
}
target.setValues(v);
}
#SuppressWarnings("rawtypes")
public static TreeMapItem loadCompact(Item source) throws NotesException {
TreeMapItem result = new TreeMapItem();
Vector v = source.getValues();
for (Object o : v) {
String[] candidate = o.toString().split("|");
if (candidate.length > 1) {
result.put(candidate[0], candidate[1]);
}
}
return result;
}
Let us know how it works for you

Resources