I use cucumber 6.
My feature file contains:
When dialog ws1000 Log In Handheld ( user: test / workstation Id: stp50)
How write step definions functions for this? How escape?
#When("dialog ws1000 Log In Handheld \( user: {} \/ workstation Id:
{})") - nor working((
You are using a Cucumber Expression. So you can escape the ( with a \ and because you are using java you also have to escape the \ which means that you end up writing \\(.
#When("dialog ws1000 Log In Handheld \\( user: {} \\/ workstation Id: {})")
public void dialog_ws1000_log_in_handheld_user_test_workstation_id_stp50(String user, String workStation) {
// Write code here that turns the phrase above into concrete actions
}
However if your steps can not be found, there may be another problem. You should go through How to Ask a Good Question, especially How to create a Minimal, Complete, and Verifiable example. There is a good chance that if you do this by starting with a working project (for example the Cucumber Java Skeleton) you'll figure out the problem on your own.
Related
I am writing a test automation step-defintion using Cucumber and WebdriverJS. I have the following feature file
Scenario Outline : Validate whatever in this scenario works
Given I have a <Service Method> for test store
Then validate that something exists
Example :
|Service Method|
|Carryout |
|Delivery |
|Dine-in |
I'm mapping this cucumber step to the following auto-generated step-definition file
this.Given(/^I have a (.*) order$/, function (serviceMethod) {}
In this situation, I want this function to process only if the cucumber parameter is one of (carryout|delivery|dine-in).
I tried to limit the capture group by trying the following regex but the cucumber step-definition is not mapping to nodeJS step-definition after this.
this.Given(/^I have a (.*) (carryout|delivery|dine-in)? order$/, function (serviceMethod) {}
How can this be resolved ? Any help is highly appreciated
If you require one of those types of orders, you can speed things up by asserting it exists,
then laying down the blazay form of the container.
/^(?=.* (carryout|delivery|dine-in) )I have a.*order$/
Group 1 contains the TYPE of the order.
this.Given(/^I have a (carryout|delivery|dine-in) order$/, function (serviceMethod) {}
The above step-definition finally worked for me.
I am unfamiliar with perl, and I have a need to modify a Nagios check. I'd appreciate any advice on how to proceed. The check I'm using is check_smart, found here:
https://www.claudiokuenzler.com/nagios-plugins/check_smart.php
This script lets you check SMART values from hard drives and present the results in a simple form for monitoring. As it stands, the script can take a regex in the form /dev/sd[a-c] for one of the options; I believe that this is the section which allows this:
# list of devices for a loop
my(#dev);
if ( $opt_d ){
# normal mode - push opt_d on the list of devices
push(#dev,$opt_d);
} else {
# glob all devices - try '?' first
#dev =glob($opt_g);
}
foreach my $opt_dl (#dev){
warn "Found $opt_dl\n" if $opt_debug;
if (-b $opt_dl || -c $opt_dl){
$device .= $opt_dl.":";
} else {
warn "$opt_dl is not a valid block/character special device!\n\n" if $opt_debug;
}
}
I don't quite understand why the variable is $opt_dl when earlier it seems to be $opt_d. The result, however, is that the script returns something like:
OK: [/dev/sda] - Device is clean --- [/dev/sdb] - Device is clean --- [/dev/sdc] - Device is clean
EDIT: Here's the code where $opt_d is set; on further thought it seems like $opt_dl is just $opt_d while it's in a loop or something?
use vars qw($opt_b $opt_d $opt_g $opt_debug $opt_h $opt_i $opt_v);
Getopt::Long::Configure('bundling');
GetOptions(
"debug" => \$opt_debug,
"b=i" => \$opt_b, "bad=i" => \$opt_b,
"d=s" => \$opt_d, "device=s" => \$opt_d,
"g=s" => \$opt_g, "global=s" => \$opt_g,
"h" => \$opt_h, "help" => \$opt_h,
"i=s" => \$opt_i, "interface=s" => \$opt_i,
"v" => \$opt_v, "version" => \$opt_v,
);
The part of the code I'd like to change in a similar fashion is:
# Allow all device types currently supported by smartctl
# See http://www.smartmontools.org/wiki/Supported_RAID-Controllers
if ($opt_i =~ m/(ata|scsi|3ware|areca|hpt|cciss|megaraid|sat)/) {
$interface = $opt_i;
} else {
print "invalid interface $opt_i for $opt_d!\n\n";
print_help();
exit $ERRORS{'UNKNOWN'};
}
Specifically, I'd like to be able to pass the script something like "megaraid,[5-8]" and let it run for each. In this case, I would not be passing the regex for the device, it would just be /dev/sda.
If anyone could give me advice on this I'd appreciate it!
$opt_dl is probably poorly named and has nothing to do with your $opt_d, those are two separate variables.
From the if statement, if $opt_d is not set (that is the script was not given any device name to act upon), then glob is called with the value of $opt_g and it is glob in fact that finds out all filenames based on the regex given inside $opt_g.
After this if statement, the #dev array is filed with the names of devices to handle.
And then you have a foreach statement which means a loop on each item inside the #dev array. And during the loop, each item is in the $opt_dl variable, due to its use on the foreach statement.
However I was not able to understand what you wanted to do in your last paragraph.
I'm the maintainer of check_smart and it's funny I accidentally stumbled on that question now.
I don't quite understand why the variable is $opt_dl when earlier it seems to be $opt_d. The result, however, is that the script returns something like: OK: [/dev/sda] - Device is clean --- [/dev/sdb] - Device is clean --- [/dev/sdc] - Device is clean
So basically when you use the -g parameter, you tell the check_smart plugin to use glob (https://perldoc.perl.org/functions/glob.html) - this is not the same as regular expression. The drives matching the glob expression (e.g. -d '/dev/sd[a-z]) will create a list ($opt_dl) and the plugin will run through each drive in a for loop.
Specifically, I'd like to be able to pass the script something like "megaraid,[5-8]" and let it run for each. In this case, I would not be passing the regex for the device, it would just be /dev/sda.
This is already possible since release 5.0 (which was released in April 2014, way before your question ;-) ). You just need to change the syntax. Instead of using the glob expression on -d, you use it on the interface parameter (-i). Practical example: -i 'megaraid,[5-8]'.
Since the newest release (6.6, released a couple of days ago), the output for multiple drive checks (using -g) and hardware storage/raid controllers has slightly changed and now indicates the interface's device id rather than the logical drive path:
# ./check_smart.pl -g /dev/sda -i 'megaraid,[1-3]'
OK: [megaraid,1] - Device is clean --- [megaraid,2] - Device is clean --- [megaraid,3] - Device is clean|
This is all described in the official documentation, too.
More info:
https://www.claudiokuenzler.com/monitoring-plugins/check_smart.php
https://www.claudiokuenzler.com/blog/914/check_smart-6.6-multiple-drives-check-megaraid-3ware-cciss-controllers
I hope this answers your question, although I am probably 2 years late.
I have tried using BotBuilder-Location to collect the user's location via the Bing Maps API. I have followed the instructions on BotBuilder-Location's GitHub repository and have managed to be able to display a map fron Bing Maps using the code from the example:
var options = {
prompt: "Where should I ship your order?",
useNativeControl: true,
reverseGeocode: true,
requiredFields:
locationDialog.LocationRequiredFields.streetAddress |
locationDialog.LocationRequiredFields.locality |
locationDialog.LocationRequiredFields.region |
locationDialog.LocationRequiredFields.postalCode |
locationDialog.LocationRequiredFields.country
};
locationDialog.getLocation(session, options)
However, in the prompt for the location the string "botbuilder-location:TitleSuffix" keeps showing, and the dialog does not continue after showing the map but instead displays the string "botbuilder-location:MultipleResultsFound" (Screenshot of unexpected strings). I have tried this in the Emulator as well as on Skype and Facebook Messenger with the same results.
Does anybody know how to fix this?
Thanks and best regards!
This is known issue and it's reported here. There you will also find a workaround. The team behind the control is testing a potential fix.
I'm concerned about security of database passwords in sqoop batches (no interactive input).
In the old days, for a sqoop batch, the only thing you could do was to pass it on the command line using --password, but then the password was easy to read with a simple ps command.
Now we have that --password-file option, but it requires to store the password unencrypted on the disk and that's not really a "secure" practice, nor is it very convenient to have individual files for individual parameters.
I was thinking of storing the encrypted password in a configuration file, and dynamically decrypt it, store it in a temporary file, setting the rights (using a chmod command), calling sqoop, and then deleting the file... But I may miss a less cumbersome way ? How do you deal with it ?
#abeaamase has the best answer at this time in his stackoverflow 23916985 response from 3/2015 found here. Essentially, we should upgrade to sqoop > 1.4.5 and use a java keystore (JKES), org.apache.sqoop.util.password.CryptoFileLoader, or a loader defined in our own class.
The provided CryptoFileLoader has the disadvantage of presuming the crypto passphrase and salt will be provided as -D parameters to drive system properties (which are open to snooping via ps) or in plain text in the configuration XML.
I initially discovered these parameters in this blog from 3/2015 having missed it earlier (it lacks a heading but you'll find it if you look at step 3 part 2).
Surprisingly, it is not the recommended practice, and does not appear in the sqoop 1.4.5 docs.
Before the availibilty of the --password-file option, I made a patch for sqoop to read the password from in input stream in a non interactive way when using the -P command.
That way, I could unencrypt the password from a configuration file, and call sqoop with that password using a stdin pipe, without using a file or a command line where the plain password could be seen.
Edit file src/java/org/apache/sqoop/SqoopOptions.java
Replace the securePasswordEntry function code by
private String securePasswordEntry() {
try {
return new String(System.console().readPassword("Enter password: "));
}
// PATCH Bouygues Telecom - read password from pipe if launched in non-interactive mode
catch (NullPointerException e) {
try {
final java.io.BufferedReader reader = new java.io.BufferedReader(
new java.io.InputStreamReader(System.in));
return reader.readLine();
}
catch (java.io.IOException excep) {
LOG.error("It seems that you have launched a Sqoop metastore job via");
LOG.error("Oozie with sqoop.metastore.client.record.password disabled.");
LOG.error("But this configuration is not supported because Sqoop can't");
LOG.error("prompt the user to enter the password while being executed");
LOG.error("as Oozie tasks. Please enable sqoop.metastore.client.record");
LOG.error(".password in sqoop-site.xml, or provide the password");
LOG.error("explicitly using --password in the command tag of the Oozie");
LOG.error("workflow file.");
}
return null;
}
}
What is cumbersome is to have to re-patch every new release of Sqoop... I should maybe submit a jira (with a low confidence that my patch will be taken into account), or move to the --password-file option the way you wanted to.
I'm new to cucumber, but enjoying it.
I'm currently writing some Frank tests, and would like to reuse blocks of cucumber script across multiple features - I'd like to do this a the cucumber level if possible (not inside the ruby).
For example, I might have 4 scripts that all start by doing the same login steps:
given my app has started
then enter "guest" in "user-field"
and enter "1234" in "password-field"
and press "login"
then I will see "welcome"
then *** here's the work specific to each script ***
Is there any way to share these first 5 lines across multiple scripts? Some kind of "include" syntax?
Generally there are 2 approaches:
Backgrounds
If you want a set of steps to run before each of the scenarios in a feature file:
Background:
given my app has started
then enter "guest" in "user-field"
and enter "1234" in "password-field"
and press "login"
then I will see "welcome"
Scenario: Some scenario
then *** here's the work specific to this scenario ***
Scenario: Some other scenario
then *** here's the work specific to this scenario ***
Calling steps from step definitions
If you need the 'block' of steps to be used in different feature files, or a Background section is not suitable because some scenarios don't need it, then create a high-level step definition which calls the other ones:
Given /^I have logged in$/ do
steps %Q {
given my app has started
then enter "guest" in "user-field"
and enter "1234" in "password-field"
and press "login"
then I will see "welcome"
}
end
Also, in this case I'd be tempted not to implement your common steps as separate steps at all, but to create a single step definition: (assuming Capybara)
Given /^I have logged in$/ do
fill_in 'user-field', :with => 'guest'
fill_in 'password-field', :with => '1234'
click_button 'login'
end
This lends a little bit more meaning to your step definitions, rather than creating a sequence of page interactions which need to be mentally parsed before you realise 'oh, this section is logging me in'.
A better approach is suggested to use ruby level "methods" to code reuse instead of nested steps from code maintenance and debugging perspective.
Here is the link to more detail:
Reuse Cucumber steps
Description
The following method proposes an alternative approach to one of the solutions described in Jon M's answer.
Namely, instead of calling nested steps inside step definitions, such common blocks of steps can be extracted into external .feature files which can be included into your feature file (in a manner of speaking).
How-to
1. Expose utility / helper methods to be able to run steps parsed from a .feature file
# features/support/env.rb
# expose Cucumber runtime
InstallPlugin do |_, registry|
runtime = registry.instance_variable_get('#registry').instance_variable_get('#runtime')
Cucumber.define_singleton_method(:runtime) { runtime }
end
# extend current World with methods to run dynamic (already parsed) steps
Before do
step_invoker = Cucumber::Runtime::SupportCode::StepInvoker.new(Cucumber.runtime.support_code)
define_singleton_method(:dynamic_steps) do |steps|
steps.each do |step|
dynamic_step(step)
end
end
define_singleton_method(:dynamic_step) do |step|
LOGGER.info("Running template step: #{step[:text]}")
step_invoker.step(step)
end
end
2. Create a template file which will contain the steps to be shared
# features/templates/my_profile.template.feature
#template
Feature: Steps to navigate to my_profile_page
Scenario: login_page
Given my app has started on "login_page"
And I enter "guest" in "user-field" on "login_page"
And I enter "1234" in "password-field" on "login_page"
And I press "login" on "login_page" and go to "welcome_page"
Scenario: welcome_page
Given that I am on "welcome_page"
And I click "my_profile_button" on "welcome_page" and go to "my_profile_page"
Scenario: my_profile_page
...
3. Create an utility module which will parse steps from a .feature file
# features/support/template_parser.rb
require 'gherkin/parser'
require 'gherkin/pickles/compiler'
module TemplateParser
class << self
def read_from_template(template_path, from: nil, till: nil)
pickles = load_template(template_path)
flow = construct_flow(pickles)
slice_flow(flow, from, till)
end
private
def load_template(template_path)
source = {
uri: template_path,
data: File.read(template_path),
mediaType: 'text/x.cucumber.gherkin+plain'
}
def source.uri
self[:uri]
end
gherkin_document = Gherkin::Parser.new.parse(source[:data])
id_generator = Cucumber::Messages::IdGenerator::UUID.new
Gherkin::Pickles::Compiler.new(id_generator).compile(gherkin_document, source)
end
def construct_flow(pickles)
pickles.to_h do |pickle|
[
pickle.name,
pickle.steps.map(&:to_h).map { |step| step[:argument] ? step.merge(step[:argument]) : step }
]
end
end
def slice_flow(flow, from, till)
raise NameError, "From step '#{from}' does not exist!" unless from.nil? || flow.keys.include?(from)
raise NameError, "Till step '#{till}' does not exist!" unless till.nil? || flow.keys.include?(till)
from_idx = from.nil? ? 0 : flow.keys.index(from)
till_idx = till.nil? ? -1 : flow.keys.index(till)
flow.slice(*flow.keys[from_idx...till_idx])
end
end
end
4. Create a step definition that will load this template and inject the specified steps dynamically at runtime
And('I complete the {string} template from the {string} until the {string}') do |template, from, till|
template_path = "features/templates/#{template}.template.feature"
flow = TemplateParser.read_from_template(
template_path,
from: from.empty? ? nil : from,
till: till.empty? ? nil : till
)
flow.each_value { |steps| dynamic_steps(steps) }
end
5. Use this step inside your main feature file, by declaring which blocks of steps to use
# features/tests/welcome.feature
Feature: User is welcomed
Scenario: Verify that user sees welcome text
Given I complete the 'my_profile' template from the 'login_page' until the 'my_profile_page'
Then I see 'welcome' on 'welcome_page'
6. Make sure you omit the #template .feature files from being run in your tests
$ bundle exec cucumber --tags ~#template
Limitations
Con:
This method exposes some internals of the private API of cucumber-ruby, which may change in future.
Con:
This is a non-standard way of sharing steps between feature files.
Helper methods are the preferred way to achieve this, as per FAQ.
Pro:
The common blocks of steps are syntax-highlighted, and have proper IntelliSense support in your editor of choice.
Pro:
You can encode entire "workflows" easily this way, allowing you to encode your workflow expectations in a DRY way.
Namely, you can reuse those workflow steps by completing the first part of a workflow, change a few things on a single page as per your test requirements, resume those workflow steps from the follow-up page, and add an appropriate verification at the end of the workflow that covers those test requirements.