In the test program docs, I see this code bit that shows there will be some default options passed to a sub-flow.
# These are the option defaults that will be used unless specified by the caller
options = {
include_loaded_output_tests: true,
index: 0,
}.merge(options)
But when i set a breakpoint in the sub-flow all I see is an empty hash. If I pass in some options to the sub-flow they show up:
# In top-level flow
import 'scan_coverage_flow', test_modes: test_modes
# In the sub-flow
From: /users/user/origen/prod/scan/origen/_scan_coverage_flow.rb # line 3 :
1: Flow.create do |options|
2: binding.pry
[1] pry(#<AmdTest::Interface>)> options
=> {:test_modes=>
{"mode1"=><Model: Origen::SubBlock:23972990241220>,
Is the documentation up-to-date or am I misreading it?
thx
You are misreading the docs, it is intending to show how to set sub-flow specific default options.
Say we have an empty sub-flow that just prints out the options:
# _my_sub_flow.rb
Flow.create do |options|
puts options
end
Calling it like this should print out an empty hash:
import "my_sub_flow"
# => {}
If you add some default options to it:
Flow.create do |options|
options = { a: 10,
b: 20
}.merge(options)
puts options
end
Then it will do what you would expect:
import "my_sub_flow"
# => {a: 10, b: 20}
And if course the reason for having options is so that you can override them at call time:
import "my_sub_flow", b: 50
# => {a: 10, b: 50}
Related
I'm trying to use vcxproj-stream-editor to edit my .vcxproj files. What I want to do is:
Call check_file to see if a section exists in the file and set a flag.
If the flag isn't set, call filter_file to add it.
But how do I set a flag inside the coroutine called by check_file? I've tried declaring a global flag:
#! python3
import vcxproj
Configuration_debug_x64 = None
#vcxproj.coroutine
def print_project_guid():
while True:
action, params = yield
if action == "start_elem" and params["name"] == "PropertyGroup":
if "attrs" in params:
if "Label" in params["attrs"]:
if params["attrs"]["Label"] == "Configuration":
if "Condition" in params["attrs"]:
if params["attrs"]["Condition"] == "'$(Configuration)|$(Platform)'=='Debug|x64'":
Configuration_debug_x64 = True
vcxproj.check_file("My.vcxproj", print_project_guid)
if Configuration_debug_x64:
print("Configuration_debug_x64")
else:
print("No Configuration_debug_x64")
but that doesn't work, and I can't add a parameter to the function I pass to check_file. Is there some obvious method I'm missing?
I am looking to apply a callback post test execution that will check for an alarm flag. I don't see any listed here so I then checked the test interface and only see what looks like a flow level callback:
# This will be called at the end of every flow or sub-flow (at the end of every
# Flow.create block).
# Any options passed to Flow.create will be passed in here.
# The options will contain top_level: true, whenever this is called at the end of a
# top-level flow file.
def shutdown(options = {})
end
We need the ability to check the alarm flags after every test but still apply a common group ID to a list of tests like this:
group "func tests", id: :func do
[:minvdd, :maxvdd].each do |cond|
func :bin1_1200, ip: :cpu, testmode: :speed, cond: cond
end
end
Here is an example of the V93K alarm flow flag:
thx!
It is common when writing interfaces to funnel all test generation methods through a common single method to add them to the flow:
def func(name, options = {})
t = test_suites.add(name)
t.test_method = test_methods.origen.functional_test(options)
add_to_flow(t, options)
end
def para(name, options = {})
t = test_suites.add(name)
t.test_method = test_methods.origen.parametric_test(options)
add_to_flow(t, options)
end
def add_to_flow(test_obj, options = {})
# Here you can do anything you want before adding each test to the flow
flow.test(test_obj, options)
# Here you can do anything you want after adding each test to the flow
end
So while there is no per-test callback, you can generally achieve whatever you wanted to do with one via the above interface architecture.
EDIT:
With reference to the alarm flag flow structure you want to create, you would code it like this:
func :some_func_test, id: :sft1
if_failed :sft1 do
bin 10, if_flag: "Alarm"
bin 11, unless_flag: "Alarm"
end
Or, if you prefer, this is equivalent:
func :some_func_test, id: :sft1
bin 10, if_flag: "Alarm", if_failed: :sft1
bin 11, unless_flag: "Alarm", if_failed: :sft1
At the time of writing, that will generate something logically correct but with a sub-optimal branch structure.
In the next release that will be fixed, see the test case that has been added here and the output it generates here.
You can call all of the flow control methods from the interface the same way you can from within the flow, so you can inject such conditions in the add_to_flow method if you want.
Note also that in the test case both if_flag and if_enable are used. if_enable should generally be used if the flag is something that would be set at the start of the flow (e.g. by the operator) and would not change. if_flag should be used if it is a flag that is subject to modification by the flow at runtime.
I know there's no built in "line count" functionality while processing files through logstash (for various, understandable and documented reasons). But - there should be a mechanism, within any given logstash instance - to have an monotonically increasing variable / count for every parsed line.
I don't want to go the metrics route since it's a continuous polling mechanism (every n-seconds). Alternatives include pre-processing of log files which given my particular use case - is unacceptable.
Again, let me reiterate - I need the ability to generate/read a monotonically increasing variable that I can store during in a logstash filter.
Thoughts?
here's nothing built into logstash to do it.
You can build a filter to do it pretty easily
Just drop something like this into lib/logstash/filters/seq.rb
# encoding: utf-8
require "logstash/filters/base"
require "logstash/namespace"
require "set"
#
# This filter will adds a sequence number to a log entry
#
# The config looks like this:
#
# filter {
# seq {
# field => "seq"
# }
# }
#
# The `field` is the field you want added to the event.
class LogStash::Filters::Seq < LogStash::Filters::Base
config_name "seq"
milestone 1
config :field, :validate => :string, :required => false, :default => "seq"
public
def register
# Nothing
end # def register
public
def initialize(config = {})
super
#threadsafe = false
# This filter needs to keep state.
#seq=1
end # def initialize
public
def filter(event)
return unless filter?(event)
event[#field] = #seq
#seq = #seq + 1
filter_matched(event)
end # def filter
end # class LogStash::Filters::Seq
This will start at 1 every time Logstash is restarted, but for most situations, this would be ok. If you need something that is persistent across restarts, you need to do a bit more work to persist it somewhere
For anyone finding this in 2018+: logstash now has a ruby filter that makes this much simpler. Put the following in a file somewhere:
# encoding: utf-8
def register(params)
#seq = 1
end
def filter(event)
event.set("seq", #seq)
#seq += 1
return [event]
end
And then configure it like this in your logstash.conf (substitute in the filename you used):
ruby {
path => "/usr/local/lib/logstash/seq.rb"
}
It would be pretty easy to make the field name configurable from logstash.conf, but I'll leave that as an exercise for the reader.
I suspect this isn't thread-safe, so I'm running only a single logstash worker.
this is another choice to slove the problem,this work for me,thanks to the answer from the previous person about thread safe. i use seq field to sort my desc
this is my configure
logstash.conf
filter {
ruby {
code => 'event.set("seq", Time.now.strftime("%N").to_i)'
}
}
logstash.yml
pipeline.batch.size: 200
pipeline.batch.delay: 60
pipeline.workers: 1
pipeline.output.workers: 1
I have to feature steps
#vcr
Given A
#vcr
Given B
and its definitions:
Given /^A$/ do
a_method_that_makes_a_request
end
Given /^B$/ do
a_method_that_makes_a_request
end
This fail with:
Unknown alias: 70305756847740 (Psych::BadAlias)
The number changes. But when I did this:
# Feature step
Given B
# Step definition
Given /^B$/ do
VCR.use_cassette 'a_cassette' do
a_method_that_makes_a_request
end
end
It works. Can avoid this patch to use #vcr tag?
This is my config:
# features/support/vcr_setup.rb
require 'vcr'
VCR.configure do |c|
# c.allow_http_connections_when_no_cassette = true
c.cassette_library_dir = 'spec/fixtures/cassettes'
c.hook_into :webmock
c.ignore_localhost = true
log_path = File.expand_path('../../../log/vcr.log', __FILE__)
c.debug_logger = File.open(log_path, 'w')
end
VCR.cucumber_tags do |t|
t.tag '#localhost_request' # uses default record mode since no options are given
t.tags '#disallowed_1', '#disallowed_2', :record => :none
t.tag '#vcr', :use_scenario_name => true, record: :new_episodes
end
I'm trying to use the groovy CliBuilder to parse command line options. I'm trying to use multiple long options without a short option.
I have the following processor:
def cli = new CliBuilder(usage: 'Generate.groovy [options]')
cli.with {
h longOpt: "help", "Usage information"
r longOpt: "root", args: 1, type: GString, "Root directory for code generation"
x args: 1, type: GString, "Type of processor (all, schema, beans, docs)"
_ longOpt: "dir-beans", args: 1, argName: "directory", type: GString, "Custom location for grails bean classes"
_ longOpt: "dir-orm", args: 1, argName: "directory", type: GString, "Custom location for grails domain classes"
}
options = cli.parse(args)
println "BEANS=${options.'dir-beans'}"
println "ORM=${options.'dir-orm'}"
if (options.h || options == null) {
cli.usage()
System.exit(0)
}
According to the groovy documentation I should be able to use multiple "_" values for an option when I want it to ignore the short option name and use a long option name only. According to the groovy documentation:
Another example showing long options (partial emulation of arg
processing for 'curl' command line):
def cli = new CliBuilder(usage:'curl [options] <url>')
cli._(longOpt:'basic', 'Use HTTP Basic Authentication')
cli.d(longOpt:'data', args:1, argName:'data', 'HTTP POST data')
cli.G(longOpt:'get', 'Send the -d data with a HTTP GET')
cli.q('If used as the first parameter disables .curlrc')
cli._(longOpt:'url', args:1, argName:'URL', 'Set URL to work with')
Which has the following usage message:
usage: curl [options] <url>
--basic Use HTTP Basic Authentication
-d,--data <data> HTTP POST data
-G,--get Send the -d data with a HTTP GET
-q If used as the first parameter disables .curlrc
--url <URL> Set URL to work with
This example shows a common convention. When mixing short and long
names, the short names are often one
character in size. One character
options with arguments don't require a
space between the option and the
argument, e.g. -Ddebug=true. The
example also shows the use of '_' when
no short option is applicable.
Also note that '_' was used multiple times. This is supported but
if any other shortOpt or any longOpt is repeated, then the behavior is undefined.
http://groovy.codehaus.org/gapi/groovy/util/CliBuilder.html
When I use the "_" it only accepts the last one in the list (last one encountered). Am I doing something wrong or is there a way around this issue?
Thanks.
not sure what you mean it only accepts the last one. but this should work...
def cli = new CliBuilder().with {
x 'something', args:1
_ 'something', args:1, longOpt:'dir-beans'
_ 'something', args:1, longOpt:'dir-orm'
parse "-x param --dir-beans beans --dir-orm orm".split(' ')
}
assert cli.x == 'param'
assert cli.'dir-beans' == 'beans'
assert cli.'dir-orm' == 'orm'
I learned that my original code works correctly. What is not working is the function that takes all of the options built in the with enclosure and prints a detailed usage. The function call built into CliBuilder that prints the usage is:
cli.usage()
The original code above prints the following usage line:
usage: Generate.groovy [options]
--dir-orm <directory> Custom location for grails domain classes
-h,--help Usage information
-r,--root Root directory for code generation
-x Type of processor (all, schema, beans, docs)
This usage line makes it look like I'm missing options. I made the mistake of not printing each individual item separate from this usage function call. That's what made this look like it only cared about the last _ item in the with enclosure. I added this code to prove that it was passing values:
println "BEANS=${options.'dir-beans'}"
println "ORM=${options.'dir-orm'}"
I also discovered that you must use = between a long option and it's value or it will not parse the command line options correctly (--long-option=some_value)