Can callbacks be used to configure remotes? - origen-sdk

I was reading the Origen documentation on remotes and had a question. When do the remote files get retrieved relative to the Origen callbacks? The reason i ask is that the files we want to retrieve would be used to construct our DUT model and there are some order dependencies.
I have tried all of the existing callbacks, in an attempt to configure the remotes, with no success.
def pre_initialize
Origen.app.config.remotes = [
{
dir: 'product',
vault: 'ssh://git#stash.com/myproduct/origen.git',
version: '0.1.0'
}
]
end
If I add the remotes config to the application file it works:
config.remotes = [
{
dir: 'product',
vault: 'ssh://git#stash.com/myproduct/origen.git',
version: '0.1.0'
}
]
The problem with using the config/application.rb file is that we can't keep product specific information anywhere in our application space. We use symbolic links to map to source files that are stored in test program stash repositories. I think we may need a new callback, please advise.
thx
** EDIT **
So I defined the remotes in another file and call the method to actually do it in the boot.rb file. Then I placed the remote manager require method in the on_create callback but got no remotes fetched.
284: def on_create
285: binding.pry
=> 286: Origen.remote_manager.require!
287: end
[1] pry(#<PPEKit::Product>)> Origen.config.remotes
=> [{:dir=>"remote_checkout", :rc_url=>"ssh://git#stash.us.com:7999/osdk/us-calendar.git", :version=>"v0.1.0"}]
[2] pry(#<PPEKit::Product>)>
origen(main):001:0>
It seems like the Origen.remote_manager.require! is not working. So I checked the remotes manager file and don't see how the require! method could ever work with a callback because it seems to be checking the remotes are dirty, which can never happen for a remote definition that was set after the application.rb file was loaded. So I created a resolve_remotes! method that seems to work:
def resolve_remotes!
resolve_remotes
remotes.each do |_name, remote|
dir = workspace_of(remote)
rc_url = remote[:rc_url] || remote[:vault]
tag = Origen::VersionString.new(remote[:version])
version_file = dir.to_s + '/.current_version'
begin
if File.exist?("#{dir}/.initial_populate_successful")
FileUtils.rm_f(version_file) if File.exist?(version_file)
rc = RevisionControl.new remote: rc_url, local: dir
rc.checkout version: prefix_tag(tag), force: true
File.open(version_file, 'w') do |f|
f.write tag
end
else
rc = RevisionControl.new remote: rc_url, local: dir
rc.checkout version: prefix_tag(tag), force: true
FileUtils.touch "#{dir}/.initial_populate_successful"
File.open(version_file, 'w') do |f|
f.write tag
end
end
rescue Origen::GitError, Origen::DesignSyncError => e
# If Git failed in the remote, its usually easy to see what the problem is, but now *where* it is.
# This will prepend the failing remote along with the error from the revision control system,
# then rethrow the error
e.message.prepend "When updating remotes for #{remote[:importer].name}: "
raise e
end
end
end
The resolve_remotes! method just forces all known remotes to be fetched. Would a PR be accepted for this solution?
thx

Remotes are currently required at application load time, which means it occurs before any of the application call back points.
The content of config.remotes can still be made dynamic by assigning it in a block:
config.remotes do
r = [{
dir: 'product',
vault: 'ssh://git#stash.com/myproduct/origen.git',
version: '0.1.0'
}]
if some_condition
r << { dir: ... }
end
r
end
The config.remotes attribute will be evaluated before the target is loaded however, so you won't be able to reference dut for example, though maybe that is good enough.
Alternatively, you could implement a post target require of the remotes within your application pretty easily.
Make the remotes return en empty array if the dut is not available yet, that will make it work ok when it is called by Origen during application load:
config.remotes do
if dut
# As above example
else
[]
end
end
Then within the callback handler of your choice:
Origen.remote_manager.require!
That should cause it to re-evaluate config.remotes and fetch any remotes that are missing.

Related

Cannot create web documentation from on_flow_end callback

I'm trying to setup origen to automatically generate web documentation after generating a flow with the origen p command, so I added the following callback:
def on_flow_end(options)
OrigenDocHelpers.generate_flow_docs layout: "#{Origen.root}/templates/web/layouts/_basic.html.erb", tab: :flows do |d|
d.page flow: "#{options[:test_module]}_flow".to_sym,
name: "#{options[:test_module].to_s.upcase} Flow",
target: "#{Origen.target.name}.rb"
end
end
This causes an error when building the flow page:
[INFO] 14.641[0.005] || Building flow page: /users/chorton/osdk/ppekit/web/content/flows/pcie_flow.md
COMPLETE CALL STACK
-------------------
Can't find: partials/_navbar.html
/home/chorton/.origen/gems/ruby/2.3.0/bundler/gems/origen-7bf48a874995/lib/origen/file_handler.rb:137:in `clean_path_to'
/home/chorton/.origen/gems/ruby/2.3.0/bundler/gems/origen-7bf48a874995/lib/origen/file_handler.rb:226:in `rescue in clean_path_to_sub_template'
/home/chorton/.origen/gems/ruby/2.3.0/bundler/gems/origen-7bf48a874995/lib/origen/file_handler.rb:213:in `clean_path_to_sub_template'
/home/chorton/.origen/gems/ruby/2.3.0/bundler/gems/origen-7bf48a874995/lib/origen/generator/renderer.rb:8:in `render'
(erb):4:in `_get_binding'
There is no error if I separately call origen p (without the added callback) and then run:
origen web compile --remote --api
Is it possible to combine the two into one command with a callback like I'm trying to do or is it necessary for origen web compile to be called after origen p?
Thanks.
This does seem like a bug, please open a ticket for it here if you cannot get it resolved - https://github.com/Origen-SDK/origen/issues
I would say that the convention normally used to do this is to hook into the after_web_site_compile callback in your config/application.rb.
Here is an example:
# config/application.rb
def after_web_site_compile(options)
# Build the test flow docs
Origen.environment.temporary = 'v93k.rb'
# Generate the program for the target(s) you want to document
%w(device_a device_b).each do |target|
Origen.target.temporary = "#{target}.rb"
Origen.app.runner.launch action: :program,
files: 'program/full.list' # Or whatever file arg you pass to 'origen p'
OrigenDocHelpers.generate_flow_docs layout: "#{Origen.root}/templates/web/layouts/_basic.html.erb", tab: :flows do |d|
d.page flow: "#{options[:test_module]}_flow".to_sym,
name: "#{options[:test_module].to_s.upcase} Flow",
target: "#{target}.rb"
end
end
end

Access type params from puppet provider

I'm trying to create a perforce custom type devtrack but am stuck in the prefetch stage. There I am trying to use my instances class method to find the correct provider
def self.prefetch(resources)
instances.each do |prov|
if resource = resources[prov.name]
resource.provider = prov
end
end
end
and in the instances class method I try to find all clients on the current host by using the command
p4 workspaces -u
using the below code
def self.get_list_of_workspaces_on_host(host)
ws_strs = p4(['workspaces', '-u', <USERNAME>]).split("\n")
ws_strs.select { |str| str.include?(host) }.map{ |ws| ws.split[1] }
end
def self.get_workspace_properties(ws)
md = /^(\w*)_.*_(main|\d{2})_managed$/.match(ws)
ws_props = {}
ws_props[:ensure] = :present
...
ws_props
end
def self.instances
host = `hostname`.strip
get_list_of_workspaces_on_host(host).collect do |ws|
ws_props = get_workspace_properties(ws)
new(ws_props)
end
end
and the p4 command is defined like
has_command(:p4, "/usr/bin/p4") do
environment :P4PORT => <PERFORCE SERVER>, :P4USER => <USERNAME>
end
The problem I have is that for any p4 command to work I need to access the server, this is specified in the type
devtrack { '36': source => '<PERFORCE SERVER>'}
but how can I access this value from prefetch? The problem beeing that prefetch is a class method and thus can not access the #properties_hash or the resource hash. Is there a way to get around this? Am I designing this completely wrong?

Custom type/provider not ensurable

I'm trying to create a new custom type/provider but not ensurable.
I've already checked the exec and augeas types, but I couldn't figure out clearly how exactly the integration between type and provider work when we don't define the ensurable mode.
Type:
Puppet::Type.newtype(:ptemplates) do
newparam(:name) do
desc ""
isnamevar
end
newproperty(:run) do
defaultto 'now'
# Actually execute the command.
def sync
provider.run
end
end
end
Provider:
require 'logger'
Puppet::Type.type(:ptemplates).provide(:ptemplates) do
desc ""
def run
log = Logger.new(STDOUT)
log.level = Logger::INFO
log.info("x.....................................")
end
But I don't know why the provider is being executed twice
root#puppet:/# puppet apply -e "ptemplates { '/tmp': }" --environment=production
Notice: Compiled catalog for puppet.localhost in environment production in 0.12 seconds
I, [2017-07-30T11:00:15.827103 #800] INFO -- : x.....................................
I, [2017-07-30T11:00:15.827492 #800] INFO -- : x.....................................
Notice: /Stage[main]/Main/Ptemplates[/tmp]/run: run changed 'true' to 'now'
Notice: Applied catalog in 4.84 seconds
Also, I had to define the defaultto to force the execution of the provider.run method.
What am I missing ?
Best Regards.
First you should spend some time reading this blog http://garylarizza.com/blog/2013/11/25/fun-with-providers/ and the two following by Gary Larizza. It gives a very good introduction to puppet type/providers.
Your log is being executed twice because of your def sync in the type that calls the run define, second when puppet tries to determine the value of your run property.
In order to write a type/provider that is not ensurable you need to do something like:
Type:
Puppet::Type.newtype(:ptemplates) do
#doc = ""
newparam(:name, :namevar => true) do
desc ""
end
newproperty(:run) do
desc ""
newvalues(:now, :notnow)
defaultto :now
end
end
Provider:
Puppet::Type.type(:ptemplates).provide(:ruby) do
desc ""
def run
#Do something to determine if run value and is now or notnow and return it
end
def run= value
#Do something to set the value of run
end
end
Note that all type providers must be able to determine the value of the property and be able to set it. The difference between an ensurable and a not ensurable type/provider is that the ensurable type/prover is able to create and destroy it, fx remove an user or add an user. A type/provider that is not ensurable is not able to create and destroy the property, fx selinux, you can set its value, but you cannot remove selinux.

Using Pickle with spork?

Pickle doesn't seem to be loading for me when I'm using spork...
If I run my cucumber normally, the step works as expected:
➜ bundle exec cucumber
And a product exists with name: "Windex", category: "Household Cleaners", description: "nasty bluish stuff" # features/step_definitions/pickle_steps.rb:4
But if I run it through spork, I get an undefined step:
You can implement step definitions for undefined steps with these snippets:
Given /^a product exists with name: "([^"]*)", category: "([^"]*)", description: "([^"]*)"$/ do |arg1, arg2, arg3|
pending # express the regexp above with the code you wish you had
end
What gives?
So it turns out there is an extra config line necessary for features/support/env.rb when using spork in order to have Pickle be able to pickup on AR models, a la this gist:
In features/support/env.rb
Spork.prefork do
ENV["RAILS_ENV"] ||= "test"
require File.expand_path(File.dirname(__FILE__) + '/../../config/environment')
# So that Pickle's AR adapter properly picks up the application's models.
Dir["#{Rails.root}/app/models/*.rb"].each { |f| load f }
# ...
end
Adding in this line fixes my problem. This is more of a spork issue than guard, per se. I'll update my question...

How can I add the build version to a scons build

At the moment I'm using some magic to get the current git revision into my scons builds.. I just grab the version a stick it into CPPDEFINES.
It works quite nicely ... until the version changes and scons wants to rebuild everything, rather than just the files that have changed - becasue the define that all files use has changed.
Ideally I'd generate a file using a custom builder called git_version.cpp and
just have a function in there that returns the right tag. That way only that one file would be rebuilt.
Now I'm sure I've seen a tutorial showing exactly how to do this .. but I can't seem to track it down. And I find the custom builder stuff a little odd in scons...
So any pointers would be appreciated...
Anyway just for reference this is what I'm currently doing:
# Lets get the version from git
# first get the base version
git_sha = subprocess.Popen(["git","rev-parse","--short=10","HEAD"], stdout=subprocess.PIPE ).communicate()[0].strip()
p1 = subprocess.Popen(["git", "status"], stdout=subprocess.PIPE )
p2 = subprocess.Popen(["grep", "Changed but not updated\\|Changes to be committed"], stdin=p1.stdout,stdout=subprocess.PIPE)
result = p2.communicate()[0].strip()
if result!="":
git_sha += "[MOD]"
print "Building version %s"%git_sha
env = Environment()
env.Append( CPPDEFINES={'GITSHAMOD':'"\\"%s\\""'%git_sha} )
You don't need a custom Builder since this is just one file. You can use a function (attached to the target version file as an Action) to generate your version file. In the example code below, I've already computed the version and put it into an environment variable. You could do the same, or you could put your code that makes git calls in the version_action function.
version_build_template="""/*
* This file is automatically generated by the build process
* DO NOT EDIT!
*/
const char VERSION_STRING[] = "%s";
const char* getVersionString() { return VERSION_STRING; }
"""
def version_action(target, source, env):
"""
Generate the version file with the current version in it
"""
contents = version_build_template % (env['VERSION'].toString())
fd = open(target[0].path, 'w')
fd.write(contents)
fd.close()
return 0
build_version = env.Command('version.build.cpp', [], Action(version_action))
env.AlwaysBuild(build_version)

Resources