I am trying to create a Bazel project which includes cucumber-cpp. I could not figure out how its BUILD file would look like.
As Google Test now includes it's own BUILD file it is as easy as it get's. Something similar would be nice.
My WORKSPACE file looks like this
load("#bazel_tools//tools/build_defs/repo:http.bzl", "http_archive")
http_archive(
name = "googletest",
sha256 = "927827c183d01734cc5cfef85e0ff3f5a92ffe6188e0d18e909c5efebf28a0c7",
strip_prefix = "googletest-release-1.8.1",
url = "https://github.com/google/googletest/archive/release-1.8.1.zip",
)
http_archive(
name = "cucumber-cpp",
sha256 = "73fddda099e39cc51ebee99051047067f6dcd437fbde60601ac48cb82a903dac",
url = "https://github.com/cucumber/cucumber-cpp/archive/v0.5.zip",
)
My specification BUILD file
cc_test(
name = "app-spec",
srcs = glob(["**/*.cpp"]),
deps = [
"//src:app-lib",
"#cucumber-cpp//:main", //do not know if this is correct
],
)
cc_test(
name = "app-spec",
srcs = glob(["**/*.cpp"]),
deps = [
"//src:app-lib",
"#cucumber-cpp//:main", //do not know if this is correct
],
)
Test BUILD file
cc_test(
name = "app-test",
srcs = glob(["**/*.cpp"]),
deps = [
"//src:app-lib",
"#googletest//:gtest_main",
],
)
But obviously the cucumber-cpp is not built so I wonder how it's Bazel BUILD file would look like?
I also wanted to do this, but couldn't find anything where anyone had attempted it. In the end I wrote a dedicated bazel extension for using cucumber and gherkin feature specs. Currently this only supports (linux|osx)+cpp+cucumber, but I may add support for windows and other languages further down the line. To use this add this to your WORKSPACE file;
load("#bazel_tools//tools/build_defs/repo:git.bzl", "git_repository")
git_repository(
name = "rules_gherkin",
commit = "ef361f40f9716ad8a3c6a8a21111bb80d4cbd927", # Update this to match latest commit
remote = "https://github.com/silvergasp/rules_gherkin.git"
)
load("#rules_gherkin//:gherkin_deps.bzl","gherkin_deps")
gherkin_deps()
load("#rules_gherkin//:gherkin_workspace.bzl","gherkin_workspace")
gherkin_workspace()
An example BUILD file would look like this;
load("//gherkin:defs.bzl", "gherkin_library", "gherkin_test")
gherkin_library(
name = "feature_specs",
srcs = glob(["**/*.feature"]),
)
gherkin_test(
name = "calc_test",
steps = ":calculator_steps",
deps = [":feature_specs"],
)
load("//gherkin:defs.bzl", "cc_gherkin_steps")
cc_gherkin_steps(
name = "calculator_steps",
srcs = [
"CalculatorSteps.cpp",
],
visibility = ["//visibility:public"],
deps = [
"//examples/Calc/src:calculator",
"#cucumber_cpp//src:cucumber_main",
"#gtest",
],
)
A complete example can be found here.
Related
I am new to bazel and have some question:
I have defined a library xxx like this:
cc_library(
name = "xxx",
srcs = glob(["lib/*.c"]),
hdrs = glob(["include/*.h"]),
copts = ["-Iinclude -Werror"],
)
Using pkg_tar, I saw that it produces a target files :xxx which have the .so and the .a :
load("#bazel_tools//tools/build_defs/pkg:pkg.bzl", "pkg_tar")
pkg_tar(
name = "libxxx",
package_dir = "/usr/lib/",
srcs = [":xxx"],
mode = "0644",
)
I want to get only the static library .a, how do I do, for the moment, I found only this solution:
pkg_tar(
name = "libxxx-static",
package_dir = "/usr/lib/",
srcs = [":xxx"],
# FIXME
strip_prefix = "libxxx.so",
mode = "0644",
)
How do I get only one file in target files ?
You can use the linkstatic field of cc_library(),
The linkstatic attribute has a different meaning if used on a cc_library() rule. For a C++ library, linkstatic=True indicates that only static linking is allowed, so no .so will be produced. linkstatic=False does not prevent static libraries from being created. The attribute is meant to control the creation of dynamic libraries.
Like so:
cc_library(
name = "xxx",
srcs = glob(["lib/*.c"]),
hdrs = glob(["include/*.h"]),
copts = ["-Iinclude -Werror"],
linkstatic = True,
)
I'm trying to create a Bazel rule that will update the version number in package.json before packing with npm_package.
In short I want to take packages/server/package.tpl.json and create an output package.json that I can depend on in npm_package.
I've tried a bunch of different was that include error such as read-only file system, no such attribute 'out' in 'stamp_package_json' rule and rule 'package_json' has file 'package.json' as both an input and an output and the current error The following files have no generating action: packages/server/package.json
My project structure looks like:
/
/packages
/server
/src
BUILD.blaze
BUILD.blaze
package.tpl.json
/tools
/npm
BUILD.blaze
stamp_package_json.bzl
This is a monorepo so it has more packages then just server.
In packages/server/BUILD.blaze I use two rules:
package(default_visibility=["//visibility:public"])
load("#build_bazel_rules_nodejs//:defs.bzl", "npm_package")
load("//tools/npm:stamp_package_json.bzl", "stamp_package_json")
stamp_package_json(
name = "package_json",
package_json = "package.tpl.json",
out = "package.json"
)
npm_package(
name = "red-server_package",
deps = [
":package_json",
"//packages/server/src:shared-red-server-library"
],
replacements = {"//packages/": "//"},
)
If I rename package.tpl.json to package.json and just include that file in npm_package it works as expected, except that the version is incorrect.
The stamp_package_json rule is defined in tools/npm/stamp_package_json.bzl:
def _impl(ctx):
package_json = ctx.file.package_json
# The command may only access files declared in inputs.
ctx.actions.run_shell(
inputs = [package_json],
outputs = [ctx.outputs.executable],
arguments=[package_json.path],
progress_message = "Stamping package.json file %s" % package_json.short_path,
command="jq '.version=\"123\"' $1 > $#")
stamp_package_json = rule(
implementation=_impl,
executable = True,
attrs = {
"package_json" : attr.label(allow_single_file=True),
"out": attr.output(mandatory = True)
}
)
As mentioned above it currently throws an error:
The following files have no generating action: packages/server/package.json
I can't seem to figure out how to deal with this. Or if my approach is any good. Or if this can be achieved in any other way.
edit: Wrote a blog post about the solution I ended up with: https://medium.com/red-flag/developer-diary-day-1-bazel-build-system-with-monorepo-and-typescript-6f7a5a0a2b00
Looking into the blog post mentioned in the question, the work of transforming package.tpl.json to package.json can be done with a genrule
genrule(
name = "package_json",
srcs = ["package.tpl.json"],
outs = ["package.json"],
cmd = "jq --arg version $$(cat $(GENDIR)/../../volatile-status.txt | sed -nE 's/^BUILD_SCM_VERSION v([0-9.]+).*$$/\\1/p') '.version=$$version' <$< > $#",
stamp = True
)
npm_package(
name = "shared-red-server-library_package",
deps = [
":package_json",
":shared-red-server-library"
],
replacements = {"//shared_red_node_library/packages/server/": "//"},
)
This looks like a good solution, except it brings external dependency on unix tools jq and sed, so build fails if either is missing or if the environment has some weird incompatible version of it.
I would like to run my py_test with python 3 in Bazel.
py_library(
name = "foo",
srcs = ["foo.py"]
)
py_test(
name = "foo_test",
srcs = glob(["foo_test.py",]),
deps = [":foo"]
)
py_runtime(
name = "python-3.6.3",
files = [],
interpreter_path = "/usr/local/bin/python3",
)
I was able to achieve this using command
bazel test --python_top=//path/to/foo:python-3.6.3 foo_test
However, I would like to import python3 to bazel sandbox with new_http_archive and provide the interpreter_path for the py_runtime rule that points to that http_archive within bazel sandbox. So far I am not able to find what is the interpreter_path... Do I have to reference the http_archive label somewhere from the py_runtime or somewhere else?
new_http_archive(
name = "python_version",
urls = ["https://www.python.org/ftp/python/3.6.3/Python-3.6.3.tgz"],
strip_prefix = "Python-3.6.3",
build_file_content = """
py_library(
name = "python_srcs",
srcs = glob(["Lib/*.py"]),
visibility = ["//visibility:public"]
)"""
)
The tgz that you're downloading doesn't contain an interpreter. It contains the source code for the interpreter. If you want to build the interpreter as part of your build, you could do something like this
new_http_archive(
name = "python_version",
urls = ["https://www.python.org/ftp/python/3.6.3/Python-3.6.3.tgz"],
strip_prefix = "Python-3.6.3",
build_file_content = """
genrule(
name = "build_python",
srcs = glob(["**"]),
outs = ["python"],
cmd = "./external/python_version/configure && make && cp python $#",
visibility = ["//visibility:public"],
)""",
)
And then your py_runtime rule would set the interpreter attribute (not interpreter_path):
py_runtime(
name = "python-3.6.3",
files = [],
interpreter = "#python_version//:python",
)
We updated from scons 2.4.1 to 2.5.1 and suddenly get several errors like so:
scons: *** Found dependency cycle(s):
Internal Error: no cycle found for node ...
The issue I believe pertains to a version file that we attempt to automatically update if our SCM detects an edit to source files. The gist of the process is that we maintain a file 'version.cfg' that has #defines. This file is checked into our SCM. If the file has already been updated once, it will not be updated a second time until the file is commited to the SCM. This file is then used to autogenerate a C++ header file named 'kb_version.hh'.
What is the cyclic dependency and how can I eliminate it? (note, whatever the issue is, did not cause problems in scons 2.4.1 (only if we use the new version 2.5.1 does it detect the cyclic dependency).
The relevant scons snippet is below:
SRCDIR = '../../src'
SRCS = [
'kb.cc',
]
SOURCE = [ os.path.join(SRCDIR, s) for s in SRCS ]
SCRIPT_VERSION_GEN = os.path.join(env['_ROOT'], 'kb/build/scripts/versionGen.sh')
SCRIPT_VERSION_UPD = os.path.join(env['_ROOT'], 'kb/build/scripts/versionUpdate.sh')
FILE_VERSION_CFG = 'version.cfg'
FILE_VERSION_HH = 'kb_version.hh'
scriptVerGen = env.File(SCRIPT_VERSION_GEN)
scriptVerUpd = env.File(SCRIPT_VERSION_UPD)
verCfg = env.File(os.path.join(SRCDIR, FILE_VERSION_CFG))
verHH = env.File(os.path.join(SRCDIR, FILE_VERSION_HH))
## this command detects for change in source files, then updates, when necessary, the source version.cfg
env.Command(
target = verCfg,
source = [ SOURCE, scriptVerUpd ],
action = [ scriptVerUpd.path + ' ' + env['BS_DIR_SRCROOT'] + '/kb/foo' + verCfg.srcnode().path, Copy(verCfg.path, verCfg.srcnode().path) ]
)
env.Command(
target = verHH,
source = [ verCfg, scriptVerGen ],
action = scriptVerGen.path + ' ' + verHH.path + ' ' + verCfg.path
)
A few inline questions. Not easy to do in comments section above..
(See ## comments below)
SRCDIR = '../../src'
SRCS = [
'kb.cc',
]
SOURCE = [ os.path.join(SRCDIR, s) for s in SRCS ]
## Is _ROOT the top of your tree where SConstruct lives?
SCRIPT_VERSION_GEN = os.path.join(env['_ROOT'], 'kb/build/scripts/versionGen.sh')
SCRIPT_VERSION_UPD = os.path.join(env['_ROOT'], 'kb/build/scripts/versionUpdate.sh')
FILE_VERSION_CFG = 'version.cfg'
FILE_VERSION_HH = 'kb_version.hh'
scriptVerGen = env.File(SCRIPT_VERSION_GEN)
scriptVerUpd = env.File(SCRIPT_VERSION_UPD)
verCfg = env.File(os.path.join(SRCDIR, FILE_VERSION_CFG))
verHH = env.File(os.path.join(SRCDIR, FILE_VERSION_HH))
## this command detects for change in source files, then updates, when necessary, the source version.cfg
env.Command(
target = verCfg,
source = [ SOURCE, scriptVerUpd ],
## Why not do this?
action = [ '$SCRIPT_VERSION_UPD $BS_DIR_SRCROOT /kb/foo ' + verCfg.srcnode().path, ## Why srcnode()?
## Why do this?
Copy(verCfg.path, verCfg.srcnode().path) ]
)
env.Command(
target = verHH,
source = [ verCfg, scriptVerGen ],
## How about this change
action = '$SCRIPT_VERSION_GEN $TARGET' + verCfg.path
)
I was able to track down the problem, though I am not sure if this is desirable behavior from scons.
The issue is this.
(1) C++ source file depends on version.h (by way of #include)
(2) version.h is autogenerated from version.cfg
(3) version.cfg is auto-incremented if any source file has been updated, which includes source files that depend on version.h (thus the cyclic dependency). However, no actual change might happen to the c++ source file, but the file itself #includes version.h. So our intent is that if and only if the source file itself has changed, then version.cfg should be updated (not the object file that results from compiling the source file).
In the comment below, the cyclic depency can be eliminated by removing the SOURCE variable from the "source = " line in the first env.Command() call.
Is there a way in scons to say I depend on the source file, but not the object file? Or is this a bug/nuance in the cyclic dependency?
Seems -u doesn't work on for me ( I am using scons-2.3.6).
To simplify the context, you can imagine my project structure like,
+root
+project
- bar.vcxproj (generated vs project)
-SConstruct
-bar.c
Inside SConstruct, I have put code like:
env_base = Environment()
...
env_base.StaticLibrary(target = 'bar', source = ['bar.c'])
...
If I execute command "scons" in root folder, everything works perfectly.
But If I execute command "scons -u" in project folder, scons can find my SConstruct up in root folder, but no file get compiled.
BTW : The reason for me to execute "scons -u" in project folder is because I want to put the generated vsproj in projet folder and use BuildCommandLine to compile the project.
I guess I didn't use "-u" correctly, what will be the elegant solution for my situation?
1st edit:
As bdbaddog asked, I have put the SConstruct here:
def BuildConfig(env, config):
env.Append(CCFLAGS = '/W 4')
env.Append(CCFLAGS = '/WX')
if config == "debug":
env.Append(CCFLAGS = '/DEBUG')
#env.Append(CCFLAGS = '-Zi /Fd${TARGET}.pdb')
env.Append(CCFLAGS = '/Z7')
elif config == "release":
pass
env_base = Environment()
lib = env_base.StaticLibrary(target = 'bar', source = ['bar.c'])
opts=Variables()
opts.Add('target', 'Compile Target (debug/release).', "debug")
# there is more in my project....
opts.Update(env_base) # update environment
# here I want to use my own command to build the project, so it can support different build option that is defined by me.
env_base['MSVSBUILDCOM'] = "scons -u target=$(Configuration)"
target = env_base["target"]
BuildConfig(env_base, env_base['target'])
env_base.MSVSProject(target = "project\\bar" + env_base['MSVSPROJECTSUFFIX'],
srcs = ["..\\bar.c"],
incs = [],
localincs = "",
resources = "",
misc = "",
buildtarget = lib,
variant = ['debug'],
auto_build_solution=0)
SCons only builds files under the current directory by default.
If you you wanted to only build files in a certain directory (for which there are rules that build the targets there), you can invoke SCons as follows:
scons the_target_directory_I_want_to_build
Though this may cause sources for targets in that directory to also be built.