Suppose I have a top level file that I pass to my compiler that has:
`include "my_defines.sv"
`include "my_component.sv"
Inside "my_component.sv" file, I am using some defines from "my_defines.sv", like this:
my_variable = `CONSTANT_FROM_MY_DEFINES;
The question is the following: do I need to have `include "my_defines.sv" inside "my_component.sv"? Perhaps this requirement is compiler-specific?
If your "my_defines.sv" has an "include" guard, then it is safe and better to include "my_defines.sv" in all your other files. The "include" guard at the top of "my_defines.sv" will look like this:
`ifndef MY_DEFINES_SV
`define MY_DEFINES_SV
// put your own defines here ...
`endif
include directives like that are like copying and pasting that file into the point where the include is. The compiler:
Reads the file you give it.
When it encounters an include, it reads that file.
When it's finished that file it continues the original file.
The result is that the compiler sees one big flat file.
In your example you can use stuff from my_defines in my_component because it appears earlier.
The problem with doing a lot of this is that eventually you'll end up with conflicts. Maybe two things reference each other (which include comes first), two things use the same name (clashing definitions), or multiple things have the same include statement (multiple definitions of the same thing).
Packages solve those problems. Once things start getting a little more complex, look into them.
It is dependent upon the order in which your source files are compiled. Because you are referring specifically todefine macros, which are global, it is required that the macro definitions are compiled before the macro is used. In your case, you do not need to include "my_defines.sv" inside "my_component.sv" since "my_defines.sv" was already compiled in your top file.
Macro definitions only persist across files but only to the end of the translation unit. Simulators must support two different methods of assigning source files to translation units and it's hard to get `include files full of `defines to compile correctly in both methods.
It is better use parameters or const variables for constants. Since parameters and constants follow normal scoping rules you can safely include them in every file/scope that needs them. Then it doesn't matter how the code is broken into translation units, it always compiles. I think it is easier to find the definitions when you're browsing the code because the `include is probably in the same file instead of off in some other unrelated file.
you have to include `include "my_defines.sv in my_component.sv...
best practice is add all include in one pkg and add that pkg to each of file.
Related
I'm trying to build a Verilog file that imports global definitions from a defines file so I can keep track of all my FPGA endpoints in one place. In the my_defines file I have a list of variables like so:
`define PipeA 8'hA1
I import this file into the main file top_module using `include "my_defines.v"
When I instantiate the variable inside my top_module file, I noticed that you have to use `PipeA as the variable name instead of PipeA. If I've already imported this, why do I need the `?
`include is a verilog syntax which directs a compiler to include contents of other files in compilation. It is very similar to #include in C.
`define defines a named text replacement (macro), similar to the #define in C.
So, `define PipeA 8'hA1 defines the macro named PipeA with 8'h1 as a context. To use it in a program you need to follow verilog rules and to use the '`' syntax, as here: `PipeA.
An example
assign myVar[7:0] = `PipeA;
The pre-processor will replace `PipeA with the text form its definition:
assign myVar[7:0] = 8'h1;
The above result will be parsed by verilog.
Macro definitions are concidered global. The definition interpretation happens before any verilog analysis and is ortogonal to the scoping rules. So, no matter whre you define the macro, in a scope or outside a scope, it will still be defined everywhere within a compilation unit.
Also, standard Verilog does not have any concept of import. System verilog does, but it has nothing to do with the above.
There's a big difference between `include and import. import is something only SystemVerilog allows. The use of `define is text substitution in a pre-processing step without understanding any Verilog syntax. `PipeA invokes a text substitution macro, it is not a variable name. There is no global namespace as far as Verilog is concerned.
SystemVerilog has a package you can define which is a namespace that can be imported into a module (or another package).
For the TypeScript ANTLR target that Sam and I have been working on, I would like to have the code generation tool create a single typescript file to hold all the classes generated from a named grammar input. Is this output file structure going to be hard?
So for example, I'd like Expr.g4 -> Expr.g4.ts. That one file TypeScript file could contain named exports for {ExprLexer, ExprParser, and ExprListener} classes, visitor code if requested, maybe even some loose factory functions etc.
I've been looking into the source code under tool/src/org/antlr/v4/codegen to find out how the number and names of the output files are determined, in particular finding CodeGenPipeline.java, This class works in conjunction with the language-specific target class, but the pipeline has a lot (perhaps too much) knowledge of possible output files built into it. None of what I see in CodeGenPipeline.java seems well matched to my 1:1 input-to-output file model.
It seems like the knowledge of what files should be generated for a given language target should come from the language.stg file if possible, but I can't find any evidence that approach has been implemented. Can anyone fill me in on any reasons that approach can't hasn't been tried or worked?
In CMake, we can use find_dependency() in an package -config.cmake file to "forwards the correct parameters for QUIET and REQUIRED which were passed to the original find_package() call." So, naturally we'll want to do that instead of calling find_package() in such files.
Also, for dependency on a threads library, CMake offers us the FindThreads module, so that we write include(FindThreads), prepended by some preference commands, and get a bunch of interesting variables set. So, that's preferable to find_package(Threads).
And thus we have a dilemma: What to put in -config.cmake files, for a threads library dependency? The former, or the latter?
Following a discussion in comments with #Tsyarev, it seems that:
find_package(Threads) includes the FindThreads module internally.
... which means it "respects" the preference variables affecting FindThreads behavioe.
so it makes sense, functionally and aesthetically, to just use find_package() in your main CMakeLists.txt and find_dependency() in -config.cmake.
Why are there Verilog verification files not in the form of a module?
The files I see start with just initial begin, and some file names use the .inc extension.
It is common to include files of arbitrary content into Verilog modules. This is done using the `include compiler directive, as described in IEEE Std 1800-2012, section "22.4 `include":
The file inclusion (include) compiler directive is used to insert the
entire contents of a source file in another file during compilation.
The result is as though the contents of the included source file
appear in place of the `include compiler directive.
It can be useful for sharing common code between different modules: parameters, define macros, tasks, functions, etc.
In general, the .inc file extension is not special. It may be a convention used by certain simulation tools.
If I have multiple .rs files in the src directory of a Cargo package, what are the rules for visibility, importing, etc.?
Currently, any extra (i.e. not the file that is explicitly identified as the source for the executable in Cargo.toml) files are ignored.
What do I need to do to fix this?
There is nothing special about Cargo at all in this way. It’s all the perfectly normal Rust module system. If Cargo will be compiling src/lib.rs, that’s more or less equivalent to having executed rustc --crate-type lib src/lib.rs (there are more command line arguments in practice, but that’s the basics of it).
Other files are then used with mod, use and so forth. Files are not automatically imported or anything like that. This part is not documented very clearly yet; a couple of things that show briefly how to achieve things are http://rustbyexample.com/mod/split.html and http://doc.rust-lang.org/reference.html#modules, but any non-trivial code base will use them and so you can pick just about any code base to look at for examples.
It's hard to say what you're getting tripped up on from the info you shared. Here are three seemingly trivial things that I still had to refer to the documentaton to figure out:
First of all,
mod foo;
looks like a declaration, but it without arguments it is actually something like an include. So you use the same keyword both for declaring and including modules, i.e. there is no using:: keyword.
Second, modules themselves can be public or private. If you didn't add a pub keyword both on the function in question AND on the containing module, that may be tripping you up.
pub mod foo {pub fn bar();}
Third, there seems to be an implicit module added at the top of every file. This is confusing; the reference manual talks about a strict separation between file paths and names, and the module paths in your code, but that abstraction seems to be leaky here.
Note, Rust is still pre-1.0 (0.12) at the time of writing, at the module system and file paths are relatively high level, so don't be surprised if what I said may already wrong by the time you read this.
Files are implicitly included from your rust code.
For instance, if a file src/foo.rs pointed by path in a [lib] or [[bin]] section of your Cargo.toml contains:
mod bar;
It tells cargo to build src/bar.rs too, and include it.