How do I use reStructuredText (rst) directives like math in Gollum?
My Bitbucket wiki has rst files with math content, such as
:math:`1+1`
But any rst directive in Gollum renders like code. Instead it expects $1+1$ (for inline math). I can change the chars defining math code, but it won't work in Bitbucket.
Related
In a python project I generate autoapi documntation. Special comment appear in generated html files.
For instance it's working and displaying on final html page:
def do_action(self,params):
"""
This is function to do some cool stuffs.
Actually it should
"""
pass
Or
...
applicationConfig = None
"""This variable hold some important data"""
However I would like autoapi generate some custom comment into html page
For example I've got a comment in code like this:
"""These are public variable:"""
p_var1 = "segg"
p_var2 = "fos"
But this last comment not shown in generated documentation. Maybe because it hasn't connected to any definition structure in source code? (I mean neither variable declaration nor function or class declaration)
Anyway, how should force sphinx to generate html entry from any comments arrounded by triple apostrophe?
There are two options for having sphinx parse variable comments. The first is via attribute docstrings, which are specified in pep 224 to belong below the attribute that they describe, as in your first example. While it was rejected, it is the format sphinx requires in order to work correctly:
p_var1 = "segg"
"""Docstring for p_var1"""
Renders as:
Alternatively, sphinx will also pick up comments above the attribute that start with a colon and treat them like a docstring, which in some cases looks a bit better in the source code:
#: Description for p_var1
p_var1 = "segg"
Renders also as:
There is no option to pick up a comment without a module, exception, class, method, function, or variable being attached to it, becauseautodoc explicitly only considers information from docstrings (and call signatures, but that's the only exception).
I would like to define a parameter MYTYPE using text macro, whose value is passed over by text macro, eg
`define MY_FEATURE(nam,def) parameter nam=def;
and then
`MY_FEATURE(MYTYPE, 1)
But the value is mixed by those who are defined by other text macros, eg
`MY_FEATURE(NEWTYPE, 2)
`MY_FEATURE(MYTYPE, NEWTYPE)
The latter case will not work unless the def in define MY_FEATURE is added with the directive dot.
I need to distinguish this two different cases and automatically expand the macro - only if it is defined, so I came up with this code but I got error.
`define yea 1
`define nop 0
`define MY_FEATURE(nam,def) `ifdef def parameter nam=`def; `else parameter nam=2; `endif
module test;
`MY_FEATURE(MYTYPE,yea)
initial begin
$display("%d",MYTYPE);
end
endmodule
The above code works and gives a 1 as output. However if I write
`MY_FEATURE(MYTYPE,10)
since for other cases I need to assign an actual number to the parameter, then I will get
`ifdef without a macro name - ignored.
My desired result is MYTYPE is assigned as 10.
Is there any way to achieve this? Thanks.
Code can be found here
http://www.edaplayground.com/x/6Jha
I think you are overthinking it. `define creates an directive expression. When when you pass a directive as parameter to another directive you can pass it as `yea.
Here is an example:
`define yea 1
`define nop 0
`define MY_FEATURE(nam,def) parameter nam=def;
module test;
`MY_FEATURE(MYTYPE,`yea)
`MY_FEATURE(MYTYPE2,10)
`MY_FEATURE(MYTYPE3,MYTYPE+MYTYPE2)
initial begin
$display("%d %d %d",MYTYPE, MYTYPE2, MYTYPE3); // displays: 1 10 11
end
endmodule
http://www.edaplayground.com/x/5Pgf
Verilog-AMS (superset of Verilog-A) is a language of its own, derived from Verilog (IEEE Std 1364); according the manual. This means your MY_FEATURE never creates new directives; it creates parameters. Directives and parameters are both treated as constants in simulation but act differently in compile. The `define/parameters relation in Verilog (and Verilog derived languages) is equivalent to C's #define/const relation. Unlike C, to access the value of a `define requires a ` prefix.
Neither directives or parameters cannot start with a numeric value. The first character must be an alpha or underscore (aka [a-zA-Z_]). There for 10 can never be a directives and even trying to use it is illegal syntax. There is noway for the compile to recover from an illegal syntax directive name. This is way I suggested passing `yea instead of yea.
If someone build you a nice model, then it should come with equally nice documentation or some way of getting support.
Is there any way to have an Alex macro defined in one source file and used in other source files? In my case, I have definitions for $LowerCaseLetter and $UpperCaseLetter (these are all letters except e and O, since they have special roles in my code). How can I refer to these macros from other .x files?
Disproving something exists is always harder than finding something that does exist, but I think the info below does show that Alex can only get macro definitions from the .x file it is reading (other than predefinied stuff like $white), and not via includes from other files....
You can get the sourcecode for Alex by doing the following:
> cabal unpack alex
> cd alex-3.1.3
In src/Main.hs, predefined macros are first set in variables called initSetEnv (charset macros $white, $printable, and "."), and initREEnv (regexp macros, there are none). This gets passed into runP, in src/ParseMonad.hs, which is used to hold the current parsing state, including all defined macros. The initial state is set using the values passed in, but macros can be added using a function called newSMac (or newRMac for regular expression macros).
Since this seems to be the only way that macros can be set, it is then only a matter of some grep bookkeeping to verify the only ways that macros can be added is through an actual macro definition in the source .x file. Unsurprisingly, Alex recursively uses its own .x/.y files for .x source file parsing (src/parser.y, src/Scan.x). It is a couple of levels of indirection away, but you can verify that the only way newSMac can be called is through the src/Scan.x macro
#smac = \$ #id | \$ \{ #id \}
<0> #smac #ws? \= { smacdef }
Other than some obvious predefined stuff, I don't believe reuse in lexers is all that typical anyway, because at the token level things are usually pretty simple (often simple tokens like SPACE, WORD, NUMBER, and a few operators, symbols and parens are all that are needed). The complexity comes at the parsing stage, although for technical reasons, parser-includes aren't that common either (see scannerless parsing for a newer technology that does allow reuse through nesting, like javascript embedded in html.... The tools for scannerless parsing are still pretty primitive though).
I was wondering about some best practices regarding extraction of selectors to constants. As a general rule, it is usually recommended to extract magic numbers and string literals to constants so they can be reused, but I am not sure if this is really a good approach when dealing with selectors in Capybara.
At the moment, I have a file called "selectors.rb" which contains the selectors that I use. Here is part of it:
SELECTORS = {
checkout: {
checkbox_agreement: 'input#agreement-1',
input_billing_city: 'input#billing\:city',
input_billing_company: 'input#billing\:company',
input_billing_country: 'input#billing\:country_id',
input_billing_firstname: 'input#billing\:firstname',
input_billing_lastname: 'input#billing\:lastname',
input_billing_postcode: 'input#billing\:postcode',
input_billing_region: 'input#billing\:region_id',
input_billing_street1: 'input#billing\:street1',
....
}
In theory, I put my selectors in this file, and then I could do something like this:
find(SELECTORS[:checkout][:input_billing_city]).click
There are several problems with this:
If I want to know the selector that is used, I have to look it up
If I change the name in selectors.rb, I could forget to change it somewhere else in the file which will result in find(nil).click
With the example above, I can't use this selector with fill_in(SELECTORS[:checkout][:input_billing_city]), because it requires an ID, name or label
There are probably a few more problems with that, so I am considering to get rid of the constants. Has anyone been in a similar spot? What is a good way to deal with this situation?
Someone mentioned the SitePrism gem to me: https://github.com/natritmeyer/site_prism
A Page Object Model DSL for Capybara
SitePrism gives you a simple, clean and semantic DSL for describing
your site using the Page Object Model pattern, for use with Capybara
in automated acceptance testing.
It is very helpful in that regard and I have adjusted my code accordingly.
I'm using Delphi 2007 and I wonder how the following problem can be solved:
I have to translate AComp.Caption for example, but the string that I want to assign to the caption, often depends on some data (for example a date or a number, that gets Formatted). Therefore I have to save the data and the string in a new variable for every translation, which is really annoying.
What I want to do is something like that:
// will add the string and data to an internal list of Translator
// and will then return a DynamicString, which represents the translated value
AComp.Caption := T.NewTranslatedString("Hello %s, do you like cheese?", User)
(Note that AComp.Caption ("Hello %s..") can be changed in different methods)
When switching to another language, you would call T.TranslateAgain() and the value of all strings will be translated and, if data given, formatted again.
Is this possible or do you know another way for solving the given problem?
Thanks in advance
Additional question:
Are strings normal objects, that I can subclass and add dynamic behaviour that changes the string itself in special cases?
Delphi strings are not objects, you can't add behaviours to them. You would need to develop your own class.
The Windows way to localize applications is to get advantage of resources, that can be changed (and loading redirected) without changes to the code (no need to call special functions or add new components), and without run-time calls but to load the resource. The only disadvantage of resources is they cannot be changed easily by the end user. The Delphi 2007 standard localization tools use this approach.
Anyway there are some libraries like dxGetText (which is a port of the GNU gettext library) or TsiLang, for example that use a more "intrusive" approach, requiring changes to your code or adding components. In exchange they can simplify end-user localization.
Before developing your own localization library, I would check if one of the existing ones fits youe needs.
Note: Be aware that Delphi localization tool has significant issues that weren't fixed until XE (which I didn't test yet). See for example QC #79449. Unluckily the fix was never backported to earlier releases.
You can use Delphi's own translator tool. It is able to extract strings and resourcestrings from your source code and form DFM files, and gives you a graphical user interface to translate them to any language. It then creates a resource DLL for each language. The DLL containing the translated strings and DFM data. You should deploy this translation DLL with your project to the destination machine.
In your case, your strings are divided into two groups; fixed strings which do not need any further processing, and parametrized strings which need some additional data to be formatted properly. For the fixed strings, you can just type in the translation into translator tool. For parametrized strings, save each one as a resourcestring and use the resourcestring for formatting them. For example:
resourcestring
strDoYouLikeCheese = 'Hello %s, do you like cheese?';
...
AComp.Caption := Format(strDoYouLikeCheese,[User]);
Now you can use the translator tool or any resource editor to translate the resourcestring into your desired language without the need for changing your source code or recompiling it.
What you want to do is to localize your application. Delphi has support for this, based around the resourcestring keyword. However, I've never done any localization myself so I recommend that you do some websearch for this topic or perhaps wait for the other experts here to supply more detailed help!
You could use a dictionary to keep track of the string mappings, something like this
TTranslator = class
private
FMappings : TDictionary <String, String>;
public
function Translate (const SrcStr : String) : String;
procedure SetMapping (const SrcStr, DestStr : String);
end;
function TTranslator.Translate (const SrcStr : String) : String;
begin
if not FMappings.TryGetValue (SrcStr, Result) then
Result := SrcStr;
end;
procedure TTranslator.SetMapping (const SrcStr, DestStr : String);
begin
FMappings.AddOrSetValue (SrcStr, DestStr);
end;
Translating would then be simply several calls to SetMappings. This gives you a lot of flexiblity. Anyway, you might consider using the built-in localization support or even third-party solutions.
EDIT: Just saw that you are using Delphi 2007, so you don't have TDictionary available. The idea should remain valid, just use any dictionary implementation or a list-based approach.
And to answer the other part of your question: no, strings are not normal object (actually they are not objects at all). They are special in various ways (memory management, copy-on-write behaviour) and it is not possible to subclass them. But that's not what you want anyway if I understood the question correctly.