Restart the numbering of the reference labels in the appendix body in overleaf - reference

I have created a supplementary section in my journal manuscript using the following script:
\appendix
%%%
\renewcommand{\appendixname}{S}
\renewcommand{\thesection}{S}
\renewcommand\thefigure{\thesection.\arabic{figure}}
\setcounter{figure}{0}
\renewcommand*{\thepage}{S\arabic{page}}
\setcounter{page}{1}
%%%
\begin{center}
\section*{Supplementary Material}
\end{center}
%%%
\subsection{Sub-heading1}
A separate bibliography also has been generated for the appendix using the multibib package as follows:
\usepackage[resetlabels]{multibib}
\newcites{supp}{Supplementary References}
and declaring
%% Loading supplementary bibliography style file
\bibliographystylesupp{unsrt}
% Loading supplementary bibliography database
\bibliographysupp{cas-sc-template-refs.bib}
\end{document}
resulting in a reference section that looks like this:
Supplmentary References
However, the reference labels in the body text does not change:
S.2. Discussion
S.2.1. Subheading2
The role of the structural of squares and the circles is clearly seen in the
interdependence of property on the values of energy and density as
shown in Figures S.4a and S.4b. There is a clear clustering of data
points based on the primary property as viewed against its dependence
on secondary property in Figures S.4c. The high-value compositions
are observed to be all apples and the medium value ones are observed
to be oranges. The values thus predicted placed most of them in the
low– and medium–value range [66].
The reference numbers are still from the main document's bibliography.
I have tried the \DeclareOption{resetlabels}{\continuouslabelsfalse} option in the multibib package documentation given in http://tug.ctan.org/tex-archive/macros/latex/contrib/multibib/multibib.pdf but to no avail.
Is there any way to renumber these reference labels as well?

Related

How to remove a feature from a product?

So I am still learning this amazing complex system folks call Hybris, or SAP Commerce, along with lots of other names :) I ran into a problem and am looking to learn how to get out of it. I have added four new classifications attributes (Utilization, Fit, Material, Function). When I went to add them to the products I added a space between them and the numeric code that came after it:
$feature1=#Utilization, 445 [$clAttrModifiers]; # Style
$feature2=#Fit, 446 [$clAttrModifiers]; # Colour
$feature3=#Material, 447 [$clAttrModifiers]; # Connections
$feature4=#Function, 448 [$clAttrModifiers]; # Function
INSERT_UPDATE Product;code[unique=true];$feature1;$feature2;$feature3;$feature4;$catalogVersion;
;300413166;my;feature;has;a space
The problem is I want to take the space out, as seen in the follow code:
$feature1=#Utilization,445 [$clAttrModifiers];# Style
$feature2=#Fit,446 [$clAttrModifiers];# Colour
$feature3=#Material,447 [$clAttrModifiers];# Connections
$feature4=#Function,448 [$clAttrModifiers];# Function
INSERT_UPDATE Product;code[unique=true];$feature1;$feature2;$feature3;$feature4;$catalogVersion;
;300413166;Bottom;Loose;Yam type;Sportswear
When I run both of these scripts together, I get 8 features:
So how do I remove the four features that have spaces in them?
How do I go about actually removing the first set of features?
Below removes the ClassAttributeAssignment records, which also removes the ProductFeature entries assigned to all products. However, you need to find the correct classification category code (e.g. clasificationCategory1) the attribute (e.g. Utilization, 445) belongs to. The classification category is the grouping/header you will find in the Attributes
$classificationCatalog=ElectronicsClassification
$classificationSystemVersion=systemVersion(catalog(id[default=$classificationCatalog]),version[default='1.0'])[unique=true,default=$classificationCatalog:1.0]
$classificationCatalogVersion=catalogversion(catalog(id[default=$classificationCatalog]),version[default='1.0'])[unique=true]
$class=classificationClass(code,$classificationCatalogVersion)[unique=true]
$attribute=classificationAttribute(code,$classificationSystemVersion)[unique=true]
REMOVE ClassAttributeAssignment[batchmode=true];$class;$attribute
;clasificationCategory1;Utilization, 445
;clasificationCategory1;Fit, 446
You can also remove the ProductFeature entries (for all products) like this, but it doesn't remove the ClassAttribute assignment.
REMOVE ProductFeature[batchmode=true];qualifier[unique=true]
;ElectronicsClassification/1.0/clasificationCategory1.Utilization, 445
Other Reference:
Classification System API: https://help.sap.com/viewer/d0224eca81e249cb821f2cdf45a82ace/1905/en-US/8b7ad17c86691014aa0ee2d228c56dd1.html

Where is the algorithm or table that "resolves" the OLC short code?

Many geocodes, such as Geohash and OLC (Open Location Code), can be reduced by a context reference, as described here and here.
For example:
Being able to say WF8Q+WF, Cape Verde, Praia is significantly easier than remembering and using 796RWF8Q+WF
The resolver software take "Cape Verde, Praia" (or ISO abbreviation CV instead Cape Verde) and transforms it into a code prefix... The resolver make use of something like a lookup table,
Prefix | Country | Name (replaces prefix) | Reference is it?
-------+---------+------------------------+------------------
796R | CV | Praia | 796RWFMP ?
796R | CV | Joao Varela | 796RXC4C ?
797R | CV | Cruz do Gato | 797R3F38 ?
... | ... | ... | ...
I am supposing that the hidden (black box) algorithm do something simple, based on an official lookup table like the illustrated above. It use prefix of the lookup table to translate short code into complete code, or the inverse:
Translating short code to complete code. To retrieve the location from the OLC short code, just know the prefix. Example: "WF8Q+WF, CV, Praia " will use the line of CV | Praia of lookup table, that informs the prefix 796R to resolve the code, concatenating prefix with suffix, "796R" with "WF8Q+WF". It is like a functionrecoverNearest('WF8Q+WF', getReferencePoint_byText(lookup,"CV", "Praia")) but Google/PlusCodes not published lookup dataset of Cape Verde.
Translating complete code to short code. To show the short code from location (e.g. from 796RWF8Q+WF), is necessary to check the "nearst reference" to resolve the spatial query — Joao Varela and Praia lines have same prefix, but Praia's reference, by 796RWF, matches better. It is like a functionshorten('796RWF8Q+WF', getReferencePoint_byNearFrom(lookup,'796RWF8Q+WF')) but Google/PlusCodes not published lookup dataset of Cape Verde.
Question: where official lookup table of Cape Verde?
NOTES
We can split in more questions to generalize:
Is plus.codes really a black box? (perhaps I am using some wrong hypothesis on my explanation)
The lookup table of a country like Cape Verde exist, and we can download it? where Google is publishing it?
The official lookup table of Cape Verde exists and Google is respecting it... where Cape Verde govern is publishing it?
More illustrations for readers that not understand the central problem:
Translation from complete code to short code. Suppose the prefix 796R, when a complete code 796Rxxxx+yy is translated to "Praia xxxx+yy" and when is translated to "Joao Varela xxxx+yy"? It is an arbitrary choice if you not have a table with the PlusCode official references.
Translation from short code to complete code. Suppose that I am developing a Javascript software. The inputs are the short code xxxx+yy and a name (country and city or contry/state/city/district). Suppose only Cabo Verde country names, how to convert names into prefixes exactly as PlusCodes?
(edit after discussions) A preliminary conclusion. There are only two possible answers to the question:
show a link where PlusCodes published its name-to-prefix table;
show the source-code of an algorithm that reproduces exactly PlusCodes, that you developed by reengineering. I supposed that the most simple algorithm use the ordinary OLC encode/decode, and a parser for translate names into prefixes (or vice-versa), based in an "official lookup table".
Open Location Code is just another form of standard geographic coordinate: latitude and longitude. So, to get OLC for any place you need only geo coordinates for this place (see encoding section) and vice versa.
With database of Cape Verde towns and their coordinates you can build your own lookup table for quick OLC transformation with any required precision (starting from Wikipedia List of cities and towns in Cape Verde or using any of free world cities databases) or you can just convert OLC to latitude and longitude and than work with this coordinates.

How do I dynamically add elements to the Roassal RTGrapher instance?

Object subclass: #MultiData
instanceVariableNames: 'b'
classVariableNames: ''
package: 'CFR-Extensions'
initialize
b := RTGrapher new.
b add: (self makeD: #('hello' 1 2 1)).
b add: (self makeD: #('test' 1 2 11)).
b
makeD: first
| d |
d := RTVerticalMultipleData new.
d barShape color: Color blue.
points := OrderedCollection new.
points add: first.
d points: points.
d addMetric: #second.
d addMetric: #third.
d addMetric: #fourth.
"Rotated text"
d barChartWithBarTitle: #first rotation: -30.
^d
The above is essentially the Several metrics per data point example from the Roassal book factored into two methods. Rather than just visualizing a static dataset I've been looking into ways of trying to add data as the program runs. I want to visualize the trace of the parameters for a tabular RL agent.
What happens when I display the graph in the inspector is that only the latest element shows up as a chart. There is some overlaying in the labels though that should not be there.
Originally I wanted to do something like pass an OrderedCollection of points, but the way RTVerticalMultipleData compiles them into Trachel elements makes such a scheme invalid, so I've thought to batch the data instead before adding it as an element.
The fact that the above does not work strikes me as a bug. Apart from fixing this, I am wondering if there is a better way to visualize dynamic data?
I don't know roassal enough to answer to your problem, but for dynamic visualizations, Pharo also has the Telescope project. (https://github.com/TelescopeSt/Telescope)
Currently, Telescope only works with Seaside via web visualization (With the Cytoscape connector: https://github.com/TelescopeSt/TelescopeCytoscape). See a demo at: https://demos.ferlicot.fr/TelescopeDemo
I don't know if web visualizations are fine with you but I share just in case.

2 Sequential Transactions, setting Detail Number (Revit API / Python)

Currently, I made a tool to rename view numbers (“Detail Number”) on a sheet based on their location on the sheet. Where this is breaking is the transactions. Im trying to do two transactions sequentially in Revit Python Shell. I also did this originally in dynamo, and that had a similar fail , so I know its something to do with transactions.
Transaction #1: Add a suffix (“-x”) to each detail number to ensure the new numbers won’t conflict (1 will be 1-x, 4 will be 4-x, etc)
Transaction #2: Change detail numbers with calculated new number based on viewport location (1-x will be 3, 4-x will be 2, etc)
Better visual explanation here: https://www.docdroid.net/EP1K9Di/161115-viewport-diagram-.pdf.html
Py File here: http://pastebin.com/7PyWA0gV
Attached is the python file, but essentially what im trying to do is:
# <---- Make unique numbers
t = Transaction(doc, 'Rename Detail Numbers')
t.Start()
for i, viewport in enumerate(viewports):
setParam(viewport, "Detail Number",getParam(viewport,"Detail Number")+"x")
t.Commit()
# <---- Do the thang
t2 = Transaction(doc, 'Rename Detail Numbers')
t2.Start()
for i, viewport in enumerate(viewports):
setParam(viewport, "Detail Number",detailViewNumberData[i])
t2.Commit()
Attached is py file
As I explained in my answer to your comment in the Revit API discussion forum, the behaviour you describe may well be caused by a need to regenerate between the transactions. The first modification does something, and the model needs to be regenerated before the modifications take full effect and are reflected in the parameter values that you query in the second transaction. You are accessing stale data. The Building Coder provides all the nitty gritty details and numerous examples on the need to regenerate.
Summary of this entire thread including both problems addressed:
http://thebuildingcoder.typepad.com/blog/2016/12/need-for-regen-and-parameter-display-name-confusion.html
So this issue actually had nothing to do with transactions or doc regeneration. I discovered (with some help :) ), that the problem lied in how I was setting/getting the parameter. "Detail Number", like a lot of parameters, has duplicate versions that share the same descriptive param Name in a viewport element.
Apparently the reason for this might be legacy issues, though im not sure. Thus, when I was trying to get/set detail number, it was somehow grabbing the incorrect read-only parameter occasionally, one that is called "VIEWER_DETAIL_NUMBER" as its builtIn Enumeration. The correct one is called "VIEWPORT_DETAIL_NUMBER". This was happening because I was trying to get the param just by passing the descriptive param name "Detail Number".Revising how i get/set parameters via builtIn enum resolved this issue. See images below.
Please see pdf for visual explanation: https://www.docdroid.net/WbAHBGj/161206-detail-number.pdf.html

How to connect specific attributes over polar coordinates in R?

I have highlighted specific activities (feeding,resting and sleeping) from the dataset in my plot. Now I want to connect these highlighted points in sequence over my polar coordinates.
Here's my dataset:
Activity Latitude Longitude
Feeding 21.09542 71.06014
Resting 21.09564 71.06064
Sleeping 21.09619 71.06128
Walking 21.09636 71.06242
Walking 21.09667 71.06564
Resting 21.09483 71.06619
Can you help me out in this?
# Example dataframe
set.seed(1)
mydf=data.frame(Activity=sample(c("Walking","Feeding","Resting","Sleeping"),20,T),Latitude=rnorm(20,21,0.5),Longitude=rnorm(20,71,0.5))
mydf$Order=1:nrow(mydf)
If you want to connect the points in order regardless of the activity, do the following (for clarity, I added the variable mydf$Order to label the points).
# Plot
library(ggplot2)
ggplot(data=mydf)+
geom_point(aes(x=Latitude,y=Longitude,colour=Activity))+
geom_path(aes(x=Latitude,y=Longitude))+
geom_text(aes(x=Latitude,y=Longitude,label=Order))+
coord_polar(theta="y")
If you want to connect points according to activities, consider CMichael's answer.
Ok I am starting from scratch: My original answerwas much too bulky and inflexible.
Just add the following to get Paths for each Activity without filtering.
+ geom_path(aes(colour=ACTIVITY,x=Latitude,y=Longitude))
If you want to plot only selected Activities:
+ geom_path(data=Data[Data$ACTIVITY %in% c("Sleeping","Resting"),],aes(colour=ACTIVITY,x=Latitude,y=Longitude))
The selected Activities are to be listed in the c(...) vector with each name quoted.
UPDATE: OP clarified that he wants to connect any stationary point, this achieved by running the following:
+ geom_path(data=Data[Data$ACTIVITY!="Walking",],colour="red",aes(x=Latitude,y=Longitude))
Note that the colour=ACTIVITY is removed from the aesthetics and we consider all stationary points (!="Walking") to draw the path.
Code combining the two answers:
set.seed(1)
mydf=data.frame(Activity=sample(c("Walking","Walking","Walking","Walking","Walking","Resting","Feeding","Sleeping"),20,T),Latitude=rnorm(20,21,0.5),Longitude=rnorm(20,71,0.5))
mydf$Order=1:nrow(mydf)
# Plot
library(ggplot2)
ggplot(data=mydf)+
geom_point(aes(x=Latitude,y=Longitude,colour=Activity),size=5)+
geom_path(aes(x=Latitude,y=Longitude),size=1.2)+
geom_text(aes(x=Latitude,y=Longitude,label=Order))+
geom_path(data=mydf[mydf$Activity!="Walking",],colour="red",aes(x=Latitude,y=Longitude)) +
coord_polar(theta="y")

Resources