Javacard scriptgen - javacard

I try to use the oracle java card development kit 3.0.5u2 command line tools.
I use the cap file generated by netbeans
I use verifycap.bat on my cap together with the export files in the api_export_files directory of the sdk.
verifycap.bat -nobanner -nowarn ..\api_export_files\javacard\framework\javacard\framework.exp ..\api_export_files\javacard\security\javacard\security.exp ..\api_export_files\java\lang\javacard\lang.exp kaylat.cap > kaylat.hash
I get a hash file with this content:
[ INFO: ] [v3.0.5] Off-Card Verifier, Version {1}.
[ INFO: ] Copyright (c) 1998, 2017, Oracle and/or its affiliates. All rights reserved.
[ INFO: ] VĂ©rification du fichier CAP kaylat.cap
[ INFO: ] Hash for kaylat/javacard/ConstantPool.cap [SHA-256: ad9ece95c64174d87b92488213081d1f977c975ba116fd7dead60246a1a94099]
[ INFO: ] Hash for kaylat/javacard/StaticField.cap [SHA-256: 5863e9740af5fb905922380b2aa88309a16a285dd3412417ae8af941327901ee]
[ INFO: ] Hash for kaylat/javacard/Descriptor.cap [SHA-256: 957c5fd5ebee857a06b38129d5b94b3a0bf155d989c6db4080bc8f0ec2c26606]
[ INFO: ] Hash for kaylat/javacard/Header.cap [SHA-256: 67802c08d73cee2e77947b76dd5f5f728055fec8c9ab4820369b18f845bb4eab]
[ INFO: ] Hash for kaylat/javacard/Directory.cap [SHA-256: 803ba29574f2013b4dc895253afaf0bc8376deb1110eb35ee90c6e6807b70a59]
[ INFO: ] Hash for kaylat/javacard/Applet.cap [SHA-256: 17b671b4e2371e00eea2717f84cb016baa818a92c46687e25b5392023a071229]
[ INFO: ] Hash for kaylat/javacard/Method.cap [SHA-256: 9e7dd02202e95de04a33fd812f1ab38cf5554ee1ee3c0d982d31140c1313f97f]
[ INFO: ] Hash for kaylat/javacard/Class.cap [SHA-256: f788cc84d355e9a2cda8432c6af815aad6ed5574b5246641122158167850c6dc]
[ INFO: ] Hash for kaylat/javacard/RefLocation.cap [SHA-256: 16e7d9445917130b643bee62728b460c37511ce12b5c65c7653f9f75b8fa5df6]
[ INFO: ] Hash for kaylat/javacard/Import.cap [SHA-256: ce4ee9399ef89f122c68620d0330239a0f40a178992aae6c60fdf02caa492817]
[ INFO: ] 0 warnings and 0 errors.
I run scriptgen.bat:
scriptgen.bat kaylat.cap -hashfile kaylat.hash
This command fails with message:
Missing hash for required component: header.cap

Instead of redirecting the output of verifycap.bat to a file ("> kaylat.hash"), you should probably use the program option "-outfile <file-name>", which will write the digest values in the correct format to the file. Then use this file for input to scriptgen.bat.
If I use the console output of the program as input to the scriptgen.bat, I get the same error. If I use the -outfile option, all works fine.
C:\verifycap.bat
[ INFO: ] Usage: verifycap [options] <export files> <CAP file>
(or)
verifycap <-C | --commandoptionsfile> <command options file>
Where options include:
-digest <alg-name> specify the digest to use (default: SHA-256)
-help Print this message and exit.
-nobanner Suppress banner message.
-nowarn Suppress warning messages.
-outfile <file-name> Specify the name of the output file to store digest (default: no output file created)
-package <pkg> Set the name of the package to be verified
-verbose Turn on verbose mode.
-version Print version number and exit.

Related

PdhAddCounterW() failed: unknown error when starting a demo

I followed instructions on https://docs.openvino.ai/latest/omz_demos.html#doxid-omz-demos.
I finished the setup/installation, downloaded the needed models with the omz_download tool and tried to start the interactive demo:
interactive_face_detection_demo ^ --loop ^ -m "C:\Intel\face-detection-adas-0001\FP16\face-detection-adas-0001.xml"
[ INFO ] version: 2022.1.0
[ INFO ] build: 2022.1.0-7019-cdb9bec7210-releases/2022/1
[ INFO ] Reading model: C:\Intel\face-detection-adas-0001\FP16\face-detection-adas-0001.xml
[ INFO ] Model name: mobilenet_ssd_672x384
[ INFO ] Inputs:
[ INFO ] data, f32, {1,3,384,672}, [N,C,H,W]
[ INFO ] Outputs:
[ INFO ] detection_out, f32, {1,1,200,7}, [...]
[ INFO ] The model C:\Intel\face-detection-adas-0001\FP16\face-detection-adas-0001.xml is loaded to CPU
[ INFO ] Device: CPU
[ INFO ] Number of streams: 1
[ INFO ] Number of threads: AUTO
[ ERROR ] PdhAddCounterW() failed: unknown error
How can I get more information about that error / is this a problem with my setup?
(I was able to get everything running on another machine)
System information:
Systemmodell Surface Pro 4
Betriebsystemname Microsoft Windows 10 Pro
Version 10.0.19043 Build 19043
Prozessor Intel(R) Core(TM) i5-6300U CPU # 2.40GHz, 2496 MHz, 2 Kern(e), 4 logische(r) Prozessor(en)
RAM: 4GB
openvino_2022.1.0.643
4.5.5-90-gc3d60a6ca (OpenVINO/2022.1)
Visual Studio 16 2019
Your error appears to be related to Windows performance-counter-API and this error is not related to OpenVINO. For more information about PdhAddCounterW(), you can refer to PdhAddCounterW function (pdh.h).
On another note, try specifying input to the demo as it requires an input to process. Here is an example command:
interactive_face_detection_demo -m "C:\Intel\face-detection-adas-0001\FP16\face-detection-adas-0001.xml" -i <path_to_video>\<input_video>.mp4

How to Diagnose Weatherreport from CouchDB

After building CouchDB from github. I run weatherreport as recommended in documentation to get the following error. How do you diagnose exactly whats going wrong? This seems like a bunch of random numbers
17:38:27 WARN: 'escriptize' command does not apply to directory /home/test/workspace/CouchDB-ant_rhel/couchdb
17:38:27 [ * ] Setup environment ... ok
17:38:27 [ * ] Ensure CouchDB is built ... ok
17:38:27 [ * ] Ensure Erlang boot script exists ... ok
17:38:27 [ * ] Prepare configuration files ... ok
17:38:27 [ * ] Start node node1 ... ok
17:38:28 [ * ] Check node at http://127.0.0.1:15984/ ... ok
17:38:28 [ * ] Running cluster setup ... ok
17:38:30 [ * ] Exec command bin/weatherreport --etc dev/lib/node1/etc --level error ... ['node1_diag35200#127.0.0.1'] [crit] Bad rpc call executing check weatherreport_check_memory_use: {'EXIT',{badarg,[{erlang,list_to_float,[[101]],[{error_info,#{module => erl_erts_errors}}]},{weatherreport_util,binary_to_float,1,[{file,[115,114,99,47,119,101,97,116,104,101,114,114,101,112,111,114,116,95,117,116,105,108,46,101,114,108]},{line,80}]},{weatherreport_check_memory_use,check,1,[{file,[115,114,99,47,119,101,97,116,104,101,114,114,101,112,111,114,116,95,99,104,101,99,107,95,109,101,109,111,114,121,95,117,115,101,46,101,114,108]},{line,56}]},{weatherreport_check,check,2,[{file,[115,114,99,47,119,101,97,116,104,101,114,114,101,112,111,114,116,95,99,104,101,99,107,46,101,114,108]},{line,81}]},{weatherreport_runner,'-run/2-fun-0-',2,[{file,[115,114,99,47,119,101,97,116,104,101,114,114,101,112,111,114,116,95,114,117,110,110,101,114,46,101,114,108]},{line,54}]},{erlang,apply,2,[]}]}}

Mongod is not a service | Ubuntu | WSL. Error: mongod: unrecognized service

I'm using WSL2 with Ubuntu 20.04.2 LTS. I tried setting up mongodb. There are two issues:
Running the command sudo apt-get install -y mongodb-org results in the error below.
Errors were encountered while processing:
/tmp/apt-dpkg-install-NtCqHi/1-mongodb-org-server_4.4.4_amd64.deb
/tmp/apt-dpkg-install-NtCqHi/2-mongodb-org-mongos_4.4.4_amd64.deb
E: Sub-process /usr/bin/dpkg returned an error code (1)
Although the command mongod --version results in:
db version v3.6.8
git version: 8e540c0b6db93ce994cc548f000900bdc740f80a
OpenSSL version: OpenSSL 1.1.1f 31 Mar 2020
allocator: tcmalloc
modules: none
build environment:
distarch: x86_64
target_arch: x86_64
which is not the latest, even though I installed the latest community version.
running sudo service mongod start shows the following error:
mongod: unrecognized service
the list of available services currently in the system are:
sudo service --status-all
[ - ] apparmor
[ ? ] apport
[ - ] atd
[ - ] console-setup.sh
[ - ] cron
[ ? ] cryptdisks
[ ? ] cryptdisks-early
[ - ] dbus
[ ? ] hwclock.sh
[ + ] irqbalance
[ - ] iscsid
[ - ] keyboard-setup.sh
[ ? ] kmod
[ - ] lvm2
[ - ] lvm2-lvmpolld
[ - ] multipath-tools
[ + ] open-iscsi
[ - ] open-vm-tools
[ ? ] plymouth
[ ? ] plymouth-log
[ - ] postgresql
[ - ] procps
[ + ] redis-server
[ - ] rsync
[ - ] rsyslog
[ - ] screen-cleanup
[ - ] ssh
[ - ] sysstat
[ - ] udev
[ - ] ufw
[ - ] unattended-upgrades
[ - ] uuidd
[ - ] x11-common
This works for me:
sudo apt-get update
sudo apt-get install mongodb
sudo service mongodb start
:: My setup: [win10, wsl2]

How do you automatically download external (C++) libraries when using native Node addons?

I'd like to include libpng in my native Node addon. How can I include it, so that when my library is installed, it will automatically download a specified version of libpng? Is it possible to use npm's package.json for this? If this is not possible, what is the accepted way of including an external library's source code in your repository?
I recommend that you create a gyp file to build the dependency library and add a script to your package.json to download it for you.
I often use my own native addon modules to demonstrate the answers I give to these questions. My own native addon module, node-dvbtee, demonstrates this.
You will notice the following inside package.json:
"scripts": {
"preinstall": "npm install mkdirp && scripts/prepare-build.sh && node scripts/configure-build.js",
"install": "node-gyp rebuild -j 8",
"test": "mocha"
},
What matters here is the preinstall section of the scripts section. It calls scripts/prepare-build.sh, which contains the following:
#!/bin/sh
cd "$(dirname "$0")"/..
if [ -e libdvbtee ]; then
echo libdvbtee sources present
else
git clone git://github.com/mkrufky/libdvbtee.git
fi
cd libdvbtee
if [ -e libdvbpsi/bootstrap ]; then
echo libdvbpsi sources present
else
rm -rf libdvbpsi
git clone git://github.com/mkrufky/libdvbpsi.git
cd libdvbpsi
touch .dont_del
cd ..
fi
As you can see, the above script checks to see if the libdvbtee directory is present. If not, it will clone it from github. After that, it checks to see if the full libdvbpsi sources are present. If not, it will clone them from github.
Now, for the gyp files:
My project has the gyp files stored in the deps directory.
libdvbpsi.gyp looks like this:
{
'target_defaults': {
'default_configuration': 'Debug',
'configurations': {
'Debug': {
'defines': [ 'DEBUG', '_DEBUG' ],
'msvs_settings': {
'VCCLCompilerTool': {
'RuntimeLibrary': 1, # static debug
},
},
},
'Release': {
'defines': [ 'NDEBUG' ],
'msvs_settings': {
'VCCLCompilerTool': {
'RuntimeLibrary': 0, # static release
},
},
}
},
'msvs_settings': {
'VCLinkerTool': {
'GenerateDebugInformation': 'true',
},
},
'include_dirs': [
'../libdvbtee/libdvbpsi/src',
'../libdvbtee/libdvbpsi/src/tables',
'../libdvbtee/libdvbpsi/src/descriptors',
'../libdvbtee/libdvbpsi'
],
'defines': [
'PIC',
'HAVE_CONFIG_H'
],
},
'targets': [
# libdvbpsi
{
'target_name': 'dvbpsi',
'product_prefix': 'lib',
'type': 'static_library',
'sources': [
'../libdvbtee/libdvbpsi/src/dvbpsi.c',
'../libdvbtee/libdvbpsi/src/psi.c',
'../libdvbtee/libdvbpsi/src/demux.c',
'../libdvbtee/libdvbpsi/src/descriptor.c',
'../libdvbtee/libdvbpsi/src/tables/pat.c',
'../libdvbtee/libdvbpsi/src/tables/pmt.c',
'../libdvbtee/libdvbpsi/src/tables/sdt.c',
'../libdvbtee/libdvbpsi/src/tables/eit.c',
# '../libdvbtee/libdvbpsi/src/tables/cat.c',
'../libdvbtee/libdvbpsi/src/tables/nit.c',
'../libdvbtee/libdvbpsi/src/tables/tot.c',
# '../libdvbtee/libdvbpsi/src/tables/sis.c',
# '../libdvbtee/libdvbpsi/src/tables/bat.c',
# '../libdvbtee/libdvbpsi/src/tables/rst.c',
'../libdvbtee/libdvbpsi/src/tables/atsc_vct.c',
'../libdvbtee/libdvbpsi/src/tables/atsc_stt.c',
'../libdvbtee/libdvbpsi/src/tables/atsc_eit.c',
'../libdvbtee/libdvbpsi/src/tables/atsc_ett.c',
'../libdvbtee/libdvbpsi/src/tables/atsc_mgt.c',
# '../libdvbtee/libdvbpsi/src/descriptors/dr_02.c',
# '../libdvbtee/libdvbpsi/src/descriptors/dr_03.c',
# '../libdvbtee/libdvbpsi/src/descriptors/dr_04.c',
# '../libdvbtee/libdvbpsi/src/descriptors/dr_05.c',
# '../libdvbtee/libdvbpsi/src/descriptors/dr_06.c',
# '../libdvbtee/libdvbpsi/src/descriptors/dr_07.c',
# '../libdvbtee/libdvbpsi/src/descriptors/dr_08.c',
# '../libdvbtee/libdvbpsi/src/descriptors/dr_09.c',
'../libdvbtee/libdvbpsi/src/descriptors/dr_0a.c',
# '../libdvbtee/libdvbpsi/src/descriptors/dr_0b.c',
# '../libdvbtee/libdvbpsi/src/descriptors/dr_0c.c',
# '../libdvbtee/libdvbpsi/src/descriptors/dr_0d.c',
# '../libdvbtee/libdvbpsi/src/descriptors/dr_0e.c',
# '../libdvbtee/libdvbpsi/src/descriptors/dr_0f.c',
# '../libdvbtee/libdvbpsi/src/descriptors/dr_10.c',
# '../libdvbtee/libdvbpsi/src/descriptors/dr_11.c',
# '../libdvbtee/libdvbpsi/src/descriptors/dr_12.c',
# '../libdvbtee/libdvbpsi/src/descriptors/dr_13.c',
# '../libdvbtee/libdvbpsi/src/descriptors/dr_14.c',
# '../libdvbtee/libdvbpsi/src/descriptors/dr_1b.c',
# '../libdvbtee/libdvbpsi/src/descriptors/dr_1c.c',
# '../libdvbtee/libdvbpsi/src/descriptors/dr_24.c',
# '../libdvbtee/libdvbpsi/src/descriptors/dr_40.c',
# '../libdvbtee/libdvbpsi/src/descriptors/dr_41.c',
# '../libdvbtee/libdvbpsi/src/descriptors/dr_42.c',
# '../libdvbtee/libdvbpsi/src/descriptors/dr_43.c',
# '../libdvbtee/libdvbpsi/src/descriptors/dr_44.c',
# '../libdvbtee/libdvbpsi/src/descriptors/dr_45.c',
# '../libdvbtee/libdvbpsi/src/descriptors/dr_47.c',
'../libdvbtee/libdvbpsi/src/descriptors/dr_48.c',
# '../libdvbtee/libdvbpsi/src/descriptors/dr_49.c',
# '../libdvbtee/libdvbpsi/src/descriptors/dr_4a.c',
# '../libdvbtee/libdvbpsi/src/descriptors/dr_4b.c',
# '../libdvbtee/libdvbpsi/src/descriptors/dr_4c.c',
'../libdvbtee/libdvbpsi/src/descriptors/dr_4d.c',
# '../libdvbtee/libdvbpsi/src/descriptors/dr_4e.c',
# '../libdvbtee/libdvbpsi/src/descriptors/dr_4f.c',
# '../libdvbtee/libdvbpsi/src/descriptors/dr_50.c',
# '../libdvbtee/libdvbpsi/src/descriptors/dr_52.c',
# '../libdvbtee/libdvbpsi/src/descriptors/dr_53.c',
# '../libdvbtee/libdvbpsi/src/descriptors/dr_54.c',
# '../libdvbtee/libdvbpsi/src/descriptors/dr_55.c',
# '../libdvbtee/libdvbpsi/src/descriptors/dr_56.c',
# '../libdvbtee/libdvbpsi/src/descriptors/dr_58.c',
# '../libdvbtee/libdvbpsi/src/descriptors/dr_59.c',
# '../libdvbtee/libdvbpsi/src/descriptors/dr_5a.c',
'../libdvbtee/libdvbpsi/src/descriptors/dr_62.c',
# '../libdvbtee/libdvbpsi/src/descriptors/dr_66.c',
# '../libdvbtee/libdvbpsi/src/descriptors/dr_69.c',
# '../libdvbtee/libdvbpsi/src/descriptors/dr_73.c',
# '../libdvbtee/libdvbpsi/src/descriptors/dr_76.c',
# '../libdvbtee/libdvbpsi/src/descriptors/dr_7c.c',
'../libdvbtee/libdvbpsi/src/descriptors/dr_81.c',
'../libdvbtee/libdvbpsi/src/descriptors/dr_83.c',
'../libdvbtee/libdvbpsi/src/descriptors/dr_86.c',
# '../libdvbtee/libdvbpsi/src/descriptors/dr_8a.c',
'../libdvbtee/libdvbpsi/src/descriptors/dr_a0.c',
'../libdvbtee/libdvbpsi/src/descriptors/dr_a1.c',
],
'conditions': [
['OS=="mac"',
{
'xcode_settings': {
'WARNING_CFLAGS': [
'-Wno-deprecated-declarations'
]
}
}
]
],
'cflags!': ['-Wdeprecated-declarations','-Wimplicit-function-declaration'],
'cflags+': ['-Wno-deprecated-declarations','-Wno-implicit-function-declaration','-std=c99'],
},
]
}
Of course, there are a lot of specifics in this gyp file that are specific to libdvbpsi and my use case. As such, you will notice that quite a few of the source files in the library are not actually needed for the version of it that we're going to build for my node.js addon module. The source files that we are not going to build are commented out by preceding that line with a hash # character.
We link this library to the node module we're currently building by including it in the dependency section of the node.js addon modules bindings.gyp. Here is the one used in my addon module:
{
"targets": [
{
"target_name": "dvbtee",
"sources": [
"src/node-dvbtee.cc",
"src/dvbtee-parser.cc"
],
"dependencies": [
'deps/libdvbtee.gyp:dvbtee_parser'
],
"include_dirs": [
"libdvbtee/usr/include",
"libdvbtee/libdvbtee",
"libdvbtee/libdvbtee/decode",
"libdvbtee/libdvbtee/decode/table",
"libdvbtee/libdvbtee/decode/descriptor",
"<!(node -e \"require('nan')\")"
],
'cflags': [ '-DDEBUG_CONSOLE=1' ],
'cflags_cc': [ '-DDEBUG_CONSOLE=1', '-Wno-deprecated-declarations' ],
'cflags!': [ '-fno-exceptions' ],
'cflags_cc!': [ '-fno-exceptions', '-Wdeprecated-declarations' ],
'conditions': [
['OS=="mac"', {
'xcode_settings': {
'WARNING_CFLAGS': [
'-Wno-deprecated-declarations'
],
'GCC_ENABLE_CPP_EXCEPTIONS': 'YES'
}
}]
]
}
]
}
As you can see, deps/libdvbtee.gyp:dvbtee_parser is listed in the dependencies section, above. deps/libdvbtee.gyp:dvbtee_parser itself contains its own dependencies section:
"dependencies": [
'libdvbpsi.gyp:dvbpsi'
],
So, when npm install is executed, npm will run the preinstall script to fetch the sources, then it will build the custom libdvbpsi library based on libdvbpsi.gyp, then build the custom libdvbtee based on libdvbtee.gyp which depends on that custom libdvbpsi library, and finally it will build and link the node.js addon module that depends on the libdvbtee library build.
In my specific case, the libraries need to be configured before we attempt to build them. This step is required to write the config.h header file that these libraries depend on. I handle that step within the scripts/configure-build.js script which is run after downloading the sources. In most cases, you will want to simply run ./configure for each library, but that depends on the libraries that you're including.
This is a cross-platform solution, provided that the libraries you're building are themselves cross-platform.
You can add it in scrips section in package.json. But you have to be careful about which all devices your application will be executed. Such as ARM, Intel 32 bit or Intel 64 bit, or so. You have options, I am just adding some hints here and you can put your code accordingly. Here script will get executed during npm install command.
1. In script, you have to check the machine type and do download library accordingly.
//package.json
{
"scripts": {
"preinstall": ""
,"install": ""
,"test" : ""
}
}
In script, you have to check the machine type and do download library accordingly, something like using wget abc.so in install section of script. You have to do some scripting to take right lib for machine and put in right place.
In other way if you want, you can add build script, which will do download the source code and build in the system on the fly.
git clone git://xyz/abc.git
cd abc
./configure
make
make install.
You can also look for babel cli for source compilation. https://babeljs.io/docs/usage/cli/
And all these within scripts section, within preinstall or install or test.
In your case, you would prefer to go for 1st way.

rabbitmq - custom config file - disk_free_limit not set properly

I've properly install (rpm based) a rabbitmq cluster (with clusterer plugin) in rhel7, create the "custom" configuration files:
/etc/rabbitmq/rabbitmq-env.config => env varialble
/etc/rabbitmq/rabbitmq.config => rabbitmq properties
The rabbitmq cluster works fine exept that my parameters are ignored, any idea why?
Thanks in advance for you help
kr,
O.
nb: if I set the paramertesr myself with a command like:
rabbitmqctl set_disk_free_limit "1g"
for the disk limit for example, it works but I want them to survive a "reboot" :/
Here are my configurations files:
# /etc/rabbitmq/rabbitmq-env.config
(..)
NODE_PORT=5672
NODENAME=rabbit#node1
RABBITMQ_CONFIG_FILE=/etc/rabbitmq/rabbitmq.config
(..)
cat << EOF > /etc/rabbitmq/rabbitmq.config
[
{kernel, [
]},
{rabbit, [
{cluster_nodes, ["rabbit#node1", "rabbit#node2", "rabbit#node3"], disc}
{tcp_listeners, [5672]},
{disk_free_limit, "1GB"},
{collect_statistics_interval, 10000},
{heartbeat, 30},
{cluster_partition_handling, autoheal},
{default_user, <<"guest">>},
{default_pass, <<"guest">>}
]},
{rabbitmq_clusterer, [
{config, [ {version,1}, {nodes,["rabbit#node1", "rabbit#node2", "rabbit#node3"]} ]}
]}
]
EOF
a little update for this topic, I had misconfigured my rabbitmq files; in order to have a working configuration, do the following modifications.
kr,
O.
For the environment file: we can get rid of the '.config' part in the file name as rabbitMQ add it anyway.
I my log file, I Had an error with "... /etc/rabbitmq/rabbitmq.config.config ... "
So keep the file with the .config extension (/etc/rabbitmq/rabbitmq.config) by set the env variable without the .config:
(..)
RABBITMQ_CONFIG_FILE=/etc/rabbitmq/rabbitmq
(..)
For the rabbit.config file: As I used the clusterer plugins, we can get rid of the line cluster_nodes.
Your file will look like this one:
cat << EOF > /etc/rabbitmq/rabbitmq.config
[
{kernel, [
]},
{rabbit, [
{tcp_listeners, [5672]},
{disk_free_limit, "1GB"},
{collect_statistics_interval, 10000},
{heartbeat, 30},
{cluster_partition_handling, autoheal}
]},
{rabbitmq_management, [
{http_log_dir,"/myapps/myproject/rabbitmq/logs"},
{listener, [{port, 15672 }]}
]},
{rabbitmq_clusterer, [
{config, [ {version,1}, {nodes,["rabbit#node01", "rabbit#node02", "rabbit#node03"]} ]}
]}
].
EOF
To verify your current config for the clusterer plugin you can use:
rabbitmqctl eval 'rabbit_clusterer:status().'

Resources