I am getting the following Error:
> XXX Created runtime cucumber.runtime.RuntimeOptions#477b4cdf
> cucumber.runtime.CucumberException: java.lang.NoSuchMethodException:
> cucumber.runtime.SerenityBackend.<init>(cucumber.runtime.io.ResourceLoader,
>
> io.cucumber.stepexpression.TypeRegistry)
> at cucumber.runtime.Reflections.newInstance(Reflections.java:53)
> at cucumber.runtime.Reflections.instantiateSubclasses(Reflections.java:35)
> at cucumber.runtime.Runtime.loadBackends(Runtime.java:89)
> at cucumber.runtime.Runtime.<init>(Runtime.java:42)
> at net.serenitybdd.cucumber.CucumberWithSerenityRuntime.createSerenityEnabledRuntime(CucumberWithSerenityRuntime.java:38)
> at net.serenitybdd.cucumber.CucumberWithSerenityRuntime.using(CucumberWithSerenityRuntime.java:28)
> at net.serenitybdd.cucumber.CucumberWithSerenity.createRuntime(CucumberWithSerenity.java:58)
> at cucumber.api.junit.Cucumber.<init>(Cucumber.java:63)
> at net.serenitybdd.cucumber.CucumberWithSerenity.<init>(CucumberWithSerenity.java:39)
> . . .
>
> at com.intellij.rt.execution.junit.JUnitStarter.main(JUnitStarter.java:70)
> Caused by: java.lang.NoSuchMethodException: cucumber.runtime.SerenityBackend.<init>(cucumber.runtime.io.ResourceLoader,
> io.cucumber.stepexpression.TypeRegistry)
> at java.lang.Class.getConstructor0(Class.java:3082)
> at java.lang.Class.getConstructor(Class.java:1825)
> at cucumber.runtime.Reflections.newInstance(Reflections.java:45)
> ... 22 more
>
> Process finished with exit code -1
It's a Maven project with Serenity.
I hope you can help me with the error, I would be very grateful
Related
I have hbase-2.2.6 and spark-3.3.1 installed on my cluster. and i want to perform operations on hbase using spark. But i am unable to do so. The error talks about classnotfoundexcepetion tableDiscriptor.
i have tried to create hbase-spark connector using https://github.com/LucaCanali/Miscellaneous/blob/master/Spark_Notes/Spark_HBase_Connector.md
but the connectors are not working
/opt/spark/spark-3.3.1-bin-hadoop3$ bin/pyspark --master yarn --num-executors 1 --executor-cores 2 --jars /opt/spark/spark-3.3.1-bin-hadoop3/jars/hbase-spark-1.0.1-SNAPSHOT.jar --packages org.apache.hbase:hbase-shaded-mapreduce:2.2.6
Python 3.8.10 (default, Nov 14 2022, 12:59:47)
\[GCC 9.4.0\] on linux
Welcome to
\___\_ \_\_
/ __/__ \__\_ ___/ /__
\\ / \_ / \_ \`/ \_\_/ '_/
/ / .\_/\_,_/_/ /_/\_\\ version 3.3.1
/_/
Using Python version 3.8.10 (default, Nov 14 2022 12:59:47)
Spark context Web UI available at http://node2.ellicium.com:4040
Spark context available as 'sc' (master = yarn, app id = application_1675083498730_0062).
SparkSession available as 'spark'.
> > > df = spark.sql("select id, 'myline\_'||id name from range(20)")
> > > df.show()
> > > \+---+---------+
> > > | id| name|
> > > \+---+---------+
> > > | 0| myline_0|
> > > | 1| myline_1|
> > > | 2| myline_2|
> > > | 3| myline_3|
> > > | 4| myline_4|
> > > | 5| myline_5|
> > > | 6| myline_6|
> > > | 7| myline_7|
> > > | 8| myline_8|
> > > | 9| myline_9|
> > > | 10|myline_10|
> > > | 11|myline_11|
> > > | 12|myline_12|
> > > | 13|myline_13|
> > > | 14|myline_14|
> > > | 15|myline_15|
> > > | 16|myline_16|
> > > | 17|myline_17|
> > > | 18|myline_18|
> > > | 19|myline_19|
> > > \+---+---------+
> > > df.write.format("org.apache.hadoop.hbase.spark").option("hbase.columns.mapping","id INT :key, name STRING cf:name").option("hbase.namespace", "default").option("hbase.table", "testspark").option("hbase.spark.use.hbasecontext", False).save()
> > > Traceback (most recent call last):
> > > File "\<stdin\>", line 1, in \<module\>
> > > File "/opt/spark/spark-3.3.1-bin-hadoop3/python/pyspark/sql/readwriter.py", line 966, in save
> > > self.\_jwrite.save()
> > > File "/opt/spark/spark-3.3.1-bin-hadoop3/python/lib/py4j-0.10.9.5-src.zip/py4j/java_gateway.py", line 1321, in __call__
> > > File "/opt/spark/spark-3.3.1-bin-hadoop3/python/pyspark/sql/utils.py", line 190, in deco
> > > return f(\*a, \*\*kw)
> > > File "/opt/spark/spark-3.3.1-bin-hadoop3/python/lib/py4j-0.10.9.5-src.zip/py4j/protocol.py", line 326, in get_return_value
> > > py4j.protocol.Py4JJavaError: An error occurred while calling o56.save.
> > > : java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/client/TableDescriptor
> > > at org.apache.hadoop.hbase.spark.DefaultSource.createRelation(DefaultSource.scala:78)
> > > at org.apache.spark.sql.execution.datasources.SaveIntoDataSourceCommand.run(SaveIntoDataSourceCommand.scala:47)
> > > at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:75)
> > > at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:73)
> > > at org.apache.spark.sql.execution.command.ExecutedCommandExec.executeCollect(commands.scala:84)
> > > at org.apache.spark.sql.execution.QueryExecution$$anonfun$eagerlyExecuteCommands$1.$anonfun$applyOrElse$1(QueryExecution.scala:98)
> > > at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$6(SQLExecution.scala:109)
> > > at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:169)
> > > at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$1(SQLExecution.scala:95)
> > > at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:779)
> > > at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:64)
> > > at org.apache.spark.sql.execution.QueryExecution$$anonfun$eagerlyExecuteCommands$1.applyOrElse(QueryExecution.scala:98)
> > > at org.apache.spark.sql.execution.QueryExecution$$anonfun$eagerlyExecuteCommands$1.applyOrElse(QueryExecution.scala:94)
> > > at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$transformDownWithPruning$1(TreeNode.scala:584)
> > > at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:176)
> > > at org.apache.spark.sql.catalyst.trees.TreeNode.transformDownWithPruning(TreeNode.scala:584)
> > > at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.org$apache$spark$sql$catalyst$plans$logical$AnalysisHelper$$super$transformDownWithPruning(LogicalPlan.scala:30)
> > > at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning(AnalysisHelper.scala:267)
> > > at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning$(AnalysisHelper.scala:263)
> > > at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:30)
> > > at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:30)
> > > at org.apache.spark.sql.catalyst.trees.TreeNode.transformDown(TreeNode.scala:560)
> > > at org.apache.spark.sql.execution.QueryExecution.eagerlyExecuteCommands(QueryExecution.scala:94)
> > > at org.apache.spark.sql.execution.QueryExecution.commandExecuted$lzycompute(QueryExecution.scala:81)
> > > at org.apache.spark.sql.execution.QueryExecution.commandExecuted(QueryExecution.scala:79)
> > > at org.apache.spark.sql.execution.QueryExecution.assertCommandExecuted(QueryExecution.scala:116)
> > > at org.apache.spark.sql.DataFrameWriter.runCommand(DataFrameWriter.scala:860)
> > > at org.apache.spark.sql.DataFrameWriter.saveToV1Source(DataFrameWriter.scala:390)
> > > at org.apache.spark.sql.DataFrameWriter.saveInternal(DataFrameWriter.scala:363)
> > > at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:247)
> > > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> > > at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> > > at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> > > at java.lang.reflect.Method.invoke(Method.java:498)
> > > at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
> > > at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
> > > at py4j.Gateway.invoke(Gateway.java:282)
> > > at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
> > > at py4j.commands.CallCommand.execute(CallCommand.java:79)
> > > at py4j.ClientServerConnection.waitForCommands(ClientServerConnection.java:182)
> > > at py4j.ClientServerConnection.run(ClientServerConnection.java:106)
> > > at java.lang.Thread.run(Thread.java:750)
> > > Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hbase.client.TableDescriptor
> > > at java.net.URLClassLoader.findClass(URLClassLoader.java:387)
> > > at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
> > > at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:352)
> > > at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
> > > ... 42 more
I've been getting the following error:
> react-scripts build
> Creating an optimized production build...
> Failed to compile.
> ./node_modules/react-native-stars/index.js
> SyntaxError:
> /mnt/c/linux_share/ayelho-mockup/node_modules/react-native-stars/index.js:
> Unexpected token (35:8)
> 33 | return this.props.opacity ?
> 34 | this.isReactElement(this.props.fullStar) ?
> > 35 | <View style={{opacity: partial}}>
> > | ^
> 36 | {this.props.fullStar}
> 37 | </View>
> 38 | :
I've tried using react-native-star-rating and this gives a similar error (it fails on ). So it is probably something in my setup. Is there something I need to do to tell it that is a tag?
The main app is written in typescript.
In Debian running headless on an arm-based single-board computer (C.H.I.P.), I have this in my /etc/rc.local
#!/bin/sh -e
#
# rc.local
#
# This script is executed at the end of each multiuser runlevel.
# Make sure that the script will "exit 0" on success or any other
# value on error.
#
# In order to enable or disable this script just change the execution
# bits.
#
# By default this script does nothing.
source /usr/local/bin/gpio.sh
gpio_export CSID5
gpio_direction CSID5 out
gpio_output CSID5 1
gpio_export XIO_P1
gpio_direction XIO_P1 in
export DBUS_SESSION_BUS_ADDRESS=unix:path=/run/dbus/system_bus_socket
export DISPLAY=:0
cd /home/chip/Inhibition
now=$(date +"%m_%d_%Y")
date >> "log_$now.log"
jackd -P99 -dalsa -Phw:1 -p8192 -n3 -s -r44100 &
sleep 30
sclang main.scd >> "log_$now.log"
exit 0
when running the script as root with
/etc/rc.local
everything runs as expected.
However when booting the last command (sclang main.scd >> "log_$now.log") is never run. (sclang is the command line interpreter of supercollider). I did try running some test commands to see whether executions stops when launching jack but these commands are executed as expected - it's just supercollider that fails to start at boot time.
any ideas?
PS I remember setting this up in a similar way in a Raspberry pi running Raspbian - can't confirm it now, but I am almost sure that the code was like that.
EDIT
I tried comparing the shell environments from the command prompt (once booted) and from within rc.local (env1 is the command shell, env2 is the rc.local shell):
diff env1 env2
1,2c1,4
< BASH=/bin/bash
< BASHOPTS=checkwinsize:cmdhist:complete_fullquote:expand_aliases:extquote:force_fignore:hostcomplete:interactive_comments:progcomp:promptvars:sourcepath
---
> AP_EINT1=193
> AP_EINT3=35
> BASH=/bin/sh
> BASHOPTS=cmdhist:complete_fullquote:extquote:force_fignore:hostcomplete:interactive_comments:progcomp:promptvars:sourcepath
4,5c6,7
< BASH_ARGC=()
< BASH_ARGV=()
---
> BASH_ARGC=([0]="1")
> BASH_ARGV=([0]="start")
7,8c9,10
< BASH_LINENO=()
< BASH_SOURCE=()
---
> BASH_LINENO=([0]="0")
> BASH_SOURCE=([0]="/etc/rc.local")
11c13,25
< COLUMNS=120
---
> CSICK=129
> CSID0=132
> CSID1=133
> CSID2=134
> CSID3=135
> CSID4=136
> CSID5=137
> CSID6=138
> CSID7=139
> CSIHSYNC=130
> CSIPCK=128
> CSIVSYNC=131
> DBUS_SESSION_BUS_ADDRESS=unix:path=/run/dbus/system_bus_socket
12a27
> DISPLAY=:0
13a29,30
> GPIO=1017
> GPIO_HASH=([TWI1_SCK]="47" [UART1_TX]="195" [LCD_D22]="118" [LCD_DE]="121" [LCD_D23]="119" [LCD_D20]="116" [LCD_D21]="117" [PWM0]="34" [CSICK]="129" [CSID3]="135" [LCD_D13]="109" [CSID2]="134" [LCD_D12]="108" [CSID1]="133" [LCD_D11]="107" [CSID0]="132" [LCD_D10]="106" [CSID7]="139" [CSID6]="138" [CSID5]="137" [LCD_D15]="111" [CSID4]="136" [LCD_D14]="110" [LCD_D19]="115" [LCD_D18]="114" [LCD_HSYNC]="122" [AP_EINT3]="35" [AP_EINT1]="193" [XIO_P3]="1019" [CSIHSYNC]="130" [LCD_CLK]="120" [XIO_P2]="1018" [XIO_P1]="1017" [TWI2_SCK]="49" [XIO_P0]="1016" [XIO_P7]="1023" [XIO_P6]="1022" [XIO_P5]="1021" [XIO_P4]="1020" [TWI1_SDA]="48" [UART1_RX]="196" [CSIVSYNC]="131" [LCD_D2]="98" [LCD_D3]="99" [LCD_D4]="100" [LCD_D5]="101" [LCD_D6]="102" [LCD_VSYNC]="123" [LCD_D7]="103" [TWI2_SDA]="50" [CSIPCK]="128" )
15,18d31
< HISTFILE=/root/.bash_history
< HISTFILESIZE=500
< HISTSIZE=500
< HOME=/root
21c34,35
< IFS=$' \t\n'
---
> IFS='
> '
23,27c37,58
< LC_ALL=en_US.UTF-8
< LC_CTYPE=UTF-8
< LINES=32
< LOGNAME=root
< LS_COLORS='rs=0:di=01;34:ln=01;36:mh=00:pi=40;33:so=01;35:do=01;35:bd=40;33;01:cd=40;33;01:or=40;31;01:su=37;41:sg=30;43:ca=30;41:tw=30;42:ow=34;42:st=37;44:ex=01;32:*.tar=01;31:*.tgz=01;31:*.arc=01;31:*.arj=01;31:*.taz=01;31:*.lha=01;31:*.lz4=01;31:*.lzh=01;31:*.lzma=01;31:*.tlz=01;31:*.txz=01;31:*.tzo=01;31:*.t7z=01;31:*.zip=01;31:*.z=01;31:*.Z=01;31:*.dz=01;31:*.gz=01;31:*.lrz=01;31:*.lz=01;31:*.lzo=01;31:*.xz=01;31:*.bz2=01;31:*.bz=01;31:*.tbz=01;31:*.tbz2=01;31:*.tz=01;31:*.deb=01;31:*.rpm=01;31:*.jar=01;31:*.war=01;31:*.ear=01;31:*.sar=01;31:*.rar=01;31:*.alz=01;31:*.ace=01;31:*.zoo=01;31:*.cpio=01;31:*.7z=01;31:*.rz=01;31:*.cab=01;31:*.jpg=01;35:*.jpeg=01;35:*.gif=01;35:*.bmp=01;35:*.pbm=01;35:*.pgm=01;35:*.ppm=01;35:*.tga=01;35:*.xbm=01;35:*.xpm=01;35:*.tif=01;35:*.tiff=01;35:*.png=01;35:*.svg=01;35:*.svgz=01;35:*.mng=01;35:*.pcx=01;35:*.mov=01;35:*.mpg=01;35:*.mpeg=01;35:*.m2v=01;35:*.mkv=01;35:*.webm=01;35:*.ogm=01;35:*.mp4=01;35:*.m4v=01;35:*.mp4v=01;35:*.vob=01;35:*.qt=01;35:*.nuv=01;35:*.wmv=01;35:*.asf=01;35:*.rm=01;35:*.rmvb=01;35:*.flc=01;35:*.avi=01;35:*.fli=01;35:*.flv=01;35:*.gl=01;35:*.dl=01;35:*.xcf=01;35:*.xwd=01;35:*.yuv=01;35:*.cgm=01;35:*.emf=01;35:*.axv=01;35:*.anx=01;35:*.ogv=01;35:*.ogx=01;35:*.aac=00;36:*.au=00;36:*.flac=00;36:*.m4a=00;36:*.mid=00;36:*.midi=00;36:*.mka=00;36:*.mp3=00;36:*.mpc=00;36:*.ogg=00;36:*.ra=00;36:*.wav=00;36:*.axa=00;36:*.oga=00;36:*.spx=00;36:*.xspf=00;36:'
---
> LCD_CLK=120
> LCD_D10=106
> LCD_D11=107
> LCD_D12=108
> LCD_D13=109
> LCD_D14=110
> LCD_D15=111
> LCD_D18=114
> LCD_D19=115
> LCD_D2=98
> LCD_D20=116
> LCD_D21=117
> LCD_D22=118
> LCD_D23=119
> LCD_D3=99
> LCD_D4=100
> LCD_D5=101
> LCD_D6=102
> LCD_D7=103
> LCD_DE=121
> LCD_HSYNC=122
> LCD_VSYNC=123
29,30c60
< MAIL=/var/mail/root
< MAILCHECK=60
---
> OLDPWD=/
36,38c66,67
< PPID=489
< PS1='${debian_chroot:+($debian_chroot)}\u#\h:\w\$ '
< PS2='> '
---
> POSIXLY_CORRECT=y
> PPID=1
40a70
> PWM0=34
42c72
< SHELLOPTS=braceexpand:emacs:hashall:histexpand:history:interactive-comments:monitor
---
> SHELLOPTS=braceexpand:errexit:hashall:interactive-comments:posix
44,48c74,80
< SUDO_COMMAND=/bin/su
< SUDO_GID=1000
< SUDO_UID=1000
< SUDO_USER=chip
< TERM=xterm-256color
---
> TERM=dumb
> TWI1_SCK=47
> TWI1_SDA=48
> TWI2_SCK=49
> TWI2_SDA=50
> UART1_RX=196
> UART1_TX=195
50,53c82,94
< USER=root
< USERNAME=root
< XDG_SESSION_ID=c1
< _=clear
---
> XIO_BASE=1016
> XIO_BASE_FILE=/sys/class/gpio/gpiochip1016/base
> XIO_LABEL_FILE=/sys/class/gpio/gpiochip1016/label
> XIO_P0=1016
> XIO_P1=1017
> XIO_P2=1018
> XIO_P3=1019
> XIO_P4=1020
> XIO_P5=1021
> XIO_P6=1022
> XIO_P7=1023
> _=
> now=10_27_2016
the only weird thing I notice is that /etc/rc.local has BASH set to /bin/sh instead of /bin/bash - I thought that this could be a problem since I do change some env variables so I moved my script (from the first export and bellow) to a run.sh file (which again does work OK on its own) and added this to my /etc/rc.local:
/bin/bash /home/chip/Inhibition/run.sh
unfortunately the results are the same - jack does run but no sclang after that.
For auto-completion, I have installed the youcompleteme and vim-lua-ftplugin, but when I try to trigger the auto-completion, I get the error
> 处理 function
> xolox#lua#omnifunc..xolox#lua#getomnivariables..xolox#lua#dofile..xolox#misc#os#exec
> 时发生错误: 行 163:
> E605: 异常没有被捕获: vim-misc 1.17.6: External command failed with exit code
> 139!^#Command line: sh -c '('\''lua'\'' '\''/home/meijie
> ru/.vim/bundle/vim-lua-ftplugin/misc/lua-ftplugin/omnicomplete.lua'\''
> '\''argcheck'\'' '\''argcheck.doc'\'' '\''argcheck.dump'\'' '\'
> 'argcheck.env'\'' '\''argcheck.graph'\'' '\''argcheck.init'\''
> '\''argcheck.usage'\'' '\''argcheck.utils'\'' '\''base64'\''
> '\''bit32' \'' '\''cjson'\'' '\''cjson.util'\'' '\''coroutine'\''
> '\''crypto'\'' '\''cwrap'\'' '\''cwrap.cinterface'\''
> '\''cwrap.init'\'' '\''cw rap.types'\'' '\''debug'\'' '\''dok'\''
> '\''dok.init'\'' '\''dok.inline'\'' '\''env'\'' '\''env.init'\''
> '\''fftw3'\'' '\''fftw3.cdefs '\'' '\''fftw3.defines'\''
> '\''fftw3.init'\'' '\''gnuplot'\'' '\''gnuplot.gnuplot'\''
> '\''gnuplot.hist'\'' '\''gnuplot.init'\'' '\''gr aph'\''
> '\''graph.Edge'\'' '\''graph.graphviz'\'' '\''graph.init'\''
> '\''graph.Node'\'' '\''graphicsmagick'\'' '\''graphicsmagick.conv
> ert'\'' '\''graphicsmagick.exif'\'' '\''graphicsmagick.Image'\''
> '\''graphicsmag
> 请按 ENTER 或其它命令继续
> 处理 function
> xolox#lua#omnifunc..xolox#lua#getomnivariables..xolox#lua#dofile
> 时发生错误: 行 41:
> E171: 缺少 :endif
> 请按 ENTER 或其它命令继续
> 处理 function xolox#lua#omnifunc 时发生错误:
> 行 10:
> E171: 缺少 :endif
I'm mostly working in torch7. I'd really appreciate a way of getting library methods suggested while typing.
noob here. I've been trying to run the node.js "twit" app on Heroku and it just seems to keep timing out and crashing. I've set it to make a call every 45 seconds, which is way too often, but within the 60second timeout that heroku seems to have, and it still happens
> > ←[36m2014-05-24T13:39:36.591605+00:00 heroku[web.1]:←[0m Starting
> > process with c ommand `node rtd2.js`
> > ←[36m2014-05-24T13:40:22.733065+00:00 app[web.1]:←[0m
> > ←[36m2014-05-24T13:40:22.733072+00:00 app[web.1]:←[0m # followers:404
> > ←[36m2014-05-24T13:40:22.972796+00:00 app[web.1]:←[0m Favorite:
> > favorited respon se: 470194277931552800
> > ←[36m2014-05-24T13:40:22.972789+00:00 app[web.1]:←[0m
> > ←[35m2014-05-24T13:40:25.771628+00:00 heroku[run.2775]:←[0m State
> > changed from s tarting to up ←[35m2014-05-24T13:40:25.727245+00:00
> > heroku[run.2775]:←[0m Starting process wit h command `node`
> > ←[35m2014-05-24T13:40:25.692309+00:00 heroku[run.2775]:←[0m Awaiting
> > client ←[32m2014-05-24T13:40:23.184143+00:00 heroku[api]:←[0m Starting
> > process with com mand `node` by username
> > ←[36m2014-05-24T13:40:38.426516+00:00 heroku[web.1]:←[0m State changed
> > from star ting to crashed ←[36m2014-05-24T13:40:36.888774+00:00
> > heroku[web.1]:←[0m Error R10 (Boot timeout ) -> Web process failed to
> > bind to $PORT within 60 seconds of launch
> > ←[36m2014-05-24T13:40:36.888952+00:00 heroku[web.1]:←[0m Stopping
> > process with S IGKILL ←[36m2014-05-24T13:40:38.417287+00:00
> > heroku[web.1]:←[0m Process exited with sta tus 137
> > ←[35m2014-05-24T13:41:54.765029+00:00 heroku[run.2775]:←[0m Client
> > connection cl osed. Sending SIGHUP to all processes
> > ←[35m2014-05-24T13:41:56.062749+00:00 heroku[run.2775]:←[0m State
> > changed from u p to complete ←[35m2014-05-24T13:41:56.053479+00:00
> > heroku[run.2775]:←[0m Process exited with status 129
> > ←[36m2014-05-24T13:43:44.649471+00:00 heroku[web.1]:←[0m State changed
> > from cras hed to starting ←[36m2014-05-24T13:43:47.654203+00:00
> > app[web.1]:←[0m RTD2: Running. ←[36m2014-05-24T13:43:46.357298+00:00
> > heroku[web.1]:←[0m Starting process with c ommand `node rtd2.js`
> > ←[36m2014-05-24T13:44:32.992131+00:00 app[web.1]:←[0m
> > ←[36m2014-05-24T13:44:32.992136+00:00 app[web.1]:←[0m # followers:404
> > ←[36m2014-05-24T13:44:33.365530+00:00 app[web.1]:←[0m
> > ←[36m2014-05-24T13:44:33.365535+00:00 app[web.1]:←[0m SearchFollow:
> > followed #Ch loixxHDM ←[36m2014-05-24T13:44:46.459802+00:00
> > heroku[web.1]:←[0m Error R10 (Boot timeout ) -> Web process failed to
> > bind to $PORT within 60 seconds of launch
> > ←[36m2014-05-24T13:44:46.459802+00:00 heroku[web.1]:←[0m Stopping
> > process with S IGKILL ←[36m2014-05-24T13:44:47.883661+00:00
> > heroku[web.1]:←[0m State changed from star ting to crashed
> > ←[36m2014-05-24T13:44:47.862070+00:00 heroku[web.1]:←[0m Process
> > exited with sta tus 137 name#PEWTON ~/twit (master) $