TypeError: jsdom.createVirtualConsole is not a function - node.js

I am trying to build upon the basic Scala.js tutorial and am having this weird error.
There isn't much different from the project set-up as shown in the tutorial, but just in case here's my build.sbt:
enablePlugins(ScalaJSPlugin)
scalaVersion := "2.12.1"
name := "algorithms1_4_34"
version := "1.0"
libraryDependencies ++= Seq("org.scalatest" % "scalatest_2.12" % "3.0.1" % "test",
"org.scalacheck" %% "scalacheck" % "1.13.4" % "test",
"org.scala-js" % "scalajs-dom_sjs0.6_2.12" % "0.9.1",
"be.doeraene" %%% "scalajs-jquery" % "0.9.1")
// This is an application with a main method
scalaJSUseMainModuleInitializer := true
skip in packageJSDependencies := false
jsDependencies +=
"org.webjars" % "jquery" % "2.1.4" / "2.1.4/jquery.js"
jsDependencies += RuntimeDOM
...and the JSApp file:
package ca.vgorcinschi.algorithms1_4_34
import scala.scalajs.js.JSApp
import org.scalajs.jquery.jQuery
object HotAndColdJS extends JSApp{
def main(): Unit = {
jQuery(()=>setupUI())
}
def addClickedMessage():Unit ={
jQuery("body").append("<p>You clicked the button!</p>")
}
def setupUI():Unit = {
//click envokes an event handler
jQuery("#click-me-button").click(()=> addClickedMessage())
jQuery("body").append("<p>Hello World!</p>")
}
}
I can run compile, fastOptJS, reload and eclipse (I am using eclipsePlugin) commands without problems. The only issue is the run command. To be fair I did add something to the flow of the tutorial, but only because running this command (npm install jsdom) from the root of application lead to failure in run as well (npm WARN enoent ENOENT). Following this as advised here I ran:
npm init
npm install
npm install jsdom
And this is where I am now. This is the error I get when running the app with run:
> run
[info] Running ca.vgorcinschi.algorithms1_4_34.HotAndColdJS
[error] [stdin]:40
[error] virtualConsole: jsdom.createVirtualConsole().sendTo(console),
[error] ^
[error]
[error] TypeError: jsdom.createVirtualConsole is not a function
[error] at [stdin]:40:27
[error] at [stdin]:61:3
[error] at ContextifyScript.Script.runInThisContext (vm.js:23:33)
[error] at Object.runInThisContext (vm.js:95:38)
[error] at Object.<anonymous> ([stdin]-wrapper:6:22)
[error] at Module._compile (module.js:571:32)
[error] at evalScript (bootstrap_node.js:391:27)
[error] at Socket.<anonymous> (bootstrap_node.js:188:13)
[error] at emitNone (events.js:91:20)
[error] at Socket.emit (events.js:188:7)
org.scalajs.jsenv.ExternalJSEnv$NonZeroExitException: Node.js with JSDOM exited with code 1
at org.scalajs.jsenv.ExternalJSEnv$AbstractExtRunner.waitForVM(ExternalJSEnv.scala:107)
at org.scalajs.jsenv.ExternalJSEnv$ExtRunner.run(ExternalJSEnv.scala:156)
at org.scalajs.sbtplugin.ScalaJSPluginInternal$.org$scalajs$sbtplugin$ScalaJSPluginInternal$$jsRun(ScalaJSPluginInternal.scala:697)
at org.scalajs.sbtplugin.ScalaJSPluginInternal$$anonfun$73$$anonfun$apply$48$$anonfun$apply$49.apply(ScalaJSPluginInternal.scala:814)
at org.scalajs.sbtplugin.ScalaJSPluginInternal$$anonfun$73$$anonfun$apply$48$$anonfun$apply$49.apply(ScalaJSPluginInternal.scala:808)
at scala.Function1$$anonfun$compose$1.apply(Function1.scala:47)
[trace] Stack trace suppressed: run last compile:run for the full output.
[error] (compile:run) org.scalajs.jsenv.ExternalJSEnv$NonZeroExitException: Node.js with JSDOM exited with code 1
[error] Total time: 4 s, completed 23-May-2017 9:24:20 PM
I would appreciate if anyone could give me a hand with this.

jsdom v10 introduced some breaking changes wrt. v9, and Scala.js <= 0.6.15 was not prepared for those. That is what's causing the error you're hitting.
Upgrading to Scala.js 0.6.16 will fix your issue. It supports both jsdom v9 and v10.

Related

Error: Can't resolve 'fs' and Error: Can't resolve 'path' in '.\node_modules\bindings' after libxmljs is added in angular 13 project

My angular project has started giving error during build post addition of libxmljs package. But strange thing is error is pointing to folder - \node_modules\bindings instead of \node_modules\libxmljs.
Only code change -
"libxmljs": "^0.19.7" in package.json
and
let xsdDoc = libxmljs.parseXml(xsdFile); in myproject.component.ts
My environment -
Node v18.0.0
NPM 8.6.0
No angular CLI. Its maven project.
Error log -
...
[INFO] ./node_modules/bindings/bindings.js:4:9-22 - Error: Module not found: Error: Can't resolve 'fs' in 'D:\my-frontend\node_modules\bindings'
[INFO]
[INFO] ./node_modules/bindings/bindings.js:5:11-26 - Error: Module not found: Error: Can't resolve 'path' in 'D:\my-frontend\node_modules\bindings'
[INFO]
[INFO] BREAKING CHANGE: webpack < 5 used to include polyfills for node.js core modules by default.
[INFO] This is no longer the case. Verify if you need this module and configure a polyfill for it.
[INFO]
[INFO] If you want to include a polyfill, you need to:
[INFO] - add a fallback 'resolve.fallback: { "path": require.resolve("path-browserify") }'
[INFO] - install 'path-browserify'
[INFO] If you don't want to include a polyfill, you can use an empty module like this:
[INFO] resolve.fallback: { "path": false }
[INFO]
[INFO]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 07:51 min
...
Troubleshoot -
Tried "browser": { "fs": false, "path":false } with no luck.
Tried webpack.config.js but same issue.
Is this a bug in libxmljs and should I find other way to validate xmls ?
It seems node_modules/bindings specific bug.
I resolved issue by switching to java using javax.xml.
Sample code for war based app -
#PostMapping("/xml/validate")
public ResourceEntity<Boolean> validateXmlWithXSD(HttpServletRequest request, #RequestHeader(value = "Authorization", required = true) String token, int xsdVersion, #RequestBody ProjectXml processIn) {
//step- initializations
...
try {
//step- session validation
...
//step- xsd access from webapp
String xsdPath = "/assets/xsds/ProjectXsd." + xsdVersion + ".xsd";
InputStream xsdStream = request.getSession().getServletContext().getResourceAsStream(xsdPath);
if(xsdStream==null)
throw new NullPointerException("Null xsd stream.");
//step- xml validation against xsd
Source source = new StreamSource(xsdStream);
Schema schema = SchemaFactory.newInstance(XMLConstants.W3C_XML_SCHEMA_NS_URI).newSchema(source);
Validator validator = schema.newValidator();
validator.validate(new StreamSource(new ByteArrayInputStream(processIn.getProcessXML())));
isValidated = true;
} catch (SAXException | IOException | NullPointerException e) {
...
}
//step- return result
}
In case of spring-boot jar app,
//step- xsd access from target/classes
String xsdPath = "/assets/xsds/ProjectXsd." + xsdVersion + ".xsd";
InputStream xsdStream = new ClassPathResource(xsdPath).getInputStream();
and in maven I configured
<resources>
<resource>
<directory>${project.build.outputDirectory}/assets/runtimeXsds</directory>
</resource>
</resources>

SlimScroll Error: Module build failed at ng2-slimscroll/index.ts

I have followed below steps Source from https://github.com/jkuri/ng2-slimscroll
Installation
npm install ng2-slimscroll
Imported module like
import { SlimScrollModule } from 'ng2-slimscroll';
imports: [ ....................,
SlimScrollModule,
......................]
facing Error in nodemodules/ng2-slimscroll/index.ts like
ERROR in ./~/ng2-slimscroll/index.ts
Module build failed: Error
at new FatalError (E:\ReconWorkspace\AgreeGateway\node_modules\tslint\lib\error.js:40:23)
at Function.findConfiguration (E:\ReconWorkspace\AgreeGateway\node_modules\tslint\lib\configurat
ion.js:47:15)
at resolveOptions (E:\ReconWorkspace\AgreeGateway\node_modules\tslint-loader\index.js:35:64)
at Object.module.exports (E:\ReconWorkspace\AgreeGateway\node_modules\tslint-loader\index.js:124
:17)
# ./src/main/webapp/app/test/test.module.ts 21:23-48
# ./src/main/webapp/app/app.module.ts
# ./src/main/webapp/app/app.main.ts
# multi (webpack)-dev-server/client?http://localhost:9060 webpack/hot/dev-server ./src/main/webapp/
app/app.main

Getting SparkFlumeProtocol and EventBatch not found errors when building Spark 1.6.2 on CentOS7

I was trying to build Spark 1.6.2 on CentOS7 and ran into the error below:
[error] /home/pateln16/spark-1.6.2/external/flume-sink/src/main/scala/org/apache/spark/streaming/flume/sink/SparkAvroCallbackHandler.scala:45: not found: type SparkFlumeProtocol
[error] val transactionTimeout: Int, val backOffInterval: Int) extends SparkFlumeProtocol with Logging {
[error] ^
[error] /home/pateln16/spark-1.6.2/external/flume-sink/src/main/scala/org/apache/spark/streaming/flume/sink/SparkAvroCallbackHandler.scala:70: not found: type EventBatch
[error] override def getEventBatch(n: Int): EventBatch = {
[error] ^
[error] /home/pateln16/spark-1.6.2/external/flume-sink/src/main/scala/org/apache/spark/streaming/flume/sink/TransactionProcessor.scala:80: not found: type EventBatch
[error] def getEventBatch: EventBatch = {
[error] ^
[error] /home/pateln16/spark-1.6.2/external/flume-sink/src/main/scala/org/apache/spark/streaming/flume/sink/SparkSinkUtils.scala:25: not found: type EventBatch
[error] def isErrorBatch(batch: EventBatch): Boolean = {
[error] ^
[error] /home/pateln16/spark-1.6.2/external/flume-sink/src/main/scala/org/apache/spark/streaming/flume/sink/SparkAvroCallbackHandler.scala:85: not found: type EventBatch
[error] new EventBatch("Spark sink has been stopped!", "", java.util.Collections.emptyList())
[error] ^
[warn] Class org.jboss.netty.channel.ChannelFactory not found - continuing with a stub.
[warn] Class org.jboss.netty.channel.ChannelFactory not found - continuing with a stub.
[warn] Class org.jboss.netty.channel.ChannelPipelineFactory not found - continuing with a stub.
[warn] Class org.jboss.netty.handler.execution.ExecutionHandler not found - continuing with a stub.
[warn] Class org.jboss.netty.channel.ChannelFactory not found - continuing with a stub.
[warn] Class org.jboss.netty.handler.execution.ExecutionHandler not found - continuing with a stub.
[warn] Class org.jboss.netty.channel.group.ChannelGroup not found - continuing with a stub.
[error] /home/pateln16/spark-1.6.2/external/flume-sink/src/main/scala/org/apache/spark/streaming/flume/sink/SparkSink.scala:86: not found: type SparkFlumeProtocol
[error] val responder = new SpecificResponder(classOf[SparkFlumeProtocol], handler.get)
I meet the same problem on Spark 2.0.0. I think the reason is that file 'external\flume-sink\src\main\avro\sparkflume.avdl' is not complied well.
The problem can be resolved by:
Download Apache Avro
http://avro.apache.org/docs/current/gettingstartedjava.html
I downloaded all jar files into folder 'C:\Downloads\avro'.
Go to folder 'external\flume-sink\src\main\avro'
compile sparkflume.avdl to java files
java -jar C:\Downloads\avro\avro-tools-1.8.1.jar idl sparkflume.avdl > sparkflume.avpr
java -jar C:\Downloads\avro\avro-tools-1.8.1.jar compile -string protocol sparkflume.avpr ..\scala
recompile your projects.

Using Sparksql and SparkCSV with SparkJob Server

Am trying to JAR a simple scala application which make use of SparlCSV and spark sql to create a Data frame of the CSV file stored in HDFS and then just make a simple query to return the Max and Min of specific column in CSV file.
I am getting error when i use the sbt command to create the JAR which later i will curl to jobserver /jars folder and execute from remote machine
Code:
import com.typesafe.config.{Config, ConfigFactory}
import org.apache.spark.SparkContext._
import org.apache.spark._
import org.apache.spark.SparkConf
import org.apache.spark.SparkContext
import org.apache.spark.sql.SQLContext
object sparkSqlCSV extends SparkJob {
def main(args: Array[String]) {
val conf = new SparkConf().setMaster("local[4]").setAppName("sparkSqlCSV")
val sc = new SparkContext(conf)
val sqlContext = new org.apache.spark.sql.SQLContext(sc)
val config = ConfigFactory.parseString("")
val results = runJob(sc, config)
println("Result is " + results)
}
override def validate(sc: sqlContext, config: Config): SparkJobValidation = {
SparkJobValid
}
override def runJob(sc: sqlContext, config: Config): Any = {
val value = "com.databricks.spark.csv"
val ControlDF = sqlContext.load(value,Map("path"->"hdfs://mycluster/user/Test.csv","header"->"true"))
ControlDF.registerTempTable("Control")
val aggDF = sqlContext.sql("select max(DieX) from Control")
aggDF.collectAsList()
}
}
Error:
[hduser#ptfhadoop01v spark-jobserver]$ sbt ashesh-jobs/package
[info] Loading project definition from /usr/local/hadoop/spark-jobserver/project
Missing bintray credentials /home/hduser/.bintray/.credentials. Some bintray features depend on this.
Missing bintray credentials /home/hduser/.bintray/.credentials. Some bintray features depend on this.
Missing bintray credentials /home/hduser/.bintray/.credentials. Some bintray features depend on this.
Missing bintray credentials /home/hduser/.bintray/.credentials. Some bintray features depend on this.
[info] Set current project to root (in build file:/usr/local/hadoop/spark-jobserver/)
[info] scalastyle using config /usr/local/hadoop/spark-jobserver/scalastyle-config.xml
[info] Processed 2 file(s)
[info] Found 0 errors
[info] Found 0 warnings
[info] Found 0 infos
[info] Finished in 9 ms
[success] created output: /usr/local/hadoop/spark-jobserver/ashesh-jobs/target
[warn] Credentials file /home/hduser/.bintray/.credentials does not exist
[info] Updating {file:/usr/local/hadoop/spark-jobserver/}ashesh-jobs...
[info] Resolving org.fusesource.jansi#jansi;1.4 ...
[info] Done updating.
[info] scalastyle using config /usr/local/hadoop/spark-jobserver/scalastyle-config.xml
[info] Processed 5 file(s)
[info] Found 0 errors
[info] Found 0 warnings
[info] Found 0 infos
[info] Finished in 1 ms
[success] created output: /usr/local/hadoop/spark-jobserver/job-server-api/target
[info] Compiling 2 Scala sources and 1 Java source to /usr/local/hadoop/spark-jobserver/ashesh-jobs/target/scala-2.10/classes...
[error] /usr/local/hadoop/spark-jobserver/ashesh-jobs/src/spark.jobserver/sparkSqlCSV.scala:8: object sql is not a member of package org.apache.spark
[error] import org.apache.spark.sql.SQLContext
[error] ^
[error] /usr/local/hadoop/spark-jobserver/ashesh-jobs/src/spark.jobserver/sparkSqlCSV.scala:14: object sql is not a member of package org.apache.spark
[error] val sqlContext = new org.apache.spark.sql.SQLContext(sc)
[error] ^
[error] /usr/local/hadoop/spark-jobserver/ashesh-jobs/src/spark.jobserver/sparkSqlCSV.scala:25: not found: type sqlContext
[error] override def runJob(sc: sqlContext, config: Config): Any = {
[error] ^
[error] /usr/local/hadoop/spark-jobserver/ashesh-jobs/src/spark.jobserver/sparkSqlCSV.scala:21: not found: type sqlContext
[error] override def validate(sc: sqlContext, config: Config): SparkJobValidation = {
[error] ^
[error] /usr/local/hadoop/spark-jobserver/ashesh-jobs/src/spark.jobserver/sparkSqlCSV.scala:27: not found: value sqlContext
[error] val ControlDF = sqlContext.load(value,Map("path"->"hdfs://mycluster/user/Test.csv","header"->"true"))
[error] ^
[error] /usr/local/hadoop/spark-jobserver/ashesh-jobs/src/spark.jobserver/sparkSqlCSV.scala:29: not found: value sqlContext
[error] val aggDF = sqlContext.sql("select max(DieX) from Control")
[error] ^
[error] 6 errors found
[error] (ashesh-jobs/compile:compileIncremental) Compilation failed
[error] Total time: 10 s, completed May 26, 2016 4:42:52 PM
[hduser#ptfhadoop01v spark-jobserver]$
I guess the main issue being that its missing the dependencies for sparkCSV and sparkSQL , But i have no idea where to place the dependencies before compiling the code using sbt.
I am issuing the following command to package the application , The source codes are placed under "ashesh_jobs" directory
[hduser#ptfhadoop01v spark-jobserver]$ sbt ashesh-jobs/package
I hope someone can help me to resolve this issue.Can you specify me the file where i can specify the dependency and the format to input
The following link has more information in creating other contexts https://github.com/spark-jobserver/spark-jobserver/blob/master/doc/contexts.md
Also you need job-server-extras
add library dependency in buil.sbt
libraryDependencies += "org.apache.spark" %% "spark-sql" % "1.6.2"

getting error 'assetmanager.process is not a function' while trying to start node server

I am trying to run MEAN stack, I am trying to start Node server by running node server command,
`TypeError: assetmanager.process is not a function
at configureApp (C:\Users\Harsh\first\node_modules\meanio\lib\core_modules\module\aggregation.js:401:29)
at Consumer.Dependable.runAction (C:\Users\Harsh\first\node_modules\lazy-dependable\index.js:73:22)
at Consumer.Dependable.fire (C:\Users\Harsh\first\node_modules\lazy-dependable\index.js:70:53)
at Consumer.onResolved (C:\Users\Harsh\first\node_modules\lazy-dependable\index.js:120:8)
at Consumer.Dependable.resolve (C:\Users\Harsh\first\node_modules\lazy-dependable\index.js:56:10)
at Meanio.Container.notifyResolved (C:\Users\Harsh\first\node_modules\lazy-dependable\index.js:209:7)
at Dependency.onResolved (C:\Users\Harsh\first\node_modules\lazy-dependable\index.js:105:18)
at Dependency.Dependable.resolve (C:\Users\Harsh\first\node_modules\lazy-dependable\index.js:56:10)
at Meanio.Container.register (C:\Users\Harsh\first\node_modules\lazy-dependable\index.js:167:5)
at C:\Users\Harsh\first\node_modules\meanio\lib\core_modules\db\index.js:101:20
at open (C:\Users\Harsh\first\node_modules\mongoose\lib\connection.js:488:17)
at NativeConnection.Connection.onOpen (C:\Users\Harsh\first\node_modules\mongoose\lib\connection.js:498:5)
at C:\Users\Harsh\first\node_modules\mongoose\lib\connection.js:457:10
at C:\Users\Harsh\first\node_modules\mongoose\lib\drivers\node-mongodb-native\connection.js:60:5
at C:\Users\Harsh\first\node_modules\mongodb\lib\db.js:229:5
at connectHandler (C:\Users\Harsh\first\node_modules\mongodb\lib\server.js:279:7)
at g (events.js:260:16)
at emitOne (events.js:77:13)
at emit (events.js:169:7)
at C:\Users\Harsh\first\node_modules\mongodb-core\lib\topologies\server.js:409:23
at C:\Users\Harsh\first\node_modules\mongodb-core\lib\topologies\server.js:778:13
at Callbacks.emit (C:\Users\Harsh\first\node_modules\mongodb-core\lib\topologies\server.js:95:3)
C:\Users\Harsh\first\node_modules\mongodb\lib\server.js:282
process.nextTick(function() { throw err; })
^
TypeError: assetmanager.process is not a function
at configureApp (C:\Users\Harsh\first\node_modules\meanio\lib\core_modules\module\aggregation.js:401:29)
at Consumer.Dependable.runAction (C:\Users\Harsh\first\node_modules\lazy-dependable\index.js:73:22)
at Consumer.Dependable.fire (C:\Users\Harsh\first\node_modules\lazy-dependable\index.js:70:53)
at Consumer.onResolved (C:\Users\Harsh\first\node_modules\lazy-dependable\index.js:120:8)
at Consumer.Dependable.resolve (C:\Users\Harsh\first\node_modules\lazy-dependable\index.js:56:10)
at Meanio.Container.notifyResolved (C:\Users\Harsh\first\node_modules\lazy-dependable\index.js:209:7)
at Dependency.onResolved (C:\Users\Harsh\first\node_modules\lazy-dependable\index.js:105:18)
at Dependency.Dependable.resolve (C:\Users\Harsh\first\node_modules\lazy-dependable\index.js:56:10)
at Meanio.Container.register (C:\Users\Harsh\first\node_modules\lazy-dependable\index.js:167:5)
at C:\Users\Harsh\first\node_modules\meanio\lib\core_modules\db\index.js:101:20
at open (C:\Users\Harsh\first\node_modules\mongoose\lib\connection.js:488:17)
at NativeConnection.Connection.onOpen (C:\Users\Harsh\first\node_modules\mongoose\lib\connection.js:498:5)
at C:\Users\Harsh\first\node_modules\mongoose\lib\connection.js:457:10
at C:\Users\Harsh\first\node_modules\mongoose\lib\drivers\node-mongodb-native\connection.js:60:5
at C:\Users\Harsh\first\node_modules\mongodb\lib\db.js:229:5
at connectHandler (C:\Users\Harsh\first\node_modules\mongodb\lib\server.js:279:7)
at g (events.js:260:16)
at emitOne (events.js:77:13)
at emit (events.js:169:7)
at C:\Users\Harsh\first\node_modules\mongodb-core\lib\topologies\server.js:409:23
at C:\Users\Harsh\first\node_modules\mongodb-core\lib\topologies\server.js:778:13
at Callbacks.emit (C:\Users\Harsh\first\node_modules\mongodb-core\lib\topologies\server.js:95:3)
Also used gulp command still same error, where am I going wrong??
Had a similar error when there was an issue with the git clone. Re-clone the repository and then run on your console:
git clone https://github.com/meanjs/mean.git myprojectname
npm install
bower install
The install processes will take a little while but running gulp after this should work.

Resources