Getting error while creating workflow in Knime for text analytics - text

I have a set of URL from which I have to read data and execute a particular work-flow in Knime for determining word frequency. However I am getting error "No column with DocumentCells found!". I have attached reference image. Can someone please help me with this.
Also I am getting following error in the HttpRetriver node saying
WARN HttpRetriever (deprecated) 0:2 Error retrieving https://www.bosch-do-it.com/gb/en/diy/knowledge/project-guides/valentine-s-day-601921.jsp: Exception java.net.UnknownHostException: www.bosch-do-it.com for URL "https://www.bosch-do-it.com/gb/en/diy/knowledge/project-guides/valentine-s-day-601921.jsp": www.bosch-do-it.com

You need the "Strings to Document" node to use the "POS tagger" node.
The "POS tagger" node needs a DocumentCell to work and the "Strings to Document" node do the job.
Updated Workflow

Related

Can't Drop Azure SQL Table due to severe error, what is the cause?

I am trying to drop a table before reading in a new set of values for testing purposes. When I run the command
DROP TABLE [dbo].[Table1]
I get the following error after about 3-5 minutes. It is a large table (~50 million rows).
Failed to execute query. Error: A severe error occurred on the current command. The results, if any, should be discarded.
Operation cancelled by user.
Failed to execute query. Error: A severe error occurred on the current command. The results, if any, should be discarded. Operation cancelled by user. what is the cause?
The error can cause for many different reasons, but it is not showing exact reason to find exact reason you can check following things:
It might be because of indexing issue to get that check the consistency of the database should be checked first
DBCC CHECKDB(``'database_name'``);
Check table consistency if you have it nailed down to a table.
DBCC CHECKTABLE(``'table_name'``);
When the problem was reported, search for any files with the name SQLDump* in the LOG folder, which contains ERRORLOG or You can try following actions in SSMS:
Object Explorer >> Management Node >> SQL Server Logs >> View the current log

Error in submitting the es-injector.flux topology

I have setup the stormcrawler project using this medium story https://medium.com/analytics-vidhya/web-scraping-and-indexing-with-stormcrawler-and-elasticsearch-a105cb9c02ca, but when I tried to submit the es-injector.flux, then I recevied this error:
Exception in thread "main" java.lang.IllegalArgumentException: Couldn't find a suitable
constructor for class 'com.digitalpebble.stormcrawler.util.StringTabScheme' with
arguments '[DISCOVERED]'.
at org.apache.storm.flux.FluxBuilder.buildObject(FluxBuilder.java:358)
at org.apache.storm.flux.FluxBuilder.buildComponents(FluxBuilder.java:421)
at org.apache.storm.flux.FluxBuilder.buildTopology(FluxBuilder.java:101)
at org.apache.storm.flux.Flux.runCli(Flux.java:158)
at org.apache.storm.flux.Flux.main(Flux.java:103)
The command that I run is:
storm jar target/project-1.0-SNAPSHOT.jar org.apache.storm.flux.Flux --local es-
injector.flux
Can someone please tell me what does it mean and how can I get rid of this error?
The latest ES tutorial is probably a better starting point, I'd recommend that you use it instead.

Push in existing local table failure (windows): InvalidRegionNumberException then IllegalArgumentException

I want to push data into an already existing table, single column family, no records.
I am using shc-core:1.1.1-2.1-s_2.11 on a windows machine. I have hbase 1.2.6 installed and use scala 2.11.8.
When I try to push data I got first the following error: org.apache.spark.sql.execution.datasources.hbase.InvalidRegionNumberException: Number of regions specified for new table must be greater than 3.
After following the advice of this link https://github.com/hortonworks-spark/shc/issues/249#issue-318285217, I added: HBaseTableCatalog.newTable -> "5" to my options.
It still failed but with: java.lang.IllegalArgumentException: Can not create a Path from a null string.
Following this link: https://github.com/hortonworks-spark/shc/issues/151#issuecomment-313800739, I added to my catalog: , "tableCoder":"PrimitiveType".
Still facing the same error.
I saw people are expecting some clarification about this issue (https://github.com/hortonworks-spark/shc/issues/249#issuecomment-463528032).
It is known issue and apparently it seems fixed (https://github.com/hortonworks-spark/shc/issues/155#issuecomment-315236736).
I do not know what to do next.
Is there a solution about this?

Unable to use uploaded .csv data under Jmeter getting error message

I am learning Jmeter use id=2172797 appid=b6907d289e10d714a6e88b30761fae22
and now Unable to read data from uploaded .csv file getting error message like ; Response code: Non HTTP response code: java.net.URISyntaxException under Jmeter.
Added screen for your reference.
Please help me.
Thanks
Double check your ${id} and ${appid} variable values using Debug Sampler and View Results Tree listener combination, the error you're getting most probably indicates that both variables are not resolved into their values and $ is an illegal character in the HTTP Request URL Path
Check out How to Debug your Apache JMeter Script article to learn more about troubleshooting JMeter tests

Error running cassandra Word count example

I am tryin to run the cassandra word count example on eclipse. I have loaded all the requisite jar files. But i am still getting some errors, in fileCassandraDemonThread.java
TNonblockingServer.Args serverArgs = new TNonblockingServer.Args(serverTransport).inputTransportFactory(inTransportFactory)
.outputTransportFactory(outTransportFactory)
.inputProtocolFactory(tProtocolFactory)
.outputProtocolFactory(tProtocolFactory)
.processor(processor);
It throws the compilation error: TNonblockingServer.Args cannot be resolved to a type
Can somebody tell if i am missing any file to be linked?
Thanks for the help.
Sounds like you don't have lib/*.jar on your runtime classpath, or less likely you have an old Thrift jar somewhere else that's getting used instead of the right one.

Resources