I am trying to crawl web data using nutch 2.3 on Linux Mint 17.2, but get the following error message
“Failed with the following error: java.net.UnknownHostException:”
I'd like to know what causes this error and how to resolve it. My observation is that this error sometimes comes with another error message "Couldn't get robots.txt." What are the relationships between these two errors and how to solve this? Thanks.
Usually this happens when Nutch can't resolve a given URL, does this happens with specific URLs? or when you run some specific Nutch commands?
Related
I started to use termux in my cellphone, and looking for information I find that I can install hydra in it.
However, when I use the comand
pkg install hydra
All I get is an error message as you can see in the title. I've tried to update and upgrade the system several times, but I'm still getting the same error.
So, I wonder... is there a chance that this library has another name? (I said it because I've seen other comments with the same error in other posts)
So please, can you help me to solve this?
Additionaly, I upload 2 pictures: the first one with the error message I get. The second one is some information that I think I might help (when I seach for that command 'hydra', it seems to be not found).
Thanks in advance.
As you can see here Hydra package has been removed from Termux repositories and no longer installable.
So before the question I wanna point out that the only thing I could find on this issue was on this stackoverflow question. This issue suggest that this was an issue with wappalyzer, and it was fixed in version 4.0.1. however I am using wappalyzer version 5.1.4 and is up to date with this.
I am building a web-app based on the MEAN-stack, everything worked as intended for a long time, until this error kept poping up in my google-chrome console:
Everytime i would click in my app header, and use my front-end routing to load up different components / modules this error appears, however I dont see any issue with what the web-app presents to me (it's not like I am missing data)
More details on the error:
I have no idea whats going on, or where this issue comes from.
This was due to a failing plugin.
Disable all plugins, and enable them one at a time to find failing chrome extensions.
In this case it was the wappalyzer extension.
I am running Xpage Application on Domino Server 8.5.3 FP1 Windows 64bit.
We are sending the bills(documents) around 2500 users. during testing ,this application works fine without any error. When we rollout the bills to user. All users are trying to open the application.
It is throwing the following error. But Application still works. At some point of time, Application gets slow and http hang. I have do restart of HTTP. Then it works fine.
I am not able to find the cause whether any error in the code or because number of users are accessing the application increases.
06/04/2014 10:58:53 AM HTTP JVM: CLFAD0211E: Exception thrown.
For more detailed information, please consult error-log-0.xml located in D:/Lotus/Domino/data/domino/workspace/logs
And sometimes this is also throwing ,
HTTP JVM: CLFAD0141E: Error processing XPage request. For more detailed information,
please consult error-log-0.xml located in D:/IBM/Lotus/Domino/data/domino/workspace/logs
Please help on this query.
You should start with what Paul suggests. If you haven't already downloaded and installed LogReader (by Jakob Majkilde), then you should. It is a database on your server that will read the various error file types and show them for you in an easy way. You can find it here: http://www.openntf.org/internal/home.nsf/project.xsp?action=openDocument&name=XPages%20Log%20File%20Reader
From these log files you will then have to look at where the system complains about problems. Did you write the code in Java or Serverside Javascript (SSJS)? You have options to debug both of these (although I cannot remember if you can debug SSJS in version 8.5.3 - it may not be available until version 9.0). But you could always add a println in the code near where you think it breaks ;-)
/John
At some point of time, Application gets slow and http hang. I have do restart of HTTP. Then it works fine.
As mentioned by Paul + John, no one is going to be able to give you an exact answer from the posted message.
As you mention the application gets slows and hangs, I would also recommend watching the XPages Masterclass.
It is approx 4 hours of videos, and goes into detail on how to debug your application for performance issues using the XPages Toolbox.
Again this isn't an exact solution. You will need to use the XPages toolbox to drill down on your code/JVM to see where it is slowing down/hanging.
Posting the stacks from the error-log-0.xml log file might give some hints, but with performance/hangs it's rarely that simple to find.
Now Application is working without issue. I have just included exception handling in all ssjs.
Now it is not throwing error.
Thanks for all your help and time.
When I installed a new instance of the Tomcat 7 with a default .jsf program the error above isn't shown. When I did basic changes in the program through of the Netbeans 7.4, I got this error:
java.lang.NoClassDefFoundError: com/sun/enterprise/InjectionException.
My tests showed me that this error is related with code. But, I expected a better error message. I'm running Tomcat 7.
I tried to work around the situation using Glassfish 4, but I stuck with error: SEC5054: Certificate has expired.
I'm running on W32 and I'm thinking to change to Debian64. Probably the issues are understandable.
What do you think about?
You might have different java versions 1.6/1.7 in NetBeans/Glassfish
I got it. related to the issue com/sun/enterprise/InjectionException I changed encoding from UTF-8 to ISO-8859-1 just in one file (index.xhtml), but I had others two: web.xml and projectname.xml to change too.
I'll try to solve other issue: SEC5054: Certificate has expired.
I am quite new to nutch . Thing is I have crawled a site successfully using nutch 1.2 .Now using cygwin I am working on crawldb and segments . Problem is when I am using webgraphdb command it is showing "Error: Could not find or load main class WebGraph". Please suggest me that what I need to do to use this command properly.