sdf
I am doing this assignment and I got stuck at the very first. I can't even read the given text file. Help me guys. please check the attachment and see where I am wrong
Exception in thread "main" java.io.FileNotFoundException: popData.txt (The system cannot find the file specified)
at java.io.FileInputStream.open0(Native Method)
at java.io.FileInputStream.open(Unknown Source)
at java.io.FileInputStream.<init>(Unknown Source)
at java.io.FileInputStream.<init>(Unknown Source)
at dd.main(dd.java:13)
try use standard Java classes
BufferedReader br = new BufferedReader(
new InputStreamReader(new FileInputStream(fileName)));
try {
String line;
while ((line = br.readLine()) != null) {
// process line
}
} finally {
br.close();
}
Related
I have setup single Cassandra node on VM. i have to create a table with 70000 columns. for this i have written java code that read json file and create table.
here is my java code snippet.
When i run my java code it throws exception after creation some columns.
Exception stack is
public void createTable(String keyspaceName, String tableName) throws FileNotFoundException{
JSONParser jsonParser = new JSONParser();
FileReader fileReader;
String filePath = "";
String columnHeader = "";
//String completeColumnHeader = "";
try{
System.out.println("Inside Create Table");
session.executeAsync("DROP TABLE IF EXISTS "+keyspaceName+"."+tableName+";");
String createQuery = "CREATE TABLE "+keyspaceName+"."+tableName +"(\"P:LanguageID\" text, "
+ "\"P:PdmarticleID\" text, PRIMARY KEY(\"P:PdmarticleID\",\"P:LanguageID\"));";
session.execute(createQuery);
System.out.println("Table created");
filePath = "CassandraTableColumnHeader/FixColumnHeader.json";
fileReader = new FileReader(filePath);
JSONObject jsonObject = (JSONObject) jsonParser.parse(fileReader);
JSONArray jsonArray = (JSONArray) jsonObject.get("columnHeaderName");
int columnHeaderSize = jsonArray.size();
int columnHeaderBatchSize = 1000;
int fromIndex = 0;
int toIndex = columnHeaderBatchSize;
while(columnHeaderSize > 0){
columnHeaderSize -=columnHeaderBatchSize;
for(int i = fromIndex; i < toIndex; i++) {
columnHeader = (String) jsonArray.get(i);
if(columnHeader.equals("P:PdmarticleID")||columnHeader.equals("P:LanguageID")){
continue;
}
session.execute("ALTER TABLE "+keyspaceName+"."+tableName +" ADD "+"\""+columnHeader+"\""+" text;");
}
fromIndex = toIndex;
if(columnHeaderSize < columnHeaderBatchSize){
toIndex += columnHeaderSize;
}else{
toIndex = toIndex + columnHeaderBatchSize;
}
}
}catch(FileNotFoundException fnfe){
throw fnfe;
}catch (ParseException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
}
Exception in thread "main" com.datastax.driver.core.exceptions.NoHostAvailableException: All host(s) tried for query failed (tried: /127.0.0.1:9042 (com.datastax.driver.core.exceptions.DriverException: Host replied with server error: java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.io.FileNotFoundException: C:\apache-cassandra-new\data\data\system\schema_columnfamilies-45f5b36024bc3f83a3631034ea4fa697\system-schema_columnfamilies-tmplink-ka-4839-Data.db (The process cannot access the file because it is being used by another process)))
at com.datastax.driver.core.exceptions.NoHostAvailableException.copy(NoHostAvailableException.java:84)
at com.datastax.driver.core.DefaultResultSetFuture.extractCauseFromExecutionException(DefaultResultSetFuture.java:265)
at com.datastax.driver.core.DefaultResultSetFuture.getUninterruptibly(DefaultResultSetFuture.java:179)
at com.datastax.driver.core.AbstractSession.execute(AbstractSession.java:52)
at com.datastax.driver.core.AbstractSession.execute(AbstractSession.java:36)
at com.exportstagging.SparkTest.DataLoaderInCassandra.createTable(DataLoaderInCassandra.java:89)
at com.exportstagging.SparkTest.DataLoaderInCassandra.main(DataLoaderInCassandra.java:216)
Caused by: com.datastax.driver.core.exceptions.NoHostAvailableException: All host(s) tried for query failed (tried: /127.0.0.1:9042 (com.datastax.driver.core.exceptions.DriverException: Host replied with server error: java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.io.FileNotFoundException: C:\apache-cassandra-new\data\data\system\schema_columnfamilies-45f5b36024bc3f83a3631034ea4fa697\system-schema_columnfamilies-tmplink-ka-4839-Data.db (The process cannot access the file because it is being used by another process)))
at com.datastax.driver.core.RequestHandler.reportNoMoreHosts(RequestHandler.java:216)
at com.datastax.driver.core.RequestHandler.access$900(RequestHandler.java:45)
at com.datastax.driver.core.RequestHandler$SpeculativeExecution.sendRequest(RequestHandler.java:276)
at com.datastax.driver.core.RequestHandler$SpeculativeExecution$1.run(RequestHandler.java:374)
at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
at java.lang.Thread.run(Unknown Source)
I have stuck here. Please help me. Thanks in advance.
If I were you I might reevaluate creating a table with 70k column headers. Your partition key P:PdmarticleID and full primary key (P:PdmarticleID, P:LanguageID) are the only two pieces of information you will be able to use to get results anyway. So having these other pieces of information explicitly stored in columns is not buying you anything.
A collection (eg. map) can hold onto 64k items, with certain other limitations (see http://wiki.apache.org/cassandra/CassandraLimitations). Is there a way you can split the columns such that you can create multiple tables, with some pieces of information stored in one table and some in another?
In linux Socket also is a file, so if exist too many sockets at a time and more than max open files, it will throw below exception:
java.net.SocketException: Too many open files
at sun.nio.ch.Net.socket0(Native Method)
at sun.nio.ch.Net.socket(Net.java:423)
at sun.nio.ch.Net.socket(Net.java:416)
at sun.nio.ch.SocketChannelImpl.<init>(SocketChannelImpl.java:104)
at sun.nio.ch.SelectorProviderImpl.openSocketChannel(SelectorProviderImpl.java:60)
at java.nio.channels.SocketChannel.open(SocketChannel.java:142)
Till now I could understand, but I'm a little confused with below phenomenon.
I executed below command in terminal to find max open file number:
$ ulimit -n
1024
But actually I created about 4091 sockets(SocketChannel) by below code:
while(true) {
new Thread(new Runnable() {
public void run() {
//...
try {
SocketChannel scChannel = SocketChannel.open();
scChannel.connect(new InetSocketAddress(hostname, port));
ByteBuffer buffer = ByteBuffer.allocate(1024);
scChannel.read(buffer);
} catch (IOException e) {
//...
}
}
}).start();
}
and from console I knew until it created 4091 socketChannel, then it threw above exception:
Start client 4091
java.net.SocketException: Too many open files
4091 is more than 1024, so why is so? The result from ulimit is not real number of max open files?
According to n.m.'s advice, I checked if jvm changed ulimit. And I found jvm indeed does something. I executed ulimit command by java code as below:
ProcessBuilder pBuilder = new ProcessBuilder("sh","-c","ulimit -n");
Process p = pBuilder.start();
BufferedReader reader = new BufferedReader(new InputStreamReader(p.getInputStream()));
String line = reader.readLine();
while(line!=null){
System.out.println(line);
line = reader.readLine();
}
the output is 4096--it is different from terminal output.
But I don't know how and when and where jvm changed the ulimit value.
I have a method in Java for unmarshalling XML-files with the given URL.
For URL's like "http:// ..." everything works fine, but for URL's like "file://localhost/C:/Users/.../filename.xml" I receive following exception.
I have no idea why he won't accept my "file://localhost/"-URL's.
javax.xml.bind.UnmarshalException
- with linked exception:
[org.xml.sax.SAXParseException; lineNumber: 1; columnNumber: 1; Content is not allowed in prolog.]
at javax.xml.bind.helpers.AbstractUnmarshallerImpl.createUnmarshalException(AbstractUnmarshallerImpl.java:335)
at com.sun.xml.internal.bind.v2.runtime.unmarshaller.UnmarshallerImpl.createUnmarshalException(UnmarshallerImpl.java:563)
at com.sun.xml.internal.bind.v2.runtime.unmarshaller.UnmarshallerImpl.unmarshal0(UnmarshallerImpl.java:249)
at com.sun.xml.internal.bind.v2.runtime.unmarshaller.UnmarshallerImpl.unmarshal(UnmarshallerImpl.java:214)
at javax.xml.bind.helpers.AbstractUnmarshallerImpl.unmarshal(AbstractUnmarshallerImpl.java:157)
at javax.xml.bind.helpers.AbstractUnmarshallerImpl.unmarshal(AbstractUnmarshallerImpl.java:204)
at preferee.data.access.IO_transfer.jaxb.XMLconverter.getItemFromStream(XMLconverter.java:40)
at preferee.data.access.IO_transfer.jaxb.XMLconverter.getItemFromURL(XMLconverter.java:57)
at preferee.data.access.testServer.LocalTestServer.<init>(LocalTestServer.java:42)
at preferee.data.access.testServer.TestProvider.<init>(TestProvider.java:16)
at preferee.data.access.Providers.createTestProvider(Providers.java:29)
at preferee.tests.FakeServerTests.MovieDao_TEST.run(MovieDao_TEST.java:22)
at preferee.tests.FakeServerTests.MovieDao_TEST.main(MovieDao_TEST.java:16)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:134)
Caused by: org.xml.sax.SAXParseException; lineNumber: 1; columnNumber: 1; Content is not allowed in prolog.
at com.sun.org.apache.xerces.internal.util.ErrorHandlerWrapper.createSAXParseException(ErrorHandlerWrapper.java:203)
at com.sun.org.apache.xerces.internal.util.ErrorHandlerWrapper.fatalError(ErrorHandlerWrapper.java:177)
at com.sun.org.apache.xerces.internal.impl.XMLErrorReporter.reportError(XMLErrorReporter.java:441)
at com.sun.org.apache.xerces.internal.impl.XMLErrorReporter.reportError(XMLErrorReporter.java:368)
at com.sun.org.apache.xerces.internal.impl.XMLScanner.reportFatalError(XMLScanner.java:1436)
at com.sun.org.apache.xerces.internal.impl.XMLDocumentScannerImpl$PrologDriver.next(XMLDocumentScannerImpl.java:999)
at com.sun.org.apache.xerces.internal.impl.XMLDocumentScannerImpl.next(XMLDocumentScannerImpl.java:606)
at com.sun.org.apache.xerces.internal.impl.XMLNSDocumentScannerImpl.next(XMLNSDocumentScannerImpl.java:117)
at com.sun.org.apache.xerces.internal.impl.XMLDocumentFragmentScannerImpl.scanDocument(XMLDocumentFragmentScannerImpl.java:510)
at com.sun.org.apache.xerces.internal.parsers.XML11Configuration.parse(XML11Configuration.java:848)
at com.sun.org.apache.xerces.internal.parsers.XML11Configuration.parse(XML11Configuration.java:777)
at com.sun.org.apache.xerces.internal.parsers.XMLParser.parse(XMLParser.java:141)
at com.sun.org.apache.xerces.internal.parsers.AbstractSAXParser.parse(AbstractSAXParser.java:1213)
at com.sun.org.apache.xerces.internal.jaxp.SAXParserImpl$JAXPSAXParser.parse(SAXParserImpl.java:649)
at com.sun.xml.internal.bind.v2.runtime.unmarshaller.UnmarshallerImpl.unmarshal0(UnmarshallerImpl.java:243)
By the way this is my method-implementation:
Class classObject = ... ;
public T getItemFromURL(String url) throws DataAccessException {
JAXBContext jc = null;
T item = null;
try (InputStream XML_Stream = new URL(url).openStream();)
{
jc = JAXBContext.newInstance(classObject);
item = (T) jc.createUnmarshaller().unmarshal(XML_Stream);
} catch (IOException e) {
throw new DataAccessException("( originele error: " + e.getClass() +" ) " + e.getMessage() + ": Kon Bestand niet ophalen of lezen." );
} catch (JAXBException e) {
throw new DataAccessException(e.getMessage());
}
return item;
}
Your URL for hitting your file system is not correct. It should be like:
file:///c|/path/to/file
Update
Will this "file:///" work on other systems like mac, linux?
You can use a file URL on any OS. Of course the URL needs to match the file layout there (i.e. no C drive in Linux).
And is there any way to convert c:\ ... to c|/ ... / easily?
File file = new File("C:/Users/.../filename.xml");
String url = file.toURI().toURL().toString();
I'm trying to put an InputStream into a file and getting the following stacktrace:
org.springframework.amqp.rabbit.listener.ListenerExecutionFailedException: Listener method 'handleMessage' threw exception
at java.lang.Thread.run(Thread.java:722)
Caused by: java.lang.reflect.UndeclaredThrowableException
... 1 more
Caused by: java.io.EOFException: Unexpected end of ZLIB input stream
at java.util.zip.InflaterInputStream.fill(InflaterInputStream.java:240)
at java.util.zip.InflaterInputStream.read(InflaterInputStream.java:158)
Here is the snippet that is causing it
HSLFSlideShow ppt = new HSLFSlideShow(fs)
ObjectData[] embeddes = ppt.getEmbeddedObjects()
embeddes.eachWithIndex { ObjectData entry, int i ->
File f = new File (fileToSave)
f.withOutputStream { w ->
w << entry.getData() //This causes the error
}
}
Here is the link to the ObjectData class getData() method https://poi.apache.org/apidocs/org/apache/poi/hslf/usermodel/ObjectData.html#getData()
I'm using Apache POI 3.10-FINAL
I am trying to delete a single row from the excel file but the can't do that,
see the code here
try{
String val = request.getParameter("rdel");
int va = 5;
System.out.println("int val"+va);
FileInputStream file = new FileInputStream(new File(fileName));
HSSFWorkbook wb = new HSSFWorkbook(file); // here exception occurs
HSSFSheet sheet = wb.getSheetAt(0);
int lastRowNum=sheet.getLastRowNum();
if(va>=0&&va<lastRowNum){
sheet.shiftRows(va+1,lastRowNum, -1);
}
if(va==lastRowNum){
HSSFRow removingRow=sheet.getRow(va);
if(removingRow!=null){
sheet.removeRow(removingRow);
}
}
FileOutputStream out = new FileOutputStream ("D:/task.xls");
wb.write(out);
}catch(Exception e){
e.printStackTrace();
}
return SUCCESS;
}
see the exception
org.apache.poi.hssf.record.RecordFormatException: Unable to construct record instance
at org.apache.poi.hssf.record.RecordFactory.createRecord(RecordFactory.java:186)
at org.apache.poi.hssf.record.RecordFactory.createRecords(RecordFactory.java:328)
at org.apache.poi.hssf.usermodel.HSSFWorkbook.<init>(HSSFWorkbook.java:271)
at org.apache.poi.hssf.usermodel.HSSFWorkbook.<init>(HSSFWorkbook.java:196)
at org.apache.poi.hssf.usermodel.HSSFWorkbook.<init>(HSSFWorkbook.java:312)
at org.apache.poi.hssf.usermodel.HSSFWorkbook.<init>(HSSFWorkbook.java:293)
at com.struts.curd.Delete.execute(Delete.java:47)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
This issue was removed after 3.2... Update POI version with recent one (right now it is 3.9) ... hope you will not face this issue again...
For Details plz check here