I am getting an error writing a file, that is driving me crazy.
I have an C# netcore 5 application running on RH Linux.
I mounted an shared folder (windows) using: sudo mount -t cifs -o username=MyDomainUsername,password=MyDomainUsernamePassword,domain=MyDomain,dir_mode=0777,file_mode=0777 //ipv4_from_destination/Reports /fileshare/Reports
Then I run the app, using just ./WebApi --urls=http://+:8060
The read/write test executes the following steps:
Create a text file.
Write the text file.
Delete de text file.
Creates a directory
Creates a text file inside that directory
Writes the text file
Deletes the text file
Deletes the directory.
Now the problem:
The text file is created
The write operation fails.
Where goes part of the log:
Creating file: /fileshare/Reports/test.616db7d1-07fb-4599-a0cf-749e6a8b34ec.tmp...Ok
Writing file: /fileshare/Reports/test.616db7d1-07fb-4599-a0cf-749e6a8b34ec.tmp...[16:22:20 ERR] ID:87988856-a765-4474-9ed9-2f04aef35771 PATH:/api/about ERROR:System.UnauthorizedAccessException:Access to the path '/fileshare/Reports/test.616db7d1-07fb-4599-a0cf-749e6a8b34ec.tmp' is denied. TRACE: at System.IO.FileStream.WriteNative(ReadOnlySpan`1 source)
at System.IO.FileStream.FlushWriteBuffer()
at System.IO.FileStream.FlushInternalBuffer()
at System.IO.FileStream.Flush(Boolean flushToDisk)
at System.IO.FileStream.Flush()
at System.IO.StreamWriter.Flush(Boolean flushStream, Boolean flushEncoder)
at System.IO.StreamWriter.Flush()
at WebApi.Controllers.ApplicationController.TestFileSystem(String folder) in xxxxxxx\WebApi\Controllers\ApplicationController.cs:line 116
What I discovered so far:
I can create and delete the files and directories.
I cannot write to files.
Can someone give me an hint on this?
Solved using the cifs option nobrl
Related
I want to get details about a selected harbour,by taking val s from a list extracted using readLines from a .txt file, where each harbour has a .txt file in the assets directory. I generate the file name but when the app is run in the emulator I get a file not found error.
In this case I am trying to get at a file called Brehatharm.txt
var portChosen = "Brehat"
//"tide2a/app/src/main/assets/"+//various paths to try
fileName = "assets/"+portChosen+"harm.txt"
val harmConsList:List<String> = File(fileName).readLines()
val portDisplayName = harmConsList[0]
val longTude = harmConsList[1]
val MTL =harmConsList[2]
etc,
The log cat reads :-
2021-01-10 15:40:34.044 7108-7108/com.example.tide2a E/AndroidRuntime: FATAL EXCEPTION: main
Process: com.example.tide2a, PID: 7108
java.lang.RuntimeException: Unable to start activity ComponentInfo{com.example.tide2a/com.example.tide2a.MainActivity}: java.io.FileNotFoundException: assets/Brehatharm.txt: open failed: ENOENT (No such file or directory)
The full windows path to the file is :-
C:\Users.......\OneDrive\Coding projects\tide2a\app\src\main\assets\Brehatharm.txt
I am sure the file is there, and spelled correctly, so I suspect I am specifying the path incorrectly. Please advise me.
Files in assets/ cannot be accessed using the File class. Use context.assets to get the AssetManager, and you can open InputStreams to the files.
I am trying to copy a file from one folder to another on a mounted folder. I see the following error. Note that this is on mounted NFS folder not on HDFS.The error is coming up from the line of code that does a create() of the destination file. The "No such file" error is not on the source.
java.io.IOException: Cannot run program "chmod": error=2, No such file or direct ory
at java.lang.ProcessBuilder.start(ProcessBuilder.java:1059)
at org.apache.hadoop.util.Shell.runCommand(Shell.java:938)
at org.apache.hadoop.util.Shell.run(Shell.java:901)
at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java: 1213)
at org.apache.hadoop.util.Shell.execCommand(Shell.java:1307)
at org.apache.hadoop.util.Shell.execCommand(Shell.java:1289)
at org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSys tem.java:840)
at org.apache.hadoop.fs.RawLocalFileSystem.mkOneDirWithMode(RawLocalFile System.java:522)
at org.apache.hadoop.fs.RawLocalFileSystem.mkdirsWithOptionalPermission( RawLocalFileSystem.java:562)
at org.apache.hadoop.fs.RawLocalFileSystem.mkdirs(RawLocalFileSystem.jav a:534)
at org.apache.hadoop.fs.ChecksumFileSystem.mkdirs(ChecksumFileSystem.jav a:705)
at org.apache.hadoop.fs.ChecksumFileSystem.create(ChecksumFileSystem.jav a:456)
at org.apache.hadoop.fs.ChecksumFileSystem.create(ChecksumFileSystem.jav a:443)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:1118)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:1098)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:987)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:975)
The issue got resolved after setting the PATH variable to chmod.
I am trying to copy a file from one location to another so I'm using this:
fs.copyFile('C:\\Users\\Me\\Documents\\myfile.zip', c:\\myfiles
console.log('file was copied successfully!');
});
I can see that the destination folder is readonly so that's why I'm getting this.
How can I change it's status on my windows pc.
I've tried this but nothing is happening and I still get the error:
fs.chmodSync('c:\\myfiles', 0o755);
How can I fix this issue?
You are using Windows, I guest C:\ is your system disk (where you install the Windows).
If you want to write a file to c:\myfiles , it require you Administrator permission (you can try by way: copy and paste a file to the folder, by hand).
Solution:
Option1: Change your folder, ex: D:\myfiles
Option2: Use windows file manager, change your folder permission (everyone read/write)
Im using tika-app.jar with the version 1.12,to try to find the list of corrupted files that can't be opened in a specified folder.
the problem is when i tested inside windows it gives me in the log folder some exception that allow me to know what files that can't be opened like this :
Caused by: org.apache.poi.openxml4j.exceptions.InvalidOperationException: Can't open the specified file: 'folder\mi-am-CV.docx'
but the problem in linux is only i get a broad error in the log folder like this:
WARN org.apache.tika.batch.FileResourceConsumer - <parse_ex resourceId="test-corrupted-2.doc">org.apache.tika.exception.TikaException: Unexpected RuntimeException from org.apache.tika.parser.microsoft.OfficeParser#f6e9bd4
so i can't know specificaly what files that are really corrupted and can't be opened.
here's the shell command that i use for that in linux :
java -Dlog4j.debug -Dlog4j.configuration=file:log4j_driver.xml -cp "bin/*" org.apache.tika.cli.TikaCLI -JXX:-OmitStackTraceInFastThrow -JXmx5g -JDlog4j.configuration=file:log4j.xml -bc tika-batch-config-basic-test.xml -i /folder -o outxml -numConsumers 10
thanks.
I'm trying to setup data import from files using apache tika and solr. There are shared docs folder on nfs mounted share. Unfortunately, I can't perform dataimport, 1 file processed and then exception:
[http-8080-3] ERROR org.apache.solr.handler.dataimport.DocBuilder - Exception while processing: files document : null:org.apache.solr.handler.dataimport.DataImportHandlerException: Unable to read content Processing Document # 2
at org.apache.solr.handler.dataimport.DataImportHandlerException.wrapAndThrow(DataImportHandlerException.java:71)
....
at java.lang.Thread.run(Thread.java:744)
Caused by: java.io.IOException: Access denied
at java.io.UnixFileSystem.createFileExclusively(Native Method)
at java.io.File.createNewFile(File.java:1006)
at java.io.File.createTempFile(File.java:1989)
at org.apache.tika.io.TemporaryResources.createTemporaryFile(TemporaryResources.java:66)
at org.apache.tika.io.TikaInputStream.getFile(TikaInputStream.java:533)
at org.apache.tika.io.TikaInputStream.getFileChannel(TikaInputStream.java:564)
at org.apache.tika.parser.microsoft.POIFSContainerDetector.getTopLevelNames(POIFSContainerDetector.java:373)
at org.apache.tika.parser.microsoft.POIFSContainerDetector.detect(POIFSContainerDetector.java:165)
at org.apache.tika.detect.CompositeDetector.detect(CompositeDetector.java:61)
at org.apache.tika.parser.AutoDetectParser.parse(AutoDetectParser.java:113)
at org.apache.solr.handler.dataimport.TikaEntityProcessor.nextRow(TikaEntityProcessor.java:140)
... 26 more
So it seems to be some problem with permissions while writing temporary files. Unfortunately, I have no idea where exactly tike tries to write that temporary files so I can't check permissions on nfs. I checked permission for tika home folder (core configuration) and docs folder and subfolders - all ok, including problematic document.
I also tried to change docs directory in my core config to other (on the same nfs share) and all is ok. So, do you have any idea how to track my issue?
[EDIT]
I just noticed that it's not really permission problem. Everything works for files .docx and .pdf. But on .doc file it fails. Do you have any ideas?