Lotus Notes : properties file couldnt not be found and loaded - lotus-notes

I am new to lotus and I am trying to pull the properties data from the server in Java Agent and I am getting error that file is not found and one thing is if I use ff.load method it is giving error as cannot access properties file.
Can any one tell what I have to add it in this code such that it would work correctly. I have corrected restriction rights also. and select the second option
Java Code.
InputStream con = session.getClass().getResourceAsStream("C:/pqrs.properties");
if(con== null)
{
System.out.println("FLAG FILE NOT FOUNDFIND NOT FOUND!!!!!!!!!!!");
}else
{
System.out.println("FLAG FILE FOUND !!!!!!!!FIND FOUND");
}
Properties ff = new Properties();
ff.load(con);

The OS's filesystem security settings are probably not allowing the Domino server to read files from the root directory of the C: drive.
Try moving the file into a location that you know for sure that the Domino server can read. E.g, the C:\IBM\Lotus\Domino\Data folder - or whatever the equivalent location is in your Domino server's configuration.

Related

File.Exists from UNC using Azure Storage/Fileshare via IIS results in false.

Problem:
trying to get an image out of azure fileshare for manipulation. I need to read the file as an Drawing.Image for manipulation. I cannot create a valid FileInfo object or Image using uncpath (which I need to do in order to use over IIS)
Current Setup:
Attach a virtual directory called Photos in IIS website pointing to UNCPath of the Azure file share (e.g. \myshare.file.core.windows.net\sharename\pathtoimages)
This works as http://example.com/photos/img.jpg so I know it is not a permissions or authentication issue.
For some reason though I cannot get a reference to File.
var imgpath = Path.Combine(Server.MapPath("~/Photos"),"img.jpg")
\\resolves as \\myshare.file.core.windows.net\sharename\pathtoimages\img.jpg
var fi = new FileInto(imgpath);
if(fi.exists) //this returns false 100% of the time
var img = System.Drawing.Image.FromFile(fi.FullName);
The problem is that the file is never found to exist, even though I cant take that path and put it in an explorer window and return the img.jpg 100% of the time.
Does anyone have any idea why this would not be working?
Do I need to be using CloudFileShare object to just get a read of a file I know is there?
It turns out the issue is that I needed to wrap my code in an impersonation of the azure file share userid since the virtual directory is not really in play at all at this point.
using (new impersonation("UserName","azure","azure pass"))
{
//my IO.File code
}
I used this guys impersonation script found here.
Can you explain why DirectoryInfo.GetFiles produces this IOException?

files in server root not loading properly in Azure

I'm working on an app that instead of a database uses file system in the server's root directory. It's basically a note application that allows me to save notes. Each note is a serialized object of Note class represented by following structure \Data\Notes\MyUsername\Title.txt
When I'm testing this on localhost through IIS Express everything works fine and I can easily go step by step there.
However, once I publish the app to Azure, the folder structure is still there (made a test Controller that uses Directory.GetFiles() and .GetDirectories() to simulate folder browsing so I'm sure that the files are there) but the file simply doesn't get loaded.
Loading script that's being called:
public T Load<T>(string filePath) where T : new()
{
StreamReader reader = null;
try
{
reader = new StreamReader(filePath);
var RawDB = reader.ReadToEnd();
return JsonConvert.DeserializeObject<T>(RawDB);
}
catch
{
return default(T);
}
finally
{
if (reader != null)
reader.Dispose();
}
}
Since I can't normally debug the app on Azure I tried to dump as much info as I can through ViewData and even there, everything looks okay and the paths match, but the deserialized object is still null, and this is only when trying to open an existing note WITHOUT creating a new one first (more on that later)
Additionally, like I said, those new notes get saved in the folder structure, and there's a Note sidebar on the left that allows users to switch between notes. The note browser is nothing more but a list that's collected with a .GetFiles() of that folder.
On Azure, this works normally and if I were to delete one manually it'd be removed from the sidebar as well.
Now here's the kicker. On localhost, adding a note adds it to the sidebar and I can switch between them normally.
Adding a note on Azure makes all Views only display that new note regardless of which note I open and the new note does NOT get stored in the structure (I don't know where it ended up at all!) even though the path is defined at that point normally and it should save just like it does on localhost.
var model = new ViewNoteModel()
{
Note = Load<Note>($#"{NotePath}\{Title}.txt"), //Works on localhost, fails on Azure on many levels. Title is a URL param.
MyNotes = GetMyNotes() //works fine, reads right directory on local and Azure
};
To summarize:
Everything works fine on localhost, Important part doesn't work on Azure.
If new note is not created but an existing note is opened, Correct note gets loaded (based on URL Param) on Localhost, it breaks on Azure and loads default Note object (not null, just the default constructor data since it's required by JsonConvert)
If a new note is created, you'll see it on Localhost and you'll be able to open all other notes regardless, you will see only the new note on Azure regardless of note picked.
It's really strange and I have no idea what could cause this? I thought it had something to do with Azure requests being handled differently so maybe controller pushes the View before the model is initialized completely but that doesn't make sense since there's nothing async here.
However the fact that it loads a note that doesn't exist on the server it's even more apsurd and I have no explanation for that.
Additionally this issue is not linked with a session. I logged in through my phone and it showed the fake note there as well right away.
P.S. Before you say anything about storage, please note this. Our university grants us a very limited Azure subscription. Simple lowest tier App service and 5DTU SQL server and 99% of the rest is locked out of our subscription. This is why I'm storing stuff on the server, not because I believe it's the smart thing to do.

Server MapPath not working on remote server

I am using Server.MapPath to find the path for a document uploaded to a remote server, so that I can then open it. However when using it, it is returning a relative path and so rather than searching the remote server it is searching the local machine instead.
What I am using to open the document is:
System.Diagnostics.Process.Start(Server.MapPath(Path.Combine("~/", document)));
Where "document" is the part of the path relative to the document itself, in this case "Files\2016\11\doc_name". So I want to take the path of this document, go to the top level of the site, and then find the document from there.
However I would hope that this would return a path similar to "server\inetpub\site\Files\2016\11\doc_name" but instead it is returning a path like "d:\inetpub\site\Files\2016\11\doc_name".
Can someone help me with what is the correct function to use to get the path I need?
EDIT
I have managed to fudge together the correct path using the following code:
string server = Environment.MachineName;
string path = Server.MapPath(Path.Combine("~/", documentpath));
System.Diagnostics.Process.Start(#"\\" + server + path.Substring(path.IndexOf(#"\")));
However, while I can get this to access the file when I'm running the project locally, it errors when I try to do it on the published site. As I can access it in one way, I'm assuming that it could be permissions (just to note the site is using windows authentication). Is this the most likely cause?

How to set the file upload path in liferay portlet so that it can be stored either in unix or windows system(machine independent)?

The suggestions were giving the path as catalina_home or user/home.
Do I need to set any parameters in server.xml?
And using actionRequest.getPortletSession().getPortletContext().getRealPath("/"); I am unable to store the file path in database.
Properties properties=PortalUtil.getPortalProperties();
String serverHome = properties.getProperty("liferay.home");
//To get the liferay server home folder
create any folder for uploaded files under it.

Uploaded image only available after refreshing the page

When I upload a picture, the file is successfully saved and the path is successfully set. But the uploaded image is not displayed immediately after the form submit. Only when I reload the page, the uploaded image is displayed.
I'm saving the uploaded file as below:
InputStream is;
try {
File file = new File("C:\\****\\*****\\Documents\\NetBeansProjects\\EventsCalendary\\web\\resources\\images\\uploadPhoto.png");
is = event.getFile().getInputstream();
OutputStream os = new FileOutputStream(file);
setUserPhoto("\\EventsCalendary\\resources\\images\\"+file.getName());
byte buf[] = new byte[1024];
int len;
while ((len = is.read(buf)) > 0) {
os.write(buf, 0, len);
}
os.close();
is.close();
} catch (IOException ex) {
System.out.println(ex.getStackTrace());
}
Why is the uploaded image only displayed after reloading the page and how can I solve this?
You're writing the file straight into the IDE's project folder and your intent seems to save the file in the webapp's deploy folder. This is a bad idea and well due to the following 3 main reasons:
Changes in the IDE's project folder does not immediately get reflected in the server's work folder. There's kind of a background job in the IDE which takes care that the server's work folder get synced with last updates (this is in IDE terms called "publishing"). This is the main cause of the problem you're seeing.
In real world code there are circumstances where storing uploaded files in the webapp's deploy folder will not work at all. Some servers do (either by default or by configuration) not expand the deployed WAR file into the local disk file system, but instead fully in the memory. You can't create new files in the memory without basically editing the deployed WAR file and redeploying it.
Even when the server expands the deployed WAR file into the local disk file system, all newly created files will get lost on a redeploy or even a simple restart, simply because those new files are not part of the original WAR file.
You need to write it to a fixed path outside the project/deploy folder instead. For example, /var/webapp/uploads. Then, to get it to be served by your webapp, just add it as a new web application context to the server.
Based on your previous question, I know that you're using Glassfish 3.1. In this server, it's called a "virtual host". You can configure it at server level in the admin console at http://localhost:4848 > Configuration > HTTP Service > Virtual Servers, or at webapp level by adding the following line to the /WEB-INF/glassfish-web.xml (your IDE should have autogenerated one; note that this file is before Glassfish 3.1 called sun-web.xml, so if you're seeing manuals/blogs/tutorials referencing it, yes it's exactly the same file):
<property name="alternatedocroot_1" value="from=/uploads/* dir=/var/webapp" />
Either way, you should then be able to use http://localhost:8080/contextname/uploads/* to serve those uploaded images from by <img> the usual way.
See also:
How to upload files to server using JSP/Servlet?
Recommended way to save uploaded files in a servlet application (contains a Tomcat configuration example)
Reading/writing a text file in a servlet, where should this file be stored in JBoss? (contains JBoss configuration example)
Simplest way to serve static data from outside the application server in a Java web application

Resources