while Importing an unmanaged cluster the job error
Permission denied ?
enter image description here
OpsCenter developer here. It's hard to give a meaningful answer with so little information.
If you click through on the error, you'll get a more in-depth description.
If you look at the opscenterd.log file, you'll find additional context. You can log the LCM job-events to that file by setting the 'lcm' logger to debug in opscenterd's logback.xml.
But if I'm guessing, you don't have filesystem permissions to write in the home directory of the user you're logging in as. Log in as the user you specified in your LCM ssh-credentials and try to touch ./test-file and see if you get a permissions error. If you do, you'll need to resolve that outside of LCM before you can proceed. LCM needs to write a temporary file to your home-directory.
Related
There is a (linux) directory like below:
/a/b/c/d
which is accessible from multiple machines (NFS)
Now, the owner of everything in that directory is dir_owner.
However, someone/ something who/ which can sudo as dir_owner is changing the permissions of directory d to 777 and then writing to that location as a different user. This different user is always the same user, say, unauthorised_user.
I was hoping to track what is causing this. Could someone please help me out on this? Let me know if any more information is required.
you can use the stat command, which is used for viewing file or file system status.
More information and parameters for stat on the following webpage.
https://ss64.com/bash/stat.html
Another command is auditd, which should be able to be configured writing audit records to the disk, more information at the following webpage,
https://man7.org/linux/man-pages/man8/auditd.8.html
I have path where some log files are generated dynamically everyday with timestamp and 400 (-r--------) permission , so the owner of these files can view the logs.
Logs path : /dir_01/abc_01/logpath
Log files :
-r-------- LogFile_20141001
-r-------- LogFile_20141002
-r-------- LogFile_20141003
I want others to view the logs, but I can't give read permission to logs for others and copying the logs every-time to a location (eg : /dir_02/logs) & giving permission there, so that others can see, is really difficult, as logs are created dynamically. Is there any way, that whenever the logs are created in actualy logs path i.e . /dir_01/abc_01/logpath , the same is updated on some other path like /dir_02/logs with read permission to others. Is mounting will be helpful for this scenario, if so, then how.
This is possible to use umask option (for some of filesystems e.g vfat) during mouting, and then all files created in this dir will have required permissions, but definitively better option is to use extended acls, then all files created in dir(s) will have set up permission according to your requirements.
Umask syscall (not umask mount option) sets up permission but only for calling process. It means, that if another process which have another umask, creates file/dir - the permission will not appropriate to your requirements.
I can't get if these are the same files or not:
/dir_01/abc_01/logpath
/dir_02/logs
But if you want to do something exactly in the moment of creating a file, then you need notify to monitor dir (to catch a event) and execute another action when file is created.
There was an error when trying to connect. Do you want to retype your credentials and try again?
Details:
Filename: \?\C:Windows\system32\inetsrv\config\redirection.config
Error: Cannot read configuration file due to insufficient permissions
Screenshot:
Suddenly I have this error when trying to access my application on IIS 8! Does anyone know how to fix it?
I had a similar issue and this article helped me resolve the issue: https://blogs.msdn.microsoft.com/prashant_upadhyay/2011/09/20/unable-to-expand-server-node-in-the-iis-7-ui-with-shared-configuration/
The issue in my case was the account associated with configurationRedirection was locked, once enabled the issue was resolved.
I guess a similar issue would arise if the account did not have access to the Shared configuration folder, or if the account's password/credentials had changed.
I had the exact same error message for one of my users following a permissions change to his user account. Turned out he had an old "disconnected" RDP session on the server. Once that was signed off (via task manager) the error went away.
I've got an Azure app up and running, but various requests generate a 500 error. There are no other details that come back from the server to let me know exactly what the problem is. No stack trace, no error message. The only thing I get back from the server are the http headers indicating I've got an error.
I've done a little looking around but can't seem to find a way to retrieve the error details that I'm looking for. I've seen some articles that suggest that I enable logging, but I'm not sure 1) how to do that, 2) where those log files would go and 3) how to access said log files. I've seen posts that say to add a whole bunch of code to my application to enable logging, but all I'm looking for is an error message and a stack trace from a 500 error. Do I really have to add a bunch of code to my app to see that information? If not, how can I get at it?
Thanks!
Chris
The best long-term solution is to enable Azure Diagnostics, which I think is what you're referring to. If you want a quick-and-dirty solution, you can log errors out to a file and then RDP into the role instances to view them. This is very similar to what you would do on a server in your own datacenter.
You can create the logs however you like. I've used log4net and RollingFileAppenders with some success. Setting the logfile path to something like "\logs\mylog.txt" will place the logs in the E: drive of the VM. Note you'll still need code somewhere in your app to capture the error and write it to the log - typically the global error handler in Global.asax is a good place for that.
You'll also have to enable RDP access to your role instances. There are many articles detailing how to do that. Here's one.
This is not a generally recommended approach because the logs may disappears when the role recycles or is recreated. It's also a pain in the butt to log to keep an eye on all those different servers.
One other warning - it's possible that the 500 error is due to some failure in your web.config. If that is the case, all the the application-level error logging in the world isn't going to help you. So be sure that your web.config is valid, and also check the Windows Event Logs while you're RDP'd into the server.
500 internal server error is most generally caused by some problem on the server when it was not able to understand incoming requests or there was some problem in configuration. So, try to run the app locally and see if there is some problem. You can record errors in a database in catches/application_error and also can use tracing. Believe me they are very helpful and worth a few extra lines of code.
For tracing have a look here, http://msdn.microsoft.com/en-us/magazine/ff714589.aspx
Please help me with this query in using log4net.
I am using log4net in mhy we application. I am facing issues in configuring log4net to log errors at user level.
That is, If user X logs in, I like to create file name X and all error for user X should be written in X.log. Siilarly if Y user logs in the log file should be in name of Y.log and the most important point to note is, they could log in concurrently.
I tried the luck by creating log files whose name would be framed dynamically as soon as the user logs in. But issue here, if they are not using the application at the same time, the log files are creeated with correct name and writing as expected, but if both users have active sessions, log file is created only for user who FIRST logged in and error of second user has been recorded in log file that is created for FIRST user.
Please help me in this.
There has to be a better solution from this one, but you can change log4net configuration from code and even decide which config file to load - so you can do it in code, which is not as nice as editing an XML file.
so what you need to do, which is highly not recommended, is to create log4net configuration each time you call the logger static class, and do what needed based on the calling user.
again.. it doesn't feel right !
(and it will probably perform poorly).
another BETTER solution is to log everything to database (log4net supports it), with a user column, and then produce the logs from db....