I am trying to connect every user account at login with our shared storage.
This works fine with the command in CMD:
net use \\NAS /USER:email#domain.com passwordhere
As we are using AzureActiveDirectory, I will need an email to connect to our shared storage. How can I read the email of the current user to replace it with the example adress? This all should happen in a batch file as there are also plenty of other lines already written. (Also without admin permissions if possible).
Thanks,
Max
Related
I would like to create a script to be scheduled in a .bat file that automatically links to google contacts and creates contacts read in a Mysql Database.
I would like a system that does not require any user action.
I know that service-account exist but I have no idea how to create the program. Do you know how to do it?
I hope you can give me a hand.
For the moment, I wish you a good day.
This can be done in three steps if the user is not part of G Suite.
Authenticate the user using OAuth with access_type = offline.
Save the generated Refresh Token.
Use the Refresh Token to generate a new Access Token and then update the account's contacts. The Access Token will be valid (default) for 3,600 seconds.
If the user is part of G Suite, then enable Domain Wide Delegation on a service account and impersonate the user.
I am trying to write a query that pulls from federated tables in BQ. In BQ I can run the query and get results. However, when I run the same query in Domo, I get the error: Domo is ready, but received a Access Denied: BigQuery BigQuery: No OAuth token with Google Drive scope was found..Please contact the data provider for support.
I have read all over the place that I need to change the scope to do this. I am not a developer though, so I am not sure exactly how to go about this in BQ.
Does anyone have step by step instructions for how to do this?
Thanks!
When you created federated table you grant access of account that run query on the file in e.g. Google Drive.
So when you run query in BQ Console - it used your credentials.
When you run from domo - it may use different account (probably some service account) - so to have everything work you should grant proper access (essentially share document with this account) to your drive file to this account.
I've downloaded the starter pack and performed all steps as mentioned in the tutorial. I can create accounts but if I log out and want to sign in again I always get the error invalid username or password. Can anyone share any pointers as debugging this is nearly impossible.
I've created several accounts already just to be sure the password was ok.
I've created the keycontainers
I've created the two applications that are needed in the custom policies (web app and native)
I've updated the extension file with the correct id's
yeah if you dont get your client IDS correct you cant log in and then it constantly reports as invalid credentials.
Double check that you are writing the object ids correctly.
I created a user account on Amazon Linux Instance with root user. I found that if I create a user account(Example: ec2-user) that account will not have execute and write permissions on Hadoop Files System, Hive, Pig and other tools which are installed on Amazon EMR. If I have to give them explicit permissions I have to create a group which has permissions equivalent to superuser(root) account and add users to that group. Is there any other way I can set up access for those accounts to HDFS, Hive and Pigs etc.
Also while logging in as user the Linux command prompt is not prompting to enter any password even though I gave password for the user account while creating it. Is there anything configuration changes I need to make in /etc/ssh/sshd_config file?
Your question is not that clear to me.
But, let me attempt with whatever I suppose I understood.
Hadoop when security is enabled needs to have security for each user. It seems your user needs a separate space for writes and executions i.e. a Home directory.
First login as 'hdfs' user in a terminal and then create a home directory for your user in HDFS. Please check if you have a directory called /user/{yourUser}. If, not create that. Then, make sure you make {yourUser} the owner of /user/{yourUser}.
I have a dedicated Linux web server with many user accounts on it. The user accounts are all located in /home/[userid] directories. I am able to create Perl scripts that run within each of my users’ accounts that can access files only within their own account, but now I need to create a script that can run “above” the users’ accounts and be able to access a file within any specified user’s account.
Currently, I have a script that uses Net::FTP to retrieve the needed file from each account so I can extract the necessary data from it, but of course, it’s slow to FTP into every account. Since the accounts are merely directories on the server, I’m looking for a way to run a Perl script in a way that it can access each account directory and simply open the required file and return the requested data for the specified account.
How can I accomplish this?
You should login as a user that has access to all the user directories (e.g. root). For security reasons, it might be safer to use sftp or some other encrypted connection.