Get file directory using username and password - io

I have a series of virtual machines deployed to an Azure virtual network that are connected to our on-premise server so that we can access the application via the UNC \10.101.1.5 which is mapped to a config variable "Input Folder"
Due to the change in architecture the following code is no longer being accessible via the network credentials of the signed in user and I want to pass a username and password into the directory info search is this possible?
'Get all the files in the input directory
Dim directoryInfo As New IO.DirectoryInfo(ConfigurationManager.AppSettings("InputFolder"))
Dim files As IO.FileInfo() = directoryInfo.GetFiles()
Is there any way I can pass the network credentials in to the DirectoryInfo method?

Do you mean this directoryInfo class https://msdn.microsoft.com/en-us/library/System.IO.DirectoryInfo(v=vs.110).aspx ? If so then there isn't a way to pass network credentials in.
What your options are depends on how much freedom you have to change the app and the architecture. For example setting up a file copy, storing files as Azure blobs and accessing them via URI/SAS key.

Related

Unable to mount file on windows from Azure

I made a file on Azure using "File Service" and then tried to mount it using "connect". It has given me the username: localhost\xyz.
Two questions:
why username starting from "localhost" and not with "Azure"?
why I am unable to mount as windows security not giving any error, instead keep on turning back to credentials page?
p.s. TCP port 445 working properly..
Here are a few workarounds that worked for us.
WAY-1
You can directly go to your PowerShell of your machine and paste the script that you have provided in your storage account
WAY-2
You can click on More options and select for different account and then use the storage account name prepended with AZURE\ as the username and a storage account key as the password.
WAY-3
You can create a file share directly by unchecking the connect using different credentials.
OUTPUT:
For all the above ways here is the screenshot of fileshares that got mounted.
REFERENCES:
Mount SMB Azure file share on Windows

Folder level access control in ADLS Gen2 for upcoming users

I have a Gen2 storage account and created a container.
Folder Structure looks something like this
StorageAccount
->Container1
->normal-data
->Files 1....n
->sensitive-data
->Files 1....m
I want to give read only access to the user only for normal-data and NOT sensitive-data
This can be achieved by setting ACL's on the folder level and giving access to the security service principle.
But limitation of this approach is user can only access the files which are loaded into the directory after the ACL is set up, hence cannot access the files which are already present inside the directory.
Because of this limitation, new users cannot be given full read access (unless new users use the same service principle, which is not the ideal scenario in my usecase)
Please suggest a read-only access method in ADLS Gen2, where
If files are already present under a folder and a new user is onboarded, he should be able to read all the files under the folder
New user should get access to only normal-data folder and NOT to sensitive-data
PS : There is a script for assigning ACL's recursively. But as I will get close to million records each day under normal-data folder, it would not be feasible for me to use the recursive ACL script
You could create an Azure AD security group and give that group read only access to the read-only folder.
Then you can add new users to the security group.
See: https://learn.microsoft.com/en-us/azure/active-directory/fundamentals/active-directory-groups-create-azure-portal

Copying files in fileshare with Azure Data Factory configuration problem

I am trying to learn using the Azure Data Factory to copy data (a collection of csv files in a folder structure) from an Azure File Share to a Cosmos DB instance.
In Azure Data factory I'm creating a "copy data" activity and try to set my file share as source using the following host:
mystorageaccount.file.core.windows.net\\mystoragefilesharename
When trying to test the connection, I get the following error:
[{"code":9059,"message":"File path 'E:\\approot\\mscissstorage.file.core.windows.net\\mystoragefilesharename' is not supported. Check the configuration to make sure the path is valid."}]
Should I move the data to another storage type like a blob or I am not entering the correct host url?
You'll need to specify the host in json file like this "\\myserver\share" if you create pipeline with JSON directly or you use set the host url like this "\myserver\share" if you're using UI to setup pipeline.
Here is more info:
https://learn.microsoft.com/en-us/azure/data-factory/connector-file-system#sample-linked-service-and-dataset-definitions
I believe when you create file linked service, you might choose public IR. If you choose public IR, local path (e.g c:\xxx, D:\xxx) is not allowed, because the machine that run your job is managed by us, which not contains any customer data. Please use self-hosted IR to copy your local files.
Based on the link posted by Nicolas Zhang: https://learn.microsoft.com/en-us/azure/data-factory/connector-file-system#sample-linked-service-and-dataset-definitions and the examples provided therein, I was able to solve it an successfully create the copy action. I had two errors (I'm configuring via the data factory UI and not directly the JSON):
In the host path, the correct one should be: \\mystorageaccount.file.core.windows.net\mystoragefilesharename\myfolderpath
The username and password must be the one corresponding to the storage account and not to the actual user's account which I was erroneously using.

IIS 7.5 - Virtual directory mapped to Azure File Share - Cannot read configuration file

As the title suggests I am receiving an error "Cannot read configuration file" when attempting to read a JPG file from an Azure File share mounted on a VM (from within Azure network) through a virtual directory from within IIS 7.5
This points to a permission problem - I created a local user on the web server matching the username and set the password to the Access Key of the storage service on Azure (the same credentials to access/mount the file storage share).
I set this on the app pool the vdir runs under aswell as the virtual directory properties "Physical Path Credentials" - both still return the following error:
HTTP Error 500.19 - Internal Server Error
The requested page cannot be accessed because the related configuration data for the page is invalid.
Detailed Error Information:
Module
IIS Web Core
Notification
BeginRequest
Handler
Not yet determined
Error Code
0x80070003
Config Error
Cannot read configuration file
Config File
\?\X:\web.config
Requested URL
http://localhost:80/myvdir/1.jpg
Physical Path
X:\1.jpg
Logon Method
Not yet determined
Logon User
Not yet determined
Config Source:
-1:
0:
So either the credentials i'm using do not match the UNC credentials or IIS does not support this.
Anyone have any ideas?
UPDATE - 2016-15-18
Solved
Thanks to Simon W and Forester123
The issue was due to the following missing steps:
When adding the local user I failed to add this user to the IIS_IUSRS group
Using the drive letter at the start of the physical path for the application. You must use the UNC path \myaccount.file.core.windows.net\sharename you cannot use a drive letter e.g. X:\
This URL (provided by Simon W) was invaluable http://blogs.iis.net/davidso/azurefile
Trying to use a mapped drive is likely your issue. Take a look at how this is achieved using a UNC instead: http://blogs.iis.net/davidso/azurefile
You need to specify the UNC path of your file share as the Physical Path property:
\\<your_storage_account>.file.core.windows.net\<your_share>
Using mapped drive will just give the error you encountered. Mapped drive is only for the local user session. See below test:

Azure Files preview - access shared folder in IIS and FileZilla

I'm interested in load balancing 2+ Windows VMs in Azure. My primary requirement, though, is that an 'uploads' folder would need to be consistent between each VM. Files in this folder are FTPed by our admin users, and they would then need to select these files in a C# MVC Web app. As you may connect through FTP to one VM, but a Web connection might be to another, the uploads have to be centralised.
It looked as if the new Azure Files, currently in Preview, would help, in that they let me set up a shared drive that each of the VMs could access. My thought was that FileZilla Server would allow FTPing up to this shared 'drive', and the Web app would access it to show the contents.
I've signed up to the Azure Files Preview, and set up the share, persistently mapping it to Drive Z for the sake of experimentation. I've also created a new user and made sure they too have persistent mapping to this same drive as Z.
But I can't seem to do anything with this outside of the Remote Desktop. FileZilla, despite having its Service set to log on using this new account, won't show the contents of this drive, or write anything to it. Likewise my Web App isn't able to access the file contents, despite switching Passthrough Authentication to this new account for the virtual folder.
Does anyone know any way of accessing this drive either through the network path or drive letter? Is this just not possible with Azure Files as they are? Are there any other solutions to sharing some blobs across VMs, but treating it as a local drive or network share?
[UPDATE]
This might help. Having set up the share, and used cmdkey and net use while in a cmd prompt runas a specially created user (as suggested in http://blogs.msdn.com/b/windowsazurestorage/archive/2014/05/27/persisting-connections-to-microsoft-azure-files.aspx), if I point a virtual folder in IIS to this share, using the specific account created, and Test Connection, I get:
Test: Authentication (green tick; "The specified user credentials are valid")
Test: Authorization (red cross; "The path does not exist or environment variables in the path could not be expanded to verify whether it exists.")
While still in a runas cmd prompt, I can access the share, so it's not a specific permissions issue. It just seems to be that IIS cannot use that user to access the share, for some reason. The limitation of Azure Files is that I cannot specifically grant any kinds of permissions on the folder within that share.
What worked for me is the following:
Create a new account
Set the IIS App Pool Identity to a this specific user
Set the IIS App Pool Load User Profile property to true
start a cmd promt as this user (runas)
do cmdkey and net use (with /persistent:true switch), as you described
create IIS Virtual Diretory with physical path set to UNC share path (not the mapped drive)
A little PowerShell snippet for point 5:
$share = "your-storage-account.file.core.windows.net\yoursharename"
$usr = "your-storage-account"
$key = "your-storage-key"
#store credentials for the network share - must be done for the user that will run the app pool
cmdkey /add:subclub.file.core.windows.net\images /user:$usr /pass:$key
net use z: "\\$share" /user:$usr $key /persistent:yes
The answers here proved helpful.
Setup
Create a new user {appuser}
Open a command windows as the user
runas /user:{appuser} cmd.exe
In the new {appuser} cmd window use
cmdkey /add:{storage-account}.file.core.windows.net
/user:{storage-account} /pass:{account-key}
Set the IIS Application pool to use {appuser}
4b. Set LoadUserProfile to true
Notice no need for the net use. Don't need the mapped drive.
Code
Now here's the key piece. From your app you must write to the UNC path.
\{storage-account}.file.core.windows.net\
ex.
File.WriteAllText("\\\\{storage-account}.file.core.windows.net\\share\test.txt", "contents goes here");

Resources