I am using Azure batch Service and I am able to add applications,pools and tasks to the same .I have attached a resource file to each task.I am confused about how to access the resource file or inputs provided to the task in the application that I have uploaded to the Azure batch .Please help me in the same
If you have attached the ResourceFile(s) to the individual tasks then the file(s) should be uploaded to the filePath you specified(if any) appended to the environment variable in your task of AZ_BATCH_TASK_WORKING_DIR (https://learn.microsoft.com/en-us/azure/batch/batch-compute-node-environment-variables).
For an ad-hoc experience you can download https://azure.github.io/BatchExplorer/ and navigate to an existing node and/or task to see a file explorer for how the directories in the node are layed out.
Related
I have an azure storage account.
Inside the container, with a client-specific folder structure, every morning, some files get pushed.
I have a function app which processes and converts these files and calls some external service to work upon on these processed files.
I have got a file-share as well, which is basically mounted on a vm.
The external service, after processing the files (#3), generates the resultant success/failure files inside this file-share(#4).
Now the ask is:
Create a simple dashboard which will monitor the storage account(and in effect the container and the file-shares),it should capture & show basic information's, and should look like below table structure(with 3 simple variations of data):
FileName|ReceivedDateTime|NumberOfRecords
Original_file.csv20221011 5:21 AM|10
Original_file_Success.csv20221011 5:31 AM|9
Original_file_Failure.csv20221011 5:32 AM|1
In here the first record is captured from the Container and the second and third - both are generated in the file-share.
Also, whenever a new failure file is generated, i.e., Original file_Failure, it should send email with a predefines template adding the file name to a predefined recipient list.
Any guidance on the azure service to use?
I have seen Azure Monitor,workbook and other stuffs, but I feel that would be an overkill for such simple requirement.
Thanks in advance.
I wonder where are the files saved in Azure Shell text editor located in Azure Portal? In my understanding, when I lunched Azure Shell the very 1st time, a Storage Account and cloud-shell-storage-(region) Resource Group were automatically created for me in Azure Resources. But, when I open cloud-shell-storage-(region) Resource Group in Azure Portal and go to Storage Explorer (preview)/FILE SHARES, I don't see any files that I previously saved in Azure Shell text editor. At the same time, I run ls command in Azure Shell and I can see them all. Is it possible to see those files from Azure Portal at all?
Only the files located in /home/username/clouddrive directory can be seen in the storage file share, the files you stored in /home/username will be stored in the img file e.g. acc_joy.img in the .cloudconsole directory of your file share, you cannot see them.
So if you want to access the file in the file share, you need to store your file in the /home/username/clouddrive, follow the steps below.
Sample:
1.Run cd ./clouddrive/ to go the clouddrive.
2.Run code testfile.txt, the testfile.txt is the file you want to create and store, then the cloud shell will open the editor for you, you could input your stuff, then Save it.
3.Then go to your storage file share, you will find the file is existing.
It also can be available in Storage Explorer (preview).
I am trying to load a Flat file to BLOB using the ADF V2. I have installed the Self Hosted Integration Runtime for the same. The Integration Runtime on Local Machine shows that is successfully connected to the cloud Service as in the SC below. However while making the LinkedService to the on Prem File, some credentials are required. I am not sure of what UserName or Password should be fed in. I have tried both On-Prem and Azure passwords (Wanted to try). Please see the SC.
Could you please guide as how the connection can be made to a local Flat file in my case.
Thanks
- Akshay
Note: You can choose a file while creating the File System as a source in ADF.
You may follow the following steps to select the text file while creating File system as source:
First create a linked service as follows:
Host: **C:\AzureLearn\**
Create a copy activity and select Source as follows:
Click on Source => New
Select New DataSet => Select File => File System and continue
Select Format= > Choose DelimitedText and continue
=> Select previously created File system linked service and click on browse.
Choose a file or folder.
Here you can find the file located under the previously selected folder while creating File System.
Hope this helps.
The LinkedService connection to BLOB or AzureSQL Server are being blocked by the firewall of my organisation. It won't let my system Integration runtime connect my resources to the public cloud.
I followed the same steps on my personal machine and everything went smoothly. Will get the firewall restrictions sorted and update this link for more information.
In my release pipeline to my Integration Environment I want to restore the production database prior running the migration script.
Both my databases are hosted in Azure. So I thought I could use the Azure SQL Database Deployment task that is already integrated to the Azure Dedvops. I created two separate tasks. First Export to a .bacpac file and then import that .bacpac file again. Currently I am running into the following issue:
...
2018-11-22T09:02:28.9173416Z Processing Table '[dbo].[EmergencyContacts]'.
2018-11-22T09:02:31.6364073Z Successfully exported database and saved it to file 'D:\a\r1\a\GeneratedOutputFiles\DatabaseName.bacpac'.
2018-11-22T09:02:31.6726957Z Generated file D:\a\r1\a\GeneratedOutputFiles\DatabaseName.bacpac. Uploading file to the logs.
2018-11-22T09:02:31.6798180Z ##[error]Unable to process command '##vso[task.uploadfile] D:\a\r1\a\GeneratedOutputFiles\DatabaseName.bacpac' successfully. Please reference documentation (http://go.microsoft.com/fwlink/?LinkId=817296)
2018-11-22T09:02:31.6812314Z ##[error]Cannot upload task attachment file, attachment file location is not specified or attachment file not exist on disk
2018-11-22T09:02:31.7016975Z Setting output variable 'SqlDeploymentOutputFile' to 'D:\a\r1\a\GeneratedOutputFiles\DatabaseName.bacpac'
2018-11-22T09:02:31.7479327Z ##[section]Finishing: Azure SQL Export`enter code here`
Any ideas how I could solve this?
I currently have a web role which displays a webpage, the webpage allows the user to select a file from their computer, the web role then uploads the file to some Azure Blob Storage.
However the file the user usually uploads is a zip file so i would like to unzip the file and extract the contents and then upload the contents to the Azure Blob Storage.
I have attempted to do this by using the SharpZipLib example I found here
http://blog.logiclabz.com/c/unzip-files-in-net-c-using-sharpziplib-open-source-library.aspx
I have add the references to my web role for the ICSharpCode.SharpZipLib.dll file and the ZipOperations.dll however I am still receiving the following errors;
Another thing I am confused about is when I am calling UnZipFile(...); what would be the directory of the file I am uploading, would it be the ID of the form which the file is selected in.
Thanks in advance, Sami.
The ZIP file should be uploaded to your web role local disk first, let's say you saved it in a local resource. Then you can invoke the SharpZip to extract the content to some other local resource, then finally upload the content files to the BLOB.
Regarding the windows azure local resource, please have a look http://msdn.microsoft.com/en-us/library/windowsazure/ee758708.aspx
Regarding your errors it looks like you didn't add necessary "using" statements at the beginning of your code. For example you need "using System.IO" then you can use File, Directory, etc. in your code.