I am looking to do the following in Azure however I should point out that on my local machine I have no visual studio, no admin rights, no IT support and no tools (except SSMS) but I have a VERY strong drive to complete this work if its possible.
I have created an Azure blob which receives a file each day (zipped) from a 3rd party. I am looking to do the following:
1)Unzip the data in an automated fashion
2)Get the data into an Azure SQL database (already created) in an automated fashion
What I want to know is if this is possible to do using Azure alone or am I going to need admin rights / Visual Studio? If it is possible any directions that you could point me in would be greatly received!
Thanks
Dave
Based on your description, one approach would be to create a Blob Triggered Azure Function through the Azure Portal (Visual Studio is not required), unzip/process the file and save desired data into Azure SQL. Moreover, considering the there is only one new file per day, prefer the Consumption Plan to optimize cost.
Find more details about Azure Function Blob Binding at https://learn.microsoft.com/en-us/azure/azure-functions/functions-bindings-storage-blob.
Based on your description, one approach would be to create a Blob Triggered Azure Function through the Azure Portal (Visual Studio is not required), unzip/process the file and save desired data into Azure SQL. Moreover, considering the there is only one new file per day, prefer the Consumption Plan to optimize cost.
Find more details about Azure Function Blob Binding at https://learn.microsoft.com/en-us/azure/azure-functions/functions-bindings-storage-blob.
Spin up your data factory on Azure, unzip function is available on adf
Related
Is there a way to use Azure Search against Azure File Shares. I only see blob storage as an option. We have on-prem servers that sync files to Azure File Shares and would like to search inside those files in a web application.
At this moment, there's no way unless you manually query and push file content to your Azure Cognitive Search index. In the future, there's a hope you'll be able to trigger an Azure Function using this type of binding, which will make your life easier. You can follow / vote up for this feature in the following link:
https://github.com/Azure/azure-webjobs-sdk-extensions/issues/14
Per UserVoice Page for Azure Search: https://feedback.azure.com/forums/263029-azure-search/suggestions/14274261-indexer-for-azure-file-shares#{toggle_previous_statuses}, Azure File Indexer is available in private preview (in fact this has been in this stage for almost 2 years now :)).
Search team would like to reach out to them in case you're interested.
I'm new to Azure eco system. I'm doing some research on copying data from on-prem to azure. I found following options:
AzCopy
Azure Data Factory (Copy Data Tool)
Data Management Gateway
Ours is a Microsoft shop; so, I'm looking for tools that gel with MS platform. Also, down the line, we want to automate the entire thing as much as we can. So, I think, Azure Storage Explorer is out of the question. Is there a preference among the above 3. Or, are there any better tools?
I think you are mixing stuff, Copy Data Tool is just an Azure Data Factory Wizard to make some sample data moving between resources. Azure Data Factory uses the data management gateway to get on premises resources such as files and databases.
What you want to do can be made with Azure Data Factory. I recommend using version 2 (even in its preview version) because its Authoring is easier to understand if you are new to the tool. You can graphically configure linked services, datasets and pipelines from there.
I hope this helped, if you need further help just ask away!
If you're already familiar with SSIS, there's also the option to use SSIS in ADF that enables on-prem data access via VNet.
I create a model in Azure ML studio.
I deployed the web service.
Now, I know how to check one record at a time, but how can I load a csv file and made the algorithm go through all records ?
If I click on Batch Execution - it will ask me to create an account for Azure storage.
Is any way to execute multiple records from csv file without creating any other accounts?
Yes, there is a way and it is simple. What you need is an excel add-in. You need not create any other account.
You can either read Excel Add-in for Azure Machine Learning web services doc or you can watch Azure ML Excel Add-in video.
If you search for videos on excel add in for azure ml, you get other useful videos too.
I hope this is the solution you are looking for.
Is it not possible to rename an Azure Storage Table?
I cannot seem to find anything online (not even cmdlets). There are no options for this in Visual Studio Server Explorer, Cloud Storage Studio or TableXplorer.
You're correct. It is not possible to rename an Azure Storage Table (or Blob Container or Queue for that matter).
Possible solution would be to download all entities from the table and upload them again in another table. Once all entities are uploaded, you can then delete the old table. When downloading entities, please do keep Continuation Token in mind as querying table would return up to 1000 entities per request.
You can download all entities using either Cloud Storage Studio (or Azure Management Studio) from Cerebrata or TableXplorer. If you want, you can use Azure Management Cmdlets from Cerebrata as well. It has cmdlets to export a table (Export-Table) and restore a table (Restore-Table).
Now, you can rename Azure Tables with Microsoft's "Microsoft Azure Storage Explorer" (after version 0.8.3). You can also rename containers and file shares with this tool. See the release notes here.
Note that this feature has the following disclaimer during usage.
Renaming works by copying to the new name, then deleting the source item. Renaming a table currently loses the table's properties and metadata, and may take a while if there are lots of entities.
Therefore this is not an actual renaming behind the scenes and incurs read/write/transaction costs.
You can also use AzCopy, which is a Microsoft command line tool for downloading/moving table data.
I've just discovered that I have 100's of GB of log files/failed request logs on Azure Blob storage that have been accumulating over the years. Is there a tool or technique for managing them - the directory structure is convoluted so its not as easy as just sorting by date (I use Cloud Storage Studio as an Azure management tool)
[With apologies in advance if it feels like product plug] You could possibly look into Azure Diagnostics Manager (http://www.cerebrata.com/Products/AzureDiagnosticsManager). This tool is built specifically for viewing/managing Windows Azure Diagnostics. You could also look into Azure Management Studio (http://www.cerebrata.com) which combines Cloud Storage Studio and Azure Diagnostics Manager into one product and is currently in public beta.
Both tools allow you to purge old data, search for logs data based on date ranges.
(Disclosure: I'm part of Cerebrata team)