Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 2 years ago.
Improve this question
I have a lot of app services on alot of different storage accounts. I would like to consolidate some. Can you move function apps and app services to new storage accounts? I have not found anything in the admin UI.
An App Service doesn't run on a storage account, it connects to a storage account. Which means you can simply switch connection strings. You should, however, think about migrating data as well.
Azure Functions have a storage account associated with them (although these, too, are connected to with a connection string) because of managing triggers and dashboarding functionality.
More information here: Storage considerations for Azure Functions.
When creating a function app, you must create or link to a general-purpose Azure Storage account that supports Blob, Queue, and Table storage. This is because Functions relies on Azure Storage for operations such as managing triggers and logging function executions.
Related
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 1 year ago.
Improve this question
If the only job is to detect new file changes in a blob container what is the best way to accomplish this, Azure Function or ADF pipeline, logic app (at the end we call an sp_proc which inserts into a table if the record doesn't exists).In case of a failure for some reason we need some built in logging so that no file is missed at the end of the day if sql connection fails or so, audit needs to be granular / queryable etc to detect this.
I would suggest the combination of BLOB container + ServiceBus + Azure Function + SQL.
You can,
For every new BLOB created, generate a new message in ServiceBus Queue
Create the Azure Function with ServiceBus's Queue Trigger
Let the Azure Function insert the entry into SQL table
The main advantage here is that incase any failure with Azure Function to insert the SQL table entry due to any network, outage or code issue, the original message on ServiceBus message will not removed until configured retries attempted, also you have the dead-letter Queue option to handle delivery failures.
You can configure the ServiceBus Queue event subscription in the Events tab of the Storage Account.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 1 year ago.
Improve this question
We are building a enterprise application using UWP technology. We would like to monitor the application performance using Microsoft Application Insights. App insights telemetry data directly logged into the Azure portal. For security reason, we do not wanted to log the data to outside the boundary. Is there any way to implement the APM without using AZURE? What i meant is, we have to use app insights services, data should be logged in to on-prem server.That needs to visualized by using any tool
Thanks in advance.
Thanks
Sekar
You cannot directly send app insights data out of azure. It will be stored and retained in azure app insights purview only. But you can use options like continuous export of app insights data to move the telemtry data to other azure storage options like blobs or data lake store.
https://learn.microsoft.com/en-us/azure/azure-monitor/app/export-telemetry
App insights are various performance counters and other telemetry that we collect from the applications. If you are sure what metrics you want to collect then turn of application insights and you can have those perf counter data custom logged to blobs or put them into some queue from where they will be sent to your on premises storage (by some process). Or if you can setup some log ingestion engine on-prem to which the apps in the cloud can send the data.
Having said that app insights are a cloud native approach to application monitoring on azure which I believe would work better than other custom approaches. So you can explore the security concerns you have on app insights and see how to mitigate them.
you could if you want to invent your own whole ingestion/storage system. in the appinsights configuration you can affect the endpoint where the data created by the sdk is sent.
in the javascript sdk it is something like this
let config: ApplicationInsights_Types.IConfiguration = {
// endpoint by default is something like this:
endpointUrl: "https://dc.services.visualstudio.com/v2/track",
instrumentationKey: this._instrumentationKey
};
you'd have to find the corresponding thing in whatever SDK you are using. then invent the whole backend and storage and query system to keep all that data.
the point of APM services like Application Insights and others it to not do all that work yourself.
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 4 years ago.
Improve this question
We need to ingest messages from an Event Hub and Topic into a third party system. However, this system doesn't support Event Hubs and Topics yet, but it does support Blob storage. We'd like to temporarily drop these Event Hub and Topic messages into a Blob.
I thought I could use Data Factory for this, but it doesn't seem like it can connect to an Event Hub or Topic.
Using the Azure platform, what are my options for placing messages from an Event Hub and Topic into Blob storage?
Is solution going to be hosted on Azure (can only assume) or on-premises?
What's the load on the system? I'm assuming if you are going with EventHubs then there's a substantial number of messages. Do you need to perform any processing aside from storing in Storage Blobs? Do you want to go with PaaS/Serverless/VMs?
Bottom line, there's no best way. Period.
There are scenarios and options you can choose from.
Without knowing specific details of your scenario, it's easy to point to a solution which is not necessarily a correct one.
You could try Azure Functions as a starting point. It offers both EventHubs and ServiceBus incoming bindings and Storage Blob output binding.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 8 years ago.
Improve this question
We mistakenly set up an azure storage account in the wrong location West Europe.
However we need it to be in North Europe.
Is there a way to transfer a whole storage account.
We don't really need the old data in the container.
But if we just delete the account and recreate in new location it will generate a new access keys, which we don't want.
Is there anyway to either manually set the access keys on a new storage account or move the storage account between regions.
Either solution works for us, moving it or deleting it and recreating with same access keys, but we can't have the new storage account with different access keys. We don't care whether or not the data comes across.
I can't see a way of setting access keys in web portal, maybe this is possible programmatially but I've searched and can't see anyone else with samples of this.
This might get closed due to it being an Azure infrastructure, vs. programming, question (and would fit better on ServerFault), though it could be argued that, since you need keys to access storage from your code (or via Azure SDKs), it's "close" to programming-related.
That said: You can't just move a storage account. You'll need to delete and re-create, which will give you new keys. You cannot provide your own keys, so you cannot copy keys from your old storage account to your new storage account.
Regarding the API (and portal, and SDK's, and PowerShell cmdlets, all built upon the API): The API only allows you to trigger a re-generation of either primary or secondary key. There's no way to pass in your own key.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 2 years ago.
Improve this question
Do Azure have a low-cost cloud storage service like Amazon Glacier?
No, Microsoft Azure does not offer a service equivalent to Amazon Glacier. Glacier is built on top of Amazon S3. Equivalent to Amazon S3 is Microsoft Azure Blob Storage.
UPDATE - 06-November-2017
Recently Microsoft Azure offered a new access tier called Archive Tier which is similar to Amazon Glacier (and other cloud provider's long term storage solution for archival purpose). You can read more about this here: https://azure.microsoft.com/en-in/blog/announcing-the-public-preview-of-azure-archive-blob-storage-and-blob-level-tiering/. I also wrote a blog post about the same that you can read here: https://gauravmantri.com/2017/10/15/understanding-azure-storage-blob-access-tiers/.
As an update to the other answer, Azure does have an semi-equivalent service in their blob storage in that you can set up your storage with a "Cold" access tier. You pay less per GB of storage, but you pay more for access against that data. In contrast to Amazon Glacier, you don't have the delay to access the data that Glacier comes with, but you do pay the same price or more (depending on your Glacier retrieval timing).
On the flip side, you can set up storage with a "Hot" access tier and pay ~80% more per GB stored, but pay half the price for access operations.
You can find the current pricing for Azure blob storage at https://azure.microsoft.com/en-us/pricing/details/storage/blobs/ and current pricing for the various Glacier retrieval tiers at https://aws.amazon.com/glacier/pricing/