On google compute engine, how can I limit the network egress or total bill for each month. I am concerned about possible security issues and want to limit my financial exposure.
There's no way to limit your usage in Compute Engine, unlike App Engine. However, you can use the Detailed Usage Export feature to generate reports that may be of help. You can create a script that will allow you to review this information.
Related
I have 100,000+ addresses that need to be geocoded, but most geocoding services seem to cost money after the first couple thousand entries. Is there any free service that can geocode what I need either in one go or in a relatively short amount of time i.e. a couple weeks at most while being free or relatively cheap?
Also, I may have access to arcgis and I'm wondering if arcgis's geocoding service have a limit if you have the premium version.
The answer is No, there are no free services that will handle 100,000+ entries.
Although it can be argued that one can write a script that will handle geocoding actions in sequence and parallel, i.e. Google's geocoding API with 100 instances running at 2500/day. It is unrealistic and violates their Terms of Service.
To my knowledge, ArcGIS also has a limit as well.
Your best bet is that if this is company or business related, have them foot the bill.
I am building a google application using google nearby and there is a possibility that I might go live on production with it. Since I couldn't find support email for google nearby, does anyone know if Google Nearby is free to use? It looks like the daily limit is 8,640,000. What happens if you exceed that limit? Is there a premium package where I could pay to obtain higher daily limit?
Secondly, does anyone know a better ultrasonic sound based communication api than google nearby?
Yes. You are right. Daily limit for Nearby APIs is 8640000. If you require more, you can apply for increase in quota. You can do this in your Google Project Console under Quotas section in Nearby API. An edit icon is provided near to daily quotas section and it will give you the option to apply for more quota limit.
I am trying to understand how Azure Mobile Services scaling work.
Below screenshot was taken from Mobile Services scale tab in Azure portal.I am using BASIC tier.
When setting SCALE-BY METRIC to NONE, we will pay at a "minimum" UNIT COUNT * $14.99.
For example, if i set UNIT COUNT to 6, then I'll pay 6 * $14.99 = $89.94 every month no matter how much API calls are being made, is my understanding correct?
When setting SCALE-BY METRIC to API CALLS, we can set minimum UNIT COUNT and maximum UNIT COUNT, this is suitable if at some days we have few api calls but in other days we have more api calls, is this correct?
Before moving from development to production, how do we anticipate which scaling option to use? Continuously monitor the API count and change the scaling option when we think we will go over the limit?
thanks!
I'm not affiliated with MS or Azure, so these answers are from my personal understanding.
Yes this is my understanding as well. You can also check this behaviour by extrapolating the predicted costs in your bill.
Yes.
I have a few apps running on Windows Azure Mobile Services in the Store. Usually I start out by trying to estimate the typical api calls per session. This can be done quite easily as long as you are in development. This should give you an idea about how many users an instance can support. Example: In one of my simpler apps one instance could serve about 800-1000 daily active users. Now, even with this information I usually set the scaling to the maximum for launch day, just to anticipate anything. In case it scales to the maximum it'll most likely be just for a day - and if it isn't well in that case congrats!
With the new Azure SQL Database tier structure, it seems important to monitor your database "DTU" usage to know whether to upgrade or downgrade to another tier.
When reading Azure SQL Database Service Tiers and Performance Levels, it only talks about monitoring with CPU, Data and Log percentage usage.
But, when I add new metrics, I also have an DTU percentage option:
I can't find any about this online. Is this essentially a summary of the other DTU-related metrics?
A DTU is a unit of measure for the performance of a service tier and is a summary of several database characteristics. Each service tier has a certain number of DTUs assigned to it as an easy way to compare the performance level of one tier versus another.
Database Throughput Unit (DTU): DTUs provide a way to
describe the relative capacity of a performance level of Basic,
Standard, and Premium databases. DTUs are based on a blended measure
of CPU, memory, reads, and writes. As DTUs increase, the power offered
by the performance level increases. For example, a performance level
with 5 DTUs has five times more power than a performance level with 1
DTU. A maximum DTU quota applies to each server.
The DTU Quota applies to the server, not the individual databases and each server has a maximum of 1600 DTUs. The DTU% is the percentage of units your particular database is using and it seems that this number can go over 100% of the DTU rating of the service tier (I assume to the limit of the server). This percentage number is designed to help you choose the appropriate service tier.
From down toward the bottom of this announcement:
For example, if your DTU consumption shows a value of 80%, it
indicates it is consuming DTU at the rate of 80% of the limit an S2
database would have. If you see values greater than 100% in this view
it means that you need a performance tier larger than S2.
As an example, let’s say you see a percentage value of 300%. This
tells you that you are using three times more resources than would be
available in an S2. To determine a reasonable starting size, compare
the DTUs available in an S2 (50 DTUs) with the next higher sizes (P1 =
100 DTUs, or 200% of S2, P2 = 200 DTUs or 400% of S2). Because you
are at 300% of S2 you would want to start with a P2 and re-test.
Still not cool enough to comment, but regarding #vladislav's comment the original article was fairly old. Here is an update document regarding DTU's, which would help answer the OP's question.
https://learn.microsoft.com/en-us/azure/sql-database/sql-database-what-is-a-dtu
From this document, this DTU percent is determined by this query:
SELECT end_time,
(SELECT Max(v)
FROM (VALUES (avg_cpu_percent), (avg_data_io_percent),
(avg_log_write_percent)) AS
value(v)) AS [avg_DTU_percent]
FROM sys.dm_db_resource_stats;
looks like the max of avg_cpu_percent, avg_data_io_percent and avg_log_write_percent
Reference:
https://learn.microsoft.com/en-us/sql/relational-databases/system-dynamic-management-views/sys-dm-db-resource-stats-azure-sql-database
DTU is nothing but a blend of CPU, memory and IO. Why do we need a blend when these 3 are pretty clear? Because we want a unit for power. But it is still confusing in many ways.
eg: If I simply increase memory will it increase power(DTU)? If yes, how can DTU be a blend? It is a yes. In this memory-increase case, as per the query in the answer given by jyong, DTU will be equivalent to memory(since we increased it). MS has even a pricing model based on this DTU and it raised many questions.
Because of these confusions and questions, MS wanted to bring in another option.
We already had some specs in on-premise, why can't we use them? As a result, 'vCore pricing model' was born. In this model we have visibility to RAM and CPU. But not in DTU model.
The counter argument from DTU would be that DTU measures are calibrated using a benchmark that simulates real-world database workload. And that we are not in on-premise anymore ;). Yes it is designed with cloud computing in mind(but is also used in OLTP workloads).
But that is not all. Now that we are entering the pricing model the equation changes. The question now is about money and the bundle(what all features are included). Here DTU has some advantages(the way I see it) but enterprises with many existing licenses would disagree.
DTU has one pricing(Compute + Storage + Backup). Simpler and can
start with lower pricing.
vCore has different pricing (Compute, Storage). Software assurance
is available here. Enterprises will have on-premise licenses, this can be easily ported here(so they get big machines for less price than DTU model). Plus they commit for multiple years and get additional discounts.
We can switch between both when needed so if not sure start with DTU(Basic/Standard/Premium).
How can we know which pricing tier to use? Go to configure menu as given below: (on the right/left you can switch between both)
Even though Vcore is bigger 'machine' and for bigger things, the cost can sometimes be cheaper for enterprise organizations. Here is a proof. DTU costs $147 . But Vcore costs $111. That is because you can commit for 3 years(but still pay monthly) and also because of the license re-use option(enterprises will have on-premise licenses).
It is a bit too much than answering direct question but I am gonna go ahead and make this complete by answering 'how to choose between different options in DTU let alone choosing between DTU and vCore'. This is answered in this beautiful blog and this flowchart explains it all
To check the accurate usage for your services be it is free (as per always free or 12 months free) or Pay-As-You-Go, it is important to monitor the usage so that you know upfront on the cost incurred or when to upgrade your service tier.
To check your free service usage and its limits, Go to search in Portal, search with "Subscription" and click on it. you will see the details of each service that you have used.
In case of free azure from Microsoft, you get to see the cost incurred for each one.
Visit Check usage of free services included with your Azure free account
Hope this helps someone!
I am writing a Java application that needs to get images from Google image search (or similar).
It works as simple as a web call:
http://ajax.googleapis.com/ajax/services/search/images?v=1.0&q=japan
However, the response is limited to 8 results per page.
and google search is free for only 100 requests per day, so 800 results per day.
Is that correct?
The replacement for that deprecated API is here: https://developers.google.com/custom-search/v1/overview and for billing they note:
Any usage beyond the free usage quota will fail if you are not signed up for billing. Once you have enabled billing, you will continue to receive 100 free queries per day. However, you will be billed for all additional requests at the rate of $5 per 1000 queries, for up to 10,000 queries per day.
So the alternative is, simply, to pay for what you need.
There is lots of information from a similar question here: What's the best web image search API?
Somebody on that thread notes:
As of August 2012, Bing API will be part of Azure Marketplace: free access still available, but limited to a number of queries (5000/month). Google limits the free access to 100 queries/day.
So perhaps Bing has sufficient free quota for your needs? There are other options on that page also, check it out.