What are the Map Sessions limitations for the free plan offered by Skobbler (Telenav)? - skmaps

I plan to use Skobbler maps iOS SDK for one of my project, but before using it inside my project I need to know the Map Sessions limitations for the free plan. In that sense I browsed the company's website and I found two different answers to my question:
1 - The first one that I found is located in the plans' description:
(Source of this screenshot)
First 1 Million FREE.
2 - The second one that I found is located in the terms of api's usage.
(Source of this screenshot)
50,000 monthly map views.
Billable Event >250,000 map sessions (including online routing): $0.50/1,000 additional map sessions.
50,000 monthly map views and you apparently start charging when we do a number of requests >250,000 map sessions.
I do not know if all these information are not up-to-date, but for the moment they really confuse me and it would be really cool if Skobbler (Telenav) or someone on this forum could provide us some clarifications about the pricing applied to an iOS developer using the Skobbler FREE plan?

With the 2.4 update, due to an error, the T&C was reverted to the, pre-October 2014 version.
It will be fixed right away - the T&C will properly reflect what is on the plans page:
the first 1 million map sessions are free
the first 10k small & large maps are free
0.50$ per 1000 map sessions
0.02$ for a small offline map
0.08$ for a large offline map

Related

azure video streaming live how calculate the cost by size audience?

I want to make a live events with 100k viewers or more for 3 hours... How can I calculate the cost
I read documentation but the azure docs are quite ambiguous regarding the pricing
There is no shortcut really where you can just enter an audience size and get a result - different factors will impact your approach including, bitrates, support level, number of events, redundancy and geo reach required etc.
A typical approach is to build a model, maybe using excel, and test different combinations and scenarios in the Azure online cost calculator.
You can then experiment with changing details like regions, support level etc to see the impact.

How high will the resolution be for OpenMapTiles Aerial/Satellite Imagery?

So this is more of an inquiry,
I visited https://openmaptiles.com/satellite/ and it is my understanding that "Highres aerial imagery" is still to be added. So, my first question will this imagery comparable to the imagery we see in Google Maps and Bing Maps? I attached some of OpenMapTiles lower resolution imagery for comparison. Also, is there a scheduled release date for coverage of the US? Maybe I'm getting ahead of things here but will the imagery be available for download at a one time fee and will it be as affordable as the OpenStreetMap tiles?
I guess this is related to the cost to acquire high resolution data. At the moment, free data sources such as Sentinel-2, Landsat-8, and CBERS4 are 5m to 30m resolution. Higher resolution data like what one would see in Google Maps and Bing Map are very expensive as they require commercial satellite images or aereal photographs. The gov't and local municipalities often provide high-res geospatial data for free on their websites (like open data), if you need data for only a few urban area.

How to export specific price and volume data from the LMAX level 2 widget to excel

Background -
I am not a programmer.
I do trade spot forex on an intraday basis.
I am willing to learn programming
Specific Query -
I would like to know how to export into Excel in real time 'top of book' price and volume data as displayed on the LMAX level 2 widget/frame on -
https://s3-eu-west-1.amazonaws.com/lmax-widget/website-widget-quote-prof-flex.html?a=rTWcS34L5WRQkHtC
In essence I am looking to export
price and volume data where the coloured flashes occur.
price and volume data for when the coloured flashes do not occur.
I understand that 1) and 2) will encompass all the top of book prices and volume. However i would like to keep 1) and 2) separate/distinguished as far as data collection is concerned.
Time period for which the collected data intends to be stored -> 2-3 hours.
What kind of languages do I need to know to do the above?
I understand that I need to be an advanced excel user too.
Long term goals -
I intend to use the above information to make discretionary intraday trading decisions.
In the long run I will get more involved with creating an algo or indicator to help with the decision making process, which would include the information above.
I have understood that one needs to know coding to get involved in activities such as the above. Hence I have started learning C ++. More so to get a hang/feel for coding.
I have been searching all over the web as to where to start in this endeavor. However I am quite confused and overwhelmed with all the information.
Hence apart from the specific data export query, any additional guidelines would also be helpful.
As of now I use MT4 to trade. Hence I believe to do the above - I will need more than just MT4.
Any help would be highly appreciated.
Yes, MetaTrader4 is still not able ( in spite of all white-label-ed Terminals' OrderBook Add-On(s) marketing and PR efforts ) to provide an OrderBook-L2/DoM-data into your MQL4 / NewMQL4 algorithm for any decision making. Third party software tools' integration is needed to make MQL4-code aware of the real-time L2/DoM-data.
LMAX widget has impressive look & feel, however for your Excel export it requires a lot of programming efforts to re-use it for an automated scanner to produce data for 1 & 2 while there may be some further, non-technical, troubles on legal / operational restrictions for automated scanner to be operated on such data-source. To bring an example, the data-publisher policy restrict automated Options-pricing scanners for options on { FTSE | CAC | AMS | DAX }, may re-visit the online published data-sources no more than once a quarter of an hour and get blocked / black-listed otherwise. So a care and a proper data-source engineering is in place.
Size of data collection is another issue. Excel has some restrictions on an amount of rows/columns that may get imported. Large data-files, the more the CSV-imports may strike these limits. L2/DoM-data, collected for 2-3 hours just for one single FX Major may go beyond such a limit, as there are many records per second ( tens, if not hundreds, with just a few miliseconds between them ). Static file-size of collected data-records take typically several minutes to just get written on disk, so proper distributed processing data-flow-design and non-blocking-fileIO engineering is a must.
Real-time system design is the right angle to view the problem solution approach, rather than just some programming language excersise. Having mastered some programming language is a great move, nevertheless, so called robust real-time system design, and Trading software is such a domain, requires, with all respect, a lot more insights and hands-on experience than to make an MQL4 code run multi-thread-ed & multi-process-ed with a few DLL services for a Cloud/Grid-based distributed processing system.
How much real-time traffic is expected to be there?
For just a raw idea, what the Market can produce per second, per milisecon, per microsecond, let's view a NYNEX traffic analysis for one instrument:
One second can have this wild relief:
And once looking into 5-msec sampling:
How to export
Check if the data-source owner legally permits your automated processing.
Create your own real-time DataPump software, independent of the HTML-wrapped Widget
Create your own 'DB-store' to efficiently off-load scanned data-records from real-time DataPump
Test the live data-source >> DataPump >> DB-store performance & robustness on being able to serve error-free a 24/6 duty for several FX Majors in parallel
Integrate your DataPump fed DB-store local data-source for on-line/off-line interactions with your preferred { MT4 | Excel | quantitative-analytics } package
Integrate a monitoring of any production environment irregularity in your real-time processing pipeline, which may range from network issues, VPN / hosting issues, data-source availability issues to an unexpected change in the scanned data-source format/access conditions.

How to send a server update to the browser?

I have a simple problem which I need to solve. The solution might not be that simple, though...
I have a web application at example.com and multiple users are using it at the same time. At some moment, 10 of those users are all looking at the Countries page which lists about 200+ countries. This page divides the lists in sections of 20 countries and some users might have filters to e.g. show only all countries starting with the letter N.
Then one user decides to rename Netherlands to Holland, and I want all other users to see this change (almost) immediately.
Of all those users, some might have been looking at a different set of countries. The Netherlands would not be visible, so they should not be updated. One user might be looking at all countries starting with N so he needs to see that Netherlands disappears. One might be looking at the first 20 countries and Netherlands is at place 50 and Holland at place 32, so he doesn't see either of them, so no update. And one user looks at all countries starting with H and he needs to get an update since Holland is added. Finally, one user looks at all countries that use Euro's as currency and he should see an update where Netherlands changes into Holland.
Basically, just a lot of updates based on the actions of a single user. Considering the fact that I might have up to 2.000 users and the Countries table being very popular, I have to be careful about performance.
So, what would be the best approach for this?
Since I use the DevExpress ASP.NET components, there might be something very useful for this in these components. Unfortunately, I'm still a new user of those components.
The solution you are looking for resides with the Server-Push technologies.
The .Net framework provides a solution for two way communication via the WCF Callbacks. More information here.
The signalR framework provides pushing data to the Client Side, so it is better for MVC like applications and not so much on DevExpress Tools that do most of the databinding on the ServerSide. Info on signalR can be found here.
DevExpress sugests a solution with timers (PostBacks after a certain time has passed).
On a general note html 5 provides the WebSocket functionality, which is essentially what all the other options use on a part. Frameworks like the above, build on the technologies but also provide walk-arounds by using timers and such.

I can't figure out where to start with GIS application development, or which technology to select

I am very new to GIS development, and to be be frank I have no background about it at all. I searched the web but the tutorials I found seemed to assume the reader has some background information.
the thing is that I am confused about what to read or learn, there seems to be lots of technologies, and I feel lost since some speak about openlayers, geoserver, mapserver, google maps, and open street maps.
So here is what I am supposed to develop, and I hove you could give me an advice about which technology to use, and where should I start reading - given that I know almost nothing -.
Case 1: a closed system for about 20 users only, who can specify locations on the map, and the web application will store the latitude and longitude of the locations and show the markers. I wanted to use google maps api, but I cancelled that since there license requires you to purchase the service if the system is a closed one. so what technology should I use in such case? I need a free option, also I will be only using web server, so if the solution includes using my own geoserver, or something like that I won't be able to do it.
Case 2: I am supposed to display the roads and routes between two given points, and probably add some notes on the map. For this I case I can use my own map server/geo server, but again I want your suggestions.
of course the solution need to be open source
finally, I hope you could tell me what to start reading first,
Start by looking over at https://gis.stackexchange.com/, starting with the tags [web-mapping] and
Some topics in particluar you may want to look at are:
https://gis.stackexchange.com/questions/8113/steps-to-start-web-mapping
https://gis.stackexchange.com/questions/8238/where-how-to-learn-about-getting-started-with-web-gis
https://gis.stackexchange.com/questions/13868/looking-for-a-developer-friendly-web-gis
As for skills and tuorials, look at:
https://gis.stackexchange.com/questions/17227/free-gis-workshops-tutorials-and-applied-learning-material
https://gis.stackexchange.com/questions/913/web-gis-development-skill-sets

Resources