Good day
Is there a way to see what is pulling data out of the system and how much?
I have looked at the Access History(OData refresh) but I am thinking the API can also be an issue.
We currently experiencing massive data pulls via the IIS on our server and I can't see what is pulling the data.
Any ideas or suggestions will be helpful I
You can monitor lots of things such as SQL and Memory through the Request Profiler.
Search for Request Profiler in the search box.
Click Log Requests and Log SQL to enable full logging.
Remember to turn it off when you are done as it will have a small performance hit.
An alternative is to use the License Monitoring Console within Acumatica. You can view historical transactions whether they are commercial or ERP related.
From the help file, commercial transactions are:
Commercial transactions are a subset of ERP transactions. The system
regards a transaction as commercial when the transaction creates or
updates any of the following entities: sales orders, shipments,
Accounts Receivable invoices, payments, purchase orders, purchase
receipts, and Accounts Payable invoices. All requests generated by
using the web services API that create or update data in these
documents are also considered commercial transactions.
Also, you can review the number of web service API requests, requests per minute and maximum number of users. This can also help determine whether your client needs to be on a higher tier for Acumatica.
You can also follow the troubleshooting recommendations listed on Acumatica's help site.
Related
I currently have a web app that allows users to create their own website through the platform, like shopify/wordpress/squarespace/wix. I also want to provide an analytics dashboard for these users to keep track of events and page views.
Are there any APIs for this? I took a look at various analytics services (google analytics, mixpanel, Adobe Analytics) but they all seem to be targeted at individual websites.
Or is the best way to keep create a custom solution storing page views and events in our own databases?
Any solution or ideas would be appreciated, thanks! :)
Localytics is a cloud-based analytics and marketing platform. This mobile analytics tool targets every customer from media, retail, travel, business and lifestyle. The retention feature is particularly interesting for users. This feature generates an analysis that pinpoints reasons for business problems, such as drop-offs, low conversion rates and reduced client retention.
Dynamics CRM 365 Online has Telemetry to monitor application performance and usage. My understanding is it stores data in Azure by sending via Script and on this Alerts can be set in Azure.
There is a CRM solution Organization Insight which is used for counters like API faults etc. my understanding is that this solution stores data in CRM and not Azure.
Please help if this solution generated data can also be used for settings alerts in azure and if yes, how? Thanks.
Over a period of time different products, SaaS offerings, platform tools, O365 capability were introduced/evolving in and around the Telemetry space. Some are strong in broader horizontal features vs deeper vertical strengths. Upto you to decide based on functionality matrix.
Dynamics CRM using Application Insights - this is just leveraging Azure Application Insights to track Availability, Performance, Usage of our CRM online using AppInsights features. It gives you the power to utilize client side telemetry like user activities/metrics/exceptions/browser, Server side agent for on-premise logs similar to Event viewer/IIS logs, data export, Alerting mechanism & web hooks, custom query, visualizations inside Azure portal resources blade (slice/dice), predictive analysis, etc.
CRM in-house Organization Insights - this is inbuilt telemetry solution from Product team, not yet fully delivered (still with bugs & limited functionality for data consumption itself), but we can expect more features, options & utilities to explore more in future. These are missing today & thats what you are asking. You can download data, call MS to give you the exported data, but still Alerts & monitoring wont fit using only available odata support for charts. This covers a lot of useful Admin stuffs, like natively gives you all plugins, WF, API calls, user interactions, mailbox, storage data, etc within CRM.
Activity logging - this is in preview and MS saying it’s more than what we have in Audit today, gives you extra than CRM audit but less than what Org Insights providing, even less than AppInsights features. But I expect more functionalities here like Alerts, Monitoring, web hooks as this is across O365 platforms & available for Global admins.
API statistics report in Google Console project doesn't show data though we are making hundreds of requests every day. Refer attached screenshot, it always shows blank.
I have billing enabled in project.
API statistics started showing once I upgraded the plan.
I'm not sure if it's Google's policy to not collect stats for free plan to improve the performance of paid plans.
We have our own application that stores contacts in an SQL database. What all is involved in getting up and running in the cloud so that each user of the application can have his own, private list of contacts, which will be synced with both his computer and his phone?
I am trying to get a feeling for what Azure might cost in this regard, but I am finding more abstract talk than I am concrete scenarios.
Let's say there are 1,000 users, and each user has 1,000 contacts that he keeps in his contacts book. No user can see the contacts set up by any other user. Syncing should occur any time the user changes his contact information.
Thanks.
While the Windows Azure Cloud Platform is not intended to compete directly with consumer-oriented services such as Dropbox, it is certainly intended as a platform for building applications that do that. So your particular use case is a good one for Windows Azure: creating a service for keeping contacts in sync, scalable across many users, scalable in the amount of data it holds, and so forth.
Making your solution is multi-tenant friendly (per comment from #BrentDaCodeMonkey) is key to cost-efficiency. Your data needs are for 1K users x 1K contacts/user = 1M contacts. If each contact is approx 1KB then we are talking about approx 1GB of storage.
Checking out the pricing calculator, the at-rest storage cost is $9.99/month for a Windows Azure SQL Database instance for 1GB (then $13.99 if you go up to 2GB, etc. - refer to calculator for add'l projections and current pricing).
Then you have data transmission (Bandwidth) charges. Though since the pricing calculator says "The first 5 GB of outbound data transfers per billing month are also free" you probably won't have any costs with current users, assuming moderate smarts in the sync.
This does not include the costs of your application. What is your application, how does it run, etc? Assuming there is a client-side component, (typically) this component cannot be trusted to have the database connection. This would therefore require a server-side component running that could serve as a gatekeeper for the database. (You also, usually, don't expose the database to all IP addresses - another motivation for channeling data through a server-side component.) This component will also cost money to operate. The costs are also in the pricing calculator - but if you chose to use a Windows Azure Web Site that could be free. An excellent approach might be the nifty ASP.NET Web API stack that has recently been released. Using the Web API, you can implement a nice REST API that your client application can access securely. Windows Azure Web Sites can host Web API endpoints. Check out the "reserved instance" capability too.
I would start out with Windows Azure Web Sites, but as my service grew in complexity/sophistication, check out the Windows Azure Cloud Service (as a more advance approach to building server-side components).
I am building a service on azure and wanted to know if there is any way to know how much resources (data downloaded or uploaded, time required to do the processing) a customer has used in a given session and what level of services they have used in order to bill them accordingly. We expose the whole framework as a service, this consists of various small levels of services, like reading the data from some external FTP server, downloading it to blob, reading the file downloaded and storing them in tables and performing some operations the data in the table, email some results from service required by the user, etc.
So, depending on what all services the customer has used, we would like to bill them accordingly.
Thanks!
The only Azure specific function that I can think of that will help you with what you want to track is Azure Storage Logging which will track each and every request to Azure Storage, I'm not sure how much that is going to help you though.
I think you will have to decide exactly what you want to bill your customers for and then start tracking that yourself. This might be items similar to what MS charges you for (tracking the size of incoming requests, counting the number of transactions and the size of data stored to Azure Storage) or maybe just some arbitrary values based on some of this information