Netsuite API for Application Performance Monitor SuiteApp - netsuite

Is there any suitescript API to pull the Application Performance Monitor metrics from the NS database or any other alternate way to get this data. We need the data to store for future reference to optimize the transaction records. Can anyone please help in this to get the data? Thanks!

It's not public but yes you can make requests to the underlying Suitelets that power the NetSuite APM. To see this, go to one of the APM modules, inspect the page and choose the network tab. Once this is open, you can refresh the data and then inspect the request that was made to fetch the data used to populate these metrics.
Good luck!

Related

Cognos REST API and scheduling schema loading

I am trying to find out more informations about using the REST API in order to create a schedule for schema loading. Right now, I have to reload the particular schemas via my data server connections manually (click on every schema and Load Metadata) and would like to automate this process.
Any pointers will be much appreciated.
Thank you
If the metadata of your data warehouse is so in flux that you need to reload the metadata so frequently that you want to automate the process then you need to understand that your data warehouse is in no way ready for use.
So, the question becomes why would you want to frequently reload the metadata of a data source schema? I'm guessing that you are refreshing the data of your data base and, because your query cache has not expired, you are not seeing the new data.
So the answer is, you probably don't want to do what you think you need to do unless you can convince me otherwise.
Also, if you enter some obvious search terms you will find the Cognos analytics REST api documentation without too much difficulty.

Bringing a MS Graph Search Custom Connector into working mode

Recently Microsoft published the Microsoft Search API (beta) which provides the possibility to index external systems by creating an MS Graph search custom connector.
I created such a connector that was successful so far. I also pushed a few items to the index and in the MS admin center, I created a result type and a vertical. Now I'm able to find the regarded external items in the SharePoint Online modern search center in a dedicated tab belonging to the search vertical created before. So far so good.
But now I wonder:
How can I achieve that the external data is continuously pushed to the MS Search Index? (How can this be implemented? Is there any tutorial or a sample project? What is the underlying architecture?)
Is there a concept of Full / Incremental / Continuous Crawls for a Search Custom Connector at all? If so, how can I "hook" into a crawl in order to update changed data to the index?
Or do I have to implement it all on my own? And if so, what would be a suitable approach?
Thank you for trying out the connector APIs. I am glad to hear that you are able to get items into the index and see the results.
Regarding your questions, the logic for determining when to push items, and your crawl strategy is something that you need to implement on your own. There is no one best strategy per se, and it will depend on your data source and the type of access you have to that data. For example, do you get notifications every time the data changes? If not, how do you determine what data has changed? If none of that is possible, you might need to do a periodic full recrawl, but you will need to consider the size of your data set for ingestion.
We will look into ways to reduce the amount of code you have to write in the future, but right now, this is something you have to implement on your own.
-James
I recently implemented incremental crawling for Graph connectors using Azure functions. I created a timer triggered function that fetches the items updated in the data source since the time of the last function run and then updates the search index with the updated items.
I also wrote a blog post around this approach considering a SharePoint list as the data source. The entire source code can be found at https://github.com/aakashbhardwaj619/function-search-connector-crawler. Hope it would be useful.

Dashboard F5 data download

In F5>Statistics>Dashboard it is possible to download raw data with the 'history' icon.
I need to download this on regular basis so automation comes into place.
I can't find such report in F5 regular report depository. I tried to link F5 to zabbix to analyze there but I don't have access to F5 backend. I set up a UI macro but I would like to have something more reliable in place.
Any tips most welcomed.
F5dashboard screenshot
Thanks!
Since you're on version 13, you have the option to load the Telemetry streaming iControl LX RPM onto BIG-IP and pull the data from any number of preconfigured systems or set up a generic pull JSON request through the API for the same data.
https://clouddocs.f5.com/products/extensions/f5-telemetry-streaming/latest/
This is the same set of data that they use to populate the BIG-IQ Centralized Management performance and health analytics.
The caveat is depending on what you request and how much you request, it can start to tax the system so enable what you need and in chunks to see how it affects system performance. I've seen the mightiest systems grid to a halt when asked to provide ALL telemetry data to Splunk or Sumologic.
Hope this helps.

cognos monitoring

Does cognos 8 have an api which can be queried to find out:
the last time a scheduled report has run
was it successful?
if not, what caused it to fail?
it would be preferable for the api to be web based. I have read about a JMX interface but documentation was lacking.
Cognos 8 has an option to enable logging to an Audit Database. The database would then be queryable for details such as when a report ran, what parameters were used, and what if any errors there were.
Link to IBM site for setting up the logging database.
Link to IBM site for setting up the connection to the database.
Basically you create a compatible database with the proper settings, then tell Cognos how to connect to it, then the next time Cognos services start up, it will automatically create necessary tables for logging and begin populating them automatically.
Typically, to accomplish this in the API fashion, you want to use Cognos SDK. SDK allows you to query scheduling and history associated with report runs and see if the request was completed or failed. If failed, you will see History associated with the failure much like you see in the Cognos Administration section when you go to look at the failed runs.
This is a good place to start to look at a sample:
http://www-01.ibm.com/support/docview.wss?uid=swg21343791

Combining data from Project Server and SharePoint into a single report

I need to combine data from the Project Server reporting database with data from custom lists in SharePoint workspaces. The results need to be displayed within a single report. How should this be done? Options I've thought of:
Extend the reporting database with the custom list data (if this is possible). Use Reporting Services to display the output.
Query the reporting database and the SharePoint workspaces and combine results in memory. Write custom code to display the output.
Any other ideas? I have the skills to develop this but am very open to purchasing a product if it solves the problem.
I've had this sort of problem as well. My apporach:
Create a Custom reporting Db.
Run regular jobs from the SQL Server to query sharepoint (via WS) and store the results in the db.
i use the ListItemsChangesSinceToken is Lists.asmx to improve effeciency. Also I utilise the sitedataquery tool set. I wrote a really simple interface into it for the ability to call a sitedataquery remotely, returning a dataTable.
Use Reporting Services / any tool to extract and report on the data.
The reason I opted for a staging Db was for
Performance - the WS calls are pretty slow.
Service continuity - if SP is down for any reason or slow then queries will fail.
Hope this helps.
I also found the tool SharePoint Data Miner which appears to do the same as DJ's answer.

Resources