How do you create a custom function in IBM Maximo Asset Monitor using Python and is there an SDK? - maximo

I would like to create my own custom function and add it to the Maximo Asset Monitor Catalog. How do I do that?

You can create custom function using Python in IBM Maximo Asset Monitor. Use the following resources:
Knowledge Center documentation includes a tutorial : https://www.ibm.com/support/knowledgecenter/SSQP8H/iot/analytics/tutorials/as_adding_complex_function_tutorial.html
Use the Watson IOT Function samples to make more advanced custom functions. https://github.com/ibm-watson-iot/functions/blob/production/iotfunctions/sample.py
Fork this github repo: https://github.com/fe01134/functions
Watch this step by step video Tutorial https://youtu.be/i5srMxmIOHM
Watson IOT Monitor SDKS and REST APIs for querying data
db https://github.com/ibm-watson-iot/functions/blob/development/iotfunctions/db.py
Use REST API for Monitor https://github.com/ibm-watson-iot/functions/blob/1cf584f30ba259ad98b1ca77684c016df9b21748/iotfunctions/db.py#L773
Queries https://github.com/ibm-watson-iot/functions/blob/development/scripts/test_queries.py

Related

How to use MQL with node.js?

I have created a Monitoring Metrics Dashboard in my Google Cloud Console. The dashboard is working as expected, but since my app is highly dependent on those metrics, I was thinking about creating a schedule to see these metrics data and update the server accordingly.
After investigating the dashboards, I have noticed that there is an MQL query. Is there any way to execute this query in my node.js function so I can fetch the data and update the server?
You can try MetaApi https://metaapi.cloud cloud service which provides REST API and WebSocket API access to both MetaTrader 4 and MetaTrader 5 accounts.
Official REST API documentation: https://metaapi.cloud/docs/client
SDKs: https://metaapi.cloud/sdks (javascript, python and Java SDKs are provided as per April 2021)
It supports reading account information, positions, orders, trade history, receiving quotes, and accessing market data.
The service also provides copy trading API https://metaapi.cloud/docs/copyfactory and API to calculate forex trading metrics on a MetaTrader account https://metaapi.cloud/docs/metastats.
There is a case similar to yours in Stackoverflow (answered by user3666197).
And also you can easily connect your nodejs server in mysql. MySQL is one of the most popular open-source databases in the world and efficient as well.
Please follow Nodejs mysql tutorial for more details about the steps/process process of how to connect nodejs server to mysql.

Azure Media Services - Blob not appearing in Asset using Rest API

I want to use Azure Rest API to encode a video to multiple bitrates.
Here are the steps i am following:
Uploaded single MP4 video as an asset using Azure Portal.
Received access_token from Azure AD Token for Service Principal Authentication using Postman.
Used this token to create a new asset using Postman.
Created a new Transform request using Postman.
Created a new job using the Transform request with input as the file uploaded at step 1 and output as asset created at step 3 using Postman.
Now this job appears under the "Jobs" section of Azure Media Service. After the job is successfully complete, the blobs are present in the blob container.
But those files are not appearing in the Asset (created at point 3 above). An empty asset is present even after the job is complete.
Do i need to call another API after the job is complete to map the blobs to the asset?
Please help.
Thank you.
Azure Media Services currently offers two API versions, v3 and v2. The Azure Portal, as called out here, is using the older/legacy v2 APIs. When you use Transforms and submit Jobs using that Transform, you are using the v3 API - the resultant output Asset is meant to be a v3 entity. Hence, contents of that Asset are not going to show up in the Azure Portal. If you are using a PC, there is a tool here that is being built on top of the v3 APIs, which may be of help in browsing v3 Assets.
Note for folks returning to this post, we are removing the Postman collection for the v3 API and directing customers to use the client SDKs directly instead now. The reason being is that too many customers had issues with "rolling their own" retry policy code for Azure Resource Management and also had trouble implementing the long running operation support for async operations. We prefer that most customers of Media Services now use the client SDKs available for various languages. See this page for various samples by client SDK language.
https://learn.microsoft.com/en-us/azure/media-services/latest/samples-overview

Is there any guide how to train a Microsoft custom MT engine and deploy it on Azure?

I have a big parallel corpus in TMX format which I'd like to use for training a custom Microsoft MT engine in the Microsoft Translator Hub. Then, I'd like to deploy this trained MT engine on Azure and use it in a cloud-based CAT tool.
Is there any step-by-step guide how to do that?
You upload your TMX to the Hub, and you train, optimize and deploy
your MT engine.
The Hub overview page lists a category ID with each
of your trained systems.
You copy the category ID from the Hub overview page into the "category" field that your CAT tool's MT connector exposes.
If your CAT tool does not expose a "category" field in the connector, please ask the CAT tool vendor to add it. In the Translator API call it is simply an additional parameter in the call, that allows the service to find your custom trained system when translating.

Pulling data from Stream Analytics to Azure Machine Learning

Working on a IoT telemetry project that receives humidity and weather pollution data from different sites on the field. I will then apply Machine Learning on the collected data. I'm using Event Hubs and Stream Analytics. Is there a way of pulling the data to Azure Machine Learning without the hassle of writing an application to get it from Stream Analytics and push to AML web service?
Stream Analytics has a functionality called the “Functions”. You can call any web service you’ve published using AML from within Stream Analytics and apply it within your Stream Analytics query. Check this link for a tutorial.
Example workflow in your case would be like the following;
Telemetry arrives and reaches Stream Analytics
Streaming Analytics (SA) calls the Machine Learning function to apply it on the data
SA redirects it to the output accordingly, here you can use the PowerBI to create a predictions dashboards.
Another way would be using R, and here’s a good tutorial showing that https://blogs.technet.microsoft.com/machinelearning/2015/12/10/azure-ml-now-available-as-a-function-in-azure-stream-analytics/ .
It is more work of course but can give you more control as you control the code.
Yes,
This is actually quite easy as it is well supported by ASA.
You can call custom AzureML function from your ASA query when you create this function from the portal.
See the following tutorial on how to achieve something like this.

Azure Api Management Join Payloads and to provide customer

I have a question about the use of the Azure Management Api. The architecture of the single responssabilidade Api predicts domain to perform the functions of the business area. See image structure.
enter image description here
1. I wonder if the Azure Api Management operates as a management or I have the possibility of using it as a Geteway add results of many APIs in one (
orchestrating) and available to the client that made the request?
2. The responsability to gather this data is the Web Application?
3. Is there a pattern?
Azure API management can do both, it started as a management tool, but has received some updates so it can act as a gateway as well.
Read about the different API managment policies you can create here: https://azure.microsoft.com/en-us/documentation/articles/api-management-policy-reference/
Or take a look at the advanced policies, with the control flow and the send request
https://msdn.microsoft.com/library/azure/dn894085.aspx
For an example of sending requests to gather information from multiple sources see this:
https://azure.microsoft.com/nl-nl/documentation/articles/api-management-sample-send-request/

Resources