Home Assistant Sensor API Specification, how to set a unique_id via REST? - sensors

Years ago I built a temperature sensor that would push (http post) readings to a server / dashboard system I had written. In lieu of expanding requirements, I've decided to switch to Home Assistant as my backend. Though it is possible to send the data to HA, the documentation is spotty. Namely I'm looking for the full JSON for the post body (an OpenAPI spec would be nice) and more details around how sensors function, and things "like can I set a unique_id so that they are editable in HA?", etc.
So far I've been working off the little bits of information around the API, some examples, and inferences from other documents for the python api (internal server code).
REST API
https://developers.home-assistant.io/docs/api/rest/
Sensor post info from HTTP integration
https://www.home-assistant.io/integrations/http/#sensor
General Sensor Docs
https://www.home-assistant.io/integrations/sensor#device-class

Perhaps what you're looking for is value_json?
Here's how I'm fetching value out of a simple webserver which returns this basic json object { key: value }, although one difference is that I'm polling and using GET.
rest:
- resource: http://192.168.0.122/status
sensor:
- name: "Data from my server"
value_template: "{{ value_json.key }}"
Tip: Go to Developer Tools -> Template for a sandbox-like environment where you can rapidly prototype your value_template

Related

Is there a way to intercept the query in nlp.js WebChat API server?

Just getting started with nlp.js, and I'd like to be able to test out some ideas with their Express API server package.
As far as I can tell, there's no way to "intervene" in the QnA bot exchange. For instance, if I'd like to format the output to contain the user's name or a time or whatever.
Say my corpus was a tsv file with:
some question \t welcome, #name
And I wanted to swap out that #name tag? Right now, I just get that string exactly as is.
In the conf.json:
"api-server": {
"port": 3000,
"serveBot": true
}
Maybe there's a pipeline logic to do that?
Can't seem to find a lot of reference material on available events in the pipeline or how to intercede in the WebChat flow out there.

500 error when using the shopping.flightDates.get endpoint

Using your API for an fun app I am developing and I just started using your eendpoints, This particular endpoint gives me this responseError:
body: '{"errors":[{"status":500,"code":141,"title":"SYSTEM ERROR HAS OCCURRED","detail":"ORIGIN AND DESTINATION NOT ALLOWED FOR AMA4DEV EXTREME SEARCH REQUESTS ON ENVIRONMENT"}]}',
The endpoint I am hitting is:
amadeus.shopping.flightDates.get({
origin : 'PHX',
destination : 'MEX'
}).then(function(response){
console.log(response.data);
}).catch(function(responseError){
console.log(responseError.response);
});
To make sure it was not something with the auth token/secret I made sure to make a test call using your example on github that works which was:
amadeus.shopping.flightDates.get({
origin : 'MUC',
destination : 'MAD'
}).then(function(response){
console.log(response.data);
}).catch(function(responseError){
console.log(responseError.response);
});
No problem in hitting that endpoint. Thank you again for looking into this
If you use the test environment: it is free of charge but limited (limited number of API calls you can do per month and limited set of data (a subset of production data)). For each API you can find the data collection available here.
For Flight Cheapest Date Search API, the test environment doesn't have data for PHX as origin.
I tried in production and it does return data. Please note that Flight Inspiration Search and Flight Cheapest Date Search are built on top of a pre-computed cache (in production). As they are inspirational APIs we do not offer all pairs of origin-destination but only the most searched all over the world. If you want to get the full list of origin-destination pairs (even smaller cities), you need to use the Flight Offers Search API.

Wallet Pass auto update web service using aws api gateway

I am working on a web service to update Apple Wallet passes using AWS Lambda/API gateway/NodeJS. The Apple wallet hit the api to get update pass but each time I am getting following error:
encountered error: Received invalid pass data (The pass cannot be read because it isn’t valid.)
I have tried the same URL in the browser to get the pass. The pass is downloading every time and its opening a valid pass every time. But its not working when Apple wallet hit the URL. I have tried same URL in Postman it gives me base64 instead of binary data.
I have tried to achieve the same functionality with NodeJS and deployed on heroku, its working properly with Wallet(also gives binary in Postman). But I need to use AWS Lambda/API gateway/NodeJS.
I am not sure, if AWS changing something while delivering binary data.
Any help on this is appreciated.
I just experienced this and spent hours trying to diagnose what was happening.
For anyone using AWS API Gateway & Lambda for their PassKit web service endpoints, there's a major "gotcha" (at least as of the date of my response) with how API Gateway's logic determines whether it needs to convert a response from base64 ==> binary.
If you inspect the request headers from Apple Wallet / PassKit, you'll see that the Accept header is */*.
API Gateway apparently iterates through the items in the request Accept header and determines if there is a match with any of the Binary Media Types you've defined under Your API Name > Settings. It will use the first match it finds and then, as you'd hope, convert the base64 string (from Lambda) to binary.
Here's the crazy part -- if you define application/vnd.apple.pkpass as one of your "please convert to binary" media types, requests from Apple Wallet / PassKit will not work. Why? Well, AWS (for whatever reason...) hasn't programmed */* to match any type ... it will literally only match */*.
As a result, the Accept header's */* will not match with application/vnd.apple.pkpass and your base64-encoded .pkpass response (from Lambda) will not be converted to binary, causing PassKit to choke + report errors.
TL;DR -- there is some goofiness with AWS API Gateway. To return PassKit pass data successfully, you need to add */* (not application/vnd.apple.pkpass) under Your API Name > Settings > Binary Media Types.

Can A Mobile Application use TrueVault to store JSON data without a "middleman" server?

I have been reading the documentation at https://docs.truevault.com/ but I am a little confused. I read this on the true vault site:
If you plan on using any of the server-side libraries, please ensure
any hosting environment you use is HIPAA compliant.
I took this to mean that TrueValut could support a standalone (client side only) mobile application architecture. Where the TrueVault API was the only server side interaction.
However my understanding of the documentation is that:
An API_KEY is required to register a new user.
Any API_KEY provides full access to all data vaults and JSON documents stored in TrueVault.
If both of these assumptions are correct that would mean it would be impossible to register new users directly from the client side app, forcing me to use a costly and resource intensive HIPPA compliment web server. The only way to get around this would be top hard code the API_KEY into the app, an obvious no go if that API_KEY can access all of my TrueVault data.
For my use case I have the following requirements for TrueVault for me to be able to consider using it (I would imagine these requirements are the same for anyone looking to develop a client side only healthcare application):
A user can sign up via the API directly from my client side app without requiring any sensitive keys or root auth data.
A user can authenticate using only the data they provided to sign up (username/email/password). My app is multi platform I cant ask them to remember their API keys to log in.
A user can Read/Write/Update/Delete data linked to their profile. They can not access any data from another user using their credentials.
Is TrueVault able to deliver these three basic requirements?
If the answer to this is "No" I would recommend you update this text on your website as there are not going to me any viable HIPPA compliment applications that can be supported by TrueVault without an independent server side interface.
I'm currently using AWS Lambda as a solution. Lambda is HIPPA compliant, more info here. Lambda is also a low cost solution.
Here is an example of the code I'm running on Lambda using Node.js.
var request = require('request-promise');
var _ = require('lodash');
function encodeBase64(str) {
return (new Buffer(str)).toString('base64');
}
var baseUrl = 'https://api.truevault.com/v1/';
var headers = {
'Content-Type': 'application/x-www-form-urlencoded;charset=utf-8'
};
var req = request.defaults({
baseUrl: baseUrl,
headers: _.extend({
Authorization: 'Basic ' + encodeBase64('your api key:')
}, headers),
transform: function(body) {
return JSON.parse(body);
}
});
exports.handler = function(event, context) {
req.post('users', {
form: {
username: event.email,
password: event.password,
attributes: encodeBase64(JSON.stringify({
name: event.name
}))
}
}).then(function() {
context.succeed({user: user});
}).catch(context.fail);
}
In general, you are correct - if you include zero server-side processing between user and TrueVault, then the API keys will be public. At least, I don't know of any way to avoid this.
That being said, it is incorrect to jump to "any API_KEY provides full access to all data vaults and JSON documents stored in TrueVault." - that's not the case if setup properly.
TrueVault API keys are able to be narrowed in scope quite a lot. Limiting a key to only Write permission on {Vault#1}, a second key to only Read permission on {Vault#2}, a third key to allow uploading Blogs in {Vault#1&#3}, quite a few variations, a forth for deleting information from {Vault#2}, and on as needed. You can also limit permissions specifically to content "owned" by the API key (e.g. user-specific keys) Full documentation here.
There are also limited scope keys (set expiry time, usage count, limit to any of the prior permission scopes). Docs here.
TrueVault also offers user logins separate from API keys which may be better suited if your user are using login credentials. Docs here.
I'm still rather figuring out TrueVault myself (at time of writing at least) so be sure to research and review more for your needs. I'm still torn if the limited scoping is "good enough" for my needs - I'm leaning towards using AWS Lambda (or similar) to be a HIPAA compliant middle man, if only to better hide my access token generation and hide that my data is going to TrueVault and add some "serverless-server-side" data validation of sorts.

Updating a wiki page with the REST API

How do you update a SharePoint 2013 wiki page using the REST API?
Three permutations:
Reading an existing page (content only)
Updating an existing page
Creating a new page
For reading an existing page, of course I can just to a "GET" of the correct URL, but this also brings down all the various decorations around the actual data on the wiki page-- rather than fish that out myself, it would be better if there was a way to just get the content if that is possible.
Are there special endpoints is the REST API that allow for any of these three operations on wiki pages?
As stated in GMasucci's post, there does not appear to be a clean or obvious way of instantiating pages through the REST API.
You can call the AddWikiPage method from the SOAP service at http://[site]/_vti_bin/Lists.asmx. This is an out of the box service that will be accessible unless it has been specifically locked down for whatever reason.
To read the content of a wiki page through the REST API, you can use the following endpoint:
https://[siteurl]/_vti_bin/client.svc/Web/GetFileByServerRelativeUrl('/page/to/wikipage.aspx')/ListItemAllFields
The content is contained within the WikiContent field. You may want to add a select to that URL and return it as JSON to reduce the amount of data getting passed over if that is a concern.
As for updating the content of an existing wiki page, it is not something I have tried but I would imagine it's just like populating another field through the REST API. This is how I would expect to do it:
Do a HTTP POST to the same endpoint as above
Use the following HTTP headers:
Cookie = "yourauthcookie"
Content-Type = "application/json;odata=verbose"
X-RequestDigest = "yourformdigest"
X-HTTP-Method, "MERGE"
If-Match = "etag value from entry node, returned from a GET to the above endpoint"
Post the following JSON body
{
"__metadata": { "type": "SP.Data.SitePagesItem" },
"WikiField" : "HTML entity coded wiki content goes here"
}
The interim answer I have found is to not utilise REST, as it appears to not be
fully documented
fully featured
supported across Sharepoint 2013 and On-line in the same way
So my current recommendation would be to utilise the SOAP services to achieve the same, as these are more documented and easily accessible.

Resources