Using my own API in Alloy - alloy

Just wondering about how can I use my own API in alloy?
I've developed an API in alloy but i don't know how can I use it?
regards
Moody

What do you mean by using "your own API"?
I assume you've developed some modules and predicates in Alloy that you wish to call from the Java API. In that case, I don't think you can do that directly using the API. Your best bet is to write Alloy expressions as plain strings in Java, then parse them (e.g., using CompUtil.parseOneExpression_fromString), and then evaluate them (e.g., using A4Solution.eval, provided that you've already obtain an instance of A4Solution).

Related

Library function that implements TPM2_MakeCredential

For remote attestation using a TPM, on the server-side I need the TPM2_MakeCredential function. Note that this function is implemented in the TPM but it is a bit off because it doesn't depend on any TPM state, it's completely stateless (unlike the TPM2_ActivateCredential function - to be run on the client-side - which critically depends on TPM keys). According to the documentation, it's provided as a convenience. That's OK but the problem is the server doesn't have (nor requires) a TPM. But I still want to use the TPM2_MakeCredential function.
Unfortunately, I haven't been able to find a library implementation of this function. There's a full-blown TPM2.0 emulator provided by Microsoft that the TPM.MSR libraries can interface to. This works, but it requires starting and managing the simulator process which sets up sockets etc. which I would rather avoid. I am wondering if there's a pure C/C++/C# implementation provided as a library? I have been working with various solutions but the function is not trivial to re-implement, and it's also not trivial to extract from the simulator.
It turns out the TPM.MSR library itself exposes this functionality (implemented purely in the library itself, not relying on a TPM) via the CreateActivationCredentials() function on TpmPublic.

MPS Typesystem Querying Databases/Issuing API calls

I'm using MPS to implement a really interesting DSL. What I'm curious to know, and haven't found anything on their site about, is whether I can, within the typesystem, issue an API call or query a database on the fly. So we would see that an equivalence test occurs and I want to be able to issue an API call or database query to see the feasibility of that equivalence test under further constraints and analysis.
You can call any Java code but it will probably make the editor and possibly other parts of MPS unresponsive since asynchronous calls are not supported.

AWS lambda like execution of Haskell functions

In AWS Lambda people can create a node.js function and trigger it through events, for example a message, etc..
I wonder how this can work 'under the hood' and how to put something like this together in Haskell. The uploaded functions are basically single function libraries without any main function.
Means on the CLI or via an API you can call any of your functions by name (and with the required input) and you get the output defined by the function signature--or, alternatively an error of course.
Would it be possible to do this in Haskell?
To clarify: what I want to do is for example loading a number of different single function Haskell libraries on a Haskell platform or any other execution context that is in my data center and execute / call them by name via the CLI or an API just in the same way AWS Lambda works with node.js functions.
If you want reproduce the same functionality (functions as services) exists a lot of technologies you can use (soap, rpc, rest, ...). If you unknown it I suggest to you read about.
My ever favorite is SOAP but is so unpopular and Haskell support is limited soap (see related question). SOAP (as others) provide exactly you want.
But you must to concrete your real problem to select the best technology.
If you are looking for distribute your own Haskell code Cloud Haskell may be a good starting point.
If you need more like "web server" then take a look to sodium, elm, ... in a Javascript style or servant (generate client code to some languages).
Anyway, even AWS Lambda require support for each language and you should not expect to find one ubiquitous technology (like HTTP) for RPC.
You may want to check out the "serverless for haskell" framework: http://qmu.li
Not only you can run individual haskell functions as Lambda functions with it but you can also describe your whole AWS infrastructure in haskell. (vs. doing it in CloudFormation json/yaml template), build it all locally and easily deploy to AWS.

Providing documentation with Node/JS REST APIs

I'm looking to build a REST API using Node and Express and I'd like to provide documentation with it. I don't want to craft this by hand and it appears that there are solutions available in the forms of Swagger, RAML and Api Blueprint/Apiary.
What I'd really like is to have the documentation auto-generate from the API code as is possible in .NET land with Swashbuckle or the Microsoft provided solution but they're made possible by strong typing and reflection.
For the JS world it seems like the correct option is to use the Swagger/RAML/Api Blueprint markup to define the API and then generate the documentation and scaffold the server from that. The former seems straightforward but I'm less sure about the latter. What I've seen of the server code generation for all of these options seem very limited. There needs to be some way to separate the auto-generated code from the manual code so that the definition can be updated easily and I've seen no sign or discussion on that. It doesn't seem like an insurmountable problem (I'm much more familiar with .NET than JS so I could easily be missing something) and there is mention of this issue and solutions being worked on in a previous Stack Overflow question from over a year ago.
Can anyone tell me if I'm missing/misunderstanding anything and if any solution for the above problem exists?
the initial version of swagger-node-express did just this--you would define some metadata from the routes, models, etc., and the documentation would auto-generate from it. Given how dynamic javascript is, this became a bit cumbersome for many to use, as it required you to keep the metadata up-to-date against the models in a somewhat decoupled manner.
Fast forward and the latest swagger-node project takes an alternative approach which can be considered in-line with "generating documentation from code" in a sense. In this project (and swagger-inflector for java, and connexion for python) take the approach that the swagger specification is the DSL for the api, and the routing logic is handled by what is defined in the swagger document. From there, you simply implement the controllers.
If you treat the swagger specification "like code" then this is a very efficient way to go--the documentation can literally never be out of date, since it is used to construct all routes, validate all input variables, and connect the API to your business layer.
While true code generation, such as what is available from the swagger-codegen project can be extremely effective, it does require some clever integration with your code after you initially construct the server. That consideration is completely removed from the workflow with the three projects above.
I hope this is helpful!
My experience with APIs and dynamic languages is that the accent is on verification instead of code generation.
For example, if using a compiled language I generate artifacts from the API spec and use that to enforce correctness. Round tripping is supported via the generation of interfaces instead of concrete classes.
With a dynamic language, the spec is used at test time to guarantee that both all the defined API is test covered and that the responses are conform to the spec (I tend to not validate requests because of Postel's law, but it is possible too).

What is the best way to expose Cassandra REST API to web?

I would like to work with Cassandra from javascript web app using REST API.
REST should support basic commands working with DB - create table, select/add/update/remove items. Will be perfect to have something similar to odata protocol.
P.S. I'm looking for some library or component. Java is a most preferred.
Staash solution looks perfect for the task - https://github.com/Netflix/staash
You can use DataStax drivers. I used it via Scala but you can use Java, a Session object is a long-lived object and it should not be used in a request/response short-lived fashion but it's up to you.
ref. rules when using datastax drivers
There is no "best" language for REST APIs, it depends on what you're comfortable using. Virtually all languages will be able to do this reasonable well, depending on your skill level.
The obvious choice is probably java, because cassandra's written in java, the java driver from Datastax is well supported, and because it's probably pretty easy to find some spring REST frameworks to do what you want. Second beyond that would be python - again, good driver support and REST frameworks with things like django or flask+potion. Ruby driver isn't bad, lots of ruby REST APIs out there, too.

Resources