How to Identify whether document following CDA or CCD format? - medical

I'm working on retrieving data from clinical document(cda or ccd). I want to identify whether the document is CDA or CCD. I check the MDHT Java api but I didn't find related.

The new standard per meaningful use stage 2 is C-CDA and all EHR vendors must comply to remain "certified". You really shouldn't be encountering too many CCRs.
Also not too sure why MDHT is being used when resources such as bluebutton+ (originally VA and now an initiative through the ONC) is far more simplistic and the js in their library will parse most forms of C-CDA. Check their resources - https://github.com/blue-button/bluebutton.js/
Check this resource as well - http://ccda-scorecard.smartplatforms.org/static/ccdaScorecard/#/

You will need to clarify your use of CDA and CCD (Did you mean CCR) - CCD stands for Continuity of Care Document which there are two version - the latest being part of the consolidated standard from hl7 - CCD is a CDA
MDHT has both implemented and you can use the CDAUtil.load on the document resource; MDHT will return a java object of the document type and the use an instanceof to check the type of document
ClinicalDocument clinicalDocument = CDAUtil.load(new FileInputStream("somedocument.xml"), result);
if (clinicalDocument instanceof ContinuityOfCareDocument) {
// Do something here
}
Here is a simple document processing framework using MDHT https://www.projects.openhealthtools.org/integration/viewvc/viewvc.cgi/trunk/examples/org.openhealthtools.mdht.cda.processor/?root=mdht-models&system=exsy1002

Related

How to add collections in transformations when writing(creating) a Document in MarkLogic

I wrote a transformation in xquery which unquotes an XML-String and inserts a element with its content. This works fine.
I need to create a collection dependant on the root element of this element as well. I can't do this on new documents as xdmp:document-add-collections() is not working. How do I add the collection to new Documents in transformations?
Here my ServerSide xQuery Code:
xquery version "1.0-ml";
module namespace transform = "http://marklogic.com/rest-api/transform/smtextdocuments";
import module namespace mem = "http://xqdev.com/in-mem-update" at '/MarkLogic/appservices/utils/in-mem-update.xqy';
declare function transform(
$context as map:map,
$params as map:map,
$content as document-node()
) as document-node()
{
let $uri := base-uri($content)
let $doccont := $content/smtextdocuments/documentcontent
let $newcont := xdmp:unquote($doccont)
let $contname := node-name($newcont/*)
let $result := if ( exists($content/smtextdocuments/content))
then mem:node-replace($content/smtextdocuments/content, <content>11{$newcont}</content>)
else mem:node-insert-after($doccont, <content>{$newcont}</content>)
let $log := xdmp:log($content)
return (
$result,
xdmp:document-add-collections($uri, fn:string($contname)),
xdmp:document-remove-collections($uri, "raw")
)
};
The script ist running with the java api (4.0.4) create methode via parameter ServerTransform transform. As per documentation the transformation script is running before the document is stored in the Database.
Its a new document; I need to transform the content and then create the collection.
I can see the document after the create, the content is available. Just the collection is missing. I can try xdmp:document-insert method but is it correct writing the document while create is running?.
The transform mechanism of the Java API / REST API takes responsibility for the document write. At present, there's no way for the transform to supply collections to the writer. That would be a reasonable request for enhancement.
The transform shouldn't attempt to write the document, because the writer would also attempt to write the same document.
One alternative would be to transform the document in Java before writing it and specify the collection as part of the write request.
Another alternative would be to rewrite the transform as a resource service extension, implement the write within the resource service extension, and modify the Java client to send the document to the resource service extension.
Depending on the model, a final alternative might be to use a range index on an element within the document to collect documents into sets instead of using a collection on the document.
Hoping that helps,
What do you mean by "new documents"? Is the document already inserted into the MarkLogic database at the time you are adjusting the collections of it? If not, you may want to modify your return to ($result, xdmp:document-insert($uri, $result, xdmp:default-permissions(), fn:string($contname)) ) for that case.
Otherwise, can you edit your question to share the error or problem more specifically you are facing?
It is a pity that REST transforms do not allow this, like MLCP transforms do. Until changed you have the options drawn by ehennum, or you can consider delaying adding of collections to a pre- or post-commit trigger. It takes some overhead, but it sometimes makes perfect sense to do something like that in a trigger, since it makes sure it is always enforced, and a good place to do content validation, audit logging, and things like that as well.
HTH!

How to achieve security level 3 in FIWARE?

I am deploying FIWARE security GEs (i.e., Wilma, AuthzForce, Keyrock) in my computer. Security level 2 (Basic Authorization) is working well, but now I need security level 3 (Advanced Authorization) using XACML.
Long story short, I want a tutorial of implementation security level 3. However, as far as I know, any tutorial or document about security level 3 does not exist.
For now, I create my policy with PAP's API, and change 'custom_policy' option in config.js from 'undefined' to 'policy.js'. And then I create 'policy.js' file into 'PEP/policies', but don't change anything compared with its template file because I don't know what this code does exactly. I think I should make XACML Request form using 'xml' variable. But in my case, PEP gives me the error when I make the XACML Request using 'xml' variable, and return this variable. Here is my error of PEP:
Error: Root - Error in AZF communication <?xml version="1.0" encoding="UTF-8" standalone="yes"?><error xmlns="http://authzforce.github.io/rest-api-model/xmlns/authz/S" xmlns:ns2="http://www.w3.org/2005/Atom" xmlns:ns3="http://authzforce.github.io/core/xmlns/pdp/5.0" xmlns:ns4="http://authzforce.github.io/pap-dao-flat-file/xmlns/properties/3.6"><message>Invalid parameters: cvc-elt.1: Cannot find the declaration of element 'Request'.</message></error>
And here is my 'getPolicy' code (XACML Request) in policy.js. I just made very simple request whether response is permit or not because I'm not sure what I'm doing at that time.:
exports.getPolicy = function (roles, req, app_id) {
var xml = xmlBuilder.create('Request', {
'xmlns': 'urn:oasis:names:tc:xacml:3.0:core:schema:wd-17',
'CombinedDecision': 'false',
'ReturnPolicyIdList': 'false'})
.ele('Attributes', {
'Category': 'urn:oasis:names:tc:xacml:1.0:subject-category:access-subject'});
So, anyone can give me any information about implementation of security level 3?
Upgrade to Wilma 6.2 (bug fixing).
Reuse the code from lib/azf.js which is known to work, and adapt the Request content to your needs. The variable is wrongly called XACMLPolicy there, but don't be mistaken, this is an actual XACML Request. This is using xml2json package to convert the JSON to XML, whereas in your code you seem to use a different one, xmlbuilder maybe? You didn't paste the full code - where does this xmlBuilder variable come from? - so I'm just guessing.
If you are indeed using xmlbuilder package and want to stick with it, I notice that in the example using namespaces, the xmlns attribute is put in a different way:
var xmlBuilder = require('xmlbuilder');
var xml = xmlBuilder.create('Request', { encoding: 'utf-8' })
.att('xmlns', 'urn:oasis:names:tc:xacml:3.0:core:schema:wd-17')
.att('CombinedDecision': 'false')
.att('ReturnPolicyIdList': 'false')
.ele('Attributes', {'Category': 'urn:oasis:names:tc:xacml:1.0:subject-category:access-subject'});
Maybe this makes a difference, I didn't check.
Also feel free to create an issue with your question on Wilma's github to get help from the dev team. (I am not one of them but we've worked together for AuthzForce integration.)
The error you are getting is really
Invalid parameters: cvc-elt.1: Cannot find the declaration of element
'Request'.
This is a simple XML validation issue. You need to make sure that the XACML request you send contains the right namespace declaration.
You'll see there is another question on this topic here.
Can you paste your XACML request so we can tell whether it is valid?

Passing sets of properties and nodes as a POST statement wit KOA-NEO4J or BOLT

I am building a REST API which connects to a NEO4J instance. I am using the koa-neo4j library as the basis (https://github.com/assister-ai/koa-neo4j-starter-kit). I am a beginner at all these technologies but thanks to some help from this forum I have the basic functionality working. For example the below code allows me to create a new node with the label "metric" and set the name and dateAdded propertis.
URL:
/metric?metricName=Test&dateAdded=2/21/2017
index.js
app.defineAPI({
method: 'POST',
route: '/api/v1/imm/metric',
cypherQueryFile: './src/api/v1/imm/metric/createMetric.cyp'
});
createMetric.cyp"
CREATE (n:metric {
name: $metricName,
dateAdded: $dateAdded
})
return ID(n) as id
However, I am struggling to know how I can approach more complicated examples. How can I handle situations when I don't know how many properties will be added when creating a new node beforehand or when I want to create multiple nodes in a single post statement. Ideally I would like to be able to pass something like JSON as part of the POST which would contain all of the nodes, labels and properties that I want to create. Is something like this possible? I tried using the below Cypher query and passing a JSON string in the POST body but it didn't work.
UNWIND $props AS properties
CREATE (n:metric)
SET n = properties
RETURN n
Would I be better off switching tothe Neo4j Rest API instead of the BOLT protocol and the KOA-NEO4J framework. From my research I thought it was better to use BOLT but I want to have a Rest API as the middle layer between my front and back end so I am willing to change over if this will be easier in the longer term.
Thanks for the help!
Your Cypher syntax is bad in a couple of ways.
UNWIND only accepts a collection as its argument, not a string.
SET n = properties is only legal if properties is a map, not a string.
This query should work for creating a single node (assuming that $props is a map containing all the properties you want to store with the newly created node):
CREATE (n:metric $props)
RETURN n
If you want to create multiple nodes, then this query (essentially the same as yours) should work (but only if $prop_collection is a collection of maps):
UNWIND $prop_collection AS props
CREATE (n:metric)
SET n = props
RETURN n
I too have faced difficulties when trying to pass complex types as arguments to neo4j, this has to do with type conversions between js and cypher over bolt and there is not much one could do except for filing an issue in the official neo4j JavaScript driver repo. koa-neo4j uses the official driver under the hood.
One way to go about such scenarios in koa-neo4j is using JavaScript to manipulate the arguments before sending to Cypher:
https://github.com/assister-ai/koa-neo4j#preprocess-lifecycle
Also possible to further manipulate the results of a Cypher query using postProcess lifecycle hook:
https://github.com/assister-ai/koa-neo4j#postprocess-lifecycle

Incremental loading in Azure Mobile Services

Given the following code:
listView.ItemsSource =
App.azureClient.GetTable<SomeTable>().ToIncrementalLoadingCollection();
We get incremental loading without further changes.
But what if we modify the read.js server side script to e.g. use mssql to query another table instead. What happens to the incremental loading? I'm assuming it breaks; if so, what's needed to support it again?
And what if the query used the untyped version instead, e.g.
App.azureClient.GetTable("SomeTable").ReadAsync(...)
Could incremental loading be somehow supported in this case, or must it be done "by hand" somehow?
Bonus points for insights on how Azure Mobile Services implements incremental loading between the server and the client.
The incremental loading collection works by sending the $top and $skip query parameters (those are also sent when you do a query by using the .Take and .Skip methods in the table). So if you want to modify the read script to do something other than the default behavior, while still maintaining the ability to use that table with an incremental loading collection, you need to take those values into account.
To do that, you can ask for the query components, which will contain the values, as shown below:
function read(query, user, request) {
var queryComponents = query.getComponents();
console.log('query components: ', queryComponents); // useful to see all information
var top = queryComponents.take;
var skip = queryComponents.skip;
// do whatever you want with those values, then call request.respond(...)
}
The way it's implemented at the client is by using a class which implements the ISupportIncrementalLoading interface. You can see it (and the full source code for the client SDKs) in the GitHub repository, or more specifically the MobileServiceIncrementalLoadingCollection class (the method is added as an extension in the MobileServiceIncrementalLoadingCollectionExtensions class).
And the untyped table does not have that method - as you can see in the extension class, it's only added to the typed version of the table.

Critical Problem with Sharepoint Timer Job Properties

Some minutes ago I tried to create a time job
A added some properties like
this.Properties.Add("fileName", fileName);
this.Properties.Add("username", new NetworkCredential("username", "passworD");
After updating the job a get a critical error in the Timer Job list of the Central Administration occured.
The platform does not know how to deserialize an object of type System.Net.NetworkCredential. The platform can deserialize primitive types such as strings, integers, and GUIDs; other SPPersistedObjects or SPAutoserializingObjects; or collections of any of the above. Consider redesigning your objects to store values in one of these supported formats, or contact your software vendor for support.
Now Im unabled to delete or retract the job with SPJobdefinition's Delete() method or other classes within the SPObject model.
Ok. I got it.
I deleted the corresponding object in the SharepointConfigDatabase.dbo.Objects table

Resources