Use Salesforce API to extract data into Alteryx - apache-spark

I have an Alteryx workflow and wanted to hook it up to import data from Salesforce, specifically Veeva (which sits on Salesforce). I want to use the Salesforce API but not sure how I can do this simply with Alteryx.
Is it possible to use Alteryx with some other software/framework to import data and run it through my ETL process?
I've heard I can possibly use Apache Spark but i'm not familiar with it. I've also heard I can possibly use Alteryx with Apache Camel but not sure about this either. Thanks!

You can find out how to connect to an API in Alteryx at this link:
https://community.alteryx.com/t5/Engine-Works-Blog/REST-API-In-5-Minutes-No-Coding/ba-p/8137
With the Salesforce API, sometimes it can be easiest to use the SOAP API for Authentication and the REST API for download. I'm not entirely sure why, but both Alteryx & Tableau do that behind the scenes for connections.
Basically, you will call out to the SOAP API for authentication, get the Auth Token and use that on subsequent calls to the REST API. The above link should tell you how to implement that in Alteryx.
As for other software/framework for import, the simple answer is Yes. The Tools to look at for this are the R Tool & Run Command Tool. They will let you either import data using an R script or from Command Line (allowing python, js, batch etc).
Spark is supported in Alteryx both natively and using the In-DB scripts. Theoretically you could use Alteryx with Apache Camel, but I don't know enough about the specifics of the Camel endpoints to say that with certainty.

Related

Fetching Data from power BI XMLA Endpoints from Linux

Can any help me in fetching data from power BI endpoint without the need of using Power Shell, as want to know a way of directly fetching in Linux only?
I know a power shell can be installed in Linux , but is there any way I can skip and directly fetch the data?
reference - https://learn.microsoft.com/en-us/power-bi/admin/service-premium-connect-tools
Your Power BI XMLA endpoint is accessible through your Azure Analysis Services (AAS) instance tied to the given datasource/workspace, which means that you should be able to connect to that AAS instance and work with the data there via the web. I am not aware of any currently available Linux compatible tools that allow this. I did a bit of research and was surprised to find that there was not a VS Code extension that allowed this (might have to get to work on that ;)).
That being said, Microsoft has several different client libraries (for both AMO and ADOMD.NET) built within their .NET Core framework that would theoretically be able to used by a client application that could be built for supported Linux OS (Microsoft doc here). In other words, (again, theoretically) it should be relatively painless to build a simple tool for a supported Linux OS that takes in XMLA commands and executes them on a provided connection.
EDIT: Another good option to consider might be Microsoft's Power BI REST API (documentation here). If the functionality you are looking for is available within their REST API, you should be able to write a client tool (using one of many different options, but .NET Core could still be the in there) targeting Linux that makes use of the API for your Power BI instance in place of directly using the XMLA endpoint. I would consider this the better alternative. This is going is a less 'Microsoft-y' way of doing this, and is going to be much easier to maintain and develop over time. I would start by confirming that the functionality you want is not available in this API first.
EDIT: After reading further in above linked document regarding AMO and ADOMD.NET client libraries:
TCP based connectivity is supported for Windows computers only.
Interactive login with Azure Active Directory is supported for Windows computers only. The .NET Core Desktop runtime is required.
So it looks like there are currently some limitations to these libraries regarding a Linux runtime. I am not positive that you could use something other than TCP based connectivity to accomplish this, but if I find a way (or someone is able to suggest something), then I will update.

In DataFactory, is there a solution to load data into Dynamics365 using the web API?

I need to upload registers into Dynamics365 using an AZDF pipeline.
Because of some requirements, I need to load the data through the web API.
I thought of using Functions/Durable Functions. They can use the API with no problem, but I'm concerned about the duration of the execution since I understand that these activities are not meant for long runs, like downloading/loading data, etcetera.
Is it really wrong to use a Functions like this?
Is there an alternative?

Test the behavior of a java web service for multiple concurrent requests

How do I test the behavior of a java restful web service in case of multiple concurrent requests? Is there any 3rd party tool that can be leveraged?
The service accepts POST method. It expects a couple of parameters in it's request body and produces the response in the form of JSON.
The functionality of the service is to perform database read operations using the request body parameters and populate the fetched data in the JSON.
I would recommend one of the following:
SoapUI - superior tool for web service testing. Has limited load testing capabilities. However it does not scale (no clustered mode is available) and has quite poor reporting (all you get is average, min and max response times)
Apache JMeter - multiprotocol load testing tool, supports web services load testing as well. Has better load capabilities and ways to define the load patterns and can represent load test results via HTML Reporting Dashboard. Check out Testing SOAP/REST Web Services Using JMeter article to learn how to conduct a web service load test using JMeter.
You can try Gatling to generate some load.
It has nice documentation and easy QuickStart .
For advanced usage it requires some knowledge of Scala, but it also features GUI tool for simple scenarios recording, so you can run some scripts by postman or whatever browser tool you use for debugging, record it and make that scenario automated.
After running scenarios it generates nice reports using Graphite, so you can see response times and general stats.
Later you can also use Gatling for load and performance tests of your web service, it's convenient and fast as soon as you start playing with it. It can easily generate up to 5k requests per second from my old Mac, or hold up to 1k connections.
One of the bests tools to test web services is SOAPUI.
You can use it for what you want.
Link to SOAPUI
You can check this link to see how to use SOAPUI and concurrent tests.

will restAssured work with rest input parameters coming from excel sheet? Has anyone tried it?

I have been asked by my boss to find a tool or design one for testing rest services hosted in cloud environment. He also asked me if I can read the input data from excel sheet so that other junior members can write tests in excel. I have created keyword driven framework using Apache poi so I know how to read data from excel in Java program. I have also worked on httpClient, so I can tie these together. But I am hearing a lot about rest assured and want to know can I use rest assured where input can be read from excel. Is it worth spending time on? Also with cloud infrastructure which approach is best? Thanks.
You can safely proceed with your boss's request, since there is no link between Excel Inputs & REST assured framework, as both of them are separate jars.
You can read test inputs/outputs from excel and use them in REST assured. REST assured provides an easy to implement REST testing DSL with BDD flavor.
The following link gives you a quick getting started tutorial.
http://www.hascode.com/2011/10/testing-restful-web-services-made-easy-using-the-rest-assured-framework/

Fetching data from cerner's EMR / EHR

I don't have much idea in medical domain.
We evaluating a requirement from our client who is using Cerner EMR system
As per the requirement we need to expose the Cerner EMR or fetch some EMR / EHR data and to display it in SharePoint 2013 portal.
To meet this requirement what kind of integration options Cerner proposes. Is there any API’s or Web services exposed which can be used to build custom solutions for the same?
As far as I know Cerner did expose EMR / EHR information in HL7 format, but i don't have any idea how to access that.
I had also requested Cerner for the same awaiting replies from their end.
If anybody who have associated with similar kind of job can through some light and provide me with some insights.
You will need to request an interface between your organization and the facility with the EMR. An interface in the Health Care IT world is not the same as a GUI. Is is the mechanism (program/tool) that transfers HL7 data between one entity and the other. There will probably be a cost to have an interface setup. However, that is the traditional way Cerner communicates with 3rd parties. HIPAA laws will require that this connection be very secure.
You might also see if the facility with the EMR has an existing interface that produces the info you are after. You may be able to share that data or have a flat file generated from that interface that you could get access to. Because of HIPAA regulations, your client may be reluctant to share information in that manner.
I would suggest you start with your client's interface/integration team. They would be the ones that manage the information into and out of Cerner. They could also shed some light on how they prefer to see things done.
Good Luck
There are two ways of achieving this as I know. One is a direct connectivity to Cerner's Oracle database. This seems less likely to be possible as Cerner doesn't allow other vendors to have a direct access to their database.
The other way is to use Cerner's mPage Web Services. We have achieved this using mPage Web Services. The client needs to host the web services on a IBM WAS or some other container. We used WAS as that was readily available to us. Once the hosting is done, you will get a URL and using that you can execute any CCL program which will return you the data in JSON/XML format. mPage webservice has a basic HTTP authentication.
Now, CCL has to be written in a way which can return you the data you require.
We have a successful setup and have been working on this since 2014. For more details you can also try uCERN portal.
Thanks,
Navin

Resources