How to run example systemml code on e.g. IBM Watson Studio - systemml

I'm trying to follow the quide
https://towardsdatascience.com/how-to-train-your-neural-networks-in-parallel-with-keras-and-apache-spark-ea8a3f48cae6#
to try out keras with systemml and spark.
Anyway I could not find the mentioned free spark plan on ibm watson. Can anyone help me to find it or is it just not available anymore?
Thanks!

In your IBM Watson Studio (you can have a free account), go to your projects (or create one), in the project go to "Environments" tab and choose one of your existing environments or create a new environment, it allows you to choose the Spark environment described at the URL that you mentioned.

Thanks!
Acutally that was just the way I was trying out. But I was confused that it says it will consume capasity units per hour and the block said it will be free.
Now I found out, that one has 50 capacities free each month. So I can try the code.

Related

SAP Cloud SDK CI/CD Pipeline: Usage with non S\4 Services

I am using SAP Cloud SDK (Java flavour) to create an extension application of SuccessFactors.
I sadly discovered that the Jenkins pipeline does not allow me to use any other service than the ones listed here: SCN Blog (scroll to the Appendix).
This does not make so much sense to me, as now the SDK can be used - and it is sponsored to be used by SAP - also with SaaS in its ecosystem, SuccessFactors being one of them.
Any hint? Can this check be somehow "bypassed"?
Thanks,
Roberto.
Please note that the blog post is quite old, have you verified your assumption that it does not work with SuccessFactors API?
Nonetheless, we recently introduced a configuration option which allows you to disable certain checks, cf https://github.com/SAP/cloud-s4-sdk-pipeline/blob/master/configuration.md#s4sdkqualitychecks
checkServices is what you would want to disable in your scenario.
As stated by Florian in the comment and following the Project Piper documentation, parameters "checkServices" and "customODataServices" can be used to customize the behavior of the pipeline when running upon a non-Business Hub API.
"checkServices: false" will completely deactivated the check, whereas "customODataServices: [ yourApiName ]" will skip the check just for the specified services.

Post Service call with Multiple records

I would like to know how to post multiple records to SAP using "BatchRequestBuilder" along with ChangeSet .I am using a custom odata service call(ODataCreateRequestBuilder),not using the VDM model. I did'nt get any blog or documentation to start with.
Can you please help me in this regard.
Updated:
Below is what I am trying to post to SAP
[{"purchaseSchAgrmntNo":"","customerMaterialNumber":"","plant":"","vendorNo":""},{"purchaseSchAgrmntNo":"","customerMaterialNumber":"","plant":"","vendorNo":""}]
SAP SDK version : 3.9.0
I have added below code with only one CreateRequest.
ChangeSet changeSet = new ChangeSetBuilder().addCreateRequest( ODataCreateRequestBuilder.withEntity(sapConfig.getServiceUrlRepriceList(),
sapConfig.getEntityRepriceList())
.withBodyAsMap(responseBody)
.build()).build();
BatchResult batchResult = BatchRequestBuilder.withService("URL?").addChangeSet(changeSet).build().execute(httpClient);
Can you let me know if this is correct.Also let me know what I have to pass in the service.Is it service URL?
Thanks,
Arun Pai
The BatchRequestBuilder is actually not directly part of the SAP Cloud SDK but a dependency that the SDK internally uses to execute batch requests. That is why on the SDK level there is no documentation on how to use it.
Roughly, a batch request comprises of multiple change sets which in turn group together multiple operations. The ChangeSetBuilder allows you to build up change sets which you can then pass to a BatchRequestBuilder.
So if you want to run create requests in batch mode you would want to leverage public ChangeSetBuilder addCreateRequest(ODataCreateRequest oDataCreateRequest).
You can take a look at how the SAP Cloud SDK uses these classes to build up batch requests to get an idea how it works in detail. As a starting point look towards BatchFluentHelperBasic. However, unless you don't know the service you want to query at compile time, I recommend that you leverage the generator to generate this code so that you can use the VDM instead which simplifies this.
If you extend your question to hold more specific information on what you actually want to achieve I can expand my answer to give a more concrete example. Also please include the SDK version you are using.

IBM Natural Language Processing Projects (Beginner getting started question)

I've been digging into the IBM Cloud Services, Watson and NLP. Just installed the CLI and tried with Node SDKs, and a starterkit, unfortunately I did not succeed by trying to get a sample code by default to understand how it works.
After that, I did some research get a better open minded approach to how actually I could use some of their free services to get started, but there's actually to vague information, even though the IBM Docs are pretty extensive and well written, it can get very confusing.
I would appreciate any open source repo, or working/live project that you are willing to share to make a better image in my mind about it IBM cloud services.
A few days ago I wrote a sample application using the Natural Language Understanding service. Check the source code here: https://github.com/watson-developer-cloud/natural-language-understanding-code-pattern
The README has instructions on how to get the apikey which is the way you will use to authenticate your API calls.
Since you are using Node.js you can start with the sample above and also look at this page: https://cloud.ibm.com/apidocs/natural-language-understanding/natural-language-understanding?code=node which includes examples for all the features in Node.js using the node-sdk: https://github.com/watson-developer-cloud/node-sdk/

Is there any tool replacement for SONAR for .net code quality and generate report from it?

I have a Visual studio solution, which is designed using c# 4.0 .
I want to check the code quality for my solution and generate report out of it.
I tried the FxCop and i also got the report but i need the report something like this(from the image).
The rules compliance is 85% but in FxCop it only showed me the critical, error, etc.
I was not able to even deploy my project into SONAR because I had some timeout issue
coming for one of my project in the solution.
please someone help me.
Thanks in advance.
Regards,
Roopini
I don't know if there's an equivalent of SonarQube for .NET projects, but if you really want such reporting (which I can understand, obviously!), you should rather ask questions on how to resolve your installation issue for SonarQube instead of searching for something else. There are plenty of organizations where big .NET solutions are successfully analyzed with SonarQube and the C# plugins, so there's no reason why it can't work for you!
You can find useful material on the net to help you on this. For instance, a blog post written by John M Wright about "setting up SonarQube for C# projects". John periodically updates his post, so the information should still be very relevant.
Have you tried the tool NDepend? It generates interactive reports about .NET code quality and code rules compliance. Here are some sample reports.
NDepend is also a tool integrated in Visual Studio (2017, 2015, 2013, 2012, 2010) that proposes a range of interactive features (graph, dependency matrix, code metrics visualization, code diff...). Another point about NDepend is that code rules are actually C# LINQ queries, so it is pretty easy to customize a default code rule or create your own code rules.
NDepend also integrates in VS Team Services and you'll get all code quality data from your VSTS UI instead of being redirected to a server.
I read that you have time-out problems analyzing your code base, maybe it is because your code base is pretty large. NDepend is optimized and it can analyze a very large code base and create a report in a few dozens of seconds (it takes around a minute to analyze the whole .NET Fx).
A 14 days full featured trial is available.
Disclaimer: I work in the NDepend team
If you haven't already, I would suggest taking a look at my blog post on setting up SonarQube for C# projects: http://www.wrightfully.com/setting-up-sonar-analysis-for-c-projects/
The key to fixing your issue will be determining what the system is doing when the timeout occurs. Take a look at your log files and see what the last lines were before it timed out. It could be that your code is complex and just needs more time, in which case you can adjust the timeout values for whichever tool is running at the time.
Otherwise, I would suggest running whichever analysis tool (fxcop, gendarme, sytlecop, etc) was running when the timeout occurred outside of SonarQube. That is, run the tool directly from the commandline to see if it still times out or provides any additional information on the console.
Also, assuming you're using the sonar-runner tool to execute the SonarQube analysis, you can add the -X argument to the commandline, which will run it with debug-level logging enabled. This will create a LOT more log messages which may shed some additional light on the issue.

Windows azure cloud development

I am new to cloud computing. I created a new cloud solution using Visual Studio 2010.
I need to deploy my solution somewhere in order to test it.
As I saw in my researches, I should have an account on http://www.microsoft.com/windowsazure/account/
Currently i do not have an account there, so where should I deploy my application, and how can i test it?
If you go here: http://msdn.microsoft.com/en-us/windowsazure/cc974146.aspx you can download the whole SDK and other tools which essentially allows you to run Azure on your local machine. It requires you to have SQL Server installed.
Apologies for the lack of details, it's quite a while since I did it myself. But, poke around on that page and you'll find all the tools and documenation you need. It's a big, hairy thing to get your head around so you'll need to set some time aside to just read, sadly.
Best resource to get started is the Azure training kit, get it here:
http://www.microsoft.com/downloads/en/details.aspx?FamilyID=413e88f8-5966-4a83-b309-53b7b77edf78&displaylang=en
Watch some videos then dive into the labs, the best teacher is hands on experience.

Resources