Hybris Database Connection - sap-commerce-cloud

For some reason we need to run database native query instead of flexible query. For running those queries we need DB connection so how can we get the jdbcTemplate or DataSource object from Hybris.

This is an example of a script groovy that can achieve this :
import java.sql.Connection;
import java.sql.PreparedStatement;
import java.sql.SQLException;
import de.hybris.platform.util.Utilities;
import de.hybris.platform.core.Registry;
Connection conn = null;
PreparedStatement pstmt = null;
try
{
conn = Registry.getCurrentTenant().getDataSource().getConnection();
pstmt = conn.prepareStatement("your sql query here...");
pstmt.execute();
}
catch (final SQLException e)
{
LOG.error("Error!!");
}
finally
{
Utilities.tryToCloseJDBC(conn, pstmt, null);
}
return "Groovy Rocks!"
Edit : find more details in this article https://www.stackextend.com/hybris/run-native-sql-query-hybris/

Related

package org.apache.http.util does not exist

I am trying on my own to implement a useful trip planner application using android studio with the help of goolge and some open sources
I have android application with no gradle.build. when it try to run it i get the error.
Error:(20, 28) java: package org.apache.http.util does not exist
Here is the part of the code that has the error.
How to solve the error ?
import org.apache.http.HttpEntity;
import org.apache.http.HttpResponse;
import org.apache.http.client.ClientProtocolException;
import org.apache.http.client.HttpClient;
import org.apache.http.client.methods.HttpGet;
import org.apache.http.impl.client.DefaultHttpClient;
import org.apache.http.util.ByteArrayBuffer;
import org.json.JSONArray;
import org.json.JSONException;
import org.json.JSONObject;
public static String queryUrl(String url) {
HttpClient httpclient = new DefaultHttpClient();
HttpGet httpget = new HttpGet(url);
HttpResponse response;
try {
response = httpclient.execute(httpget);
Log.i(TAG, "Url:" + url + "");
Log.i(TAG, "Status:[" + response.getStatusLine().toString() + "]");
HttpEntity entity = response.getEntity();
if (entity != null) {
InputStream instream = entity.getContent();
BufferedReader r = new BufferedReader(new InputStreamReader(instream));
StringBuilder total = new StringBuilder();
String line;
while ((line = r.readLine()) != null) {
total.append(line);
}
instream.close();
String result = total.toString();
return result;
}
} catch (ClientProtocolException e) {
Log.e(TAG, "There was a protocol based error", e);
} catch (Exception e) {
Log.e(TAG, "There was some error", e);
}
return null;
}
See following answer, HttpClient won't import in Android Studio
I don't know how your module's build.gradle looks like, but I assume you are including apache libraries some wrong way.
Also, I would not reccomend using apache HTTP libraries in new android applications. Have look at OkHttp or Volley
I would post this in comments, but I don't have enough reputation yet.

Cassandra cluster is not scaling. 3 Nodes are even a little faster then 6 nodes (Code and data provided)

I am using Datastax Enterprise 4.8 for testing purposes in my bachelor thesis. I am loading wheather data into the cluster (about 33 Mio rows).
The data looks something like the following
//id;unix timestamp; validity; station info; temp in °C; humidity in %
3;1950040101;5;24; 5.7000;83.0000
3;1950040102;5;24; 5.6000;83.0000
3;1950040103;5;24; 5.5000;83.0000
I know my data model is not very clean (I use decimal for the timestamp but I just wanted to try it this way).
CREATE TABLE temp{
id int,
timestamp decimal,
validity decimal,
structure decimal,
temperature float,
humidity float,
PRIMARY KEY((id),timestamp));
I roughly based it on an article on the datastax website:
https://academy.datastax.com/resources/getting-started-time-series-data-modeling
The insertion is done based on the often mentioned article on lostechies
https://lostechies.com/ryansvihla/2016/04/29/cassandra-batch-loading-without-the-batch%E2%80%8A-%E2%80%8Athe-nuanced-edition/
This is my insertion code:
import java.io.BufferedReader;
import java.io.File;
import java.io.FileNotFoundException;
import java.io.FileReader;
import java.io.IOException;
import java.math.BigDecimal;
import java.util.Iterator;
import java.util.LinkedList;
import java.util.List;
import java.util.concurrent.ExecutorService;
import java.util.concurrent.Executors;
import java.util.concurrent.ThreadPoolExecutor;
import java.util.concurrent.TimeUnit;
import com.datastax.driver.core.BoundStatement;
import com.datastax.driver.core.Cluster;
import com.datastax.driver.core.ConsistencyLevel;
import com.datastax.driver.core.PreparedStatement;
import com.datastax.driver.core.ResultSet;
import com.datastax.driver.core.ResultSetFuture;
import com.datastax.driver.core.Session;
import com.datastax.driver.extras.codecs.jdk8.InstantCodec;
import com.google.common.base.Stopwatch;
import com.google.common.util.concurrent.FutureCallback;
import com.google.common.util.concurrent.Futures;
import com.google.common.util.concurrent.MoreExecutors;
public class BulkLoader {
private final int threads;
private final String[] contactHosts;
private final Cluster cluster;
private final Session session;
private final ExecutorService executor;
public BulkLoader(int threads, String... contactHosts) {
this.threads = threads;
this.contactHosts = contactHosts;
this.cluster = Cluster.builder().addContactPoints(contactHosts).build();
cluster.getConfiguration().getCodecRegistry()
.register(InstantCodec.instance);
session = cluster.newSession();
// fixed thread pool that closes on app exit
executor = MoreExecutors
.getExitingExecutorService((ThreadPoolExecutor) Executors
.newFixedThreadPool(threads));
}
public static class IngestCallback implements FutureCallback<ResultSet> {
public void onSuccess(ResultSet result) {
}
public void onFailure(Throwable t) {
throw new RuntimeException(t);
}
}
public void ingest(Iterator<Object[]> boundItemsIterator, String insertCQL)
throws InterruptedException {
final PreparedStatement statement = session.prepare(insertCQL);
while (boundItemsIterator.hasNext()) {
BoundStatement boundStatement = statement.bind(boundItemsIterator
.next());
boundStatement.setConsistencyLevel(ConsistencyLevel.QUORUM);
ResultSetFuture future = session.executeAsync(boundStatement);
Futures.addCallback(future, new IngestCallback(), executor);
}
}
public void stop() {
session.close();
cluster.close();
}
public static List<Object[]> readCSV(File csv) {
BufferedReader fileReader = null;
List<Object[]> result = new LinkedList<Object[]>();
try {
fileReader = new BufferedReader(new FileReader(csv));
String line = "";
while ((line = fileReader.readLine()) != null) {
String[] tokens = line.split(";");
if (tokens.length < 6) {
System.out.println("Unvollständig");
continue;
}
Object[] tmp = new Object[6];
tmp[0] = (int) Integer.parseInt(tokens[0]);
tmp[1] = new BigDecimal(Integer.parseInt(tokens[1]));
tmp[2] = new BigDecimal(Integer.parseInt(tokens[2]));
tmp[3] = new BigDecimal(Integer.parseInt(tokens[2]));
tmp[4] = (float) Float.parseFloat(tokens[4]);
tmp[5] = (float) Float.parseFloat(tokens[5]);
result.add(tmp);
}
} catch (FileNotFoundException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} finally {
try {
fileReader.close();
} catch (IOException e) {
e.printStackTrace();
}
}
return result;
}
public static void main(String[] args) {
Stopwatch watch = Stopwatch.createStarted();
File folder = new File(
"C:/VirtualMachines/Kiosk/BachelorarbeitStraubinger/workspace/bulk/src/main/resources");
List<Object[]> data = new LinkedList<Object[]>();
BulkLoader loader = new BulkLoader(16, "10.2.57.38", "10.2.57.37",
"10.2.57.36", "10.2.57.35", "10.2.57.34", "10.2.57.33");
int cnt = 0;
File[] listOfFiles = folder.listFiles();
for (File file : listOfFiles) {
if (file.isFile() && file.getName().contains(".th")) {
data = readCSV(file);
cnt += data.size();
try {
loader.ingest(
data.iterator(),
"INSERT INTO wheather.temp (id, timestamp, validity, structure, temperature, humidity) VALUES(?,?,?,?,?,?)");
} catch (InterruptedException e) {
e.printStackTrace();
} finally {
System.out.println(file.getName()
+ " -> Datasets importet: " + cnt);
}
}
}
System.out.println("total time seconds = "
+ watch.elapsed(TimeUnit.SECONDS));
watch.stop();
loader.stop();
}
}
The replication factor is 3 and i run test on 6 or 3 nodes. With vNodes enabled and num_tokens = 256.
I get roughly the same insert times when running it on either cluster. Any ideas why that is?
It is likely that you're maxing out the client application / client server. If you're reading from a static file, you may benefit from breaking it up into a few pieces and running them in parallel, or even looking at Brian Hess' loader ( https://github.com/brianmhess/cassandra-loader ) or the real cassandra bulk loader ( http://www.datastax.com/dev/blog/using-the-cassandra-bulk-loader-updated ) , which turns the data into a series of sstables and streams those in directly. Both are likely faster than your existing code.
Physics.
You're probably maxing out the throughput your app is capable of. Normally the answer would be to have multiple clients/app servers but it looks like you are reading from a CSV. I suggest either cutting up the CSV in pieces and running multiple instances of your app or generate fake data and multiple instances of that.
Edit: I also think it's worth noting that with a data model like that, a payload size that small, and proper hardware, I'd imagine each node could be capable of 15-20K inserts/second (Not accounting for node density/compaction).

TestNG Close Browsers after Parallel Test Execution

I want to close browsers after completion of all test. Problem is I am not able to close the browser since the object created ThreadLocal driver does not recognize the driver after completion of test value returning is null.
Below is my working code
package demo;
import java.lang.reflect.Method;
import org.openqa.selenium.By;
import org.testng.annotations.AfterMethod;
import org.testng.annotations.BeforeMethod;
import org.testng.annotations.DataProvider;
import org.testng.annotations.Test;
public class ParallelMethodTest {
private static ThreadLocal<dummy> driver;
private int input;
private int length;
#BeforeMethod
public void beforeMethod() {
System.err.println("Before ID" + Thread.currentThread().getId());
System.setProperty("webdriver.chrome.driver", "chromedriver.exe");
if (driver == null) {
driver = new ThreadLocal<dummy>();
}
if (driver.get()== null) {
driver.set(new dummy());
}
}
#DataProvider(name = "sessionDataProvider", parallel = true)
public static Object[][] sessionDataProvider(Method method) {
int len = 12;
Object[][] parameters = new Object[len][2];
for (int i = 0; i < len; i++) {
parameters[i][0] = i;
parameters[i][1]=len;
}
return parameters;
}
#Test(dataProvider = "sessionDataProvider")
public void executSessionOne(int input,int length) {
System.err.println("Test ID---" + Thread.currentThread().getId());
this.input=input;
this.length=length;
// First session of WebDriver
// find user name text box and fill it
System.out.println("Parameter size is:"+length);
driver.get().getDriver().findElement(By.name("q")).sendKeys(input + "");
System.out.println("Input is:"+input);
try {
Thread.sleep(5000);
} catch (InterruptedException e) {
e.printStackTrace();
}
}
#AfterMethod
public void afterMethod() {
System.err.println("After ID" + Thread.currentThread().getId());
driver.get().close();
}
}
package demo;
import org.openqa.selenium.WebDriver;
import org.openqa.selenium.chrome.ChromeDriver;
import org.testng.annotations.AfterClass;
public class dummy {
public WebDriver getDriver() {
return newDriver;
}
public void setNewDriver(WebDriver newDriver) {
this.newDriver = newDriver;
}
private WebDriver newDriver;
public dummy() {
newDriver = new ChromeDriver();
newDriver.get("https://www.google.co.in/");
}
#AfterClass
public void close(){
if(newDriver!=null){
System.out.println("In After Class");
newDriver.quit();
}
}
}
Thanks in Advance.
private static ThreadLocal<dummy> driver is added at the class level. What is happening is that you have already declared the variable at class level. i.e. memory is already allocated to it. Multiple threads are just setting and resetting the values of the same variable.
What you need to do is create a factory that will return an instance of Driver based on a parameter you pass to it.Logic can be anything but taking a general use case example the factory will create a new object and return only if an existing object doesn't exist. Declare and initialise the driver (from factory) in your #Test Methods
Sample code for the factory would be something like
static RemoteWebDriver firefoxDriver;
static RemoteWebDriver someOtherDriver;
static synchronized RemoteWebDriver getDriver(String browser, String browserVersion, String platform, String platformVersion)
{
if (browser == 'firefox')
{
if (firefoxDriver == null)
{
DesiredCapabilities cloudCaps = new DesiredCapabilities();
cloudCaps.setCapability("browser", browser);
cloudCaps.setCapability("browser_version", browserVersion);
cloudCaps.setCapability("os", platform);
cloudCaps.setCapability("os_version", platformVersion);
cloudCaps.setCapability("browserstack.debug", "true");
cloudCaps.setCapability("browserstack.local", "true");
firefoxDriver = new RemoteWebDriver(new URL(URL),cloudCaps);
}
}
else
{
if (someOtherDriver == null)
{
DesiredCapabilities cloudCaps = new DesiredCapabilities();
cloudCaps.setCapability("browser", browser);
cloudCaps.setCapability("browser_version", browserVersion);
cloudCaps.setCapability("os", platform);
cloudCaps.setCapability("os_version", platformVersion);
cloudCaps.setCapability("browserstack.debug", "true");
cloudCaps.setCapability("browserstack.local", "true");
someOtherDriver = new RemoteWebDriver(new URL(URL),cloudCaps);
}
return someOtherDriver;
}
You have a concurrency issue: multiple threads can create a ThreadLocal instance because dummy == null can evaluate to true on more than one thread when run in parallel. As such, some threads can execute driver.set(new dummy()); but then another thread replaces driver with a new ThreadLocal instance.
In my experience it is simpler and less error prone to always use ThreadLocal as a static final to ensure that multiple objects can access it (static) and that it is only defined once (final).
You can see my answers to the following Stack Overflow questions for related details and code samples:
How to avoid empty extra browser opens when running parallel tests with TestNG
Session not found exception with Selenium Web driver parallel execution of Data Provider test case
This is happening because you are creating the driver instance in beforeMethod function so it's scope ends after the function ends.
So when your afterMethod start it's getting null because webdriver instance already destroy as beforeMethod function is already completed.
Refer below links:-
http://www.java-made-easy.com/variable-scope.html
What is the default scope of a method in Java?

Can access Azure SQL Database in the driver method of a Hadoop job running in HDInsight?

I'd like to work on a Hadoop application which runs on HDInsight. In the driver method of my application, I need to get some information from Azure SQL Database. I wonder to know whether that's possible to query Azure SQL Database in the driver method of my Hadoop job?
You can access Azure SQL Database using java.sql classes but you may need to add your headnode IP to your Database firewall rules.
package org.microsoft.andrewmoll.SqlExample;
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.io.IntWritable;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.mapreduce.Job;
import org.apache.hadoop.mapreduce.Mapper;
import org.apache.hadoop.mapreduce.Reducer;
import java.sql.Connection;
import java.sql.DriverManager;
import java.sql.PreparedStatement;
import java.sql.ResultSet;
/**
* Hello world!
*
*/
public class SQLExample
{
public static class TokenizerMapper
extends Mapper<Object, Text, Text, IntWritable>{
//You should put some awesome map logic here
}
public static class IntSumReducer
extends Reducer<Text,IntWritable,Text,IntWritable> {
//You should put some awesome reducer logic here
}
public static void main(String[] args) throws Exception {
Configuration conf = new Configuration();
String jobName = getData();
System.out.println(jobName);
Job job = Job.getInstance(conf, jobName);
job.setJarByClass(SQLExample.class);
job.setMapperClass(TokenizerMapper.class);
job.setCombinerClass(IntSumReducer.class);
job.setReducerClass(IntSumReducer.class);
job.setOutputKeyClass(Text.class);
job.setOutputValueClass(IntWritable.class);
System.exit(job.waitForCompletion(true) ? 0 : 1);
}
public static String getData()
{
String driver = "com.microsoft.sqlserver.jdbc.SQLServerDriver";
String url = "jdbc:sqlserver:<servername>.database.windows.net;DatabaseName=<dbname>";
String username = "DarthMoll";
String password = "Luke,Iamnotyourfather";
try {
/* Load database driver */
Class.forName(driver);
/* Establish database connection */
Connection con = DriverManager.getConnection(url, username, password);
/* Run query */
PreparedStatement stmt = con.prepareStatement("select top 1 * from dbo.SithWarriors");
/* Get return result */
ResultSet resultset = stmt.executeQuery();
/* get users first name */
String result = resultset.getString("FirstName");
/* Close result set */
resultset.close();
/* Close database connection */
con.close();
return result;
} catch (Exception e) {
e.printStackTrace();
}
return "Implement Some Throwable Here";
}
}
If possible, I suggest storing the data in a blob and using the Java SDK to access the data. Saves you from having to worry about the headnode IP address.

groovy.lang.MissingPropertyException while Downloading Files from FTP Server

I want to create job in grails which download the files from ftp
server after certain interval of time say 2-3 days and store it on
specified local path. the same code with minor changes is written in
java which was working fine but when write the similar code in Grails
I'm facing the Error and not able to resolve it. Can any body Tell me
where I'm making mistake?
Following is the Error that I'm facing when job start.
JOB STARTED::************************************************************************************
2015-08-24 18:20:35,285 INFO org.quartz.core.JobRunShell:207 Job GRAILS_JOBS.com.hoteligence.connector.job.DownloadIpgGzipFilesJob threw a JobExecutionException:
org.quartz.JobExecutionException: groovy.lang.MissingPropertyException: No such property: ftpClient for class: com.hoteligence.connector.job.DownloadIpgGzipFilesJob [See nested exception: groovy.lang.MissingPropertyException: No such property: ftpClient for class: com.hoteligence.connector.job.DownloadIpgGzipFilesJob]
at grails.plugins.quartz.GrailsJobFactory$GrailsJob.execute(GrailsJobFactory.java:111)
at org.quartz.core.JobRunShell.run(JobRunShell.java:202)
at org.quartz.simpl.SimpleThreadPool$WorkerThread.run(SimpleThreadPool.java:573)
Caused by: groovy.lang.MissingPropertyException: No such property: ftpClient for class: com.hoteligence.connector.job.DownloadIpgGzipFilesJob
at com.hoteligence.connector.job.DownloadIpgGzipFilesJob.execute(DownloadIpgGzipFilesJob.groovy:93)
at grails.plugins.quartz.GrailsJobFactory$GrailsJob.execute(GrailsJobFactory.java:104)
... 2 more
/* I've added all the related dependencies in grails Build Config.
*/
package com.hoteligence.connector.job
import java.io.BufferedOutputStream;
import java.io.File;
import java.io.FileOutputStream;
import java.io.IOException;
import java.io.InputStream;
import java.io.OutputStream;
import java.text.DateFormat;
import java.text.SimpleDateFormat;
import java.util.ArrayList;
import java.util.Date;
import org.codehaus.groovy.grails.commons.ConfigurationHolder as ConfigHolder;
import org.apache.commons.net.ftp.FTP;
import org.apache.commons.net.ftp.FTPClient;
import org.apache.commons.net.ftp.FTPFile;
import org.apache.commons.net.ftp.FTPReply;
/**
* #author Gajanan
* this is back-end job which download Files from ftp server and store it on locally
*/
class DownloadIpgGzipFilesJob {
static triggers = {
simple repeatInterval: Long.parseLong(ConfigHolder.config.DEVICE_PING_ALERT_JOB_REPEAT_INTERVAL),
startDelay : 60000
}
def execute() {
try{
println "JOB STARTED::************************************************************************************";
/* following is the details which are required for server connectivity
*/
String server = ConfigHolder.config.IPG_SERVER_NAME;
int port = ConfigHolder.config.IPG_SERVER_PORT_NO;
String user = ConfigHolder.config.IPG_SERVER_USER_NAME;
String pass = ConfigHolder.config.IPG_SERVER_USER_PASSWORD;
String [] fileNames = ConfigHolder.config.IPG_DOWNLOADABLE_GZIP_FILE_LIST.split(",");
String downloadFilePath = ConfigHolder.config.IPG_GZIP_DOWNLOAD_LOCATION;
String fileDate = (todaysDate.getYear()+1900)+""+((todaysDate.getMonth()+1)<=9?("0"+(todaysDate.getMonth()+1)):(todaysDate.getMonth()+1))+""+todaysDate.getDate();
FTPClient ftpClient = new FTPClient();
/* Here we are making connection to the server and the reply
from server is printed on console
*/
ftpClient.connect(server, port);
showServerReply(ftpClient);
int replyCode = ftpClient.getReplyCode();
if (!FTPReply.isPositiveCompletion(replyCode)) {
System.out.println("Connect failed");
return;
}
boolean success = ftpClient.login(user, pass);
showServerReply(ftpClient);
if (!success) {
System.out.println("Could not login to the server");
return;
}
/* Here we are iterate the FileList and download them to specified directory
*/
for(int i =0; i<fileNames.length;i++) {
String fileName = "on_"+ConfigHolder.config.IPG_DATA_COUNTRY_CODE+fileNames[i]+fileDate+".xml.gz";
System.out.println(fileName);
downloadFtpFileByName(ftpClient,fileName,downloadFilePath+fileName);
}
}
catch (IOException ex) {
System.out.println("Oops! Something wrong happened");
ex.printStackTrace();
}
catch(Exception e) {
e.printStackTrace();
}
finally {
// logs out and disconnects from server
/* In finally block we forcefully close the connection and close the file node also
*/
try {
if (ftpClient.isConnected()) {
ftpClient.logout();
ftpClient.disconnect();
}
} catch (IOException ex) {
ex.printStackTrace();
}
}
}
/* this function is nothing but to print the ftp server reply after connection to ftp server
*/
private static void showServerReply(FTPClient ftpClient) {
String[] replies = ftpClient.getReplyStrings();
if (replies != null && replies.length > 0) {
for (String aReply : replies) {
System.out.println("SERVER: " + aReply);
}
}
}
/* This is the actual logic where we copy the file from ftp
and store on local directory
this method accept three parameter FtpClient object, Name of the file which has to be downloaded from server and the path where downloaded file has to be stored
*/
private static void downloadFtpFileByName(FTPClient ftpClient,String fileName,String downloadfileName){
System.out.println("Strat Time::"+System.currentTimeMillis());
try {
String remoteFile1 = "/"+fileName; // file on server
File downloadFile1 = new File(downloadfileName); // new file which is going to be copied on local directory
OutputStream outputStream1 = new BufferedOutputStream(new FileOutputStream(downloadFile1));
Boolean success = ftpClient.retrieveFile(remoteFile1, outputStream1);
if (success) {
System.out.println("File"+fileName+" has been downloaded successfully.");
}
else
{
System.out.println("File"+fileName+" has been DOWNLOAD FAILURE....");
}
outputStream1.close();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
System.out.println("END Time::"+System.currentTimeMillis());
}
}
Move this line:
FTPClient ftpClient = new FTPClient();
Outside of the try { ... } catch() block (ie, move it up to before the try)
You are declaring the local variable inside the try, then trying to use it in the finally block

Resources