I am migrating the code which is using Apache POI-2.5 to POI-5.
I am unable to find the import org.apache.poi.hssf.util.Region; Could please help me where can I find the jar or replacement ?
I need to update below Region with latest implementation from POI-5.
sheet.addMergedRegion(new Region(0,(short)13,0,(short)16));
Thanks!
sheet.addMergedRegion(new CellRangeAddress(0,(short)13,0,(short)16));
Related
While doing system update ( or ant update system), we are getting below error (hmchistoryentries doesn't exist). Any one faced this before?
Per documentation, it seems this is a deprecated item type. Though we are not using hmc, we are not sure which extension is using this itemtype. Appreciate your help.
[java] Caused by: org.apache.ddlutils.DatabaseOperationException: java.sql.SQLSyntaxErrorException: Table 'hybrisD2C.hmchistoryentries' doesn't exist[java] at org.apache.ddlutils.platform.PlatformImplBase.readModelFromDatabase(PlatformImplBase.java:1891)[java] at org.apache.ddlutils.platform.PlatformImplBase.readModelFromDatabase(PlatformImplBase.java:1869)[java] at de.hybris.bootstrap.ddl.HybrisSchemaGenerator.update(HybrisSchemaGenerator.java:225)[java] at de.hybris.platform.core.Initialization.initializeSchemaAndTypeSystemFullyNewStyle(Initialization.java:1245)[java] at de.hybris.platform.core.Initialization.initialize(Initialization.java:1121)[java] at de.hybris.platform.core.Initialization.createEmptySystemOrUpdate(Initialization.java:776)[java] at de.hybris.platform.core.Initialization.access$4(Initialization.java:756)[java] at de.hybris.platform.core.Initialization$4.call(Initialization.java:563)[java] at de.hybris.platform.core.Initialization$4.call(Initialization.java:1)[java] at de.hybris.platform.core.Initialization$SessionRecoveryAfterRegistryStartupAwareExecutor.execute(Initialization.java:698)[java] at de.hybris.platform.core.Initialization.doInitializeImpl(Initialization.java:566)[java] at de.hybris.platform.core.Initialization.access$5(Initialization.java:488)[java] at de.hybris.platform.core.Initialization$5.call(Initialization.java:812)[java] at de.hybris.platform.core.Initialization$5.call(Initialization.java:1)[java] at de.hybris.platform.core.system.InitializationLockHandler.performLocked(InitializationLockHandler.java:80)[java] at de.hybris.platform.core.Initialization.doInitialize(Initialization.java:844)[java] at de.hybris.ant.taskdefs.InitPlatformAntPerformableImpl.performImpl(InitPlatformAntPerformableImpl.java:106)[java] at de.hybris.ant.taskdefs.AbstractAntPerformable.doPerform(AbstractAntPerformable.java:92)[java] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)[java] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)[java] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)[java] at java.lang.reflect.Method.invoke(Method.java:498)[java] at bsh.Reflect.invokeMethod(Reflect.java:131)[java] at bsh.Reflect.invokeObjectMethod(Reflect.java:77)[java] at bsh.Name.invokeMethod(Name.java:852)[java] at bsh.BSHMethodInvocation.eval(BSHMethodInvocation.java:69)[java] ... 16 more
Can you search all *-items.xml files to find out what extension it is coming from? You might also want to check your the localextensions.xml file if the extension or any extension depending on it is there. If it's there, you can remove it.
This error happened due to the incorrect version of DB connector. We were using MySQL 8.0 and mysql-connector-java-8.0.19.jar connector. But SAP officially doesn't support MySQL 8.0 with 1808 version. After downgrading it to MySql 5.x and mysql-connector-java-5.1.x-bin.jar, this error no longer exists.
Thank you.
With bdutil, the latest version of tarball I can find is on spark 1.3.1:
gs://spark-dist/spark-1.3.1-bin-hadoop2.6.tgz
There are a few new DataFrame features in Spark 1.4 that I want to use. Any chance the Spark 1.4 image be available for bdutil, or any workaround?
UPDATE:
Following the suggestion from Angus Davis, I downloaded and pointed to spark-1.4.1-bin-hadoop2.6.tgz, the deployment went well; however, run into error when calling SqlContext.parquetFile(). I cannot explain why this exception is possible, GoogleHadoopFileSystem should be a subclass of org.apache.hadoop.fs.FileSystem. Will continue investigate on this.
Caused by: java.lang.ClassCastException: com.google.cloud.hadoop.fs.gcs.GoogleHadoopFileSystem cannot be cast to org.apache.hadoop.fs.FileSystem
at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2595)
at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:91)
at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2630)
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2612)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:370)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:169)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:354)
at org.apache.hadoop.fs.Path.getFileSystem(Path.java:296)
at org.apache.hadoop.hive.metastore.Warehouse.getFs(Warehouse.java:112)
at org.apache.hadoop.hive.metastore.Warehouse.getDnsPath(Warehouse.java:144)
at org.apache.hadoop.hive.metastore.Warehouse.getWhRoot(Warehouse.java:159)
at org.apache.hadoop.hive.metastore.Warehouse.getDefaultDatabasePath(Warehouse.java:177)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB_core(HiveMetaStore.java:504)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:523)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:397)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.<init>(HiveMetaStore.java:356)
at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:54)
at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:59)
at org.apache.hadoop.hive.metastore.HiveMetaStore.newHMSHandler(HiveMetaStore.java:4944)
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:171)
Asked a separate question about the exception here
UPDATE:
The error turned out to be a Spark defect; resolution/workaround provided in the above question.
Thanks!
Haiying
If a local workaround is acceptable, you can copy the spark-1.4.1-bin-hadoop2.6.tgz from an apache mirror into a bucket that you control. You can then edit extensions/spark/spark-env.sh and change SPARK_HADOOP2_TARBALL_URI='<your copy of spark 1.4.1>' (make certain that the service account running your VMs has permission to read the tarball).
Note that I haven't done any testing to see if Spark 1.4.1 works out of the box right now, but I'd be interested in hearing your experience if you decide to give it a go.
The Symfony documentation for twig mentions using form_start but when I try this in Silex I get this error
Twig_Error_Syntax: The function "form_start" does not exist
Is it possible to use this in Silex?
Other form function like form_row() and form_widget() work.
Edit: I am using symfony/form dev-master (945f91ee8729a8f16e5d5c87c4920694e6b10475)
and symfony/twig-bridge 2.2.x-dev (6ddcb37ae4b7275c14baf365c7513b9ffdd6e31c)
You're using a version of twig-bridge where it is not yet present.
form_start and form_end have been introduced by commit d0b896, and github tells us that 2.3.0 is the first release including it.
Also note that while you're using 2.2, you're browsing the docs for 2.3.
Thankyou for taking the time to look at my problem. I'm working on an android application and I keep getting an error in eclipse every time I use the parent="android.Theme.Holo.Light". I have my folder created using values-v11 indicating when to use the correct theme for the correct version but I just get the error:
No resource found that matches the given name 'android.Theme.Holo.Light' in my styles.xml file.
Any idea why this is happening? Thanks in advance.
Try this:
parent="#android:style/Theme.Holo.Light"
i am trying to execute tasktracker on Cygwin but following error occur's as:-
mapred.TaskTracker: Process Tree implementation is missing on this system. TaskMemoryManager is disabled.
Rest all (i.e. Namenode,Secondarynamenode,Jobtracker and Datanode) working properly through cygwin but the issue is with the Tasktracker.I am hadoop version:hadoop-19.0.1
So,How I get rid of it.If anybody knows please help!.
Your Help will be appreciated!
I didn't encountered this specific problem but ...
Make sure that you are using the same hadoop version that it is in use on the cluster.
Update Hadoop to more recent version if possible.
The following patches may address (or maybe not) your problem:
https://issues.apache.org/jira/browse/HADOOP-6230
https://issues.apache.org/jira/browse/MAPREDUCE-834