We are experiencing an issue whereby upon the restart of IIS, our Umbraco site is reverting to old content/branding. The log reads that umbraco.content is loading XML from file
Only 1 of the 7 sites in IIS is affected by this behaviour. We are running the full version of MS-SQL Server (not SQL CE)
Where is Umbraco getting this content from and how can we prevent it happening when IIS restarts (a nightly task). Is there an Umbraco setting that is only present on this specific database/config?
The logs read:
2016-05-11 12:50:37,715 [35] INFO Umbraco.Core.CoreBootManager - [T29/D5] Umbraco application starting
2016-05-11 12:50:37,762 [35] INFO Umbraco.Core.PluginManager - [T29/D5] Determining hash of code files on disk
2016-05-11 12:50:37,793 [35] INFO Umbraco.Core.PluginManager - [T29/D5] Hash determined (took 18ms)
2016-05-11 12:50:37,793 [35] INFO Umbraco.Core.PluginManager - [T29/D5] Starting resolution types of umbraco.interfaces.IApplicationStartupHandler
2016-05-11 12:50:37,809 [35] INFO Umbraco.Core.PluginManager - [T29/D5] Completed resolution of types of umbraco.interfaces.IApplicationStartupHandler, found 38 (took 10ms)
2016-05-11 12:50:37,949 [35] INFO Umbraco.Core.PluginManager - [T29/D5] Starting resolution types of Umbraco.Core.PropertyEditors.IPropertyEditorValueConverter
2016-05-11 12:50:37,949 [35] INFO Umbraco.Core.PluginManager - [T29/D5] Completed resolution of types of Umbraco.Core.PropertyEditors.IPropertyEditorValueConverter, found 0 (took 0ms)
2016-05-11 12:50:37,949 [35] INFO Umbraco.Core.PluginManager - [T29/D5] Starting resolution types of Umbraco.Core.PropertyEditors.IPropertyValueConverter
2016-05-11 12:50:37,949 [35] INFO Umbraco.Core.PluginManager - [T29/D5] Completed resolution of types of Umbraco.Core.PropertyEditors.IPropertyValueConverter, found 16 (took 1ms)
2016-05-11 12:50:37,965 [35] INFO Umbraco.Core.PluginManager - [T29/D5] Starting resolution types of Umbraco.Web.Mvc.SurfaceController
2016-05-11 12:50:37,965 [35] INFO Umbraco.Core.PluginManager - [T29/D5] Completed resolution of types of Umbraco.Web.Mvc.SurfaceController, found 14 (took 1ms)
2016-05-11 12:50:37,965 [35] INFO Umbraco.Core.PluginManager - [T29/D5] Starting resolution types of Umbraco.Web.WebApi.UmbracoApiController
2016-05-11 12:50:37,980 [35] INFO Umbraco.Core.PluginManager - [T29/D5] Completed resolution of types of Umbraco.Web.WebApi.UmbracoApiController, found 61 (took 10ms)
2016-05-11 12:50:38,043 [35] INFO Umbraco.Core.PluginManager - [T29/D5] Starting resolution types of Umbraco.Core.Media.IThumbnailProvider
2016-05-11 12:50:38,043 [35] INFO Umbraco.Core.PluginManager - [T29/D5] Completed resolution of types of Umbraco.Core.Media.IThumbnailProvider, found 3 (took 0ms)
2016-05-11 12:50:38,043 [35] INFO Umbraco.Core.PluginManager - [T29/D5] Starting resolution types of Umbraco.Core.Media.IImageUrlProvider
2016-05-11 12:50:38,059 [35] INFO Umbraco.Core.PluginManager - [T29/D5] Completed resolution of types of Umbraco.Core.Media.IImageUrlProvider, found 1 (took 6ms)
2016-05-11 12:50:39,809 [35] INFO Umbraco.Web.Search.ExamineEvents - [T29/D5] Initializing Examine and binding to business logic events
2016-05-11 12:50:39,996 [35] INFO Umbraco.Web.Search.ExamineEvents - [T29/D5] Adding examine event handlers for index providers: 3
2016-05-11 12:50:39,996 [35] INFO Umbraco.Core.CoreBootManager - [T29/D5] Umbraco application startup complete (took 2282ms)
2016-05-11 12:50:41,293 [35] INFO Umbraco.Web.UmbracoModule - [T34/D5] Setting OriginalRequestUrl: xxx.xxx.com/umbraco
2016-05-11 12:50:41,465 [35] INFO umbraco.content - [T34/D5] Load Xml from file...
2016-05-11 12:50:41,465 [35] INFO umbraco.content - [T34/D5] Loaded Xml from file.
Related
I am trying to execute a python file from LiveOperator of Airflow which resides in ADLS gen 2 location .
livy_python_task = LivyOperator(task_id='pi_python_task',livy_conn_id='livy_default', file='abfss://container#storage/file.py',conf={'fs.azure.account.key.adlsint3.dfs.core.windows.net':'key'}, polling_interval=60)
Now it's ignoring the property fs.azure.account.key.adlsint3.dfs.core.windows.net
It's throwing the error -
22/11/17 00:00:02 WARN FileSystem: Failed to initialize fileystem abfss://container#storage/file.py: Failure to initialize configuration
[2022-11-17, 00:01:00 UTC] {livy.py:301} INFO - Exception in thread "main" Failure to initialize configuration
[2022-11-17, 00:01:00 UTC] {livy.py:301} INFO - at org.apache.hadoop.fs.azurebfs.services.SimpleKeyProvider.getStorageAccountKey(SimpleKeyProvider.java:51)
[2022-11-17, 00:01:00 UTC] {livy.py:301} INFO - at org.apache.hadoop.fs.azurebfs.AbfsConfiguration.getStorageAccountKey(AbfsConfiguration.java:548)
[2022-11-17, 00:01:00 UTC] {livy.py:301} INFO - at org.apache.hadoop.fs.azurebfs.AzureBlobFileSystemStore.initializeClient(AzureBlobFileSystemStore.java:1449)
[2022-11-17, 00:01:00 UTC] {livy.py:301} INFO - at org.apache.hadoop.fs.azurebfs.AzureBlobFileSystemStore.<init>(AzureBlobFileSystemStore.java:215)
I have to pass this storage key property as an argument , not by embedded in code
Is there a way to access this file and how to pass non spark property in Livy
I am trying to deploy a Flink job in Kubernetes cluster (Azure AKS). The Job Cluster is getting aborted just after starting but Task manager is running fine.
The docker image is created successfully without any exception. I am able to run the docker image as well as able to SSH to docker image.
I have followed steps mentioned in the below link:
https://github.com/apache/flink/tree/release-1.9/flink-container/kubernetes
While creating image I have provided Job jar and it has been copied on "/opt/artifacts" inside the image. But still not getting why getting below exception in Job Cluster pod log.
Caused by: org.apache.flink.util.FlinkException: Failed to find job JAR on class path. Please provide the job class name explicitly.
I am new in Kubernetes, Could you please give me some clue to debug this issue.
Please find below complete logs:
A. flink-job-cluster Pod Log
develk#ACIDLAELKV01:~/cntx_eng$ kubectl logs flink-job-cluster-kszwf
Starting the job-cluster
Starting standalonejob as a console application on host flink-job-cluster-kszwf.
2019-12-12 10:37:17,170 INFO org.apache.flink.runtime.entrypoint.ClusterEntrypoint - --------------------------------------------------------------------------------
2019-12-12 10:37:17,172 INFO org.apache.flink.runtime.entrypoint.ClusterEntrypoint - Starting StandaloneJobClusterEntryPoint (Version: 1.8.0, Rev:4caec0d, Date:03.04.2019 # 13:25:54 PDT)
2019-12-12 10:37:17,172 INFO org.apache.flink.runtime.entrypoint.ClusterEntrypoint - OS current user: flink
2019-12-12 10:37:17,173 INFO org.apache.flink.runtime.entrypoint.ClusterEntrypoint - Current Hadoop/Kerberos user: <no hadoop dependency found>
2019-12-12 10:37:17,173 INFO org.apache.flink.runtime.entrypoint.ClusterEntrypoint - JVM: OpenJDK 64-Bit Server VM - IcedTea - 1.8/25.212-b04
2019-12-12 10:37:17,173 INFO org.apache.flink.runtime.entrypoint.ClusterEntrypoint - Maximum heap size: 989 MiBytes
2019-12-12 10:37:17,173 INFO org.apache.flink.runtime.entrypoint.ClusterEntrypoint - JAVA_HOME: /usr/lib/jvm/java-1.8-openjdk/jre
2019-12-12 10:37:17,174 INFO org.apache.flink.runtime.entrypoint.ClusterEntrypoint - No Hadoop Dependency available
2019-12-12 10:37:17,174 INFO org.apache.flink.runtime.entrypoint.ClusterEntrypoint - JVM Options:
2019-12-12 10:37:17,174 INFO org.apache.flink.runtime.entrypoint.ClusterEntrypoint - -Xms1024m
2019-12-12 10:37:17,174 INFO org.apache.flink.runtime.entrypoint.ClusterEntrypoint - -Xmx1024m
2019-12-12 10:37:17,174 INFO org.apache.flink.runtime.entrypoint.ClusterEntrypoint - -Dlog4j.configuration=file:/opt/flink-1.8.0/conf/log4j-console.properties
2019-12-12 10:37:17,175 INFO org.apache.flink.runtime.entrypoint.ClusterEntrypoint - -Dlogback.configurationFile=file:/opt/flink-1.8.0/conf/logback-console.xml
2019-12-12 10:37:17,175 INFO org.apache.flink.runtime.entrypoint.ClusterEntrypoint - Program Arguments:
2019-12-12 10:37:17,175 INFO org.apache.flink.runtime.entrypoint.ClusterEntrypoint - --configDir
2019-12-12 10:37:17,175 INFO org.apache.flink.runtime.entrypoint.ClusterEntrypoint - /opt/flink-1.8.0/conf
2019-12-12 10:37:17,175 INFO org.apache.flink.runtime.entrypoint.ClusterEntrypoint - -Djobmanager.rpc.address=flink-job-cluster
2019-12-12 10:37:17,175 INFO org.apache.flink.runtime.entrypoint.ClusterEntrypoint - -Dparallelism.default=1
2019-12-12 10:37:17,176 INFO org.apache.flink.runtime.entrypoint.ClusterEntrypoint - -Dblob.server.port=6124
2019-12-12 10:37:17,176 INFO org.apache.flink.runtime.entrypoint.ClusterEntrypoint - -Dqueryable-state.server.ports=6125
2019-12-12 10:37:17,176 INFO org.apache.flink.runtime.entrypoint.ClusterEntrypoint - Classpath: /opt/flink-1.8.0/lib/log4j-1.2.17.jar:/opt/flink-1.8.0/lib/slf4j-log4j12-1.7.15.jar:/opt/flink-1.8.0/lib/flink-dist_2.11-1.8.0.jar:::
2019-12-12 10:37:17,176 INFO org.apache.flink.runtime.entrypoint.ClusterEntrypoint - --------------------------------------------------------------------------------
2019-12-12 10:37:17,178 INFO org.apache.flink.runtime.entrypoint.ClusterEntrypoint - Registered UNIX signal handlers for [TERM, HUP, INT]
2019-12-12 10:37:17,306 INFO org.apache.flink.configuration.GlobalConfiguration - Loading configuration property: jobmanager.rpc.address, localhost
2019-12-12 10:37:17,306 INFO org.apache.flink.configuration.GlobalConfiguration - Loading configuration property: jobmanager.rpc.port, 6123
2019-12-12 10:37:17,307 INFO org.apache.flink.configuration.GlobalConfiguration - Loading configuration property: jobmanager.heap.size, 1024m
2019-12-12 10:37:17,307 INFO org.apache.flink.configuration.GlobalConfiguration - Loading configuration property: taskmanager.heap.size, 1024m
2019-12-12 10:37:17,307 INFO org.apache.flink.configuration.GlobalConfiguration - Loading configuration property: taskmanager.numberOfTaskSlots, 1
2019-12-12 10:37:17,307 INFO org.apache.flink.configuration.GlobalConfiguration - Loading configuration property: parallelism.default, 1
2019-12-12 10:37:17,336 INFO org.apache.flink.runtime.entrypoint.ClusterEntrypoint - Starting StandaloneJobClusterEntryPoint.
2019-12-12 10:37:17,336 INFO org.apache.flink.runtime.entrypoint.ClusterEntrypoint - Install default filesystem.
2019-12-12 10:37:17,343 INFO org.apache.flink.core.fs.FileSystem - Hadoop is not in the classpath/dependencies. The extended set of supported File Systems via Hadoop is not available.
2019-12-12 10:37:17,352 INFO org.apache.flink.runtime.entrypoint.ClusterEntrypoint - Install security context.
2019-12-12 10:37:17,362 INFO org.apache.flink.runtime.security.modules.HadoopModuleFactory - Cannot create Hadoop Security Module because Hadoop cannot be found in the Classpath.
2019-12-12 10:37:17,381 INFO org.apache.flink.runtime.security.SecurityUtils - Cannot install HadoopSecurityContext because Hadoop cannot be found in the Classpath.
2019-12-12 10:37:17,382 INFO org.apache.flink.runtime.entrypoint.ClusterEntrypoint - Initializing cluster services.
2019-12-12 10:37:17,638 INFO org.apache.flink.runtime.rpc.akka.AkkaRpcServiceUtils - Trying to start actor system at flink-job-cluster:6123
2019-12-12 10:37:18,163 INFO akka.event.slf4j.Slf4jLogger - Slf4jLogger started
2019-12-12 10:37:18,237 INFO akka.remote.Remoting - Starting remoting
2019-12-12 10:37:18,366 INFO akka.remote.Remoting - Remoting started; listening on addresses :[akka.tcp://flink#flink-job-cluster:6123]
2019-12-12 10:37:18,375 INFO org.apache.flink.runtime.rpc.akka.AkkaRpcServiceUtils - Actor system started at akka.tcp://flink#flink-job-cluster:6123
2019-12-12 10:37:18,398 INFO org.apache.flink.configuration.Configuration - Config uses fallback configuration key 'jobmanager.rpc.address' instead of key 'rest.address'
2019-12-12 10:37:18,407 INFO org.apache.flink.runtime.blob.BlobServer - Created BLOB server storage directory /tmp/blobStore-63338044-67c1-4872-a3d9-c94563b3a7c3
2019-12-12 10:37:18,412 INFO org.apache.flink.runtime.blob.BlobServer - Started BLOB server at 0.0.0.0:6124 - max concurrent requests: 50 - max backlog: 1000
2019-12-12 10:37:18,428 INFO org.apache.flink.runtime.metrics.MetricRegistryImpl - No metrics reporter configured, no metrics will be exposed/reported.
2019-12-12 10:37:18,430 INFO org.apache.flink.runtime.entrypoint.ClusterEntrypoint - Trying to start actor system at flink-job-cluster:0
2019-12-12 10:37:18,464 INFO akka.event.slf4j.Slf4jLogger - Slf4jLogger started
2019-12-12 10:37:18,472 INFO akka.remote.Remoting - Starting remoting
2019-12-12 10:37:18,480 INFO akka.remote.Remoting - Remoting started; listening on addresses :[akka.tcp://flink-metrics#flink-job-cluster:33529]
2019-12-12 10:37:18,482 INFO org.apache.flink.runtime.entrypoint.ClusterEntrypoint - Actor system started at akka.tcp://flink-metrics#flink-job-cluster:33529
2019-12-12 10:37:18,490 INFO org.apache.flink.runtime.blob.TransientBlobCache - Created BLOB cache storage directory /tmp/blobStore-ba64dcdb-5095-41fc-9c98-0f1528d95c40
2019-12-12 10:37:18,514 INFO org.apache.flink.configuration.Configuration - Config uses fallback configuration key 'jobmanager.rpc.address' instead of key 'rest.address'
2019-12-12 10:37:18,515 WARN org.apache.flink.runtime.jobmaster.MiniDispatcherRestEndpoint - Upload directory /tmp/flink-web-f6be0c2d-5099-4bd6-bc72-a0ae1fc6448e/flink-web-upload does not exist, or has been deleted externally. Previously uploaded files are no longer available.
2019-12-12 10:37:18,516 INFO org.apache.flink.runtime.jobmaster.MiniDispatcherRestEndpoint - Created directory /tmp/flink-web-f6be0c2d-5099-4bd6-bc72-a0ae1fc6448e/flink-web-upload for file uploads.
2019-12-12 10:37:18,603 INFO org.apache.flink.runtime.jobmaster.MiniDispatcherRestEndpoint - Starting rest endpoint.
2019-12-12 10:37:18,872 WARN org.apache.flink.runtime.webmonitor.WebMonitorUtils - Log file environment variable 'log.file' is not set.
2019-12-12 10:37:18,872 WARN org.apache.flink.runtime.webmonitor.WebMonitorUtils - JobManager log files are unavailable in the web dashboard. Log file location not found in environment variable 'log.file' or configuration key 'Key: 'web.log.path' , default: null (fallback keys: [{key=jobmanager.web.log.path, isDeprecated=true}])'.
2019-12-12 10:37:19,115 INFO org.apache.flink.runtime.jobmaster.MiniDispatcherRestEndpoint - Rest endpoint listening at flink-job-cluster:8081
2019-12-12 10:37:19,116 INFO org.apache.flink.runtime.jobmaster.MiniDispatcherRestEndpoint - http://flink-job-cluster:8081 was granted leadership with leaderSessionID=00000000-0000-0000-0000-000000000000
2019-12-12 10:37:19,116 INFO org.apache.flink.runtime.jobmaster.MiniDispatcherRestEndpoint - Web frontend listening at http://flink-job-cluster:8081.
2019-12-12 10:37:19,239 INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Starting RPC endpoint for org.apache.flink.runtime.resourcemanager.StandaloneResourceManager at akka://flink/user/resourcemanager .
2019-12-12 10:37:19,262 INFO org.apache.flink.container.entrypoint.ClassPathJobGraphRetriever - Scanning class path for job JAR
2019-12-12 10:37:19,270 INFO org.apache.flink.runtime.jobmaster.MiniDispatcherRestEndpoint - Shutting down rest endpoint.
2019-12-12 10:37:19,295 INFO org.apache.flink.runtime.jobmaster.MiniDispatcherRestEndpoint - Removing cache directory /tmp/flink-web-f6be0c2d-5099-4bd6-bc72-a0ae1fc6448e/flink-web-ui
2019-12-12 10:37:19,299 INFO org.apache.flink.runtime.jobmaster.MiniDispatcherRestEndpoint - http://flink-job-cluster:8081 lost leadership
2019-12-12 10:37:19,299 INFO org.apache.flink.runtime.jobmaster.MiniDispatcherRestEndpoint - Shut down complete.
2019-12-12 10:37:19,302 INFO org.apache.flink.runtime.entrypoint.ClusterEntrypoint - Shutting StandaloneJobClusterEntryPoint down with application status FAILED. Diagnostics org.apache.flink.util.FlinkException: Could not create the DispatcherResourceManagerComponent.
at org.apache.flink.runtime.entrypoint.component.AbstractDispatcherResourceManagerComponentFactory.create(AbstractDispatcherResourceManagerComponentFactory.java:257)
at org.apache.flink.runtime.entrypoint.ClusterEntrypoint.runCluster(ClusterEntrypoint.java:224)
at org.apache.flink.runtime.entrypoint.ClusterEntrypoint.lambda$startCluster$0(ClusterEntrypoint.java:172)
at org.apache.flink.runtime.security.NoOpSecurityContext.runSecured(NoOpSecurityContext.java:30)
at org.apache.flink.runtime.entrypoint.ClusterEntrypoint.startCluster(ClusterEntrypoint.java:171)
at org.apache.flink.runtime.entrypoint.ClusterEntrypoint.runClusterEntrypoint(ClusterEntrypoint.java:535)
at org.apache.flink.container.entrypoint.StandaloneJobClusterEntryPoint.main(StandaloneJobClusterEntryPoint.java:105)
Caused by: org.apache.flink.util.FlinkException: Failed to find job JAR on class path. Please provide the job class name explicitly.
at org.apache.flink.container.entrypoint.ClassPathJobGraphRetriever.getJobClassNameOrScanClassPath(ClassPathJobGraphRetriever.java:131)
at org.apache.flink.container.entrypoint.ClassPathJobGraphRetriever.createPackagedProgram(ClassPathJobGraphRetriever.java:114)
at org.apache.flink.container.entrypoint.ClassPathJobGraphRetriever.retrieveJobGraph(ClassPathJobGraphRetriever.java:96)
at org.apache.flink.runtime.dispatcher.JobDispatcherFactory.createDispatcher(JobDispatcherFactory.java:62)
at org.apache.flink.runtime.dispatcher.JobDispatcherFactory.createDispatcher(JobDispatcherFactory.java:41)
at org.apache.flink.runtime.entrypoint.component.AbstractDispatcherResourceManagerComponentFactory.create(AbstractDispatcherResourceManagerComponentFactory.java:184)
... 6 more
Caused by: java.util.NoSuchElementException: No JAR with manifest attribute for entry class
at org.apache.flink.container.entrypoint.JarManifestParser.findOnlyEntryClass(JarManifestParser.java:80)
at org.apache.flink.container.entrypoint.ClassPathJobGraphRetriever.scanClassPathForJobJar(ClassPathJobGraphRetriever.java:137)
at org.apache.flink.container.entrypoint.ClassPathJobGraphRetriever.getJobClassNameOrScanClassPath(ClassPathJobGraphRetriever.java:129)
... 11 more
.
2019-12-12 10:37:19,305 INFO org.apache.flink.runtime.blob.BlobServer - Stopped BLOB server at 0.0.0.0:6124
2019-12-12 10:37:19,305 INFO org.apache.flink.runtime.blob.TransientBlobCache - Shutting down BLOB cache
2019-12-12 10:37:19,315 INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopping Akka RPC service.
2019-12-12 10:37:19,320 INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - Shutting down remote daemon.
2019-12-12 10:37:19,321 INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - Shutting down remote daemon.
2019-12-12 10:37:19,323 INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - Remote daemon shut down; proceeding with flushing remote transports.
2019-12-12 10:37:19,325 INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - Remote daemon shut down; proceeding with flushing remote transports.
2019-12-12 10:37:19,354 INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - Remoting shut down.
2019-12-12 10:37:19,356 INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - Remoting shut down.
2019-12-12 10:37:19,378 INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopped Akka RPC service.
2019-12-12 10:37:19,382 ERROR org.apache.flink.runtime.entrypoint.ClusterEntrypoint - Could not start cluster entrypoint StandaloneJobClusterEntryPoint.
org.apache.flink.runtime.entrypoint.ClusterEntrypointException: Failed to initialize the cluster entrypoint StandaloneJobClusterEntryPoint.
at org.apache.flink.runtime.entrypoint.ClusterEntrypoint.startCluster(ClusterEntrypoint.java:190)
at org.apache.flink.runtime.entrypoint.ClusterEntrypoint.runClusterEntrypoint(ClusterEntrypoint.java:535)
at org.apache.flink.container.entrypoint.StandaloneJobClusterEntryPoint.main(StandaloneJobClusterEntryPoint.java:105)
Caused by: org.apache.flink.util.FlinkException: Could not create the DispatcherResourceManagerComponent.
at org.apache.flink.runtime.entrypoint.component.AbstractDispatcherResourceManagerComponentFactory.create(AbstractDispatcherResourceManagerComponentFactory.java:257)
at org.apache.flink.runtime.entrypoint.ClusterEntrypoint.runCluster(ClusterEntrypoint.java:224)
at org.apache.flink.runtime.entrypoint.ClusterEntrypoint.lambda$startCluster$0(ClusterEntrypoint.java:172)
at org.apache.flink.runtime.security.NoOpSecurityContext.runSecured(NoOpSecurityContext.java:30)
at org.apache.flink.runtime.entrypoint.ClusterEntrypoint.startCluster(ClusterEntrypoint.java:171)
... 2 more
Caused by: org.apache.flink.util.FlinkException: Failed to find job JAR on class path. Please provide the job class name explicitly.
at org.apache.flink.container.entrypoint.ClassPathJobGraphRetriever.getJobClassNameOrScanClassPath(ClassPathJobGraphRetriever.java:131)
at org.apache.flink.container.entrypoint.ClassPathJobGraphRetriever.createPackagedProgram(ClassPathJobGraphRetriever.java:114)
at org.apache.flink.container.entrypoint.ClassPathJobGraphRetriever.retrieveJobGraph(ClassPathJobGraphRetriever.java:96)
at org.apache.flink.runtime.dispatcher.JobDispatcherFactory.createDispatcher(JobDispatcherFactory.java:62)
at org.apache.flink.runtime.dispatcher.JobDispatcherFactory.createDispatcher(JobDispatcherFactory.java:41)
at org.apache.flink.runtime.entrypoint.component.AbstractDispatcherResourceManagerComponentFactory.create(AbstractDispatcherResourceManagerComponentFactory.java:184)
... 6 more
Caused by: java.util.NoSuchElementException: No JAR with manifest attribute for entry class
at org.apache.flink.container.entrypoint.JarManifestParser.findOnlyEntryClass(JarManifestParser.java:80)
at org.apache.flink.container.entrypoint.ClassPathJobGraphRetriever.scanClassPathForJobJar(ClassPathJobGraphRetriever.java:137)
at org.apache.flink.container.entrypoint.ClassPathJobGraphRetriever.getJobClassNameOrScanClassPath(ClassPathJobGraphRetriever.java:129)
... 11 more
develk#ACIDLAELKV01:~/cntx_eng$
Now, I have added Job class name as in argument section of "job-cluster-job.yaml.template" file.
Like below:
args: ["job-cluster",
"--job-classname", "com.flink.wordCountSimple",
"-Djobmanager.rpc.address=flink-job-cluster",
But after that I am getting below exception:
Caused by: org.apache.flink.util.FlinkException: Could not load the provided entrypoint class.
Please see below detail log.
2019-12-13 19:08:34,323 INFO org.apache.flink.runtime.jobmaster.MiniDispatcherRestEndpoint - Shut down complete.
2019-12-13 19:08:34,329 INFO org.apache.flink.runtime.entrypoint.ClusterEntrypoint - Shutting StandaloneJobClusterEntryPoint down with application status FAILED. Diagnostics org.apache.flink.util.FlinkException: Could not create the DispatcherResourceManagerComponent.
at org.apache.flink.runtime.entrypoint.component.AbstractDispatcherResourceManagerComponentFactory.create(AbstractDispatcherResourceManagerComponentFactory.java:257)
at org.apache.flink.runtime.entrypoint.ClusterEntrypoint.runCluster(ClusterEntrypoint.java:224)
at org.apache.flink.runtime.entrypoint.ClusterEntrypoint.lambda$startCluster$0(ClusterEntrypoint.java:172)
at org.apache.flink.runtime.security.NoOpSecurityContext.runSecured(NoOpSecurityContext.java:30)
at org.apache.flink.runtime.entrypoint.ClusterEntrypoint.startCluster(ClusterEntrypoint.java:171)
at org.apache.flink.runtime.entrypoint.ClusterEntrypoint.runClusterEntrypoint(ClusterEntrypoint.java:535)
at org.apache.flink.container.entrypoint.StandaloneJobClusterEntryPoint.main(StandaloneJobClusterEntryPoint.java:105)
Caused by: org.apache.flink.util.FlinkException: Could not load the provided entrypoint class.
at org.apache.flink.container.entrypoint.ClassPathJobGraphRetriever.createPackagedProgram(ClassPathJobGraphRetriever.java:119)
at org.apache.flink.container.entrypoint.ClassPathJobGraphRetriever.retrieveJobGraph(ClassPathJobGraphRetriever.java:96)
at org.apache.flink.runtime.dispatcher.JobDispatcherFactory.createDispatcher(JobDispatcherFactory.java:62)
at org.apache.flink.runtime.dispatcher.JobDispatcherFactory.createDispatcher(JobDispatcherFactory.java:41)
at org.apache.flink.runtime.entrypoint.component.AbstractDispatcherResourceManagerComponentFactory.create(AbstractDispatcherResourceManagerComponentFactory.java:184)
... 6 more
Caused by: java.lang.ClassNotFoundException: com.flink.wordCountSimple
at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at org.apache.flink.container.entrypoint.ClassPathJobGraphRetriever.createPackagedProgram(ClassPathJobGraphRetriever.java:116)
... 10 more
.
2019-12-13 19:08:34,337 INFO org.apache.flink.runtime.blob.BlobServer - Stopped BLOB server at 0.0.0.0:6124
2019-12-13 19:08:34,338 INFO org.apache.flink.runtime.blob.TransientBlobCache - Shutting down BLOB cache
2019-12-13 19:08:34,364 INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopping Akka RPC service.
2019-12-13 19:08:34,368 INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - Shutting down remote daemon.
2019-12-13 19:08:34,372 INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - Remote daemon shut down; proceeding with flushing remote transports.
2019-12-13 19:08:34,392 INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - Shutting down remote daemon.
2019-12-13 19:08:34,392 INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - Remote daemon shut down; proceeding with flushing remote transports.
2019-12-13 19:08:34,406 INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - Remoting shut down.
2019-12-13 19:08:34,410 INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - Remoting shut down.
2019-12-13 19:08:34,434 INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopped Akka RPC service.
2019-12-13 19:08:34,443 ERROR org.apache.flink.runtime.entrypoint.ClusterEntrypoint - Could not start cluster entrypoint StandaloneJobClusterEntryPoint.
org.apache.flink.runtime.entrypoint.ClusterEntrypointException: Failed to initialize the cluster entrypoint StandaloneJobClusterEntryPoint.
at org.apache.flink.runtime.entrypoint.ClusterEntrypoint.startCluster(ClusterEntrypoint.java:190)
at org.apache.flink.runtime.entrypoint.ClusterEntrypoint.runClusterEntrypoint(ClusterEntrypoint.java:535)
at org.apache.flink.container.entrypoint.StandaloneJobClusterEntryPoint.main(StandaloneJobClusterEntryPoint.java:105)
Caused by: org.apache.flink.util.FlinkException: Could not create the DispatcherResourceManagerComponent.
at org.apache.flink.runtime.entrypoint.component.AbstractDispatcherResourceManagerComponentFactory.create(AbstractDispatcherResourceManagerComponentFactory.java:257)
at org.apache.flink.runtime.entrypoint.ClusterEntrypoint.runCluster(ClusterEntrypoint.java:224)
at org.apache.flink.runtime.entrypoint.ClusterEntrypoint.lambda$startCluster$0(ClusterEntrypoint.java:172)
at org.apache.flink.runtime.security.NoOpSecurityContext.runSecured(NoOpSecurityContext.java:30)
at org.apache.flink.runtime.entrypoint.ClusterEntrypoint.startCluster(ClusterEntrypoint.java:171)
... 2 more
Caused by: org.apache.flink.util.FlinkException: Could not load the provided entrypoint class.
at org.apache.flink.container.entrypoint.ClassPathJobGraphRetriever.createPackagedProgram(ClassPathJobGraphRetriever.java:119)
at org.apache.flink.container.entrypoint.ClassPathJobGraphRetriever.retrieveJobGraph(ClassPathJobGraphRetriever.java:96)
at org.apache.flink.runtime.dispatcher.JobDispatcherFactory.createDispatcher(JobDispatcherFactory.java:62)
at org.apache.flink.runtime.dispatcher.JobDispatcherFactory.createDispatcher(JobDispatcherFactory.java:41)
at org.apache.flink.runtime.entrypoint.component.AbstractDispatcherResourceManagerComponentFactory.create(AbstractDispatcherResourceManagerComponentFactory.java:184)
... 6 more
Caused by: java.lang.ClassNotFoundException: com.flink.wordCountSimple
at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at org.apache.flink.container.entrypoint.ClassPathJobGraphRetriever.createPackagedProgram(ClassPathJobGraphRetriever.java:116)
... 10 more
There's a complete, working example of creating and running a flink job cluster on kubernetes in https://github.com/alpinegizmo/flink-containers-example. Maybe that will help. See also https://www.youtube.com/watch?v=ceZtUDgh2TE.
version: "2.1"
services:
jobmanager:
build:
context: ./
args:
JAR_FILE: flink-event-tracker-bundled-1.6.0.jar
image: test/flink-event-tracker
expose:
- "6123"
ports:
- "8081:8081"
- "6123:6123"
command: job-cluster --job-classname com.company.test.flink.pipelines.KafkaPipelineConsumer -Djobmanager.rpc.address=jobmanager --runner=FlinkRunner --streaming=true --checkpointingInterval=30000
environment:
- JOB_MANAGER_RPC_ADDRESS=jobmanager
- JOB_MANAGER=jobmanager
volumes:
- data-volume:/docker/volumes
taskmanager:
image: test/flink-event-tracker
expose:
- "6121"
- "6122"
depends_on:
- jobmanager
command: task-manager -Djobmanager.rpc.address=jobmanager
links:
- "jobmanager:jobmanager"
environment:
- JOB_MANAGER_RPC_ADDRESS=jobmanager
- JOB_MANAGER=jobmanager
volumes:
- data-volume:/docker/volumes
volumes:
data-volume:
driver: local
driver_opts:
o: bind
type: none
device: /Users/home/Development/docker/volumes/flink
Docker file
FROM flink:1.9
ARG JAR_FILE=""
ENV APP_OPTS ""
ENV JAVA_OPTS ""
ENV JOB_MANAGER=""
# Build arg allows passing the version at runtime
ARG VERSION=unset-version
COPY flink-conf.yml $FLINK_HOME/conf/flink-conf.yaml
COPY target/$JAR_FILE $FLINK_HOME/lib/event-tracker.jar
COPY docker-cluster-entrypoint.sh /docker-cluster-entrypoint.sh
RUN apt-get update && apt-get install procps -y && apt-get install curl -y
RUN echo "root:root" | chpasswd
RUN chmod 777 /docker-cluster-entrypoint.sh
RUN chmod 777 $FLINK_HOME/lib/event-tracker.jar
ENTRYPOINT [ "bash","/docker-cluster-entrypoint.sh" ]
docker-cluster-entrypoint.sh
FLINK_HOME=${FLINK_HOME:-"/opt/flink/bin"}
JOB_CLUSTER="job-cluster"
TASK_MANAGER="task-manager"
CMD="$1"
shift;
if [ "${CMD}" = "--help" -o "${CMD}" = "-h" ]; then
echo "Usage: $(basename $0) (${JOB_CLUSTER}|${TASK_MANAGER})"
exit 0
elif [ "${CMD}" = "${JOB_CLUSTER}" -o "${CMD}" = "${TASK_MANAGER}" ]; then
echo "Starting the ${CMD}"
if [ "${CMD}" = "${TASK_MANAGER}" ]; then
exec $FLINK_HOME/bin/taskmanager.sh start-foreground "$#"
else
exec $FLINK_HOME/bin/standalone-job.sh start-foreground "$#"
fi
fi
How to run:-
mvn clean install
docker-compose -f docker-compose.local.yml up --scale taskmanager=2 > exceptionlog.log
docker-compose -f docker-compose.local.yml build
this is the entire conf that runs your docker. but if you want to run in kube, just convert the docker-compose file to its corresponding kube files...remaining can stay same.. may be do a helm that way kube maintenance is better.
Note:- we are using apache beam to code the job
I have a working Hazelcast cluster configured with tcp-ip. I need it to work with Eureka discovery. I am trying to implement the hazelcast-eureka-one plugin.
The (Spring-Boot) app currently already registers itself with Eureka sucessfully, using the #EnableEurekaClient annotation. I am not concerned with whether the hazelcast eureka client is the same or a different client. I am fine with hazelcast registering itself separately from the app. As long as it works.
When I remove eureka-client.properties, the app will not start up, showing an error that eureka-client.properties can not be found. When I have the file in place, the app starts, but apparently none of the properties from eureka-client.properties are being loaded, which leaves hazelcast not knowing where the eureka server is. The logs indicate that the properties file is being found, but none of the properties seem to be imported.
Upgrading hazelcast-eureka-one to 1.1 makes no change.
Setting use-metadata-for-host-and-port to true makes no change.
Gradle:
compile group: 'com.hazelcast', name: 'hazelcast-spring', version: '3.9.4'
compile group: 'com.hazelcast', name: 'hazelcast-hibernate52', version: '1.2.3'
compile group: 'com.hazelcast', name: 'hazelcast-eureka-one', version: '1.0.1'
hazelcast.xml:
<?xml version="1.0" encoding="UTF-8"?>
<hazelcast xsi:schemaLocation="http://www.hazelcast.com/schema/config http://www.hazelcast.com/schema/config/hazelcast-config-3.9.xsd"
xmlns="http://www.hazelcast.com/schema/config"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
<instance-name>app.name.hazelcast.sessions-instance</instance-name>
<group>
<name>app.name.hazelcast.sessions.local-group</name>
</group>
<network>
<join>
<multicast enabled="false"/>
<tcp-ip enabled="false"/>
<aws enabled="false"/>
<discovery-strategies>
<discovery-strategy class="com.hazelcast.eureka.one.EurekaOneDiscoveryStrategy" enabled="true">
<properties>
<property name="self-registration">true</property>
<property name="namespace">hazelcast-app-name</property>
<property name="use-metadata-for-host-and-port">false</property>
</properties>
</discovery-strategy>
</discovery-strategies>
</join>
</network>
<map name="spring:session:sessions">
<attributes>
<attribute extractor="org.springframework.session.hazelcast.PrincipalNameExtractor">principalName</attribute>
</attributes>
<indexes>
<index>principalName</index>
</indexes>
</map>
eureka-client.properties:
hazelcast.shouldUseDns=false
hazelcast.datacenter=primary
hazelcast.name=hazelcast-app-name-sessions
hazelcast.serviceUrl.default=http://username:password#svcregistry1-dev.company.com:8580/eureka/,http://username:password#svcregistry2-dev.company.com:8590/eureka/
Log file:
Loading 'hazelcast.xml' from classpath.
2019-02-15 11:19:13,935 - INFO - [localhost-startStop-1] - [,,] - com.hazelcast.instance.AddressPicker : [LOCAL] [app.name.hazelcast.sessions.local-group] [3.9.4] Prefer IPv4 stack is true.
2019-02-15 11:19:14,166 - INFO - [localhost-startStop-1] - [,,] - com.hazelcast.instance.AddressPicker : [LOCAL] [app.name.hazelcast.sessions.local-group] [3.9.4] Picked [172.28.208.1]:5701, using socket ServerSocket[addr=/0:0:0:0:0:0:0:0,localport=5701], bind any local is true
2019-02-15 11:19:14,179 - INFO - [localhost-startStop-1] - [,,] - com.hazelcast.system : [172.28.208.1]:5701 [app.name.hazelcast.sessions.local-group] [3.9.4] Hazelcast 3.9.4 (20180420 - b8001d5) starting at [172.28.208.1]:5701
2019-02-15 11:19:14,179 - INFO - [localhost-startStop-1] - [,,] - com.hazelcast.system : [172.28.208.1]:5701 [app.name.hazelcast.sessions.local-group] [3.9.4] Copyright (c) 2008-2018, Hazelcast, Inc. All Rights Reserved.
2019-02-15 11:19:14,179 - INFO - [localhost-startStop-1] - [,,] - com.hazelcast.system : [172.28.208.1]:5701 [app.name.hazelcast.sessions.local-group] [3.9.4] Configured Hazelcast Serialization version: 1
2019-02-15 11:19:14,616 - INFO - [localhost-startStop-1] - [,,] - c.h.s.i.o.impl.BackpressureRegulator : [172.28.208.1]:5701 [app.name.hazelcast.sessions.local-group] [3.9.4] Backpressure is disabled
2019-02-15 11:19:15,309 - DEBUG - [localhost-startStop-1] - [,,] - .n.c.u.OverridingPropertiesConfiguration : Base path set to file:///C:/Users/my.name/IdeaProjects/AppName/build/classes/main/
2019-02-15 11:19:15,310 - DEBUG - [localhost-startStop-1] - [,,] - .n.c.u.OverridingPropertiesConfiguration : FileName set to eureka-client.properties
2019-02-15 11:19:15,310 - DEBUG - [localhost-startStop-1] - [,,] - .n.c.u.OverridingPropertiesConfiguration : URL set to file:/C:/Users/my.name/IdeaProjects/AppName/build/classes/main/eureka-client.properties
2019-02-15 11:19:15,316 - INFO - [localhost-startStop-1] - [,,] - c.n.config.util.ConfigurationUtils : Loaded properties file file:/C:/Users/my.name/IdeaProjects/AppName/build/classes/main/eureka-client.properties
2019-02-15 11:19:15,326 - INFO - [localhost-startStop-1] - [,,] - .p.EurekaConfigBasedInstanceInfoProvider : Setting initial instance status as: STARTING
2019-02-15 11:19:15,334 - WARN - [localhost-startStop-1] - [,,] - c.n.config.util.ConfigurationUtils : file:/C:/Users/my.name/IdeaProjects/AppName/build/classes/main/eureka-client.properties is already loaded
2019-02-15 11:19:15,385 - INFO - [localhost-startStop-1] - [,,] - com.netflix.discovery.DiscoveryClient : Initializing Eureka in region us-east-1
2019-02-15 11:19:15,951 - INFO - [localhost-startStop-1] - [,,] - c.n.d.provider.DiscoveryJerseyProvider : Using JSON encoding codec LegacyJacksonJson
2019-02-15 11:19:15,951 - INFO - [localhost-startStop-1] - [,,] - c.n.d.provider.DiscoveryJerseyProvider : Using JSON decoding codec LegacyJacksonJson
2019-02-15 11:19:16,143 - INFO - [localhost-startStop-1] - [,,] - c.n.d.provider.DiscoveryJerseyProvider : Using XML encoding codec XStreamXml
2019-02-15 11:19:16,143 - INFO - [localhost-startStop-1] - [,,] - c.n.d.provider.DiscoveryJerseyProvider : Using XML decoding codec XStreamXml
2019-02-15 11:19:16,392 - INFO - [localhost-startStop-1] - [,,] - c.n.d.s.r.aws.ConfigClusterResolver : Resolving eureka endpoints via configuration
2019-02-15 11:19:16,394 - DEBUG - [localhost-startStop-1] - [,,] - c.n.discovery.endpoint.EndpointUtils : The availability zone for the given region us-east-1 are [defaultZone]
2019-02-15 11:19:16,394 - DEBUG - [localhost-startStop-1] - [,,] - c.n.d.s.r.aws.ConfigClusterResolver : Config resolved to []
2019-02-15 11:19:16,394 - ERROR - [localhost-startStop-1] - [,,] - c.n.d.s.r.aws.ConfigClusterResolver : Cannot resolve to any endpoints from provided configuration: {defaultZone=[]}
2019-02-15 11:19:16,612 - DEBUG - [localhost-startStop-1] - [,,] - c.n.d.s.r.a.ZoneAffinityClusterResolver : Local zone=defaultZone; resolved to: []
2019-02-15 11:19:16,612 - ERROR - [localhost-startStop-1] - [,,] - c.n.d.s.transport.EurekaHttpClients : Initial resolution of Eureka server endpoints failed. Check ConfigClusterResolver logs for more info
2019-02-15 11:19:16,647 - INFO - [localhost-startStop-1] - [,,] - com.netflix.discovery.DiscoveryClient : Disable delta property : false
2019-02-15 11:19:16,647 - INFO - [localhost-startStop-1] - [,,] - com.netflix.discovery.DiscoveryClient : Single vip registry refresh property : null
2019-02-15 11:19:16,647 - INFO - [localhost-startStop-1] - [,,] - com.netflix.discovery.DiscoveryClient : Force full registry fetch : false
2019-02-15 11:19:16,647 - INFO - [localhost-startStop-1] - [,,] - com.netflix.discovery.DiscoveryClient : Application is null : false
2019-02-15 11:19:16,647 - INFO - [localhost-startStop-1] - [,,] - com.netflix.discovery.DiscoveryClient : Registered Applications size is zero : true
2019-02-15 11:19:16,647 - INFO - [localhost-startStop-1] - [,,] - com.netflix.discovery.DiscoveryClient : Application version is -1: true
2019-02-15 11:19:16,647 - INFO - [localhost-startStop-1] - [,,] - com.netflix.discovery.DiscoveryClient : Getting all instance registry info from the eureka server
2019-02-15 11:19:16,648 - DEBUG - [localhost-startStop-1] - [,,] - c.n.d.s.t.d.SessionedEurekaHttpClient : Ending a session and starting anew
2019-02-15 11:19:16,655 - ERROR - [localhost-startStop-1] - [,,] - com.netflix.discovery.DiscoveryClient : DiscoveryClient_UNKNOWN/0c99d08b-8072-4fe4-a20f-c8653e10e374 - was unable to refresh its cache! status = There is no known eureka server; cluster server list is empty
com.netflix.discovery.shared.transport.TransportException: There is no known eureka server; cluster server list is empty
at com.netflix.discovery.shared.transport.decorator.RetryableEurekaHttpClient.execute(RetryableEurekaHttpClient.java:108)
at com.netflix.discovery.shared.transport.decorator.EurekaHttpClientDecorator.getApplications(EurekaHttpClientDecorator.java:134)
at com.netflix.discovery.shared.transport.decorator.EurekaHttpClientDecorator$6.execute(EurekaHttpClientDecorator.java:137)
at com.netflix.discovery.shared.transport.decorator.SessionedEurekaHttpClient.execute(SessionedEurekaHttpClient.java:77)
at com.netflix.discovery.shared.transport.decorator.EurekaHttpClientDecorator.getApplications(EurekaHttpClientDecorator.java:134)
at com.netflix.discovery.DiscoveryClient.getAndStoreFullRegistry(DiscoveryClient.java:1051)
at com.netflix.discovery.DiscoveryClient.fetchRegistry(DiscoveryClient.java:965)
at com.netflix.discovery.DiscoveryClient.<init>(DiscoveryClient.java:414)
at com.netflix.discovery.DiscoveryClient.<init>(DiscoveryClient.java:269)
at com.netflix.discovery.DiscoveryClient.<init>(DiscoveryClient.java:265)
at com.netflix.discovery.DiscoveryClient.<init>(DiscoveryClient.java:257)
at com.hazelcast.eureka.one.EurekaOneDiscoveryStrategy.<init>(EurekaOneDiscoveryStrategy.java:147)
at com.hazelcast.eureka.one.EurekaOneDiscoveryStrategy.<init>(EurekaOneDiscoveryStrategy.java:55)
at com.hazelcast.eureka.one.EurekaOneDiscoveryStrategy$EurekaOneDiscoveryStrategyBuilder.build(EurekaOneDiscoveryStrategy.java:111)
at com.hazelcast.eureka.one.EurekaOneDiscoveryStrategyFactory.newDiscoveryStrategy(EurekaOneDiscoveryStrategyFactory.java:53)
at com.hazelcast.spi.discovery.impl.DefaultDiscoveryService.buildDiscoveryStrategy(DefaultDiscoveryService.java:185)
at com.hazelcast.spi.discovery.impl.DefaultDiscoveryService.loadDiscoveryStrategies(DefaultDiscoveryService.java:145)
at com.hazelcast.spi.discovery.impl.DefaultDiscoveryService.<init>(DefaultDiscoveryService.java:60)
at com.hazelcast.spi.discovery.impl.DefaultDiscoveryServiceProvider.newDiscoveryService(DefaultDiscoveryServiceProvider.java:29)
at com.hazelcast.instance.Node.createDiscoveryService(Node.java:265)
at com.hazelcast.instance.Node.<init>(Node.java:216)
at com.hazelcast.instance.HazelcastInstanceImpl.createNode(HazelcastInstanceImpl.java:160)
at com.hazelcast.instance.HazelcastInstanceImpl.<init>(HazelcastInstanceImpl.java:128)
at com.hazelcast.instance.HazelcastInstanceFactory.constructHazelcastInstance(HazelcastInstanceFactory.java:195)
at com.hazelcast.instance.HazelcastInstanceFactory.newHazelcastInstance(HazelcastInstanceFactory.java:174)
at com.hazelcast.instance.HazelcastInstanceFactory.newHazelcastInstance(HazelcastInstanceFactory.java:124)
at com.hazelcast.core.Hazelcast.newHazelcastInstance(Hazelcast.java:92)
at org.springframework.boot.autoconfigure.hazelcast.HazelcastServerConfiguration$HazelcastServerConfigFileConfiguration.hazelcastInstance(HazelcastServerConfiguration.java:56)
at org.springframework.boot.autoconfigure.hazelcast.HazelcastServerConfiguration$HazelcastServerConfigFileConfiguration$$EnhancerBySpringCGLIB$$d6cfebe6.CGLIB$hazelcastInstance$0(<generated>)
at org.springframework.boot.autoconfigure.hazelcast.HazelcastServerConfiguration$HazelcastServerConfigFileConfiguration$$EnhancerBySpringCGLIB$$d6cfebe6$$FastClassBySpringCGLIB$$3a3e2869.invoke(<generated>)
at org.springframework.cglib.proxy.MethodProxy.invokeSuper(MethodProxy.java:228)
at org.springframework.context.annotation.ConfigurationClassEnhancer$BeanMethodInterceptor.intercept(ConfigurationClassEnhancer.java:365)
at org.springframework.boot.autoconfigure.hazelcast.HazelcastServerConfiguration$HazelcastServerConfigFileConfiguration$$EnhancerBySpringCGLIB$$d6cfebe6.hazelcastInstance(<generated>)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.springframework.beans.factory.support.SimpleInstantiationStrategy.instantiate(SimpleInstantiationStrategy.java:154)
at org.springframework.beans.factory.support.ConstructorResolver.instantiateUsingFactoryMethod(ConstructorResolver.java:583)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.instantiateUsingFactoryMethod(AbstractAutowireCapableBeanFactory.java:1246)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:1096)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:535)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:495)
at org.springframework.beans.factory.support.AbstractBeanFactory.lambda$doGetBean$0(AbstractBeanFactory.java:317)
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:222)
at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:315)
at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:199)
at org.springframework.beans.factory.config.DependencyDescriptor.resolveCandidate(DependencyDescriptor.java:251)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.doResolveDependency(DefaultListableBeanFactory.java:1135)
at org.springframework.beans.factory.support.DefaultListableBeanFactory$DependencyObjectProvider.getObject(DefaultListableBeanFactory.java:1665)
at org.springframework.session.hazelcast.config.annotation.web.http.HazelcastHttpSessionConfiguration.setHazelcastInstance(HazelcastHttpSessionConfiguration.java:96)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.springframework.beans.factory.annotation.AutowiredAnnotationBeanPostProcessor$AutowiredMethodElement.inject(AutowiredAnnotationBeanPostProcessor.java:696)
at org.springframework.beans.factory.annotation.InjectionMetadata.inject(InjectionMetadata.java:90)
at org.springframework.beans.factory.annotation.AutowiredAnnotationBeanPostProcessor.postProcessPropertyValues(AutowiredAnnotationBeanPostProcessor.java:370)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.populateBean(AbstractAutowireCapableBeanFactory.java:1336)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:572)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:495)
at org.springframework.beans.factory.support.AbstractBeanFactory.lambda$doGetBean$0(AbstractBeanFactory.java:317)
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:222)
at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:315)
at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:199)
at org.springframework.beans.factory.support.ConstructorResolver.instantiateUsingFactoryMethod(ConstructorResolver.java:373)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.instantiateUsingFactoryMethod(AbstractAutowireCapableBeanFactory.java:1246)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:1096)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:535)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:495)
at org.springframework.beans.factory.support.AbstractBeanFactory.lambda$doGetBean$0(AbstractBeanFactory.java:317)
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:222)
at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:315)
at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:199)
at org.springframework.beans.factory.config.DependencyDescriptor.resolveCandidate(DependencyDescriptor.java:251)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.doResolveDependency(DefaultListableBeanFactory.java:1135)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.resolveDependency(DefaultListableBeanFactory.java:1062)
at org.springframework.beans.factory.support.ConstructorResolver.resolveAutowiredArgument(ConstructorResolver.java:819)
at org.springframework.beans.factory.support.ConstructorResolver.createArgumentArray(ConstructorResolver.java:725)
at org.springframework.beans.factory.support.ConstructorResolver.instantiateUsingFactoryMethod(ConstructorResolver.java:475)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.instantiateUsingFactoryMethod(AbstractAutowireCapableBeanFactory.java:1246)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:1096)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:535)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:495)
at org.springframework.beans.factory.support.AbstractBeanFactory.lambda$doGetBean$0(AbstractBeanFactory.java:317)
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:222)
at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:315)
at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:204)
at org.springframework.boot.web.servlet.ServletContextInitializerBeans.getOrderedBeansOfType(ServletContextInitializerBeans.java:226)
at org.springframework.boot.web.servlet.ServletContextInitializerBeans.getOrderedBeansOfType(ServletContextInitializerBeans.java:214)
at org.springframework.boot.web.servlet.ServletContextInitializerBeans.addServletContextInitializerBeans(ServletContextInitializerBeans.java:91)
at org.springframework.boot.web.servlet.ServletContextInitializerBeans.<init>(ServletContextInitializerBeans.java:80)
at org.springframework.boot.web.servlet.context.ServletWebServerApplicationContext.getServletContextInitializerBeans(ServletWebServerApplicationContext.java:250)
at org.springframework.boot.web.servlet.context.ServletWebServerApplicationContext.selfInitialize(ServletWebServerApplicationContext.java:237)
at org.springframework.boot.web.embedded.tomcat.TomcatStarter.onStartup(TomcatStarter.java:54)
at org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5245)
at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1420)
at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1410)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Your namespace in hazelcast.xml must be the same as the prefix of the properties in eureka-client.properties.
In other words, you need to either change the namespace to:
<property name="namespace">hazelcast</property>
Or change your eureka-client.properties to:
hazelcast-app-name.shouldUseDns=false
hazelcast-app-name.datacenter=primary
hazelcast-app-name.name=hazelcast-app-name-sessions
hazelcast-app-name.serviceUrl.default=http://username:password#svcregistry1-dev.company.com:8580/eureka/,http://username:password#svcregistry2-dev.company.com:8590/eureka/
Please read more at:
Hazelcast Eureka Plugin GH repository
Hazelcast Eureka Plugin Code Sample
I've recently installed AS 3.1 and whenever I create a new project AS stuck at "Building xxxx Gradle project info" for hours.
I've already followed the answer here, here, here and here, and none of them solved the problem.
OS: Windows 10
Gradle file: 4.4
Gradle directory:
C:\Users\Ahmed\.gradle\wrapper\dists\gradle-4.4-all\9br9xq1tocpiv8o6njlyu5op1
Gradle directory components:
- gradle-4.4/
- gradle-4.4-all.zip
- gradle-4.4-all.zip.lck
- gradle-4.4-all.zip.ok
Here's a screenshot:
Update:
After waiting for 4 hours, the program started with the following error message:
and here's the last part of the idea.log file:
2018-04-10 19:11:54,478 [e-1024-b02] INFO - j.ide.ui.OptionsTopHitProvider - 10386 ms spent to cache options in application
2018-04-10 19:11:54,723 [e-1024-b02] INFO - rd.FirstRunWizardFrameProvider - Overriding welcome frame to be resizable
2018-04-10 19:12:26,145 [d thread 2] INFO - .openapi.application.Preloader - Finished preloading com.intellij.ide.ui.search.SearchableOptionPreloader#201f6ae4
2018-04-10 19:12:35,171 [d thread 2] INFO - .openapi.application.Preloader - Finished preloading com.intellij.codeInsight.completion.CompletionPreloader#67e41d09
2018-04-10 19:21:08,465 [e-1024-b02] INFO - idea.project.IndexingSuspender - Subscribing project 'Project 'F:\Programming\Mobile\Opensource Android Apps\LeafPic-dev' LeafPic-dev' to indexing suspender events (com.android.tools.idea.project.IndexingSuspender#777708be)
2018-04-10 19:21:08,666 [e-1024-b02] INFO - ellij.project.impl.ProjectImpl - 147 project components initialized in 20284 ms
2018-04-10 19:21:08,668 [e-1024-b02] INFO - le.impl.ModuleManagerComponent - 0 module(s) loaded in 0 ms
2018-04-10 19:21:15,186 [e-1024-b02] INFO - e.project.sync.GradleSyncState - Started sync with Gradle for project 'LeafPic-dev'.
2018-04-10 19:21:15,490 [e-1024-b02] INFO - idea.project.IndexingSuspender - Consuming IndexingSuspender activation event: SYNC_STARTED
2018-04-10 19:21:21,837 [d thread 2] INFO - s.plugins.gradle.GradleManager - Instructing gradle to use java from C:/Program Files/Android/Android Studio/jre
2018-04-10 19:21:22,251 [d thread 2] INFO - s.plugins.gradle.GradleManager - Instructing gradle to use java from C:/Program Files/Android/Android Studio/jre
2018-04-10 19:21:25,535 [e-1024-b02] INFO - rojectCodeStyleSettingsManager - Initialized from default code style settings.
2018-04-10 19:24:23,889 [d thread 2] INFO - xecution.GradleExecutionHelper - Passing command-line args to Gradle Tooling API: -Didea.version=3.1 -Djava.awt.headless=true -Pandroid.injected.build.model.only=true -Pandroid.injected.build.model.only.advanced=true -Pandroid.injected.invoked.from.ide=true -Pandroid.injected.build.model.only.versioned=3 -Pandroid.injected.studio.version=3.1.1.0 -Pandroid.builder.sdkDownload=false --init-script C:\Users\Ahmed\AppData\Local\Temp\ijinit25.gradle --offline
2018-04-10 23:11:08,898 [d thread 2] INFO - .project.GradleProjectResolver - Gradle project resolve error
org.gradle.tooling.GradleConnectionException: Could not run build action using Gradle distribution 'https://services.gradle.org/distributions/gradle-4.4-all.zip'.
at org.gradle.tooling.internal.consumer.ExceptionTransformer.transform(ExceptionTransformer.java:55)
at org.gradle.tooling.internal.consumer.ExceptionTransformer.transform(ExceptionTransformer.java:29)
at org.gradle.tooling.internal.consumer.ResultHandlerAdapter.onFailure(ResultHandlerAdapter.java:41)
at org.gradle.tooling.internal.consumer.async.DefaultAsyncConsumerActionExecutor$1$1.run(DefaultAsyncConsumerActionExecutor.java:57)
at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:63)
at org.gradle.internal.concurrent.StoppableExecutorImpl$1.run(StoppableExecutorImpl.java:46)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:55)
at java.lang.Thread.run(Thread.java:745)
at org.gradle.tooling.internal.consumer.BlockingResultHandler.getResult(BlockingResultHandler.java:46)
at org.gradle.tooling.internal.consumer.DefaultBuildActionExecuter.run(DefaultBuildActionExecuter.java:60)
at org.jetbrains.plugins.gradle.service.project.GradleProjectResolver.doResolveProjectInfo(GradleProjectResolver.java:283)
at org.jetbrains.plugins.gradle.service.project.GradleProjectResolver.access$200(GradleProjectResolver.java:79)
at org.jetbrains.plugins.gradle.service.project.GradleProjectResolver$ProjectConnectionDataNodeFunction.fun(GradleProjectResolver.java:939)
at org.jetbrains.plugins.gradle.service.project.GradleProjectResolver$ProjectConnectionDataNodeFunction.fun(GradleProjectResolver.java:923)
at org.jetbrains.plugins.gradle.service.execution.GradleExecutionHelper.execute(GradleExecutionHelper.java:210)
at org.jetbrains.plugins.gradle.service.project.GradleProjectResolver.resolveProjectInfo(GradleProjectResolver.java:140)
at org.jetbrains.plugins.gradle.service.project.GradleProjectResolver.resolveProjectInfo(GradleProjectResolver.java:79)
at com.intellij.openapi.externalSystem.service.remote.RemoteExternalSystemProjectResolverImpl.lambda$resolveProjectInfo$0(RemoteExternalSystemProjectResolverImpl.java:37)
at com.intellij.openapi.externalSystem.service.remote.AbstractRemoteExternalSystemService.execute(AbstractRemoteExternalSystemService.java:59)
at com.intellij.openapi.externalSystem.service.remote.RemoteExternalSystemProjectResolverImpl.resolveProjectInfo(RemoteExternalSystemProjectResolverImpl.java:37)
at com.intellij.openapi.externalSystem.service.remote.wrapper.ExternalSystemProjectResolverWrapper.resolveProjectInfo(ExternalSystemProjectResolverWrapper.java:45)
at com.intellij.openapi.externalSystem.service.internal.ExternalSystemResolveProjectTask.doExecute(ExternalSystemResolveProjectTask.java:87)
at com.intellij.openapi.externalSystem.service.internal.AbstractExternalSystemTask.execute(AbstractExternalSystemTask.java:163)
at com.intellij.openapi.externalSystem.service.internal.AbstractExternalSystemTask.execute(AbstractExternalSystemTask.java:149)
at com.intellij.openapi.externalSystem.util.ExternalSystemUtil$3.execute(ExternalSystemUtil.java:557)
at com.intellij.openapi.externalSystem.util.ExternalSystemUtil$4.run(ExternalSystemUtil.java:619)
at com.intellij.openapi.progress.impl.CoreProgressManager$TaskRunnable.run(CoreProgressManager.java:713)
at com.intellij.openapi.progress.impl.CoreProgressManager$5.run(CoreProgressManager.java:397)
at com.intellij.openapi.progress.impl.CoreProgressManager.lambda$runProcess$1(CoreProgressManager.java:157)
at com.intellij.openapi.progress.impl.CoreProgressManager.registerIndicatorAndRun(CoreProgressManager.java:543)
at com.intellij.openapi.progress.impl.CoreProgressManager.executeProcessUnderProgress(CoreProgressManager.java:488)
at com.intellij.openapi.progress.impl.ProgressManagerImpl.executeProcessUnderProgress(ProgressManagerImpl.java:94)
at com.intellij.openapi.progress.impl.CoreProgressManager.runProcess(CoreProgressManager.java:144)
at com.intellij.openapi.application.impl.ApplicationImpl.lambda$null$10(ApplicationImpl.java:575)
at com.intellij.openapi.application.impl.ApplicationImpl$1.run(ApplicationImpl.java:315)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: org.gradle.launcher.daemon.client.NoUsableDaemonFoundException: Unable to find a usable idle daemon. I have connected to 100 different daemons but I could not use any of them to run the build. BuildActionParameters were DefaultBuildActionParameters{, currentDir=F:\Programming\Mobile\Opensource Android Apps\LeafPic-dev, systemProperties size=94, envVariables size=40, logLevel=LIFECYCLE, useDaemon=true, continuous=false, interactive=false, injectedPluginClasspath=[]}.
at org.gradle.launcher.daemon.client.DaemonClient.execute(DaemonClient.java:151)
at org.gradle.launcher.daemon.client.DaemonClient.execute(DaemonClient.java:92)
at org.gradle.tooling.internal.provider.DaemonBuildActionExecuter.execute(DaemonBuildActionExecuter.java:60)
at org.gradle.tooling.internal.provider.DaemonBuildActionExecuter.execute(DaemonBuildActionExecuter.java:41)
at org.gradle.tooling.internal.provider.LoggingBridgingBuildActionExecuter.execute(LoggingBridgingBuildActionExecuter.java:60)
at org.gradle.tooling.internal.provider.LoggingBridgingBuildActionExecuter.execute(LoggingBridgingBuildActionExecuter.java:34)
at org.gradle.tooling.internal.provider.ProviderConnection.run(ProviderConnection.java:156)
at org.gradle.tooling.internal.provider.ProviderConnection.runClientAction(ProviderConnection.java:140)
at org.gradle.tooling.internal.provider.ProviderConnection.run(ProviderConnection.java:126)
at org.gradle.tooling.internal.provider.DefaultConnection.run(DefaultConnection.java:224)
at org.gradle.tooling.internal.consumer.connection.CancellableConsumerConnection$CancellableActionRunner.run(CancellableConsumerConnection.java:99)
at org.gradle.tooling.internal.consumer.connection.AbstractConsumerConnection.run(AbstractConsumerConnection.java:62)
at org.gradle.tooling.internal.consumer.connection.ParameterValidatingConsumerConnection.run(ParameterValidatingConsumerConnection.java:53)
at org.gradle.tooling.internal.consumer.DefaultBuildActionExecuter$1.run(DefaultBuildActionExecuter.java:71)
at org.gradle.tooling.internal.consumer.connection.LazyConsumerActionExecutor.run(LazyConsumerActionExecutor.java:84)
at org.gradle.tooling.internal.consumer.connection.CancellableConsumerActionExecutor.run(CancellableConsumerActionExecutor.java:45)
at org.gradle.tooling.internal.consumer.connection.ProgressLoggingConsumerActionExecutor.run(ProgressLoggingConsumerActionExecutor.java:58)
at org.gradle.tooling.internal.consumer.connection.RethrowingErrorsConsumerActionExecutor.run(RethrowingErrorsConsumerActionExecutor.java:38)
at org.gradle.tooling.internal.consumer.async.DefaultAsyncConsumerActionExecutor$1$1.run(DefaultAsyncConsumerActionExecutor.java:55)
at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:63)
at org.gradle.internal.concurrent.StoppableExecutorImpl$1.run(StoppableExecutorImpl.java:46)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:55)
... 1 more
Caused by: org.gradle.launcher.daemon.client.DaemonInitialConnectException: The first result from the daemon was empty. Most likely the process died immediately after connection.
at org.gradle.launcher.daemon.client.DaemonClient.executeBuild(DaemonClient.java:170)
at org.gradle.launcher.daemon.client.DaemonClient.execute(DaemonClient.java:141)
... 24 more
2018-04-10 23:11:11,606 [d thread 2] WARN - nal.AbstractExternalSystemTask - The first result from the daemon was empty. Most likely the process died immediately after connection.
com.intellij.openapi.externalSystem.model.ExternalSystemException: The first result from the daemon was empty. Most likely the process died immediately after connection.
at com.android.tools.idea.gradle.project.sync.idea.ProjectImportErrorHandler.getUserFriendlyError(ProjectImportErrorHandler.java:72)
at com.android.tools.idea.gradle.project.sync.idea.AndroidGradleProjectResolver.getUserFriendlyError(AndroidGradleProjectResolver.java:436)
at org.jetbrains.plugins.gradle.service.project.AbstractProjectResolverExtension.getUserFriendlyError(AbstractProjectResolverExtension.java:158)
at org.jetbrains.plugins.gradle.service.project.GradleProjectResolver$ProjectConnectionDataNodeFunction.fun(GradleProjectResolver.java:943)
at org.jetbrains.plugins.gradle.service.project.GradleProjectResolver$ProjectConnectionDataNodeFunction.fun(GradleProjectResolver.java:923)
at org.jetbrains.plugins.gradle.service.execution.GradleExecutionHelper.execute(GradleExecutionHelper.java:210)
at org.jetbrains.plugins.gradle.service.project.GradleProjectResolver.resolveProjectInfo(GradleProjectResolver.java:140)
at org.jetbrains.plugins.gradle.service.project.GradleProjectResolver.resolveProjectInfo(GradleProjectResolver.java:79)
at com.intellij.openapi.externalSystem.service.remote.RemoteExternalSystemProjectResolverImpl.lambda$resolveProjectInfo$0(RemoteExternalSystemProjectResolverImpl.java:37)
at com.intellij.openapi.externalSystem.service.remote.AbstractRemoteExternalSystemService.execute(AbstractRemoteExternalSystemService.java:59)
at com.intellij.openapi.externalSystem.service.remote.RemoteExternalSystemProjectResolverImpl.resolveProjectInfo(RemoteExternalSystemProjectResolverImpl.java:37)
at com.intellij.openapi.externalSystem.service.remote.wrapper.ExternalSystemProjectResolverWrapper.resolveProjectInfo(ExternalSystemProjectResolverWrapper.java:45)
at com.intellij.openapi.externalSystem.service.internal.ExternalSystemResolveProjectTask.doExecute(ExternalSystemResolveProjectTask.java:87)
at com.intellij.openapi.externalSystem.service.internal.AbstractExternalSystemTask.execute(AbstractExternalSystemTask.java:163)
at com.intellij.openapi.externalSystem.service.internal.AbstractExternalSystemTask.execute(AbstractExternalSystemTask.java:149)
at com.intellij.openapi.externalSystem.util.ExternalSystemUtil$3.execute(ExternalSystemUtil.java:557)
at com.intellij.openapi.externalSystem.util.ExternalSystemUtil$4.run(ExternalSystemUtil.java:619)
at com.intellij.openapi.progress.impl.CoreProgressManager$TaskRunnable.run(CoreProgressManager.java:713)
at com.intellij.openapi.progress.impl.CoreProgressManager$5.run(CoreProgressManager.java:397)
at com.intellij.openapi.progress.impl.CoreProgressManager.lambda$runProcess$1(CoreProgressManager.java:157)
at com.intellij.openapi.progress.impl.CoreProgressManager.registerIndicatorAndRun(CoreProgressManager.java:543)
at com.intellij.openapi.progress.impl.CoreProgressManager.executeProcessUnderProgress(CoreProgressManager.java:488)
at com.intellij.openapi.progress.impl.ProgressManagerImpl.executeProcessUnderProgress(ProgressManagerImpl.java:94)
at com.intellij.openapi.progress.impl.CoreProgressManager.runProcess(CoreProgressManager.java:144)
at com.intellij.openapi.application.impl.ApplicationImpl.lambda$null$10(ApplicationImpl.java:575)
at com.intellij.openapi.application.impl.ApplicationImpl$1.run(ApplicationImpl.java:315)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: org.gradle.launcher.daemon.client.DaemonInitialConnectException: The first result from the daemon was empty. Most likely the process died immediately after connection.
at org.gradle.launcher.daemon.client.DaemonClient.executeBuild(DaemonClient.java:170)
at org.gradle.launcher.daemon.client.DaemonClient.execute(DaemonClient.java:141)
at org.gradle.launcher.daemon.client.DaemonClient.execute(DaemonClient.java:92)
at org.gradle.tooling.internal.provider.DaemonBuildActionExecuter.execute(DaemonBuildActionExecuter.java:60)
at org.gradle.tooling.internal.provider.DaemonBuildActionExecuter.execute(DaemonBuildActionExecuter.java:41)
at org.gradle.tooling.internal.provider.LoggingBridgingBuildActionExecuter.execute(LoggingBridgingBuildActionExecuter.java:60)
at org.gradle.tooling.internal.provider.LoggingBridgingBuildActionExecuter.execute(LoggingBridgingBuildActionExecuter.java:34)
at org.gradle.tooling.internal.provider.ProviderConnection.run(ProviderConnection.java:156)
at org.gradle.tooling.internal.provider.ProviderConnection.runClientAction(ProviderConnection.java:140)
at org.gradle.tooling.internal.provider.ProviderConnection.run(ProviderConnection.java:126)
at org.gradle.tooling.internal.provider.DefaultConnection.run(DefaultConnection.java:224)
at org.gradle.tooling.internal.consumer.connection.CancellableConsumerConnection$CancellableActionRunner.run(CancellableConsumerConnection.java:99)
at org.gradle.tooling.internal.consumer.connection.AbstractConsumerConnection.run(AbstractConsumerConnection.java:62)
at org.gradle.tooling.internal.consumer.connection.ParameterValidatingConsumerConnection.run(ParameterValidatingConsumerConnection.java:53)
at org.gradle.tooling.internal.consumer.DefaultBuildActionExecuter$1.run(DefaultBuildActionExecuter.java:71)
at org.gradle.tooling.internal.consumer.connection.LazyConsumerActionExecutor.run(LazyConsumerActionExecutor.java:84)
at org.gradle.tooling.internal.consumer.connection.CancellableConsumerActionExecutor.run(CancellableConsumerActionExecutor.java:45)
at org.gradle.tooling.internal.consumer.connection.ProgressLoggingConsumerActionExecutor.run(ProgressLoggingConsumerActionExecutor.java:58)
at org.gradle.tooling.internal.consumer.connection.RethrowingErrorsConsumerActionExecutor.run(RethrowingErrorsConsumerActionExecutor.java:38)
at org.gradle.tooling.internal.consumer.async.DefaultAsyncConsumerActionExecutor$1$1.run(DefaultAsyncConsumerActionExecutor.java:55)
at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:63)
at org.gradle.internal.concurrent.StoppableExecutorImpl$1.run(StoppableExecutorImpl.java:46)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:55)
... 1 more
2018-04-10 23:11:14,353 [d thread 2] WARN - ect.sync.idea.ProjectSetUpTask - The first result from the daemon was empty. Most likely the process died immediately after connection.
2018-04-10 23:11:14,530 [d thread 2] INFO - e.project.sync.GradleSyncState - Gradle sync failed: The first result from the daemon was empty. Most likely the process died immediately after connection.
Consult IDE log for more details (Help | Show Log) (3h 49m 59s 342ms)
2018-04-10 23:11:24,533 [d thread 2] INFO - j.ide.script.IdeStartupScripts - 0 startup script(s) found
2018-04-10 23:11:40,980 [ thread 67] INFO - tartup.impl.StartupManagerImpl - ExternalSystemStartupActivity run in 292ms under project opening modal progress
2018-04-10 23:11:41,137 [ thread 67] INFO - tartup.impl.StartupManagerImpl - ConfigProjectComponent run in 104ms under project opening modal progress
2018-04-10 23:11:41,471 [ thread 67] INFO - tartup.impl.StartupManagerImpl - OCInitialTablesBuildingActivity run in 259ms under project opening modal progress
2018-04-10 23:11:41,915 [ thread 67] INFO - tartup.impl.StartupManagerImpl - InitToolWindowsActivity run in 390ms under project opening modal progress
2018-04-10 23:11:41,915 [ thread 67] INFO - .diagnostic.PerformanceWatcher - Post-startup activities under progress took 2185ms; general responsiveness: ok; EDT responsiveness: ok
2018-04-10 23:11:44,556 [e-1024-b02] INFO - tartup.impl.StartupManagerImpl - F:/Programming/Mobile/Opensource Android Apps/LeafPic-dev/.idea case-sensitivity: expected=false actual=false
2018-04-10 23:11:45,139 [ thread 70] INFO - pl.projectlevelman.NewMappings - VCS Root: [] - [<Project>]
2018-04-10 23:11:50,476 [ thread 69] INFO - .diagnostic.PerformanceWatcher - Pushing properties took 6027ms; general responsiveness: ok; EDT responsiveness: 1/2 sluggish, 1/2 very slow
2018-04-10 23:11:53,440 [e-1024-b02] INFO - tor.impl.FileEditorManagerImpl - Project opening took 13874787 ms
2018-04-10 23:12:10,202 [ thread 69] INFO - .diagnostic.PerformanceWatcher - Indexable file iteration took 19598ms; general responsiveness: 1/18 sluggish, 7/18 very slow; EDT responsiveness: 0/16 sluggish, 10/16 very slow
2018-04-10 23:12:10,208 [ thread 69] INFO - indexing.UnindexedFilesUpdater - Unindexed files update started: 286 files to update
2018-04-10 23:13:05,540 [ thread 69] INFO - .diagnostic.PerformanceWatcher - Unindexed files update took 55332ms; general responsiveness: 1/54 sluggish, 1/54 very slow; EDT responsiveness: 0/54 sluggish, 8/54 very slow
2018-04-10 23:13:05,680 [ thread 69] INFO - #com.jetbrains.cidr.lang - Clearing symbols finished in 0 s.
2018-04-10 23:13:05,870 [ thread 69] INFO - #com.jetbrains.cidr.lang - Building symbols in FAST mode, 0 source files from total 0 project files
2018-04-10 23:13:06,317 [ thread 69] INFO - #com.jetbrains.cidr.lang - Loading Module Maps finished in 0 s.
2018-04-10 23:13:06,337 [ thread 69] INFO - #com.jetbrains.cidr.lang - Saving Module Maps finished in 0 s.
2018-04-10 23:13:06,337 [ thread 69] INFO - #com.jetbrains.cidr.lang - Saving Module Maps finished in 0 s.
2018-04-10 23:13:06,338 [ thread 69] INFO - #com.jetbrains.cidr.lang - Loaded 0 tables for 0 files (0 project files)
2018-04-10 23:13:06,346 [ thread 69] INFO - #com.jetbrains.cidr.lang - Building symbols for 0 source files
2018-04-10 23:13:06,482 [ thread 69] INFO - #com.jetbrains.cidr.lang - Building symbols for 0 unused headers
2018-04-10 23:13:06,485 [ thread 69] INFO - #com.jetbrains.cidr.lang - Building symbols finished in 0 s.
2018-04-10 23:13:06,490 [ thread 69] INFO - #com.jetbrains.cidr.lang - Saving modified symbols for 0 files (0 tables of total 0)
2018-04-10 23:13:06,564 [ thread 69] INFO - #com.jetbrains.cidr.lang - Saving symbols finished in 0 s.
2018-04-10 23:13:08,141 [e-1024-b02] INFO - tartup.impl.StartupManagerImpl - Some post-startup activities freeze UI for noticeable time. Please consider making them DumbAware to do them in background under modal progress, or just making them faster to speed up project opening.
2018-04-10 23:13:08,142 [e-1024-b02] INFO - tartup.impl.StartupManagerImpl - ProjectInspectionProfileStartUpActivity run in 1516ms on UI thread
2018-04-10 23:13:12,220 [e-1024-b02] INFO - j.ide.ui.OptionsTopHitProvider - 3038 ms spent to cache options in project
2018-04-10 23:13:15,808 [e-1024-b02] INFO - idea.project.IndexingSuspender - Starting batch update for project: Project 'F:\Programming\Mobile\Opensource Android Apps\LeafPic-dev' LeafPic-dev
2018-04-10 23:13:22,604 [d thread 2] INFO - g.FileBasedIndexProjectHandler - Reindexing refreshed files: 1 to update, calculated in 96ms
2018-04-10 23:13:22,694 [d thread 2] INFO - .diagnostic.PerformanceWatcher - Reindexing refreshed files took 89ms; general responsiveness: ok; EDT responsiveness: 1/1 sluggish
2018-04-10 23:13:25,617 [d thread 2] INFO - g.FileBasedIndexProjectHandler - Reindexing refreshed files: 0 to update, calculated in 12ms
2018-04-10 23:13:29,982 [d thread 2] INFO - CompilerWorkspaceConfiguration - Available processors: 4
2018-04-10 23:13:30,753 [d thread 2] INFO - g.FileBasedIndexProjectHandler - Reindexing refreshed files: 212 to update, calculated in 321ms
2018-04-10 23:13:39,277 [d thread 2] INFO - .diagnostic.PerformanceWatcher - Reindexing refreshed files took 8524ms; general responsiveness: 2/8 sluggish, 1/8 very slow; EDT responsiveness: 0/8 sluggish, 3/8 very slow
2018-04-10 23:24:17,350 [e-1024-b02] INFO - ide.actions.ShowFilePathAction -
Exit code 1
I've figured out the cause of the problem.
It was COMODO Firewall that was blocking some files.
I've marked them as trusted and the problem was solved.
If anyone faced my problem, please consider your AV or Firewall software as it might be the cause of the problem by blocking some files.
thanks for everyone tried to help!
After running "nodetool repair" command cassandra node gone down and did not start again.
INFO [main] 2016-10-19 12:44:50,244 ColumnFamilyStore.java:405 - Initializing system_schema.aggregates
INFO [main] 2016-10-19 12:44:50,247 ColumnFamilyStore.java:405 - Initializing system_schema.indexes
INFO [main] 2016-10-19 12:44:50,248 ViewManager.java:139 - Not submitting build tasks for views in keyspace system_schema as storage service is not initialized
Cassandra version 3.7
Turned on the node and it's fine. It took too long to start (more than 30 minutes).
INFO [main] 2016-10-19 15:32:48,348 ColumnFamilyStore.java:405 - Initializing system_schema.indexes
INFO [main] 2016-10-19 15:32:48,354 ViewManager.java:139 - Not submitting build tasks for views in keyspace system_schema as storage service is not initialized
INFO [main] 2016-10-19 16:07:36,529 ColumnFamilyStore.java:405 - Initializing system_distributed.parent_repair_history
INFO [main] 2016-10-19 16:07:36,546 ColumnFamilyStore.java:405 - Initializing system_distributed.repair_history
Now I'm trying to figure out why it is so slow.