Did lots of search, saw many people having the similar issue and tried various suggested solution. None worked.
Can someone help me?
resolvers += Resolver.url("bintray-sbt-plugins", url("http://dl.bintray.com/sbt/sbt-plugin-releases"))(Resolver.ivyStylePatterns)
addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.13.0")
The file is inside the project folder.
Instead of 0.13.0 version, I used 0.14.0 version.
I fixed this by adding POM file which I downloaded from
https://dl.bintray.com/sbt/sbt-plugin-releases/com.eed3si9n/sbt-assembly/scala_2.10/sbt_0.13/0.14.4/ivys/
to my local ivy folder under below location .ivy/local ( if not present, create the local folder).
once it was there I ran the build and it downloaded the jar.
You need to add [root_dir]/project/plugins.sbt file with the following content:
// packager
addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.14.5")
Event better - don't use sbt-assembly at all! Flat-jars cause conflicts during merging which need to be resolved with assemblyMergeStrategy.
Use the binary distribution format plugin that sbt offers which enables you to distribute in binary script, dmg, msi and tar.gz.
Check out sbt-native-packager
Related
I have an offline pyspark cluster (no internet access) where I need to install graphframes library.
I have manually downloaded the jar from here added in $SPARK_HOME/jars/ and then when I try to use it I get the following error:
error: missing or invalid dependency detected while loading class file 'Logging.class'.
Could not access term typesafe in package com,
because it (or its dependencies) are missing. Check your build definition for
missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see the problematic classpath.)
A full rebuild may help if 'Logging.class' was compiled against an incompatible version of com.
error: missing or invalid dependency detected while loading class file 'Logging.class'.
Could not access term scalalogging in value com.typesafe,
because it (or its dependencies) are missing. Check your build definition for
missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see the problematic classpath.)
A full rebuild may help if 'Logging.class' was compiled against an incompatible version of com.typesafe.
error: missing or invalid dependency detected while loading class file 'Logging.class'.
Could not access type LazyLogging in value com.slf4j,
because it (or its dependencies) are missing. Check your build definition for
missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see the problematic classpath.)
A full rebuild may help if 'Logging.class' was compiled against an incompatible version of com.slf4j.
Which is the correct way to offline install it with all the dependencies?
I manage to install the graphframes libarary. First of all I found the graphframes dependencies witch where:
scala-logging-api_xx-xx.jar
scala-logging-slf4j_xx-xx.jar
where xx is the proper versions for scala and the jar version. Then I installed them in the proper path. Because I work in an Cloudera machine the proper path is:
/opt/cloudera/parcels/SPARK2/lib/spark2/jars/
If you can not place them in this directory in your cluster (because you have no root rights and your admin is super lazy) you can simply add in your spark-submit/ spark-shell
spark-submit ..... --driver-class-path /path-for-jar/ \
--jars /../graphframes-0.5.0-spark2.1-s_2.11.jar,/../scala-logging-slf4j_2.10-2.1.2.jar,/../scala-logging-api_2.10-2.1.2.jar
This works for Scala. In order to use graphframes for python you need to
download graphframes jar and then through shell
#Extract JAR content
jar xf graphframes_graphframes-0.3.0-spark2.0-s_2.11.jar
#Enter the folder
cd graphframes
#Zip the contents
zip graphframes.zip -r *
And then add the zipped file in your python path in spark-env.sh or your bash_profile
with
export PYTHONPATH=$PYTHONPATH:/..proper path/graphframes.zip:.
Then opening the shell/submitting (again with the same arguments as with scala) importing graphframes works normaly
This link was extremely useful for this solution
I'm working on IntelliJ and specified this parameter to my JVM :
-Dcom.github.fommil.netlib.BLAS=mkl_rt.dll (my mkl folder is in the Path)
However I still have the following warning :
WARN BLAS: Failed to load implementation from: mkl_rt.dll
Any help ?
I finally solved this issue, here's the complete step to do make it work on intelliJ Idea on Windows :
First create an SBT project and make sure to put the following line in build.SBT :
libraryDependencies ++= Seq("com.github.fommil.netlib" % "all" % "1.1.1" pomOnly())
Refresh the project, after that you should have the libraries available. If that doesn't work for some reason, you can go here : http://repo1.maven.org/maven2/com/github/fommil/netlib/ and download the necessary resources for your system directly.
Copy your mkl_rt.dll twice and rename the copies libblas3.dll and liblapack3.dll. Make sure your folders containing all the Dll is in the PATH environment variable.
Finally, go to Run -> Edit configuration and in the VM options put :
-Dcom.github.fommil.netlib.BLAS=mkl_rt.dll
I'm trying to work on an sbt project offline (again). Things almost seem to be ok, but there are strange things that I'm baffled by. Here's what I'm noticing:
I've created an empty sbt project and am considering the following dependencies in build.sbt:
name := "sbtSand"
version := "1.0"
scalaVersion := "2.11.7"
libraryDependencies ++= Seq(
"joda-time" % "joda-time" % "2.9.1",
"org.apache.spark" %% "spark-core" % "1.5.2"
)
I've built the project while online, and can see all the packages in [userhome]/.ivy2/cache. The project builds fine. I then turn off wifi, sbt clean and attempt to build. The build fails. I comment out the spark dependency (keeping the joda-time one). Still offline, I run sbt compile. The project builds fine. I put the spark dependency back in, and sbt clean. It again fails to build. I get back online. I can build again.
The sbt output for the failed builds are like: https://gist.github.com/ashic/9e5ebc39ff4eb8c41ffb
The key part of it is:
[info] Resolving org.apache.hadoop#hadoop-mapreduce-client-app;2.2.0 ...
[warn] Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-mapreduce-client-app/2.2.0/hadoop-mapreduce-client-app-2.2.0.pom
[info] You probably access the destination server through a proxy server that is not well configured.
It's interesting that sbt is managing to use the joda-time from ivy cache, but for the spark-core package (or rather its dependency) it wants to reach out to the internet and fails the build. Could anybody please help me understand this, and what I can do so that I can get this to work while fully offline?
It seems the issue is resolved in 0.13.9. I was using 0.13.8. [The 0.13.9 msi for windows seemed to give me 0.13.8, while the 0.13.9.2 msi installed the right version. Existing projects need updating manually to 0.13.9 in build properties.]
I need to run F# with monodeveloper on Arch Linux. (Please don't advice me to NOT use it, i have to use it as it is required by a university course). I tried to add Language Binding for F# as it is described in this link(downloaded zip file, extract it and tried to add from add-in manager):
https://code.google.com/p/wildart/wiki/FSharpBinding
It gives me the following errors:
The selected add-ins can not be installed because there are dependency conflicts.
The package 'Components v2.2' could not be found in any repository
The package 'Core v2.2' could not be found in any repository
The package 'Core.Gui v2.2' could not be found in any repository
The package 'Ide v2.2' could not be found in any repository
The package 'Projects v2.2' could not be found in any repository
The package 'Projects.Gui v2.2' could not be found in any repository
How can i solve this problem? I search on the internet and couldn't find a solution yet.
Second way, i tried to build and install from scratch as described in github:
https://github.com/fsharp/fsharpbinding/tree/master/monodevelop
I've downloaded fsharpbinding.zip, installed nuget(i don't know what else i should download, as it says "required nuget packages"), unzip the file, and call ./configure.sh under his folder. It fails because it can't find fsc in given paths:
which: no fsc in (/usr/local/sbin:/usr/local/bin:/usr/bin:/usr/bin/site_perl:/usr/bin/vendor_perl:/usr/bin/core_perl
It successfully finds all other directories that it looks for. F# is already installed but i don't know how to find fsc to pass its path to configure.sh. What should i do?
just ran into this, and it seems to be this issue. fixed it by uninstalling monodevelop and installing monodevelop-latest from aur
I am struggling around a wrong usage of composer, for sure.
I set up this repository: https://github.com/alle/assets-merger
I forked the project and was just trying to make it a kohana-module, including all the dependencies.
As for it would need the YUI comporess JAR, I was trying to make just that JARfile as a dependency, and I ended to declare it in the composer.json file (please, look at this).
Once I need to add my new package to a project I add it in the require section as follows:
...
"alle/assets-merger": "dev-master",
...
But the (latest) composer update command says:
Loading composer repositories with package information
Updating dependencies (including require-dev)
Your requirements could not be resolved to an installable set of packages.
Problem 1
- Installation request for alle/assets-merger dev-develop -> satisfiable by alle/assets-merger[dev-develop].
- alle/assets-merger dev-develop requires yui/yuicompressor 2.4.8 -> no matching package found.
Potential causes:
- A typo in the package name
- The package is not available in a stable-enough version according to your minimum-stability setting see <https://groups.google.com/d/topic/composer-dev/_g3ASeIFlrc/discussion> for more details.
And my story ends here.
How should I configure my composer.json in the https://github.com/alle/assets-merger repository, in order to include it as a fully satisfied kohana-module in other projects?
Some things I notice in your composer.json.
There is a version of that CSS minify available on Packagist which says it is just a copy of the original Goole-Code hosted files, but with Composer: natxet/cssmin. It is version 3.0.2, but I think that shouldn't make a difference.
mrclay/minify is included twice in the packages with the same version. It also is available on Packagist. You will probably already use that (version 2.2.0 is registered, and because you didn't turn of Packagist access, it will be generally available for install unless a version requirement or conflict prevents it).
You are trying to download a JAR file (which is a java executable without and PHP), but try to get PHP classmaps out of it. That will fail for sure.
You did miss the big note in the Composer documentation saying that Composer cannot resolve repositories mentioned in sub packages, only in the root package. That means that whatever you mention in your alle/asset-merger package will not be used if you use that package anywhere else. You'd have to duplicate these repositories in every package in addition to adding the package name itself as "required".
What this means is that you probably avoided missing mrclay/minify because it is available on Packagist, you might as well have added the cssmin by accident, but you definitly did not add YUICompressor.
But you shouldn't add this in the first place, because it is no PHP software. You can however add post-install commands to your projects. All your Composer integration does is download the JAR file. You can do that with a post-install or post-update command. See the documentation here.