I was just pointing out that you used "groupId" instead of "group", and "artifactId" instead of "module".

On Tue, Nov 17, 2015 at 4:04 PM, tog <guillaume.alleon@gmail.com> wrote:
Hi Keegan

Not sure to understand. When I run directly my script from the command line:

@Grab('org.apache.spark:spark-core_2.10:1.5.2') ................... ................ ................ ............................ works

@Grab(group='org.apache.spark', module='spark-core_2.10', version='1.5.2') .......................................works

groovy.grape.Grape.grab(group:'org.apache.spark', module:'spark-core_2.10', version:'1.5.2') ............. does not work

I was suggested the last one in place of @Gra)


When I try to run the same script through GroovyShell (which is what I really want to do) then I got the following exception:

java.lang.ClassNotFoundException: org.apache.spark.rpc.akka.AkkaRpcEnvFactory

at java.net.URLClassLoader.findClass(URLClassLoader.java:381)

at java.lang.ClassLoader.loadClass(ClassLoader.java:424)

at groovy.lang.GroovyClassLoader.loadClass(GroovyClassLoader.java:677)

at groovy.lang.GroovyClassLoader.loadClass(GroovyClassLoader.java:787)

at groovy.lang.GroovyClassLoader.loadClass(GroovyClassLoader.java:775)

at java.lang.Class.forName0(Native Method)

at java.lang.Class.forName(Class.java:348)


Cheers

Guillaume


On 17 November 2015 at 20:47, Keegan Witt <keeganwitt@gmail.com> wrote:
Guillaume,
You just have the wrong syntax.  This
@Grab(group='org.apache.spark', module='spark-core_2.10', version='1.5.2')

is the equivalent to this
@Grab('org.apache.spark:spark-core_2.10:1.5.2')

Grab uses the Ivy names for the coordinates rather than Maven's naming convention.  Let me know if that doesn't work or that doesn't answer your question.

-Keegan

On Tue, Nov 17, 2015 at 3:24 PM, tog <guillaume.alleon@gmail.com> wrote:
Hello

Any more ideas regarding my issue?

Thanks
Guillaume

On 15 November 2015 at 20:19, tog <guillaume.alleon@gmail.com> wrote:
Sorry, my previous email is wrong.

The block:
groovy.grape.Grape.grab(
    groupId: 'org.apache.spark',
    artifactId: 'spark-core_2.10',
    version: '1.5.2'
)

does not seem equivalent to:

@Grab('org.apache.spark:spark-core_2.10:1.5.2')

since the imports cannot be found.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------

tog GroovySpark $ groovy  GroovySparkWordcount.groovy 

org.codehaus.groovy.control.MultipleCompilationErrorsException: startup failed:

/Users/tog/Work/GroovySpark/GroovySparkWordcount.groovy: 9: unable to resolve class org.apache.spark.api.java.JavaSparkContext

 @ line 9, column 1.

   import org.apache.spark.api.java.JavaSparkContext

   ^


/Users/tog/Work/GroovySpark/GroovySparkWordcount.groovy: 8: unable to resolve class org.apache.spark.SparkConf

 @ line 8, column 1.

   import org.apache.spark.SparkConf

   ^


2 errors


On 15 November 2015 at 18:55, tog <guillaume.alleon@gmail.com> wrote:
Thanks, Yes, just realize the typo ... I fixed it and get the very same error.
I am getting lost ;-)



org.apache.spark.SparkConf@2158ddec java.lang.ClassNotFoundException: org.apache.spark.rpc.akka.AkkaRpcEnvFactory at java.net.URLClassLoader.findClass(URLClassLoader.java:381) at java.lang.ClassLoader.loadClass(ClassLoader.java:424) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331) at java.lang.ClassLoader.loadClass(ClassLoader.java:357) at java.lang.Class.forName0(Native Method) at java.lang.Class.forName(Class.java:348) at org.apache.spark.rpc.RpcEnv$.getRpcEnvFactory(RpcEnv.scala:40) at org.apache.spark.rpc.RpcEnv$.create(RpcEnv.scala:52) at org.apache.spark.SparkEnv$.create(SparkEnv.scala:247) at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:188) at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:267) at org.apache.spark.SparkContext.<init>(SparkContext.scala:424) at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:61) at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:422) at org.codehaus.groovy.reflection.CachedConstructor.invoke(CachedConstructor.java:80) at org.codehaus.groovy.runtime.callsite.ConstructorSite$ConstructorSiteNoUnwrapNoCoerce.callConstructor(ConstructorSite.java:105) at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCallConstructor(CallSiteArray.java:60) at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callConstructor(AbstractCallSite.java:235) at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callConstructor(AbstractCallSite.java:247) at Script6.run(Script6.groovy:16)



On 15 November 2015 at 18:41, Bahman Movaqar <Bahman@bahmanm.com> wrote:
On 11/15/2015 10:03 PM, tog wrote:

> @Grap seems to have default repo to look into ... with the change you
> are suggesting I got
> ava.lang.RuntimeException: Error grabbing Grapes -- [unresolved
> dependency: org.apache.spark#spark core_2.10;1.5.2: not found] at
> sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>
> How do I define them?

It was a typo on my side. `artifactId` should be "spark-core_2.10" (note
the `-` character).



--
PGP KeyID: 2048R/EA31CFC9  subkeys.pgp.net



--
PGP KeyID: 2048R/EA31CFC9  subkeys.pgp.net



--
PGP KeyID: 2048R/EA31CFC9  subkeys.pgp.net




--
PGP KeyID: 2048R/EA31CFC9  subkeys.pgp.net