groovy-users mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Keegan Witt <keeganw...@gmail.com>
Subject Re: GroovyShell
Date Wed, 18 Nov 2015 05:40:01 GMT
Ah, sorry, I misunderstood.  I think the issue might be that you don't have
Ivy on your classpath.  This executes without exceptions for me

Main.java

import groovy.lang.GroovyShell;
public class Main {
    public static void main(String[] args) {
        new GroovyShell().evaluate("@Grab('org.apache.spark:spark-core_2.10:1.5.2')\n"
+
                "import org.apache.spark.api.java.JavaSparkContext");
    }
}


build.gradle

apply plugin: 'java'
repositories {
    mavenCentral()
}
dependencies {
    compile 'org.codehaus.groovy:groovy-all:2.4.5'
    runtime 'org.apache.ivy:ivy:2.4.0'
}


It also works in the GroovyConsole for me.  I know it seems obvious, but
maybe check your Groovy lib directory for the Ivy jar?  I'm having trouble
thinking what else the difference might be.  Unless maybe I'm running a
line that's different from the one you saw fail?

-Keegan

On Tue, Nov 17, 2015 at 4:48 PM, tog <guillaume.alleon@gmail.com> wrote:

> Hi Keegan
>
> Thanks for testing
>
> Well it works with groovysh indeed but not in a script using GroovyShell
> using the following scripts
>
> https://gist.github.com/galleon/231dbfcff36f8d4ce6c2
> https://gist.github.com/galleon/e0807499a1b8b78924ca
>
> Any idea what I do wrong ?
>
> --------
>
> tog GroovySpark $ groovy -version
>
> Groovy Version: 2.4.5 JVM: 1.8.0_60 Vendor: Oracle Corporation OS: Mac OS X
>
>
> tog GroovySpark $ groovysh
>
> Groovy Shell (2.4.5, JVM: 1.8.0_60)
>
> Type '*:help*' or '*:h*' for help.
>
>
> --------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
>
> *groovy:*000*>* groovy.grape.Grape.grab(group:'org.apache.spark',
> module:'spark-core_2.10', version:'1.5.2')
>
> *===>* null
>
> *groovy:*000*>* import org.apache.spark.api.java.JavaSparkContext
>
> *===>* org.apache.spark.api.java.JavaSparkContext
>
> *groovy:*000*>* println JavaSparkContext
>
> class org.apache.spark.api.java.JavaSparkContext
>
> *===>* null
>
> On 17 November 2015 at 21:26, Keegan Witt <keeganwitt@gmail.com> wrote:
>
>> What version of Groovy are you using?  It seems to work for me with 2.4.5
>> and Java 1.8.0_65.
>>
>> groovy.grape.Grape.grab(group:'org.apache.spark',
>> module:'spark-core_2.10', version:'1.5.2')
>> ===> null
>> groovy:000> import org.apache.spark.api.java.JavaSparkContext
>> ===> org.apache.spark.api.java.JavaSparkContext
>> groovy:000> println JavaSparkContext
>> ERROR org.apache.spark.SparkException:
>> A master URL must be set in your configuration
>>         at org.apache.spark.SparkContext.<init> (SparkContext.scala:394)
>>         at org.apache.spark.SparkContext.<init> (SparkContext.scala:112)
>>         at org.apache.spark.api.java.JavaSparkContext.<init>
>> (JavaSparkContext.scala:56)
>>
>>
>> On Tue, Nov 17, 2015 at 4:24 PM, Thibault Kruse <tibokruse@googlemail.com
>> > wrote:
>>
>>> you'd have to recompile, my PR is growing old
>>>
>>> On Tue, Nov 17, 2015 at 10:08 PM, tog <guillaume.alleon@gmail.com>
>>> wrote:
>>> > Thibault
>>> >
>>> > Has your change been pushed in groovy recently or should I recompile
>>> my own
>>> > version to test if that solve my issue?
>>> > Any other way to test it without having to generate my own version ?
>>> >
>>> > Cheers
>>> > Guillaume
>>> >
>>> > On 17 November 2015 at 20:57, Thibault Kruse <tibokruse@googlemail.com
>>> >
>>> > wrote:
>>> >>
>>> >> Not sure if this is related all. But I had an issue getting Grape
>>> >> imports available in Groovysh (which is related to Groovy Shell),
>>> >> which cause me to try and tamper with the Grape classloading:
>>> >>
>>> >>
>>> http://mail-archives.apache.org/mod_mbox/groovy-dev/201508.mbox/%3CCAByu6UVw1KNVqPnQrjKRCANj6e8od9sGczinz7iDWA1P+=45PA@mail.gmail.com%3E
>>> >>
>>> >> This might be unrelated to your problems, though.
>>> >>
>>> >>
>>> >> On Tue, Nov 17, 2015 at 9:24 PM, tog <guillaume.alleon@gmail.com>
>>> wrote:
>>> >> > Hello
>>> >> >
>>> >> > Any more ideas regarding my issue?
>>> >> >
>>> >> > Thanks
>>> >> > Guillaume
>>> >> >
>>> >> > On 15 November 2015 at 20:19, tog <guillaume.alleon@gmail.com>
>>> wrote:
>>> >> >>
>>> >> >> Sorry, my previous email is wrong.
>>> >> >>
>>> >> >> The block:
>>> >> >> groovy.grape.Grape.grab(
>>> >> >>     groupId: 'org.apache.spark',
>>> >> >>     artifactId: 'spark-core_2.10',
>>> >> >>     version: '1.5.2'
>>> >> >> )
>>> >> >>
>>> >> >> does not seem equivalent to:
>>> >> >>
>>> >> >> @Grab('org.apache.spark:spark-core_2.10:1.5.2')
>>> >> >>
>>> >> >> since the imports cannot be found.
>>> >> >>
>>> >> >>
>>> >> >>
>>> >> >>
>>> ------------------------------------------------------------------------------------------------------------------------------------------------------------------------
>>> >> >>
>>> >> >> tog GroovySpark $ groovy  GroovySparkWordcount.groovy
>>> >> >>
>>> >> >> org.codehaus.groovy.control.MultipleCompilationErrorsException:
>>> startup
>>> >> >> failed:
>>> >> >>
>>> >> >> /Users/tog/Work/GroovySpark/GroovySparkWordcount.groovy: 9:
unable
>>> to
>>> >> >> resolve class org.apache.spark.api.java.JavaSparkContext
>>> >> >>
>>> >> >>  @ line 9, column 1.
>>> >> >>
>>> >> >>    import org.apache.spark.api.java.JavaSparkContext
>>> >> >>
>>> >> >>    ^
>>> >> >>
>>> >> >>
>>> >> >> /Users/tog/Work/GroovySpark/GroovySparkWordcount.groovy: 8:
unable
>>> to
>>> >> >> resolve class org.apache.spark.SparkConf
>>> >> >>
>>> >> >>  @ line 8, column 1.
>>> >> >>
>>> >> >>    import org.apache.spark.SparkConf
>>> >> >>
>>> >> >>    ^
>>> >> >>
>>> >> >>
>>> >> >> 2 errors
>>> >> >>
>>> >> >>
>>> >> >> On 15 November 2015 at 18:55, tog <guillaume.alleon@gmail.com>
>>> wrote:
>>> >> >>>
>>> >> >>> Thanks, Yes, just realize the typo ... I fixed it and get
the very
>>> >> >>> same
>>> >> >>> error.
>>> >> >>> I am getting lost ;-)
>>> >> >>>
>>> >> >>>
>>> >> >>>
>>> >> >>> org.apache.spark.SparkConf@2158ddec
>>> java.lang.ClassNotFoundException:
>>> >> >>> org.apache.spark.rpc.akka.AkkaRpcEnvFactory at
>>> >> >>> java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at
>>> >> >>> java.lang.ClassLoader.loadClass(ClassLoader.java:424) at
>>> >> >>> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
at
>>> >> >>> java.lang.ClassLoader.loadClass(ClassLoader.java:357) at
>>> >> >>> java.lang.Class.forName0(Native Method) at
>>> >> >>> java.lang.Class.forName(Class.java:348) at
>>> >> >>> org.apache.spark.rpc.RpcEnv$.getRpcEnvFactory(RpcEnv.scala:40)
at
>>> >> >>> org.apache.spark.rpc.RpcEnv$.create(RpcEnv.scala:52) at
>>> >> >>> org.apache.spark.SparkEnv$.create(SparkEnv.scala:247) at
>>> >> >>> org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:188)
at
>>> >> >>>
>>> org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:267)
>>> >> >>> at
>>> >> >>> org.apache.spark.SparkContext.<init>(SparkContext.scala:424)
at
>>> >> >>>
>>> >> >>>
>>> org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:61)
>>> >> >>> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
>>> >> >>> Method) at
>>> >> >>>
>>> >> >>>
>>> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>>> >> >>> at
>>> >> >>>
>>> >> >>>
>>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>>> >> >>> at
>>> java.lang.reflect.Constructor.newInstance(Constructor.java:422) at
>>> >> >>>
>>> >> >>>
>>> org.codehaus.groovy.reflection.CachedConstructor.invoke(CachedConstructor.java:80)
>>> >> >>> at
>>> >> >>>
>>> >> >>>
>>> org.codehaus.groovy.runtime.callsite.ConstructorSite$ConstructorSiteNoUnwrapNoCoerce.callConstructor(ConstructorSite.java:105)
>>> >> >>> at
>>> >> >>>
>>> >> >>>
>>> org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCallConstructor(CallSiteArray.java:60)
>>> >> >>> at
>>> >> >>>
>>> >> >>>
>>> org.codehaus.groovy.runtime.callsite.AbstractCallSite.callConstructor(AbstractCallSite.java:235)
>>> >> >>> at
>>> >> >>>
>>> >> >>>
>>> org.codehaus.groovy.runtime.callsite.AbstractCallSite.callConstructor(AbstractCallSite.java:247)
>>> >> >>> at Script6.run(Script6.groovy:16)
>>> >> >>>
>>> >> >>>
>>> >> >>>
>>> >> >>> On 15 November 2015 at 18:41, Bahman Movaqar <Bahman@bahmanm.com>
>>> >> >>> wrote:
>>> >> >>>>
>>> >> >>>> On 11/15/2015 10:03 PM, tog wrote:
>>> >> >>>>
>>> >> >>>> > @Grap seems to have default repo to look into
... with the
>>> change
>>> >> >>>> > you
>>> >> >>>> > are suggesting I got
>>> >> >>>> > ava.lang.RuntimeException: Error grabbing Grapes
-- [unresolved
>>> >> >>>> > dependency: org.apache.spark#spark core_2.10;1.5.2:
not found]
>>> at
>>> >> >>>> > sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
>>> >> >>>> > Method)
>>> >> >>>> >
>>> >> >>>> > How do I define them?
>>> >> >>>>
>>> >> >>>> It was a typo on my side. `artifactId` should be
>>> "spark-core_2.10"
>>> >> >>>> (note
>>> >> >>>> the `-` character).
>>> >> >>>>
>>> >> >>>> --
>>> >> >>>> Bahman Movaqar
>>> >> >>>>
>>> >> >>>> http://BahmanM.com - https://twitter.com/bahman__m
>>> >> >>>> https://github.com/bahmanm - https://gist.github.com/bahmanm
>>> >> >>>> PGP Key ID: 0x6AB5BD68 (keyserver2.pgp.com)
>>> >> >>>>
>>> >> >>>
>>> >> >>>
>>> >> >>>
>>> >> >>> --
>>> >> >>> PGP KeyID: 2048R/EA31CFC9  subkeys.pgp.net
>>> >> >>
>>> >> >>
>>> >> >>
>>> >> >>
>>> >> >> --
>>> >> >> PGP KeyID: 2048R/EA31CFC9  subkeys.pgp.net
>>> >> >
>>> >> >
>>> >> >
>>> >> >
>>> >> > --
>>> >> > PGP KeyID: 2048R/EA31CFC9  subkeys.pgp.net
>>> >
>>> >
>>> >
>>> >
>>> > --
>>> > PGP KeyID: 2048R/EA31CFC9  subkeys.pgp.net
>>>
>>
>>
>
>
> --
> PGP KeyID: 2048R/EA31CFC9  subkeys.pgp.net
>

Mime
View raw message