livy-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Marcelo Vanzin <van...@cloudera.com>
Subject Re: How to deploy generic Spark applications via Livy using the Java client
Date Mon, 10 Sep 2018 18:21:22 GMT
If your "orchestrator engine" is receiving pre-built apps from others
and needs to execute them in the cluster, you could just use Livy's
batch API. I don't think there are Java bindings for that, you'd need
to talk to the REST endpoints directly.

The code you're referring to is for "interactive" sessions, where you
can send closures or even code snippets to be executed in Spark.

On Tue, Sep 4, 2018 at 8:37 AM Daniel Seybold <daniel.seybold@uni-ulm.de> wrote:
>
> Hi guys,
>
> I'd like to use Livy Server and its Java client to deploy generic Spark applications
by integrating the Java client into a custom orchestration engine.
>
> After going through the docs and experimenting the code I am not sure if this is already
possible with Livy, see the following example:
>
> The orchestration engine can reveive generic Spark binaries and additional input parameters,
which should be executed programmitcally at a Spark Cluster (with the Livy Server).
>
> Yet, according to the PiJob example it seems that I need to wrap the code of any Spark
application to submit it via the Java client?
>
> Hence, the following code snippet would not work:
>
> LivyClient client = new LivyClientBuilder()
>     .setURI(new URI("http://IP:8998")).build();
>
> client.uploadJar(new File("genberic-spark-app.jar"));
>
> Yet, it seems that this kind of execution would be possible by using the REST-API directly
(and not the Java Client)?
>
> Thanks for any advice!
>
> Cheers,
> Daniel



-- 
Marcelo

Mime
View raw message