You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
We are using Spark Hidden REST API for managing (launch, kill, monitor, ...) our jobs.
We recently learnt that Spark JobServer provides more wonderful features that can really make our lives easier.
After reading the Spark Jobserver documentation, I noticed that it does not seem to have the same way to configure a Spark job. For Spark Hidden REST API, once can configure a Spark job with a JSON file like:
With a HTTP POST, such as "curl -X POST --data @config.json http://my_spark_server" and it will start a job with the above configuration.
Our questions are:
Does Spark JobServer allow to pass job/context configuration using a JSON file like aforementioned? As far as our understanding, all configuration seems to be passed via HTTP query. For instance, "http://my_spark_server/jobs?configName1=value1&configName2=value2". This does not seem to be practical if we have tens of different configuration fields to set.
If it DOES, is the JSON format the same or not?
Best
Tien Dat
The text was updated successfully, but these errors were encountered:
As far as I know, currently jobserver only allows querystring parameters. Some of the configurations can be set directly in spark configurations, others can be set in jobserver default configurations and remaining one's can be set via querystring. Generally, it works out well.
As far as JSON passing in body is concerned, either you will have to write a wrapper on top of jobserver or contribute to jobserver by enhancing its rest API (shouldn't be too difficult).
One extra point to my first comment:
Since Job Server allows Context to running permanently, the aforementioned configuration actually targets two different objects:
1- Spark context: such as spark.cores.max, spark.driver.memory, ...
2- The running jobs: such as appResource , appArgs , ...
Spark 2.3.1
Spark JobServer 0.8.0
Spark Standalone mode
Dear,
We are using Spark Hidden REST API for managing (launch, kill, monitor, ...) our jobs.
We recently learnt that Spark JobServer provides more wonderful features that can really make our lives easier.
After reading the Spark Jobserver documentation, I noticed that it does not seem to have the same way to configure a Spark job. For Spark Hidden REST API, once can configure a Spark job with a JSON file like:
With a HTTP POST, such as "curl -X POST --data @config.json http://my_spark_server" and it will start a job with the above configuration.
Our questions are:
Does Spark JobServer allow to pass job/context configuration using a JSON file like aforementioned? As far as our understanding, all configuration seems to be passed via HTTP query. For instance, "http://my_spark_server/jobs?configName1=value1&configName2=value2". This does not seem to be practical if we have tens of different configuration fields to set.
If it DOES, is the JSON format the same or not?
Best
Tien Dat
The text was updated successfully, but these errors were encountered: