Skip to content

Commit

Permalink
[SPARK-13889][YARN] Fix integer overflow when calculating the max num…
Browse files Browse the repository at this point in the history
…ber of executor failure

## What changes were proposed in this pull request?
The max number of executor failure before failing the application is default to twice the maximum number of executors if dynamic allocation is enabled. The default value for "spark.dynamicAllocation.maxExecutors" is Int.MaxValue. So this causes an integer overflow and a wrong result. The calculated value of the default max number of executor failure is 3. This PR adds a check to avoid the overflow.

## How was this patch tested?
It tests if the value is greater that Int.MaxValue / 2 to avoid the overflow when it multiplies 2.

Author: Carson Wang <carson.wang@intel.com>

Closes apache#11713 from carsonwang/IntOverflow.
  • Loading branch information
carsonwang authored and srowen committed Mar 16, 2016
1 parent 1d95fb6 commit 496d2a2
Showing 1 changed file with 4 additions and 1 deletion.
Original file line number Diff line number Diff line change
Expand Up @@ -73,7 +73,10 @@ private[spark] class ApplicationMaster(
} else {
sparkConf.get(EXECUTOR_INSTANCES).getOrElse(0)
}
val defaultMaxNumExecutorFailures = math.max(3, 2 * effectiveNumExecutors)
// By default, effectiveNumExecutors is Int.MaxValue if dynamic allocation is enabled. We need
// avoid the integer overflow here.
val defaultMaxNumExecutorFailures = math.max(3,
if (effectiveNumExecutors > Int.MaxValue / 2) Int.MaxValue else (2 * effectiveNumExecutors))

sparkConf.get(MAX_EXECUTOR_FAILURES).getOrElse(defaultMaxNumExecutorFailures)
}
Expand Down

0 comments on commit 496d2a2

Please sign in to comment.