Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

log public dns of master node #2074

Merged
merged 7 commits into from
Jun 3, 2019

Conversation

coyotemarin
Copy link
Collaborator

This builds on pull request #2007, which requests that we log the public DNS of the master node as soon as we have it.

@coyotemarin coyotemarin merged commit e25e8b7 into Yelp:master Jun 3, 2019
@coyotemarin coyotemarin added this to the v0.6.10 milestone Jun 3, 2019
@coyotemarin coyotemarin deleted the log-address-of-master branch June 3, 2019 21:06
coyotemarin pushed a commit to coyotemarin/mrjob that referenced this pull request Mar 29, 2020
official PyPy support
 * officially support PyPy (Yelp#1011)
   * when launched in PyPy, defaults python_bin to pypy or pypy3
 * Spark runner
   * turn off internal protocol with --skip-internal-protocol (Yelp#1952)
   * spark Harness can run inside EMR (Yelp#2070)
 * EMR runner
   * default instance type is now m5.xlarge (Yelp#2071)
   * log DNS of master node as soon as we know it (Yelp#2074)
 * better error when reading YAML conf file without YAML library (Yelp#2047)

v0.6.9, 2019-05-29 -- Better emulation
 * formally dropped support for Python 3.4
   * (still seems to work except for Google libraries)
 * jobs:
   * deprecated add_*_option() methods can take types as their type arg (Yelp#2058)
 * all runners
   * archives no longer go into working dir mirror (Yelp#2059)
     * fixes bug in v0.6.8 that could break archives on Hadoop
 * sim runners (local, inline)
   * simulated mapreduce.map.input.file is now a file:// URL (Yelp#2066)
 * Spark runner
   * added emulate_map_input_file option (Yelp#2061)
     * can optionally emulate mapreduce.map.input.file in first step's mapper
   * increment counter() emulation now uses correct arg names (Yelp#2060)
   * warns if spark_tmp_dir and master aren't both local/remote (Yelp#2062)
 * mrjob spark-submit can take switches to script without using "--" (Yelp#2055)

(This tag is one revision ahead of the released version on PyPI. The only
difference is in docs/requirements.txt so that readthedocs.org builds)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

1 participant