Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[MRG+1] Issue#7998 : Consistent parameters between QDA and LDA #8130

Merged
merged 34 commits into from
Aug 1, 2017

Conversation

mrbeann
Copy link
Contributor

@mrbeann mrbeann commented Dec 28, 2016

Reference Issue

Fix #7998

see also the previous one

@mrbeann
Copy link
Contributor Author

mrbeann commented Dec 29, 2016

Thanks @johannah . I have made a new PR according to your guidance.
#8134

PS:You mentioned:

You've still got a number of other peoples' commits showing in your diff :\

But I'm not really sure, In this PR, there just one commit

@jnothman
Copy link
Member

I think this one is looking okay. Please just be sure to close what is not current. Then if you can avoid opening new PRs, that would be great.

@mrbeann
Copy link
Contributor Author

mrbeann commented Dec 29, 2016

Thank you, I will close them. Sorry for the inconveniences.

@jnothman
Copy link
Member

jnothman commented Dec 29, 2016 via email

@mrbeann
Copy link
Contributor Author

mrbeann commented Dec 29, 2016

Okay, learned a lot from this process, thank you for your help.
And to my understanding #8130 and #8134 are both okay, so I can close one of them? then make the future change in the remaining one?

@jnothman
Copy link
Member

jnothman commented Dec 29, 2016 via email

@mrbeann
Copy link
Contributor Author

mrbeann commented Dec 29, 2016

Okay, just close #8134

@@ -130,6 +130,7 @@ def _class_cov(X, y, priors=None, shrinkage=None):

class LinearDiscriminantAnalysis(BaseEstimator, LinearClassifierMixin,
TransformerMixin):

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

please don't insert new lines into untouched parts of the code

@@ -289,6 +291,9 @@ def _solve_lsqr(self, X, y, shrinkage):
0-471-05669-3.
"""
self.means_ = _class_means(X, y)
if not self.store_covariance:
warnings.warn("'covariance_' will be stored only if "
"'store_covariance' is True", DeprecationWarning)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

add "from version 0.21"

@@ -327,6 +332,9 @@ class scatter). This solver supports both classification and
0-471-05669-3.
"""
self.means_ = _class_means(X, y)
if not self.store_covariance:
warnings.warn("'covariance_' will be stored only if "
"'store_covariance' is True", DeprecationWarning)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

add "from version 0.21"

@@ -534,6 +542,7 @@ def predict_log_proba(self, X):


class QuadraticDiscriminantAnalysis(BaseEstimator, ClassifierMixin):

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

please remove blank line

if self.store_covariances:
if self.store_covariances is not None:
warnings.warn("'store_covariances' was renamed to store_covariance"
" in version 0.19 and will be removed in 0.20.",
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

in 0.21

warnings.warn("'store_covariances' was renamed to store_covariance"
" in version 0.19 and will be removed in 0.20.",
DeprecationWarning)
self.store_covariance = self.store_covariances
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Better to use a local variable so that we don't modify the already set parameter.

def test_lda_store_covariance():
# The default is to not set the covariances_ attribute
clf = LinearDiscriminantAnalysis().fit(X, y)
assert_true(not hasattr(clf, 'covariance_'))
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

perhaps assert_false!

@@ -223,6 +223,26 @@ def test_lda_scaling():
'using covariance: %s' % solver)


def test_lda_store_covariance():
# The default is to not set the covariances_ attribute
clf = LinearDiscriminantAnalysis().fit(X, y)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please test with each solver available.

np.array([[0.7, 0.45], [0.45, 0.7]])
)

assert_array_almost_equal(
clf.covariances_[1],
clf.covariance_[1],
np.array([[0.33333333, -0.33333333], [-0.33333333, 0.66666667]])
)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please also test the deprecation using assert_warns

@mrbeann
Copy link
Contributor Author

mrbeann commented Jan 3, 2017

@jnothman Just fix these, please check them

Copy link
Member

@jnothman jnothman left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't mind whether or not you do it here, but it can be useful to test deprecated functionality separately from testing continuing behaviour. This makes it easier to remove the relevant tests when the deprecation concludes.

Thanks!

# Test for slover 'lsqr' and 'eigen'
for solver in ('lsqr', 'eigen'):
clf = LinearDiscriminantAnalysis(solver=solver)
assert_false(not hasattr(clf.fit(X, y), 'covariance_'))
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this is assert_true(hasattr...), no?

*store_covariance* has been moved to main constructor.

*store_covariances* has been moved to the main constructor as
*store_covaraiance*
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

spelling

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

please also remove the blank line before.

@@ -556,9 +563,15 @@ class QuadraticDiscriminantAnalysis(BaseEstimator, ClassifierMixin):
Regularizes the covariance estimate as
``(1-reg_param)*Sigma + reg_param*np.eye(n_features)``

store_covariance : boolean
If True the covariance matrices are computed and stored in the
`self.covariance_` attribute.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

use double-backticks for code formatting and you can drop self

@@ -289,6 +290,9 @@ def _solve_lsqr(self, X, y, shrinkage):
0-471-05669-3.
"""
self.means_ = _class_means(X, y)
if not self.store_covariance:
warnings.warn("from version 0.21 'covariance_' will be stored only"
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think this should only be raised when covariance_ is accessed and present by error. I.e. do it with a property.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hi @jnothman Thanks for your review. But I'm not sure what this mean here. Can you give me more detail or examples?
Thank you!

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I mean you will need something like:

@property
def covariance_(self):
    if hasattr(self, '_covariance_') and self.store_covariance == False:
       warnings.warn ...
    return self._covariance_

It's a bit ugly, but still.

assert_false(not hasattr(clf.fit(X, y), 'covariance_'))

# Test the deprecation
assert_warns(DeprecationWarning, clf.fit, X, y)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Would be better to use assert_warns_message to clarify which deprecation this is.


# Test the actual attribute:
clf = QuadraticDiscriminantAnalysis(store_covariances=True).fit(X6, y6)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ideally, we should check that store_covariances still works but raises a deprecation warning.

@mrbeann
Copy link
Contributor Author

mrbeann commented Jan 7, 2017

Hi @jnothman , please check if I'm in the right direction.
Thanks

Copy link
Member

@jnothman jnothman left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks!

@@ -188,7 +188,7 @@ class LinearDiscriminantAnalysis(BaseEstimator, LinearClassifierMixin,
intercept_ : array, shape (n_features,)
Intercept term.

covariance_ : array-like, shape (n_features, n_features)
_covariance_ : array-like, shape (n_features, n_features)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

the public attribute is still covariance_

@property
def covariance_(self):
if hasattr(self, '_covariance_') and not self.store_covariance:
warnings.warn("from version 0.21 '_covariance_' will be stored "
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

_covariance_ -> covariance_

@covariance_.setter
def covariance_(self, value):
if hasattr(self, '_covariance_') and not self.store_covariance:
warnings.warn("from version 0.21 '_covariance_' will be stored "
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

_covariance_ -> covariance_

@@ -599,7 +616,7 @@ class QuadraticDiscriminantAnalysis(BaseEstimator, ClassifierMixin):
>>> clf.fit(X, y)
... # doctest: +ELLIPSIS, +NORMALIZE_WHITESPACE
QuadraticDiscriminantAnalysis(priors=None, reg_param=0.0,
store_covariances=False, tol=0.0001)
store_covariance=False, store_covariances=None, tol=0.0001)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

PEP8 visual indentation would have this line aligned with the opening parenthesis

@property
@deprecated("Attribute covariances_ was deprecated in version"
" 0.19 and will be removed in 0.21. Use "
"covariance_' instead")
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

remove '

)

assert_array_almost_equal(
clf._covariance_[1],
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

_covariance_ -> covariance_


# Test for slover svd, the default is to not set the covariances_ attribute
clf = LinearDiscriminantAnalysis(solver='svd').fit(X, y)
assert_true(not hasattr(clf, '_covariance_'))
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

_covariance_ -> covariance_

assert_false(hasattr

# Test the actual attribute:
clf = LinearDiscriminantAnalysis(solver=solver,
store_covariance=True).fit(X6, y6)
assert_true(hasattr(clf, '_covariance_'))
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

_covariance_ -> covariance_

assert_warns_message(DeprecationWarning, "from version 0.21 "
"'_covariance_' will be stored only if "
"'store_covariance' is True",
clf.fit, X, y)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think this warning should only trigger on getattr(clf.fit(X, y), 'covariance_')

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for your review! I'm not sure is this means I need a deprecation in __getattr__ instead of now with property?
To my understanding we need covariance_ as a "private value" but not sure which way to achieve this?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No, the covariance_ attribute is public. It may present a public face to a __dict__ entry, _covariance_, and do so by being implemented as a descriptor with a getter and setter. The getter is called whenever hasattr(obj, 'covariance_') or getattr(obj, 'covariance_') is called on an instance of LinearDiscriminantAnalysis. If the getter raises AttribtueError, hasattr returns False; otherwise True. getattr returns the result of the getter and propagates the exception.

# assert_warns(DeprecationWarning, clf.fit, X, y)
assert_warns_message(DeprecationWarning, "'store_covariances' was renamed"
" to store_covariance in version 0.19 and will be "
"removed in 0.21.", clf.fit, X, y)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Also check that covariance_ (and covariances_ with warning) is stored.

@@ -255,6 +256,36 @@ def __init__(self, solver='svd', shrinkage=None, priors=None,
self.store_covariance = store_covariance # used only in svd solver
self.tol = tol # used only in svd solver

def __get__(self, obj, objtype):
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't think you mean this. defining __get__ turns the class into a descriptor...

@mrbeann
Copy link
Contributor Author

mrbeann commented Jan 10, 2017

Hi @jnothman , I misunderstood this, and I have just fixed it, please help me checking if I am in the right direction. Thanks!

@mrbeann
Copy link
Contributor Author

mrbeann commented Mar 3, 2017

Hi, @jnothman , I think I'm stuck here, can you offer some more guidance? Thank you

Copy link
Member

@jnothman jnothman left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Otherwise LGTM

# Test for slover 'lsqr' and 'eigen'
for solver in ('lsqr', 'eigen'):
clf = LinearDiscriminantAnalysis(solver=solver)
assert_true(hasattr(clf.fit(X, y), 'covariance_'))
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please use assert_warns on the hasattr to check that this behaviour is deprecated appropriately.

Either do that here and remove the test below, or make this test only about non-deprecated behaviour and avoid this assertion with a comment saying that the opposite should be asserted once the deprecation is complete.

# check that covariance_ (and covariances_ with warning) is stored
clf = QuadraticDiscriminantAnalysis(store_covariances=True).fit(X6, y6)
assert_true(hasattr(clf, 'covariance_'))
assert_true(hasattr(clf, 'covariances_'))
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We do not want the tests issuing warnings. Drop this line and replace hasattr with getattr in the next statement to test the same thing but also catch the warning.

@jnothman jnothman changed the title [MRG] Issue#7998 : Consistent parameters between QDA and LDA [MRG+1] Issue#7998 : Consistent parameters between QDA and LDA Mar 5, 2017
np.array([0.088889, 0.533333])
)

# Test for slover svd, the default is to not set the covariances_ attribute
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

test with the SVD solver

)

# Test for slover svd, the default is to not set the covariances_ attribute
clf = LinearDiscriminantAnalysis(solver='svd').fit(X, y)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why use X, y instead of X6, y6 as in the other checks?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks! Actually, there's no special purpose, I just change it to X, y

:issue:`8174` by :user:`Tahar Zanouda <tzano>`, `Alexandre Gramfort`_
and `Raghav RV`_.

- SciPy >= 0.13.3 and NumPy >= 1.8.2 are now the minimum supported versions
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It looks like there is some unwanted changes in this diff. The only thing that should be added is your paragraph about store_covariances_ and covariances_

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sorry, these seem caused by merge conflicts. I just fix it. Thanks!

@@ -245,6 +245,7 @@ class LinearDiscriminantAnalysis(BaseEstimator, LinearClassifierMixin,
>>> print(clf.predict([[-0.8, -1]]))
[1]
"""

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks like the docstring doesn't explain that store_covariance only makes a difference if solver='svd'

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think it did last I looked

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

huh either it changed or I'm blind...

Copy link
Member

@amueller amueller left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good apart from minor nitpicks.

def test_lda_store_covariance():
# Test for slover 'lsqr' and 'eigen'
for solver in ('lsqr', 'eigen'):
clf = LinearDiscriminantAnalysis(solver=solver).fit(X6, y6)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Maybe add a comment that store_covariance has no effect on these solvers?


# check that covariance_ (and covariances_ with warning) is stored
clf = QuadraticDiscriminantAnalysis(store_covariances=True).fit(X6, y6)
assert_true(hasattr(clf, 'covariance_'))
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I would remove this (because it creates a DeprecationWarning) and store the result of assert_warns_message and ensure that it's True.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This line should not raise a warning. The fit on the line before must. It should be changed to assert or ignore the warning

Copy link
Member

@lesteve lesteve left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Some comments

warnings.warn("'store_covariances' was renamed to store_covariance"
" in version 0.19 and will be removed in 0.21.",
DeprecationWarning)
store_covariance = self.store_covariances
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

To simplify the code, I think we should have:

store_covariance = self.store_covariance or self.store_covariances
if store_covariance:
    # some code

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

but we can't figure out the value of store_covariance is come from self.store_covariance or self.store_covariances, then we can't decide whether to raise a warning?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The only thing you need for the warning is self.store_covariances.


assert_array_almost_equal(
clf.covariance_[0],
np.array([0.422222, 0.088889])
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Where do the expected covariance come from? Is there a way to compute them in a different way in order to make the test less magical?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Not familiar with this part, you mean we should find another program or in sklearn we have another choice?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Any idea about this? @jnothman Thanks!

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

As long as the covariance computed by each solver is asserted equal, I think that's more than adequate. They are, after all, used within these techniques to produce the overall model.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why is this split into two asserts? It's a single covariance matrix, right?

"removed in 0.21.", clf.fit, X, y)

# check that covariance_ (and covariances_ with warning) is stored
clf = QuadraticDiscriminantAnalysis(store_covariances=True).fit(X6, y6)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You don't need to redefine clf here I think and can just reuse the existing clf

def fit(self, X, y):
"""Fit the model according to the given training data and parameters.

.. versionchanged:: 0.19
*store_covariance* has been moved to main constructor.
*store_covariances* has been moved to main constructor as
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Improvement: store_covariances is not a parameter of fit anymore. Use the store_covariance parameter in the constructor instead.

Also I would use double-backticks (monospace in reST) for the parameter names rather than asterisk (italic in reST).

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Not sure what this means, we already have store_covariances in the constructor. should we add store_covariance ?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It means your formulation is not clear to me and I would use the one I proposed instead.

store_covariance=True).fit(X6, y6)
assert_true(hasattr(clf, 'covariance_'))

assert_array_almost_equal(
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

why is this split into two asserts?

self.tol = tol

@property
@deprecated("Attribute covariances_ was deprecated in version"
" 0.19 and will be removed in 0.21. Use "
Copy link
Member

@amueller amueller Jul 31, 2017

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Deprecated in 0.20 and removed in 0.22?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@jnothman How do you think about this?

@jnothman
Copy link
Member

jnothman commented Aug 1, 2017 via email

@amueller amueller merged commit 72d0529 into scikit-learn:master Aug 1, 2017
@amueller
Copy link
Member

amueller commented Aug 1, 2017

ok cool.

@mrbeann
Copy link
Contributor Author

mrbeann commented Aug 1, 2017

Thank you for your guidance! @jnothman @amueller @agramfort @ogrisel

@mrbeann mrbeann deleted the pr7998 branch August 1, 2017 23:45
@mrbeann mrbeann restored the pr7998 branch August 1, 2017 23:45
@jnothman
Copy link
Member

jnothman commented Aug 1, 2017 via email

jnothman pushed a commit to jnothman/scikit-learn that referenced this pull request Aug 6, 2017
…t-learn#8130)

* for scikit-learn#7998

* Fix some style error and add test

* Add local variable store_covariance

* better deprecation

* fix bug

* Style check

* fix covariance_

* style check

* Update

* modify test

* Formating

* update

* Update

* Add whats_new.rst

* Revert "Add whats_new.rst"

This reverts commit 4e5977d.

* whats_new

* Update for FutureWarning

* Remove warning from the setter

* add fit in test

* drop back

* Quick fix

* Small fix

* Fix

* update new

* Fix space

* Fix docstring

* fix style

* Fix

* fix assert
dmohns pushed a commit to dmohns/scikit-learn that referenced this pull request Aug 7, 2017
…t-learn#8130)

* for scikit-learn#7998

* Fix some style error and add test

* Add local variable store_covariance

* better deprecation

* fix bug

* Style check

* fix covariance_

* style check

* Update

* modify test

* Formating

* update

* Update

* Add whats_new.rst

* Revert "Add whats_new.rst"

This reverts commit 4e5977d.

* whats_new

* Update for FutureWarning

* Remove warning from the setter

* add fit in test

* drop back

* Quick fix

* Small fix

* Fix

* update new

* Fix space

* Fix docstring

* fix style

* Fix

* fix assert
dmohns pushed a commit to dmohns/scikit-learn that referenced this pull request Aug 7, 2017
…t-learn#8130)

* for scikit-learn#7998

* Fix some style error and add test

* Add local variable store_covariance

* better deprecation

* fix bug

* Style check

* fix covariance_

* style check

* Update

* modify test

* Formating

* update

* Update

* Add whats_new.rst

* Revert "Add whats_new.rst"

This reverts commit 4e5977d.

* whats_new

* Update for FutureWarning

* Remove warning from the setter

* add fit in test

* drop back

* Quick fix

* Small fix

* Fix

* update new

* Fix space

* Fix docstring

* fix style

* Fix

* fix assert
paulha pushed a commit to paulha/scikit-learn that referenced this pull request Aug 19, 2017
…t-learn#8130)

* for scikit-learn#7998

* Fix some style error and add test

* Add local variable store_covariance

* better deprecation

* fix bug

* Style check

* fix covariance_

* style check

* Update

* modify test

* Formating

* update

* Update

* Add whats_new.rst

* Revert "Add whats_new.rst"

This reverts commit 4e5977d.

* whats_new

* Update for FutureWarning

* Remove warning from the setter

* add fit in test

* drop back

* Quick fix

* Small fix

* Fix

* update new

* Fix space

* Fix docstring

* fix style

* Fix

* fix assert
AishwaryaRK pushed a commit to AishwaryaRK/scikit-learn that referenced this pull request Aug 29, 2017
…t-learn#8130)

* for scikit-learn#7998

* Fix some style error and add test

* Add local variable store_covariance

* better deprecation

* fix bug

* Style check

* fix covariance_

* style check

* Update

* modify test

* Formating

* update

* Update

* Add whats_new.rst

* Revert "Add whats_new.rst"

This reverts commit 4e5977d.

* whats_new

* Update for FutureWarning

* Remove warning from the setter

* add fit in test

* drop back

* Quick fix

* Small fix

* Fix

* update new

* Fix space

* Fix docstring

* fix style

* Fix

* fix assert
maskani-moh pushed a commit to maskani-moh/scikit-learn that referenced this pull request Nov 15, 2017
…t-learn#8130)

* for scikit-learn#7998

* Fix some style error and add test

* Add local variable store_covariance

* better deprecation

* fix bug

* Style check

* fix covariance_

* style check

* Update

* modify test

* Formating

* update

* Update

* Add whats_new.rst

* Revert "Add whats_new.rst"

This reverts commit 4e5977d.

* whats_new

* Update for FutureWarning

* Remove warning from the setter

* add fit in test

* drop back

* Quick fix

* Small fix

* Fix

* update new

* Fix space

* Fix docstring

* fix style

* Fix

* fix assert
jwjohnson314 pushed a commit to jwjohnson314/scikit-learn that referenced this pull request Dec 18, 2017
…t-learn#8130)

* for scikit-learn#7998

* Fix some style error and add test

* Add local variable store_covariance

* better deprecation

* fix bug

* Style check

* fix covariance_

* style check

* Update

* modify test

* Formating

* update

* Update

* Add whats_new.rst

* Revert "Add whats_new.rst"

This reverts commit 4e5977d.

* whats_new

* Update for FutureWarning

* Remove warning from the setter

* add fit in test

* drop back

* Quick fix

* Small fix

* Fix

* update new

* Fix space

* Fix docstring

* fix style

* Fix

* fix assert
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

7 participants