Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

testing: add missing deployments/statefulsets to kf_is_ready_test.py #4294

Closed
yanniszark opened this issue Oct 14, 2019 · 4 comments
Closed

Comments

@yanniszark
Copy link
Contributor

/kind feature

Why you need this feature:
The kf_is_ready_test checks if certain Kubeflow deployments/statefulsets are ready.
The current list is missing a lot of the installed deployments/statefulsets (eg katib, metadata, some pipelines, admission webhook, ...).

Describe the solution you'd like:
Add the missing deployments/statefulsets to kf_is_ready_test in order to improve coverage.

Anything else you would like to add:
/cc @jlewi who has some suggestions on how to implement this in pytest.

@issue-label-bot
Copy link

Issue-Label Bot is automatically applying the label kind/feature to this issue, with a confidence of 0.95. Please mark this comment with 👍 or 👎 to give our bot feedback!

Links: app homepage, dashboard and code for this bot.

@yanniszark
Copy link
Contributor Author

/priority p1

@jlewi
Copy link
Contributor

jlewi commented Oct 14, 2019

Repasting my comment from #4154

I don't think we want one monolithic test. This prevents us from

  • Having fine grained visibility into which services are causing problems
  • Increases the likelihood of flakes
  • Decreases our ability to skip or mark as expected flaky services

With pytest we can easily break this up into multiple test functions e.g.

def test_kfserving
   _wait_for_deployments(....)
   _wait_for_statefulsets(...)

def test_gcp
    _wait_for_deployments(....)
   _wait_for_statefulsets(...)

We can then use the pytest annotations to mark flaky tests to prevent them from blocking PRs.
We can also use pytest annotations to conditionally enable some tests
http://doc.pytest.org/en/latest/skipping.html

@stale
Copy link

stale bot commented Apr 2, 2020

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

@stale stale bot added the lifecycle/stale label Apr 2, 2020
@jlewi jlewi closed this as completed Apr 5, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

4 participants