Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

kubernetes-e2e-gce-serial: broken test run #34798

Closed
k8s-github-robot opened this issue Oct 14, 2016 · 1 comment
Closed

kubernetes-e2e-gce-serial: broken test run #34798

k8s-github-robot opened this issue Oct 14, 2016 · 1 comment
Assignees
Labels
area/test-infra kind/flake Categorizes issue or PR as related to a flaky test. priority/backlog Higher priority than priority/awaiting-more-evidence.

Comments

@k8s-github-robot
Copy link

https://k8s-gubernator.appspot.com/build/kubernetes-jenkins/logs/kubernetes-e2e-gce-serial/2334/

Multiple broken tests:

Failed: [k8s.io] Kubelet [Serial] [Slow] [k8s.io] regular resource usage tracking resource tracking for 35 pods per node {Kubernetes e2e suite}

/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/kubelet_perf.go:278
Expected error:
    <errors.aggregate | len:4, cap:4>: [
        {
            s: "Resource usage on node \"jenkins-e2e-master\" is not ready yet",
        },
        {
            s: "Resource usage on node \"jenkins-e2e-minion-group-lxlm\" is not ready yet",
        },
        {
            s: "Resource usage on node \"jenkins-e2e-minion-group-o3by\" is not ready yet",
        },
        {
            s: "Resource usage on node \"jenkins-e2e-minion-group-xx3u\" is not ready yet",
        },
    ]
    [Resource usage on node "jenkins-e2e-master" is not ready yet, Resource usage on node "jenkins-e2e-minion-group-lxlm" is not ready yet, Resource usage on node "jenkins-e2e-minion-group-o3by" is not ready yet, Resource usage on node "jenkins-e2e-minion-group-xx3u" is not ready yet]
not to have occurred
/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/kubelet_perf.go:104

Issues about this test specifically: #28220

Failed: [k8s.io] Kubelet [Serial] [Slow] [k8s.io] regular resource usage tracking resource tracking for 0 pods per node {Kubernetes e2e suite}

/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/kubelet_perf.go:278
Expected error:
    <errors.aggregate | len:4, cap:4>: [
        {
            s: "Resource usage on node \"jenkins-e2e-minion-group-o3by\" is not ready yet",
        },
        {
            s: "Resource usage on node \"jenkins-e2e-minion-group-xx3u\" is not ready yet",
        },
        {
            s: "Resource usage on node \"jenkins-e2e-master\" is not ready yet",
        },
        {
            s: "Resource usage on node \"jenkins-e2e-minion-group-lxlm\" is not ready yet",
        },
    ]
    [Resource usage on node "jenkins-e2e-minion-group-o3by" is not ready yet, Resource usage on node "jenkins-e2e-minion-group-xx3u" is not ready yet, Resource usage on node "jenkins-e2e-master" is not ready yet, Resource usage on node "jenkins-e2e-minion-group-lxlm" is not ready yet]
not to have occurred
/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/kubelet_perf.go:104

Issues about this test specifically: #26784 #28384 #33023

Failed: [k8s.io] Kubelet [Serial] [Slow] [k8s.io] regular resource usage tracking resource tracking for 100 pods per node {Kubernetes e2e suite}

/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/kubelet_perf.go:278
Expected error:
    <errors.aggregate | len:4, cap:4>: [
        {
            s: "Resource usage on node \"jenkins-e2e-master\" is not ready yet",
        },
        {
            s: "Resource usage on node \"jenkins-e2e-minion-group-lxlm\" is not ready yet",
        },
        {
            s: "Resource usage on node \"jenkins-e2e-minion-group-o3by\" is not ready yet",
        },
        {
            s: "Resource usage on node \"jenkins-e2e-minion-group-xx3u\" is not ready yet",
        },
    ]
    [Resource usage on node "jenkins-e2e-master" is not ready yet, Resource usage on node "jenkins-e2e-minion-group-lxlm" is not ready yet, Resource usage on node "jenkins-e2e-minion-group-o3by" is not ready yet, Resource usage on node "jenkins-e2e-minion-group-xx3u" is not ready yet]
not to have occurred
/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/kubelet_perf.go:104

Issues about this test specifically: #26982 #33994 #34035

Failed: [k8s.io] SchedulerPredicates [Serial] validates MaxPods limit number of pods that are allowed to run [Slow] {Kubernetes e2e suite}

/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/scheduler_predicates.go:142
Not scheduled Pods: []api.Pod(nil)
Expected
    <int>: 0
to equal
    <int>: 1
/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/scheduler_predicates.go:932

Issues about this test specifically: #27662 #29820 #31971 #32505

Failed: [k8s.io] SchedulerPredicates [Serial] validates resource limits of pods that are allowed to run [Conformance] {Kubernetes e2e suite}

/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/scheduler_predicates.go:216
Not scheduled Pods: []api.Pod(nil)
Expected
    <int>: 0
to equal
    <int>: 1
/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/scheduler_predicates.go:932

Issues about this test specifically: #27115 #28070 #30747 #31341

Previous issues for this suite: #26743 #27118 #27320 #31771 #34183

@k8s-github-robot k8s-github-robot added kind/flake Categorizes issue or PR as related to a flaky test. priority/backlog Higher priority than priority/awaiting-more-evidence. area/test-infra labels Oct 14, 2016
@k8s-github-robot
Copy link
Author

This is a duplicate of #34679; closing

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
area/test-infra kind/flake Categorizes issue or PR as related to a flaky test. priority/backlog Higher priority than priority/awaiting-more-evidence.
Projects
None yet
Development

No branches or pull requests

2 participants