Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

kubernetes-e2e-gke-test: broken test run #30570

Closed
k8s-github-robot opened this issue Aug 13, 2016 · 26 comments
Closed

kubernetes-e2e-gke-test: broken test run #30570

k8s-github-robot opened this issue Aug 13, 2016 · 26 comments
Assignees
Labels
area/test-infra kind/flake Categorizes issue or PR as related to a flaky test. priority/critical-urgent Highest priority. Must be actively worked on as someone's top priority right now.

Comments

@k8s-github-robot
Copy link

Failed: https://k8s-gubernator.appspot.com/build/kubernetes-jenkins/logs/kubernetes-e2e-gke-test/13538/

Run so broken it didn't make JUnit output!

@k8s-github-robot k8s-github-robot added priority/backlog Higher priority than priority/awaiting-more-evidence. area/test-infra kind/flake Categorizes issue or PR as related to a flaky test. labels Aug 13, 2016
@k8s-github-robot
Copy link
Author

Failed: https://k8s-gubernator.appspot.com/build/kubernetes-jenkins/logs/kubernetes-e2e-gke-test/13539/

Run so broken it didn't make JUnit output!

@k8s-github-robot
Copy link
Author

Failed: https://k8s-gubernator.appspot.com/build/kubernetes-jenkins/logs/kubernetes-e2e-gke-test/13540/

Run so broken it didn't make JUnit output!

@k8s-github-robot k8s-github-robot added priority/important-soon Must be staffed and worked on either currently, or very soon, ideally in time for the next release. and removed priority/backlog Higher priority than priority/awaiting-more-evidence. labels Aug 14, 2016
@k8s-github-robot
Copy link
Author

Failed: https://k8s-gubernator.appspot.com/build/kubernetes-jenkins/logs/kubernetes-e2e-gke-test/13542/

Run so broken it didn't make JUnit output!

@k8s-github-robot k8s-github-robot added priority/critical-urgent Highest priority. Must be actively worked on as someone's top priority right now. and removed priority/important-soon Must be staffed and worked on either currently, or very soon, ideally in time for the next release. labels Aug 14, 2016
@k8s-github-robot
Copy link
Author

Failed: https://k8s-gubernator.appspot.com/build/kubernetes-jenkins/logs/kubernetes-e2e-gke-test/13543/

Run so broken it didn't make JUnit output!

@k8s-github-robot
Copy link
Author

Failed: https://k8s-gubernator.appspot.com/build/kubernetes-jenkins/logs/kubernetes-e2e-gke-test/13549/

Run so broken it didn't make JUnit output!

@k8s-github-robot
Copy link
Author

Failed: https://k8s-gubernator.appspot.com/build/kubernetes-jenkins/logs/kubernetes-e2e-gke-test/13550/

Run so broken it didn't make JUnit output!

@k8s-github-robot
Copy link
Author

Failed: https://k8s-gubernator.appspot.com/build/kubernetes-jenkins/logs/kubernetes-e2e-gke-test/13551/

Run so broken it didn't make JUnit output!

@k8s-github-robot
Copy link
Author

Failed: https://k8s-gubernator.appspot.com/build/kubernetes-jenkins/logs/kubernetes-e2e-gke-test/13552/

Run so broken it didn't make JUnit output!

@k8s-github-robot
Copy link
Author

Failed: https://k8s-gubernator.appspot.com/build/kubernetes-jenkins/logs/kubernetes-e2e-gke-test/13553/

Run so broken it didn't make JUnit output!

@k8s-github-robot
Copy link
Author

Failed: https://k8s-gubernator.appspot.com/build/kubernetes-jenkins/logs/kubernetes-e2e-gke-test/13554/

Run so broken it didn't make JUnit output!

@k8s-github-robot
Copy link
Author

Failed: https://k8s-gubernator.appspot.com/build/kubernetes-jenkins/logs/kubernetes-e2e-gke-test/13555/

Run so broken it didn't make JUnit output!

@k8s-github-robot
Copy link
Author

Failed: https://k8s-gubernator.appspot.com/build/kubernetes-jenkins/logs/kubernetes-e2e-gke-test/13556/

Run so broken it didn't make JUnit output!

@k8s-github-robot
Copy link
Author

Failed: https://k8s-gubernator.appspot.com/build/kubernetes-jenkins/logs/kubernetes-e2e-gke-test/13557/

Run so broken it didn't make JUnit output!

@k8s-github-robot
Copy link
Author

Failed: https://k8s-gubernator.appspot.com/build/kubernetes-jenkins/logs/kubernetes-e2e-gke-test/13558/

Run so broken it didn't make JUnit output!

@k8s-github-robot
Copy link
Author

Failed: https://k8s-gubernator.appspot.com/build/kubernetes-jenkins/logs/kubernetes-e2e-gke-test/13559/

Run so broken it didn't make JUnit output!

@k8s-github-robot
Copy link
Author

Failed: https://k8s-gubernator.appspot.com/build/kubernetes-jenkins/logs/kubernetes-e2e-gke-test/13560/

Run so broken it didn't make JUnit output!

@k8s-github-robot
Copy link
Author

Failed: https://k8s-gubernator.appspot.com/build/kubernetes-jenkins/logs/kubernetes-e2e-gke-test/13561/

Run so broken it didn't make JUnit output!

@k8s-github-robot
Copy link
Author

Failed: https://k8s-gubernator.appspot.com/build/kubernetes-jenkins/logs/kubernetes-e2e-gke-test/13562/

Run so broken it didn't make JUnit output!

@k8s-github-robot
Copy link
Author

Failed: https://k8s-gubernator.appspot.com/build/kubernetes-jenkins/logs/kubernetes-e2e-gke-test/13563/

Run so broken it didn't make JUnit output!

@k8s-github-robot
Copy link
Author

Failed: https://k8s-gubernator.appspot.com/build/kubernetes-jenkins/logs/kubernetes-e2e-gke-test/13571/

Run so broken it didn't make JUnit output!

@k8s-github-robot
Copy link
Author

Failed: https://k8s-gubernator.appspot.com/build/kubernetes-jenkins/logs/kubernetes-e2e-gke-test/13572/

Run so broken it didn't make JUnit output!

@k8s-github-robot
Copy link
Author

https://k8s-gubernator.appspot.com/build/kubernetes-jenkins/logs/kubernetes-e2e-gke-test/13578/

Multiple broken tests:

Failed: [k8s.io] Pods should cap back-off at MaxContainerBackOff [Slow] {Kubernetes e2e suite}

/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/pods.go:1428
getting pod back-off-cap
Expected error:
    <*errors.StatusError | 0xc821594380>: {
        ErrStatus: {
            TypeMeta: {Kind: "", APIVersion: ""},
            ListMeta: {SelfLink: "", ResourceVersion: ""},
            Status: "Failure",
            Message: "the server has asked for the client to provide credentials (get pods back-off-cap)",
            Reason: "Unauthorized",
            Details: {
                Name: "back-off-cap",
                Group: "",
                Kind: "pods",
                Causes: [
                    {
                        Type: "UnexpectedServerResponse",
                        Message: "Unauthorized",
                        Field: "",
                    },
                ],
                RetryAfterSeconds: 0,
            },
            Code: 401,
        },
    }
    the server has asked for the client to provide credentials (get pods back-off-cap)
not to have occurred

Issues about this test specifically: #27703

Failed: [k8s.io] Proxy version v1 should proxy logs on node using proxy subresource [Conformance] {Kubernetes e2e suite}

/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/framework/framework.go:133
Aug 18 22:47:14.937: Couldn't delete ns "e2e-tests-proxy-d7rzk": the server does not allow access to the requested resource (delete namespaces e2e-tests-proxy-d7rzk)

Failed: [k8s.io] Namespaces [Serial] should delete fast enough (90 percent of 100 namespaces in 150 seconds) {Kubernetes e2e suite}

/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/namespace.go:222
Expected error:
    <*errors.StatusError | 0xc821914980>: {
        ErrStatus: {
            TypeMeta: {Kind: "", APIVersion: ""},
            ListMeta: {SelfLink: "", ResourceVersion: ""},
            Status: "Failure",
            Message: "the server does not allow access to the requested resource (get serviceAccounts)",
            Reason: "Forbidden",
            Details: {
                Name: "",
                Group: "",
                Kind: "serviceAccounts",
                Causes: [
                    {
                        Type: "UnexpectedServerResponse",
                        Message: "Forbidden: \"/api/v1/watch/namespaces/e2e-tests-nslifetest-3-fkeav/serviceaccounts?fieldSelector=metadata.name%3Ddefault\"",
                        Field: "",
                    },
                ],
                RetryAfterSeconds: 0,
            },
            Code: 403,
        },
    }
    the server does not allow access to the requested resource (get serviceAccounts)
not to have occurred

Issues about this test specifically: #27957

Failed: [k8s.io] SchedulerPredicates [Serial] validates that NodeSelector is respected if matching [Conformance] {Kubernetes e2e suite}

/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/scheduler_predicates.go:233
Expected error:
    <*errors.errorString | 0xc822338000>: {
        s: "Namespace e2e-tests-proxy-d7rzk is active",
    }
    Namespace e2e-tests-proxy-d7rzk is active
not to have occurred

Issues about this test specifically: #29516

Failed: [k8s.io] SchedulerPredicates [Serial] validates that NodeSelector is respected if not matching [Conformance] {Kubernetes e2e suite}

/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/scheduler_predicates.go:233
Expected error:
    <*errors.errorString | 0xc82321e8d0>: {
        s: "Namespace e2e-tests-proxy-d7rzk is active",
    }
    Namespace e2e-tests-proxy-d7rzk is active
not to have occurred

Issues about this test specifically: #28091

Failed: [k8s.io] SchedulerPredicates [Serial] validates that taints-tolerations is respected if matching {Kubernetes e2e suite}

/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/scheduler_predicates.go:233
Expected error:
    <*errors.errorString | 0xc822eedc80>: {
        s: "Namespace e2e-tests-proxy-d7rzk is active",
    }
    Namespace e2e-tests-proxy-d7rzk is active
not to have occurred

Issues about this test specifically: #28853

Failed: [k8s.io] Services should be able to change the type and ports of a service [Slow] {Kubernetes e2e suite}

/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/service.go:717
Aug 18 21:10:38.001: Timeout waiting for service "mutability-test" to have a load balancer

Issues about this test specifically: #26134

@k8s-github-robot
Copy link
Author

Failed: https://k8s-gubernator.appspot.com/build/kubernetes-jenkins/logs/kubernetes-e2e-gke-test/13577/

Run so broken it didn't make JUnit output!

@k8s-github-robot
Copy link
Author

https://k8s-gubernator.appspot.com/build/kubernetes-jenkins/logs/kubernetes-e2e-gke-test/13588/

Multiple broken tests:

Failed: [k8s.io] SchedulerPredicates [Serial] validates resource limits of pods that are allowed to run [Conformance] {Kubernetes e2e suite}

/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/scheduler_predicates.go:233
Expected error:
    <*errors.errorString | 0xc820b72470>: {
        s: "Namespace e2e-tests-job-cqhq1 is active",
    }
    Namespace e2e-tests-job-cqhq1 is active
not to have occurred

Issues about this test specifically: #27115 #28070 #30747

Failed: [k8s.io] Services should be able to change the type and ports of a service [Slow] {Kubernetes e2e suite}

/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/service.go:717
Aug 21 14:06:08.939: Timeout waiting for service "mutability-test" to have a load balancer

Issues about this test specifically: #26134

Failed: [k8s.io] Job should scale a job down {Kubernetes e2e suite}

/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/job.go:162
Expected error:
    <kubectl.ScaleError>: {
        FailureType: 0,
        ResourceVersion: "Unknown",
        ActualError: {
            Op: "Get",
            URL: "https://104.198.218.12/apis/batch/v1/namespaces/e2e-tests-job-cqhq1/jobs/scale-down",
            Err: {
                Op: "dial",
                Net: "tcp",
                Source: nil,
                Addr: {
                    IP: "\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\xff\xffh\xc6\xda\f",
                    Port: 443,
                    Zone: "",
                },
                Err: {
                    Syscall: "getsockopt",
                    Err: 0x6f,
                },
            },
        },
    }
    Scaling the resource failed with: Get https://104.198.218.12/apis/batch/v1/namespaces/e2e-tests-job-cqhq1/jobs/scale-down: dial tcp 104.198.218.12:443: getsockopt: connection refused; Current resource version Unknown
not to have occurred

Issues about this test specifically: #29066 #30592 #31065

Failed: [k8s.io] SchedulerPredicates [Serial] validates that a pod with an invalid NodeAffinity is rejected {Kubernetes e2e suite}

/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/scheduler_predicates.go:233
Expected error:
    <*errors.errorString | 0xc821668570>: {
        s: "Namespace e2e-tests-job-cqhq1 is active",
    }
    Namespace e2e-tests-job-cqhq1 is active
not to have occurred

@k8s-github-robot
Copy link
Author

https://k8s-gubernator.appspot.com/build/kubernetes-jenkins/logs/kubernetes-e2e-gke-test/13591/

Multiple broken tests:

Failed: [k8s.io] Networking should provide Internet connection for containers [Conformance] {Kubernetes e2e suite}

/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/framework/framework.go:132
Expected error:
    <*errors.StatusError | 0xc82202c200>: {
        ErrStatus: {
            TypeMeta: {Kind: "", APIVersion: ""},
            ListMeta: {SelfLink: "", ResourceVersion: ""},
            Status: "Failure",
            Message: "the server does not allow access to the requested resource (get serviceAccounts)",
            Reason: "Forbidden",
            Details: {
                Name: "",
                Group: "",
                Kind: "serviceAccounts",
                Causes: [
                    {
                        Type: "UnexpectedServerResponse",
                        Message: "Forbidden: \"/api/v1/watch/namespaces/e2e-tests-nettest-0va72/serviceaccounts?fieldSelector=metadata.name%3Ddefault\"",
                        Field: "",
                    },
                ],
                RetryAfterSeconds: 0,
            },
            Code: 403,
        },
    }
    the server does not allow access to the requested resource (get serviceAccounts)
not to have occurred

Issues about this test specifically: #26171 #28188

Failed: [k8s.io] Generated release_1_2 clientset should create pods, delete pods, watch pods {Kubernetes e2e suite}

/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/framework/framework.go:132
Expected error:
    <*errors.errorString | 0xc8200b1060>: {
        s: "timed out waiting for the condition",
    }
    timed out waiting for the condition
not to have occurred

Failed: [k8s.io] SchedulerPredicates [Serial] validates that NodeSelector is respected if not matching [Conformance] {Kubernetes e2e suite}

/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/scheduler_predicates.go:233
Expected error:
    <*errors.errorString | 0xc821c449e0>: {
        s: "Namespace e2e-tests-job-7hzpy is active",
    }
    Namespace e2e-tests-job-7hzpy is active
not to have occurred

Issues about this test specifically: #28091

Failed: [k8s.io] SchedulerPredicates [Serial] validates MaxPods limit number of pods that are allowed to run [Slow] {Kubernetes e2e suite}

/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/scheduler_predicates.go:233
Expected error:
    <*errors.errorString | 0xc822be5b00>: {
        s: "Namespace e2e-tests-job-7hzpy is active",
    }
    Namespace e2e-tests-job-7hzpy is active
not to have occurred

Issues about this test specifically: #27662 #29820

Failed: [k8s.io] EmptyDir volumes should support (root,0777,default) [Conformance] {Kubernetes e2e suite}

/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/framework/framework.go:132
Expected error:
    <*errors.errorString | 0xc8200b1060>: {
        s: "timed out waiting for the condition",
    }
    timed out waiting for the condition
not to have occurred

Issues about this test specifically: #26780

Failed: [k8s.io] Services should be able to change the type and ports of a service [Slow] {Kubernetes e2e suite}

/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/service.go:717
Aug 22 07:06:52.634: Timeout waiting for service "mutability-test" to have a load balancer

Issues about this test specifically: #26134

Failed: [k8s.io] Pods should be restarted with a docker exec "cat /tmp/health" liveness probe [Conformance] {Kubernetes e2e suite}

/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/framework/framework.go:132
Expected error:
    <*errors.errorString | 0xc8200b1060>: {
        s: "timed out waiting for the condition",
    }
    timed out waiting for the condition
not to have occurred

Issues about this test specifically: #28332

Failed: [k8s.io] Kubectl client [k8s.io] Kubectl api-versions should check if v1 is in available api versions [Conformance] {Kubernetes e2e suite}

/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/kubectl.go:428
Expected error:
    <*errors.errorString | 0xc821fb2100>: {
        s: "Error running &{/workspace/kubernetes/platforms/linux/amd64/kubectl [kubectl --server=https://104.198.247.15 --kubeconfig=/workspace/.kube/config api-versions] []  <nil>  error: failed to negotiate an api version; server supports: map[], client supports: map[federation/v1beta1:{} apps/v1alpha1:{} authorization.k8s.io/v1beta1:{} v1:{} autoscaling/v1:{} batch/v1:{} policy/v1alpha1:{} componentconfig/v1alpha1:{} rbac.authorization.k8s.io/v1alpha1:{} authentication.k8s.io/v1beta1:{} batch/v2alpha1:{} extensions/v1beta1:{}]\n [] <nil> 0xc822bea5e0 exit status 1 <nil> true [0xc820996018 0xc820996038 0xc820996058] [0xc820996018 0xc820996038 0xc820996058] [0xc820996028 0xc820996050] [0xa96280 0xa96280] 0xc8208c66c0}:\nCommand stdout:\n\nstderr:\nerror: failed to negotiate an api version; server supports: map[], client supports: map[federation/v1beta1:{} apps/v1alpha1:{} authorization.k8s.io/v1beta1:{} v1:{} autoscaling/v1:{} batch/v1:{} policy/v1alpha1:{} componentconfig/v1alpha1:{} rbac.authorization.k8s.io/v1alpha1:{} authentication.k8s.io/v1beta1:{} batch/v2alpha1:{} extensions/v1beta1:{}]\n\nerror:\nexit status 1\n",
    }
    Error running &{/workspace/kubernetes/platforms/linux/amd64/kubectl [kubectl --server=https://104.198.247.15 --kubeconfig=/workspace/.kube/config api-versions] []  <nil>  error: failed to negotiate an api version; server supports: map[], client supports: map[federation/v1beta1:{} apps/v1alpha1:{} authorization.k8s.io/v1beta1:{} v1:{} autoscaling/v1:{} batch/v1:{} policy/v1alpha1:{} componentconfig/v1alpha1:{} rbac.authorization.k8s.io/v1alpha1:{} authentication.k8s.io/v1beta1:{} batch/v2alpha1:{} extensions/v1beta1:{}]
     [] <nil> 0xc822bea5e0 exit status 1 <nil> true [0xc820996018 0xc820996038 0xc820996058] [0xc820996018 0xc820996038 0xc820996058] [0xc820996028 0xc820996050] [0xa96280 0xa96280] 0xc8208c66c0}:
    Command stdout:

    stderr:
    error: failed to negotiate an api version; server supports: map[], client supports: map[federation/v1beta1:{} apps/v1alpha1:{} authorization.k8s.io/v1beta1:{} v1:{} autoscaling/v1:{} batch/v1:{} policy/v1alpha1:{} componentconfig/v1alpha1:{} rbac.authorization.k8s.io/v1alpha1:{} authentication.k8s.io/v1beta1:{} batch/v2alpha1:{} extensions/v1beta1:{}]

    error:
    exit status 1

not to have occurred

Issues about this test specifically: #29710

Failed: [k8s.io] SchedulerPredicates [Serial] validates that NodeSelector is respected if matching [Conformance] {Kubernetes e2e suite}

/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/scheduler_predicates.go:233
Expected error:
    <*errors.errorString | 0xc82059ec10>: {
        s: "Namespace e2e-tests-job-7hzpy is active",
    }
    Namespace e2e-tests-job-7hzpy is active
not to have occurred

Issues about this test specifically: #29516

Failed: [k8s.io] SchedulerPredicates [Serial] validates that required NodeAffinity setting is respected if matching {Kubernetes e2e suite}

/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/scheduler_predicates.go:233
Expected error:
    <*errors.errorString | 0xc821c02340>: {
        s: "Namespace e2e-tests-job-7hzpy is active",
    }
    Namespace e2e-tests-job-7hzpy is active
not to have occurred

Issues about this test specifically: #28071

Failed: [k8s.io] ConfigMap should be consumable via environment variable [Conformance] {Kubernetes e2e suite}

/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/framework/framework.go:132
Expected error:
    <*errors.errorString | 0xc8200b1060>: {
        s: "timed out waiting for the condition",
    }
    timed out waiting for the condition
not to have occurred

Issues about this test specifically: #27079

Failed: [k8s.io] SchedulerPredicates [Serial] validates that taints-tolerations is respected if matching {Kubernetes e2e suite}

/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/scheduler_predicates.go:233
Expected error:
    <*errors.errorString | 0xc822a5b8c0>: {
        s: "Namespace e2e-tests-job-7hzpy is active",
    }
    Namespace e2e-tests-job-7hzpy is active
not to have occurred

Issues about this test specifically: #28853

Failed: [k8s.io] SchedulerPredicates [Serial] validates that embedding the JSON NodeAffinity setting as a string in the annotation value work {Kubernetes e2e suite}

/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/scheduler_predicates.go:233
Expected error:
    <*errors.errorString | 0xc8221f8290>: {
        s: "Namespace e2e-tests-job-7hzpy is active",
    }
    Namespace e2e-tests-job-7hzpy is active
not to have occurred

Issues about this test specifically: #29816 #30018

Failed: [k8s.io] Job should run a job to completion when tasks sometimes fail and are locally restarted {Kubernetes e2e suite}

/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/framework/framework.go:132
Expected error:
    <*errors.StatusError | 0xc822621180>: {
        ErrStatus: {
            TypeMeta: {Kind: "", APIVersion: ""},
            ListMeta: {SelfLink: "", ResourceVersion: ""},
            Status: "Failure",
            Message: "the server does not allow access to the requested resource (get serviceAccounts)",
            Reason: "Forbidden",
            Details: {
                Name: "",
                Group: "",
                Kind: "serviceAccounts",
                Causes: [
                    {
                        Type: "UnexpectedServerResponse",
                        Message: "Forbidden: \"/api/v1/watch/namespaces/e2e-tests-job-7hzpy/serviceaccounts?fieldSelector=metadata.name%3Ddefault\"",
                        Field: "",
                    },
                ],
                RetryAfterSeconds: 0,
            },
            Code: 403,
        },
    }
    the server does not allow access to the requested resource (get serviceAccounts)
not to have occurred

@fejta
Copy link
Contributor

fejta commented Aug 22, 2016

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
area/test-infra kind/flake Categorizes issue or PR as related to a flaky test. priority/critical-urgent Highest priority. Must be actively worked on as someone's top priority right now.
Projects
None yet
Development

No branches or pull requests

3 participants