Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Missing error logs in flow run timeline from KubernetesWorker #15722

Closed
DGolubets opened this issue Oct 16, 2024 · 6 comments · Fixed by #16647
Closed

Missing error logs in flow run timeline from KubernetesWorker #15722

DGolubets opened this issue Oct 16, 2024 · 6 comments · Fixed by #16647
Labels
enhancement An improvement of an existing feature integrations Related to integrations with other services

Comments

@DGolubets
Copy link

Bug summary

My deployment executing in K8s fails with an error (inspected with kubectl logs):

ModuleNotFoundError: No module named 'polars'

But the flow logs in the UI don't have anything except:

Process for flow run 'banana-bug' exited with status code: 1

Version info (prefect version output)

Version:             3.0.4
API version:         0.8.4
Python version:      3.12.3
Git commit:          c068d7e2
Built:               Tue, Oct 1, 2024 11:54 AM
OS/Arch:             linux/x86_64
Profile:             ephemeral
Server type:         server
Pydantic version:    2.9.2
Integrations:
  prefect-kubernetes: 0.5.0
  prefect-aws:       0.5.0
  prefect-dask:      0.3.1

Additional context

No response

@DGolubets DGolubets added the bug Something isn't working label Oct 16, 2024
@zzstoatzz
Copy link
Collaborator

zzstoatzz commented Oct 16, 2024

hi @DGolubets - on some level this is expected to me, i.e. your deployment's runtime is likely missing polars as a dependency, so your code failed at import time before a flow run logger could be setup.

is your expectation that this line

ModuleNotFoundError: No module named 'polars'

is surfaced by the worker in the flow run logs? I think that's a very reasonable enhancement request

@DGolubets
Copy link
Author

DGolubets commented Oct 16, 2024

Hi @zzstoatzz

Thanks for quick reply.
I wrongly assumed that Prefect collects the logs from pods stdout.
If it send them from Python, then this behavior is understandable indeed.

Though UX would definitely be better if Prefect could handle this case.

@zzstoatzz
Copy link
Collaborator

Though UX would definitely be better if Prefect could handle this case.

Agreed! I will mark this as an enhancement

@zzstoatzz zzstoatzz added enhancement An improvement of an existing feature integrations Related to integrations with other services and removed bug Something isn't working labels Oct 16, 2024
@zzstoatzz zzstoatzz changed the title Missing error logs Missing error logs in flow run timeline from KubernetesWorker Oct 16, 2024
@EmilRex
Copy link
Contributor

EmilRex commented Nov 25, 2024

In cases where the flow author does not have direct access to the infrastructure, this becomes a point of friction. For example, the flow author might be a data scientist who does not have direct access to their organization's kubernetes cluster. In this case, the data scientist would have to rely on their cluster owner to find and share the stack trace.

@EmilRex
Copy link
Contributor

EmilRex commented Nov 25, 2024

My understanding is that these are errors that happen in the Runner. Errors that happen while provisioning infrastructure are captured in worker logs, and errors that happen in the context of a flow are captured in flow run logs, but between the infra being provisioned and the flow starting, logs are not captured.

@EmilRex
Copy link
Contributor

EmilRex commented Dec 13, 2024

In addition to import errors I believe this would also happen with syntax errors. Suppose you create a deployment using git storage. After creating the deployment, you commit and push a syntax error. Starting a flow run will then retrieve the commit with a syntax error and crash, but not report logs.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement An improvement of an existing feature integrations Related to integrations with other services
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants