-
Notifications
You must be signed in to change notification settings - Fork 334
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat(auto-edit): improve error logging #6609
Conversation
vscode/src/services/sentry/sentry.ts
Outdated
} | ||
// if (isError(error) && !insideAgent && !error.stack?.includes('sourcegraph.cody-ai')) { | ||
// return false | ||
// } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
is this a leftover from local testing ?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'll remove it since it's easy to get it back if we need to.
vscode/src/services/sentry/sentry.ts
Outdated
// Reason: we don't have _any_ errors in Sentry which is suspicious. | ||
// Original PR: https://github.com/sourcegraph/cody/pull/3540 | ||
// Follow-up issue: https://linear.app/sourcegraph/issue/CODY-4673/telemetry-verify-error-filtering-conditions-after-the-change | ||
// |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can we verify this before merging, since without it we won't get errors even if we get them at runtime and would defeat the purpose of the PR
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
or I see we already created a issue, so maybe we can follow up as well
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I tried debugging it locally, but I couldn't verify it by running the extension in the extension host. The stack traces must be different in the production environment, and I'm not sure how to replicate them locally. cc @umpox for ideas on getting production error stack traces locally
With the current implementation, we don't get any errors. My PR loosens the filters we use, so the opposite might happen — we could overwhelm Sentry with errors. However, we've dealt with this situation before, and if it happens again, we can re-enable this filter with a backport or introduce a new one based on the stack traces we see.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
makes sense !!
return result | ||
} catch (error) { | ||
const errorToReport = | ||
error instanceof Error |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
does it also make sense to send the error to BQ for easy querying and filtering ?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We already do! The analytics logger logic sends data to both BigQuery and Sentry. The only difference is that we manually prevent bursts of errors from being sent to BigQuery to limit the event load.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
That awesome !! Thanks for confirming :)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
minor comments; most changes looks good to me !!
provideInlineCompletionItems
method in a try-catch block and logs errors to our telemetry backend and Sentry.sourcegraph.cody-ai
substring in the error stack. We haven't seen any errors in Sentry for the last 90 days, which is suspicious. Let's verify if this condition is still valid and ensure we aren't missing any false positives due to this logic.Test plan
CI and updated integration tests for the autoedit provider.