Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature Request]: Colorize the errors on CI for acceptance tests #21448

Open
HardikGoyal2003 opened this issue Dec 16, 2024 · 35 comments
Open
Labels
enhancement Label to indicate an issue is a feature/improvement good first issue Impact: Medium Will improve quality-of-life for at least 30% of users. Work: Low Solution is known and broken into good-first-issue-sized chunks.

Comments

@HardikGoyal2003
Copy link
Member

HardikGoyal2003 commented Dec 16, 2024

Is your feature request related to a problem? Please describe.

It's challenging to identify the exact error for acceptance test fails in the CI run failure; we need to look at it closely, which is time-consuming.

Describe the solution (or solutions) you'd like

Highlight the errors on the CI similar to the E2E tests

Describe alternatives you've considered and rejected

No response

Additional context

No response

@HardikGoyal2003 HardikGoyal2003 added triage needed enhancement Label to indicate an issue is a feature/improvement labels Dec 16, 2024
@HardikGoyal2003 HardikGoyal2003 changed the title [Feature Request]: Colorize the errors on CI [Feature Request]: Colorize the errors on CI for acceptance tests Dec 16, 2024
@seanlip seanlip added Work: Low Solution is known and broken into good-first-issue-sized chunks. Impact: Medium Will improve quality-of-life for at least 30% of users. Work: Medium The means to find the solution is clear, but it isn't at good-first-issue level yet. good first issue and removed Work: Low Solution is known and broken into good-first-issue-sized chunks. Work: Medium The means to find the solution is clear, but it isn't at good-first-issue level yet. labels Dec 16, 2024
@mansi-sharma22
Copy link

Hi @HardikGoyal2003,
I would love to take up this issue and implement the requested feature to colorize errors on CI for acceptance tests. I have experience with CI enhancements and can deliver a clean solution. Please assign this issue to me so I can get started.
Thank you!

@HardikGoyal2003
Copy link
Member Author

Hey @mansi-sharma22 Per the guidance at https://github.com/oppia/oppia/wiki/Contributing-code-to-Oppia#choosing-a-good-first-issue, please provide an explanation of what your PR will do (with names of files you're changing, what you plan to change in each file, etc.). If it looks good, we can assign you to this issue.

Please also follow the other instructions on that wiki page if you have not yet done so. Thanks!

@mansi-sharma22
Copy link

Hi, thanks a lot for your response. This will be my approach. I will also go through the wiki page. Apologies for not having done so already @HardikGoyal2003

  1. Files to be modified:
    a. scripts/check_ci_test_suites_to_run.py

    • Add a new function colorize_error_output(error_message) that will wrap error messages with appropriate color codes
    • Modify the existing error reporting to use this new function

    b. scripts/check_tests_are_captured_in_ci.py

    • Similar changes as above, focusing on the parts that output acceptance test errors

    c. scripts/check_ci_test_suites_to_run_test.py

    • Add new test cases to verify the colorization function works correctly

    d. scripts/check_tests_are_captured_in_ci_test.py

    • Add tests to ensure the colorized output is generated correctly
  2. New file to be created:
    scripts/utils/color_utils.py

    • This will contain color code constants and utility functions for applying colors to strings
  3. Implementation details:

    • I plan to use ANSI color codes for terminal output
    • Ensure that colorization can be easily toggled on/off for different environments
  4. Testing:

    • Will add unit tests for the new colorization functions
    • Will manually test the output in CI environment to ensure compatibility

@HardikGoyal2003
Copy link
Member Author

HardikGoyal2003 commented Dec 18, 2024

@mansi-sharma22, could you provide a video demo of your approach? If it looks good, we can assign you. Thanks!

@MadhavDhatrak
Copy link

@HardikGoyal2003 @seanlip
The issue is that acceptance test errors in CI are not colorized making it difficult to quickly identify failures
The problem occurred because the acceptance tests didn't have dedicated error formatting like E2E tests do.

because of the Missing Components
-No custom jasmine reporter for colorized output
-No color formatting in test error messages
-Lack of error highlighting in CI logs

Solution-:
-Create a new reporter file
-Update Jasmine config
-Modify test runner

files to Update:

-core/tests/puppeteer-acceptance-tests/utilities/jasmine-colored-reporter.ts (new file)
-jasmine.json
-run_acceptance_tests.py
-package.json

This solution will:
-Add color highlighting for test failures (red) and passes (green)
-Show stack traces in gray
-Make errors visually distinct in CI output
-Maintain consistency with E2E tests

@MadhavDhatrak
Copy link

Hi @HardikGoyal2003 @seanlip

I understand this issue and why this problem is happening. I have made some code changes, specifically in the ConsoleReporter class, where the main error reporting happens. I have added some visual formatting to the error reporting, but I’m confused about how to test these changes locally.

Can you guide me on how I can test it locally?

@HardikGoyal2003
Copy link
Member Author

Hey @MadhavDhatrak you can create a test PR, and check it.

@MadhavDhatrak
Copy link

Hi @HardikGoyal2003 @seanlip

I am running into a issue When I run Mypy checks locally using the following commands, everything works perfectly and I get the "Success: no issues found" message:

1.make run_tests.mypy
2.make run_tests.mypy PYTHON_ARGS="--files path/file1.py path/file2.py"

However when I try to push to my forked repository the linter tests pass but when Mypy runs, it shows the following errors:

Errors from Mypy when pushing:
core/jobs/io/gcs_io.py:76: Bad number of arguments for type alias, expected: 2, given: 1.
core/jobs/batch_jobs/translation_migration_jobs.py:105: Argument 1 to "Ok" has incompatible type "List[EntityTranslation]"; expected "Tuple[str, EntityTranslation]".

and Much more

Thats why I am unable to create the test PR due to the Mypy errors during the push process.
Could you provide a solution or suggest how I can resolve these issues to successfully create a test PR?

@HardikGoyal2003
Copy link
Member Author

@MadhavDhatrak Can you please try adding these changes to your PR and push it.

@souja450
Copy link

@seanlip I would like to work on this issue to improve the developer experience by addressing the challenge of identifying acceptance test failures in CI runs. Currently, it is difficult to pinpoint the exact errors without going through the logs manually, which is time-consuming and inefficient. My goal is to enhance the CI logs by implementing a feature that highlights specific errors, similar to how E2E test errors are displayed.

This improvement will streamline the debugging process, reduce manual effort, and ensure quicker identification of issues during CI runs.

@seanlip
Copy link
Member

seanlip commented Dec 22, 2024

@souja450 Per the guidance at https://github.com/oppia/oppia/wiki/Contributing-code-to-Oppia#choosing-a-good-first-issue, please provide an explanation of what your PR will do (with names of files you're changing, what you plan to change in each file, etc.), as well as a video showing that the changes work correctly on your local machine.

We cannot assign you to this (or any issue) without seeing the proof that what you are planning to build actually works. Thanks.

@souja450
Copy link

souja450 commented Dec 22, 2024

Hi @seanlip,

As per the issue requirements, I plan to enhance error message readability in CI for acceptance tests by adding colorization.
The files to be modified are:
scripts/check_ci_test_suites_to_run.py:
To add a colorize_error_output() function using ANSI codes to format error messages and integrate it into error reporting mechanisms.
In scripts/check_tests_are_captured_in_ci_test.py:
To write unit tests to validate the colorize_error_output() function and ensure integration does not disrupt functionality.
In scripts/utils/color_utils.py (New):
To create a utility for ANSI color constants and reusable colorization functions.
Expected Outcome:
Error messages in CI will display in color, improving readability.
I will share a video demonstrating these changes working correctly in my local environment once implemented.
Screenshot 2024-12-22 at 2 26 17 PM
I uploaded a screesnhot one of my codes work.

Thanks!

@MadhavDhatrak
Copy link

Hello @HardikGoyal2003 @seanlip

I have made some changes in the files and I am confident that these changes will resolve the issue. However I am facing some difficulties while testing these changes. I need to create a test PR to verify the fixes, but I’ve been trying for the last two days and am encountering multiple issues. Some of them I’ve already solved, like the mypy test issue, but the problem with the google-chrome-stable_current_amd64.deb file keeps occurring again and again.

Here is a screenshot of the issue I’m facing:

Screenshot 2024-12-22 145746

If you have any solutions or suggestions on how to proceed,please let me know.

@seanlip
Copy link
Member

seanlip commented Dec 24, 2024

@MadhavDhatrak This is a question for GitHub Discussions, not this issue thread.

@souja450 Thanks for your update. We'll wait until you have everything set up and working correctly in your local environment before assigning you.

@Harish-1511
Copy link

Harish-1511 commented Dec 26, 2024

Hi @seanlip,

Regarding the requested feature for colorizing errors in the CI for acceptance tests:

I wrote a Python script that uses ANSI codes to add colorization to terminal output. This enhancement improves readability and makes it easier to identify errors in acceptance tests, significantly reducing the time needed to pinpoint issues.

New File: scripts/utils/color_utils.py

  • This file includes ANSI color codes for terminal output and reusable functions for string colorization.

  • A function named colorize_error(error_message) has been implemented to apply color formatting to error messages, making them clear and visually distinct in the CI logs.

Implementation:

  • The colorize_error function uses ANSI color codes to format error messages dynamically.

  • The implementation ensures that the functionality is reusable and compatible across different CI outputs.

Testing:

  • New File: scripts/utils/test_color_utils.py

  • I implemented test cases to validate the color_utils functions for proper string colorization.

  • I also manually tested the output within the Oppia environment to ensure compatibility.

Screenshot 2024-12-26 131630

I am currently working on integrating the colorized output functionality into the CI logs for acceptance tests.

Files and Updates:

  • scripts/utils/color_utils.py: Created this file to include ANSI codes and reusable functions for string colorization.
  • This implementation aims to enhance the readability of CI logs by making errors
  • scripts/run_acceptance_tests.py: The main script for executing acceptance tests.
  • scripts/check_ci_test_suites_to_run.py: Added a function to handle error colorization.
  • scripts/check_tests_are_captured_in_ci.py: Applied colorization to the existing outputs for better readability.

Screenshot 2024-12-24 210500

I’ll share a video showing these changes working in my local setup once completed.

share any suggestions or solutions for moving forward.

Thanks!

@seanlip
Copy link
Member

seanlip commented Dec 27, 2024

Thanks @Harish-1511. We'll wait for your update that shows that this works properly within the relevant contexts.

@souja450
Copy link

Hello @seanlip sorry for late reply. This is the video demo for my work.

Uploading Testingvideo1.mov…

@souja450
Copy link

@MadhavDhatrak This is a question for GitHub Discussions, not this issue thread.

@souja450 Thanks for your update. We'll wait until you have everything set up and working correctly in your local environment before assigning you.

Hello @seanlip sorry for late reply. This is the video demo for my work.

Uploading Testingvideo1.mov…

@souja450
Copy link

souja450 commented Dec 27, 2024 via email

@HardikGoyal2003
Copy link
Member Author

@souja450 The video demo in your previous comments is not properly uploaded, making it invisible. Additionally, your last email reply is unclear. Could you please clarify? Thanks!

@souja450
Copy link

souja450 commented Dec 27, 2024 via email

@souja450
Copy link

Hi @seanlip,

As per the issue requirements, I did those changes:
scripts/check_ci_test_suites_to_run.py:
To add a colorize_error_output() function using ANSI codes to format error messages and integrate it into error reporting mechanisms.
In scripts/check_tests_are_captured_in_ci_test.py:
To write unit tests to validate the colorize_error_output() function and ensure integration does not disrupt functionality.
In scripts/utils/color_utils.py (New):
To create a utility for ANSI color constants and reusable colorization functions.
I got the error message in color like before and i tested check_tests_are_captured_in_ci_test.py but the video is low quality resolution in the demo video t must be less than 10 MB thats why it looks invisible.
regards,
Soulaimene

Testingvideo1.mov

@seanlip
Copy link
Member

seanlip commented Dec 29, 2024

@souja450 Thanks, I can't see your video clearly enough and I don't think it is actually running the acceptance tests. I suggest doing the following:

  • Record a video of just your terminal, not your entire screen.
  • Record the video in two or more parts, if you need to.
  • Show that the error logs for acceptance tests are properly coloured. In your script there seems to be some installation issue and you haven't actually gotten to the acceptance tests part yet.

Hope this helps.

@souja450
Copy link

souja450 commented Dec 29, 2024

Hello @seanlip Please find attached my video demo on parts.
regards,
Soulaimene Jaoua

videopart1.mov
videopart2.mov
videopart3--.mov
videopart4--.mov

@seanlip
Copy link
Member

seanlip commented Dec 29, 2024

@souja450 This doesn't follow the guidance I provided above. You are still recording the entire screen and you haven't shown that you have gotten to the "acceptance tests" section yet.

@souja450
Copy link

@seanlip please find attached the terminal video here.

full_video.mov

@seanlip
Copy link
Member

seanlip commented Dec 30, 2024

@souja450 That's better, yup. You still haven't successfully got to the point of running an acceptance test, though. Please don't post more videos in this thread until you've sorted that out.

@souja450
Copy link

@seanlip hello I just got the acceptance test passed.
Acceptance_test
I got the acceptance test passed

@seanlip
Copy link
Member

seanlip commented Dec 30, 2024

No, they have not passed. You have run 0 tests.

Your logs should look something like the ones here: https://github.com/oppia/oppia/actions/runs/12543790205/job/34979432783

@souja450
Copy link

souja450 commented Jan 2, 2025

Screenshot 2025-01-02 at 3 47 53 PM Hello @seanlip Happy new year when I compile python -m scripts.run_e2e_tests it generates those errors. Can u help me fixing those every time I need to install requirement.txt I followed the instruction on wiki page but it couldn't be resolved.

@souja450
Copy link

souja450 commented Jan 3, 2025

Uploading Screenshot 2025-01-03 at 3.36.45 AM.png…

@souja450
Copy link

souja450 commented Jan 3, 2025

I need help I tried to compile but always there is a requirement dependency file that check which creates conflicts how to fix it_

@souja450
Copy link

souja450 commented Jan 3, 2025

@seanlip Hello the environment is set up but when it reaches the testing of acceptance test it shows like python dependencies are missing despite I installed requirement.txt and requirement_dev.txt. what could be the possible solution?

@souja450
Copy link

souja450 commented Jan 4, 2025

Hi @seanlip please find attched the test suites running and the environment setup running too.
Screenshot 2025-01-04 at 2 13 15 PM
Screenshot 2025-01-04 at 2 14 39 PM

@seanlip
Copy link
Member

seanlip commented Jan 5, 2025

Hi @souja450, that's great, but these are the e2e tests, not the acceptance tests. You'll want to try running the acceptance tests. See our wiki for more details.

Also I would suggest that you refrain from posting more images on this thread until you have the colourization actually working, i.e. you have actually solved the issue. You can use GitHub Discussions instead if you have general dev-workflow-related questions. Thanks.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement Label to indicate an issue is a feature/improvement good first issue Impact: Medium Will improve quality-of-life for at least 30% of users. Work: Low Solution is known and broken into good-first-issue-sized chunks.
Projects
Status: Todo
Development

No branches or pull requests

6 participants