Skip to content

Make it easier to analyze reports #257

Open
@mgifford

Description

I created another script to help analyze the reports.

Right now if you scan a few sites with Purple A11y's CLI tool you'll be left with bunch of directories with HTML and CSV files in them. That's great, however it doesn't allow you to really gather the metadata behind the scan. I wanted a snapshot and not the detailed reports.

So I created a little script to crawl the CSV files and extract some very basic information that would be a very basic status report.

I wanted a consistent benchmark for our sites that allows us to demonstrate improvement over time. Seeing the individual errors is useful, but if I have scanned 1000 pages, it would make sense that there would be more errors than if I'd just scanned 100.

Knowing how many URLs, WCAG errors (for each type), axe impact status (for each type) gives a better snapshot.

Perfection is nice, but for most sites that may be unattainable, and it would be better to strive to be able to prove progress,

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions