Skip to content

Commit

Permalink
small wording tweaks for README
Browse files Browse the repository at this point in the history
  • Loading branch information
pdurbin committed Jul 14, 2022
1 parent 10f047f commit 5aa1deb
Showing 1 changed file with 26 additions and 26 deletions.
52 changes: 26 additions & 26 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,9 +2,9 @@

![GitHub release (latest by date)](https://img.shields.io/github/v/release/gdcc/dataverse_tests) ![Python version](https://img.shields.io/static/v1?label=Python&message=3.6|3.7|3.8&color=blue) [![DOI](https://zenodo.org/badge/DOI/10.5281/zenodo.6802981.svg)](https://doi.org/10.5281/zenodo.6802981) [![GitHub](https://img.shields.io/github/license/gdcc/pydataverse.svg)](https://opensource.org/licenses/MIT) [![Code style: black](https://img.shields.io/badge/code%20style-black-000000.svg)](https://github.com/psf/black)

Dataverse tests helps you to test the operational requirements of your [Dataverse](dataverse.org/) installation to maintane stability and low risk. It offers tests for integration, system and risk-based testing. In addition to the tests, `utils` offers a CLI of common workflow actions to help you with your testing activities.
Dataverse Tests helps you to test the operational requirements of your [Dataverse](dataverse.org/) installation to maintain stability and lower risk. It offers tests for integration, system and risk-based testing. In addition to the tests, `utils` offers a CLI of common workflow actions to help you with your testing activities.

Tests are written in Python with pytest, requests and Selenium. They are easy to adapt and extend. They are Open Source and well documented. The tests do not contain common frontend or backend unit tests, which are part of Dataverse development itself.
Tests are written in Python with pytest, requests and Selenium. They are easy to adapt and extend. They are open source and well documented. The tests do not contain common frontend or backend unit tests, which are part of Dataverse development itself.

Funded by:

Expand All @@ -18,7 +18,7 @@ General
* Settings management
* Flexible and easy to use for your own Dataverse instance
* Integration of [dataverse_testdata](https://github.com/gdcc/dataverse_testdata/) and [dataverse-sample-data](https://github.com/IQSS/dataverse-sample-data/)
* Open Source (MIT)
* Open source (MIT)

Tests (`testing/`)

Expand Down Expand Up @@ -97,38 +97,38 @@ pip install .

### 1. Set up .env-file

Before you can start with ether `testing` or `utils`, you have to configure the settings management. You need to create an `.env`-file for each Dataverse installation, and set the needed variables in it. Start by using the `env-config/example.env` template. The `.env` file filename is your central identifier for works done with your Dataverse installation and then later on used for other naming purposes, so use a descriptive one (e. g. `ORGANISATION_INSTALLATION.env` => `aussda_production.env`). Once setup, you have to set the `ENV_FILE` environment variable in your terminal to your absolute path of your `.env`-file.
Before you can start with either `testing` or `utils`, you have to configure the settings management. You need to create an `.env`-file for each Dataverse installation, and set the needed variables in it. Start by using the `env-config/example.env` template. The `.env` file filename is your central identifier for actions done with your Dataverse installation and then later on used for other naming purposes, so use a descriptive one (e. g. `ORGANISATION_INSTALLATION.env` => `aussda_production.env`). Once set up, you have to set the `ENV_FILE` environment variable in your terminal to your absolute path of your `.env`-file.

```shell
export ENV_FILE="/ABSOLUTE/FILE/PATH/TO/ENV/FILE.env"
```

Note: To not track testing activities with you web-analytics service (e. g. Matomo or Google Analytics), you should set the user agent and exclude calls from it in you web-analytics service.
Note: To not track testing activities with you web-analytics service (e. g. Matomo or Google Analytics), you should set the user agent and exclude calls from it in your web-analytics service.

Note: Environment variables set via command line will overwrite the ones defined in an `.env`-file.

Find the used environment variables documented in `src/dvtests/settings.py`.
Environment variables are documented in `src/dvtests/settings.py`.

## 2. Create user JSON

For some tests and utils functions, you need at least one user who has proper rights to do API requests, create Datasets, to login or do other stuff. These user credentials are stored inside a JSON file under `user/`.
For some tests and utils functions, you need at least one user who has proper rights to do API requests, create Datasets, to login or do perform other actions. These user credentials are stored inside a JSON file under `user/`.

The user JSON file consists of user specific information to be used both for testing and utils functionality. We recommend to copy `user/example.json`, rename it after your instance (e. g. `aussda_production`) and add all your users with their credentials.
The user JSON file consists of user-specific information to be used both for testing and utils functionality. We recommend to copy `user/example.json`, rename it after your instance (e. g. `aussda_production`) and add all your users with their credentials.

Beware: This file consist of secret, critical data and should not be versioned or shared with anybody.

### 3. Testing

**3.a. Find tests**

The tests can be found inside `src/dvtests/testing/`. They are seperated into:
The tests can be found inside `src/dvtests/testing/`. They are separated into:

* `default/`: basic tests applicable to a normal Dataverse installation with default cofiguration
* `custom/`: tests to verify installation specific customizations of a Dataverse installation
* `default/`: basic tests applicable to a normal Dataverse installation with default configuration
* `custom/`: tests to verify installation-specific customizations of a Dataverse installation

**3.b. Setup browser engines (only if Selenium tests are used)**

To run selenium tests, you have to have at least one browser engine running and be callable by [pytest-selenium](<https://pytest-selenium.readthedocs.io/>).
To run selenium tests, you have to have at least one browser engine running and callable by [pytest-selenium](<https://pytest-selenium.readthedocs.io/>).

For this, set the `PATH` environment variable in the terminal. You have to add the directories for all the browserengines you want to use (e. g. geckodriver, [chromedriver](https://chromedriver.chromium.org/)) to your path. Check out [pytest-selenium](https://pytest-selenium.readthedocs.io/en/latest/) for supported browserengines.

Expand All @@ -144,9 +144,9 @@ Note: The browserengine file must be executable
A test normally works like this: You define the test input and the expected result and the test resolves to true if the actual result equals the expected one.

As the test input and expected results differ from installation to installation, you need to define them before you can execute the tests. These test-configs can be found in the `config/installations/` directory, inside a sub-directory named after your .env-file (e. g. `aussda_production/`). For each installations, you now have to create a `settings.json` inside it. Use the `config/installations/TEMPLATE_testing-settings.json` template for it.
Inside the dataverse installation folder must be a directory named `testing/`, in which all the core tests are placed inside the `default/` folder. Example path: `configs/installations/aussda_production/testing/default/`.
Inside the Dataverse installation folder must be a directory named `testing/`, in which all the core tests are placed inside the `default/` folder. Example path: `configs/installations/aussda_production/testing/default/`.

Inside there, you then have to place for each test a config file. They all have the `test_` prefix and are written in JSON. If you want to find out how the configs work, you best check out first the tests and/or other test-configs.
Inside there, you then have to place for each test a config file. They all have the `test_` prefix and are written in JSON. If you want to find out how the configs work, first check out the tests and/or other test-configs.

Best Practice: Start by copying the `config/installations/aussda_production` folder, rename it to your .env-filename and adapt the test-configs to your own setup.

Expand All @@ -165,23 +165,23 @@ If you want to use a Selenium frontend test, you have to pass the browserengine:
pytest -v --driver Firefox src/dvtests/testing/default/test_create-frontend_dataverse.py
```

We have defined several markers for the tests, which you can find out about in `setup.cfg`. Most markers tell you, if the test was already used with a specific Dataverse version, or if it uses utils or selenium to run properly.
We have defined several markers for the tests, which you can find out about in `setup.cfg`. Most markers tell you, if the test was already used with a specific Dataverse version, or if it uses utils or Selenium to run properly.

```shell
pytest -v -m "v5_6" src/dvtests/testing/default/test_shibboleth.py
```

**3.e. Optional: Adapt shibboleth login function**
**3.e. Optional: Adapt Shibboleth login function**

As every Shibboleth login works differently, you have to adapt/overwrite `custom_shibboleth_institution_login` inside `src/dvtests/testing/conftest.py` to your own Shibboleth login procedure to use it.

**3.f. Optional: Collect data with utils to data completeness**
**3.f. Optional: Collect data with utils for data completeness**

If you want to test the data completeness of your installation (e. g. after an upgrade or migration), you first need to collect the data from the existing/old Dataverse installation. Find out more at utils create-testdata.

### 4. Utils

Utils intends to offer helpful functions for your testing workflow - like collecting all data before an migration, uploading testdata for automated and/or manual testing or cleaning up after testing. These functions can be called by command line.
Utils intends to offer helpful functions for your testing workflow - like collecting all data before a migration, uploading testdata for automated and/or manual testing or cleaning up after testing. These functions can be called by the command line.

Note: Execute step 1. and 2. before you start using utils.

Expand All @@ -205,7 +205,7 @@ python src/dvtests FUNCTION_NAME [FUNCTION_VARIABLES]

Create testdata uses a JSON file to know, which data should be created how, in which order and by whom.

Note: The `:root` Dataverse is not published on a fresh installation and often the superuser (`dataverseAdmin`) account is not verified. This can lead to problems related to attaching new data.
Note: The `:root` Dataverse is not published on a fresh installation and often the superuser (`dataverseAdmin`) account is not verified. This can lead to problems related to adding new data.

**Call**

Expand All @@ -225,9 +225,9 @@ Actions are executed in sequential order.

* `id`: id of data related to the action
* `id-type`: id type of the parent `dvtests` if defined by user in JSON file, `alias` if Dataverse alias, `pid` if Dataset PID.
* `action`: defines, which kind of action should be done (`create` Dataverse or Dataset,`publish` Dataverse or Dataset, `upload` Datafile)
* `action`: defines which kind of action should be done (`create` Dataverse or Dataset,`publish` Dataverse or Dataset, `upload` Datafile)
* `user-handle`: defines by which user the action should be executed (user must be defined in the users JSON)
* `parent-id`: id of the parent to wich the data should be attached to.
* `parent-id`: id of the parent to which the data should be attached to.
* `parent-id-type`: same as `id-type`
* `metadata`: data related to the metadata
* `update`: list of metadata attributes from the metadata file, which should be updated before further steps.
Expand Down Expand Up @@ -400,18 +400,18 @@ python src/dvtests create-user configs/utils/create_user_01.json

The next steps for the project are:

1. Get it used by the broader communiy
2. Add compatibility with missing Dataverse version
1. Get it used by the broader community
2. Add compatibility with newer Dataverse versions
3. Extend existing tests
4. Add new tests

**Sustainability**

As for now, there is no ongoing, steady funding available, so no actual developments are planned. If you have feature requests or other ideas or concerns regarding the future of dataverse_tests, please contact GDCC.
As for now, there is no ongoing, steady funding available, so no actual developments are planned. If you have feature requests or other ideas or concerns regarding the future of dataverse_tests, please contact the GDCC.

## Contributor Guide

Please see at [CONTRIBUTING.rst](CONTRIBUTING.rst).
Please see [CONTRIBUTING.rst](CONTRIBUTING.rst).

## Resources

Expand All @@ -424,7 +424,7 @@ Please see at [CONTRIBUTING.rst](CONTRIBUTING.rst).

To everyone who has contributed to this project - with an idea, an issue, a pull request, developing an application, sharing it with others or by any other means: **Thank you for your support!**

Open Source projects live from the cooperation of the many and Dataverse Tests is no exception to that, so to say thank you is the least that can be done.
Open source projects live from the cooperation of the many and Dataverse Tests is no exception to that, so to say thank you is the least that can be done.

Special thanks to all Slava Tykhonov from DANS and all the people who do an amazing job by developing Dataverse at IQSS.

Expand Down

0 comments on commit 5aa1deb

Please sign in to comment.