diff --git a/CHANGELOG.md b/CHANGELOG.md index d2ecc38dc9..9980e231a5 100644 --- a/CHANGELOG.md +++ b/CHANGELOG.md @@ -6,6 +6,10 @@ New `zentral.core.stores.backends.snowflake` store backend. ### Backward incompatibilities +#### 🧨 updated monolith configuration + +The Monolith repository is not configured in `base.json` anymore. Multiple Monolith repositories can be managed using the API or the GUI. + #### 🧨 updated `/api/inventory/machines/tags/` API endpoint To add more flexibility, the payload for this API endpoint has changed. Please refer to [the documentation](https://docs.zentral.io/en/latest/apps/inventory/#apiinventorymachinestags). diff --git a/docs/apps/monolith.md b/docs/apps/monolith.md index 06acb84371..b1389ab6aa 100644 --- a/docs/apps/monolith.md +++ b/docs/apps/monolith.md @@ -4,74 +4,42 @@ Monolith is a Munki server that adds dynamic manifests, catalogs, with progressi ## Zentral configuration -To activate monolith, you need to add a `zentral.contrib.monolith` section to the `apps` section in `base.json`. - -### Local repository with osquery optional enrollment - -The Munki repository is on the same server as Zentral. Only osquery is proposed for enrollment. The Munki enrollment is always enabled. `munkitools_core` and `osquery` must be present in your repository. +To activate monolith, you need to add a `zentral.contrib.monolith` section to the `apps` section in `base.json`: ```json { - "zentral.contrib.monolith": { - "enrollment_package_builders": { - "zentral.contrib.munki.osx_package.builder.MunkiZentralEnrollPkgBuilder": { - "requires": ["munkitools_core"], - "optional": false - }, - "zentral.contrib.osquery.osx_package.builder.OsqueryZentralEnrollPkgBuilder": { - "requires": ["osquery"], - "optional": true - } - }, - "munki_repository": { - "backend": "zentral.contrib.monolith.repository_backends.local", - "root": "/var/lib/munki/repo" - } - } + "zentral.contrib.monolith": {} } ``` -### S3 repository with osquery optional enrollment - -The Munki repository is in a S3 bucket. Only osquery is proposed for enrollment. The Munki enrollment is always enabled. `munkitools_core` and `osquery` must be present in your repository. +You can also configure enrollment packages. In the following example, two enrollment packages are configured: one for the Zentral Munki module, with `munkitools_core` as required PkgInfo, and one for the Zentral Osquery module, with `osquery` as required PkgInfo. ```json { "zentral.contrib.monolith": { "enrollment_package_builders": { "zentral.contrib.munki.osx_package.builder.MunkiZentralEnrollPkgBuilder": { - "requires": ["munkitools_core"], - "optional": false + "requires": ["munkitools_core"] }, "zentral.contrib.osquery.osx_package.builder.OsqueryZentralEnrollPkgBuilder": { - "requires": ["osquery"], - "optional": true + "requires": ["osquery"] } - }, - "munki_repository": { - "backend": "zentral.contrib.monolith.repository_backends.s3", - "aws_access_key_id": "AAAAAAAAAAAAAAAAAAAA", - "aws_secret_access_key": "SECRET", - "bucket": "monolith-acme", - "signature_version": "s3v4", - "region_name": "eu-central-1", - "prefix": "path_to_repo_root_in_bucket" } } } ``` -**IMPORTANT** When running in AWS, it is recommended to use [AWS instance profiles](https://docs.aws.amazon.com/IAM/latest/UserGuide/id_roles_use_switch-role-ec2_instance-profiles.html), [task IAM roles](https://docs.aws.amazon.com/AmazonECS/latest/userguide/task-iam-roles.html), or any other integrated authentication mechanism to authenticate with the bucket. If this is not possible, the AWS credentials can be passed as environment variables, using the `{{ env:NAME_OF_THE_VARIABLE }}` substitution in the Zentral configuration. +### Repositories -### Catalogs +Multiple repositories can be used. There are two kinds of repositories. `S3` and `Virtual`. Use a `S3` repository when you have a Munki repository published in a AWS S3 bucket. Use a `Virtual` repository to upload packages directly in Zentral. -Monolith works better – and is easier to reason about – when all the needed base versions of all pkginfo files are present in at least one catalog, and when more recent pkginfo files are made available in extra catalogs that can be activated for some machines. You could have for example a `production` catalog with the base versions of all the softwares you want to distribute across your fleet, and a `testing` catalog for the more recent versions. +**IMPORTANT** When using AWS S3 buckets, it is recommended to use [AWS instance profiles](https://docs.aws.amazon.com/IAM/latest/UserGuide/id_roles_use_switch-role-ec2_instance-profiles.html), [task IAM roles](https://docs.aws.amazon.com/AmazonECS/latest/userguide/task-iam-roles.html), or any other integrated authentication mechanism to authenticate with the bucket. -Monolith can run in one of two modes. By default, the catalogs from the pkginfo files are **automatically imported and used** in Monolith. If you want to promote a pkginfo file from `testing` to `production`, you would do it in the repository, and trigger a sync (it could be from `bleeding-edge` to `standard`, names are not important as long as they are used consistently). This mode would be the one to pick if you already have a pkginfo file auto-promotion setup. +### Catalogs -Monolith can also run in **manual mode**. To use this mode, set the `manual_catalog_management` to `true` in the `munki_repository` repository configuration of the `zentral.contrib.monolith` app configuration. In this mode, you can also choose the default name of the catalog that the new pkginfo files will be attached to in Zentral, by setting the `default_catalog` key (default to `Not assigned`). To promote a pkginfo file from one catalog to the other one, you would then have to do it in Zentral. +Monolith works better – and is easier to reason about – when all the needed base versions of all pkginfo files are present in at least one catalog, and when more recent pkginfo files are made available in extra catalogs that can be activated for some machines. You could have for example a `production` catalog with the base versions of all the softwares you want to distribute across your fleet, and a `testing` catalog for the more recent versions. -In either mode, you need to set the catalogs priorities in Zentral. Munki cannot understand that `bleeding-edge` has more recent versions than `standard` (or `testing` > `production`). That's why you need to give the catalogs where the most recent versions of the pkginfo files are, higher priorities (bigger numbers). This way we can make sure that if for example there is firefox 123 in `bleeding-edge`, and 122 in `production`, and that munki gets those two catalogs, that firefox 123 will be installed. +By default, the catalogs from the pkginfo files are **automatically imported and used** in Monolith. If you want to promote a pkginfo file from `testing` to `production`, you would do it in the repository, and trigger a sync (it could be from `bleeding-edge` to `standard`, names are not important as long as they are used consistently). ## Build a manifest @@ -126,32 +94,23 @@ Zentral will parse the body of the request based on the `Content-Type` HTTP head * `Content-Type: application/json` -### /api/monolith/repository/sync/ +### /api/monolith/repository/``/sync/ -#### Fetch the package infos from the repository +#### Fetch the package infos, the icons, the client resources from the repository -During a sync, monolith will import all the available [pkginfo files](https://github.com/munki/munki/wiki/Glossary#info-file-or-pkginfo-file), their [catalogs](https://github.com/munki/munki/wiki/Glossary#catalog), categories, and make them available to the app. +During a sync, monolith will import all the available [pkginfo files](https://github.com/munki/munki/wiki/Glossary#info-file-or-pkginfo-file), their [catalogs](https://github.com/munki/munki/wiki/Glossary#catalog), categories, and make them available to the app. It will also import the icon hashes, and get a list of the client resources. * method: POST * Content-Type: application/json -* Required permissions: - * `monolith.view_catalog` - * `monolith.add_catalog` - * `monolith.change_catalog`, - * `monolith.view_pkginfoname` - * `monolith.add_pkginfoname` - * `monolith.change_pkginfoname`, - * `monolith.view_pkginfo` - * `monolith.add_pkginfo` - * `monolith.change_pkginfo`, - * `monolith.change_manifest` +* Required permission: + * `monolith.sync_repository` Example: ``` curl -X POST \ -H "Authorization: Token $TOKEN" \ - https://$FQDN/api/monolith/repository/sync/ + https://$FQDN/api/monolith/repository/1/sync/ ``` Response: diff --git a/tests/conf/base.json b/tests/conf/base.json index ac649e1abe..59cf6c6825 100644 --- a/tests/conf/base.json +++ b/tests/conf/base.json @@ -168,11 +168,6 @@ ], "optional": true } - }, - "munki_repository": { - "manual_catalog_management": false, - "backend": "zentral.contrib.monolith.repository_backends.local", - "root": "/tmp" } }, "zentral.contrib.okta": {}, diff --git a/tests/monolith/test_api_views.py b/tests/monolith/test_api_views.py index be1d5c61c7..fd617f5931 100644 --- a/tests/monolith/test_api_views.py +++ b/tests/monolith/test_api_views.py @@ -11,14 +11,23 @@ from django.utils.crypto import get_random_string from accounts.models import APIToken, User from zentral.conf import settings -from zentral.contrib.inventory.models import EnrollmentSecret, MetaBusinessUnit, Tag +from zentral.contrib.inventory.models import MetaBusinessUnit, Tag from zentral.contrib.inventory.serializers import EnrollmentSecretSerializer from zentral.contrib.monolith.events import MonolithSyncCatalogsRequestEvent from zentral.contrib.monolith.models import (CacheServer, Catalog, Condition, Enrollment, Manifest, ManifestCatalog, ManifestSubManifest, PkgInfo, PkgInfoName, + Repository, SubManifest, SubManifestPkgInfo) +from zentral.contrib.monolith.repository_backends import load_repository_backend from zentral.core.events.base import AuditEvent +from .utils import (CLOUDFRONT_PRIVKEY_PEM, + force_catalog, force_condition, + force_enrollment, + force_manifest, + force_name, force_pkg_info, + force_repository, + force_sub_manifest, force_sub_manifest_pkg_info) class MonolithAPIViewsTestCase(TestCase): @@ -44,8 +53,6 @@ def setUpTestData(cls): # mbu cls.mbu = MetaBusinessUnit.objects.create(name=get_random_string(64)) cls.mbu.create_enrollment_business_unit() - # manifest - cls.manifest = Manifest.objects.create(meta_business_unit=cls.mbu, name=get_random_string(12)) # utility methods @@ -100,106 +107,545 @@ def delete(self, url, include_token=True): kwargs["HTTP_AUTHORIZATION"] = f"Token {self.api_key}" return self.client.delete(url, **kwargs) - def force_catalog(self, name=None, archived=False): - if name is None: - name = get_random_string(12) - archived_at = None - if archived: - archived_at = datetime.utcnow() - return Catalog.objects.create(name=name, priority=1, archived_at=archived_at) - - def force_condition(self): - return Condition.objects.create( - name=get_random_string(12), - predicate=get_random_string(12) + # list repositories + + def test_get_repositories_unauthorized(self): + response = self.get(reverse("monolith_api:repositories"), include_token=False) + self.assertEqual(response.status_code, 401) + + def test_get_repositories_permission_denied(self): + response = self.get(reverse("monolith_api:repositories")) + self.assertEqual(response.status_code, 403) + + def test_get_repositories_filter_by_name_not_found(self): + force_repository() + self._set_permissions("monolith.view_repository") + response = self.get(reverse("monolith_api:repositories"), {"name": "foo"}) + self.assertEqual(response.status_code, 200) + self.assertEqual(response.json(), []) + + def test_get_repositories_filter_by_name(self): + force_repository() + repository = force_repository(virtual=True) + self._set_permissions("monolith.view_repository") + response = self.get(reverse("monolith_api:repositories"), {"name": repository.name}) + self.assertEqual(response.status_code, 200) + self.assertEqual(response.json(), [{ + 'id': repository.pk, + 'backend': 'VIRTUAL', + 'backend_kwargs': {}, + 'name': repository.name, + 'created_at': repository.created_at.isoformat(), + 'updated_at': repository.updated_at.isoformat(), + 'meta_business_unit': None, + 'client_resources': [], + 'icon_hashes': {}, + 'last_synced_at': None, + }]) + + def test_get_repositories(self): + self._set_permissions("monolith.view_repository") + repository = force_repository(mbu=self.mbu) + response = self.get(reverse("monolith_api:repositories")) + self.assertEqual(response.status_code, 200) + self.assertEqual(response.json(), [{ + 'id': repository.pk, + 'backend': 'S3', + 'backend_kwargs': repository.get_backend_kwargs(), + 'name': repository.name, + 'created_at': repository.created_at.isoformat(), + 'updated_at': repository.updated_at.isoformat(), + 'meta_business_unit': self.mbu.pk, + 'client_resources': [], + 'icon_hashes': {}, + 'last_synced_at': None, + }]) + + # create repository + + def test_create_repository_unauthorized(self): + response = self._post_json_data(reverse("monolith_api:repositories"), {}, include_token=False) + self.assertEqual(response.status_code, 401) + + def test_create_repository_permission_denied(self): + response = self._post_json_data(reverse("monolith_api:repositories"), {}) + self.assertEqual(response.status_code, 403) + + def test_create_s3_repository_missing_bucket(self): + self._set_permissions("monolith.add_repository") + response = self._post_json_data( + reverse("monolith_api:repositories"), + {"name": get_random_string(12), + "meta_business_unit": self.mbu.pk, + "backend": "S3", + "backend_kwargs": {}}, + ) + self.assertEqual(response.status_code, 400) + self.assertEqual(response.json(), {'backend_kwargs': {'bucket': ['This field is required.']}}) + + def test_create_s3_repository_invalid_privkey(self): + self._set_permissions("monolith.add_repository") + response = self._post_json_data( + reverse("monolith_api:repositories"), + {"name": get_random_string(12), + "meta_business_unit": self.mbu.pk, + "backend": "S3", + "backend_kwargs": {"bucket": get_random_string(12), + "cloudfront_privkey_pem": "YADA"}}, + ) + self.assertEqual(response.status_code, 400) + self.assertEqual(response.json(), {'backend_kwargs': {'cloudfront_privkey_pem': ['Invalid private key.']}}) + + def test_create_s3_repository_missing_cloudfront_domain_key_id(self): + self._set_permissions("monolith.add_repository") + response = self._post_json_data( + reverse("monolith_api:repositories"), + {"name": get_random_string(12), + "meta_business_unit": self.mbu.pk, + "backend": "S3", + "backend_kwargs": {"bucket": get_random_string(12), + "cloudfront_privkey_pem": CLOUDFRONT_PRIVKEY_PEM}}, + ) + self.assertEqual(response.status_code, 400) + self.assertEqual( + response.json(), + {'backend_kwargs': { + 'cloudfront_domain': ['This field is required when configuring Cloudfront.'], + 'cloudfront_key_id': ['This field is required when configuring Cloudfront.'] + }} ) - def force_enrollment(self, tag_count=0): - enrollment_secret = EnrollmentSecret.objects.create(meta_business_unit=self.mbu) - tags = [Tag.objects.create(name=get_random_string(12)) for _ in range(tag_count)] - if tags: - enrollment_secret.tags.set(tags) - return ( - Enrollment.objects.create(manifest=self.force_manifest(), secret=enrollment_secret), - tags + def test_create_s3_repository_missing_cloudfront_key_id_privkey_pem(self): + self._set_permissions("monolith.add_repository") + response = self._post_json_data( + reverse("monolith_api:repositories"), + {"name": get_random_string(12), + "meta_business_unit": self.mbu.pk, + "backend": "S3", + "backend_kwargs": {"bucket": get_random_string(12), + "cloudfront_domain": "yolo.cloudfront.net"}}, + ) + self.assertEqual(response.status_code, 400) + self.assertEqual( + response.json(), + {'backend_kwargs': { + 'cloudfront_key_id': ['This field is required when configuring Cloudfront.'], + 'cloudfront_privkey_pem': ['This field is required when configuring Cloudfront.'], + }} ) - def force_manifest(self, mbu=None, name=None): - if mbu is None: - mbu = self.mbu - if name is None: - name = get_random_string(12) - return Manifest.objects.create(meta_business_unit=mbu, name=name) - - def force_manifest_catalog(self, tag=None): - manifest = self.force_manifest() - catalog = self.force_catalog() - mc = ManifestCatalog.objects.create(manifest=manifest, catalog=catalog) - if tag: - mc.tags.add(tag) - return mc - - def force_manifest_sub_manifest(self, tag=None): - manifest = self.force_manifest() - sub_manifest = self.force_sub_manifest() - msm = ManifestSubManifest.objects.create(manifest=manifest, sub_manifest=sub_manifest) - if tag: - msm.tags.add(tag) - return msm - - def force_pkg_info_name(self): - return PkgInfoName.objects.create(name=get_random_string(12)) - - def force_sub_manifest(self, meta_business_unit=None): - return SubManifest.objects.create( - name=get_random_string(12), - description=get_random_string(12), - meta_business_unit=meta_business_unit + def test_create_virtual_repository_bad_backend(self): + self._set_permissions("monolith.add_repository") + response = self._post_json_data( + reverse("monolith_api:repositories"), + {"name": get_random_string(12), + "meta_business_unit": self.mbu.pk, + "backend": "YOLO", + "backend_kwargs": {"un": 1}}, + ) + self.assertEqual(response.status_code, 400) + self.assertEqual( + response.json(), + {'backend': ['"YOLO" is not a valid choice.']} ) - def force_sub_manifest_pkg_info(self, sub_manifest=None, options=None): - if sub_manifest is None: - sub_manifest = self.force_sub_manifest() - if options is None: - options = {} - return SubManifestPkgInfo.objects.create( - sub_manifest=sub_manifest, - key="managed_installs", - pkg_info_name=self.force_pkg_info_name(), - options=options + def test_create_virtual_repository_bad_backend_kwargs(self): + self._set_permissions("monolith.add_repository") + response = self._post_json_data( + reverse("monolith_api:repositories"), + {"name": get_random_string(12), + "meta_business_unit": self.mbu.pk, + "backend": "VIRTUAL", + "backend_kwargs": {"un": 1}}, + ) + self.assertEqual(response.status_code, 400) + self.assertEqual( + response.json(), + {'backend_kwargs': {'non_field_errors': ['Must be an empty dict for a virtual repository.']}} ) + @patch("base.notifier.Notifier.send_notification") + @patch("zentral.core.queues.backends.kombu.EventQueues.post_event") + def test_create_s3_repository(self, post_event, send_notification): + self._set_permissions("monolith.add_repository") + name = get_random_string(12) + bucket = get_random_string(12) + with self.captureOnCommitCallbacks(execute=True) as callbacks: + response = self._post_json_data( + reverse("monolith_api:repositories"), + {"name": name, + "meta_business_unit": self.mbu.pk, + "backend": "S3", + "backend_kwargs": {"bucket": bucket}}, + ) + self.assertEqual(response.status_code, 201) + self.assertEqual(len(callbacks), 1) + repository = Repository.objects.get(name=name) + self.assertEqual(response.json(), { + 'id': repository.pk, + 'backend': 'S3', + 'backend_kwargs': {"bucket": bucket}, + 'name': repository.name, + 'created_at': repository.created_at.isoformat(), + 'updated_at': repository.updated_at.isoformat(), + 'meta_business_unit': self.mbu.pk, + 'client_resources': [], + 'icon_hashes': {}, + 'last_synced_at': None, + }) + event = post_event.call_args_list[0].args[0] + self.assertIsInstance(event, AuditEvent) + self.assertEqual( + event.payload, + {"action": "created", + "object": { + "model": "monolith.repository", + "pk": str(repository.pk), + "new_value": { + "pk": repository.pk, + "name": name, + "meta_business_unit": {"pk": self.mbu.pk, "name": self.mbu.name}, + "backend": "S3", + "backend_kwargs": {"bucket": bucket}, + "created_at": repository.created_at, + "updated_at": repository.updated_at, + } + }} + ) + metadata = event.metadata.serialize() + self.assertEqual(metadata["objects"], {"monolith_repository": [str(repository.pk)]}) + self.assertEqual(sorted(metadata["tags"]), ["monolith", "zentral"]) + send_notification.assert_called_once_with("monolith.repository", str(repository.pk)) + + @patch("base.notifier.Notifier.send_notification") + @patch("zentral.core.queues.backends.kombu.EventQueues.post_event") + def test_create_virtual_repository(self, post_event, send_notification): + self._set_permissions("monolith.add_repository") + name = get_random_string(12) + with self.captureOnCommitCallbacks(execute=True) as callbacks: + response = self._post_json_data( + reverse("monolith_api:repositories"), + {"name": name, + "backend": "VIRTUAL"}, + ) + self.assertEqual(response.status_code, 201) + self.assertEqual(len(callbacks), 1) + repository = Repository.objects.get(name=name) + self.assertEqual(response.json(), { + 'id': repository.pk, + 'backend': 'VIRTUAL', + 'backend_kwargs': {}, + 'name': repository.name, + 'created_at': repository.created_at.isoformat(), + 'updated_at': repository.updated_at.isoformat(), + 'meta_business_unit': None, + 'client_resources': [], + 'icon_hashes': {}, + 'last_synced_at': None, + }) + event = post_event.call_args_list[0].args[0] + self.assertIsInstance(event, AuditEvent) + self.assertEqual( + event.payload, + {"action": "created", + "object": { + "model": "monolith.repository", + "pk": str(repository.pk), + "new_value": { + "pk": repository.pk, + "name": name, + "backend": "VIRTUAL", + "backend_kwargs": {}, + "created_at": repository.created_at, + "updated_at": repository.updated_at, + } + }} + ) + metadata = event.metadata.serialize() + self.assertEqual(metadata["objects"], {"monolith_repository": [str(repository.pk)]}) + self.assertEqual(sorted(metadata["tags"]), ["monolith", "zentral"]) + send_notification.assert_called_once_with("monolith.repository", str(repository.pk)) + + # get repository + + def test_get_repository_unauthorized(self): + repository = force_repository() + response = self.get(reverse("monolith_api:repository", args=(repository.pk,)), include_token=False) + self.assertEqual(response.status_code, 401) + + def test_get_repository_permission_denied(self): + repository = force_repository() + response = self.get(reverse("monolith_api:repository", args=(repository.pk,))) + self.assertEqual(response.status_code, 403) + + def test_get_repository(self): + self._set_permissions("monolith.view_repository") + repository = force_repository(mbu=self.mbu) + response = self.get(reverse("monolith_api:repository", args=(repository.pk,))) + self.assertEqual(response.status_code, 200) + self.assertEqual(response.json(), { + 'id': repository.pk, + 'backend': 'S3', + 'backend_kwargs': repository.get_backend_kwargs(), + 'name': repository.name, + 'created_at': repository.created_at.isoformat(), + 'updated_at': repository.updated_at.isoformat(), + 'meta_business_unit': self.mbu.pk, + 'client_resources': [], + 'icon_hashes': {}, + 'last_synced_at': None, + }) + + # update repository + + def test_update_repository_unauthorized(self): + repository = force_repository() + response = self._post_json_data(reverse("monolith_api:repository", args=(repository.pk,)), + {}, include_token=False) + self.assertEqual(response.status_code, 401) + + def test_update_repository_permission_denied(self): + repository = force_repository() + response = self._post_json_data(reverse("monolith_api:repository", args=(repository.pk,)), {}) + self.assertEqual(response.status_code, 403) + + def test_update_s3_repository_bad_mbu(self): + repository = force_repository() + manifest = force_manifest() + self.assertIsNone(repository.meta_business_unit) + self.assertNotEqual(manifest.meta_business_unit, self.mbu) + force_catalog(repository=repository, manifest=manifest) + self._set_permissions("monolith.change_repository") + response = self._put_json_data( + reverse("monolith_api:repository", args=(repository.pk,)), + {"name": get_random_string(12), + "meta_business_unit": self.mbu.pk, + "backend": "S3", + "backend_kwargs": {"bucket": get_random_string(12)}} + ) + self.assertEqual(response.status_code, 400) + self.assertEqual( + response.json(), + {'meta_business_unit': [ + f"Repository linked to manifest '{manifest}' which has a different business unit." + ]} + ) + + @patch("base.notifier.Notifier.send_notification") + @patch("zentral.core.queues.backends.kombu.EventQueues.post_event") + def test_update_s3_repository(self, post_event, send_notification): + repository = force_repository() + manifest = force_manifest(mbu=self.mbu) + self.assertEqual(manifest.version, 1) + # two catalogs, only one manifest version bump! + force_catalog(repository=repository, manifest=manifest) + force_catalog(repository=repository, manifest=manifest) + prev_value = repository.serialize_for_event() + self._set_permissions("monolith.change_repository") + new_name = get_random_string(12) + new_bucket = get_random_string(12) + with self.captureOnCommitCallbacks(execute=True) as callbacks: + response = self._put_json_data( + reverse("monolith_api:repository", args=(repository.pk,)), + {"name": new_name, + "meta_business_unit": self.mbu.pk, + "backend": "S3", + "backend_kwargs": {"bucket": new_bucket, + "region_name": "us-east2", + "prefix": "prefix", + "access_key_id": "11111111111111111111", + "secret_access_key": "22222222222222222222", + "assume_role_arn": "arn:aws:iam::123456789012:role/S3Access", + "signature_version": "s3v2", + "endpoint_url": "https://endpoint.example.com", + "cloudfront_domain": "yada.cloudfront.net", + "cloudfront_key_id": "YADA", + "cloudfront_privkey_pem": CLOUDFRONT_PRIVKEY_PEM}}, + ) + self.assertEqual(response.status_code, 200) + self.assertEqual(len(callbacks), 1) + repository2 = Repository.objects.get(name=new_name) + self.assertEqual(repository, repository2) + repository.refresh_from_db() + self.assertEqual(response.json(), { + 'id': repository.pk, + 'backend': 'S3', + 'backend_kwargs': {"bucket": new_bucket, + "region_name": "us-east2", + "prefix": "prefix", + "access_key_id": "11111111111111111111", + "secret_access_key": "22222222222222222222", + "assume_role_arn": "arn:aws:iam::123456789012:role/S3Access", + "signature_version": "s3v2", + "endpoint_url": "https://endpoint.example.com", + "cloudfront_domain": "yada.cloudfront.net", + "cloudfront_key_id": "YADA", + "cloudfront_privkey_pem": CLOUDFRONT_PRIVKEY_PEM}, + 'name': new_name, + 'created_at': repository.created_at.isoformat(), + 'updated_at': repository.updated_at.isoformat(), + 'meta_business_unit': self.mbu.pk, + 'client_resources': [], + 'icon_hashes': {}, + 'last_synced_at': None, + }) + event = post_event.call_args_list[0].args[0] + self.assertIsInstance(event, AuditEvent) + self.assertEqual( + event.payload, + {"action": "updated", + "object": { + "model": "monolith.repository", + "pk": str(repository.pk), + "prev_value": prev_value, + "new_value": { + "pk": repository.pk, + "name": new_name, + "meta_business_unit": {"pk": self.mbu.pk, "name": self.mbu.name}, + "backend": "S3", + "backend_kwargs": { + "access_key_id": "11111111111111111111", + "assume_role_arn": "arn:aws:iam::123456789012:role/S3Access", + "bucket": new_bucket, + "cloudfront_domain": "yada.cloudfront.net", + "cloudfront_key_id": "YADA", + "cloudfront_privkey_pem_hash": "f42f0756e0d05ae8e6e63581e615d2d8" + "04c0f79b9f6bfb3cb7cfc5e9b6fc6a8f", + "endpoint_url": "https://endpoint.example.com", + "prefix": "prefix", + "region_name": "us-east2", + "secret_access_key_hash": "d70d4cbd04b6a3140c2ee642a40820abeacef01117ea9ce209de7c72452abe21", + "signature_version": "s3v2", + }, + "created_at": repository.created_at, + "updated_at": repository.updated_at, + } + }} + ) + metadata = event.metadata.serialize() + self.assertEqual(metadata["objects"], {"monolith_repository": [str(repository.pk)]}) + self.assertEqual(sorted(metadata["tags"]), ["monolith", "zentral"]) + send_notification.assert_called_once_with("monolith.repository", str(repository.pk)) + repository_backend = load_repository_backend(repository) + self.assertEqual(repository_backend.name, new_name) + self.assertEqual(repository_backend.bucket, new_bucket) + self.assertEqual(repository_backend.region_name, "us-east2") + self.assertEqual(repository_backend.prefix, "prefix") + self.assertEqual( + repository_backend.credentials, + {'aws_access_key_id': '11111111111111111111', + 'aws_secret_access_key': '22222222222222222222'} + ) + self.assertEqual( + repository_backend.assume_role_arn, + "arn:aws:iam::123456789012:role/S3Access", + ) + self.assertEqual(repository_backend.signature_version, "s3v2") + self.assertEqual(repository_backend.endpoint_url, "https://endpoint.example.com") + self.assertEqual(repository_backend.cloudfront_domain, "yada.cloudfront.net") + self.assertEqual(repository_backend.cloudfront_key_id, "YADA") + self.assertEqual(repository_backend.cloudfront_privkey_pem, CLOUDFRONT_PRIVKEY_PEM) + manifest.refresh_from_db() + self.assertEqual(manifest.version, 2) # only one bump + + # delete repository + + def test_delete_repository_unauthorized(self): + repository = force_repository() + response = self.delete(reverse("monolith_api:repository", args=(repository.pk,)), include_token=False) + self.assertEqual(response.status_code, 401) + + def test_delete_repository_permission_denied(self): + repository = force_repository() + response = self.delete(reverse("monolith_api:repository", args=(repository.pk,))) + self.assertEqual(response.status_code, 403) + + def test_delete_repository_cannot_be_deleted(self): + repository = force_repository() + manifest = force_manifest() + force_catalog(repository=repository, manifest=manifest) + self._set_permissions("monolith.delete_repository") + response = self.delete(reverse("monolith_api:repository", args=(repository.pk,))) + self.assertEqual(response.status_code, 400) + self.assertEqual(response.json(), ['This repository cannot be deleted']) + + @patch("base.notifier.Notifier.send_notification") + @patch("zentral.core.queues.backends.kombu.EventQueues.post_event") + def test_delete_s3_repository(self, post_event, send_notification): + repository = force_repository() + prev_value = repository.serialize_for_event() + self._set_permissions("monolith.delete_repository") + with self.captureOnCommitCallbacks(execute=True) as callbacks: + response = self.delete(reverse("monolith_api:repository", args=(repository.pk,))) + self.assertEqual(response.status_code, 204) + self.assertEqual(len(callbacks), 1) + event = post_event.call_args_list[0].args[0] + self.assertIsInstance(event, AuditEvent) + self.assertEqual( + event.payload, + {"action": "deleted", + "object": { + "model": "monolith.repository", + "pk": str(repository.pk), + "prev_value": prev_value, + }} + ) + metadata = event.metadata.serialize() + self.assertEqual(metadata["objects"], {"monolith_repository": [str(repository.pk)]}) + self.assertEqual(sorted(metadata["tags"]), ["monolith", "zentral"]) + send_notification.assert_called_once_with("monolith.repository", str(repository.pk)) + # sync repository def test_sync_repository_unauthorized(self): - response = self._post_json_data(reverse("monolith_api:sync_repository"), {}, include_token=False) + repository = force_repository() + response = self._post_json_data(reverse("monolith_api:sync_repository", args=(repository.pk,)), + {}, include_token=False) self.assertEqual(response.status_code, 401) def test_sync_repository_permission_denied(self): - response = self._post_json_data(reverse("monolith_api:sync_repository"), {}) + repository = force_repository() + response = self._post_json_data(reverse("monolith_api:sync_repository", args=(repository.pk,)), {}) self.assertEqual(response.status_code, 403) @patch("zentral.core.queues.backends.kombu.EventQueues.post_event") - @patch("zentral.contrib.monolith.repository_backends.local.Repository.get_all_catalog_content") - def test_sync_repository(self, get_all_catalog_content, post_event): + @patch("zentral.contrib.monolith.repository_backends.s3.S3Repository.get_all_catalog_content") + @patch("zentral.contrib.monolith.repository_backends.s3.S3Repository.get_icon_hashes_content") + @patch("zentral.contrib.monolith.repository_backends.s3.S3Repository.iter_client_resources") + def test_sync_repository( + self, + iter_client_resources, + get_icon_hashes_content, + get_all_catalog_content, + post_event + ): + repository = force_repository() catalog_name = get_random_string(12) pkg_info_name = get_random_string(12) + iter_client_resources.return_value = ["site_default.zip",] + get_icon_hashes_content.return_value = plistlib.dumps({ + f"{pkg_info_name}.png": "a" * 64 + }) get_all_catalog_content.return_value = plistlib.dumps([ {"catalogs": [catalog_name], "name": pkg_info_name, "version": "1.0"} ]) - self._set_permissions( - "monolith.view_catalog", "monolith.add_catalog", "monolith.change_catalog", - "monolith.view_pkginfoname", "monolith.add_pkginfoname", "monolith.change_pkginfoname", - "monolith.view_pkginfo", "monolith.add_pkginfo", "monolith.change_pkginfo", - "monolith.change_manifest" - ) + self._set_permissions("monolith.sync_repository") with self.captureOnCommitCallbacks(execute=True) as callbacks: - response = self._post_json_data(reverse("monolith_api:sync_repository"), {}) + response = self._post_json_data(reverse("monolith_api:sync_repository", args=(repository.pk,)), {}) self.assertEqual(response.status_code, 200) json_response = response.json() self.assertEqual(json_response, {"status": 0}) + pkg_infos = PkgInfo.objects.filter(name__name=pkg_info_name) + self.assertEqual(pkg_infos.count(), 1) + pkg_info = pkg_infos.first() + self.assertEqual(pkg_info.repository, repository) + self.assertEqual(list(c.name for c in pkg_info.catalogs.filter(repository=repository)), + [catalog_name]) + repository.refresh_from_db() + self.assertEqual(repository.client_resources, ["site_default.zip"]) + self.assertEqual(repository.icon_hashes, {f"icon.{pkg_info.pk}.{pkg_info_name}.png": "a" * 64}) self.assertEqual(len(callbacks), 1) self.assertEqual(len(post_event.call_args_list), 4) mscr_evt = post_event.call_args_list[0].args[0] @@ -227,25 +673,28 @@ def test_sync_repository(self, get_all_catalog_content, post_event): # update cache server def test_update_cache_server_unauthorized(self): - response = self._post_json_data(reverse("monolith_api:update_cache_server", args=(self.manifest.pk,)), + manifest = force_manifest() + response = self._post_json_data(reverse("monolith_api:update_cache_server", args=(manifest.pk,)), {}, include_token=False) self.assertEqual(response.status_code, 401) def test_update_cache_server_permission_denied(self): - response = self._post_json_data(reverse("monolith_api:update_cache_server", args=(self.manifest.pk,)), {}) + manifest = force_manifest() + response = self._post_json_data(reverse("monolith_api:update_cache_server", args=(manifest.pk,)), {}) self.assertEqual(response.status_code, 403) def test_update_cache_server(self): self._set_permissions("monolith.change_manifest", "monolith.add_cacheserver", "monolith.change_cacheserver") name = get_random_string(12) ip_address = "129.2.1.1" - response = self._post_json_data(reverse("monolith_api:update_cache_server", args=(self.manifest.pk,)), + manifest = force_manifest() + response = self._post_json_data(reverse("monolith_api:update_cache_server", args=(manifest.pk,)), {"name": name, "base_url": "https://example.com"}, ip=ip_address) self.assertEqual(response.status_code, 200) self.assertEqual(response.json(), {"status": 0}) - cache_server = CacheServer.objects.get(manifest=self.manifest, name=name) + cache_server = CacheServer.objects.get(manifest=manifest, name=name) self.assertEqual(cache_server.public_ip_address, ip_address) # list manifests @@ -274,44 +723,47 @@ def test_get_manifests_filter_by_meta_business_unit_id_not_found(self): def test_get_manifests_filter_by_name(self): for _ in range(3): - self.force_manifest() + force_manifest() self._set_permissions("monolith.view_manifest") - response = self.get(reverse("monolith_api:manifests"), {"name": self.manifest.name}) + manifest = force_manifest() + response = self.get(reverse("monolith_api:manifests"), {"name": manifest.name}) self.assertEqual(response.status_code, 200) self.assertEqual(response.json(), [{ - 'id': self.manifest.pk, - 'name': self.manifest.name, + 'id': manifest.pk, + 'name': manifest.name, 'version': 1, - 'created_at': self.manifest.created_at.isoformat(), - 'updated_at': self.manifest.updated_at.isoformat(), - 'meta_business_unit': self.manifest.meta_business_unit.pk + 'created_at': manifest.created_at.isoformat(), + 'updated_at': manifest.updated_at.isoformat(), + 'meta_business_unit': manifest.meta_business_unit.pk }]) def test_get_manifests_filter_by_meta_business_unit_id(self): self._set_permissions("monolith.view_manifest") + manifest = force_manifest(mbu=self.mbu) response = self.get(reverse("monolith_api:manifests"), - {"meta_business_unit_id": self.manifest.meta_business_unit.pk}) + {"meta_business_unit_id": self.mbu.pk}) self.assertEqual(response.status_code, 200) self.assertEqual(response.json(), [{ - 'id': self.manifest.pk, - 'name': self.manifest.name, + 'id': manifest.pk, + 'name': manifest.name, 'version': 1, - 'created_at': self.manifest.created_at.isoformat(), - 'updated_at': self.manifest.updated_at.isoformat(), - 'meta_business_unit': self.manifest.meta_business_unit.pk + 'created_at': manifest.created_at.isoformat(), + 'updated_at': manifest.updated_at.isoformat(), + 'meta_business_unit': manifest.meta_business_unit.pk }]) def test_get_manifests(self): self._set_permissions("monolith.view_manifest") + manifest = force_manifest() response = self.get(reverse("monolith_api:manifests")) self.assertEqual(response.status_code, 200) self.assertEqual(response.json(), [{ - 'id': self.manifest.pk, - 'name': self.manifest.name, + 'id': manifest.pk, + 'name': manifest.name, 'version': 1, - 'created_at': self.manifest.created_at.isoformat(), - 'updated_at': self.manifest.updated_at.isoformat(), - 'meta_business_unit': self.manifest.meta_business_unit.pk + 'created_at': manifest.created_at.isoformat(), + 'updated_at': manifest.updated_at.isoformat(), + 'meta_business_unit': manifest.meta_business_unit.pk }]) # get manifest @@ -331,15 +783,16 @@ def test_get_manifest_not_found(self): def test_get_manifest(self): self._set_permissions("monolith.view_manifest") - response = self.get(reverse("monolith_api:manifest", args=(self.manifest.pk,))) + manifest = force_manifest() + response = self.get(reverse("monolith_api:manifest", args=(manifest.pk,))) self.assertEqual(response.status_code, 200) self.assertEqual(response.json(), { - 'id': self.manifest.pk, - 'name': self.manifest.name, + 'id': manifest.pk, + 'name': manifest.name, 'version': 1, - 'created_at': self.manifest.created_at.isoformat(), - 'updated_at': self.manifest.updated_at.isoformat(), - 'meta_business_unit': self.manifest.meta_business_unit.pk + 'created_at': manifest.created_at.isoformat(), + 'updated_at': manifest.updated_at.isoformat(), + 'meta_business_unit': manifest.meta_business_unit.pk }) # create manifest @@ -363,21 +816,22 @@ def test_create_manifest_fields_empty(self): def test_create_manifest(self): self._set_permissions("monolith.add_manifest") + name = get_random_string(12) response = self._post_json_data(reverse("monolith_api:manifests"), data={ - 'name': 'foo', - 'meta_business_unit': self.manifest.meta_business_unit.pk + 'name': name, + 'meta_business_unit': self.mbu.pk }) self.assertEqual(response.status_code, 201) - manifest = Manifest.objects.get(name='foo') + manifest = Manifest.objects.get(name=name) self.assertEqual(response.json(), { 'id': manifest.pk, - 'name': 'foo', + 'name': name, 'version': 1, 'created_at': manifest.created_at.isoformat(), 'updated_at': manifest.updated_at.isoformat(), 'meta_business_unit': self.mbu.pk }) - self.assertEqual(manifest.name, 'foo') + self.assertEqual(manifest.name, name) # update manifest @@ -396,7 +850,8 @@ def test_update_manifest_not_found(self): def test_update_manifest_fields_invalid(self): self._set_permissions("monolith.change_manifest") - response = self._put_json_data(reverse("monolith_api:manifest", args=(self.manifest.pk,)), data={ + manifest = force_manifest() + response = self._put_json_data(reverse("monolith_api:manifest", args=(manifest.pk,)), data={ 'name': '', 'meta_business_unit': '' }) @@ -407,7 +862,7 @@ def test_update_manifest_fields_invalid(self): }) def test_update_manifest_invalid_meta_business_unit(self): - manifest = self.force_manifest() + manifest = force_manifest() self._set_permissions("monolith.change_manifest") response = self._put_json_data(reverse("monolith_api:manifest", args=(manifest.pk,)), data={ 'name': 'foo', @@ -419,7 +874,7 @@ def test_update_manifest_invalid_meta_business_unit(self): }) def test_update_manifest(self): - manifest = self.force_manifest() + manifest = force_manifest() self._set_permissions("monolith.change_manifest") response = self._put_json_data(reverse("monolith_api:manifest", args=(manifest.pk,)), data={ 'name': 'spam', @@ -453,7 +908,7 @@ def test_delete_manifest_not_found(self): self.assertEqual(response.status_code, 404) def test_delete_manifest(self): - manifest = self.force_manifest() + manifest = force_manifest() self._set_permissions("monolith.delete_manifest") response = self.delete(reverse("monolith_api:manifest", args=(manifest.pk,))) self.assertEqual(response.status_code, 204) @@ -475,29 +930,29 @@ def test_get_catalogs_filter_by_name_not_found(self): self.assertEqual(response.json(), []) def test_get_catalogs_filter_by_name(self): - self.force_catalog() - catalog = self.force_catalog() + force_catalog() + catalog = force_catalog() self._set_permissions("monolith.view_catalog") response = self.get(reverse("monolith_api:catalogs"), {"name": catalog.name}) self.assertEqual(response.status_code, 200) self.assertEqual(response.json(), [{ 'id': catalog.pk, + 'repository': catalog.repository.pk, 'name': catalog.name, - 'priority': 1, 'created_at': catalog.created_at.isoformat(), 'updated_at': catalog.updated_at.isoformat(), 'archived_at': None, }]) def test_get_catalogs(self): - catalog = self.force_catalog() + catalog = force_catalog() self._set_permissions("monolith.view_catalog") response = self.get(reverse("monolith_api:catalogs")) self.assertEqual(response.status_code, 200) self.assertEqual(response.json(), [{ 'id': catalog.pk, + 'repository': catalog.repository.pk, 'name': catalog.name, - 'priority': 1, 'created_at': catalog.created_at.isoformat(), 'updated_at': catalog.updated_at.isoformat(), 'archived_at': None, @@ -519,14 +974,14 @@ def test_get_catalog_not_found(self): self.assertEqual(response.status_code, 404) def test_get_catalog(self): - catalog = self.force_catalog(archived=True) + catalog = force_catalog(archived=True) self._set_permissions("monolith.view_catalog") response = self.get(reverse("monolith_api:catalog", args=(catalog.pk,))) self.assertEqual(response.status_code, 200) self.assertEqual(response.json(), { 'id': catalog.pk, + 'repository': catalog.repository.pk, 'name': catalog.name, - 'priority': 1, 'created_at': catalog.created_at.isoformat(), 'updated_at': catalog.updated_at.isoformat(), 'archived_at': catalog.archived_at.isoformat(), @@ -547,28 +1002,45 @@ def test_create_catalog_fields_empty(self): response = self._post_json_data(reverse("monolith_api:catalogs"), data={}) self.assertEqual(response.status_code, 400) self.assertEqual(response.json(), { + 'repository': ['This field is required.'], 'name': ['This field is required.'], }) + def test_create_catalog_not_virtual_repository(self): + self._set_permissions("monolith.add_catalog") + name = get_random_string(12) + repository = force_repository(virtual=False) + response = self._post_json_data(reverse("monolith_api:catalogs"), data={ + 'repository': repository.pk, + 'name': name, + 'archived_at': datetime.utcnow().isoformat(), + }) + self.assertEqual(response.status_code, 400) + self.assertEqual(response.json(), { + 'repository': ['Not a virtual repository.'], + }) + def test_create_catalog(self): self._set_permissions("monolith.add_catalog") name = get_random_string(12) + repository = force_repository(virtual=True) response = self._post_json_data(reverse("monolith_api:catalogs"), data={ + 'repository': repository.pk, 'name': name, - 'priority': 17, 'archived_at': datetime.utcnow().isoformat(), }) self.assertEqual(response.status_code, 201) catalog = Catalog.objects.get(name=name) self.assertEqual(response.json(), { 'id': catalog.pk, + 'repository': repository.pk, 'name': name, - 'priority': 17, 'created_at': catalog.created_at.isoformat(), 'updated_at': catalog.updated_at.isoformat(), 'archived_at': None # read only }) - self.assertEqual(catalog.priority, 17) + self.assertEqual(catalog.repository, repository) + self.assertEqual(catalog.name, name) # update catalog @@ -586,36 +1058,71 @@ def test_update_catalog_not_found(self): self.assertEqual(response.status_code, 404) def test_update_catalog_fields_invalid(self): - catalog = self.force_catalog() + repository = force_repository(virtual=True) + catalog = force_catalog(repository=repository) self._set_permissions("monolith.change_catalog") response = self._put_json_data(reverse("monolith_api:catalog", args=(catalog.pk,)), data={ 'name': '', }) self.assertEqual(response.status_code, 400) self.assertEqual(response.json(), { + 'repository': ['This field is required.'], 'name': ['This field may not be blank.'], }) + def test_update_catalog_not_virtual_repository(self): + repository = force_repository(virtual=False) + catalog = force_catalog(repository=repository) + self._set_permissions("monolith.change_catalog") + response = self._put_json_data(reverse("monolith_api:catalog", args=(catalog.pk,)), data={ + 'repository': catalog.repository.pk, + 'name': get_random_string(12), + }) + self.assertEqual(response.status_code, 400) + self.assertEqual( + response.json(), + {"repository": ["Not a virtual repository."]} + ) + + def test_update_catalog_bad_mbu(self): + manifest = force_manifest() + repository = force_repository(mbu=manifest.meta_business_unit, virtual=True) + catalog = force_catalog(repository=repository, manifest=manifest) + new_repository = force_repository(mbu=MetaBusinessUnit.objects.create(name=get_random_string(12)), + virtual=True) + self._set_permissions("monolith.change_catalog") + response = self._put_json_data(reverse("monolith_api:catalog", args=(catalog.pk,)), data={ + 'repository': new_repository.pk, + 'name': get_random_string(12), + }) + self.assertEqual(response.status_code, 400) + self.assertEqual( + response.json(), + {"repository": [ + "This catalog is included in manifests linked to different business units than this repository." + ]} + ) + def test_update_catalog(self): - catalog = self.force_catalog() + repository = force_repository(virtual=True) + catalog = force_catalog(repository=repository) self._set_permissions("monolith.change_catalog") new_name = get_random_string(12) response = self._put_json_data(reverse("monolith_api:catalog", args=(catalog.pk,)), data={ + 'repository': catalog.repository.pk, 'name': new_name, - 'priority': 42, }) self.assertEqual(response.status_code, 200) catalog.refresh_from_db() self.assertEqual(response.json(), { 'id': catalog.pk, + 'repository': catalog.repository.pk, 'name': new_name, - 'priority': 42, 'created_at': catalog.created_at.isoformat(), 'updated_at': catalog.updated_at.isoformat(), 'archived_at': None }) self.assertEqual(catalog.name, new_name) - self.assertEqual(catalog.priority, 42) # delete catalog @@ -633,14 +1140,17 @@ def test_delete_catalog_not_found(self): self.assertEqual(response.status_code, 404) def test_delete_catalog_not_ok(self): - manifest_catalog = self.force_manifest_catalog() + repository = force_repository(virtual=True) + manifest = force_manifest() + catalog = force_catalog(repository=repository, manifest=manifest) self._set_permissions("monolith.delete_catalog") - response = self.delete(reverse("monolith_api:catalog", args=(manifest_catalog.catalog.pk,))) + response = self.delete(reverse("monolith_api:catalog", args=(catalog.pk,))) self.assertEqual(response.status_code, 400) self.assertEqual(response.json(), ['This catalog cannot be deleted']) def test_delete_catalog(self): - catalog = self.force_catalog() + repository = force_repository(virtual=True) + catalog = force_catalog(repository=repository) self._set_permissions("monolith.delete_catalog") response = self.delete(reverse("monolith_api:catalog", args=(catalog.pk,))) self.assertEqual(response.status_code, 204) @@ -662,8 +1172,8 @@ def test_get_conditions_filter_by_name_not_found(self): self.assertEqual(response.json(), []) def test_get_conditions_filter_by_name(self): - self.force_condition() - condition = self.force_condition() + force_condition() + condition = force_condition() self._set_permissions("monolith.view_condition") response = self.get(reverse("monolith_api:conditions"), {"name": condition.name}) self.assertEqual(response.status_code, 200) @@ -676,7 +1186,7 @@ def test_get_conditions_filter_by_name(self): }]) def test_get_conditions(self): - condition = self.force_condition() + condition = force_condition() self._set_permissions("monolith.view_condition") response = self.get(reverse("monolith_api:conditions")) self.assertEqual(response.status_code, 200) @@ -704,7 +1214,7 @@ def test_get_condition_not_found(self): self.assertEqual(response.status_code, 404) def test_get_condition(self): - condition = self.force_condition() + condition = force_condition() self._set_permissions("monolith.view_condition") response = self.get(reverse("monolith_api:condition", args=(condition.pk,))) self.assertEqual(response.status_code, 200) @@ -755,7 +1265,7 @@ def test_create_condition(self): self.assertEqual(condition.predicate, predicate) def test_create_condition_name_conflict(self): - condition = self.force_condition() + condition = force_condition() self._set_permissions("monolith.add_condition") response = self._post_json_data(reverse("monolith_api:conditions"), data={ 'name': condition.name, @@ -780,7 +1290,7 @@ def test_update_condition_not_found(self): self.assertEqual(response.status_code, 404) def test_update_condition_fields_invalid(self): - condition = self.force_condition() + condition = force_condition() self._set_permissions("monolith.change_condition") response = self._put_json_data(reverse("monolith_api:condition", args=(condition.pk,)), data={ 'name': '', @@ -793,13 +1303,13 @@ def test_update_condition_fields_invalid(self): }) def test_update_condition(self): - condition = self.force_condition() - msm = self.force_manifest_sub_manifest() - manifest = msm.manifest + condition = force_condition() + manifest = force_manifest() + sub_manifest = force_sub_manifest(manifest=manifest) self.assertEqual(manifest.version, 1) SubManifestPkgInfo.objects.create( - sub_manifest=msm.sub_manifest, - pkg_info_name=self.force_pkg_info_name(), + sub_manifest=sub_manifest, + pkg_info_name=force_name(), condition=condition ) self._set_permissions("monolith.change_condition") @@ -824,8 +1334,8 @@ def test_update_condition(self): self.assertEqual(manifest.version, 2) def test_update_condition_name_conflict(self): - condition1 = self.force_condition() - condition2 = self.force_condition() + condition1 = force_condition() + condition2 = force_condition() self._set_permissions("monolith.change_condition") response = self._put_json_data(reverse("monolith_api:condition", args=(condition2.pk,)), data={ 'name': condition1.name, @@ -850,10 +1360,10 @@ def test_delete_condition_not_found(self): self.assertEqual(response.status_code, 404) def test_delete_condition_not_ok(self): - condition = self.force_condition() + condition = force_condition() SubManifestPkgInfo.objects.create( - sub_manifest=self.force_sub_manifest(), - pkg_info_name=self.force_pkg_info_name(), + sub_manifest=force_sub_manifest(), + pkg_info_name=force_name(), condition=condition ) self._set_permissions("monolith.delete_condition") @@ -862,7 +1372,7 @@ def test_delete_condition_not_ok(self): self.assertEqual(response.json(), ['This condition cannot be deleted']) def test_delete_condition(self): - condition = self.force_condition() + condition = force_condition() self._set_permissions("monolith.delete_condition") response = self.delete(reverse("monolith_api:condition", args=(condition.pk,))) self.assertEqual(response.status_code, 204) @@ -888,13 +1398,13 @@ def test_get_enrollments_filter_by_manifest_id_invalid_choice(self): def test_get_enrollments_filter_by_manifest_id_no_results(self): self._set_permissions("monolith.view_enrollment") - manifest = self.force_manifest() + manifest = force_manifest() response = self.get(reverse("monolith_api:enrollments"), {"manifest_id": manifest.pk}) self.assertEqual(response.status_code, 200) self.assertEqual(response.json(), []) def test_get_enrollments_filter_by_manifest_id(self): - enrollment, tags = self.force_enrollment(tag_count=1) + enrollment, tags = force_enrollment(mbu=self.mbu, tag_count=1) self._set_permissions("monolith.view_enrollment") response = self.get(reverse("monolith_api:enrollments"), {"manifest_id": enrollment.manifest.pk}) self.assertEqual(response.status_code, 200) @@ -926,7 +1436,7 @@ def test_get_enrollments_filter_by_manifest_id(self): }]) def test_get_enrollments(self): - enrollment, _ = self.force_enrollment() + enrollment, _ = force_enrollment(mbu=self.mbu) self._set_permissions("monolith.view_enrollment") response = self.get(reverse("monolith_api:enrollments")) self.assertEqual(response.status_code, 200) @@ -973,7 +1483,7 @@ def test_get_enrollment_not_found(self): self.assertEqual(response.status_code, 404) def test_get_enrollment(self): - enrollment, _ = self.force_enrollment() + enrollment, _ = force_enrollment(mbu=self.mbu) self._set_permissions("monolith.view_enrollment") response = self.get(reverse("monolith_api:enrollment", args=(enrollment.pk,))) self.assertEqual(response.status_code, 200) @@ -1025,7 +1535,7 @@ def test_create_enrollment_fields_empty(self): def test_create_enrollment(self): self._set_permissions("monolith.add_enrollment") - manifest = self.force_manifest() + manifest = force_manifest(mbu=self.mbu) self.assertEqual(manifest.version, 1) tags = [Tag.objects.create(name=get_random_string(12)) for _ in range(1)] response = self._post_json_data(reverse("monolith_api:enrollments"), data={ @@ -1068,7 +1578,7 @@ def test_create_enrollment(self): def test_create_enrollment_mbu_conflict(self): self._set_permissions("monolith.add_enrollment") - manifest = self.force_manifest() + manifest = force_manifest() mbu = MetaBusinessUnit.objects.create(name=get_random_string(12)) response = self._post_json_data(reverse("monolith_api:enrollments"), data={ 'manifest': manifest.pk, @@ -1096,7 +1606,7 @@ def test_update_enrollment_not_found(self): self.assertEqual(response.status_code, 404) def test_update_enrollment(self): - enrollment, _ = self.force_enrollment(tag_count=2) + enrollment, _ = force_enrollment(mbu=self.mbu, tag_count=2) enrollment_secret = enrollment.secret self.assertEqual(enrollment.secret.quota, None) self.assertEqual(enrollment.secret.serial_numbers, None) @@ -1145,7 +1655,7 @@ def test_delete_enrollment_not_found(self): self.assertEqual(response.status_code, 404) def test_delete_enrollment(self): - enrollment, _ = self.force_enrollment() + enrollment, _ = force_enrollment(mbu=self.mbu) manifest = enrollment.manifest self.assertEqual(manifest.version, 1) self._set_permissions("monolith.delete_enrollment") @@ -1157,23 +1667,23 @@ def test_delete_enrollment(self): # enrollment plist def test_get_enrollment_plist_unauthorized(self): - enrollment, _ = self.force_enrollment() + enrollment, _ = force_enrollment(mbu=self.mbu) response = self.get(reverse("monolith_api:enrollment_plist", args=(enrollment.pk,)), include_token=False) self.assertEqual(response.status_code, 401) def test_get_enrollment_plist_permission_denied(self): - enrollment, _ = self.force_enrollment() + enrollment, _ = force_enrollment(mbu=self.mbu) response = self.get(reverse("monolith_api:enrollment_plist", args=(enrollment.pk,))) self.assertEqual(response.status_code, 403) def test_get_enrollment_plist_permission_denied_user(self): - enrollment, _ = self.force_enrollment() + enrollment, _ = force_enrollment(mbu=self.mbu) self.client.force_login(self.user) response = self.client.get(reverse("monolith_api:enrollment_plist", args=(enrollment.pk,))) self.assertEqual(response.status_code, 403) def test_get_enrollment_plist(self): - enrollment, _ = self.force_enrollment() + enrollment, _ = force_enrollment(mbu=self.mbu) self._set_permissions("monolith.view_enrollment") response = self.get(reverse("monolith_api:enrollment_plist", args=(enrollment.pk,))) self.assertEqual(response.status_code, 200) @@ -1195,7 +1705,7 @@ def test_get_enrollment_plist(self): ) def test_get_enrollment_plist_user(self): - enrollment, _ = self.force_enrollment() + enrollment, _ = force_enrollment(mbu=self.mbu) self._set_permissions("monolith.view_enrollment") self.client.force_login(self.user) response = self.client.get(reverse("monolith_api:enrollment_plist", args=(enrollment.pk,))) @@ -1209,24 +1719,24 @@ def test_get_enrollment_plist_user(self): # enrollment configuration profile def test_get_enrollment_configuration_profile_unauthorized(self): - enrollment, _ = self.force_enrollment() + enrollment, _ = force_enrollment(mbu=self.mbu) response = self.get( reverse("monolith_api:enrollment_configuration_profile", args=(enrollment.pk,)), include_token=False) self.assertEqual(response.status_code, 401) def test_get_enrollment_configuration_profile_permission_denied(self): - enrollment, _ = self.force_enrollment() + enrollment, _ = force_enrollment(mbu=self.mbu) response = self.get(reverse("monolith_api:enrollment_configuration_profile", args=(enrollment.pk,))) self.assertEqual(response.status_code, 403) def test_get_enrollment_configuration_profile_permission_denied_user(self): - enrollment, _ = self.force_enrollment() + enrollment, _ = force_enrollment(mbu=self.mbu) self.client.force_login(self.user) response = self.client.get(reverse("monolith_api:enrollment_configuration_profile", args=(enrollment.pk,))) self.assertEqual(response.status_code, 403) def test_get_enrollment_configuration_profile(self): - enrollment, _ = self.force_enrollment() + enrollment, _ = force_enrollment(mbu=self.mbu) self._set_permissions("monolith.view_enrollment") response = self.get(reverse("monolith_api:enrollment_configuration_profile", args=(enrollment.pk,))) self.assertEqual(response.status_code, 200) @@ -1250,7 +1760,7 @@ def test_get_enrollment_configuration_profile(self): ) def test_get_enrollment_configuration_profile_user(self): - enrollment, _ = self.force_enrollment() + enrollment, _ = force_enrollment(mbu=self.mbu) self._set_permissions("monolith.view_enrollment") self.client.force_login(self.user) response = self.client.get(reverse("monolith_api:enrollment_configuration_profile", args=(enrollment.pk,))) @@ -1274,47 +1784,52 @@ def test_get_manifest_catalogs_permission_denied(self): def test_get_manifest_catalogs_filter_by_manifest_id_not_found(self): self._set_permissions("monolith.view_manifestcatalog") - response = self.get(reverse("monolith_api:manifest_catalogs"), {"manifest_id": self.manifest.pk}) + manifest = force_manifest() + response = self.get(reverse("monolith_api:manifest_catalogs"), {"manifest_id": manifest.pk}) self.assertEqual(response.status_code, 200) self.assertEqual(response.json(), []) def test_get_manifest_catalogs_filter_by_manifest_id(self): - self.force_manifest_catalog() - manifest_catalog = self.force_manifest_catalog() + manifest1 = force_manifest() + force_catalog(manifest=manifest1) + manifest2 = force_manifest() + catalog = force_catalog(manifest=manifest2) self._set_permissions("monolith.view_manifestcatalog") response = self.get(reverse("monolith_api:manifest_catalogs"), - {"manifest_id": manifest_catalog.manifest.id}) + {"manifest_id": manifest2.id}) self.assertEqual(response.status_code, 200) self.assertEqual(response.json(), [{ - 'id': manifest_catalog.pk, - 'manifest': manifest_catalog.manifest.id, - 'catalog': manifest_catalog.catalog.id, + 'id': manifest2.manifestcatalog_set.first().pk, + 'manifest': manifest2.id, + 'catalog': catalog.id, 'tags': [] }]) def test_get_manifest_catalogs_filter_by_catalog_id(self): - self.force_manifest_catalog() - manifest_catalog = self.force_manifest_catalog() + manifest = force_manifest() + force_catalog(manifest=manifest) + catalog = force_catalog(manifest=manifest) self._set_permissions("monolith.view_manifestcatalog") response = self.get(reverse("monolith_api:manifest_catalogs"), - {"catalog_id": manifest_catalog.catalog.id}) + {"catalog_id": catalog.id}) self.assertEqual(response.status_code, 200) self.assertEqual(response.json(), [{ - 'id': manifest_catalog.pk, - 'manifest': manifest_catalog.manifest.id, - 'catalog': manifest_catalog.catalog.id, + 'id': manifest.manifestcatalog_set.get(catalog=catalog).pk, + 'manifest': manifest.id, + 'catalog': catalog.id, 'tags': [] }]) def test_get_manifest_catalogs(self): - manifest_catalog = self.force_manifest_catalog() + manifest = force_manifest() + catalog = force_catalog(manifest=manifest) self._set_permissions("monolith.view_manifestcatalog") response = self.get(reverse("monolith_api:manifest_catalogs")) self.assertEqual(response.status_code, 200) self.assertEqual(response.json(), [{ - 'id': manifest_catalog.pk, - 'manifest': manifest_catalog.manifest.id, - 'catalog': manifest_catalog.catalog.id, + 'id': manifest.manifestcatalog_set.first().pk, + 'manifest': manifest.id, + 'catalog': catalog.id, 'tags': [] }]) @@ -1334,16 +1849,18 @@ def test_get_manifest_catalog_not_found(self): self.assertEqual(response.status_code, 404) def test_get_manifest_catalog(self): - tag = Tag.objects.create(name=get_random_string(12)) - manifest_catalog = self.force_manifest_catalog(tag=tag) + tags = [Tag.objects.create(name=get_random_string(12))] + manifest = force_manifest() + catalog = force_catalog(manifest=manifest, tags=tags) + manifest_catalog = manifest.manifestcatalog_set.first() self._set_permissions("monolith.view_manifestcatalog") response = self.get(reverse("monolith_api:manifest_catalog", args=(manifest_catalog.pk,))) self.assertEqual(response.status_code, 200) self.assertEqual(response.json(), { 'id': manifest_catalog.pk, - 'manifest': manifest_catalog.manifest.id, - 'catalog': manifest_catalog.catalog.id, - 'tags': [tag.pk] + 'manifest': manifest.id, + 'catalog': catalog.id, + 'tags': [tags[0].pk] }) # create manifest catalog @@ -1368,9 +1885,9 @@ def test_create_manifest_catalog_fields_empty(self): def test_create_manifest_catalog(self): self._set_permissions("monolith.add_manifestcatalog") - manifest = self.force_manifest() + manifest = force_manifest() self.assertEqual(manifest.version, 1) - catalog = self.force_catalog() + catalog = force_catalog() tag = Tag.objects.create(name=get_random_string(12)) response = self._post_json_data(reverse("monolith_api:manifest_catalogs"), data={ 'manifest': manifest.pk, @@ -1406,13 +1923,14 @@ def test_update_manifest_catalog_not_found(self): self.assertEqual(response.status_code, 404) def test_update_manifest_catalog(self): - manifest_catalog = self.force_manifest_catalog( - tag=Tag.objects.create(name=get_random_string(12)) - ) + manifest = force_manifest() + tags = [Tag.objects.create(name=get_random_string(12))] + catalog = force_catalog(manifest=manifest, tags=tags) + manifest_catalog = manifest.manifestcatalog_set.first() self.assertEqual(manifest_catalog.tags.count(), 1) - manifest = self.force_manifest() + manifest = force_manifest() self.assertEqual(manifest.version, 1) - catalog = self.force_catalog() + catalog = force_catalog() self._set_permissions("monolith.change_manifestcatalog") response = self._put_json_data(reverse("monolith_api:manifest_catalog", args=(manifest_catalog.pk,)), data={ 'manifest': manifest.pk, @@ -1448,8 +1966,9 @@ def test_delete_manifest_catalog_not_found(self): self.assertEqual(response.status_code, 404) def test_delete_manifest_catalog(self): - manifest_catalog = self.force_manifest_catalog() - manifest = manifest_catalog.manifest + manifest = force_manifest() + force_catalog(manifest=manifest) + manifest_catalog = manifest.manifestcatalog_set.first() self.assertEqual(manifest.version, 1) self._set_permissions("monolith.delete_manifestcatalog") response = self.delete(reverse("monolith_api:manifest_catalog", args=(manifest_catalog.pk,))) @@ -1469,47 +1988,53 @@ def test_get_manifest_sub_manifests_permission_denied(self): def test_get_manifest_sub_manifests_filter_by_manifest_id_not_found(self): self._set_permissions("monolith.view_manifestsubmanifest") - response = self.get(reverse("monolith_api:manifest_sub_manifests"), {"manifest_id": self.manifest.pk}) + manifest = force_manifest() + response = self.get(reverse("monolith_api:manifest_sub_manifests"), {"manifest_id": manifest.pk}) self.assertEqual(response.status_code, 200) self.assertEqual(response.json(), []) def test_get_manifest_sub_manifests_filter_by_manifest_id(self): - self.force_manifest_sub_manifest() - manifest_sub_manifest = self.force_manifest_sub_manifest() + manifest = force_manifest() + force_sub_manifest() + sub_manifest = force_sub_manifest(manifest=manifest) self._set_permissions("monolith.view_manifestsubmanifest") response = self.get(reverse("monolith_api:manifest_sub_manifests"), - {"manifest_id": manifest_sub_manifest.manifest.id}) + {"manifest_id": manifest.id}) self.assertEqual(response.status_code, 200) self.assertEqual(response.json(), [{ - 'id': manifest_sub_manifest.pk, - 'manifest': manifest_sub_manifest.manifest.id, - 'sub_manifest': manifest_sub_manifest.sub_manifest.id, + 'id': manifest.manifestsubmanifest_set.filter(sub_manifest=sub_manifest).first().pk, + 'manifest': manifest.id, + 'sub_manifest': sub_manifest.id, 'tags': [] }]) def test_get_manifest_sub_manifests_filter_by_sub_manifest_id(self): - self.force_manifest_sub_manifest() - manifest_sub_manifest = self.force_manifest_sub_manifest() + manifest = force_manifest() + force_sub_manifest(manifest=manifest) + sub_manifest = force_sub_manifest(manifest=manifest) + manifest_sub_manifest = sub_manifest.manifestsubmanifest_set.first() self._set_permissions("monolith.view_manifestsubmanifest") response = self.get(reverse("monolith_api:manifest_sub_manifests"), - {"sub_manifest_id": manifest_sub_manifest.sub_manifest.id}) + {"sub_manifest_id": sub_manifest.id}) self.assertEqual(response.status_code, 200) self.assertEqual(response.json(), [{ 'id': manifest_sub_manifest.pk, - 'manifest': manifest_sub_manifest.manifest.id, - 'sub_manifest': manifest_sub_manifest.sub_manifest.id, + 'manifest': manifest.id, + 'sub_manifest': sub_manifest.id, 'tags': [] }]) def test_get_manifest_sub_manifests(self): - manifest_sub_manifest = self.force_manifest_sub_manifest() + manifest = force_manifest() + sub_manifest = force_sub_manifest(manifest=manifest) + manifest_sub_manifest = manifest.manifestsubmanifest_set.first() self._set_permissions("monolith.view_manifestsubmanifest") response = self.get(reverse("monolith_api:manifest_sub_manifests")) self.assertEqual(response.status_code, 200) self.assertEqual(response.json(), [{ 'id': manifest_sub_manifest.pk, - 'manifest': manifest_sub_manifest.manifest.id, - 'sub_manifest': manifest_sub_manifest.sub_manifest.id, + 'manifest': manifest.id, + 'sub_manifest': sub_manifest.id, 'tags': [] }]) @@ -1529,16 +2054,18 @@ def test_get_manifest_sub_manifest_not_found(self): self.assertEqual(response.status_code, 404) def test_get_manifest_sub_manifest(self): - tag = Tag.objects.create(name=get_random_string(12)) - manifest_sub_manifest = self.force_manifest_sub_manifest(tag=tag) + manifest = force_manifest() + tags = [Tag.objects.create(name=get_random_string(12))] + sub_manifest = force_sub_manifest(manifest=manifest, tags=tags) + manifest_sub_manifest = sub_manifest.manifestsubmanifest_set.first() self._set_permissions("monolith.view_manifestsubmanifest") response = self.get(reverse("monolith_api:manifest_sub_manifest", args=(manifest_sub_manifest.pk,))) self.assertEqual(response.status_code, 200) self.assertEqual(response.json(), { 'id': manifest_sub_manifest.pk, - 'manifest': manifest_sub_manifest.manifest.id, - 'sub_manifest': manifest_sub_manifest.sub_manifest.id, - 'tags': [tag.pk] + 'manifest': manifest.id, + 'sub_manifest': sub_manifest.id, + 'tags': [t.pk for t in tags] }) # create manifest sub manifest @@ -1563,9 +2090,9 @@ def test_create_manifest_sub_manifest_fields_empty(self): def test_create_manifest_sub_manifest(self): self._set_permissions("monolith.add_manifestsubmanifest") - manifest = self.force_manifest() + manifest = force_manifest() self.assertEqual(manifest.version, 1) - sub_manifest = self.force_sub_manifest() + sub_manifest = force_sub_manifest() tag = Tag.objects.create(name=get_random_string(12)) response = self._post_json_data(reverse("monolith_api:manifest_sub_manifests"), data={ 'manifest': manifest.pk, @@ -1601,14 +2128,14 @@ def test_update_manifest_sub_manifest_not_found(self): self.assertEqual(response.status_code, 404) def test_update_manifest_sub_manifest(self): - manifest_sub_manifest = self.force_manifest_sub_manifest( - tag=Tag.objects.create(name=get_random_string(12)) - ) - manifest = manifest_sub_manifest.manifest + manifest = force_manifest() + tags = [Tag.objects.create(name=get_random_string(12))] + force_sub_manifest(manifest=manifest, tags=tags) + manifest_sub_manifest = manifest.manifestsubmanifest_set.first() self.assertEqual(manifest.version, 1) - self.assertEqual(manifest_sub_manifest.tags.count(), 1) - manifest = self.force_manifest() - sub_manifest = self.force_sub_manifest() + self.assertEqual(list(manifest_sub_manifest.tags.all()), tags) + manifest = force_manifest() + sub_manifest = force_sub_manifest() self._set_permissions("monolith.change_manifestsubmanifest") response = self._put_json_data( reverse("monolith_api:manifest_sub_manifest", args=(manifest_sub_manifest.pk,)), @@ -1647,8 +2174,9 @@ def test_delete_manifest_sub_manifest_not_found(self): self.assertEqual(response.status_code, 404) def test_delete_manifest_sub_manifest(self): - manifest_sub_manifest = self.force_manifest_sub_manifest() - manifest = manifest_sub_manifest.manifest + manifest = force_manifest() + force_sub_manifest(manifest=manifest) + manifest_sub_manifest = manifest.manifestsubmanifest_set.first() self.assertEqual(manifest.version, 1) self._set_permissions("monolith.delete_manifestsubmanifest") response = self.delete(reverse("monolith_api:manifest_sub_manifest", args=(manifest_sub_manifest.pk,))) @@ -1673,8 +2201,8 @@ def test_get_sub_manifests_filter_by_name_not_found(self): self.assertEqual(response.json(), []) def test_get_sub_manifests_filter_by_name(self): - self.force_sub_manifest() - sub_manifest = self.force_sub_manifest() + force_sub_manifest() + sub_manifest = force_sub_manifest() self._set_permissions("monolith.view_submanifest") response = self.get(reverse("monolith_api:sub_manifests"), {"name": sub_manifest.name}) @@ -1689,7 +2217,7 @@ def test_get_sub_manifests_filter_by_name(self): }]) def test_get_sub_manifests(self): - sub_manifest = self.force_sub_manifest(meta_business_unit=self.mbu) + sub_manifest = force_sub_manifest(mbu=self.mbu) self._set_permissions("monolith.view_submanifest") response = self.get(reverse("monolith_api:sub_manifests")) self.assertEqual(response.status_code, 200) @@ -1718,7 +2246,7 @@ def test_get_sub_manifest_not_found(self): self.assertEqual(response.status_code, 404) def test_get_sub_manifest(self): - sub_manifest = self.force_sub_manifest() + sub_manifest = force_sub_manifest() self._set_permissions("monolith.view_submanifest") response = self.get(reverse("monolith_api:sub_manifest", args=(sub_manifest.pk,))) self.assertEqual(response.status_code, 200) @@ -1785,7 +2313,7 @@ def test_update_sub_manifest_not_found(self): self.assertEqual(response.status_code, 404) def test_update_sub_manifest(self): - sub_manifest = self.force_sub_manifest() + sub_manifest = force_sub_manifest() self._set_permissions("monolith.change_submanifest") new_name = get_random_string(12) new_description = get_random_string(12) @@ -1824,7 +2352,7 @@ def test_delete_sub_manifest_not_found(self): self.assertEqual(response.status_code, 404) def test_delete_sub_manifest(self): - sub_manifest = self.force_sub_manifest() + sub_manifest = force_sub_manifest() self._set_permissions("monolith.delete_submanifest") response = self.delete(reverse("monolith_api:sub_manifest", args=(sub_manifest.pk,))) self.assertEqual(response.status_code, 204) @@ -1841,14 +2369,14 @@ def test_get_sub_manifest_pkg_infos_permission_denied(self): def test_get_sub_manifest_pkg_infos_filter_by_sub_manifest_id_not_found(self): self._set_permissions("monolith.view_submanifestpkginfo") - sub_manifest = self.force_sub_manifest() + sub_manifest = force_sub_manifest() response = self.get(reverse("monolith_api:sub_manifest_pkg_infos"), {"sub_manifest_id": sub_manifest.pk}) self.assertEqual(response.status_code, 200) self.assertEqual(response.json(), []) def test_get_sub_manifest_pkg_infos_filter_by_sub_manifest_id(self): - self.force_sub_manifest_pkg_info() - sub_manifest_pkg_info = self.force_sub_manifest_pkg_info() + force_sub_manifest_pkg_info() + sub_manifest_pkg_info = force_sub_manifest_pkg_info() self._set_permissions("monolith.view_submanifestpkginfo") response = self.get(reverse("monolith_api:sub_manifest_pkg_infos"), {"sub_manifest_id": sub_manifest_pkg_info.sub_manifest.id}) @@ -1869,7 +2397,7 @@ def test_get_sub_manifest_pkg_infos_filter_by_sub_manifest_id(self): }]) def test_get_sub_manifest_pkg_infos(self): - sub_manifest_pkg_info = self.force_sub_manifest_pkg_info() + sub_manifest_pkg_info = force_sub_manifest_pkg_info() self._set_permissions("monolith.view_submanifestpkginfo") response = self.get(reverse("monolith_api:sub_manifest_pkg_infos")) self.assertEqual(response.status_code, 200) @@ -1904,7 +2432,7 @@ def test_get_sub_manifest_pkg_info_not_found(self): self.assertEqual(response.status_code, 404) def test_get_sub_manifest_pkg_info(self): - sub_manifest_pkg_info = self.force_sub_manifest_pkg_info() + sub_manifest_pkg_info = force_sub_manifest_pkg_info() self._set_permissions("monolith.view_submanifestpkginfo") response = self.get(reverse("monolith_api:sub_manifest_pkg_info", args=(sub_manifest_pkg_info.pk,))) self.assertEqual(response.status_code, 200) @@ -1947,7 +2475,7 @@ def test_create_sub_manifest_pkg_info_fields_empty(self): def test_create_sub_manifest_pkg_info_unknown_pkg_info_name(self): self._set_permissions("monolith.add_submanifestpkginfo") - sub_manifest = self.force_sub_manifest() + sub_manifest = force_sub_manifest() response = self._post_json_data(reverse("monolith_api:sub_manifest_pkg_infos"), data={ 'sub_manifest': sub_manifest.pk, 'pkg_info_name': get_random_string(12), @@ -1961,8 +2489,8 @@ def test_create_sub_manifest_pkg_info_unknown_pkg_info_name(self): def test_create_sub_manifest_pkg_info_scoping_errors(self): self._set_permissions("monolith.add_submanifestpkginfo") - sub_manifest = self.force_sub_manifest() - pkg_info_name = self.force_pkg_info_name() + sub_manifest = force_sub_manifest() + pkg_info_name = force_name() tag1 = Tag.objects.create(name=get_random_string(12)) tag2 = Tag.objects.create(name=get_random_string(12)) response = self._post_json_data(reverse("monolith_api:sub_manifest_pkg_infos"), data={ @@ -1989,11 +2517,10 @@ def test_create_sub_manifest_pkg_info_scoping_errors(self): def test_create_sub_manifest_pkg_info(self): self._set_permissions("monolith.add_submanifestpkginfo") - msm = self.force_manifest_sub_manifest() - manifest = msm.manifest - sub_manifest = msm.sub_manifest + manifest = force_manifest() + sub_manifest = force_sub_manifest(manifest=manifest) self.assertEqual(manifest.version, 1) - pkg_info_name = self.force_pkg_info_name() + pkg_info_name = force_name() response = self._post_json_data(reverse("monolith_api:sub_manifest_pkg_infos"), data={ 'sub_manifest': sub_manifest.pk, 'pkg_info_name': pkg_info_name.name, @@ -2043,14 +2570,13 @@ def test_update_sub_manifest_pkg_info_not_found(self): self.assertEqual(response.status_code, 404) def test_update_sub_manifest_pkg_info(self): - sub_manifest_pkg_info = self.force_sub_manifest_pkg_info() + sub_manifest_pkg_info = force_sub_manifest_pkg_info() self._set_permissions("monolith.change_submanifestpkginfo") - msm = self.force_manifest_sub_manifest() - new_manifest = msm.manifest - new_sub_manifest = msm.sub_manifest + new_manifest = force_manifest() + new_sub_manifest = force_sub_manifest(manifest=new_manifest) self.assertEqual(new_manifest.version, 1) - new_pkg_info_name = self.force_pkg_info_name() - new_condition = self.force_condition() + new_pkg_info_name = force_name() + new_condition = force_condition() excluded_tag = Tag.objects.create(name=get_random_string(12)) shard_tag = Tag.objects.create(name=get_random_string(12)) response = self._put_json_data( @@ -2113,11 +2639,11 @@ def test_delete_sub_manifest_pkg_info_not_found(self): self.assertEqual(response.status_code, 404) def test_delete_sub_manifest_pkg_info(self): - msm = self.force_manifest_sub_manifest() - manifest = msm.manifest - sub_manifest = msm.sub_manifest + sub_manifest_pkg_info = force_sub_manifest_pkg_info() + sub_manifest = sub_manifest_pkg_info.sub_manifest + manifest = sub_manifest.manifestsubmanifest_set.first().manifest self.assertEqual(manifest.version, 1) - sub_manifest_pkg_info = self.force_sub_manifest_pkg_info(sub_manifest=sub_manifest) + force_pkg_info(sub_manifest=sub_manifest) self._set_permissions("monolith.delete_submanifestpkginfo") response = self.delete(reverse("monolith_api:sub_manifest_pkg_info", args=(sub_manifest_pkg_info.pk,))) self.assertEqual(response.status_code, 204) diff --git a/tests/monolith/test_models.py b/tests/monolith/test_models.py index f7abe3289d..58372c50c9 100644 --- a/tests/monolith/test_models.py +++ b/tests/monolith/test_models.py @@ -4,9 +4,10 @@ from django.utils.crypto import get_random_string from zentral.contrib.inventory.models import MetaBusinessUnit, Tag from zentral.contrib.monolith.conf import monolith_conf -from zentral.contrib.monolith.models import (Catalog, Manifest, ManifestCatalog, ManifestEnrollmentPackage, +from zentral.contrib.monolith.models import (Manifest, ManifestCatalog, ManifestEnrollmentPackage, ManifestSubManifest, PkgInfo, PkgInfoName, SubManifest) from zentral.contrib.munki.models import ManagedInstall +from .utils import force_catalog def sorted_objects(object_list): @@ -18,8 +19,8 @@ class MonolithModelsTestCase(TestCase): def setUpTestData(cls): cls.meta_business_unit = MetaBusinessUnit.objects.create(name=get_random_string(13)) cls.manifest = Manifest.objects.create(meta_business_unit=cls.meta_business_unit, name=get_random_string(13)) - cls.catalog_1 = Catalog.objects.create(name=get_random_string(13), priority=10) - cls.catalog_2 = Catalog.objects.create(name=get_random_string(13), priority=20) + cls.catalog_1 = force_catalog() + cls.catalog_2 = force_catalog(repository=cls.catalog_1.repository) cls.sub_manifest_1 = SubManifest.objects.create( meta_business_unit=cls.meta_business_unit, name=get_random_string(13)) cls.sub_manifest_2 = SubManifest.objects.create( @@ -38,14 +39,16 @@ def setUpTestData(cls): cls.mep_2 = ManifestEnrollmentPackage.objects.create(manifest=cls.manifest, builder=cls.builder) cls.mep_2.tags.set([cls.tag_1, cls.tag_2]) cls.pkginfo_name_1 = PkgInfoName.objects.create(name="aaaa first name") - cls.pkginfo_1_1 = PkgInfo.objects.create(name=cls.pkginfo_name_1, version="1.0", + cls.pkginfo_1_1 = PkgInfo.objects.create(repository=cls.catalog_1.repository, + name=cls.pkginfo_name_1, version="1.0", data={"name": cls.pkginfo_name_1.name, "version": "1.0", "zentral_monolith": { "shards": {"modulo": 17} }}) cls.pkginfo_1_1.catalogs.set([cls.catalog_1, cls.catalog_2]) - cls.pkginfo_1_2 = PkgInfo.objects.create(name=cls.pkginfo_name_1, version="2.0", + cls.pkginfo_1_2 = PkgInfo.objects.create(repository=cls.catalog_2.repository, + name=cls.pkginfo_name_1, version="2.0", data={"name": cls.pkginfo_name_1.name, "version": "2.0", "zentral_monolith": { @@ -58,7 +61,8 @@ def setUpTestData(cls): }}) cls.pkginfo_1_2.catalogs.set([cls.catalog_2]) cls.pkginfo_name_2 = PkgInfoName.objects.create(name="bbbb second name") - cls.pkginfo_2_1 = PkgInfo.objects.create(name=cls.pkginfo_name_2, version="1.0", + cls.pkginfo_2_1 = PkgInfo.objects.create(repository=cls.catalog_1.repository, + name=cls.pkginfo_name_2, version="1.0", data={"name": cls.pkginfo_name_2.name, "version": "1.0"}) cls.pkginfo_2_1.catalogs.set([cls.catalog_1, cls.catalog_2]) diff --git a/tests/monolith/test_munki_api_views.py b/tests/monolith/test_munki_api_views.py index 1a836f45a4..026f8fb295 100644 --- a/tests/monolith/test_munki_api_views.py +++ b/tests/monolith/test_munki_api_views.py @@ -1,7 +1,7 @@ import copy import os.path import plistlib -import shutil +from urllib.parse import urlparse import uuid from django.core.files.uploadedfile import SimpleUploadedFile from django.test import TestCase, override_settings @@ -10,10 +10,11 @@ from server.urls import build_urlpatterns_for_zentral_apps from zentral.conf import settings from zentral.contrib.inventory.models import EnrollmentSecret, MachineTag, MetaBusinessUnit, Tag -from zentral.contrib.monolith.models import (Catalog, Enrollment, - Manifest, ManifestCatalog, ManifestSubManifest, +from zentral.contrib.monolith.models import (Enrollment, + ManifestCatalog, ManifestSubManifest, PkgInfo, PkgInfoName, SubManifest, SubManifestPkgInfo) +from .utils import force_catalog, force_manifest, force_repository pkginfo_src = """ @@ -72,8 +73,11 @@ def setUpTestData(cls): # mbu cls.mbu = MetaBusinessUnit.objects.create(name=get_random_string(64)) cls.mbu.create_enrollment_business_unit() + # repository + cls.virtual_repository = force_repository(virtual=True) + cls.s3_repository = force_repository(virtual=False) # manifest - cls.manifest = Manifest.objects.create(meta_business_unit=cls.mbu, name=get_random_string(12)) + cls.manifest = force_manifest(mbu=cls.mbu) # pkginfos cls.pkginfo_data = plistlib.loads(pkginfo_src.encode("utf-8")) # enrollment @@ -113,7 +117,7 @@ def _force_smpi( local_pkg_content=None ): if catalog is None: - catalog = Catalog.objects.create(name=get_random_string(12)) + catalog = force_catalog(repository=self.virtual_repository if local_pkg_content else self.s3_repository) ManifestCatalog.objects.get_or_create(manifest=self.manifest, catalog=catalog) if sub_manifest is None: sub_manifest, _ = SubManifest.objects.get_or_create(name=get_random_string(12)) @@ -137,6 +141,7 @@ def _force_smpi( data["zentral_monolith"] = zentral_monolith pkg_info_name, _ = PkgInfoName.objects.get_or_create(name=name) pkg_info = PkgInfo.objects.create( + repository=catalog.repository, name=pkg_info_name, version=version, data=data @@ -451,22 +456,25 @@ def test_sub_manifest_one_tag_shard_included(self): # repository package - def test_repository_package_not_found(self): - pkg_info, _, _ = self._force_smpi() - api_path = pkg_info.get_pkg_info()["installer_item_location"] - response = self._make_munki_request(reverse("monolith_public:repository_package", args=(api_path,))) - self.assertEqual(response.status_code, 404) - def test_repository_package(self): pkg_info, _, _ = self._force_smpi() - local_path = os.path.join("/tmp/pkgs", pkg_info.data["installer_item_location"]) - os.makedirs(os.path.dirname(local_path), exist_ok=True) - with open(local_path, "wb") as f: - f.write(b"yolo") api_path = pkg_info.get_pkg_info()["installer_item_location"] response = self._make_munki_request(reverse("monolith_public:repository_package", args=(api_path,))) - self.assertEqual(b"".join(response.streaming_content), b"yolo") - shutil.rmtree("/tmp/pkgs") + self.assertEqual(response.status_code, 302) + p = urlparse(response.url) + self.assertEqual(p.scheme, "https") + self.assertEqual(p.netloc, "s3.us-east1.amazonaws.com") + s3_repo_kwargs = self.s3_repository.get_backend_kwargs() + self.assertEqual( + p.path, + os.path.join( + "/", + s3_repo_kwargs["bucket"], + s3_repo_kwargs["prefix"], + "pkgs", + pkg_info.data["installer_item_location"] + ) + ) def test_unknown_repository_package(self): pkg_info, _, _ = self._force_smpi() @@ -474,7 +482,14 @@ def test_unknown_repository_package(self): api_path = api_path.replace("." + str(pkg_info.pk) + ".", ".0.") # no pkg info with pk == 0 response = self._make_munki_request(reverse("monolith_public:repository_package", args=(api_path,))) self.assertEqual(response.status_code, 404) - self.assertEqual(response.content, b"PkgInfo not found!") + self.assertEqual(response.content, b"Not found!") + + def test_local_repository_package_not_found(self): + pkg_info, _, _ = self._force_smpi(local_pkg_content=b"fomo") + api_path = pkg_info.get_pkg_info()["installer_item_location"] + pkg_info.file.delete() + response = self._make_munki_request(reverse("monolith_public:repository_package", args=(api_path,))) + self.assertEqual(response.status_code, 404) def test_local_repository_package(self): pkg_info, _, _ = self._force_smpi(local_pkg_content=b"fomo") @@ -483,7 +498,58 @@ def test_local_repository_package(self): self.assertEqual(b"".join(response.streaming_content), b"fomo") pkg_info.file.delete(save=False) - # legacy public endpoints + # icon hashes + + def test_icon_hashes(self): + repository1 = force_repository() + repository1.icon_hashes = {"un": 64 * "a"} + repository1.save() + force_catalog(repository=repository1, manifest=self.manifest) + repository2 = force_repository() + repository2.icon_hashes = {"deux": 64 * "b"} + repository2.save() + force_catalog(repository=repository2, manifest=self.manifest) + response = self._make_munki_request(reverse("monolith_public:repository_icon_hashes")) + self.assertEqual( + plistlib.loads(response.content), + {"un": 64 * "a", "deux": 64 * "b"} + ) + + # client resources + + def test_client_resource_redirect(self): + repository = force_repository() + repository.client_resources = ["yolo.zip"] + repository.save() + force_catalog(repository=repository, manifest=self.manifest) + response = self._make_munki_request(reverse("monolith_public:repository_client_resource", + args=("yolo.zip",))) + self.assertEqual(response.status_code, 302) + p = urlparse(response.url) + self.assertEqual(p.scheme, "https") + self.assertEqual(p.netloc, "s3.us-east1.amazonaws.com") + s3_repo_kwargs = repository.get_backend_kwargs() + self.assertEqual( + p.path, + os.path.join( + "/", + s3_repo_kwargs["bucket"], + s3_repo_kwargs.get("prefix", ""), + "client_resources", + "yolo.zip" + ) + ) + + def test_client_resource_not_found(self): + repository = force_repository() + repository.client_resources = ["yolo.zip"] + repository.save() + force_catalog(repository=repository, manifest=self.manifest) + response = self._make_munki_request(reverse("monolith_public:repository_client_resource", + args=("fomo.zip",))) + self.assertEqual(response.status_code, 404) + + # legacy URLs def test_legacy_public_urls_are_disabled_on_tests(self): routes = [ diff --git a/tests/monolith/test_repositories.py b/tests/monolith/test_repositories.py index 46733d3a8a..5a8211bba5 100644 --- a/tests/monolith/test_repositories.py +++ b/tests/monolith/test_repositories.py @@ -1,16 +1,13 @@ from datetime import datetime import plistlib -from unittest.mock import call, patch, Mock +from unittest.mock import call, Mock from django.db.models.expressions import CombinedExpression -from django.http import HttpResponseRedirect from django.test import TestCase from django.utils.crypto import get_random_string -from zentral.contrib.inventory.models import MetaBusinessUnit -from zentral.contrib.monolith.conf import monolith_conf -from zentral.contrib.monolith.exceptions import RepositoryError -from zentral.contrib.monolith.models import Catalog, PkgInfoCategory, Manifest, ManifestCatalog, PkgInfo, PkgInfoName -from zentral.contrib.monolith.repository_backends.http import Repository as HttpRepository +from zentral.contrib.monolith.models import PkgInfo, PkgInfoCategory, PkgInfoName +from zentral.contrib.monolith.repository_backends import load_repository_backend from zentral.core.events.base import AuditEvent +from .utils import force_catalog, force_manifest, force_pkg_info, force_repository class MonolithRepositoriesTestCase(TestCase): @@ -18,75 +15,31 @@ class MonolithRepositoriesTestCase(TestCase): # utility methods - def _build_all_catalog(self, data): + def _build_plist(self, data): return plistlib.dumps(data) - def _force_catalog(self, archived=False): - catalog = Catalog.objects.create( - name=get_random_string(12), - archived_at=datetime.utcnow() if archived else None, + def _load_repository(self, db_repository, return_value): + repository = load_repository_backend(db_repository) + repository.get_icon_hashes_content = Mock( + name="get_icon_hashes_content", + return_value=self._build_plist({}) ) - ManifestCatalog.objects.create( - manifest=Manifest.objects.create( - meta_business_unit=MetaBusinessUnit.objects.create(name=get_random_string(12)), - name=get_random_string(12) - ), - catalog=catalog + repository.iter_client_resources = Mock( + name="iter_client_resources", + return_value=[] ) - return catalog - - def _force_category(self): - return PkgInfoCategory.objects.create(name=get_random_string(12)) - - def _force_name(self): - return PkgInfoName.objects.create(name=get_random_string(12)) - - def _force_pkg_info(self, local=True, version="1.0", archived=False, alles=False): - pkg_info_name = self._force_name() - data = {"name": pkg_info_name.name, - "version": version} - pi = PkgInfo.objects.create( - name=pkg_info_name, version=version, local=local, - archived_at=datetime.utcnow() if archived else None, - data=data + repository.get_all_catalog_content = Mock( + name="get_all_catalog_content", + return_value=self._build_plist(return_value) ) - pi.catalogs.add(self._force_catalog()) - return pi - - # http repository - - @patch("zentral.contrib.monolith.repository_backends.http.requests.get") - def test_http_repository_get_all_catalog_content(self, requests_get): - mocked_r = Mock() - mocked_r.status_code = 200 - mocked_r.content = b"yolo" - requests_get.return_value = mocked_r - r = HttpRepository({"root": "https://example.com/root"}) - self.assertEqual(r.get_all_catalog_content(), b"yolo") - requests_get.assert_called_once_with("https://example.com/root/catalogs/all") - - @patch("zentral.contrib.monolith.repository_backends.http.requests.get") - def test_http_repository_get_all_catalog_content_error(self, requests_get): - mocked_r = Mock() - mocked_r.status_code = 400 - requests_get.return_value = mocked_r - r = HttpRepository({"root": "https://example.com/root"}) - with self.assertRaises(RepositoryError): - r.get_all_catalog_content() - requests_get.assert_called_once_with("https://example.com/root/catalogs/all") - - def test_http_repository_make_munki_response(self): - r = HttpRepository({"root": "https://example.com/root"}) - resp = r.make_munki_repository_response("yadi", "yada") - self.assertIsInstance(resp, HttpResponseRedirect) - self.assertEqual(resp.headers["Location"], "https://example.com/root/yadi/yada") + return repository # sync catalogs - @patch("zentral.contrib.monolith.repository_backends.local.Repository.get_all_catalog_content") - def test_sync_catalogs(self, get_all_catalog_content): - catalog = self._force_catalog() - manifest = catalog.manifestcatalog_set.first().manifest + def test_sync_catalogs(self): + db_repository = force_repository() + manifest = force_manifest() + catalog = force_catalog(repository=db_repository, manifest=manifest) m_prev_value = manifest.serialize_for_event() self.assertEqual(manifest.version, 1) category_name = get_random_string(12) @@ -94,20 +47,21 @@ def test_sync_catalogs(self, get_all_catalog_content): requires_pin_name = get_random_string(12) update_for_pin_name = get_random_string(12) now = datetime.utcnow() - get_all_catalog_content.return_value = self._build_all_catalog([ - {"catalogs": [catalog.name], - "name": name, - "category": category_name, - "requires": [requires_pin_name], - "update_for": [update_for_pin_name], - "version": "3.0", - "yolo": now} - ]) audit_callback = Mock() - monolith_conf.repository.sync_catalogs(audit_callback) + repository = self._load_repository( + db_repository, + [{"catalogs": [catalog.name], + "name": name, + "category": category_name, + "requires": [requires_pin_name], + "update_for": [update_for_pin_name], + "version": "3.0", + "yolo": now}] + ) + repository.sync_catalogs(audit_callback) pkg_info = PkgInfo.objects.get(name__name=name, version="3.0") pin = PkgInfoName.objects.get(name=name) - category = PkgInfoCategory.objects.get(name=category_name) + category = PkgInfoCategory.objects.get(repository=db_repository, name=category_name) requires_pin = PkgInfoName.objects.get(name=requires_pin_name) update_for_pin = PkgInfoName.objects.get(name=update_for_pin_name) self.assertEqual( @@ -133,61 +87,35 @@ def test_sync_catalogs(self, get_all_catalog_content): manifest.refresh_from_db() self.assertEqual(manifest.version, 2) - @patch("zentral.contrib.monolith.repository_backends.local.Repository.get_all_catalog_content") - def test_missing_catalog(self, get_all_catalog_content): - self.assertIsNone(monolith_conf.repository.default_catalog_name) - get_all_catalog_content.return_value = self._build_all_catalog([{ - "name": get_random_string(12), - "version": "1.0" - }]) + def test_missing_catalog(self): + db_repository = force_repository() audit_callback = Mock() - monolith_conf.repository.sync_catalogs(audit_callback) + repository = self._load_repository( + db_repository, + [{"name": get_random_string(12), + "version": "1.0"}] + ) + repository.sync_catalogs(audit_callback) self.assertEqual(len(audit_callback.call_args_list), 0) - @patch("zentral.contrib.monolith.repository_backends.local.Repository.get_all_catalog_content") - def test_sync_catalogs_existing_local_pkg_info(self, get_all_catalog_content): - pkg_info = self._force_pkg_info(local=True) + def test_sync_catalogs_catalog_updates(self): + db_repository = force_repository() + old_catalog = force_catalog(repository=db_repository) + oc_prev_value = old_catalog.serialize_for_event() + pkg_info = force_pkg_info(catalog=old_catalog) pi_prev_value = pkg_info.serialize_for_event() - catalog = pkg_info.catalogs.first() - manifest = catalog.manifestcatalog_set.first().manifest + manifest = force_manifest() m_prev_value = manifest.serialize_for_event() - self.assertEqual(manifest.version, 1) - category = self._force_category() - get_all_catalog_content.return_value = self._build_all_catalog([ - {"catalogs": [catalog.name], - "name": pkg_info.name.name, - "category": category.name, - "version": "1.0"} - ]) - audit_callback = Mock() - monolith_conf.repository.sync_catalogs(audit_callback) - self.assertEqual( - audit_callback.call_args_list, - [call(pkg_info, AuditEvent.Action.UPDATED, pi_prev_value), - call(manifest, AuditEvent.Action.UPDATED, m_prev_value)] - ) - pkg_info.refresh_from_db() - self.assertTrue(pkg_info.local is False) - manifest.refresh_from_db() - self.assertEqual(manifest.version, 2) - - @patch("zentral.contrib.monolith.repository_backends.local.Repository.get_all_catalog_content") - def test_sync_catalogs_catalog_updates(self, get_all_catalog_content): - pkg_info = self._force_pkg_info(local=False) - pi_prev_value = pkg_info.serialize_for_event() - old_catalog = pkg_info.catalogs.first() - oc_prev_value = old_catalog.serialize_for_event() - new_catalog = self._force_catalog(archived=True) + new_catalog = force_catalog(repository=db_repository, manifest=manifest, archived=True) nc_prev_value = new_catalog.serialize_for_event() - manifest = new_catalog.manifestcatalog_set.first().manifest - m_prev_value = manifest.serialize_for_event() - get_all_catalog_content.return_value = self._build_all_catalog([ - {"catalogs": [new_catalog.name], - "name": pkg_info.name.name, - "version": "1.0"} - ]) audit_callback = Mock() - monolith_conf.repository.sync_catalogs(audit_callback) + repository = self._load_repository( + db_repository, + [{"catalogs": [new_catalog.name], + "name": pkg_info.name.name, + "version": "1.0"}] + ) + repository.sync_catalogs(audit_callback) self.assertEqual( audit_callback.call_args_list, [call(new_catalog, AuditEvent.Action.UPDATED, nc_prev_value), @@ -200,24 +128,25 @@ def test_sync_catalogs_catalog_updates(self, get_all_catalog_content): new_catalog.refresh_from_db() self.assertIsNone(new_catalog.archived_at) - @patch("zentral.contrib.monolith.repository_backends.local.Repository.get_all_catalog_content") - def test_sync_catalogs_pkg_info_archived(self, get_all_catalog_content): - remote_pkg_info_to_archive = self._force_pkg_info(local=False) - rpita_prev_value = remote_pkg_info_to_archive.serialize_for_event() - rpita_catalog = remote_pkg_info_to_archive.catalogs.first() + def test_sync_catalogs_pkg_info_archived(self): + db_repository = force_repository() + rpita_catalog = force_catalog(repository=db_repository) rpitac_prev_value = rpita_catalog.serialize_for_event() - local_pkg_info = self._force_pkg_info(local=True) # local, no event + remote_pkg_info_to_archive = force_pkg_info(catalog=rpita_catalog, local=False) + rpita_prev_value = remote_pkg_info_to_archive.serialize_for_event() new_name = get_random_string(12) - new_catalog = self._force_catalog() - manifest = new_catalog.manifestcatalog_set.first().manifest + manifest = force_manifest() m_prev_value = manifest.serialize_for_event() - get_all_catalog_content.return_value = self._build_all_catalog([ - {"catalogs": [new_catalog.name], - "name": new_name, - "version": "3.0"} - ]) + new_catalog = force_catalog(repository=db_repository, manifest=manifest) audit_callback = Mock() - monolith_conf.repository.sync_catalogs(audit_callback) + repository = load_repository_backend(db_repository) + repository = self._load_repository( + db_repository, + [{"catalogs": [new_catalog.name], + "name": new_name, + "version": "3.0"}] + ) + repository.sync_catalogs(audit_callback) pkg_info = PkgInfo.objects.get(name__name=new_name, version="3.0") self.assertEqual( audit_callback.call_args_list, @@ -234,26 +163,24 @@ def test_sync_catalogs_pkg_info_archived(self, get_all_catalog_content): # and was not present anymore rpita_catalog.refresh_from_db() self.assertIsNotNone(rpita_catalog.archived_at) - # local_pkg_info not archived - local_pkg_info.refresh_from_db() - self.assertIsNone(local_pkg_info.archived_at) - @patch("zentral.contrib.monolith.repository_backends.local.Repository.get_all_catalog_content") - def test_sync_catalogs_pkg_info_unarchived(self, get_all_catalog_content): - pkg_info_to_unarchive = self._force_pkg_info(local=False) - pkg_info_to_unarchive.archived_at = datetime.utcnow() - pkg_info_to_unarchive.save() - prev_value = pkg_info_to_unarchive.serialize_for_event() - catalog = pkg_info_to_unarchive.catalogs.first() - manifest = catalog.manifestcatalog_set.first().manifest + def test_sync_catalogs_pkg_info_unarchived(self): + db_repository = force_repository() + manifest = force_manifest() manifest_prev_value = manifest.serialize_for_event() - get_all_catalog_content.return_value = self._build_all_catalog([ - {"catalogs": [pkg_info_to_unarchive.catalogs.first().name], - "name": pkg_info_to_unarchive.name.name, - "version": pkg_info_to_unarchive.version} - ]) + catalog = force_catalog(repository=db_repository, manifest=manifest) + pkg_info_to_unarchive = force_pkg_info(local=False, catalog=catalog, archived=True) + self.assertIsNotNone(pkg_info_to_unarchive.archived_at) + prev_value = pkg_info_to_unarchive.serialize_for_event() audit_callback = Mock() - monolith_conf.repository.sync_catalogs(audit_callback) + repository = load_repository_backend(db_repository) + repository = self._load_repository( + db_repository, + [{"catalogs": [catalog.name], + "name": pkg_info_to_unarchive.name.name, + "version": pkg_info_to_unarchive.version}] + ) + repository.sync_catalogs(audit_callback) # pkg_info_to_unarchive archived at is None, because present in the catalog pkg_info_to_unarchive.refresh_from_db() self.assertIsNone(pkg_info_to_unarchive.archived_at) diff --git a/tests/monolith/test_setup_views.py b/tests/monolith/test_setup_views.py index 64b86159fd..15d64d36d6 100644 --- a/tests/monolith/test_setup_views.py +++ b/tests/monolith/test_setup_views.py @@ -13,12 +13,19 @@ from accounts.models import User from zentral.contrib.inventory.models import EnrollmentSecret, MetaBusinessUnit, Tag from zentral.contrib.monolith.models import (Catalog, Condition, Enrollment, EnrolledMachine, - Manifest, ManifestCatalog, ManifestSubManifest, - PkgInfo, PkgInfoCategory, PkgInfoName, - SubManifest, SubManifestPkgInfo) + PkgInfo, PkgInfoName) +from zentral.contrib.monolith.repository_backends import load_repository_backend +from zentral.contrib.monolith.repository_backends.s3 import S3Repository +from zentral.contrib.monolith.repository_backends.virtual import VirtualRepository from zentral.contrib.munki.models import ManagedInstall from zentral.core.events.base import AuditEvent from utils.packages import build_dummy_package +from .utils import (CLOUDFRONT_PRIVKEY_PEM, + force_catalog, force_category, force_condition, + force_manifest, force_name, + force_pkg_info, + force_sub_manifest, force_sub_manifest_pkg_info, + force_repository) @override_settings(STATICFILES_STORAGE='django.contrib.staticfiles.storage.StaticFilesStorage') @@ -34,16 +41,17 @@ def setUpTestData(cls): # mbu cls.mbu = MetaBusinessUnit.objects.create(name=get_random_string(64)) cls.mbu.create_enrollment_business_unit() + # repository + cls.repository = force_repository() # manifest - cls.manifest = Manifest.objects.create(meta_business_unit=cls.mbu, name=get_random_string(12)) + cls.manifest = force_manifest(mbu=cls.mbu) # catalog - cls.catalog_1 = Catalog.objects.create(name=get_random_string(13), priority=10) - # manifest catalog - ManifestCatalog.objects.create(manifest=cls.manifest, catalog=cls.catalog_1) + cls.catalog_1 = force_catalog(repository=cls.repository, manifest=cls.manifest) # pkginfo name cls.pkginfo_name_1 = PkgInfoName.objects.create(name="aaaa first name") # pkginfo - cls.pkginfo_1_1 = PkgInfo.objects.create(name=cls.pkginfo_name_1, version="1.0", + cls.pkginfo_1_1 = PkgInfo.objects.create(repository=cls.repository, + name=cls.pkginfo_name_1, version="1.0", data={"name": cls.pkginfo_name_1.name, "version": "1.0"}) cls.pkginfo_1_1.catalogs.set([cls.catalog_1]) @@ -82,78 +90,417 @@ def _login(self, *permissions): self.group.permissions.clear() self.client.force_login(self.user) - def _force_catalog(self): - return Catalog.objects.create(name=get_random_string(12)) + # index - def _force_condition(self, submanifest=False): - return Condition.objects.create(name=get_random_string(12), predicate='machine_type == "laptop"') + def test_index_redirect(self): + self._login_redirect(reverse("monolith:index")) - def _force_manifest(self): - return Manifest.objects.create(name=get_random_string(12), meta_business_unit=self.mbu) + def test_index_permission_denied(self): + self._login() + response = self.client.get(reverse("monolith:index")) + self.assertEqual(response.status_code, 403) + + def test_index(self): + self._login("monolith.view_manifest") + response = self.client.get(reverse("monolith:index")) + self.assertEqual(response.status_code, 200) + self.assertTemplateUsed(response, "monolith/index.html") + + # repositories + + def test_repositories_redirect(self): + self._login_redirect(reverse("monolith:repositories")) + + def test_repositories_permission_denied(self): + self._login() + response = self.client.get(reverse("monolith:repositories")) + self.assertEqual(response.status_code, 403) + + def test_repositories(self): + repository = force_repository() + self._login("monolith.view_repository") + response = self.client.get(reverse("monolith:repositories")) + self.assertEqual(response.status_code, 200) + self.assertTemplateUsed(response, "monolith/repository_list.html") + self.assertContains(response, repository.get_absolute_url()) + self.assertContains(response, repository.name) - def _force_sub_manifest(self, condition=None): - submanifest = SubManifest.objects.create(name=get_random_string(12)) - submanifest_pkginfo = SubManifestPkgInfo.objects.create( - sub_manifest=submanifest, - key="managed_installs", - pkg_info_name=self.pkginfo_name_1, - condition=condition + # create repository + + def test_create_repository_redirect(self): + self._login_redirect(reverse("monolith:create_repository")) + + def test_create_repository_permission_denied(self): + self._login() + response = self.client.get(reverse("monolith:create_repository")) + self.assertEqual(response.status_code, 403) + + def test_create_repository_get(self): + self._login("monolith.add_repository") + response = self.client.get(reverse("monolith:create_repository")) + self.assertEqual(response.status_code, 200) + self.assertTemplateUsed(response, "monolith/repository_form.html") + + def test_create_s3_repository_invalid_private_key(self): + self._login("monolith.add_repository", "monolith.view_repository") + name = get_random_string(12) + bucket = get_random_string(12) + response = self.client.post(reverse("monolith:create_repository"), + {"r-name": name, + "r-backend": "S3", + "s3-bucket": bucket, + "s3-cloudfront_domain": "yada.cloudfront.net", + "s3-cloudfront_key_id": "YADA", + "s3-cloudfront_privkey_pem": "YADA"}, + follow=True) + self.assertEqual(response.status_code, 200) + self.assertTemplateUsed(response, "monolith/repository_form.html") + self.assertFormError( + response.context["s3_form"], "cloudfront_privkey_pem", + "Invalid private key." + ) + + def test_create_s3_repository_missing_cf_domain_key_id(self): + self._login("monolith.add_repository", "monolith.view_repository") + name = get_random_string(12) + bucket = get_random_string(12) + response = self.client.post(reverse("monolith:create_repository"), + {"r-name": name, + "r-backend": "S3", + "s3-bucket": bucket, + "s3-cloudfront_privkey_pem": CLOUDFRONT_PRIVKEY_PEM}, + follow=True) + self.assertEqual(response.status_code, 200) + self.assertTemplateUsed(response, "monolith/repository_form.html") + self.assertFormError( + response.context["s3_form"], "cloudfront_domain", + "This field is required when configuring Cloudfront." + ) + self.assertFormError( + response.context["s3_form"], "cloudfront_key_id", + "This field is required when configuring Cloudfront." + ) + + def test_create_s3_repository_missing_cf_privkey(self): + self._login("monolith.add_repository", "monolith.view_repository") + name = get_random_string(12) + bucket = get_random_string(12) + response = self.client.post(reverse("monolith:create_repository"), + {"r-name": name, + "r-backend": "S3", + "s3-bucket": bucket, + "s3-cloudfront_domain": "yada.cloudfront.net", + "s3-cloudfront_key_id": "YADA"}, + follow=True) + self.assertEqual(response.status_code, 200) + self.assertTemplateUsed(response, "monolith/repository_form.html") + self.assertFormError( + response.context["s3_form"], "cloudfront_privkey_pem", + "This field is required when configuring Cloudfront." + ) + + @patch("base.notifier.Notifier.send_notification") + @patch("zentral.core.queues.backends.kombu.EventQueues.post_event") + def test_create_s3_repository(self, post_event, send_notification): + self._login("monolith.add_repository", "monolith.view_repository") + name = get_random_string(12) + bucket = get_random_string(12) + with self.captureOnCommitCallbacks(execute=True) as callbacks: + response = self.client.post(reverse("monolith:create_repository"), + {"r-name": name, + "r-backend": "S3", + "s3-bucket": bucket}, + follow=True) + self.assertEqual(response.status_code, 200) + self.assertEqual(len(callbacks), 1) + self.assertTemplateUsed(response, "monolith/repository_detail.html") + self.assertContains(response, name) + repository = response.context["object"] + self.assertEqual(repository.name, name) + event = post_event.call_args_list[0].args[0] + self.assertIsInstance(event, AuditEvent) + self.assertEqual( + event.payload, + {"action": "created", + "object": { + "model": "monolith.repository", + "pk": str(repository.pk), + "new_value": { + "pk": repository.pk, + "name": name, + "backend": "S3", + "backend_kwargs": {"bucket": bucket}, + "created_at": repository.created_at, + "updated_at": repository.updated_at, + } + }} + ) + metadata = event.metadata.serialize() + self.assertEqual(metadata["objects"], {"monolith_repository": [str(repository.pk)]}) + self.assertEqual(sorted(metadata["tags"]), ["monolith", "zentral"]) + send_notification.assert_called_once_with("monolith.repository", str(repository.pk)) + repository_backend = load_repository_backend(repository) + self.assertIsInstance(repository_backend, S3Repository) + self.assertEqual(repository_backend.signature_version, "s3v4") + + @patch("base.notifier.Notifier.send_notification") + @patch("zentral.core.queues.backends.kombu.EventQueues.post_event") + def test_create_virtual_repository(self, post_event, send_notification): + self._login("monolith.add_repository", "monolith.view_repository") + name = get_random_string(12) + with self.captureOnCommitCallbacks(execute=True) as callbacks: + response = self.client.post(reverse("monolith:create_repository"), + {"r-name": name, + "r-backend": "VIRTUAL"}, + follow=True) + self.assertEqual(response.status_code, 200) + self.assertEqual(len(callbacks), 1) + self.assertTemplateUsed(response, "monolith/repository_detail.html") + self.assertContains(response, name) + repository = response.context["object"] + self.assertEqual(repository.name, name) + event = post_event.call_args_list[0].args[0] + self.assertIsInstance(event, AuditEvent) + self.assertEqual( + event.payload, + {"action": "created", + "object": { + "model": "monolith.repository", + "pk": str(repository.pk), + "new_value": { + "pk": repository.pk, + "name": name, + "backend": "VIRTUAL", + "backend_kwargs": {}, + "created_at": repository.created_at, + "updated_at": repository.updated_at, + } + }} + ) + metadata = event.metadata.serialize() + self.assertEqual(metadata["objects"], {"monolith_repository": [str(repository.pk)]}) + self.assertEqual(sorted(metadata["tags"]), ["monolith", "zentral"]) + send_notification.assert_called_once_with("monolith.repository", str(repository.pk)) + repository_backend = load_repository_backend(repository) + self.assertIsInstance(repository_backend, VirtualRepository) + + # repository + + def test_repository_redirect(self): + repository = force_repository() + self._login_redirect(reverse("monolith:repository", args=(repository.pk,))) + + def test_repository_permission_denied(self): + repository = force_repository() + self._login() + response = self.client.get(reverse("monolith:repository", args=(repository.pk,))) + self.assertEqual(response.status_code, 403) + + def test_repository_get_no_edit_link(self): + repository = force_repository() + self._login("monolith.view_repository") + response = self.client.get(reverse("monolith:repository", args=(repository.pk,))) + self.assertEqual(response.status_code, 200) + self.assertTemplateUsed(response, "monolith/repository_detail.html") + self.assertContains(response, repository.name) + self.assertNotContains(response, reverse("monolith:update_repository", args=(repository.pk,))) + + def test_repository_get_edit_link(self): + repository = force_repository() + self._login("monolith.view_repository", "monolith.change_repository") + response = self.client.get(reverse("monolith:repository", args=(repository.pk,))) + self.assertEqual(response.status_code, 200) + self.assertTemplateUsed(response, "monolith/repository_detail.html") + self.assertContains(response, repository.name) + self.assertContains(response, reverse("monolith:update_repository", args=(repository.pk,))) + + # update repository + + def test_update_repository_redirect(self): + repository = force_repository() + self._login_redirect(reverse("monolith:update_repository", args=(repository.pk,))) + + def test_update_repository_permission_denied(self): + repository = force_repository() + self._login() + response = self.client.get(reverse("monolith:update_repository", args=(repository.pk,))) + self.assertEqual(response.status_code, 403) + + def test_update_repository_get(self): + repository = force_repository() + self._login("monolith.change_repository") + response = self.client.get(reverse("monolith:update_repository", args=(repository.pk,))) + self.assertEqual(response.status_code, 200) + self.assertTemplateUsed(response, "monolith/repository_form.html") + + def test_update_s3_repository_bad_mbu(self): + repository = force_repository() + manifest = force_manifest() + self.assertIsNone(repository.meta_business_unit) + self.assertNotEqual(manifest.meta_business_unit, self.mbu) + force_catalog(repository=repository, manifest=manifest) + self._login("monolith.change_repository") + response = self.client.post(reverse("monolith:update_repository", args=(repository.pk,)), + {"r-name": get_random_string(12), + "r-meta_business_unit": self.mbu.pk, + "r-backend": "S3", + "s3-bucket": get_random_string(12)}) + self.assertEqual(response.status_code, 200) + self.assertTemplateUsed(response, "monolith/repository_form.html") + self.assertFormError( + response.context["form"], "meta_business_unit", + f"Repository linked to manifest '{manifest}' which has a different business unit." + ) + + @patch("base.notifier.Notifier.send_notification") + @patch("zentral.core.queues.backends.kombu.EventQueues.post_event") + def test_update_s3_repository(self, post_event, send_notification): + repository = force_repository() + manifest = force_manifest(mbu=self.mbu) + self.assertEqual(manifest.version, 1) + # two catalogs, only one manifest version bump! + force_catalog(repository=repository, manifest=manifest) + force_catalog(repository=repository, manifest=manifest) + prev_value = repository.serialize_for_event() + new_name = get_random_string(12) + new_bucket = get_random_string(12) + self._login("monolith.change_repository", "monolith.view_repository") + with self.captureOnCommitCallbacks(execute=True) as callbacks: + response = self.client.post(reverse("monolith:update_repository", args=(repository.pk,)), + {"r-name": new_name, + "r-meta_business_unit": self.mbu.pk, + "r-backend": "S3", + "s3-bucket": new_bucket, + "s3-region_name": "us-east2", + "s3-prefix": "prefix", + "s3-access_key_id": "11111111111111111111", + "s3-secret_access_key": "22222222222222222222", + "s3-assume_role_arn": "arn:aws:iam::123456789012:role/S3Access", + "s3-signature_version": "s3v2", + "s3-endpoint_url": "https://endpoint.example.com", + "s3-cloudfront_domain": "yada.cloudfront.net", + "s3-cloudfront_key_id": "YADA", + "s3-cloudfront_privkey_pem": CLOUDFRONT_PRIVKEY_PEM}, + follow=True) + self.assertEqual(response.status_code, 200) + self.assertEqual(len(callbacks), 1) + self.assertTemplateUsed(response, "monolith/repository_detail.html") + self.assertContains(response, new_name) + repository = response.context["object"] + self.assertEqual(repository.name, new_name) + event = post_event.call_args_list[0].args[0] + self.assertIsInstance(event, AuditEvent) + self.assertEqual( + event.payload, + {"action": "updated", + "object": { + "model": "monolith.repository", + "pk": str(repository.pk), + "prev_value": prev_value, + "new_value": { + "pk": repository.pk, + "name": new_name, + "meta_business_unit": {"pk": self.mbu.pk, "name": self.mbu.name}, + "backend": "S3", + "backend_kwargs": { + "access_key_id": "11111111111111111111", + "assume_role_arn": "arn:aws:iam::123456789012:role/S3Access", + "bucket": new_bucket, + "cloudfront_domain": "yada.cloudfront.net", + "cloudfront_key_id": "YADA", + "cloudfront_privkey_pem_hash": "f42f0756e0d05ae8e6e63581e615d2d8" + "04c0f79b9f6bfb3cb7cfc5e9b6fc6a8f", + "endpoint_url": "https://endpoint.example.com", + "prefix": "prefix", + "region_name": "us-east2", + "secret_access_key_hash": "d70d4cbd04b6a3140c2ee642a40820abeacef01117ea9ce209de7c72452abe21", + "signature_version": "s3v2", + }, + "created_at": repository.created_at, + "updated_at": repository.updated_at, + } + }} + ) + metadata = event.metadata.serialize() + self.assertEqual(metadata["objects"], {"monolith_repository": [str(repository.pk)]}) + self.assertEqual(sorted(metadata["tags"]), ["monolith", "zentral"]) + send_notification.assert_called_once_with("monolith.repository", str(repository.pk)) + repository_backend = load_repository_backend(repository) + self.assertEqual(repository_backend.name, new_name) + self.assertEqual(repository_backend.bucket, new_bucket) + self.assertEqual(repository_backend.region_name, "us-east2") + self.assertEqual(repository_backend.prefix, "prefix") + self.assertEqual( + repository_backend.credentials, + {'aws_access_key_id': '11111111111111111111', + 'aws_secret_access_key': '22222222222222222222'} ) - return submanifest, submanifest_pkginfo - - def _force_pkg_info_name(self): - return PkgInfoName.objects.create(name=get_random_string(12)) - - def _force_pkg_info(self, local=True, version="1.0", archived=False, alles=False): - pkg_info_name = self._force_pkg_info_name() - data = {"name": pkg_info_name.name, - "version": version} - pi = PkgInfo.objects.create( - name=pkg_info_name, version=version, local=local, - archived_at=datetime.utcnow() if archived else None, - data=data + self.assertEqual( + repository_backend.assume_role_arn, + "arn:aws:iam::123456789012:role/S3Access", ) - if alles: - pkg_info_category = PkgInfoCategory.objects.create(name=get_random_string(12)) - pi.catalogs.add(Catalog.objects.create(name=get_random_string(12))) - pi.category = pkg_info_category - data["category"] = pkg_info_category.name - pkg_info_name_required = self._force_pkg_info_name() - pi.requires.add(pkg_info_name_required) - data["requires"] = [pkg_info_name_required.name] - pkg_info_name_update_for = self._force_pkg_info_name() - pi.update_for.add(pkg_info_name_update_for) - data["update_for"] = [pkg_info_name_update_for.name] - data.update({ - 'display_name': get_random_string(12), - 'description': get_random_string(12), - 'autoremove': False, - 'installed_size': 111, - 'installer_item_hash': get_random_string(64, allowed_chars="0123456789abcdef"), - 'installer_item_location': get_random_string(12), - 'installer_item_size': 55, - 'minimum_os_version': '10.11.0', - 'receipts': [{ - 'installed_size': 111, - 'packageid': 'io.zentral.{}'.format(slugify(pkg_info_name.name)), - 'version': version - }], - 'unattended_install': True, - 'unattended_uninstall': True, - 'uninstall_method': 'removepackages', - 'uninstallable': True, - 'version': version, - 'zentral_monolith': { - "excluded_tag": [Tag.objects.create(name=get_random_string(12)).name], - "shards": { - "modulo": 5, - "default": 2, - "tags": {Tag.objects.create(name=get_random_string(12)).name: 5} - } - } - }) - pi.save() - return pi + self.assertEqual(repository_backend.signature_version, "s3v2") + self.assertEqual(repository_backend.endpoint_url, "https://endpoint.example.com") + self.assertEqual(repository_backend.cloudfront_domain, "yada.cloudfront.net") + self.assertEqual(repository_backend.cloudfront_key_id, "YADA") + self.assertEqual(repository_backend.cloudfront_privkey_pem, CLOUDFRONT_PRIVKEY_PEM) + manifest.refresh_from_db() + self.assertEqual(manifest.version, 2) # only one bump + + # delete repository + + def test_delete_repository_redirect(self): + repository = force_repository() + self._login_redirect(reverse("monolith:delete_repository", args=(repository.pk,))) + + def test_delete_repository_permission_denied(self): + repository = force_repository() + self._login() + response = self.client.get(reverse("monolith:delete_repository", args=(repository.pk,))) + self.assertEqual(response.status_code, 403) + + def test_delete_repository_get_not_found(self): + repository = force_repository() + manifest = force_manifest() + force_catalog(repository=repository, manifest=manifest) + self._login("monolith.delete_repository") + response = self.client.get(reverse("monolith:delete_repository", args=(repository.pk,))) + self.assertEqual(response.status_code, 404) + + def test_delete_repository_get(self): + repository = force_repository() + self._login("monolith.delete_repository") + response = self.client.get(reverse("monolith:delete_repository", args=(repository.pk,))) + self.assertEqual(response.status_code, 200) + self.assertTemplateUsed(response, "monolith/repository_confirm_delete.html") + + @patch("base.notifier.Notifier.send_notification") + @patch("zentral.core.queues.backends.kombu.EventQueues.post_event") + def test_delete_repository(self, post_event, send_notification): + repository = force_repository() + prev_value = repository.serialize_for_event() + self._login("monolith.delete_repository", "monolith.view_repository") + with self.captureOnCommitCallbacks(execute=True) as callbacks: + response = self.client.post(reverse("monolith:delete_repository", args=(repository.pk,)), follow=True) + self.assertEqual(response.status_code, 200) + self.assertEqual(len(callbacks), 1) + self.assertTemplateUsed(response, "monolith/repository_list.html") + self.assertNotContains(response, repository.name) + event = post_event.call_args_list[0].args[0] + self.assertIsInstance(event, AuditEvent) + self.assertEqual( + event.payload, + {"action": "deleted", + "object": { + "model": "monolith.repository", + "pk": str(repository.pk), + "prev_value": prev_value, + }} + ) + metadata = event.metadata.serialize() + self.assertEqual(metadata["objects"], {"monolith_repository": [str(repository.pk)]}) + self.assertEqual(sorted(metadata["tags"]), ["monolith", "zentral"]) + send_notification.assert_called_once_with("monolith.repository", str(repository.pk)) # pkg infos @@ -169,18 +516,18 @@ def test_pkg_infos(self): self._login("monolith.view_pkginfo") response = self.client.get(reverse("monolith:pkg_infos")) self.assertEqual(response.status_code, 200) - self.assertTemplateUsed(response, "monolith/pkg_info_list.html") + self.assertTemplateUsed(response, "monolith/pkginfo_list.html") self.assertContains(response, self.pkginfo_name_1.name) def test_pkg_infos_search(self): self._login("monolith.view_pkginfo") response = self.client.get(reverse("monolith:pkg_infos")) self.assertEqual(response.status_code, 200) - self.assertTemplateUsed(response, "monolith/pkg_info_list.html") + self.assertTemplateUsed(response, "monolith/pkginfo_list.html") self.assertContains(response, self.pkginfo_name_1.name) response = self.client.get(reverse("monolith:pkg_infos"), {"name": "does not exists"}) self.assertEqual(response.status_code, 200) - self.assertTemplateUsed(response, "monolith/pkg_info_list.html") + self.assertTemplateUsed(response, "monolith/pkginfo_list.html") self.assertContains(response, "We didn't find any item related to your search") self.assertContains(response, reverse("monolith:pkg_infos") + '">all the items') @@ -312,14 +659,14 @@ def test_delete_pkg_info_name_404(self): @patch("zentral.core.queues.backends.kombu.EventQueues.post_event") def test_delete_pkg_info_name(self, post_event): self._login("monolith.delete_pkginfoname", "monolith.view_pkginfo") - pkg_info_name = self._force_pkg_info_name() + pkg_info_name = force_name() prev_pk = pkg_info_name.pk with self.captureOnCommitCallbacks(execute=True) as callbacks: response = self.client.post(reverse("monolith:delete_pkg_info_name", args=(pkg_info_name.pk,)), follow=True) self.assertEqual(response.status_code, 200) self.assertEqual(len(callbacks), 1) - self.assertTemplateUsed(response, "monolith/pkg_info_list.html") + self.assertTemplateUsed(response, "monolith/pkginfo_list.html") self.assertNotContains(response, pkg_info_name.name) event = post_event.call_args_list[0].args[0] self.assertIsInstance(event, AuditEvent) @@ -352,7 +699,7 @@ def test_upload_package_permission_denied(self): def test_upload_package_get_no_name(self): self._login("monolith.add_pkginfo") - pkg_info_name = self._force_pkg_info_name() + pkg_info_name = force_name() response = self.client.get(reverse("monolith:upload_package")) self.assertEqual(response.status_code, 200) self.assertTemplateUsed(response, "monolith/package_form.html") @@ -362,25 +709,99 @@ def test_upload_package_get_no_name(self): def test_upload_package_get_name(self): self._login("monolith.add_pkginfo") - pkg_info_name = self._force_pkg_info_name() + pkg_info_name = force_name() response = self.client.get(reverse("monolith:upload_package"), {"pin_id": pkg_info_name.pk}) self.assertEqual(response.status_code, 200) self.assertTemplateUsed(response, "monolith/package_form.html") self.assertContains(response, "Upload package") self.assertNotIn("name", response.context["form"].fields) + def test_upload_package_catalog_different_repository(self): + self._login("monolith.add_pkginfo") + pkg_info_name = force_name() + file = BytesIO(build_dummy_package()) + file.name = "test123.pkg" + catalog = force_catalog(repository=force_repository(virtual=True)) + catalog2 = force_catalog(repository=force_repository(virtual=True)) + response = self.client.post( + reverse("monolith:upload_package"), + {"file": file, + "name": pkg_info_name.pk, + "catalogs": [catalog.pk, catalog2.pk]}, + follow=True + ) + self.assertEqual(response.status_code, 200) + self.assertTemplateUsed(response, "monolith/package_form.html") + self.assertFormError( + response.context["form"], "catalogs", + "The catalogs must be from the same repository." + ) + + def test_upload_package_catalog_category_different_repository(self): + self._login("monolith.add_pkginfo") + pkg_info_name = force_name() + file = BytesIO(build_dummy_package()) + file.name = "test123.pkg" + catalog = force_catalog(repository=force_repository(virtual=True)) + category = force_category(repository=force_repository(virtual=True)) + response = self.client.post( + reverse("monolith:upload_package"), + {"file": file, + "name": pkg_info_name.pk, + "category": category.pk, + "catalogs": [catalog.pk]}, + follow=True + ) + self.assertEqual(response.status_code, 200) + self.assertTemplateUsed(response, "monolith/package_form.html") + self.assertFormError( + response.context["form"], "category", + "The category must be from the same repository as the catalogs." + ) + + def test_upload_package_catalog_category_wrong_choices(self): + self._login("monolith.add_pkginfo") + pkg_info_name = force_name() + file = BytesIO(build_dummy_package()) + file.name = "test123.pkg" + repository = force_repository(virtual=False) + catalog = force_catalog(repository=repository) + category = force_category(repository=repository) + response = self.client.post( + reverse("monolith:upload_package"), + {"file": file, + "name": pkg_info_name.pk, + "category": category.pk, + "catalogs": [catalog.pk]}, + follow=True + ) + self.assertEqual(response.status_code, 200) + self.assertTemplateUsed(response, "monolith/package_form.html") + self.assertFormError( + response.context["form"], "catalogs", + f"Select a valid choice. {catalog.pk} is not one of the available choices." + ) + self.assertFormError( + response.context["form"], "category", + "Select a valid choice. That choice is not one of the available choices." + ) + @patch("zentral.core.queues.backends.kombu.EventQueues.post_event") def test_upload_package(self, post_event): self._login("monolith.add_pkginfo", "monolith.view_pkginfoname", "monolith.view_pkginfo") - pkg_info_name = self._force_pkg_info_name() + pkg_info_name = force_name() file = BytesIO(build_dummy_package()) file.name = "test123.pkg" + repository = force_repository(virtual=True) + catalog = force_catalog(repository=repository) + category = force_category(repository=repository) with self.captureOnCommitCallbacks(execute=True) as callbacks: response = self.client.post( reverse("monolith:upload_package"), {"file": file, "name": pkg_info_name.pk, - "catalogs": [self.catalog_1.pk]}, + "category": category.pk, + "catalogs": [catalog.pk]}, follow=True ) self.assertEqual(response.status_code, 200) @@ -412,7 +833,10 @@ def test_upload_package(self, post_event): "pk": pkg_info.pk, "local": True, "name": pkg_info_name.name, - "catalogs": [{"pk": self.catalog_1.pk, "name": self.catalog_1.name}], + "category": {"pk": category.pk, "name": category.name, + "repository": {"pk": repository.pk, "name": repository.name}}, + "catalogs": [{"pk": catalog.pk, "name": catalog.name, + "repository": {"pk": repository.pk, "name": repository.name}}], "requires": [], "update_for": [], "version": "1.0", @@ -434,10 +858,12 @@ def test_upload_package(self, post_event): @patch("zentral.core.queues.backends.kombu.EventQueues.post_event") def test_upload_package_from_pkg_info_name(self, post_event): self._login("monolith.add_pkginfo", "monolith.view_pkginfoname", "monolith.view_pkginfo") - pkg_info_category = PkgInfoCategory.objects.create(name=get_random_string(12)) - pkg_info_name = self._force_pkg_info_name() - pkg_info_name_required = self._force_pkg_info_name() - pkg_info_name_update_for = self._force_pkg_info_name() + repository = force_repository(virtual=True) + catalog = force_catalog(repository=repository) + pkg_info_category = force_category(repository=repository) + pkg_info_name = force_name() + pkg_info_name_required = force_name() + pkg_info_name_update_for = force_name() excluded_tag = Tag.objects.create(name=get_random_string(12)) shard_tag = Tag.objects.create(name=get_random_string(12)) file = BytesIO(build_dummy_package()) @@ -448,7 +874,7 @@ def test_upload_package_from_pkg_info_name(self, post_event): {"file": file, "display_name": "Yolo", "description": "Fomo", - "catalogs": [self.catalog_1.pk], + "catalogs": [catalog.pk], "category": pkg_info_category.pk, "requires": [pkg_info_name_required.pk], "update_for": [pkg_info_name_update_for.pk], @@ -510,8 +936,12 @@ def test_upload_package_from_pkg_info_name(self, post_event): "pk": pkg_info.pk, "local": True, "name": pkg_info_name.name, - "category": {"name": pkg_info_category.name, "pk": pkg_info_category.pk}, - "catalogs": [{"pk": self.catalog_1.pk, "name": self.catalog_1.name}], + "category": {"name": pkg_info_category.name, "pk": pkg_info_category.pk, + "repository": {"pk": repository.pk, + "name": repository.name}}, + "catalogs": [{"pk": catalog.pk, "name": catalog.name, + "repository": {"pk": repository.pk, + "name": repository.name}}], "data": data, "requires": [pkg_info_name_required.name], "update_for": [pkg_info_name_update_for.name], @@ -532,7 +962,7 @@ def test_upload_package_from_pkg_info_name(self, post_event): def test_upload_package_conflict(self): self._login("monolith.add_pkginfo") - pkg_info = self._force_pkg_info() + pkg_info = force_pkg_info() file = BytesIO(build_dummy_package(name=pkg_info.name.name, version=pkg_info.version)) file.name = "{}.pkg".format(get_random_string(12)) response = self.client.post( @@ -552,16 +982,18 @@ def test_upload_package_conflict(self): @patch("zentral.core.queues.backends.kombu.EventQueues.post_event") def test_upload_package_existing_archived_package(self, post_event): self._login("monolith.add_pkginfo", "monolith.view_pkginfoname", "monolith.view_pkginfo") - pkg_info = self._force_pkg_info(archived=True) + pkg_info = force_pkg_info(archived=True) pkg_info_name = pkg_info.name file = BytesIO(build_dummy_package(pkg_info_name.name, pkg_info.version)) file.name = "{}.pkg".format(get_random_string(12)) + repository = force_repository(virtual=True) + catalog = force_catalog(repository=repository) with self.captureOnCommitCallbacks(execute=True) as callbacks: response = self.client.post( reverse("monolith:upload_package"), {"file": file, "name": pkg_info_name.pk, - "catalogs": [self.catalog_1.pk]}, + "catalogs": [catalog.pk]}, follow=True ) self.assertEqual(response.status_code, 200) @@ -594,7 +1026,9 @@ def test_upload_package_existing_archived_package(self, post_event): "pk": pkg_info.pk, "local": True, "name": pkg_info_name.name, - "catalogs": [{"pk": self.catalog_1.pk, "name": self.catalog_1.name}], + "catalogs": [{"pk": catalog.pk, "name": catalog.name, + "repository": {"pk": catalog.repository.pk, + "name": catalog.repository.name}}], "requires": [], "update_for": [], "version": "1.0", @@ -616,17 +1050,17 @@ def test_upload_package_existing_archived_package(self, post_event): # update package def test_update_package_login_redirect(self): - pkg_info = self._force_pkg_info() + pkg_info = force_pkg_info() self._login_redirect(reverse("monolith:update_package", args=(pkg_info.pk,))) def test_update_package_permission_denied(self): - pkg_info = self._force_pkg_info() + pkg_info = force_pkg_info() self._login() response = self.client.get(reverse("monolith:update_package", args=(pkg_info.pk,))) self.assertEqual(response.status_code, 403) def test_update_package_get_no_name(self): - pkg_info = self._force_pkg_info() + pkg_info = force_pkg_info() self._login("monolith.change_pkginfo") response = self.client.get(reverse("monolith:update_package", args=(pkg_info.pk,))) self.assertEqual(response.status_code, 200) @@ -636,13 +1070,15 @@ def test_update_package_get_no_name(self): @patch("zentral.core.queues.backends.kombu.EventQueues.post_event") def test_update_package(self, post_event): - pkg_info = self._force_pkg_info(alles=True) + pkg_info = force_pkg_info() prev_value = pkg_info.serialize_for_event() self._login("monolith.change_pkginfo", "monolith.view_pkginfo", "monolith.view_pkginfoname") + repository = force_repository(virtual=True) + catalog = force_catalog(repository=repository) with self.captureOnCommitCallbacks(execute=True) as callbacks: response = self.client.post( reverse("monolith:update_package", args=(pkg_info.pk,)), - {"catalogs": [self.catalog_1.pk]}, + {"catalogs": [catalog.pk]}, follow=True ) self.assertEqual(response.status_code, 200) @@ -653,13 +1089,12 @@ def test_update_package(self, post_event): event = post_event.call_args_list[0].args[0] self.assertIsInstance(event, AuditEvent) new_value = copy.deepcopy(prev_value) - del new_value["category"] new_value["updated_at"] = pkg_info.updated_at - new_value["catalogs"] = [{"pk": self.catalog_1.pk, "name": self.catalog_1.name}] + new_value["catalogs"] = [{"pk": catalog.pk, "name": catalog.name, + "repository": {"pk": repository.pk, + "name": repository.name}}] new_value["requires"] = [] new_value["update_for"] = [] - for key in ("zentral_monolith", "category", "display_name", "description", "requires", "update_for"): - del new_value["data"][key] self.assertEqual( event.payload, {"action": "updated", @@ -681,17 +1116,17 @@ def test_update_package(self, post_event): # delete pkg info def test_delete_pkg_info_login_redirect(self): - pkg_info = self._force_pkg_info() + pkg_info = force_pkg_info() self._login_redirect(reverse("monolith:delete_pkg_info", args=(pkg_info.pk,))) def test_delete_pkg_info_permission_denied(self): - pkg_info = self._force_pkg_info() + pkg_info = force_pkg_info() self._login() response = self.client.get(reverse("monolith:delete_pkg_info", args=(pkg_info.pk,))) self.assertEqual(response.status_code, 403) def test_delete_pkg_info_404(self): - pkg_info = self._force_pkg_info(local=False) + pkg_info = force_pkg_info(local=False) self._login("monolith.delete_pkginfo") response = self.client.post(reverse("monolith:delete_pkg_info", args=(pkg_info.pk,))) self.assertEqual(response.status_code, 404) @@ -699,7 +1134,7 @@ def test_delete_pkg_info_404(self): @patch("zentral.core.queues.backends.kombu.EventQueues.post_event") def test_delete_pkg_info(self, post_event): self._login("monolith.delete_pkginfo", "monolith.view_pkginfo", "monolith.view_pkginfoname") - pkg_info = self._force_pkg_info() + pkg_info = force_pkg_info() prev_value = pkg_info.serialize_for_event() with self.captureOnCommitCallbacks(execute=True) as callbacks: response = self.client.post(reverse("monolith:delete_pkg_info", args=(pkg_info.pk,)), @@ -746,17 +1181,17 @@ def test_catalogs(self): # catalog def test_catalog_login_redirect(self): - catalog = self._force_catalog() + catalog = force_catalog() self._login_redirect(reverse("monolith:catalog", args=(catalog.pk,))) def test_catalog_permission_denied(self): - catalog = self._force_catalog() + catalog = force_catalog() self._login() response = self.client.get(reverse("monolith:catalog", args=(catalog.pk,))) self.assertEqual(response.status_code, 403) def test_catalog(self): - catalog = self._force_catalog() + catalog = force_catalog() self._login("monolith.view_catalog") response = self.client.get(reverse("monolith:catalog", args=(catalog.pk,))) self.assertEqual(response.status_code, 200) @@ -765,38 +1200,43 @@ def test_catalog(self): # create catalog - def test_create_catalog_auto_permission_denied(self): - response = self.client.get(reverse("monolith:create_catalog")) - self.assertContains(response, "Automatic catalog management", status_code=403) - - @patch("zentral.contrib.monolith.views.monolith_conf.repository") - def test_create_catalog_login_redirect(self, repository): - repository.manual_catalog_management = True + def test_create_catalog_login_redirect(self): self._login_redirect(reverse("monolith:create_catalog")) - @patch("zentral.contrib.monolith.views.monolith_conf.repository") - def test_create_catalog_permission_denied(self, repository): - repository.manual_catalog_management = True + def test_create_catalog_permission_denied(self): self._login() response = self.client.get(reverse("monolith:create_catalog")) self.assertContains(response, "Forbidden", status_code=403) - @patch("zentral.contrib.monolith.views.monolith_conf.repository") + def test_create_catalog_not_virtual_repository(self): + repository = force_repository(virtual=False) + self._login("monolith.add_catalog") + response = self.client.post(reverse("monolith:create_catalog"), + {"repository": repository.pk, + "name": get_random_string(12)}, + follow=True) + self.assertEqual(response.status_code, 200) + self.assertTemplateUsed(response, "monolith/catalog_form.html") + self.assertFormError( + response.context["form"], "repository", + "Select a valid choice. That choice is not one of the available choices." + ) + @patch("zentral.core.queues.backends.kombu.EventQueues.post_event") - def test_create_catalog(self, post_event, repository): - repository.manual_catalog_management = True + def test_create_catalog(self, post_event): + repository = force_repository(virtual=True) self._login("monolith.add_catalog", "monolith.view_catalog") name = get_random_string(12) with self.captureOnCommitCallbacks(execute=True) as callbacks: response = self.client.post(reverse("monolith:create_catalog"), - {"name": name, "priority": 17}, + {"repository": repository.pk, + "name": name}, follow=True) self.assertEqual(response.status_code, 200) self.assertEqual(len(callbacks), 1) self.assertTemplateUsed(response, "monolith/catalog_detail.html") catalog = response.context["object"] self.assertEqual(catalog.name, name) - self.assertEqual(catalog.priority, 17) event = post_event.call_args_list[0].args[0] self.assertIsInstance(event, AuditEvent) self.assertEqual( @@ -814,89 +1254,60 @@ def test_create_catalog(self, post_event, repository): # update catalog - def test_update_catalog_auto_permission_denied(self): - catalog = self._force_catalog() - response = self.client.get(reverse("monolith:update_catalog", args=(catalog.pk,))) - self.assertContains(response, "Automatic catalog management", status_code=403) - - @patch("zentral.contrib.monolith.views.monolith_conf.repository") - def test_update_catalog_login_redirect(self, repository): - repository.manual_catalog_management = True - catalog = self._force_catalog() + def test_update_catalog_login_redirect(self): + repository = force_repository(virtual=True) + catalog = force_catalog(repository=repository) self._login_redirect(reverse("monolith:update_catalog", args=(catalog.pk,))) - @patch("zentral.contrib.monolith.views.monolith_conf.repository") - def test_update_catalog_permission_denied(self, repository): - repository.manual_catalog_management = True - catalog = self._force_catalog() + def test_update_catalog_permission_denied(self): + repository = force_repository(virtual=True) + catalog = force_catalog(repository=repository) self._login() response = self.client.get(reverse("monolith:update_catalog", args=(catalog.pk,))) self.assertContains(response, "Forbidden", status_code=403) - @patch("zentral.contrib.monolith.views.monolith_conf.repository") - @patch("zentral.core.queues.backends.kombu.EventQueues.post_event") - def test_update_catalog(self, post_event, repository): - repository.manual_catalog_management = True - catalog = self._force_catalog() - prev_value = catalog.serialize_for_event() - self._login("monolith.change_catalog", "monolith.view_catalog") - new_name = get_random_string(12) - with self.captureOnCommitCallbacks(execute=True) as callbacks: - response = self.client.post(reverse("monolith:update_catalog", args=(catalog.pk,)), - {"name": new_name, "priority": 42}, - follow=True) + def test_update_catalog_not_virtual(self): + repository = force_repository(virtual=False) + catalog = force_catalog(repository=repository) + self._login("monolith.change_catalog") + response = self.client.get(reverse("monolith:update_catalog", args=(catalog.pk,))) + self.assertEqual(response.status_code, 404) + + def test_update_catalog_bad_mbu(self): + manifest = force_manifest() + repository = force_repository(mbu=manifest.meta_business_unit, virtual=True) + catalog = force_catalog(repository=repository, manifest=manifest) + new_repository = force_repository(mbu=MetaBusinessUnit.objects.create(name=get_random_string(12)), + virtual=True) + self._login("monolith.change_catalog") + response = self.client.post(reverse("monolith:update_catalog", args=(catalog.pk,)), + {"repository": new_repository.pk, + "name": catalog.name}) self.assertEqual(response.status_code, 200) - self.assertEqual(len(callbacks), 1) - self.assertTemplateUsed(response, "monolith/catalog_detail.html") - self.assertEqual(catalog, response.context["object"]) - catalog.refresh_from_db() - self.assertEqual(catalog.name, new_name) - self.assertEqual(catalog.priority, 42) - event = post_event.call_args_list[0].args[0] - self.assertIsInstance(event, AuditEvent) - self.assertEqual( - event.payload, - {"action": "updated", - "object": { - "model": "monolith.catalog", - "pk": str(catalog.pk), - "prev_value": prev_value, - "new_value": catalog.serialize_for_event(), - }} + self.assertTemplateUsed(response, "monolith/catalog_form.html") + self.assertFormError( + response.context["form"], "repository", + "This catalog is included in manifests linked to different business units than this repository." ) - metadata = event.metadata.serialize() - self.assertEqual(metadata["objects"], {"monolith_catalog": [str(catalog.pk)]}) - self.assertEqual(sorted(metadata["tags"]), ["monolith", "zentral"]) - - # update catalog priority - - def test_update_catalog_priority_login_redirect(self): - catalog = self._force_catalog() - self._login_redirect(reverse("monolith:update_catalog_priority", args=(catalog.pk,))) - - def test_update_catalog_priority_permission_denied(self): - catalog = self._force_catalog() - self._login() - response = self.client.get(reverse("monolith:update_catalog_priority", args=(catalog.pk,))) - self.assertContains(response, "Forbidden", status_code=403) @patch("zentral.core.queues.backends.kombu.EventQueues.post_event") - def test_update_catalog_priority(self, post_event): - catalog = self._force_catalog() + def test_update_catalog(self, post_event): + repository = force_repository(virtual=True) + catalog = force_catalog(repository=repository) prev_value = catalog.serialize_for_event() self._login("monolith.change_catalog", "monolith.view_catalog") new_name = get_random_string(12) with self.captureOnCommitCallbacks(execute=True) as callbacks: - response = self.client.post(reverse("monolith:update_catalog_priority", args=(catalog.pk,)), - {"name": new_name, "priority": 43}, + response = self.client.post(reverse("monolith:update_catalog", args=(catalog.pk,)), + {"repository": repository.pk, + "name": new_name}, follow=True) self.assertEqual(response.status_code, 200) self.assertEqual(len(callbacks), 1) self.assertTemplateUsed(response, "monolith/catalog_detail.html") self.assertEqual(catalog, response.context["object"]) catalog.refresh_from_db() - self.assertEqual(catalog.name, prev_value["name"]) # not updated - self.assertEqual(catalog.priority, 43) + self.assertEqual(catalog.name, new_name) event = post_event.call_args_list[0].args[0] self.assertIsInstance(event, AuditEvent) self.assertEqual( @@ -915,37 +1326,27 @@ def test_update_catalog_priority(self, post_event): # delete catalog - def test_delete_catalog_auto_permission_denied(self): - catalog = self._force_catalog() - response = self.client.get(reverse("monolith:delete_catalog", args=(catalog.pk,))) - self.assertContains(response, "Automatic catalog management", status_code=403) - - @patch("zentral.contrib.monolith.views.monolith_conf.repository") - def test_delete_catalog_login_redirect(self, repository): - repository.manual_catalog_management = True - catalog = self._force_catalog() + def test_delete_catalog_login_redirect(self): + repository = force_repository(virtual=True) + catalog = force_catalog(repository=repository) self._login_redirect(reverse("monolith:delete_catalog", args=(catalog.pk,))) - @patch("zentral.contrib.monolith.views.monolith_conf.repository") - def test_delete_catalog_permission_denied(self, repository): - repository.manual_catalog_management = True - catalog = self._force_catalog() + def test_delete_catalog_permission_denied(self): + repository = force_repository(virtual=True) + catalog = force_catalog(repository=repository) self._login() response = self.client.get(reverse("monolith:delete_catalog", args=(catalog.pk,))) self.assertContains(response, "Forbidden", status_code=403) - @patch("zentral.contrib.monolith.views.monolith_conf.repository") - def test_delete_catalog_cannot_be_deleted(self, repository): - repository.manual_catalog_management = True + def test_delete_catalog_cannot_be_deleted(self): self._login("monolith.delete_catalog") response = self.client.get(reverse("monolith:delete_catalog", args=(self.catalog_1.pk,))) self.assertEqual(response.status_code, 404) - @patch("zentral.contrib.monolith.views.monolith_conf.repository") @patch("zentral.core.queues.backends.kombu.EventQueues.post_event") - def test_delete_catalog(self, post_event, repository): - repository.manual_catalog_management = True - catalog = self._force_catalog() + def test_delete_catalog(self, post_event): + repository = force_repository(virtual=True) + catalog = force_catalog(repository=repository) prev_pk = catalog.pk prev_value = catalog.serialize_for_event() self._login("monolith.delete_catalog", "monolith.view_catalog") @@ -1018,8 +1419,8 @@ def test_condition_with_delete(self): self.assertContains(response, reverse("monolith:delete_condition", args=(condition.pk,))) def test_condition_cannot_delete(self): - condition = self._force_condition() - submanifest, _ = self._force_sub_manifest(condition=condition) + condition = force_condition() + force_sub_manifest_pkg_info(condition=condition) self._login("monolith.view_condition", "monolith.delete_condition") response = self.client.get(reverse("monolith:condition", args=(condition.pk,))) self.assertEqual(response.status_code, 200) @@ -1070,11 +1471,11 @@ def test_create_condition(self, post_event): # update condition def test_update_condition_login_redirect(self): - condition = self._force_condition() + condition = force_condition() self._login_redirect(reverse("monolith:update_condition", args=(condition.pk,))) def test_update_condition_permission_denied(self): - condition = self._force_condition() + condition = force_condition() self._login() response = self.client.get(reverse("monolith:update_condition", args=(condition.pk,))) self.assertEqual(response.status_code, 403) @@ -1083,10 +1484,10 @@ def test_update_condition_permission_denied(self): def test_update_condition(self, post_event): condition = Condition.objects.create(name=get_random_string(12), predicate='machine_type == "laptop"') prev_value = condition.serialize_for_event() - sub_manifest, _ = self._force_sub_manifest(condition=condition) - manifest = self._force_manifest() + manifest = force_manifest() self.assertEqual(manifest.version, 1) - ManifestSubManifest.objects.create(manifest=manifest, sub_manifest=sub_manifest) + sub_manifest = force_sub_manifest(manifest=manifest) + force_sub_manifest_pkg_info(sub_manifest=sub_manifest, condition=condition) self._login("monolith.change_condition", "monolith.view_condition") new_name = get_random_string(12) new_predicate = 'machine_type == "desktop"' @@ -1139,8 +1540,8 @@ def test_delete_condition_get(self): self.assertTemplateUsed(response, "monolith/condition_confirm_delete.html") def test_delete_condition_cannot_delete(self): - condition = self._force_condition() - submanifest, _ = self._force_sub_manifest(condition=condition) + condition = force_condition() + force_sub_manifest_pkg_info(condition=condition) self._login("monolith.delete_condition") response = self.client.get(reverse("monolith:delete_condition", args=(condition.pk,))) self.assertEqual(response.status_code, 404) @@ -1174,14 +1575,14 @@ def test_delete_condition_post(self, post_event): def test_delete_condition_get_blocked(self): condition = Condition.objects.create(name=get_random_string(12), predicate='machine_type == "laptop"') - submanifest, _ = self._force_sub_manifest(condition=condition) + force_sub_manifest_pkg_info(condition=condition) self._login("monolith.view_condition", "monolith.delete_condition") response = self.client.get(reverse("monolith:delete_condition", args=(condition.pk,)), follow=True) self.assertEqual(response.status_code, 404) def test_delete_condition_post_blocked(self): condition = Condition.objects.create(name=get_random_string(12), predicate='machine_type == "laptop"') - submanifest, _ = self._force_sub_manifest(condition=condition) + force_sub_manifest_pkg_info(condition=condition) self._login("monolith.view_condition", "monolith.delete_condition") response = self.client.post(reverse("monolith:delete_condition", args=(condition.pk,)), follow=True) self.assertEqual(response.status_code, 404) @@ -1198,10 +1599,10 @@ def test_sub_manifests_permission_denied(self): def test_sub_manifests(self): self._login("monolith.view_submanifest") - submanifest, _ = self._force_sub_manifest() + sub_manifest = force_sub_manifest() response = self.client.get(reverse("monolith:sub_manifests")) self.assertEqual(response.status_code, 200) - self.assertContains(response, submanifest.name) + self.assertContains(response, sub_manifest.name) def test_sub_manifests_search(self): self._login("monolith.view_submanifest") @@ -1209,13 +1610,13 @@ def test_sub_manifests_search(self): self.assertEqual(response.status_code, 200) self.assertTemplateUsed(response, "monolith/sub_manifest_list.html") self.assertNotContains(response, "We didn't find any item related to your search") - submanifest, _ = self._force_sub_manifest() + sub_manifest = force_sub_manifest() response = self.client.get(reverse("monolith:sub_manifests"), {"keywords": "does not exists"}) self.assertEqual(response.status_code, 200) self.assertTemplateUsed(response, "monolith/sub_manifest_list.html") self.assertContains(response, "We didn't find any item related to your search") self.assertContains(response, reverse("monolith:sub_manifests") + '">all the items') - response = self.client.get(reverse("monolith:sub_manifests"), {"keywords": submanifest.name}) + response = self.client.get(reverse("monolith:sub_manifests"), {"keywords": sub_manifest.name}) self.assertEqual(response.status_code, 200) self.assertTemplateUsed(response, "monolith/sub_manifest_list.html") self.assertNotContains(response, "We didn't find any item related to your search") @@ -1223,51 +1624,49 @@ def test_sub_manifests_search(self): # sub manifest def test_sub_manifest_login_redirect(self): - submanifest, _ = self._force_sub_manifest() - self._login_redirect(reverse("monolith:sub_manifest", args=(submanifest.pk,))) + sub_manifest = force_sub_manifest() + self._login_redirect(reverse("monolith:sub_manifest", args=(sub_manifest.pk,))) def test_sub_manifest_permission_denied(self): - submanifest, _ = self._force_sub_manifest() + sub_manifest = force_sub_manifest() self._login() - response = self.client.get(reverse("monolith:sub_manifest", args=(submanifest.pk,))) + response = self.client.get(reverse("monolith:sub_manifest", args=(sub_manifest.pk,))) self.assertEqual(response.status_code, 403) def test_sub_manifest_no_pkginfo_edit_link(self): - submanifest, submanifest_pkginfo = self._force_sub_manifest() + smpi = force_sub_manifest_pkg_info() self._login("monolith.view_submanifest") - response = self.client.get(reverse("monolith:sub_manifest", args=(submanifest.pk,))) + response = self.client.get(reverse("monolith:sub_manifest", args=(smpi.sub_manifest.pk,))) self.assertTemplateUsed(response, "monolith/sub_manifest.html") self.assertNotContains(response, 'class="danger"') self.assertNotContains( response, reverse("monolith:update_sub_manifest_pkg_info", - args=(submanifest.pk, submanifest_pkginfo.pk)) + args=(smpi.sub_manifest.pk, smpi.pk)) ) def test_sub_manifest_pkginfo_edit_link(self): - submanifest, submanifest_pkginfo = self._force_sub_manifest() + smpi = force_sub_manifest_pkg_info() self._login("monolith.view_submanifest", "monolith.change_submanifestpkginfo") - response = self.client.get(reverse("monolith:sub_manifest", args=(submanifest.pk,))) + response = self.client.get(reverse("monolith:sub_manifest", args=(smpi.sub_manifest.pk,))) self.assertTemplateUsed(response, "monolith/sub_manifest.html") self.assertNotContains(response, 'class="danger"') self.assertContains( response, reverse("monolith:update_sub_manifest_pkg_info", - args=(submanifest.pk, submanifest_pkginfo.pk)) + args=(smpi.sub_manifest.pk, smpi.pk)) ) def test_sub_manifest_pkginfo_archived_no_edit_link(self): - submanifest, submanifest_pkginfo = self._force_sub_manifest() - self.pkginfo_1_1.archived_at = datetime.utcnow() - self.pkginfo_1_1.save() + smpi = force_sub_manifest_pkg_info(archived=True) self._login("monolith.view_submanifest", "monolith.change_submanifestpkginfo") - response = self.client.get(reverse("monolith:sub_manifest", args=(submanifest.pk,))) + response = self.client.get(reverse("monolith:sub_manifest", args=(smpi.sub_manifest.pk,))) self.assertTemplateUsed(response, "monolith/sub_manifest.html") self.assertContains(response, 'class="data-row danger"') self.assertNotContains( response, reverse("monolith:update_sub_manifest_pkg_info", - args=(submanifest.pk, submanifest_pkginfo.pk)) + args=(smpi.sub_manifest.pk, smpi.pk)) ) # create submanifest @@ -1299,28 +1698,28 @@ def test_create_submanifest_post(self): # add submanifest pkginfo def test_add_sub_manifest_pkg_info_redirect(self): - submanifest, _ = self._force_sub_manifest() - self._login_redirect(reverse("monolith:sub_manifest_add_pkg_info", args=(submanifest.pk,))) + sub_manifest = force_sub_manifest() + self._login_redirect(reverse("monolith:sub_manifest_add_pkg_info", args=(sub_manifest.pk,))) def test_add_sub_manifest_pkg_info_permission_denied(self): - submanifest, _ = self._force_sub_manifest() + sub_manifest = force_sub_manifest() self._login() - response = self.client.get(reverse("monolith:sub_manifest_add_pkg_info", args=(submanifest.pk,))) + response = self.client.get(reverse("monolith:sub_manifest_add_pkg_info", args=(sub_manifest.pk,))) self.assertEqual(response.status_code, 403) def test_add_sub_manifest_pkg_info_get(self): - submanifest, _ = self._force_sub_manifest() + sub_manifest = force_sub_manifest() self._login("monolith.add_submanifestpkginfo") - response = self.client.get(reverse("monolith:sub_manifest_add_pkg_info", args=(submanifest.pk,))) + response = self.client.get(reverse("monolith:sub_manifest_add_pkg_info", args=(sub_manifest.pk,))) self.assertEqual(response.status_code, 200) self.assertTemplateUsed(response, "monolith/edit_sub_manifest_pkg_info.html") def test_add_sub_manifest_pkg_info_post_pkg_info_name_already_included(self): - submanifest, _ = self._force_sub_manifest() + smpi = force_sub_manifest_pkg_info() self._login("monolith.add_submanifestpkginfo") response = self.client.post( - reverse("monolith:sub_manifest_add_pkg_info", args=(submanifest.pk,)), - {"pkg_info_name": self.pkginfo_name_1.pk, + reverse("monolith:sub_manifest_add_pkg_info", args=(smpi.sub_manifest.pk,)), + {"pkg_info_name": smpi.pkg_info_name, "key": "managed_installs"}, follow=True, ) @@ -1332,14 +1731,15 @@ def test_add_sub_manifest_pkg_info_post_pkg_info_name_already_included(self): ) def test_add_sub_manifest_pkg_info_post_featured_item_error(self): - submanifest, _ = self._force_sub_manifest() + sub_manifest = force_sub_manifest() self._login("monolith.add_submanifestpkginfo") pkginfo_name = PkgInfoName.objects.create(name=get_random_string(12)) - PkgInfo.objects.create(name=pkginfo_name, version="1.0", + PkgInfo.objects.create(repository=force_repository(), + name=pkginfo_name, version="1.0", data={"name": pkginfo_name.name, "version": "1.0"}) response = self.client.post( - reverse("monolith:sub_manifest_add_pkg_info", args=(submanifest.pk,)), + reverse("monolith:sub_manifest_add_pkg_info", args=(sub_manifest.pk,)), {"pkg_info_name": pkginfo_name.pk, "key": "managed_installs", "featured_item": "on"}, @@ -1350,14 +1750,15 @@ def test_add_sub_manifest_pkg_info_post_featured_item_error(self): self.assertFormError(response.context["form"], "featured_item", "Only optional install items can be featured") def test_add_sub_manifest_pkg_info_post(self): - submanifest, _ = self._force_sub_manifest() + sub_manifest = force_sub_manifest() self._login("monolith.add_submanifestpkginfo", "monolith.view_submanifest") - pkginfo_name = PkgInfoName.objects.create(name=get_random_string(12)) - PkgInfo.objects.create(name=pkginfo_name, version="1.0", + pkginfo_name = force_name() + PkgInfo.objects.create(repository=force_repository(), + name=pkginfo_name, version="1.0", data={"name": pkginfo_name.name, "version": "1.0"}) response = self.client.post( - reverse("monolith:sub_manifest_add_pkg_info", args=(submanifest.pk,)), + reverse("monolith:sub_manifest_add_pkg_info", args=(sub_manifest.pk,)), {"pkg_info_name": pkginfo_name.pk, "key": "optional_installs", "featured_item": "on"}, @@ -1368,14 +1769,15 @@ def test_add_sub_manifest_pkg_info_post(self): self.assertContains(response, pkginfo_name.name) def test_add_default_install_sub_manifest_pkg_info_shard(self): - submanifest, _ = self._force_sub_manifest() + sub_manifest = force_sub_manifest() self._login("monolith.add_submanifestpkginfo", "monolith.view_submanifest") - pkginfo_name = PkgInfoName.objects.create(name=get_random_string(12)) - PkgInfo.objects.create(name=pkginfo_name, version="1.0", + pkginfo_name = force_name() + PkgInfo.objects.create(repository=force_repository(), + name=pkginfo_name, version="1.0", data={"name": pkginfo_name.name, "version": "1.0"}) response = self.client.post( - reverse("monolith:sub_manifest_add_pkg_info", args=(submanifest.pk,)), + reverse("monolith:sub_manifest_add_pkg_info", args=(sub_manifest.pk,)), {"pkg_info_name": pkginfo_name.pk, "key": "default_installs", "featured_item": "on", @@ -1386,43 +1788,44 @@ def test_add_default_install_sub_manifest_pkg_info_shard(self): self.assertEqual(response.status_code, 200) self.assertTemplateUsed(response, "monolith/sub_manifest.html") self.assertContains(response, pkginfo_name.name) - smpi = submanifest.submanifestpkginfo_set.get(pkg_info_name=pkginfo_name) + smpi = sub_manifest.submanifestpkginfo_set.get(pkg_info_name=pkginfo_name) self.assertEqual(smpi.options, {"shards": {"default": 90, "modulo": 100}}) # delete submanifest pkginfo def test_delete_sub_manifest_pkg_info_redirect(self): - submanifest, pkginfo = self._force_sub_manifest() - self._login_redirect(reverse("monolith:delete_sub_manifest_pkg_info", args=(submanifest.pk, pkginfo.pk))) + smpi = force_sub_manifest_pkg_info() + self._login_redirect(reverse("monolith:delete_sub_manifest_pkg_info", + args=(smpi.sub_manifest.pk, smpi.pk))) def test_delete_sub_manifest_pkg_info_permission_denied(self): - submanifest, pkginfo = self._force_sub_manifest() + smpi = force_sub_manifest_pkg_info() self._login() response = self.client.get(reverse("monolith:delete_sub_manifest_pkg_info", - args=(submanifest.pk, pkginfo.pk))) + args=(smpi.sub_manifest.pk, smpi.pk))) self.assertEqual(response.status_code, 403) def test_delete_sub_manifest_pkg_info_get(self): - submanifest, pkginfo = self._force_sub_manifest() + smpi = force_sub_manifest_pkg_info() self._login("monolith.delete_submanifestpkginfo", "monolith.view_submanifest") response = self.client.get(reverse("monolith:delete_sub_manifest_pkg_info", - args=(submanifest.pk, pkginfo.pk))) + args=(smpi.sub_manifest.pk, smpi.pk))) self.assertEqual(response.status_code, 200) self.assertTemplateUsed(response, "monolith/delete_sub_manifest_pkg_info.html") def test_delete_sub_manifest_pkg_info_post(self): - submanifest, pkginfo = self._force_sub_manifest() - manifest = self._force_manifest() + manifest = force_manifest() self.assertEqual(manifest.version, 1) - ManifestSubManifest.objects.create(manifest=manifest, sub_manifest=submanifest) + sub_manifest = force_sub_manifest(manifest=manifest) + smpi = force_sub_manifest_pkg_info(sub_manifest=sub_manifest) self._login("monolith.delete_submanifestpkginfo", "monolith.view_submanifest") response = self.client.post(reverse("monolith:delete_sub_manifest_pkg_info", - args=(submanifest.pk, pkginfo.pk)), + args=(sub_manifest.pk, smpi.pk)), follow=True) self.assertEqual(response.status_code, 200) self.assertTemplateUsed(response, "monolith/sub_manifest.html") - self.assertEqual(response.context["object"], submanifest) - self.assertEqual(submanifest.submanifestpkginfo_set.count(), 0) + self.assertEqual(response.context["object"], sub_manifest) + self.assertEqual(sub_manifest.submanifestpkginfo_set.count(), 0) manifest.refresh_from_db() self.assertEqual(manifest.version, 2) @@ -1481,7 +1884,7 @@ def test_manifest_search(self): self.assertEqual(response.status_code, 200) self.assertTemplateUsed(response, "monolith/manifest_list.html") self.assertNotContains(response, "We didn't find any item related to your search") - manifest = self._force_manifest() + manifest = force_manifest() response = self.client.get(reverse("monolith:manifests"), {"name": manifest.name}, follow=True) self.assertEqual(response.status_code, 200) self.assertTemplateUsed(response, "monolith/manifest.html") @@ -1543,18 +1946,18 @@ def test_create_manifest(self, post_event): # update manifest def test_update_manifest_login_redirect(self): - manifest = self._force_manifest() + manifest = force_manifest() self._login_redirect(reverse("monolith:update_manifest", args=(manifest.pk,))) def test_update_manifest_permission_denied(self): - manifest = self._force_manifest() + manifest = force_manifest() self._login() response = self.client.get(reverse("monolith:update_manifest", args=(manifest.pk,))) self.assertEqual(response.status_code, 403) @patch("zentral.core.queues.backends.kombu.EventQueues.post_event") def test_update_manifest(self, post_event): - manifest = self._force_manifest() + manifest = force_manifest() prev_value = manifest.serialize_for_event() self._login("monolith.change_manifest", "monolith.view_manifest") new_name = get_random_string(12) @@ -1619,7 +2022,9 @@ def test_manifest_machine_info(self): self.assertEqual( response.context["packages"], [('aaaa first name', - {'pkgsinfo': [({'name': 'aaaa first name', 'version': '1.0'}, + {'pkgsinfo': [({'name': 'aaaa first name', + 'version': '1.0', + 'icon_name': f'icon.{self.pkginfo_1_1.pk}.aaaa-first-name.png'}, 'installed', [], None, @@ -1646,9 +2051,9 @@ def test_terraform_export(self): "monolith.view_manifest", "monolith.view_submanifest", ) - self._force_catalog() - condition = self._force_condition() - self._force_manifest() - self._force_sub_manifest(condition) + force_catalog() + condition = force_condition() + force_manifest() + force_sub_manifest_pkg_info(condition=condition) response = self.client.get(reverse("monolith:terraform_export")) self.assertEqual(response.status_code, 200) diff --git a/tests/monolith/utils.py b/tests/monolith/utils.py new file mode 100644 index 0000000000..64c42833c4 --- /dev/null +++ b/tests/monolith/utils.py @@ -0,0 +1,155 @@ +from datetime import datetime +from django.utils.crypto import get_random_string +from zentral.contrib.inventory.models import EnrollmentSecret, MetaBusinessUnit, Tag +from zentral.contrib.monolith.models import (Catalog, Condition, Enrollment, + Manifest, ManifestCatalog, ManifestSubManifest, + PkgInfo, PkgInfoCategory, PkgInfoName, + SubManifest, SubManifestPkgInfo, + Repository, RepositoryBackend) + + +CLOUDFRONT_PRIVKEY_PEM = """-----BEGIN RSA PRIVATE KEY----- +MIIBOwIBAAJBAKRhksp6Bvp6Iph7vxcAT1FO3p78ek34i3Zjv5p65Yve8SC5ZCef +d3ZfYpTLsq8Bagmv2McYu1BLQcP6808qf5cCAwEAAQJBAJGPOX4EOoO4fUQLaDYE +9zenoGimZ+L9cPl/8J3pr7R/ZcJkXMIj9t7cI1rY/Tk5N2ARBZ/H3NE4Unm7xZJU +lKECIQDXoiGSvGMSB3rLKZYqyAj75O/lsh9TtZRZgF/bUBBScQIhAMMnREkKtr9d +5W7eziXRABOnVdQjPPle1KiHlSaAFmaHAiB70nUW7qixFKx1dzbs8BsAknETZBpL +FkzOrEHfDPWicQIhAKN8I7Jk7U9HY8sLj/sSKVRNnJNIqe3mSZSdcI9+QkXFAiBg +Y5iiw7n52shShyNTBggl3Xp8BILhfrIgGJ6o8jOQwA== +-----END RSA PRIVATE KEY-----""" + + +def force_repository(mbu=None, virtual=False, secret_access_key=None, cloudfront_privkey_pem=None): + r = Repository.objects.create( + name=get_random_string(12), + meta_business_unit=mbu, + backend=RepositoryBackend.VIRTUAL if virtual else RepositoryBackend.S3, + backend_kwargs={}, + ) + if not virtual: + kwargs = { + "bucket": get_random_string(12), + "region_name": "us-east1", + "prefix": "munki_repo/", + "access_key_id": get_random_string(20), + "secret_access_key": secret_access_key or get_random_string(20), + "signature_version": "s3v4", + "endpoint_url": None, + } + if cloudfront_privkey_pem: + kwargs["cloudfront_domain"] = get_random_string(8) + ".cloudfront.net" + kwargs["cloudfront_key_id"] = get_random_string(8) + kwargs["cloudfront_privkey_pem"] = cloudfront_privkey_pem + r.set_backend_kwargs(kwargs) + r.save() + return r + + +def force_manifest(mbu=None, name=None): + if not mbu: + mbu = MetaBusinessUnit.objects.create(name=get_random_string(12)) + return Manifest.objects.create(meta_business_unit=mbu, name=name or get_random_string(12)) + + +def force_sub_manifest(mbu=None, name=None, description=None, manifest=None, tags=None): + sm = SubManifest.objects.create( + name=name or get_random_string(12), + description=description or get_random_string(12), + meta_business_unit=mbu, + ) + if manifest: + msm = ManifestSubManifest.objects.create(manifest=manifest, sub_manifest=sm) + if tags: + msm.tags.set(tags) + return sm + + +def force_catalog(name=None, repository=None, manifest=None, tags=None, archived=False): + if not repository: + repository = force_repository() + catalog = Catalog.objects.create( + repository=repository, + name=get_random_string(12) if name is None else name, + archived_at=datetime.utcnow() if archived else None, + ) + if manifest: + mc = ManifestCatalog.objects.create(manifest=manifest, catalog=catalog) + if tags: + mc.tags.set(tags) + return catalog + + +def force_condition(name=None): + return Condition.objects.create(name=name or get_random_string(12), predicate=get_random_string(12)) + + +def force_category(repository=None, name=None): + if not repository: + repository = force_repository() + return PkgInfoCategory.objects.create(repository=repository, name=name or get_random_string(12)) + + +def force_name(name=None): + return PkgInfoName.objects.create(name=name or get_random_string(12)) + + +def _force_pkg_info( + local=True, + version="1.0", + archived=False, + catalog=None, + sub_manifest=None, + options=None, + condition=None, +): + pkg_info_name = force_name() + data = {"name": pkg_info_name.name, + "version": version} + if catalog is None: + repository = force_repository(virtual=local) + catalog = force_catalog(repository=repository) + pi = PkgInfo.objects.create( + repository=catalog.repository, + name=pkg_info_name, version=version, local=local, + archived_at=datetime.utcnow() if archived else None, + data=data + ) + pi.catalogs.add(catalog) + if sub_manifest: + options = options or {} + smpi = SubManifestPkgInfo.objects.create( + sub_manifest=sub_manifest, + key="managed_installs", + pkg_info_name=pkg_info_name, + condition=condition, + options=options, + ) + else: + smpi = None + return pkg_info_name, catalog, pi, smpi + + +def force_pkg_info(local=True, version="1.0", archived=False, catalog=None, sub_manifest=None, options=None): + _, _, pi, _ = _force_pkg_info(local, version, archived, catalog, sub_manifest, options) + return pi + + +def force_sub_manifest_pkg_info(sub_manifest=None, archived=False, condition=None): + if not sub_manifest: + manifest = force_manifest() + sub_manifest = force_sub_manifest(manifest=manifest) + _, _, _, smpi = _force_pkg_info(sub_manifest=sub_manifest, archived=archived, condition=condition) + return smpi + + +def force_enrollment(mbu=None, tag_count=0): + if not mbu: + mbu = MetaBusinessUnit.objects.create(name=get_random_string(12)) + enrollment_secret = EnrollmentSecret.objects.create(meta_business_unit=mbu) + tags = [Tag.objects.create(name=get_random_string(12)) for _ in range(tag_count)] + if tags: + enrollment_secret.tags.set(tags) + return ( + Enrollment.objects.create(manifest=force_manifest(mbu=mbu), secret=enrollment_secret), + tags + ) diff --git a/zentral/contrib/monolith/api_urls.py b/zentral/contrib/monolith/api_urls.py index 032a4768e8..43c21df4ef 100644 --- a/zentral/contrib/monolith/api_urls.py +++ b/zentral/contrib/monolith/api_urls.py @@ -7,12 +7,15 @@ ManifestList, ManifestDetail, ManifestCatalogList, ManifestCatalogDetail, ManifestSubManifestList, ManifestSubManifestDetail, + RepositoryList, RepositoryDetail, SubManifestList, SubManifestDetail, SubManifestPkgInfoList, SubManifestPkgInfoDetail) app_name = "monolith_api" urlpatterns = [ - path('repository/sync/', SyncRepository.as_view(), name="sync_repository"), + path('repositories/', RepositoryList.as_view(), name="repositories"), + path('repositories//', RepositoryDetail.as_view(), name="repository"), + path('repositories//sync/', SyncRepository.as_view(), name="sync_repository"), path('catalogs/', CatalogList.as_view(), name="catalogs"), path('catalogs//', CatalogDetail.as_view(), name="catalog"), path('conditions/', ConditionList.as_view(), name="conditions"), diff --git a/zentral/contrib/monolith/api_views.py b/zentral/contrib/monolith/api_views.py index d2656ed6ef..0803a0f26c 100644 --- a/zentral/contrib/monolith/api_views.py +++ b/zentral/contrib/monolith/api_views.py @@ -9,56 +9,25 @@ from rest_framework.response import Response from rest_framework.serializers import ModelSerializer from rest_framework.views import APIView +from base.notifier import notifier from zentral.core.events.base import AuditEvent, EventRequest -from zentral.utils.drf import DjangoPermissionRequired, DefaultDjangoModelPermissions +from zentral.utils.drf import (DjangoPermissionRequired, DefaultDjangoModelPermissions, + ListCreateAPIViewWithAudit, RetrieveUpdateDestroyAPIViewWithAudit) from zentral.utils.http import user_agent_and_ip_address_from_request -from .conf import monolith_conf from .events import post_monolith_cache_server_update_request, post_monolith_sync_catalogs_request from .models import (CacheServer, Catalog, Condition, Enrollment, - Manifest, ManifestCatalog, ManifestSubManifest, SubManifest, SubManifestPkgInfo) -from .serializers import (CatalogSerializer, ConditionSerializer, EnrollmentSerializer, + Manifest, ManifestCatalog, ManifestSubManifest, + Repository, + SubManifest, SubManifestPkgInfo) +from .repository_backends import load_repository_backend +from .serializers import (CatalogSerializer, ConditionSerializer, + EnrollmentSerializer, ManifestSerializer, ManifestCatalogSerializer, ManifestSubManifestSerializer, + RepositorySerializer, SubManifestSerializer, SubManifestPkgInfoSerializer) from .utils import build_configuration_plist, build_configuration_profile -class SyncRepository(APIView): - permission_required = ( - "monolith.view_catalog", "monolith.add_catalog", "monolith.change_catalog", - "monolith.view_pkginfoname", "monolith.add_pkginfoname", "monolith.change_pkginfoname", - "monolith.view_pkginfo", "monolith.add_pkginfo", "monolith.change_pkginfo", - "monolith.change_manifest" - ) - permission_classes = [DjangoPermissionRequired] - - def initialize_events(self, request): - self.events = [] - self.event_uuid = uuid.uuid4() - self.event_index = 0 - self.event_request = EventRequest.build_from_request(request) - - def audit_callback(self, instance, action, prev_value=None): - self.events.append( - AuditEvent.build( - instance, action, prev_value=prev_value, - event_uuid=self.event_uuid, event_index=self.event_index, - event_request=self.event_request - ) - ) - self.event_index += 1 - - def post_events(self): - for event in self.events: - event.post() - - def post(self, request, *args, **kwargs): - post_monolith_sync_catalogs_request(request) - self.initialize_events(request) - monolith_conf.repository.sync_catalogs(self.audit_callback) - transaction.on_commit(lambda: self.post_events()) - return Response({"status": 0}) - - class CacheServerSerializer(ModelSerializer): class Meta: model = CacheServer @@ -91,6 +60,65 @@ def post(self, request, *args, **kwargs): return Response({"status": 0}) +# repositories + + +class SyncRepository(APIView): + permission_required = "monolith.sync_repository" + permission_classes = [DjangoPermissionRequired] + + def initialize_events(self, request): + self.events = [] + self.event_uuid = uuid.uuid4() + self.event_index = 0 + self.event_request = EventRequest.build_from_request(request) + + def audit_callback(self, instance, action, prev_value=None): + event = AuditEvent.build( + instance, action, prev_value=prev_value, + event_uuid=self.event_uuid, event_index=self.event_index, + event_request=self.event_request + ) + event.metadata.add_objects({"monolith_repository": ((self.db_repository.pk,),)}) + self.events.append(event) + self.event_index += 1 + + def post_events(self): + for event in self.events: + event.post() + + def post(self, request, *args, **kwargs): + self.db_repository = get_object_or_404(Repository, pk=kwargs["pk"]) + post_monolith_sync_catalogs_request(request, self.db_repository) + repository = load_repository_backend(self.db_repository) + self.initialize_events(request) + repository.sync_catalogs(self.audit_callback) + transaction.on_commit(lambda: self.post_events()) + return Response({"status": 0}) + + +class RepositoryList(ListCreateAPIViewWithAudit): + queryset = Repository.objects.all() + serializer_class = RepositorySerializer + filterset_fields = ('name',) + + def on_commit_callback_extra(self, instance): + notifier.send_notification("monolith.repository", str(instance.pk)) + + +class RepositoryDetail(RetrieveUpdateDestroyAPIViewWithAudit): + queryset = Repository.objects.all() + serializer_class = RepositorySerializer + + def on_commit_callback_extra(self, instance): + notifier.send_notification("monolith.repository", str(instance.pk)) + + def perform_destroy(self, instance): + if not instance.can_be_deleted(): + raise ValidationError('This repository cannot be deleted') + return super().perform_destroy(instance) + + # catalogs @@ -108,7 +136,7 @@ class CatalogDetail(generics.RetrieveUpdateDestroyAPIView): permission_classes = (DefaultDjangoModelPermissions,) def perform_destroy(self, instance): - if not instance.can_be_deleted(override_manual_management=True): + if not instance.can_be_deleted(): raise ValidationError('This catalog cannot be deleted') return super().perform_destroy(instance) diff --git a/zentral/contrib/monolith/apps.py b/zentral/contrib/monolith/apps.py index f1bd1307da..48b3565380 100644 --- a/zentral/contrib/monolith/apps.py +++ b/zentral/contrib/monolith/apps.py @@ -16,6 +16,7 @@ class ZentralMonolithAppConfig(ZentralAppConfig): "manifestsubmanifest", "pkginfo", "pkginfoname", + "repository", "submanifest", "submanifestpkginfo", ) diff --git a/zentral/contrib/monolith/conf.py b/zentral/contrib/monolith/conf.py index ee18294071..4f2d58ea9d 100644 --- a/zentral/contrib/monolith/conf.py +++ b/zentral/contrib/monolith/conf.py @@ -1,26 +1,51 @@ -from importlib import import_module -from django.utils.functional import cached_property +from datetime import datetime, timedelta +import logging +import threading +from django.utils.functional import cached_property, SimpleLazyObject from zentral.conf import settings from zentral.utils.osx_package import get_package_builders +from base.notifier import notifier +from .repository_backends import load_repository_backend + + +logger = logging.getLogger("zentral.contrib.monolith.conf") class MonolithConf: - def app_config(self): - return settings['apps']['zentral.contrib.monolith'].copy() + reload_interval = timedelta(hours=1) - @cached_property - def repository(self): - repository_cfg = self.app_config()['munki_repository'] - repository_class_name = "Repository" - module = import_module(repository_cfg.pop('backend')) - repository_class = getattr(module, repository_class_name) - return repository_class(repository_cfg) + def __init__(self): + self._lock = threading.Lock() + self._repositories_last_loaded_at = None + + def _reload_repositories(self, *args, **kwargs): + logger.info("Reload repositories") + # avoid circular dependencies + from .models import Repository + with self._lock: + self._repositories = {} + for repository in Repository.objects.select_related("meta_business_unit").all(): + self._repositories[repository.pk] = load_repository_backend(repository) + logger.info("Repository %s loaded", repository) + if self._repositories_last_loaded_at is None: + # first time + notifier.add_callback("monolith.repository", self._reload_repositories) + self._repositories_last_loaded_at = datetime.utcnow() + + def get_repository(self, pk): + if ( + self._repositories_last_loaded_at is None + or datetime.utcnow() - self._repositories_last_loaded_at > self.reload_interval + ): + self._reload_repositories() + with self._lock: + return self._repositories[pk] @cached_property def enrollment_package_builders(self): package_builders = get_package_builders() enrollment_package_builders = {} - epb_cfg = self.app_config().get('enrollment_package_builders') + epb_cfg = settings['apps']['zentral.contrib.monolith'].get('enrollment_package_builders') if epb_cfg: for builder, builder_cfg in epb_cfg.serialize().items(): requires = builder_cfg.get("requires") @@ -36,4 +61,4 @@ def enrollment_package_builders(self): return enrollment_package_builders -monolith_conf = MonolithConf() +monolith_conf = SimpleLazyObject(lambda: MonolithConf()) diff --git a/zentral/contrib/monolith/events/__init__.py b/zentral/contrib/monolith/events/__init__.py index dc95515a04..e6ad11d321 100644 --- a/zentral/contrib/monolith/events/__init__.py +++ b/zentral/contrib/monolith/events/__init__.py @@ -28,6 +28,13 @@ class MonolithSyncCatalogsRequestEvent(BaseEvent): event_type = "monolith_sync_catalogs_request" tags = ["monolith"] + def get_linked_objects_keys(self): + keys = {} + repository_pk = self.payload.get("repository", {}).get("pk") + if repository_pk: + keys["mdm_repository"] = [(repository_pk,)] + return keys + register_event_type(MonolithSyncCatalogsRequestEvent) @@ -47,11 +54,14 @@ def post_monolith_munki_request(msn, user_agent, ip, **payload): MonolithMunkiRequestEvent.post_machine_request_payloads(msn, user_agent, ip, [payload]) -def post_monolith_sync_catalogs_request(request): +def post_monolith_sync_catalogs_request(request, repository): event_class = MonolithSyncCatalogsRequestEvent event_request = EventRequest.build_from_request(request) metadata = EventMetadata(request=event_request) - event = event_class(metadata, {}) + event = event_class( + metadata, + {"repository": repository.serialize_for_event(keys_only=True)} + ) event.post() diff --git a/zentral/contrib/monolith/forms.py b/zentral/contrib/monolith/forms.py index 20560f2cf4..c0a5c18b73 100644 --- a/zentral/contrib/monolith/forms.py +++ b/zentral/contrib/monolith/forms.py @@ -5,7 +5,48 @@ from .exceptions import AttachmentError from .models import (Catalog, Enrollment, Manifest, ManifestCatalog, ManifestSubManifest, - PkgInfo, PkgInfoName, SubManifest, SubManifestPkgInfo) + PkgInfo, PkgInfoCategory, PkgInfoName, + Repository, + SubManifest, SubManifestPkgInfo) + + +class RepositoryForm(forms.ModelForm): + class Meta: + model = Repository + fields = "__all__" + + def clean_meta_business_unit(self): + mbu = self.cleaned_data.get("meta_business_unit") + if mbu and self.instance.pk: + for manifest in self.instance.manifests(): + if manifest.meta_business_unit != mbu: + raise forms.ValidationError( + f"Repository linked to manifest '{manifest}' which has a different business unit." + ) + return mbu + + +class CatalogForm(forms.ModelForm): + class Meta: + model = Catalog + fields = "__all__" + + def __init__(self, *args, **kwargs): + super().__init__(*args, **kwargs) + self.fields["repository"].queryset = Repository.objects.for_manual_catalogs() + + def clean_repository(self): + repository = self.cleaned_data.get("repository") + if repository and repository.meta_business_unit and self.instance.pk: + if ( + Manifest.objects.filter(manifestcatalog__catalog=self.instance) + .exclude(meta_business_unit=repository.meta_business_unit) + .count() + ): + raise forms.ValidationError( + "This catalog is included in manifests linked to different business units than this repository." + ) + return repository class PkgInfoSearchForm(forms.Form): @@ -222,6 +263,10 @@ def __init__(self, *args, **kwargs): self.fields["excluded_tags"].initial = [tag.pk for tag in self.instance.excluded_tags] self.fields["default_shard"].initial = self.instance.default_shard self.fields["shard_modulo"].initial = self.instance.shard_modulo + # catalogs + self.fields["catalogs"].queryset = Catalog.objects.for_upload() + # categories + self.fields["category"].queryset = PkgInfoCategory.objects.for_upload() # hide name field if necessary if self.pkg_info_name: del self.fields["name"] @@ -255,12 +300,25 @@ def clean_file(self): raise forms.ValidationError(e.message) return None + def clean_catalogs(self): + catalogs = self.cleaned_data.get("catalogs") + if catalogs and len(set(c.repository for c in catalogs)) > 1: + raise forms.ValidationError("The catalogs must be from the same repository.") + return catalogs + def clean(self): self.instance.local = True if self.instance.data is None: self.instance.data = {} data = self.instance.data pin = self.cleaned_data.get("name", self.pkg_info_name) + # repository + catalogs = self.cleaned_data.get("catalogs") + if catalogs: + self.instance.repository = catalogs[0].repository + category = self.cleaned_data.get("category") + if category and category.repository != self.instance.repository: + self.add_error("category", "The category must be from the same repository as the catalogs.") # file pf = self.cleaned_data.get("package_file") if pf: @@ -329,7 +387,7 @@ def clean(self): elif "description" in data: del data["description"] # category → data - category = self.cleaned_data["category"] + category = self.cleaned_data.get("category") if category: data["category"] = category.name elif "category" in data: @@ -365,11 +423,8 @@ class AddManifestCatalogForm(forms.Form): def __init__(self, *args, **kwargs): self.manifest = kwargs.pop('manifest') super().__init__(*args, **kwargs) - field = self.fields['catalog'] - field.queryset = field.queryset.exclude(id__in=[mc.catalog_id - for mc in self.manifest.manifestcatalog_set.all()]) - field = self.fields['tags'] - field.queryset = Tag.objects.available_for_meta_business_unit(self.manifest.meta_business_unit) + self.fields['catalog'].queryset = Catalog.objects.available_for_manifest(self.manifest, add_only=True) + self.fields['tags'].queryset = Tag.objects.available_for_meta_business_unit(self.manifest.meta_business_unit) def save(self): mc = ManifestCatalog(manifest=self.manifest, diff --git a/zentral/contrib/monolith/migrations/0049_auto_20230119_1027.py b/zentral/contrib/monolith/migrations/0049_auto_20230119_1027.py old mode 100755 new mode 100644 diff --git a/zentral/contrib/monolith/migrations/0055_alter_catalog_options_alter_manifestcatalog_options_and_more.py b/zentral/contrib/monolith/migrations/0055_alter_catalog_options_alter_manifestcatalog_options_and_more.py new file mode 100644 index 0000000000..c862797e3e --- /dev/null +++ b/zentral/contrib/monolith/migrations/0055_alter_catalog_options_alter_manifestcatalog_options_and_more.py @@ -0,0 +1,150 @@ +# Generated by Django 4.2.8 on 2024-01-17 13:59 + +from django.db import migrations, models +import django.db.models.deletion + + +def create_repository(apps, schema_editor): + try: + from zentral.conf import settings + config_repo = settings["apps"]["zentral.contrib.monolith"]["munki_repository"] + except Exception: + return + try: + from zentral.contrib.monolith.repository_backends import RepositoryBackend + config_repo = config_repo or {} + if config_repo.get("backend") == "zentral.contrib.monolith.repository_backends.s3": + backend = RepositoryBackend.S3 + backend_kwargs = {} + for db_a, cfg_a in (("bucket", "bucket"), + ("region_name", "region_name"), + ("prefix", "prefix"), + ("access_key_id", "aws_access_key_id"), + ("secret_access_key", "aws_secret_access_key"), + ("assume_role_arn", "assume_role_arn"), + ("signature_version", "signature_version"), + ("endpoint_url", "endpoint_url")): + val = config_repo.get(cfg_a) + if val is not None: + backend_kwargs[db_a] = val + cloudfront_cfg = config_repo.get("cloudfront") + if cloudfront_cfg: + for db_a, cfg_a in (("cloudfront_domain", "domain"), + ("cloudfront_key_id", "key_id"), + ("cloudfront_privkey_pem", "privkey_pem")): + val = cloudfront_cfg.get(cfg_a) + if val is not None: + backend_kwargs[db_a] = val + else: + backend = RepositoryBackend.VIRTUAL + backend_kwargs = {} + from zentral.contrib.monolith.models import Repository + repository = Repository.objects.create( + name="Default", + meta_business_unit=None, + backend=backend, + backend_kwargs={}, + ) + repository.set_backend_kwargs(backend_kwargs) + repository.save() + Catalog = apps.get_model("monolith", "Catalog") + Catalog.objects.update(repository=repository) + PkgInfoCategory = apps.get_model("monolith", "PkgInfoCategory") + PkgInfoCategory.objects.update(repository=repository) + PkgInfo = apps.get_model("monolith", "PkgInfo") + PkgInfo.objects.update(repository=repository) + except Exception: + return + + +class Migration(migrations.Migration): + + dependencies = [ + ('inventory', '0077_file_signing_id'), + ('monolith', '0054_auto_20230317_0921'), + ] + + operations = [ + migrations.AlterModelOptions( + name='manifestcatalog', + options={'ordering': ('catalog__name',)}, + ), + migrations.AlterField( + model_name='catalog', + name='name', + field=models.CharField(max_length=256), + ), + migrations.CreateModel( + name='Repository', + fields=[ + ('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')), + ('name', models.CharField(max_length=256, unique=True)), + ('backend', models.CharField(choices=[('S3', 'Amazon S3'), ('VIRTUAL', 'Virtual')], max_length=32)), + ('backend_kwargs', models.JSONField(editable=False)), + ('icon_hashes', models.JSONField(default=dict, editable=False)), + ('client_resources', models.JSONField(default=list, editable=False)), + ('last_synced_at', models.DateTimeField(editable=False, null=True)), + ('created_at', models.DateTimeField(auto_now_add=True)), + ('updated_at', models.DateTimeField(auto_now=True)), + ('meta_business_unit', models.ForeignKey(blank=True, null=True, + on_delete=django.db.models.deletion.SET_NULL, + to='inventory.metabusinessunit')), + ], + ), + migrations.AlterModelOptions( + name='repository', + options={'ordering': ('name',), 'permissions': [('sync_repository', 'Can sync repository')]}, + ), + migrations.AddField( + model_name='catalog', + name='repository', + field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.CASCADE, to='monolith.repository'), + ), + migrations.AlterField( + model_name='catalog', + name='archived_at', + field=models.DateTimeField(editable=False, null=True), + ), + migrations.AlterModelOptions( + name='catalog', + options={'ordering': ('-archived_at', 'repository__name', 'name', 'pk')}, + ), + migrations.AddField( + model_name='pkginfo', + name='repository', + field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.CASCADE, to='monolith.repository'), + ), + migrations.AddField( + model_name='pkginfocategory', + name='repository', + field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.CASCADE, to='monolith.repository'), + ), + migrations.AlterUniqueTogether( + name='catalog', + unique_together={('repository', 'name')}, + ), + migrations.AlterUniqueTogether( + name='pkginfocategory', + unique_together={('repository', 'name')}, + ), + migrations.RemoveField( + model_name='catalog', + name='priority', + ), + migrations.RunPython(create_repository), + migrations.AlterField( + model_name='catalog', + name='repository', + field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='monolith.repository'), + ), + migrations.AlterField( + model_name='pkginfo', + name='repository', + field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='monolith.repository'), + ), + migrations.AlterField( + model_name='pkginfocategory', + name='repository', + field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='monolith.repository'), + ), + ] diff --git a/zentral/contrib/monolith/models.py b/zentral/contrib/monolith/models.py index 4831a7f459..98749af863 100644 --- a/zentral/contrib/monolith/models.py +++ b/zentral/contrib/monolith/models.py @@ -1,3 +1,4 @@ +from dataclasses import dataclass from datetime import timedelta from itertools import chain import json @@ -17,6 +18,7 @@ from zentral.contrib.inventory.models import BaseEnrollment, MetaBusinessUnit, Tag from zentral.utils.text import get_version_sort_key from .conf import monolith_conf +from .repository_backends import RepositoryBackend, get_repository_backend, load_repository_backend from .utils import build_manifest_enrollment_package @@ -70,6 +72,79 @@ def parse_munki_name(name): raise MunkiNameError +class RepositoryManager(models.Manager): + def for_deletion(self): + return self.annotate( + # not linked to a manifest + manifest_link_count=Count("catalog__manifestcatalog"), + ).filter(manifest_link_count=0) + + def for_manual_catalogs(self): + return self.filter(backend=RepositoryBackend.VIRTUAL) + + +class Repository(models.Model): + name = models.CharField(max_length=256, unique=True) + meta_business_unit = models.ForeignKey(MetaBusinessUnit, on_delete=models.SET_NULL, blank=True, null=True) + backend = models.CharField(max_length=32, choices=RepositoryBackend.choices) + backend_kwargs = models.JSONField(editable=False) + icon_hashes = models.JSONField(editable=False, default=dict) + client_resources = models.JSONField(editable=False, default=list) + last_synced_at = models.DateTimeField(editable=False, null=True) + created_at = models.DateTimeField(auto_now_add=True) + updated_at = models.DateTimeField(auto_now=True) + + objects = RepositoryManager() + + class Meta: + ordering = ("name",) + permissions = [ + ("sync_repository", "Can sync repository"), + ] + + def __str__(self): + return self.name + + def can_be_deleted(self): + return Repository.objects.for_deletion().filter(pk=self.pk).exists() + + def manifests(self): + return Manifest.objects.distinct().filter(manifestcatalog__catalog__repository=self) + + def get_absolute_url(self): + return reverse("monolith:repository", args=(self.pk,)) + + def get_backend_kwargs(self): + backend = load_repository_backend(self) + return backend.get_kwargs() + + def get_backend_kwargs_for_event(self): + backend = load_repository_backend(self) + return backend.get_kwargs_for_event() + + def set_backend_kwargs(self, kwargs): + backend = get_repository_backend(self) + backend.set_kwargs(kwargs) + + def rewrap_secrets(self): + backend = load_repository_backend(self) + backend.rewrap_kwargs() + + def serialize_for_event(self, keys_only=False): + d = {"pk": self.pk, + "name": self.name} + if not keys_only: + if self.meta_business_unit: + d["meta_business_unit"] = self.meta_business_unit.serialize_for_event(keys_only=True) + d.update({ + "backend": str(self.backend), + "backend_kwargs": self.get_backend_kwargs_for_event(), + "created_at": self.created_at, + "updated_at": self.updated_at + }) + return d + + class CatalogManager(models.Manager): def for_deletion(self): return self.annotate( @@ -77,23 +152,45 @@ def for_deletion(self): pkginfo_count=Count("pkginfo", filter=Q(pkginfo__archived_at__isnull=True)), # not included in a manifest manifestcatalog_count=Count("manifestcatalog") - ).filter(pkginfo_count=0, manifestcatalog_count=0) + ).filter( + repository__backend=RepositoryBackend.VIRTUAL, + pkginfo_count=0, + manifestcatalog_count=0 + ) + + def for_update(self): + return self.filter(repository__backend=RepositoryBackend.VIRTUAL) + + def for_upload(self): + return self.filter(repository__backend=RepositoryBackend.VIRTUAL) + + def available_for_manifest(self, manifest, add_only=False): + qs = self.filter( + Q(repository__meta_business_unit__isnull=True) + | Q(repository__meta_business_unit=manifest.meta_business_unit) + ) + if add_only: + qs = qs.exclude( + pk__in=[mc.catalog_id for mc in manifest.manifestcatalog_set.all()] + ) + return qs class Catalog(models.Model): - name = models.CharField(max_length=256, unique=True) - priority = models.PositiveIntegerField(default=0) + repository = models.ForeignKey(Repository, on_delete=models.CASCADE) + name = models.CharField(max_length=256) created_at = models.DateTimeField(auto_now_add=True) updated_at = models.DateTimeField(auto_now=True) - archived_at = models.DateTimeField(blank=True, null=True) + archived_at = models.DateTimeField(null=True, editable=False) objects = CatalogManager() class Meta: - ordering = ('-archived_at', '-priority', 'name') + unique_together = (('repository', 'name'),) + ordering = ('-archived_at', 'repository__name', 'name', 'pk') def __str__(self): - return self.name + return f"{self.repository} - {self.name}" def get_absolute_url(self): return reverse("monolith:catalog", args=(self.pk,)) @@ -102,34 +199,48 @@ def get_pkg_info_url(self): return "{}?{}".format(reverse("monolith:pkg_infos"), urllib.parse.urlencode({"catalog": self.pk})) - def can_be_deleted(self, override_manual_management=False): - return ((override_manual_management or monolith_conf.repository.manual_catalog_management) - and self.pkginfo_set.filter(archived_at__isnull=True).count() == 0 - and self.manifestcatalog_set.count() == 0) + def can_be_deleted(self): + return Catalog.objects.for_deletion().filter(pk=self.pk).exists() + + def can_be_updated(self): + return Catalog.objects.for_update().filter(pk=self.pk).exists() def serialize_for_event(self, keys_only=False): - d = {"pk": self.pk, "name": self.name} - if keys_only: - return d - d.update({"created_at": self.created_at, - "updated_at": self.updated_at}) - if self.archived_at: - d["archived_at"] = self.archived_at + d = {"pk": self.pk, + "repository": self.repository.serialize_for_event(keys_only=True), + "name": self.name} + if not keys_only: + d.update({"created_at": self.created_at, + "updated_at": self.updated_at}) + if self.archived_at: + d["archived_at"] = self.archived_at return d +class PkgInfoCategoryManager(models.Manager): + def for_upload(self): + return self.filter(repository__backend=RepositoryBackend.VIRTUAL) + + class PkgInfoCategory(models.Model): + repository = models.ForeignKey(Repository, on_delete=models.CASCADE) name = models.CharField(max_length=256, unique=True) created_at = models.DateTimeField(auto_now_add=True) + objects = PkgInfoCategoryManager() + + class Meta: + unique_together = (('repository', 'name'),) + def __str__(self): return self.name def serialize_for_event(self, keys_only=False): - d = {"pk": self.pk, "name": self.name} - if keys_only: - return d - d["created_at"] = self.created_at + d = {"pk": self.pk, + "repository": self.repository.serialize_for_event(keys_only=True), + "name": self.name} + if not keys_only: + d["created_at"] = self.created_at return d @@ -212,7 +323,7 @@ def alles(self, **kwargs): f"with aggregated_pi as ({aggregated_pi_query}) " "select api.*," "case when pn_total=0 then null else 100.0 * count / pn_total end as percent," - "json_agg(distinct jsonb_build_object('pk', c.id, 'name', c.name, 'priority', c.priority)) as catalogs " + "json_agg(distinct jsonb_build_object('pk', c.id, 'name', c.name)) as catalogs " "from aggregated_pi as api " "{left}join monolith_pkginfo_catalogs as pc on (pc.pkginfo_id = api.pi_pk) " "{left}join monolith_catalog as c on (c.id = pc.catalog_id) " @@ -252,7 +363,7 @@ def alles(self, **kwargs): 'version': version, 'version_sort': get_version_sort_key(version), 'local': pi_local, - 'catalogs': sorted(catalogs, key=lambda c: (c["priority"], c["name"])), + 'catalogs': sorted(catalogs, key=lambda c: c["name"]), 'count': int(count), 'percent': percent} if pi_opts: @@ -306,6 +417,7 @@ def pkg_info_path(instance, filename): class PkgInfo(models.Model): + repository = models.ForeignKey(Repository, on_delete=models.CASCADE) name = models.ForeignKey(PkgInfoName, on_delete=models.CASCADE) version = models.CharField(max_length=256) catalogs = models.ManyToManyField(Catalog) @@ -334,15 +446,32 @@ def __str__(self): def active_catalogs(self): return self.catalogs.filter(archived_at__isnull=True) + def get_original_icon_name(self): + return self.data.get("icon_name") or f"{self.name.name}.png" + + def get_monolith_icon_name(self): + icon_name = self.data.get("icon_name") + if icon_name: + root, ext = os.path.splitext(icon_name) + name = os.path.basename(root) + else: + ext = ".png" + name = self.name.name + return build_munki_name("icon", self.id, name, ext) + def get_pkg_info(self): pkg_info = self.data.copy() pkg_info.pop("catalogs", None) - for attr in ("installer_item_location", "uninstaller_item_loc"): + # replace package locations + for attr in ("installer_item_location", "uninstaller_item_location"): loc = pkg_info.pop(attr, None) if loc: root, ext = os.path.splitext(loc) name = os.path.basename(root) - pkg_info[attr] = build_munki_name("repository_package", self.id, name, ext) + model = attr.removesuffix("_location") + pkg_info[attr] = build_munki_name(model, self.id, name, ext) + # replace icon name + pkg_info["icon_name"] = self.get_monolith_icon_name() return pkg_info def get_absolute_url(self): @@ -413,6 +542,37 @@ def linked_objects_keys_for_event(self): "munki_pkginfo": ((self.name.name, self.version),)} +@dataclass +class CachedPkgInfo: + pk: int + repository_pk: int + version: str + file_name: str + installer_item_location: str + uninstaller_item_location: str + icon_name: str + name: str + + def get_repository_section_and_name(self, model): + section = name = None + if model == "installer_item": + section = "pkgs" + if self.file_name: + name = self.file_name + elif self.installer_item_location: + name = self.installer_item_location + elif model == "uninstaller_item" and self.uninstaller_item_location: + section = "pkgs" + name = self.uninstaller_item_location + elif model == "icon": + section = "icons" + if self.icon_name: + name = self.icon_name + else: + name = f"{self.name}.png" + return section, name + + SUB_MANIFEST_PKG_INFO_KEY_CHOICES = ( ('managed_installs', 'Managed Installs'), ('managed_uninstalls', 'Managed Uninstalls'), @@ -641,7 +801,7 @@ def catalogs(self, tags=None): return [mc.catalog for mc in (self.manifestcatalog_set .distinct() - .select_related("catalog") + .select_related("catalog__repository") .filter(Q(tags__isnull=True) | Q(tags__in=tags)))] def sub_manifests(self, tags=None): @@ -692,76 +852,150 @@ def enrollment_packages(self, tags=None): d[ep.builder] = ep return d - def pkginfos_with_deps_and_updates(self, tags=None): + def _pkginfos_with_deps_and_updates(self, tags): """PkgInfos linked to a manifest for a given set of tags""" + kwargs = {"manifest_pk": self.pk} if tags: - m2mt_filter = "OR m2mt.tag_id in ({})".format(",".join(str(int(t.id)) for t in tags)) + m2mt_filter = "OR m2mt.tag_id in %(tag_pks)s" + kwargs["tag_pks"] = tuple(t.pk for t in tags) else: m2mt_filter = "" query = ( "WITH RECURSIVE pkginfos_with_deps_and_updates AS ( " - "SELECT pi.id as pi_id, pi.version as pi_version, pn.id AS pn_id, pn.name as pn_name " + + "SELECT pi.id pk," + "pi.repository_id repository_pk," + "pi.version version," + "pi.file file_name," + "pi.data->>'installer_item_location' installer_item_location," + "pi.data->>'uninstaller_item_location' uninstaller_item_location," + "pi.data->>'icon_name' icon_name," + "pn.id name_pk," + "pn.name name " "FROM monolith_pkginfo pi " "JOIN monolith_pkginfoname pn ON (pi.name_id=pn.id) " - "JOIN monolith_submanifestpkginfo sm ON (pn.id=pkg_info_name_id) " + "JOIN monolith_submanifestpkginfo sm ON (pn.id=sm.pkg_info_name_id) " "JOIN monolith_manifestsubmanifest ms ON (sm.sub_manifest_id=ms.sub_manifest_id) " "LEFT JOIN monolith_manifestsubmanifest_tags m2mt ON (ms.id=m2mt.manifestsubmanifest_id) " - "WHERE ms.manifest_id = {manifest_id} " - "AND (m2mt.tag_id IS NULL {m2mt_filter}) " + "WHERE ms.manifest_id = %(manifest_pk)s " + f"AND (m2mt.tag_id IS NULL {m2mt_filter}) " + "UNION " - "SELECT pi.id, pi.version, pn.id, pn.name " + + "SELECT pi.id," + "pi.repository_id," + "pi.version," + "pi.file," + "pi.data->>'installer_item_location'," + "pi.data->>'uninstaller_item_location'," + "pi.data->>'icon_name'," + "pn.id," + "pn.name " "FROM monolith_pkginfo pi " "JOIN monolith_pkginfoname pn ON (pi.name_id=pn.id) " "LEFT JOIN monolith_pkginfo_requires pr ON (pr.pkginfoname_id=pn.id) " "LEFT JOIN monolith_pkginfo_update_for pu ON (pu.pkginfo_id=pi.id) " - "JOIN pkginfos_with_deps_and_updates rec ON (pr.pkginfo_id=rec.pi_id OR pu.pkginfoname_id=rec.pn_id) " + "JOIN pkginfos_with_deps_and_updates rec ON (pr.pkginfo_id=rec.pk OR pu.pkginfoname_id=rec.name_pk) " + ") " - "SELECT pi_id as id, pi_version as version from pkginfos_with_deps_and_updates " - "JOIN monolith_pkginfo_catalogs pc ON (pi_id=pc.pkginfo_id) " + + "SELECT pk," + "repository_pk," + "version," + "file_name," + "installer_item_location," + "uninstaller_item_location," + "icon_name," + "name " + "from pkginfos_with_deps_and_updates " + "JOIN monolith_pkginfo_catalogs pc ON (pk=pc.pkginfo_id) " "JOIN monolith_manifestcatalog mc ON (pc.catalog_id=mc.catalog_id) " "LEFT JOIN monolith_manifestcatalog_tags m2mt ON (mc.id=m2mt.manifestcatalog_id) " - "WHERE mc.manifest_id = {manifest_id} " - "AND (m2mt.tag_id IS NULL {m2mt_filter});" - ).format(manifest_id=int(self.id), m2mt_filter=m2mt_filter) - return PkgInfo.objects.raw(query) - - def _pkginfo_deps_and_updates(self, package_names, tags): - package_names = ",".join("'{}'".format(package_name) - for package_name in set(package_names)) + "WHERE mc.manifest_id = %(manifest_pk)s " + f"AND (m2mt.tag_id IS NULL {m2mt_filter});" + ) + cursor = connection.cursor() + cursor.execute(query, kwargs) + for row in cursor.fetchall(): + yield CachedPkgInfo(*row) + + def _enrollment_packages_pkginfo_deps(self, tags): + """PkgInfos that enrollment packages require, with their dependencies""" + package_names = tuple(chain.from_iterable( + ep.get_requires() + for ep in self.enrollment_packages(tags).values() + )) if not package_names: - return PkgInfo.objects.none() + return + kwargs = { + "manifest_pk": self.pk, + "package_names": package_names, + } if tags: - m2mt_filter = "OR m2mt.tag_id in ({})".format(",".join(str(int(t.id)) for t in tags)) + m2mt_filter = "OR m2mt.tag_id in %(tag_pks)s" + kwargs["tag_pks"] = tuple(t.pk for t in tags) else: m2mt_filter = "" query = ( "WITH RECURSIVE pkginfos_with_deps_and_updates AS ( " - "SELECT pi.id as pi_id, pi.version as pi_version, pn.id AS pn_id, pn.name as pn_name " + + "SELECT pi.id pk," + "pi.repository_id repository_pk," + "pi.version version," + "pi.file file_name," + "pi.data->>'installer_item_location' installer_item_location," + "pi.data->>'uninstaller_item_location' uninstaller_item_location," + "pi.data->>'icon_name' icon_name," + "pn.id name_pk," + "pn.name name " "FROM monolith_pkginfo pi " "JOIN monolith_pkginfoname pn ON (pi.name_id=pn.id) " - "WHERE pn.name in ({package_names}) " + "WHERE pn.name in %(package_names)s " + "UNION " - "SELECT pi.id, pi.version, pn.id, pn.name " + + "SELECT pi.id," + "pi.repository_id," + "pi.version," + "pi.file," + "pi.data->>'installer_item_location'," + "pi.data->>'uninstaller_item_location'," + "pi.data->>'icon_name'," + "pn.id," + "pn.name " "FROM monolith_pkginfo pi " "JOIN monolith_pkginfoname pn ON (pi.name_id=pn.id) " "LEFT JOIN monolith_pkginfo_requires pr ON (pr.pkginfoname_id=pn.id) " "LEFT JOIN monolith_pkginfo_update_for pu ON (pu.pkginfo_id=pi.id) " - "JOIN pkginfos_with_deps_and_updates rec ON (pr.pkginfo_id=rec.pi_id OR pu.pkginfoname_id=rec.pn_id) " + "JOIN pkginfos_with_deps_and_updates rec ON (pr.pkginfo_id=rec.pk OR pu.pkginfoname_id=rec.name_pk) " + ") " - "SELECT pi_id as id, pi_version as version from pkginfos_with_deps_and_updates " - "JOIN monolith_pkginfo_catalogs pc ON (pi_id=pc.pkginfo_id) " + + "SELECT pk," + "repository_pk," + "version," + "file_name," + "installer_item_location," + "uninstaller_item_location," + "icon_name," + "name " + "from pkginfos_with_deps_and_updates " + "JOIN monolith_pkginfo_catalogs pc ON (pk=pc.pkginfo_id) " "JOIN monolith_manifestcatalog mc ON (pc.catalog_id=mc.catalog_id) " "LEFT JOIN monolith_manifestcatalog_tags m2mt ON (mc.id=m2mt.manifestcatalog_id) " - "WHERE mc.manifest_id = {manifest_id} " - "AND (m2mt.tag_id IS NULL {m2mt_filter});" - ).format(package_names=package_names, manifest_id=int(self.id), m2mt_filter=m2mt_filter) - return PkgInfo.objects.raw(query) + "WHERE mc.manifest_id = &(manifest_pk)s " + f"AND (m2mt.tag_id IS NULL {m2mt_filter});" + ) + cursor = connection.cursor() + cursor.execute(query, kwargs) + for row in cursor.fetchall(): + yield CachedPkgInfo(*row) - def enrollment_packages_pkginfo_deps(self, tags=None): - """PkgInfos that enrollment packages require, with their dependencies""" - required_packages_iter = chain.from_iterable(ep.get_requires() - for ep in self.enrollment_packages(tags).values()) - return self._pkginfo_deps_and_updates(required_packages_iter, tags) + def get_pkginfo_for_cache(self, tags, pk): + for cached_pkginfo in chain(self._pkginfos_with_deps_and_updates(tags), + self._enrollment_packages_pkginfo_deps(tags)): + if cached_pkginfo.pk == pk: + return cached_pkginfo # the manifest catalog - for a given set of tags @@ -795,6 +1029,20 @@ def build_catalog(self, tags=None): return pkginfo_list + def serialize_icon_hashes(self, tags): + icon_hashes = {} + for catalog in self.catalogs(tags): + icon_hashes.update(catalog.repository.icon_hashes) + return plistlib.dumps(icon_hashes) + + def serialize_client_resources(self, tags): + client_resources = {} + for catalog in self.catalogs(tags): + repository_pk = catalog.repository.pk + for name in catalog.repository.client_resources: + client_resources[name] = repository_pk + return client_resources + # the manifest def build(self, tags): @@ -849,7 +1097,7 @@ class ManifestCatalog(models.Model): class Meta: unique_together = (("manifest", "catalog"),) - ordering = ('-catalog__priority', '-catalog__name') + ordering = ('catalog__name',) class ManifestSubManifest(models.Model): @@ -904,7 +1152,9 @@ def get_requires(self): def get_pkg_info(self): pkg_info = self.pkg_info.copy() - pkg_info["installer_item_location"] = build_munki_name("enrollment_pkg", self.id, self.get_name(), "pkg") + name = self.get_name() + pkg_info["installer_item_location"] = build_munki_name("enrollment_pkg", self.id, name, "pkg") + pkg_info["icon_name"] = build_munki_name("enrollment_pkg_icon", self.id, name, "png") return pkg_info @cached_property diff --git a/zentral/contrib/monolith/public_urls.py b/zentral/contrib/monolith/public_urls.py index b66cae143c..7fdbff3260 100644 --- a/zentral/contrib/monolith/public_urls.py +++ b/zentral/contrib/monolith/public_urls.py @@ -9,8 +9,10 @@ public_views.MRManifestView.as_view(), name='repository_manifest'), path('munki_repo/pkgs/', public_views.MRPackageView.as_view(), name='repository_package'), + path('munki_repo/icons/_icon_hashes.plist', + public_views.MRIconHashesView.as_view(), name='repository_icon_hashes'), path('munki_repo/icons/', - public_views.MRRedirectView.as_view(section="icons"), name='repository_icon'), + public_views.MRPackageView.as_view(), name='repository_icon'), path('munki_repo/client_resources/', - public_views.MRRedirectView.as_view(section="client_resources"), name='repository_client_resource'), + public_views.MRClientResourceView.as_view(), name='repository_client_resource'), ] diff --git a/zentral/contrib/monolith/public_views.py b/zentral/contrib/monolith/public_views.py index b8f2a61468..f86f4c6488 100644 --- a/zentral/contrib/monolith/public_views.py +++ b/zentral/contrib/monolith/public_views.py @@ -1,4 +1,3 @@ -from itertools import chain import logging import plistlib import random @@ -97,9 +96,12 @@ def dispatch(self, request, *args, **kwargs): class MRNameView(MRBaseView): - def get_request_args(self, name): + def get_name(self, kwargs): + return kwargs["name"] + + def get_request_args(self): try: - model, key = parse_munki_name(name) + model, key = parse_munki_name(self.name) except MunkiNameError: model = key = None return model, key @@ -116,10 +118,10 @@ def get_cache_key(self, model, key): return ".".join(str(i) for i in items) def get(self, request, *args, **kwargs): - name = kwargs["name"] + self.name = self.get_name(kwargs) event_payload = {"type": self.event_payload_type, - "name": name} - model, key = self.get_request_args(name) + "name": self.name} + model, key = self.get_request_args() if model is None or key is None: error = True response = HttpResponseForbidden("No no no!") @@ -169,13 +171,13 @@ def do_get(self, model, key, cache_key, event_payload): class MRManifestView(MRNameView): event_payload_type = "manifest" - def get_request_args(self, name): - model, key = super().get_request_args(name) + def get_request_args(self): + model, key = super().get_request_args() if model is None or key is None: # Not a valid munki name. # It is the first request for the main manifest. model = "manifest" - key = self.manifest.id + key = self.manifest.pk return model, key def do_get(self, model, key, cache_key, event_payload): @@ -263,49 +265,67 @@ def do_get(self, model, key, cache_key, event_payload): return HttpResponseRedirect(default_storage.url(filename)) else: return FileResponse(default_storage.open(filename)) - elif model == "repository_package": - pk = key - event_payload["repository_package"] = {"id": pk} - pkginfo_name = pkginfo_version = pkginfo_iil = pkginfo_fn = None - try: - pkginfo_name, pkginfo_version, pkginfo_iil, pkginfo_fn = cache.get(cache_key) - except TypeError: - for pkginfo in chain(self.manifest.pkginfos_with_deps_and_updates(self.tags), - self.manifest.enrollment_packages_pkginfo_deps(self.tags)): - if pkginfo.pk == pk: - pkginfo_name = pkginfo.name.name - pkginfo_version = pkginfo.version - if pkginfo.file: - pkginfo_fn = pkginfo.file.name - else: - pkginfo_iil = pkginfo.data.get("installer_item_location") - break - # set the cache value, even if pkginfo_name, pkginfo_version and pkginfo_iil are None - cache.set(cache_key, (pkginfo_name, pkginfo_version, pkginfo_iil, pkginfo_fn), timeout=None) - else: - event_payload["cache"]["hit"] = True - if pkginfo_name is not None: - event_payload["repository_package"]["name"] = pkginfo_name - if pkginfo_version is not None: - event_payload["repository_package"]["version"] = pkginfo_version - if pkginfo_iil: - return monolith_conf.repository.make_munki_repository_response( - "pkgs", pkginfo_iil, cache_server=self._get_cache_server() - ) - elif pkginfo_fn: - if self._redirect_to_files: - return HttpResponseRedirect(default_storage.url(pkginfo_fn)) - else: - return FileResponse(default_storage.open(pkginfo_fn)) - else: - # should never happen - return HttpResponseNotFound("PkgInfo not found!") + elif model == "enrollment_pkg_icon": + return HttpResponseNotFound("No icon available for this package!") + elif model in ("icon", "installer_item", "uninstaller_item"): + event_payload["package_info"] = {"id": key} + sentinel = object() + cached_pkginfo = cache.get(cache_key, sentinel) + if cached_pkginfo is sentinel: + cached_pkginfo = self.manifest.get_pkginfo_for_cache(self.tags, key) + # set the cache value, even if None + cache.set(cache_key, cached_pkginfo, timeout=604800) # 7 days + if cached_pkginfo: + event_payload["package_info"].update({ + "name": cached_pkginfo.name, + "version": cached_pkginfo.version + }) + event_payload["repository"] = {"pk": cached_pkginfo.repository_pk} + repository = monolith_conf.get_repository(cached_pkginfo.repository_pk) + if repository: + event_payload["repository"]["name"] = repository.name + section, name = cached_pkginfo.get_repository_section_and_name(model) + if section and name: + return repository.make_munki_repository_response( + section, name, self._get_cache_server() + ) + return HttpResponseNotFound("Not found!") -class MRRedirectView(MRBaseView): - section = None +class MRIconHashesView(MRNameView): + event_payload_type = "icons" - def get(self, request, *args, **kwargs): - name = kwargs["name"] - self.post_monolith_munki_request(type=self.section, name=name) - return monolith_conf.repository.make_munki_repository_response(self.section, name) + def get_name(self, kwargs): + return "_icon_hashes.plist" + + def get_request_args(self): + return "icon_hashes", self.manifest.pk + + def do_get(self, model, key, cache_key, event_payload): + icon_hashes = cache.get(cache_key) + if not icon_hashes: + icon_hashes = self.manifest.serialize_icon_hashes(self.tags) + cache.set(cache_key, icon_hashes, timeout=604800) # 7 days + else: + event_payload["cache"]["hit"] = True + return HttpResponse(icon_hashes, content_type="application/xml") + + +class MRClientResourceView(MRNameView): + event_payload_type = "client_resources" + + def get_request_args(self): + return "client_resources", self.manifest.pk + + def do_get(self, model, key, cache_key, event_payload): + client_resources = cache.get(cache_key) + if client_resources is None: + client_resources = self.manifest.serialize_client_resources(self.tags) + cache.set(cache_key, client_resources, timeout=604800) # 7 days + else: + event_payload["cache"]["hit"] = True + repository_pk = client_resources.get(self.name) + if repository_pk: + return monolith_conf.get_repository(repository_pk).make_munki_repository_response("client_resources", + self.name) + return HttpResponseNotFound("Not found!") diff --git a/zentral/contrib/monolith/repository_backends/README.md b/zentral/contrib/monolith/repository_backends/README.md index 373331628e..3cff090d5e 100644 --- a/zentral/contrib/monolith/repository_backends/README.md +++ b/zentral/contrib/monolith/repository_backends/README.md @@ -22,22 +22,23 @@ IAM policy example: ```json { "Version": "2012-10-17", - "Statement": [ - { - "Sid": "GetBucketInfo", - "Effect": "Allow", - "Action": [ - "s3:ListBucket", - "s3:GetBucketLocation" - ], - "Resource": "arn:aws:s3:::BUCKET_NAME" + "Statement": [ + { + "Sid": "ZentralMonolithRO", + "Effect": "Allow", + "Principal": { + "AWS": "arn:aws:iam::123456789012:user/Dave" }, - { - "Sid": "GetAllBucketObjects", - "Effect": "Allow", - "Action": "s3:GetObject", - "Resource": "arn:aws:s3:::BUCKET_NAME/*" - } - ] + "Action": [ + "s3:GetObject", + "s3:GetBucketLocation", + "s3:ListBucket" + ], + "Resource": [ + "arn:aws:s3:::BUCKET_NAME/*", + "arn:aws:s3:::BUCKET_NAME" + ] + } + ] } -``` \ No newline at end of file +``` diff --git a/zentral/contrib/monolith/repository_backends/__init__.py b/zentral/contrib/monolith/repository_backends/__init__.py index e69de29bb2..5b5e780418 100644 --- a/zentral/contrib/monolith/repository_backends/__init__.py +++ b/zentral/contrib/monolith/repository_backends/__init__.py @@ -0,0 +1,22 @@ +from django.db import models + + +class RepositoryBackend(models.TextChoices): + S3 = "S3", "Amazon S3" + VIRTUAL = "VIRTUAL", "Virtual" + + +def get_repository_backend(repository, load=False): + backend = RepositoryBackend(repository.backend) + if backend == RepositoryBackend.S3: + from .s3 import S3Repository + return S3Repository(repository, load) + elif backend == RepositoryBackend.VIRTUAL: + from .virtual import VirtualRepository + return VirtualRepository(repository, load) + else: + raise ValueError(f"Unknown repository backend: {backend}") + + +def load_repository_backend(repository): + return get_repository_backend(repository, load=True) diff --git a/zentral/contrib/monolith/repository_backends/base.py b/zentral/contrib/monolith/repository_backends/base.py index 23664f534e..e642f85c11 100644 --- a/zentral/contrib/monolith/repository_backends/base.py +++ b/zentral/contrib/monolith/repository_backends/base.py @@ -1,24 +1,77 @@ from datetime import datetime +import hashlib import logging import plistlib from django.db.models import Count, Q from zentral.contrib.monolith.models import Catalog, Manifest, PkgInfo, PkgInfoCategory, PkgInfoName from zentral.core.events.base import AuditEvent +from zentral.core.secret_engines import decrypt, decrypt_str, encrypt_str, rewrap logger = logging.getLogger('zentral.contrib.monolith.repository_backends.base') class BaseRepository: - def __init__(self, config): - self.manual_catalog_management = config.get("manual_catalog_management", False) - if self.manual_catalog_management: - self.default_catalog_name = config.get("default_catalog", "Not assigned").strip() - else: - self.default_catalog_name = None + kwargs_keys = () + encrypted_kwargs_keys = () + form_class = None + + def __init__(self, repository, load=True): + self.repository = repository + self.name = repository.name + if load: + self.load() + + def load(self): + backend_kwargs = self.get_kwargs() + for key in self.kwargs_keys: + setattr(self, key, backend_kwargs.get(key)) + + # secrets + + def _get_secret_engine_kwargs(self, subfield): + if not self.name: + raise ValueError("Repository must have a name") + return {"field": f"backend_kwargs.{subfield}", + "model": "monolith.repository", + "name": self.name} + + def get_kwargs(self): + if not isinstance(self.repository.backend_kwargs, dict): + raise ValueError("Repository hasn't been initialized") + return { + k: decrypt_str(v, **self._get_secret_engine_kwargs(k)) if k in self.encrypted_kwargs_keys else v + for k, v in self.repository.backend_kwargs.items() + } + + def get_kwargs_for_event(self): + if not isinstance(self.repository.backend_kwargs, dict): + raise ValueError("Repository hasn't been initialized") + return { + k if k not in self.encrypted_kwargs_keys else f"{k}_hash": + hashlib.sha256(decrypt(v, **self._get_secret_engine_kwargs(k))).hexdigest() + if k in self.encrypted_kwargs_keys else v + for k, v in self.repository.backend_kwargs.items() + if v is not None + } + + def set_kwargs(self, kwargs): + self.repository.backend_kwargs = { + k: encrypt_str(v, **self._get_secret_engine_kwargs(k)) if k in self.encrypted_kwargs_keys else v + for k, v in kwargs.items() + if v + } + + def rewrap_kwargs(self): + self.repository.backend_kwargs = { + k: rewrap(v, **self._secret_engine_kwargs(k)) if k in self.encrypted_kwargs_keys else v + for k, v in self.repository.backend_kwargs.items() + } + + # sync def _import_category(self, name, audit_callback): - pic, created = PkgInfoCategory.objects.get_or_create(name=name) + pic, created = PkgInfoCategory.objects.get_or_create(repository=self.repository, name=name) if created and audit_callback: audit_callback(pic, AuditEvent.Action.CREATED) return pic @@ -31,18 +84,13 @@ def _import_name(self, name, audit_callback): def _import_catalogs(self, pkg_info_data, audit_callback): catalogs = [] - if self.default_catalog_name: - # force the catalog to the default catalog - pkg_info_catalogs = [self.default_catalog_name] - else: - # take the catalogs from the pkg info data - pkg_info_catalogs = pkg_info_data.get("catalogs", []) + pkg_info_catalogs = pkg_info_data.get("catalogs", []) for catalog_name in pkg_info_catalogs: catalog_name = catalog_name.strip() try: - catalog = Catalog.objects.get(name=catalog_name) + catalog = Catalog.objects.get(repository=self.repository, name=catalog_name) except Catalog.DoesNotExist: - catalog = Catalog.objects.create(name=catalog_name) + catalog = Catalog.objects.create(repository=self.repository, name=catalog_name) if audit_callback: audit_callback(catalog, AuditEvent.Action.CREATED) else: @@ -84,10 +132,13 @@ def _import_pkg_info(self, pkg_info_data, audit_callback): # save PkgInfo in db try: pkg_info = (PkgInfo.objects.prefetch_related("catalogs", "requires", "update_for") - .select_related("category", "name") - .get(name=pkg_info_name, version=version)) + .select_related("repository", "name", "category") + .get(repository=self.repository, + name=pkg_info_name, + version=version)) except PkgInfo.DoesNotExist: - pkg_info = PkgInfo.objects.create(name=pkg_info_name, + pkg_info = PkgInfo.objects.create(repository=self.repository, + name=pkg_info_name, version=version, category=pkg_info_category, data=pkg_info_data) @@ -119,12 +170,11 @@ def _import_pkg_info(self, pkg_info_data, audit_callback): pkg_info.data = pkg_info_data updated = True # update m2m attributes - pkg_info_m2m_updates = [("requires", requires), - ("update_for", update_for)] - if not self.manual_catalog_management: - # need to update the pkg info catalogs too - pkg_info_m2m_updates.append(("catalogs", catalogs)) - for pkg_info_attr, pkg_info_values in pkg_info_m2m_updates: + for pkg_info_attr, pkg_info_values in ( + ("requires", requires), + ("update_for", update_for), + ("catalogs", catalogs) + ): pkg_info_old_values = set(getattr(pkg_info, pkg_info_attr).all()) pkg_info_values = set(pkg_info_values) if pkg_info_old_values != pkg_info_values: @@ -164,25 +214,50 @@ def _bump_manifest(self, manifest, audit_callback): def sync_catalogs(self, audit_callback=None): found_pkg_info_pks = set([]) found_catalog_pks = set([]) + # initialize repository icon hashes + repo_icon_hashes = {} + icon_hashes_content = self.get_icon_hashes_content() + if icon_hashes_content: + icon_hashes = plistlib.loads(icon_hashes_content) + else: + icon_hashes = {} # update or create current pkg_infos for pkg_info_data in plistlib.loads(self.get_all_catalog_content()): catalogs, pkg_info = self._import_pkg_info(pkg_info_data, audit_callback) found_catalog_pks.update(c.pk for c in catalogs) if pkg_info: found_pkg_info_pks.add(pkg_info.pk) + icon_hash = icon_hashes.get(pkg_info.get_original_icon_name()) + if icon_hash: + repo_icon_hashes[pkg_info.get_monolith_icon_name()] = icon_hash # archive unknown non-local pkg_infos for pkg_info in (PkgInfo.objects.prefetch_related("catalogs", "requires", "update_for") .select_related("category", "name") .filter(archived_at__isnull=True) .exclude(Q(local=True) | Q(pk__in=found_pkg_info_pks))): self._archive_pkg_info(pkg_info, audit_callback) - # archive old catalogs if auto catalog management - if not self.manual_catalog_management: - for c in (Catalog.objects.annotate(pkginfo_count=Count("pkginfo", - filter=Q(pkginfo__archived_at__isnull=True))) - .filter(archived_at__isnull=True, pkginfo_count=0) - .exclude(pk__in=found_catalog_pks)): - self._archive_catalog(c, audit_callback) + # archive old catalogs + for c in (Catalog.objects.annotate(pkginfo_count=Count("pkginfo", + filter=Q(pkginfo__archived_at__isnull=True))) + .filter(archived_at__isnull=True, pkginfo_count=0) + .exclude(pk__in=found_catalog_pks)): + self._archive_catalog(c, audit_callback) + # update repository + self.repository.icon_hashes = repo_icon_hashes + self.repository.client_resources = list(self.iter_client_resources()) + self.repository.last_synced_at = datetime.utcnow() + self.repository.save() # bump versions of manifests connected to found catalogs for manifest in Manifest.objects.distinct().filter(manifestcatalog__catalog__pk__in=found_catalog_pks): self._bump_manifest(manifest, audit_callback) + + # to implement in the subclasses + + def get_all_catalog_content(self): + raise NotImplementedError + + def get_icon_hashes_content(self): + raise NotImplementedError + + def iter_client_resources(self): + raise NotImplementedError diff --git a/zentral/contrib/monolith/repository_backends/http.py b/zentral/contrib/monolith/repository_backends/http.py deleted file mode 100644 index 969fb9a321..0000000000 --- a/zentral/contrib/monolith/repository_backends/http.py +++ /dev/null @@ -1,23 +0,0 @@ -import os.path -from django.http import HttpResponseRedirect -import requests -from zentral.contrib.monolith.exceptions import RepositoryError -from .base import BaseRepository - - -class Repository(BaseRepository): - def __init__(self, config): - super().__init__(config) - self.root = config["root"] - - def get_all_catalog_content(self): - r = requests.get(os.path.join(self.root, "catalogs/all")) - if not r.status_code == 200: - raise RepositoryError - return r.content - - def make_munki_repository_response(self, section, name, cache_server=None): - url = os.path.join(self.root, section, name) - if cache_server: - url = cache_server.get_cache_url(url) - return HttpResponseRedirect(url) diff --git a/zentral/contrib/monolith/repository_backends/local.py b/zentral/contrib/monolith/repository_backends/local.py deleted file mode 100644 index 9101770f91..0000000000 --- a/zentral/contrib/monolith/repository_backends/local.py +++ /dev/null @@ -1,20 +0,0 @@ -import os.path -from django.http import FileResponse, HttpResponseNotFound -from .base import BaseRepository - - -class Repository(BaseRepository): - def __init__(self, config): - super().__init__(config) - self.root = config["root"] - - def get_all_catalog_content(self): - with open(os.path.join(self.root, "catalogs", "all"), "rb") as f: - return f.read() - - def make_munki_repository_response(self, section, name, cache_server=None): - filepath = os.path.join(self.root, section, name) - if not os.path.isfile(filepath): - return HttpResponseNotFound("not found") - else: - return FileResponse(open(filepath, 'rb')) diff --git a/zentral/contrib/monolith/repository_backends/s3.py b/zentral/contrib/monolith/repository_backends/s3.py index d12e78ac9d..48dd2903bc 100644 --- a/zentral/contrib/monolith/repository_backends/s3.py +++ b/zentral/contrib/monolith/repository_backends/s3.py @@ -4,14 +4,15 @@ import boto3 from botocore.client import Config from botocore.signers import CloudFrontSigner -from cryptography.hazmat.backends import default_backend from cryptography.hazmat.primitives import hashes from cryptography.hazmat.primitives import serialization from cryptography.hazmat.primitives.asymmetric import padding +from django import forms from django.http import HttpResponseRedirect from django.utils.functional import cached_property import requests from requests.utils import requote_uri +from rest_framework import serializers from zentral.contrib.monolith.exceptions import RepositoryError from zentral.utils.boto3 import make_refreshable_assume_role_session from .base import BaseRepository @@ -20,58 +21,139 @@ logger = logging.getLogger("zentral.contrib.monolith.repository_backends.s3") -class Repository(BaseRepository): - def __init__(self, config): - super().__init__(config) - - # bucket (required) - self.bucket = config["bucket"] - - # bucket region (optional, can be fetched) - self.region_name = config.get("region_name") - - # relative path to the repository in the bucket (optional, default = the root of the bucket) - self.prefix = config.get("prefix", "") +def load_cloudfront_private_key(privkey_pem): + return serialization.load_pem_private_key( + privkey_pem.encode("utf-8"), + password=None, + ) + + +class S3RepositoryForm(forms.Form): + bucket = forms.CharField() + region_name = forms.CharField(required=False) + prefix = forms.CharField(required=False) + access_key_id = forms.CharField(required=False) + secret_access_key = forms.CharField(required=False) + assume_role_arn = forms.CharField(label="Assume role ARN", required=False) + signature_version = forms.CharField(required=False) + endpoint_url = forms.URLField(label="Endpoint URL", required=False) + cloudfront_domain = forms.CharField(required=False) + cloudfront_key_id = forms.CharField(required=False) + cloudfront_privkey_pem = forms.CharField(widget=forms.Textarea, required=False) + + def clean_cloudfront_privkey_pem(self): + data = self.cleaned_data.get("cloudfront_privkey_pem") + if data: + try: + load_cloudfront_private_key(data) + except Exception: + raise forms.ValidationError("Invalid private key.") + return data + + def clean(self): + cleaned_data = super().clean() + # cloudfront + cf_domain = cleaned_data.get("cloudfront_domain") + cf_key_id = cleaned_data.get("cloudfront_key_id") + cf_privkey_pem = cleaned_data.get("cloudfront_privkey_pem") + if cf_domain or cf_key_id or cf_privkey_pem: + err_msg = "This field is required when configuring Cloudfront." + if not cf_domain: + self.add_error("cloudfront_domain", err_msg) + if not cf_key_id: + self.add_error("cloudfront_key_id", err_msg) + if not cf_privkey_pem and "cloudfront_privkey_pem" not in self.errors: + self.add_error("cloudfront_privkey_pem", err_msg) + + def get_backend_kwargs(self): + return {k: v for k, v in self.cleaned_data.items() if v} + + +class S3RepositorySerializer(serializers.Serializer): + bucket = serializers.CharField() + region_name = serializers.CharField(required=False) + prefix = serializers.CharField(required=False) + access_key_id = serializers.CharField(required=False) + secret_access_key = serializers.CharField(required=False) + assume_role_arn = serializers.CharField(required=False) + signature_version = serializers.CharField(required=False) + endpoint_url = serializers.URLField(required=False) + cloudfront_domain = serializers.CharField(required=False) + cloudfront_key_id = serializers.CharField(required=False) + cloudfront_privkey_pem = serializers.CharField(required=False) + + def validate_cloudfront_privkey_pem(self, value): + try: + load_cloudfront_private_key(value) + except Exception: + raise serializers.ValidationError("Invalid private key.") + return value + + def validate(self, data): + data = super().validate(data) + # cloudfront + cf_domain = data.get("cloudfront_domain") + cf_key_id = data.get("cloudfront_key_id") + cf_privkey_pem = data.get("cloudfront_privkey_pem") + cf_errors = {} + if cf_domain or cf_key_id or cf_privkey_pem: + err_msg = "This field is required when configuring Cloudfront." + if not cf_domain: + cf_errors.update({"cloudfront_domain": err_msg}) + if not cf_key_id: + cf_errors.update({"cloudfront_key_id": err_msg}) + if not cf_privkey_pem: + cf_errors.update({"cloudfront_privkey_pem": err_msg}) + if cf_errors: + raise serializers.ValidationError(cf_errors) + return data + + +class S3Repository(BaseRepository): + kwargs_keys = ( + "bucket", + "region_name", + "prefix", + "access_key_id", + "secret_access_key", + "assume_role_arn", + "signature_version", + "endpoint_url", + "cloudfront_domain", + "cloudfront_key_id", + "cloudfront_privkey_pem" + ) + encrypted_kwargs_keys = ( + "cloudfront_privkey_pem", + "secret_access_key" + ) + form_class = S3RepositoryForm + + def load(self): + super().load() + + # default prefix + if not self.prefix: + self.prefix = "" # fixed credentials (optional) self.credentials = {} - for k in ("aws_access_key_id", "aws_secret_access_key"): - v = config.get(k) + for ck, k in (("aws_access_key_id", "access_key_id"), + ("aws_secret_access_key", "secret_access_key")): + v = getattr(self, k) if v: - self.credentials[k] = v + self.credentials[ck] = v - # ARN of the role to assume (optional) - self.assume_role_arn = config.get("assume_role_arn") + # signature version + if not self.signature_version: + self.signature_version = "s3v4" - # signature version (optional, default = s3v4) - self.signature_version = config.get("signature_version", "s3v4") - - # endpoint URL (optional, use it for special S3 like services) - self.endpoint_url = config.get("endpoint_url") - - # cloudfront (optional) + # cloudfront signer (optional) self.cloudfront_signer = None - cloudfront_cfg = config.get("cloudfront") - if not cloudfront_cfg: - return - self.cloudfront_domain = cloudfront_cfg.get("domain") if not self.cloudfront_domain: - logger.error("Missing cloudfront domain") - return - key_id = cloudfront_cfg.get("key_id") - if not key_id: - logger.error("Missing cloudfront Key ID") - return - privkey_pem = cloudfront_cfg.get("privkey_pem") - if not privkey_pem: - logger.error("Missing cloudfront privkey PEM") return try: - privkey = serialization.load_pem_private_key( - privkey_pem.encode("utf-8"), - password=None, - backend=default_backend() - ) + privkey = load_cloudfront_private_key(self.cloudfront_privkey_pem) except Exception: logger.exception("Cloud not load cloudfront privkey") return @@ -79,7 +161,7 @@ def __init__(self, config): def rsa_signer(message): return privkey.sign(message, padding.PKCS1v15(), hashes.SHA1()) - self.cloudfront_signer = CloudFrontSigner(key_id, rsa_signer) + self.cloudfront_signer = CloudFrontSigner(self.cloudfront_key_id, rsa_signer) @cached_property def _session(self): @@ -103,18 +185,42 @@ def _client(self): return self._session.client("s3", region_name=self.region_name, endpoint_url=self.endpoint_url, config=Config(signature_version=self.signature_version)) - def get_all_catalog_content(self): + def _get_resource(self, key, missing_ok=False): try: return self._client.get_object( Bucket=self.bucket, - Key=os.path.join(self.prefix, "catalogs/all") + Key=os.path.join(self.prefix, key) )['Body'].read() + except self._client.exceptions.NoSuchKey: + logging_args = ("Could not find key %s in repository %s", key, self.repository) + if missing_ok: + logger.info(*logging_args) + return None + logger.exception(*logging_args) + raise RepositoryError + except Exception: + logger.exception("Could not download all catalog from repository %s", self.repository) + raise RepositoryError + + def get_all_catalog_content(self): + return self._get_resource("catalogs/all") + + def get_icon_hashes_content(self): + return self._get_resource("icons/_icon_hashes.plist", missing_ok=True) + + def iter_client_resources(self): + prefix = os.path.join(self.prefix, "client_resources/") + try: + paginator = self._client.get_paginator('list_objects_v2') + for page in paginator.paginate(Bucket=self.bucket, Prefix=prefix): + for obj in page.get("Contents", []): + yield obj["Key"].removeprefix(prefix) except Exception: - logger.exception("Could not download all catalog") + logger.exception("Could not list client resources keys in repository %s", self.repository) raise RepositoryError def make_munki_repository_response(self, section, name, cache_server=None): - expires_in = 180 # 3 minutes + expires_in = 180 # 3 minutes TODO: hardcoded key = os.path.join(self.prefix, section, name) if self.cloudfront_signer: url = self.cloudfront_signer.generate_presigned_url( diff --git a/zentral/contrib/monolith/repository_backends/virtual.py b/zentral/contrib/monolith/repository_backends/virtual.py new file mode 100644 index 0000000000..b426235a4b --- /dev/null +++ b/zentral/contrib/monolith/repository_backends/virtual.py @@ -0,0 +1,27 @@ +import logging +from django.core.files.storage import default_storage +from django.http import FileResponse, HttpResponseNotFound, HttpResponseRedirect +from django.utils.functional import cached_property +from zentral.utils.storage import file_storage_has_signed_urls +from .base import BaseRepository + + +logger = logging.getLogger("zentral.contrib.monolith.repository_backends.virtual") + + +class VirtualRepository(BaseRepository): + def sync_catalogs(self, audit_callback=None): + # NOOP + return + + @cached_property + def _redirect_to_files(self): + return file_storage_has_signed_urls() + + def make_munki_repository_response(self, section, name, cache_server=None): + if section == "pkgs": + if self._redirect_to_files: + return HttpResponseRedirect(default_storage.url(name)) + elif default_storage.exists(name): + return FileResponse(default_storage.open(name)) + return HttpResponseNotFound("Munki asset not found!") diff --git a/zentral/contrib/monolith/serializers.py b/zentral/contrib/monolith/serializers.py index 5d9e6ae267..2a4ca20a4a 100644 --- a/zentral/contrib/monolith/serializers.py +++ b/zentral/contrib/monolith/serializers.py @@ -4,7 +4,73 @@ from zentral.contrib.inventory.models import EnrollmentSecret, Tag from zentral.contrib.inventory.serializers import EnrollmentSecretSerializer from .models import (Catalog, Condition, Enrollment, Manifest, ManifestCatalog, ManifestSubManifest, - PkgInfoName, SubManifest, SubManifestPkgInfo) + PkgInfoName, Repository, RepositoryBackend, SubManifest, SubManifestPkgInfo) +from .repository_backends.s3 import S3RepositorySerializer + + +class RepositorySerializer(serializers.ModelSerializer): + backend_kwargs = serializers.JSONField(source="get_backend_kwargs", required=False) + + class Meta: + model = Repository + fields = ( + "id", + "backend", + "backend_kwargs", + "name", + "meta_business_unit", + "icon_hashes", + "client_resources", + "created_at", + "updated_at", + "last_synced_at", + ) + + def validate_meta_business_unit(self, value): + if self.instance: + for manifest in self.instance.manifests(): + if manifest.meta_business_unit != value: + raise serializers.ValidationError( + f"Repository linked to manifest '{manifest}' which has a different business unit." + ) + return value + + def validate(self, data): + backend_kwargs = data.pop("get_backend_kwargs", {}) + data = super().validate(data) + backend = data.get("backend") + if backend: + if backend == RepositoryBackend.S3: + backend_serializer = S3RepositorySerializer(data=backend_kwargs) + if backend_serializer.is_valid(): + data["backend_kwargs"] = backend_serializer.data + else: + raise serializers.ValidationError({"backend_kwargs": backend_serializer.errors}) + elif backend == RepositoryBackend.VIRTUAL: + if backend_kwargs and backend_kwargs != {}: + raise serializers.ValidationError({ + "backend_kwargs": { + "non_field_errors": ["Must be an empty dict for a virtual repository."] + } + }) + return data + + def create(self, validated_data): + backend_kwargs = validated_data.pop("backend_kwargs", {}) + validated_data["backend_kwargs"] = {} + repository = super().create(validated_data) + repository.set_backend_kwargs(backend_kwargs) + repository.save() + return repository + + def update(self, instance, validated_data): + backend_kwargs = validated_data.pop("backend_kwargs", {}) + repository = super().update(instance, validated_data) + repository.set_backend_kwargs(backend_kwargs) + repository.save() + for manifest in repository.manifests(): + manifest.bump_version() + return repository class CatalogSerializer(serializers.ModelSerializer): @@ -13,6 +79,20 @@ class Meta: fields = '__all__' read_only_fields = ['archived_at'] + def validate_repository(self, value): + if value.backend != RepositoryBackend.VIRTUAL: + raise serializers.ValidationError("Not a virtual repository.") + if value.meta_business_unit and self.instance: + if ( + Manifest.objects.filter(manifestcatalog__catalog=self.instance) + .exclude(meta_business_unit=value.meta_business_unit) + .count() + ): + raise serializers.ValidationError( + "This catalog is included in manifests linked to different business units than this repository." + ) + return value + class ConditionSerializer(serializers.ModelSerializer): class Meta: diff --git a/zentral/contrib/monolith/templates/monolith/catalog_confirm_delete.html b/zentral/contrib/monolith/templates/monolith/catalog_confirm_delete.html index 489d67023c..ebe1014ef3 100644 --- a/zentral/contrib/monolith/templates/monolith/catalog_confirm_delete.html +++ b/zentral/contrib/monolith/templates/monolith/catalog_confirm_delete.html @@ -3,7 +3,8 @@ {% block content %} diff --git a/zentral/contrib/monolith/templates/monolith/catalog_detail.html b/zentral/contrib/monolith/templates/monolith/catalog_detail.html index f66ce21bdf..9d3bdd3a97 100644 --- a/zentral/contrib/monolith/templates/monolith/catalog_detail.html +++ b/zentral/contrib/monolith/templates/monolith/catalog_detail.html @@ -4,8 +4,9 @@ {% block content %}
@@ -15,77 +16,70 @@

{{ object.name }}

Catalog

- {% if not object.archived_at and perms.monolith.change_catalog %} - {% url edit_catalog_view object.pk as url %} - {% button 'UPDATE' url "Edit Catalog View" %} + {% if object.can_be_updated and perms.monolith.change_catalog %} + {% url 'monolith:update_catalog' object.pk as url %} + {% button 'UPDATE' url "Edit Monolith catalog" %} {% endif %} {% if object.can_be_deleted and perms.monolith.delete_catalog %} {% url 'monolith:delete_catalog' object.pk as url %} - {% button 'DELETE' url "Delete Catalog" %} + {% button 'DELETE' url "Delete Monolith catalog" %} {% endif %}
- + - - + + - - - - + + + + + + - + {% if pkg_infos %} + + + + + {% endif %} +
AttributeValueName{{ object.name }}
Priority{{ object.priority }}Repository + {% if perms.monolith.view_repository %} + {{ object.repository }} / {{ object.repository.get_backend_display }} + {% else %} + {{ object.repository }} + {% endif %} +
Manifest{{ manifests|length|pluralize }} ({{ manifests|length }}) + {% if manifests %} +
    + {% for manifest, tags in manifests %} +
  • + {% if perms.monolith.view_manifest %} + + {{ manifest }} + + {% else %} + {{ manifest }} + {% endif %} + {% for tag in tags %} + {% inventory_tag tag %} + {% endfor %} +
  • + {% endfor %} +
+ {% else %} + - + {% endif %} +
Package{{ pkg_infos|length|pluralize }} ({{ pkg_infos|length }}) + Browse all packages +
{% created_updated_at object %}
-{% if perms.monolith.view_manifestcatalog %} -

Manifest{{ manifests|length|pluralize }} ({{ manifests|length }})

- -{% if manifests %} - - - - - - - - - {% for manifest, tags in manifests %} - - - - - {% endfor %} - -
ManifestTags
- {% if perms.monolith.view_manifest %} - {{ manifest }} - {% else %} - {{ manifest }} - {% endif %} - - {% for tag in tags %} - {% inventory_tag tag %} - {% empty %} - - - {% endfor %} -
-{% endif %} -{% endif %} - - -{% if perms.monolith.view_pkginfo %} -

Package{{ pkg_infos|length|pluralize }} ({{ pkg_infos|length }})

- -{% if pkg_infos %} -

Browse all packages -{% endif %} -{% endif %} - {% endblock %} diff --git a/zentral/contrib/monolith/templates/monolith/catalog_form.html b/zentral/contrib/monolith/templates/monolith/catalog_form.html index 0a8b11eba3..48146e5dec 100644 --- a/zentral/contrib/monolith/templates/monolith/catalog_form.html +++ b/zentral/contrib/monolith/templates/monolith/catalog_form.html @@ -3,7 +3,8 @@ {% block content %}

-

{{ title }}

+

{% if object %}Update {{ object.name }} catalog{% else %}Create catalog{% endif %}

{% csrf_token %} {{ form }} diff --git a/zentral/contrib/monolith/templates/monolith/catalog_list.html b/zentral/contrib/monolith/templates/monolith/catalog_list.html index e8f433e0b1..faaba0db6c 100644 --- a/zentral/contrib/monolith/templates/monolith/catalog_list.html +++ b/zentral/contrib/monolith/templates/monolith/catalog_list.html @@ -1,36 +1,58 @@ {% extends 'base.html' %} +{% load ui_extras %} {% block content %} -

Catalogs

+
+

Catalogs ({{ object_list|length }})

+
+ {% if perms.monolith.add_catalog %} + {% url 'monolith:create_catalog' as url %} + {% button 'CREATE' url "Create new Monolith catalog" %} + {% endif %} +
+
-{% if can_create_catalog %} -

Create

-{% endif %} - - +
+ - - {% if manual_catalog_management %} + {% if perms.monolith.change_catalog or perms.monolith.delete_catalog %} + {% endif %} {% for catalog in object_list %} - + + - - {% if manual_catalog_management %} + {% if perms.monolith.change_catalog or perms.monolith.delete_catalog %} + {% endif %} {% endfor %} diff --git a/zentral/contrib/monolith/templates/monolith/condition_confirm_delete.html b/zentral/contrib/monolith/templates/monolith/condition_confirm_delete.html index d8e6463842..91b620bbb1 100644 --- a/zentral/contrib/monolith/templates/monolith/condition_confirm_delete.html +++ b/zentral/contrib/monolith/templates/monolith/condition_confirm_delete.html @@ -3,7 +3,8 @@ {% block content %} diff --git a/zentral/contrib/monolith/templates/monolith/condition_detail.html b/zentral/contrib/monolith/templates/monolith/condition_detail.html index 923c0741cc..d5dd3c3065 100644 --- a/zentral/contrib/monolith/templates/monolith/condition_detail.html +++ b/zentral/contrib/monolith/templates/monolith/condition_detail.html @@ -4,8 +4,9 @@ {% block content %}
diff --git a/zentral/contrib/monolith/templates/monolith/condition_form.html b/zentral/contrib/monolith/templates/monolith/condition_form.html index 89692c1103..8fe4223074 100644 --- a/zentral/contrib/monolith/templates/monolith/condition_form.html +++ b/zentral/contrib/monolith/templates/monolith/condition_form.html @@ -3,7 +3,8 @@ {% block content %}
Repository NamePriority Created atArchived at
+ {% if perms.monolith.view_repository %} + {{ catalog.repository }} + {% else %} + {{ catalog.repository }} + {% endif %} - {{ catalog.name }} + {{ catalog.name }} {{ catalog.priority }} {{ catalog.created_at }}{{ catalog.archived_at|default:"-" }} + {% if perms.monolith.change_catalog and catalog.can_be_updated %} + {% url 'monolith:update_catalog' catalog.id as url %} + {% button 'UPDATE' url "Edit catalog" %} + {% endif %} + {% if perms.monolith.delete_catalog and catalog.can_be_deleted %} + {% url 'monolith:delete_catalog' catalog.id as url %} + {% button 'DELETE' url "Delete catalog" %} + {% endif %} +
+ - {% if perms.monolith.change_manifestcatalog or perms.monolith.delete_manifestcatalog %} @@ -134,17 +135,28 @@

Catalog{{ manifest_catalogs|length|pluralize }} ({{ manifest_c

{% for manifest_catalog in manifest_catalogs %} + {% with manifest_catalog.catalog as catalog %} - - + + {% else %} + {{ catalog.name }} + {% endif %} + - {% if perms.monolith.change_manifestcatalog or perms.monolith.delete_manifestcatalog %} {% endif %} + {% endwith %} {% endfor %}
repository namepriority tags
{{ manifest_catalog.catalog }}{{ manifest_catalog.catalog.priority }} - {% for tag in manifest_catalog.tags.all %} - {% inventory_tag tag %} - {% empty %} - - - {% endfor %} + {% if perms.monolith.view_repository %} + {{ catalog.repository }} + {% else %} + {{ catalog.repository }} + {% endif %} + + {% if perms.monolith.view_catalog %} + {{ catalog.name }} + {% for tag in manifest_catalog.tags.all %} + {% inventory_tag tag %} + {% empty %} + - + {% endfor %} {% if perms.monolith.change_manifestcatalog %} @@ -158,6 +170,7 @@

Catalog{{ manifest_catalogs|length|pluralize }} ({{ manifest_c

diff --git a/zentral/contrib/monolith/templates/monolith/manifest_cache_server_setup.html b/zentral/contrib/monolith/templates/monolith/manifest_cache_server_setup.html index 45382274d8..aef15a2021 100644 --- a/zentral/contrib/monolith/templates/monolith/manifest_cache_server_setup.html +++ b/zentral/contrib/monolith/templates/monolith/manifest_cache_server_setup.html @@ -3,7 +3,8 @@ {% block content %} diff --git a/zentral/contrib/monolith/templates/monolith/manifest_catalog_form.html b/zentral/contrib/monolith/templates/monolith/manifest_catalog_form.html index d24e2ed108..db4bb57ec0 100644 --- a/zentral/contrib/monolith/templates/monolith/manifest_catalog_form.html +++ b/zentral/contrib/monolith/templates/monolith/manifest_catalog_form.html @@ -3,7 +3,8 @@ {% block content %} diff --git a/zentral/contrib/monolith/templates/monolith/manifest_enrollment_package_forms.html b/zentral/contrib/monolith/templates/monolith/manifest_enrollment_package_forms.html index 7e39c51aa1..568f610285 100644 --- a/zentral/contrib/monolith/templates/monolith/manifest_enrollment_package_forms.html +++ b/zentral/contrib/monolith/templates/monolith/manifest_enrollment_package_forms.html @@ -3,7 +3,8 @@ {% block content %}