Closed
Description
Local attempts at using the class while working on #11041 have been failing with the following error:
self = <tests.functional.workloads.app.amq.test_rgw_kafka_notifications.TestRGWAndKafkaNotifications object at 0x30f7507c0>, rgw_bucket_factory = <function bucket_factory_fixture.<locals>._create_buckets at 0x324b46200>
def test_rgw_kafka_notifications(self, rgw_bucket_factory):
"""
Test to verify rgw kafka notifications
"""
# Get sc
sc = default_storage_class(interface_type=constants.CEPHBLOCKPOOL)
# Deploy amq cluster
> self.amq.setup_amq_cluster(sc.name)
tests/functional/workloads/app/amq/test_rgw_kafka_notifications.py:98:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
ocs_ci/ocs/amq.py:925: in setup_amq_cluster
self.setup_amq_kafka_persistent(sc_name, size, replicas)
ocs_ci/ocs/amq.py:240: in setup_amq_kafka_persistent
kafka_persistent = templating.load_yaml(
ocs_ci/utility/templating.py:159: in load_yaml
return loader(fs.read())
venv/lib/python3.10/site-packages/yaml/__init__.py:125: in safe_load
return load(stream, SafeLoader)
venv/lib/python3.10/site-packages/yaml/__init__.py:81: in load
return loader.get_single_data()
venv/lib/python3.10/site-packages/yaml/constructor.py:49: in get_single_data
node = self.get_single_node()
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <yaml.loader.SafeLoader object at 0x30f79c2b0>
def get_single_node(self):
# Drop the STREAM-START event.
self.get_event()
# Compose a document if the stream is not empty.
document = None
if not self.check_event(StreamEndEvent):
document = self.compose_document()
# Ensure that the stream contains no more documents.
if not self.check_event(StreamEndEvent):
event = self.get_event()
> raise ComposerError("expected a single document in the stream",
document.start_mark, "but found another document",
event.start_mark)
E yaml.composer.ComposerError: expected a single document in the stream
E in "<unicode string>", line 1, column 1:
E apiVersion: kafka.strimzi.io/v1beta2
E ^
E but found another document
E in "<unicode string>", line 19, column 1:
E ---
E ^
venv/lib/python3.10/site-packages/yaml/composer.py:41: ComposerError
Apparently, the strimzi-kafka-operator/packaging/examples/kafka/kafka-persistent.yaml
template we've been using has been changed to include multiple resource definitions instead of one, and zookeeper has been removed: strimzi/strimzi-kafka-operator#10982
Activity