You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Observed behavior:
The user contribution models entity for oppiamigrationbot has tens of thousands of entries because of that the exploration migration job sometimes use to throw an error (BadRequestError: Too many indexed properties) but the errors seem to resolve themselves after a while.
Fix:
To fix this issue while doing exploration migration let's not update the user_model if the contributor is the oppia bot.
Error log:
Traceback (most recent call last): (/<path>/third_party/gae-mapreduce-1.9.22.0/mapreduce/handlers.py:517)
File "/<path>/third_party/gae-mapreduce-1.9.22.0/mapreduce/handlers.py", line 503, in handle
tstate.input_reader, shard_state, tstate, ctx)
File "/<path>/third_party/gae-mapreduce-1.9.22.0/mapreduce/handlers.py", line 593, in _process_inputs
entity, input_reader, ctx, tstate):
File "/<path>/third_party/gae-mapreduce-1.9.22.0/mapreduce/handlers.py", line 635, in _process_datum
for output in result:
File "/<path>/core/domain/exp_jobs_one_off.py", line 262, in map
feconf.CURRENT_STATE_SCHEMA_VERSION))
File "/<path>/core/domain/exp_services.py", line 1201, in update_exploration
user_services.add_edited_exploration_id(committer_id, exploration.id)
File "/<path>/core/domain/user_services.py", line 1401, in add_edited_exploration_id
_save_user_contributions(user_contributions)
File "/<path>/core/domain/user_services.py", line 1415, in _save_user_contributions
edited_exploration_ids=user_contributions.edited_exploration_ids,
File "/<path>/python27/python27_lib/versions/1/google/appengine/ext/ndb/model.py", line 3458, in _put
return self._put_async(**ctx_options).get_result()
File "/<path>/python27/python27_lib/versions/1/google/appengine/ext/ndb/tasklets.py", line 383, in get_result
self.check_success()
File "/<path>/python27/python27_lib/versions/1/google/appengine/ext/ndb/tasklets.py", line 427, in _help_tasklet_along
value = gen.throw(exc.__class__, exc, tb)
File "/<path>/python27/python27_lib/versions/1/google/appengine/ext/ndb/context.py", line 824, in put
key = yield self._put_batcher.add(entity, options)
File "/<path>/python27/python27_lib/versions/1/google/appengine/ext/ndb/tasklets.py", line 427, in _help_tasklet_along
value = gen.throw(exc.__class__, exc, tb)
File "/<path>/python27/python27_lib/versions/1/google/appengine/ext/ndb/context.py", line 358, in _put_tasklet
keys = yield self._conn.async_put(options, datastore_entities)
File "/<path>/python27/python27_lib/versions/1/google/appengine/ext/ndb/tasklets.py", line 513, in _on_rpc_completion
result = rpc.get_result()
File "/<path>/python27/python27_lib/versions/1/google/appengine/api/apiproxy_stub_map.py", line 615, in get_result
return self.__get_result_hook(self)
File "/<path>/python27/python27_lib/versions/1/google/appengine/datastore/datastore_rpc.py", line 1887, in __put_hook
self.check_rpc_success(rpc)
File "/<path>/python27/python27_lib/versions/1/google/appengine/datastore/datastore_rpc.py", line 1379, in check_rpc_success
raise _ToDatastoreError(err)
BadRequestError: Too many indexed properties
Additional thoughts:
For cleanliness, we should clear the edited_exp_ids entries in the bot too, and add a check to the prod validation job that that entity is treated specially
The text was updated successfully, but these errors were encountered:
Observed behavior:
The user contribution models entity for oppiamigrationbot has tens of thousands of entries because of that the exploration migration job sometimes use to throw an error (
BadRequestError: Too many indexed properties
) but the errors seem to resolve themselves after a while.Fix:
To fix this issue while doing exploration migration let's not update the user_model if the contributor is the oppia bot.
Error log:
Additional thoughts:
For cleanliness, we should clear the edited_exp_ids entries in the bot too, and add a check to the prod validation job that that entity is treated specially
The text was updated successfully, but these errors were encountered: