[ML] Add defaults for model ID for Kibana connector #120834
Open
Description
Description
The inference Kibana connector currently dynamically generates the UI inputs for the user by calling the GET _inference/_services
API. The API will return default values to populate the input fields when possible. Currently the model ID input field is not populated with a default for any services. For some services (Ex. OpenAI) where the "default" model to use constantly changes, this is a reasonable experience. For other services (ex. EIS/ElasticsearchInternalService) we can improve the user experience by populating the field with a reasonable default model ID based on the task type. We'll need to align on 2 decisions before making this change (see comments for some ideas on solutions):
- What should the structure of the data returned by the inference API be?
- How can we make these changes without breaking the feature in serverless?