Feature Request: Model Filtering and Provider Display in Assistant Chat #25859
pejas
started this conversation in
LLMs and Zed Assistant
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hello Zed Team,
I'd like to propose two feature enhancements for the assistant chat model selection:
Issue: Currently, the model dropdown in the assistant chat lists all available models from GitHub Copilot, including those in preview. Many organizations, disable preview features due to data training concerns. This results in an unnecessarily long list of models, most of which are inaccessible.
Request:
Is it possible to filter the model list to only display models that are actively enabled within the user's GitHub Copilot settings?
Alternatively, could we have a configuration option (e.g., in a config file) to specify a whitelist or blacklist of models we wish to see in the dropdown?
Rationale: This would streamline the model selection process, making it more efficient and relevant to individual user configurations.
Issue: The model dropdown at the bottom of the assistant window currently displays only the model name. This makes it difficult to quickly determine the model's provider (e.g., GitHub Copilot, OpenAI, local Ollama). For instance, distinguishing between GPT-4o from Copilot and GPT-4o from OpenAI requires opening the model details.
Request: Could the model dropdown be enhanced to include the provider name alongside the model name? For example, "GPT-4o (GitHub Copilot)" or "GPT-4o (OpenAI)".
Rationale: This would improve clarity and efficiency, enabling users to quickly identify and select the desired model based on its provider.
Thank you for considering these suggestions. I believe they would significantly enhance the user experience with Zed's assistant chat feature.
Beta Was this translation helpful? Give feedback.
All reactions