Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Weird behaviors from using OpenRouter's models #186

Closed
Koihin88 opened this issue Aug 24, 2024 · 5 comments
Closed

Weird behaviors from using OpenRouter's models #186

Koihin88 opened this issue Aug 24, 2024 · 5 comments

Comments

@Koihin88
Copy link

Thank you for this amazing plugin!

I've encountered some strange behaviors while using OpenRouter's AI models. Here are the details:

This is my config file:

return {
  {
    "yetone/avante.nvim",
    event = "VeryLazy",
    build = "make",
    opts = {
      -- add any opts hereby
      provider = "openai",
      openai = {
        endpoint = "https://openrouter.ai/api",
        model = "openai/gpt-4o-2024-08-06",
        temperature = 0,
        max_tokens = 4096,
      },
    },
    dependencies = {
      "nvim-tree/nvim-web-devicons",
      "stevearc/dressing.nvim",
      "nvim-lua/plenary.nvim",
      "MunifTanjim/nui.nvim",
      --- The below is optional, make sure to setup it properly if you have lazy=true
      {
        "MeanderingProgrammer/render-markdown.nvim",
        opts = {
          file_types = { "markdown", "Avante" },
        },
        ft = { "markdown", "Avante" },
      },
    },
  },
}

  1. Issue with openai/gpt-4o-2024-08-06 and meta-llama/llama-3.1-405b:
    • The sidebar toggles off automatically after generating a few lines.
    • When toggling the sidebar back on, a duplicate of the conversation appears.
    • This occurs despite the model API being called only once.
[
   {
      "provider":"openai",
      "timestamp":"2024-08-24 13:39:33",
      "model":"openai/gpt-4o-2024-08-06",
      "response":"Replace lines: 23-37\n```python\n            # ImageMagick command to convert APNG to GIF\n            cmd = [\n                \"convert\",\n                input_path,\n                \"-coalesce\",\n                \"-layers\",\n                \"optimize\",\n                output_path,\n            ]\n\n            # Run the ImageMagick command\n            subprocess.run(\n                cmd, check=True, stdout=subprocess.DEVNULL, stderr=subprocess.DEVNULL\n            )\n```",
      "request":"use imagemagick instead"
   },
   {
      "provider":"openai",
      "timestamp":"2024-08-24 13:39:33",
      "model":"openai/gpt-4o-2024-08-06",
      "response":"Replace lines: 23-37\n```python\n            # ImageMagick command to convert APNG to GIF\n            cmd = [\n                \"convert\",\n                input_path,\n                \"-coalesce\",\n                \"-layers\",\n                \"optimize\",\n                output_path,\n            ]\n\n            # Run the ImageMagick command\n            subprocess.run(\n                cmd, check=True, stdout=subprocess.DEVNULL, stderr=subprocess.DEVNULL\n            )\n```",
      "request":"use imagemagick instead"
   }
]
Screenshot 2024-08-24 at 14 01 10
  1. Issue with anthropic/claude-3.5-sonnet:
    • Unable to use <Tab> to continue the conversation after generation is complete.
    • Toggle the sidebar off and on only to find out the chat message to disappear.
    • The .avante_chat_history folder is not created.
Screenshot 2024-08-24 at 14 16 24
@yetone
Copy link
Owner

yetone commented Aug 24, 2024

image

When this line appears, it indicates that the content generation is complete. If this line does not appear, it means the content is still being generated. At this time, the input will disappear and no history record will be created.

@yetone
Copy link
Owner

yetone commented Aug 24, 2024

The sidebar toggles off automatically after generating a few lines.

Pressing the q or key will close the sidebar. Did you press these keys?

@Koihin88
Copy link
Author

When this line appears, it indicates that the content generation is complete. If this line does not appear, it means the content is still being generated.

I suspect the models from OpenRouter do not work very well with the system prompt in this plugin? Because they don't have 🎉🎉🎉 **Generation complete!** Please review the code suggestions above. at the end of the responses. And when I use anthropic/claude-3.5-sonnet, it finishes generating a full response but then just freezes up. (This freezing problem is only happening with the anthropic/claude-3.5-sonnet though.)

Pressing the q or key will close the sidebar. Did you press these keys?

No, I did not touch anything.

@yetone
Copy link
Owner

yetone commented Aug 24, 2024

‌‌‌This issue has been fixed in the latest version, you can give it a try.

@Koihin88
Copy link
Author

🧎‍♂️You are shipping so fast, what a legend! Everything is on point except the anthropic/claude-3.5-sonnet model. It does not follow your prompt very strictly. And despite finishing generating its response, the .avante_chat_history was not created either. I think I will just stick with other models for now. Thank you!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants