Issue with sending message #80

Closed
opened 2026-02-13 17:27:47 -06:00 by mirrors · 12 comments
Owner

Originally created by @1ost on GitHub (Aug 25, 2025).

Hello.
Since 4 days ago, i keep getting responses of the sort "I can't help you with this, because I'm just a language model and I don't have the ability to understand and respond." when trying to send messages via this reversed API.

I had tried many times the same prompt in the web app and never encontered this issue while with the reversed API, i get it very frequently.
I had also tried using proxies, tried without proxies and the same result happens.

Sometimes it works, sometimes it doesn't so its really weird.

If anybody has any clue what could be wrong, could they please let me know? Thanks!

Originally created by @1ost on GitHub (Aug 25, 2025). Hello. Since 4 days ago, i keep getting responses of the sort "I can't help you with this, because I'm just a language model and I don't have the ability to understand and respond." when trying to send messages via this reversed API. I had tried many times the same prompt in the web app and never encontered this issue while with the reversed API, i get it very frequently. I had also tried using proxies, tried without proxies and the same result happens. Sometimes it works, sometimes it doesn't so its really weird. If anybody has any clue what could be wrong, could they please let me know? Thanks!
mirrors 2026-02-13 17:27:47 -06:00
  • closed this issue
  • added the
    invalid
    label
Author
Owner

@Ginoooo commented on GitHub (Aug 26, 2025):

I also encountered this problem. It appeared when using the reverse API, but I opened it on the web page and said "try it" in the chat content, and it could output the answer.

@Ginoooo commented on GitHub (Aug 26, 2025): I also encountered this problem. It appeared when using the reverse API, but I opened it on the web page and said "try it" in the chat content, and it could output the answer.
Author
Owner

@1ost commented on GitHub (Aug 27, 2025):

I had digged through it and i think its either one of these:

  1. The payload needed to be sent might need some changes . I stripped it down to just support for Gems and normal chat sessions and it seems to work
  2. Gemini 2.5-pro has a different model id compared to the ones defined in the const python file. From "2525e3954d185b3c" to "4af6c7f5da75d65d".

I'm more inclined to think that the second option is the one causing the issues, given that its trying to use a different version of Gemini 2.5-pro.

If someone could check exactly what could be wrong, it would be great. Thanks!

@1ost commented on GitHub (Aug 27, 2025): I had digged through it and i think its either one of these: 1) The payload needed to be sent might need some changes . I stripped it down to just support for Gems and normal chat sessions and it seems to work 2) Gemini 2.5-pro has a different model id compared to the ones defined in the const python file. From "2525e3954d185b3c" to "4af6c7f5da75d65d". I'm more inclined to think that the second option is the one causing the issues, given that its trying to use a different version of Gemini 2.5-pro. If someone could check exactly what could be wrong, it would be great. Thanks!
Author
Owner

@1ost commented on GitHub (Aug 27, 2025):

After more examination, i think the issue is the model id that is assigned to Gemini 2.5-pro. After changing the id to the new id extracted from gemini web app, its being way more reliable. I had run a test using a Gem that i had very frequent errors and it managed to process 9 messages without any issue whatsoever.

@1ost commented on GitHub (Aug 27, 2025): After more examination, i think the issue is the model id that is assigned to Gemini 2.5-pro. After changing the id to the new id extracted from gemini web app, its being way more reliable. I had run a test using a Gem that i had very frequent errors and it managed to process 9 messages without any issue whatsoever.
Author
Owner

@1ost commented on GitHub (Aug 27, 2025):

Update: It stopped working yet again. Only 2.5 flash is still working fine.
It looks like they've did some changes to 2.5 pro especially.

@1ost commented on GitHub (Aug 27, 2025): Update: It stopped working yet again. Only 2.5 flash is still working fine. It looks like they've did some changes to 2.5 pro especially.
Author
Owner

@HanaokaYuzu commented on GitHub (Aug 27, 2025):

Could you provide the prompt or code snippet which was not working? Tried several questions with both 2.5 flash and 2.5 pro and they both worked. If you are asking about explicit contents, the more possible reason is that Google tightened their moderation policy recently.

@HanaokaYuzu commented on GitHub (Aug 27, 2025): Could you provide the prompt or code snippet which was not working? Tried several questions with both 2.5 flash and 2.5 pro and they both worked. If you are asking about explicit contents, the more possible reason is that Google tightened their moderation policy recently.
Author
Owner

@1ost commented on GitHub (Aug 28, 2025):

Hello.
Thanks for answering.

I am not sending anything explicit, its just a Gem that's instructed to generate Schema.org product schemas for given products (specified in a JSON list).

Here's the link to a conversation where it failed:
https://g.co/gemini/share/5b054fe95dfd

To add: If i retry the same prompt in the same gem session using the web app, it works just fine.

@1ost commented on GitHub (Aug 28, 2025): Hello. Thanks for answering. I am not sending anything explicit, its just a Gem that's instructed to generate Schema.org product schemas for given products (specified in a JSON list). Here's the link to a conversation where it failed: https://g.co/gemini/share/5b054fe95dfd To add: If i retry the same prompt in the same gem session using the web app, it works just fine.
Author
Owner

@1ost commented on GitHub (Aug 28, 2025):

Here's another example with a simpler prompt from another script.

https://g.co/gemini/share/2d15e8598dab

This one is relatively more simple. Maybe it just doesn't accept enforcing the response format, given that i enforce it to respond with a JSON list?

Again, while running it, sometimes it works sometimes it doesn't. When trying the same prompt in the webapp, manually it works fine.
It had been working perfectly last week and the week before that. It just stopped working around last Friday.

@1ost commented on GitHub (Aug 28, 2025): Here's another example with a simpler prompt from another script. https://g.co/gemini/share/2d15e8598dab This one is relatively more simple. Maybe it just doesn't accept enforcing the response format, given that i enforce it to respond with a JSON list? Again, while running it, sometimes it works sometimes it doesn't. When trying the same prompt in the webapp, manually it works fine. It had been working perfectly last week and the week before that. It just stopped working around last Friday.
Author
Owner

@lesongbang commented on GitHub (Aug 28, 2025):

Mình cũng gặp vấn đề tương tự khi kêu gemini tạo ảnh.

@lesongbang commented on GitHub (Aug 28, 2025): Mình cũng gặp vấn đề tương tự khi kêu gemini tạo ảnh.
Author
Owner

@HanaokaYuzu commented on GitHub (Sep 2, 2025):

Did you get this response only when using a gem? Btw just updated model headers in v1.15.1, please update to the latest version and see if the problem persists.

@HanaokaYuzu commented on GitHub (Sep 2, 2025): Did you get this response only when using a gem? Btw just updated model headers in v1.15.1, please update to the latest version and see if the problem persists.
Author
Owner

@infstellar commented on GitHub (Sep 4, 2025):

I encountered a similar problem, but it got better the next day. I feel that network factors might be one of the issues.

@infstellar commented on GitHub (Sep 4, 2025): I encountered a similar problem, but it got better the next day. I feel that network factors might be one of the issues.
Author
Owner

@HanaokaYuzu commented on GitHub (Sep 4, 2025):

Here's another example with a simpler prompt from another script.

https://g.co/gemini/share/2d15e8598dab

You are right. Look like 2.5 pro randomly throws "I can't help you" response on this prompt. Unfortunately there doesn't seem to be a specific rule of how it happens. Leave this issue open to see if there will be any updates.

@HanaokaYuzu commented on GitHub (Sep 4, 2025): > Here's another example with a simpler prompt from another script. > > https://g.co/gemini/share/2d15e8598dab You are right. Look like 2.5 pro randomly throws "I can't help you" response on this prompt. Unfortunately there doesn't seem to be a specific rule of how it happens. Leave this issue open to see if there will be any updates.
Author
Owner

@1ost commented on GitHub (Sep 5, 2025):

Hey, sorry for the late reply but it looks like its working now.
I still get this ocassionaly but in most part it just works.

Its most likely something on their end, because some days it works perfectly with no error whatsoever and sometimes it throws that errors in some chats.

@1ost commented on GitHub (Sep 5, 2025): Hey, sorry for the late reply but it looks like its working now. I still get this ocassionaly but in most part it just works. Its most likely something on their end, because some days it works perfectly with no error whatsoever and sometimes it throws that errors in some chats.
Sign in to join this conversation.
No milestone
No project
No assignees
1 participant
Notifications
Due date
The due date is invalid or out of range. Please use the format "yyyy-mm-dd".

No due date set.

Dependencies

No dependencies set.

Reference
mirrors/Gemini-API#80
No description provided.