Telegram cli bot with pyrogram limits - python-3.x

I use the pyrogram library to develop the client robot, I set it up so that the robot connects to 15 phone numbers and sends messages in order with its proxies.
These 15 accounts are members in 300 joint groups! , every 15 accounts will be activated, respectively, and the text they have to send will be sent to these 300 groups.
In the form of the following process :
The first account is responsible for sending messages to the first 20 groups and the second account is responsible for sending messages to the second 20 groups, respectively ....
The question is, are pyrograms or telethons or these libraries restricted by the telegram? And the accounts get baned??
Is there a way to prevent this from happening?

Yes, you will get a flood wait if you try to send too many messages using a account.
Here is code to avoid flood wait https://docs.pyrogram.org/faq/how-to-avoid-flood-waits#how-to-avoid-flood-waits
to prevent flood wait you can add a delay for sending messages
asyncio.sleep(3)
Don't forget to import the asyncio library
Don't use time.sleep() because it is blocking, prefer to use asyncio.sleep() instead.

Related

Process long arrays in Cloud Functions

I'am developing an application that makes users able to broadcast videos. As many social network do, users need to receive a notification when someone goes live. To do so I'am using Cloud Functions and i pass to the functions the array of the users that must receive the notification; for every user of the array I need to extract the FCM TOKEN from the server and then send the notification.
For arrays of 10 / 20 Users the functions doesn't take so long, but for 150/300 users sometimes I get timeout or a very slow execution.
So my question is: Is it possible to divide the array in groups of 20/30 users and process many arrays at same time??
Thanks
There is 2 way to answer this
From a development point of view, some languages are easier for allowing the concurrent processing (Go is very handy for this). So, like this, because you spend a lot of time in API call (FCM), it's a first solution to perform several calls concurrently
From an architecture point of view, PubSub and/or Cloud Task are well designed for this
Your first function only creates chunk of message to send and posts them to Cloud Task or PubSub
Your second function receives the chunks and sends the messages. The chunks are processed in parallel on several functions.

How to get message info by ID [Telegram API]

I'm writing bot for telegram to gather some stats from group chat. I need to get info about every message (from the beginning of chat). I know how can i do it, but it's a quite bad idea. I can use forwardMessage method, but i need second acc for it and i'm getting timeouted when i'm sending messages too fast (for one hour), so it's a bit long way to collect stats for conversation that has over 2 million messages ;s I tried to set limit on 10 messages per second but i'm still getting timeouted, so idk how it works.
There must be other way to get JUST message info by id without forwarding it ;v I can't find it in API.
There has no API to do this at this time, you can suggest this idea to #BotSupport, before them added this feature, I am doing same thing like you.
According to Bot FAQ, Telegram API rate limit 1/s pre chat, and global limit is 30/s.
There is no way to do this with Telegram bot api, you can use ReadHistory Method of MadelineProto without the necessity to use forward message method

Kik bot is triggering "TooManyRequests" error in BotFramework

I recently deployed a bot using Azure and BotFramework to Skype, Slack, Telegram and some other platforms.
They all seem to work fine, except in Kik, where the bot will suddenly stop responding. The error message in BotFramework reads:
{"message":"Too many requests for user: 'redacted_user_name'","error":"TooManyRequests"}
The Kik tester is triggering this error through regular use, though when I test it on my (Android) phone, it works just fine.
Any idea what might be causing this?
EDIT:
After contacting Kik, I was told that my Bot was sending more messages than it was recieving, and they only allow a surplus of 20 before a bot becomes banned.
They say the solution is to implement batching, which BotBuilder says is built in. (My bot uses session.send("text") followed by a prompt.) However, Kik does not see my messages as a batch, and every couplet is counting as 2 messages.
I tried adjusting the autoBatchDelay to see if 0 would work better than the default and noticed that it did not make a difference. Furthermore, changing it to 2000 also made no difference, and did not delay 2000ms between messages.
var bot = new builder.UniversalBot(connector, {autoBatchDelay: 0});
Is it possible my bot is not batching properly? What steps could I take to address this?
Batching for Kik is currently on our backlog. In the mean time, is there any reason you can't send your text and prompt in the same message (with carriage returns in between if needed)? That should resolve your issue (as I understand it).
Also worth noting that the Kik rules for recovering from a throttling deficit are somewhat complex.
• In any given send message request, a bot can send up to 25 messages in a single POST request. Within the 25 messages, a bot is allowed to have up to 5 messages directed to a single user.
• Whether you send 1 message or 5 messages, that collection of requests is considered a “batch” of messages to a user.
• A bot is allowed 20 unsolicited batches to a user a day.
• This means you could be sending between 20-100 unsolicited messages to a user a day depending on how many messages you have in a batch. How the bot platform determines unsolicitation works like a debit/credit system that resets at the end of a day. e.g. Julie sends the bot a message, the balance becomes +1. The bot responds with 3 messages in one batch, the balance becomes 0. Julie sends the bot 1 message, the balance becomes +1. The bot responds with 5 messages in separate batches, the balance becomes -4. Julie sends the bot a message, the balance becomes +1. The bot responds with 5 messages in separate batches, the balance becomes -9.
• If this deficit continues to -20, the daily user rate limit will have reached, and the bot will NOT be able to send any more messages to that user. There are different methods to work with this rate limit, e.g. using batches more efficiently or building a UX that encourages more user interactivity.

How to make multiple API calls with rate limits per user using RabbitMQ?

In my app I am getting data on behalf of different users via one API which has a rate limit of 1 API call every 2 seconds per user.
Currently I am storing all the calls I need to make in a single message queue. I am using RabbitMQ for this.
There is currently one consumer who is taking one message at a time, doing the call, processing the result and then start with the next message.
The queue is filling up faster than this single consumer can make the API calls (1 call every 2 seconds as I don't know which user comes next and I don't want to hit API limits).
My problem is now that I don't know how to add more consumers which in theory would be possible as the queue holds jobs for different users and the API rate limit is per user so e.g. I could do 2 API calls every 2 seconds if they are from different users.
However I have no information about the messages in the queue. Could be from a single user, could be from many different users.
Only solution I see right now is to create separate queues for each user. But I have many different users (say 1,000) and would rather stay with 1 queue.
If possible I would stick with RabbitMQ as I use this for other similar tasks as well. But if I need to change my stack I would be willing to do so.
App is using the MEAN stack.
You will need to maintain a state somewhere, I had a similar application and what i did was maintain state in Redis, before every call check if user has made request in last 2 seconds eg:
Redis key:
user:<user_id> // value is epoch time-stamp
update Redis once request is made.
refrence:
redis

GCM message to all users (without topics)

I have the following dilemma:
I need to send a heartbeat message every 5 minutes (or less) to all users of my app
I thought about topic messaging, but the 1 million subscriber limit is not acceptable for my application
So: the only possibility left is sending out the message in batches of 1000
This is really resource intensive
Now my question:
How can I make this process of batching and sending really efficient? Is there a good solution already made, preferably in node.js?
Thank you,
Sebastian
You may use XMPP, instead of HTTP.
As google says, it is less resource intensive in respect to HTTP:
The asynchronous nature of XMPP allows you to send more messages with
fewer resources.
Also you can have 1000 similtanouis connection per app (sender ID):
For each sender ID, GCM allows 1000 connections in parallel.
Also there exists a node-xmpp solution available for that.

Resources