Rabbit MQ amqplib error "No channels left to allocate" - node.js

After working the rabbit mq workers in pub sub pattern for some time i am getting an error in creating an channel.
Error: No channels left to allocate

If you are using https://www.npmjs.com/package/amqplib,
you can use Promise to share a channel while publishing multiple messages
in message-queue.js
const q = 'tasks';
const open = require('amqplib').connect('amqp://localhost');
const channelPromise = open.then((conn) => conn.createChannel());
// Publisher
function publishMessage(message) {
channelPromise.then((ch) => ch.assertQueue(q)
.then((ok) => ch.sendToQueue(q, Buffer.from(message))))
.catch(console.warn);
}
// Consumer
open.then((conn) => conn.createChannel())
.then((ch) => ch.assertQueue(q).then((ok) => ch.consume(q, (msg) => {
if (msg !== null) {
console.log(msg.content.toString());
ch.ack(msg);
}
}))).catch(console.warn);
module.exports = {
publishMessage,
};
some-where.js
messageQueue.publishMessage('hello world')

The maximum number of channels you can allocate is negotiated with rabbitmp server. In my server, this value is 2047. You can debug you amqplib/lib/connection.js to get this.

Related

Implementing this Node.js blockhain on a network

I am looking at this Blockchain from firebase. https://github.com/fireship-io/node-blockchain/blob/main/index.ts
The Blockchain is simple enough. There are many similar examples of blockchain implementations, but I don't really see any that are actually used in a network. I am trying to get some footing for implementing this, at least with 2 users to start.
The thing I'm confused about at the moment is what actually gets shared between users? Is it the chain itself? Just new transactions? When a user does Wallet.sendMoney(5, satoshi.publicKey), it would update the local wallet, but then what? I'm guessing send the transaction to others on the network, but then each copy of the blockchain adds/verifies independently. This seems problematic because some of the transactions could get lost (internet outage or whatever), which makes me wonder if the whole blockchain gets sent, yet this seems unwieldy.
The thing I'm confused about at the moment is what actually gets
shared between users?
You meant between nodes not "users". All the nodes should have the same chain and transactions. So you have to implement a pub-sub system to listen for certain events and also publish the transaction and chain. Using redis, you can create a class. I explained on code:
// Note that redis stores strings.
const redis = require("redis");
// create two channels to transfer data
const CHANNELS = {
BLOCKCHAIN: "BLOCKCHAIN",
TRANSACTION: "TRANSACTION",
};
// unlike socket, we do not need to know the address of otehr nodes
// ---------------------------HERE IS BROADCASTING STATION---------------------
// it writes multiple processes to communicate over channels.
class PubSub {
constructor({ blockchain, transactionPool }) {
// it is able to broadcast its chain and replacing the valid chain
this.blockchain = blockchain;
this.transactionPool = transactionPool;
this.publisher = redis.createClient();
this.subscriber = redis.createClient();
this.subscribeToChannels();
// this.subscriber.on("message", (channel, message) => {
// return this.handleMessage(channel, message);
// });
this.subscriber.on("message", (channel, message) =>
this.handleMessage(channel, message)
);
}
// we are listening all channels
subscribeToChannels() {
Object.values(CHANNELS).forEach((channel) => {
this.subscriber.subscribe(channel);
});
}
publish({ channel, message }) {
//we unsubscrive so we dont send message to ourselves
// we subscribe again to receive messages
this.subscriber.unsubscribe(channel, () => {
this.publisher.publish(channel, message, () => {
this.subscriber.subscribe(channel);
});
});
}
// ------THIS IS WHERE BROADCASTING DONE-------------
handleMessage(channel, message) {
const parsedMessage = JSON.parse(message);
switch (channel) {
case CHANNELS.BLOCKCHAIN:
this.blockchain.replaceChain(parsedMessage, true, () => {
// we need to clear local transaction pool becasue we got a new chain
this.transactionPool.clearBlockchainTransactions({
chain: parsedMessage,
});
});
break;
case CHANNELS.TRANSACTION:
console.log("this in pubsusb", this.transactionPool);
this.transactionPool.setTransaction(parsedMessage);
break;
default:
return;
}
}
broadcastChain() {
this.publish({
channel: CHANNELS.BLOCKCHAIN,
message: JSON.stringify(this.blockchain.chain),
});
}
broadcastTransaction(transaction) {
this.publish({
channel: CHANNELS.TRANSACTION,
message: JSON.stringify(transaction),
});
}
}
export default PubSub;

Implement a PubSub debounce mechanism

I have different publishers publish to a PubSub Topic. Each message has a specific key. I would like to create subscribers that only pick up the latest message for each specific key within a defined interval. In other words, I would like to have some kind of debounce implemented for my subscribers.
Example (with debounce 2 seconds)
-(x)-(y)-(x)-------(z)-(z)---(x)-----------------> [Topic with messages]
|-------|---------------|execute for x [Subscriber]
2 seconds
|---------------|execute for y [Subscriber]
2 seconds
|---|---------------|execute for z [Subscriber]
2 seconds
|---------------|execute for x [Subscriber]
2 seconds
Ordered Execution Summary:
execute for message with key: y
execute for message with key: x
execute for message with key: z
execute for message with key: x
Implementation
// index.ts
import * as pubsub from '#google-cloud/pubsub';
import * as functions from 'firebase-functions';
import AbortController from 'node-abort-controller';
exports.Debouncer = functions
.runWith({
// runtimeOptions
})
.region('REGION')
.pubsub.topic('TOPIC_NAME')
.onPublish(async (message, context) => {
const key = message.json.key;
// when an equivalent topic is being received, cancel this calculation:
const aborter = await abortHelper<any>(
'TOPIC_NAME',
(message) => message?.key === key
).catch((error) => {
console.error('Failed to init abort helper', error);
throw new Error('Failed to init abort helper');
});
await new Promise((resolve) => setTimeout(resolve, 2000));
// here, run the EXECUTION for the key, unless an abortsignal from the abortHelper was received:
// if(aborter.abortController.signal) ...
aborter.teardown();
/**
* Subscribe to the first subscription found for the specified topic. Once a
* message gets received that is matching `messageMatcher`, the returned
* AbortController reflects the abortet state. Calling the returned teardown
* will cancel the subscription.
*/
async function abortHelper<TMessage>(
topicName: string,
messageMatcher: (message: TMessage) => boolean = () => true
) {
const abortController = new AbortController();
const pubSubClient = new pubsub.PubSub();
const topic = pubSubClient.topic(topicName);
const subscription = await topic
.getSubscriptions()
.then((subscriptionsResponse) => {
// TODO use better approach to find or provide subscription
const subscription = subscriptionsResponse?.[0]?.[0];
if (!subscription) {
throw new Error('no found subscription');
}
return subscription;
});
const listener = (message: TMessage) => {
const matching = messageMatcher(message);
if (matching) {
abortController.abort();
unsubscribeFromPubSubTopicSubscription();
}
};
subscription.addListener('message', listener);
return {
teardown: () => {
unsubscribeFromPubSubTopicSubscription();
},
abortController,
};
function unsubscribeFromPubSubTopicSubscription() {
subscription.removeListener('message', listener);
}
}
});
The initial idea was to register a cloud function to the topic. This cloud function itself then subscribes to the topic as well and waits for the defined interval. If it picks up a message with the same key during the interval, it exits the cloud function. Otherwise, it runs the execution.
Running inside the firebase-emulator this worked fine. However, on production random and hard to debug issues occurred most likely due to parallel execution of the functions.
What would be the best approach to implement such a system in a scalable way? (It does not necessarily have to be with PubSub.)

How can we load all messages from a single discord channel?

I'm currently working on a self-bot that fetches all images from a channel and then downloads them: when I use my self-bot, the bot doesn't fetch messages that aren't loaded by the client and we can't load all of the messages simultaneously. Is there a way to do that? Something like a command to load all messages from a channel and then do multiple .fetchMessages() to get them all?
Self-Bots might be against the ToS, but iterating through messages in a channel is not, as far as I know. So...
Here's a snippet that will fetch all messages using the new js async generators functionality for efficiency
The snippet:
async function * messagesIterator (channel) {
let before = null
let done = false
while (!done) {
const messages = await channel.messages.fetch({ limit: 100, before })
if (messages.size > 0) {
before = messages.lastKey()
yield messages
} else done = true
}
}
async function * loadAllMessages (channel) {
for await (const messages of messagesIterator(channel)) {
for (const message of messages.values()) yield message
}
}
How it's used:
client.on('ready', async () => {
const targetChannel = client.guilds.cache.first().channels.cache.find(x => x.name === 'test')
// Iterate through all the messages as they're pulled
for await (const message of loadAllMessages(targetChannel)) {
console.log(message.content)
}
})
We can't since it's against ToS. :/ (even if it's a bot I think)

websocket interrupted while angular2 project is loading on firefox

I've just started angular 2. I've done an angular2 sample as given in the https://angular.io/guide/quickstart
when I run the project in Firefox using
npm start
command in terminal, the connection get disconnected after output showing once.Error showing like
The connection to ws://localhost:3000/browser-sync/socket.io/?EIO=3&transport=websocket&sid=6YFGHWy7oD7T7qioAAAA was interrupted while the page was loading
Any idea about how to fix this issue ?
I don't know how you manage your web socket but you could consider using the following code. This idea is to wrap the web socket into an observable.
For this you could use a service like below. The initializeWebSocket will create a shared observable (hot) to wrap a WebSocket object.
export class WebSocketService {
initializeWebSocket(url) {
this.wsObservable = Observable.create((observer) => {
this.ws = new WebSocket(url);
this.ws.onopen = (e) => {
(...)
};
this.ws.onclose = (e) => {
if (e.wasClean) {
observer.complete();
} else {
observer.error(e);
}
};
this.ws.onerror = (e) => {
observer.error(e);
}
this.ws.onmessage = (e) => {
observer.next(JSON.parse(e.data));
}
return () => {
this.ws.close();
};
}).share();
}
}
You could add a sendData to send data on the web socket:
export class WebSocketService {
(...)
sendData(message) {
this.ws.send(JSON.stringify(message));
}
}
The last point is to make things a bit robust, i.e. filter received messages based on a criteria and implement retry when there is a disconnection. For this, you need to wrap our initial websocket observable into another one. This way we can support retries when the connection is lost and integrate filtering on criteria like the client identifier (in the sample the received data is JSON and contains a sender attribute).
export class WebSocketService {
(...)
createClientObservable(clientId) {
return Observable.create((observer) => {
let subscription = this.wsObservable
.filter((data) => data.sender!==clientId)
.subscribe(observer);
return () => {
subscription.unsubscribe();
};
}).retryWhen((errors) => {
return Observable.timer(3000);
});
}
}
You can see that deconnections are handled in this code using the retryWhen operator of observable.

How to work around amqplib's Channel#consume odd signature?

I am writing a worker that uses amqplib's Channel#consume method. I want this worker to wait for jobs and process them as soon as they appear in the queue.
I wrote my own module to abstract away ampqlib, here are the relevant functions for getting a connection, setting up the queue and consuming a message:
const getConnection = function(host) {
return amqp.connect(host);
};
const createChannel = function(conn) {
connection = conn;
return conn.createConfirmChannel();
};
const assertQueue = function(channel, queue) {
return channel.assertQueue(queue);
};
const consume = Promise.method(function(channel, queue, processor) {
processor = processor || function(msg) { if (msg) Promise.resolve(msg); };
return channel.consume(queue, processor)
});
const setupQueue = Promise.method(function setupQueue(queue) {
const amqp_host = 'amqp://' + ((host || process.env.AMQP_HOST) || 'localhost');
return getConnection(amqp_host)
.then(conn => createChannel(conn)) // -> returns a `Channel` object
.tap(channel => assertQueue(channel, queue));
});
consumeJob: Promise.method(function consumeJob(queue) {
return setupQueue(queue)
.then(channel => consume(channel, queue))
});
My problem is with Channel#consume's odd signature. From http://www.squaremobius.net/amqp.node/channel_api.html#channel_consume:
#consume(queue, function(msg) {...}, [options, [function(err, ok) {...}]])
The callback is not where the magic happens, the message's processing should actually go in the second argument and that breaks the flow of promises.
This is how I planned on using it:
return queueManager.consumeJob(queue)
.then(msg => {
// do some processing
});
But it doesn't work. If there are no messages in the queue, the promise is rejected and then if a message is dropped in the queue nothing happens. If there is a message, only one message is processed and then the worker stalls because it exited the "processor" function from the Channel#consume call.
How should I go about it? I want to keep the queueManager abstraction so my code is easier to reason about but I don't know how to do it... Any pointers?
As #idbehold said, Promises can only be resolved once. If you want to process messages as they come in, there is no other way than to use this function. Channel#get will only check the queue once and then return; it wouldn't work for a scenario where you need a worker.
just as an option. You can present your application as a stream of some messages(or events). There is a library for this http://highlandjs.org/#examples
Your code should look like this(it isn`t a finished sample, but I hope it illustrates the idea):
let messageStream = _((push, next) => {
consume(queue, (msg) => {
push(null, msg)
})
)
// now you can operate with your stream in functional style
message.map((msg) => msg + 'some value').each((msg) => // do something with msg)
This approach provides you a lot of primitives for synchronization and transformation
http://highlandjs.org/#examples

Resources