Web3 how to keep connection to web socket - node.js

Im trying to listen to Transfer events but it works for couple of minutes then process terminates. I believe thats because of the blockchain node I use but not sure. Can't find anything other so.
How can I keep the connection and Listen to Transfer events 24/7
const web3 = new Web3(new Web3.providers.WebsocketProvider('wss://bsc-ws-node.nariox.org:443'))
const contract = await new web3.eth.Contract(
ABI,
contracts[0]
)
contract.events
.Transfer({
fromBlock: 'latest',
filter: { from: contracts[1] }
})
.on('data', async (event: EventData) => {
const {
transactionHash,
returnValues: { value }
} = event
....
})

Related

Socket IO Client not receiving events on reconnection

I have a file called socket_io.js where I created a single instance of a socket io client in my react app as shown below:
socket_io.js
import EndPoints from './http/endpoints';
import io from "socket.io-client";
const socketUrl = EndPoints.SOCKET_BASE;
let socketOptions = { transports: ["websocket"] }
let socket;
if (!socket) {
socket = io(socketUrl, socketOptions);
socket.on('connect', () => {
console.log(`Connected to Server`);
})
socket.on('disconnect', () => {
console.log(`Disconnected from Server`);
})
}
export default socket;
Then I imported the above singleton in many react components as shown below.
MessagePage.js
import socket from '../socket_io.js';
let messageHandler=(data)=>{
}
useEffect(()=>{
socket.on('message',messageHandler); //This event no longer fires When the singleton socket io instance is reconnected
return ()=>{
socket.off('message');
}
},[]);
which works well but the issue I'm facing now is that when the singleton instance reconnects, the components referencing it are no longer receiving events from their respective handlers.
Possible causes of reconnection are when I manually restart the server
How can this be resolved?
I just solved this after working on it for a project of my own. My method involves two parts: creating the socket in a useEffect hook and then managing it using useRef for reconnection situations.
In Summary:
I think there are two issues. One is that the socket is being initialized as a singleton and not using a hook/context. I've read other reports of strangeness in this case, so I suggest switching to using context and creating your socket in a hook. Secondly, we have to manually store reconnection logic (although by generating the socket properly, it seems as though the actual event listeners are kept through reconnect).
export const SocketContext = createContext();
export const SocketContextProvider = ({ children }) => {
const [socket, setSocket] = useState();
const reconnectEmits = useRef([]);
// Here's your basic socket init.
useEffect(()=>{
const newSocket = io(url);
setSocket(newSocket);
return () => {
newSocket.close();
}
}, []);
// Code used to rejoin rooms, etc., on reconnect.
newSocket.io.on('reconnect', e => {
console.log("it took " + e + " tries to reconnect.");
for (let action of reconnectEmits.current) {
newSocket.emit(action.event, action.data);
}
})
// Here I also define a setListener and removeListener function, which determine which listeners a socket listens to. I don't have the code in front of me now, but it's pretty simple:
const addListener = (event, function) => {
// I use socket.off(event) right here to make sure I only have one listener per event, but you may not want this. If you don't use it you will need to make sure you use hooks to remove the event listeners that your components add to your socket when they are removed from the DOM.
socket.on(event, function);
}
// I implement an emit function here that's a wrapper, but I'm not sure if it's necessary. You could just expose the socket itself in the context. I just choose not to.
return (
<SocketContext.Provider value={{ emit, setListener, removeListener, addReconnectEmit, removeReconnectEmit }}>
{children}
</SocketContext.Provider>
)
}
And then in my components, in addition to having the emits to join rooms or conduct actions, I also provide the add and remove ReconnectEmit functions:
const addReconnectEmit = (event, data) => {
reconnectEmits.current = ([...reconnectEmits.current, { event, data }]);
console.log(reconnectEmits.current);
}
const removeReconnectEmit = (event, data) => {
console.log('removing reconnect event');
reconnectEmits.current = reconnectEmits.current.filter(e =>
{ return e.event !== event && e.data !== data }
);
console.log(reconnectEmits.current);
};
With these, I can set it so that, after a reconnect, my socket knows to reconnect to a certain room, etc. See here:
const Chatroom = ({ convoId }) => {
console.log("RENDERED: Chatroom");
const { emit, addReconnectEmit, removeReconnectEmit } = useContext(SocketContext);
useEffect(() => {
emit('joinConvo', convoId);
console.log("Emitting joinConvo message.");
addReconnectEmit('joinConvo', convoId);
return () => {
emit('leaveConvo', convoId);
removeReconnectEmit('leaveConvo', convoId);
}
}, [convoId, emit, addReconnectEmit, removeReconnectEmit]);
return (
<div id="chatroom">
<ChatroomOutput />
<ChatroomStatus />
<ChatroomControls convoId={convoId} />
</div>
);
}
I hope that helps! Between useEffect and manual reconnection logic, I just fixed similar issues to the ones you were having, where I was losing data on reconnection.
Saw you just answered yourself but my approach might still be valuable for others or if you continue to build a socket-client.
You need to abstract the listening components away from the socket object. The socket object upon onMessage needs to retrieve the subscribers and publish the new message to them. You can of course add filtering based on id, type or other properties. Also each component can drop its subscription when un-mounting or based on another need.
In order to show case I used timers but would be easily converted to messages.
socket_io.js
let socket;
const subscribers = []
if (!socket) {
// socket initial connect
socket = true
setInterval(() => {
console.log('interval runs', { socket })
if (socket) {
subscribers.forEach((sub) => {
sub.onMessage()
})
}
}, 1000)
setTimeout(() => {
// socket disconnects
socket = false
setTimeout(() => {
// socket reconnects
socket = true
}, 4000)
}, 4000)
}
export default subscribers;
MessagePage.js
import React, { useEffect, useState } from 'react'
import subscribers from './socket_io.js'
const MessagePage = () => {
const [messageCount, setMessageCount] = useState(0)
let messageHandler = (data) => {
setMessageCount((current) => current + 1)
}
useEffect(() => {
subscribers.push({
id: '1',
onMessage: (data) => messageHandler(data)
})
return () => {
const subToRemove = subscribers.findIndex((sub) => sub.id === '1')
subscribers.splice(subToRemove, 1)
}
}, []);
return (
<div>
Messages received: {messageCount}
</div>
)
}
export default MessagePage
Hope I could help.
export default expects a Hoistable Declarative , i.e function,express
socket_oi.js
import EndPoints from './http/endpoints';
import io from "socket.io-client";
const socketUrl = EndPoints.SOCKET_BASE;
let socketOptions = { transports: ["websocket"] }
let socket;
class Socket {
constructor (){
if (!socket) {
socket = io(socketUrl, socketOptions);
socket.on('connect', () => {
console.log(`Connected to Server`);
})
socket.on('disconnect', () => {
console.log(`Disconnected from Server`);
})
}
socket = this
}
}
//Freeze the object , to avoid modification by other functions/modules
let newSocketInstance = Object.freeze(new Socket)
module.exports = newSocketInstance;
MessagePage.js
import socket from '../socket_io.js';
const MessagePage = (props){
const messageHandler=(data)=>{
}
useEffect(()=>{
socket.on('message',messageHandler); //This event no longer fires When the
singleton socket io instance is reconnected
return ()=>{
socket.off('message');
}
},[]);
}

RxJS non-blocking pooling

I'm working with Amazon Transcribe Service and in the SDK doesn't have any way to get when the transcription job has finished.
So we need to pool that information. Nowdays we have the following working code...
const jobid = nanoid();
await amazonTrascribeClient
.startTranscriptionJob({
IdentifyMultipleLanguages: true,
TranscriptionJobName: jobId,
Media: {
MediaFileUri: "s3://file-location",
},
Subtitles: {
OutputStartIndex: 1,
Formats: ["vtt", "srt"],
},
OutputBucketName: `file-location`,
OutputKey: `transcriptions/${jobId}/`,
})
.promise();
// HELP HERE:
const callIntervalFunc = () => {
const callInverval = setInterval(async () => {
const { TranscriptionJob } = await amazonTrascribeClient
.getTranscriptionJob({ TranscriptionJobName: jobId })
.promise();
if (
["COMPLETED", "FAILED"].includes(TranscriptionJob.TranscriptionJobStatus)
) {
clearInterval(callInverval);
// Persist in database etc...
}
}, 2000);
};
callIntervalFunc();
But as you can see it's extremally expensive and don't work in the concurrency mode since it's lock the thread. The objective is pool the information without block the event loop, some people said to use fire and forget, but I have no idea where I should start.

Bi-directional Websocket with RTK Query

I'm building a web-based remote control application for the music program Ableton Live. The idea is to be able to use a tablet on the same local network as a custom controller.
Ableton Live runs Python scripts, and I use this library that exposes the Ableton Python API to Node. In Node, I'm building an HTTP/Websocket server to serve my React frontend and to handle communication between the Ableton Python API and the frontend running Redux/RTK Query.
Since I both want to send commands from the frontend to Ableton Live, and be able to change something in Ableton Live on my laptop and have the frontend reflect it, I need to keep a bi-directional Websocket communication going. The frontend recreates parts of the Ableton Live UI, so different components will care about/subscribe to different small parts of the whole Ableton Live "state", and will need to be able to update just those parts.
I tried to follow the official RTK Query documentation, but there are a few things I really don't know how to solve the best.
RTK Query code:
import { createApi, fetchBaseQuery } from '#reduxjs/toolkit/query/react';
import { LiveProject } from '../models/liveModels';
export const remoteScriptsApi = createApi({
baseQuery: fetchBaseQuery({ baseUrl: 'http://localhost:9001' }),
endpoints: (builder) => ({
getLiveState: builder.query<LiveProject, void>({
query: () => '/completeLiveState',
async onCacheEntryAdded(arg, { updateCachedData, cacheDataLoaded, cacheEntryRemoved }) {
const ws = new WebSocket('ws://localhost:9001/ws');
try {
await cacheDataLoaded;
const listener = (event: MessageEvent) => {
const message = JSON.parse(event.data)
switch (message.type) {
case 'trackName':
updateCachedData(draft => {
const track = draft.tracks.find(t => t.trackIndex === message.id);
if (track) {
track.trackName = message.value;
// Components then use selectFromResult to only
// rerender on exactly their data being updated
}
})
break;
default:
break;
}
}
ws.addEventListener('message', listener);
} catch (error) { }
await cacheEntryRemoved;
ws.close();
}
}),
})
})
Server code:
import { Ableton } from 'ableton-js';
import { Track } from 'ableton-js/ns/track';
import path from 'path';
import { serveDir } from 'uwebsocket-serve';
import { App, WebSocket } from 'uWebSockets.js';
const ableton = new Ableton();
const decoder = new TextDecoder();
const initialTracks: Track[] = [];
async function buildTrackList(trackArray: Track[]) {
const tracks = await Promise.all(trackArray.map(async (track) => {
initialTracks.push(track);
// A lot more async Ableton data fetching will be going on here
return {
trackIndex: track.raw.id,
trackName: track.raw.name,
}
}));
return tracks;
}
const app = App()
.get('/completeLiveState', async (res, req) => {
res.onAborted(() => console.log('TODO: Handle onAborted error.'));
const trackArray = await ableton.song.get('tracks');
const tracks = await buildTrackList(trackArray);
const liveProject = {
tracks // Will send a lot more than tracks eventually
}
res.writeHeader('Content-Type', 'application/json').end(JSON.stringify(liveProject));
})
.ws('/ws', {
open: (ws) => {
initialTracks.forEach(track => {
track.addListener('name', (result) => {
ws.send(JSON.stringify({
type: 'trackName',
id: track.raw.id,
value: result
}));
})
});
},
message: async (ws, msg) => {
const payload = JSON.parse(decoder.decode(msg));
if (payload.type === 'trackName') {
// Update track name in Ableton Live and respond
}
}
})
.get('/*', serveDir(path.resolve(__dirname, '../myCoolProject/build')))
.listen(9001, (listenSocket) => {
if (listenSocket) {
console.log('Listening to port 9001');
}
});
I have a timing issue where the server's ".ws open" method runs before the buildTrackList function is done fetching all the tracks from Ableton Live. These "listeners" I'm adding in the ws-open-method are callbacks that you can attach to stuff in Ableton Live, and the one in this example will fire the callback whenever the name of a track changes. The first question is if it's best to try to solve this timing issue on the server side or the RTK Query side?
All examples I've seen on working with Websockets in RTK Query is about "streaming updates". But since the beginning I've thought about my scenario as needing bi-directional communication using the same Websocket connection the whole application through. Is this possible with RTK Query, and if so how do I implement it? Or should I use regular query endpoints for all commands from the frontend to the server?

How to send and close websocket with a function in React

I am trying to use websockets to connect to Kraken websocket API. I create a new instance and setup listeners on intitial render. I am trying to make buttons to subscribe and close but the buttons are not working.
I am using this websocket library.
const WebSocket = require('isomorphic-ws')
function App() {
const ws = new WebSocket('wss://ws.kraken.com')
useEffect(() => {
ws.onopen = () => {
console.log('connected')
}
ws.onmessage = (msg) => {
const message = JSON.parse(msg.data)
const sub = message[2]
console.log(message)
}
ws.onclose = () => {
console.log('closing connection')
// ws.close()
}
return () => {
ws.close()
}
}, [])
const wsClose = () => {
ws.close()
}
const wsSub = () => {
ws.send(JSON.stringify(
{
"event": "subscribe",
"pair": [
"XBT/USD"
],
"subscription": {
"name": "ticker"
}
}
))
console.log('send subscribe')
}
return (
<button onClick={wsSub}>subscribe</button>
<button onClick={wsClose}>close ws</button>
)
}
export default App
If I put the code from the wsSub function under the ws.onopen listener the subscription works, but I want to have control over the subscriptions not subscribe when the websocket is opened. I am using buttons like this for testing. I want to subscribe and unsubscribe based on user form submission but I feel like I should be able to get this working first.
Right now, you're creating a new socket every time the component re-renders. The effect callback references the first socket created (on mount), but the wsClose and wsSub do not (they reference the socket created in the immediately previous render).
Put the socket into a state or a ref:
const wsRef = useRef();
if (!wsRef.current) {
wsRef.current = new WebSocket('wss://ws.kraken.com');
}
Then proceed to replace all uses of ws with wsRef.current.

Pubsub between two nodes in IPFS

I'm trying to send messages between two IPFS nodes.
The daemon that I'm running is based on go-ipfs, and is running with the flag:
ipfs daemon --enable-pubsub-experiment
I've coded two .js files, one is for the subscriber:
const IPFS = require('ipfs')
const topic = 'topic';
const Buffer = require('buffer').Buffer;
const msg_buffer = Buffer.from('message');
const ipfs = new IPFS({
repo: repo(),
EXPERIMENTAL: {
pubsub: true
},
config: {
Addresses: {
Swarm: [
'/dns4/ws-star.discovery.libp2p.io/tcp/443/wss/p2p-websocket-star'
]
}
}
})
ipfs.once('ready', () => ipfs.id((err, info) => {
if (err) { throw err }
console.log('IPFS node ready with address ' + info.id)
subscribeToTopic()
}))
function repo () {
return 'ipfs-' + Math.random()
}
const receiveMsg = (msg) => {
console.log(msg.data.toString())
}
const subscribeToTopic = () => {
ipfs.pubsub.subscribe(topic, receiveMsg, (err) => {
if (err) {
return console.error(`failed to subscribe to ${topic}`, err)
}
console.log(`subscribed to ${topic}`)
})
}
And one is for the publisher:
const IPFS = require('ipfs');
const topic = 'topic';
const Buffer = require('buffer').Buffer;
const msg_buffer = Buffer.from('message');
const ipfs = new IPFS({
repo: repo(),
EXPERIMENTAL: {
pubsub: true
},
config: {
Addresses: {
Swarm: [
'/dns4/ws-star.discovery.libp2p.io/tcp/443/wss/p2p-websocket-star'
]
}
}
})
ipfs.once('ready', () => ipfs.id((err, info) => {
if (err) { throw err }
console.log('IPFS node ready with address ' + info.id)
publishToTopic()
}))
function repo () {
return 'ipfs-' + Math.random()
}
const publishToTopic = () => {
ipfs.pubsub.publish(topic, msg_buffer, (err) => {
if (err) {
return console.error(`failed to publish to ${topic}`, err)
}
// msg was broadcasted
console.log(`published to ${topic}`)
console.log(msg_buffer.toString())
})
}
I've runned the .js scripts with:
node file.js
But the subscriber didn't receive any message from the subscriber and I don't know why.
What is the correct way to connect two nodes in this case?
Maybe I'm wrong but the npm package ipfs is an entire implementation of IPFS protocol and it creates a node when the constructor is called, that's why ipfs daemon ... is not necesary. If you need to use it as API with the ipfs daemon you can use the ipfs-http-client package.
You can use ipfs-pubsub-room and it has a working example based on this package ipfs-pubsub-room-demo.
I hope it helps, I'm still learning this tech too.
Currently (2019-09-17) most nodes in the ipfs network don't have pubsub enabled, so chances your pubsub messages will pass through are slim.
You can try to establish direct connection between your nodes, as explained here:managing swarm connections in ipfs
Essentially:
Run "ipfs id" on internet accessible node
Inspect output, get the address (it should look like this /ip4/207.210.95.74/tcp/4001/ipfs/QmesRgiWSBeMh4xbUEHUKTzAfNqihr3fFhmBk4NbLZxXDP
)
On the other node establish direct connection:
ipfs swarm connect /ip4/207.210.95.74/tcp/4001/ipfs/QmesRgiWSBeMh4xbUEHUKTzAfNqihr3fFhmBk4NbLZxXDP
Please see the ipfs Github example, as it shows how to connect two js-ipfs browser nodes together via WebRTC.

Resources