I have run into an unforeseen problem with my socket.io setup.
I use socket.io to live load data from my database (mongoDB, nodejs, react).
To accomplish this, I use mongoDB's changestream to detect changes and then push them to the front-end via socket.io.
Now this works perfectly as long as the user is connected. And right now, when the user reconnects, it just reloads all data. While this is fine for most users, there is a small group with very bad network connection and thus the front-end is reloading data all the time. Which causes the front-end to be unresponsive for some time.
So, I am looking for a way to only send events that occurred during the front-end being offline. While the front-end can do this quite easily: https://socket.io/docs/v4/client-offline-behavior/
It doesn't seem possible to do this at the server side. Since socket.io (server side) immediately forgets sockets that have disconnected and thus cant buffer events.
So, I was wondering if there is a good way do this? Or would this need a full "wrapper" around socket.io that caches disconnected sockets?
Any help or advice would be appreciated!
I find it is a really interesting and painful problem ! ^^'
If you can give more variables, it may help people to give you a better answer
For instance
How many data are stored in database, how much a typical user will receive, and how many events are triggered on a time frame ?
How long should an event take to be visible ? I mean, if users receive an event with a 10s,30s,... delay, is it harmfull for the service they provide.
How your data is structured ? is it a simple json array with the same field, custom field, dynamic json object, etc..
How your react app is structured, do you put heavy logic when your data is update, etc..
I think you should put more controls in your front end code and update only when new datas.
Some paths to explore
1. Put more controls in your front end
As you stated, for the users with bad connection, the react client seems to update his state too quickly, when they reload data after the websocket is connected, again and again. Ui may freeze in this case, yes.
For this, I think of two approaches :
Before updating the state, check if react current state is the same as the data you receive from websocket connection. If the reconnection is quick enough and no new data arrived, it should be the same. So in this case do not update react state.
If too many events are triggered and after each reconnection new data arrived, you can buffer the datas from the websocket and display it only once per time frame. What i mean by time frame, is you can use functions like setInterval or requestAnimationFrame to trigger react update. A pseudo react code to illustrate this.
function App() {
const [events, setEvents] = useState({ datas: [] });
const bufferedEvents = useRef([]);
useEffect(() => {
websocket.on("connected", (newEvents) => {
bufferedEvents.current = bufferedEvents.current.concat(newEvents);
})
websocket.on("data", (newEvent) => {
bufferedEvents.current = bufferedEvents.current.concat(newEvent);
})
// In the setInterval function you take all the events receive at the connection + new events. to update the react state. You clean the bufferedEvents at the same time.
const intervalId=setInterval(() => {
const events = bufferedEvents.current;
bufferedEvents.current = [];
//update if new datas
if (events.length > 0) {
setEvents((prevState) => { return { datas: prevState.datas.concat(events) } });
}
// console.log(events)
}, 1000) // trigger data update every second. You could replace this approach with a requestAnimationFrame. You can adapt the time refresh as you need.
//Do not forget to clear the interval when the component is unmount
return ()=>{
clearInterval(intervalId)
}
}, []);
return (
<div>
<span>Total events : {events.datas.length}</span>
<br />
{
events.datas.map(event => {
return <div>{event.data}</div>
})
}
</div>
)
}
You can look at this article for details on using requestAnimation frame.
I think that modifying the front end is needed in all case, but still alone, not really good on performance.
2. Fetch only new data in your back end
For this approach, it really depends how your data is structured in the database.
If the data have some timestamp in it, I can think of a naive but simple cookie with a timestamp in it.
When user connects the first time, this cookie is null.
When they fetch the data, on the websocket connection, they receive all the datas. When datas arrived, you update the cookie timestamp with the most recent date in the data.
Websocket is disconnected, you open a new websocket with the cookie timestamp on it. With this information you can query all the datas more recent than the timestamp on the cookie.
Like this, you don't have to download the entirity of data, but only fresh ones.
Other approaches may be more helpfull but without more informations on your datas and more precise requirements, it is hard to say.
If you have a lot of data, I will personally check some pagination mechanism and maybe combine some classic http request for fetching the data, and websocket, sse, or long polling for live events.
You can put a comment if needed and I will update my response !
Cheers
When I click on Clear search(cross) it take 30 sec to clear search text why it work slowly.
Please help me to resolve problem.
<SearchField id="idSoldSearch" search="onSoldSearch" width="100%" />
here I bind data through web service at onSoldSearch function.
When you click on the cross button of Search, it will also trigger the event handler of onSoldSearch, then it will trigger a web service call with the search value empty. It will take 30 sec to make a round trip .You can open the Chrome F12 developer tool to monitor the Network call time.
If you really want to make the WebService call when the search value is empty, You can show a busy dialog.
onSoldSearch:function(oEvent) {
var sQuery = oEvent.getParameters().query;
//adjust following code based on your real code
var onSuccess = function(oResponse) {
//some logic
this.hideBusyDialog();
}
var onError = function(oResponse) {
//some logic
this.hideBusyDialog();
}
webService.call("YourWebServiceurl?"+query,onSuccess,onError);
this.showBusyDialog();
}
Please see the example about BusyDialog.
I have implemented a function from the client part that calls the following function in the bigbluebutton-apps participant service
public void modEndMeeting(String roomName){
roomsManager.removeRoom(roomName);
}
Which calls a function that does the following
Gson gson = new Gson();
messagingService.send(MessagingConstants.SYSTEM_CHANNEL, gson.toJson(map));
And on the bigbluebutton-web part the following code is ran
listener.userLeft(meetingId, internalUserId);
And all that code does is place the meeting on the garbage collection list and does not end it right away.
The correct way to do it is to call the function end defined in ApiController.groovy.
I can do that buy generating an HTTP request. and sending the correct parameter. But I do not want to follow that approach
Is there any way that i can connect or return an instance of from which i can call the function end defined in ApiController.groovy that has an instance of a class called meetingService that actually owns the meeting ?
I have tried to make meetingService a singleton but that did not work.
I have also made the list that owns in the meeting of meetingService static. That also did not work
Here is a brief structure of meetingService
private final ConcurrentMap<String, Meeting> meetings;
This is a list that houses the meetings
Any ideas or suggestions ??
And http request would be the only way to close a meeting from the APIcontroller.
An other way would be to expensive. And it would decreasing the timer from 60 000 to like 1000 in the ExpiredMeetingCleanupTimer.java class
Or you can place a listener in the Apicontroller.groovy and send a redis message.
The thing to do would be to mark the meeting as closed on the red5 part and kicking anyone trying to join it, until the web part does the trick and ends the meeting.
This is my project architecture issue - its my first backbone project and I probably did something wrong.
In all my project in route callbacks I have:
myroute: function() {
this.currentView = new MyCustomView();
},
mysecondroute: function() {
this.currentView = new MySecondView()
},
//...
So in all route callbacks I instantiate some view. This view has initialze method which calls render method. It works except that all view events (declared in events: {}) are 'binded' every time same view is instantiated. so when I visit same route twice events for view corresponding to this route are fired twice...
Probably I shouldn instantiate new view on every route call - but how I can do this ? I mean what are the standards? Maybe I should just unload current view somehow - is there any method to do this?
I think you have to add a method to unbind all the events at the time to close the view
like this
close : function () {
//your code to clean everything before closing the view
this.remove();
this.unbind();
}
so the next time the view is called the events will be added during the initialization of the view, thats why you had events being called twice. the initialize method binds the events to the .el element. you need to make sure you unbind those at some point.
I have an ASP.NET MVC 3 (.NET 4) web application.
This app fetches data from an Oracle database and mixes some information with another Sql Database.
Many tables are joined together and lot of database reading is involved.
I have already optimized the best I could the fetching side and I don't have problems with that.
I've use caching to save information I don't need to fetch over and over.
Now I would like to build a responsive interface and my goal is to present the users the order headers filtered, and load the order lines in background.
I want to do that cause I need to manage all the lines (order lines) as a whole cause of some calculations.
What I have done so far is using jQuery to make an Ajax call to my action where I fetch the order headers and save them in a cache (System.Web.Caching.Cache).
When the Ajax call has succeeded I fire off another Ajax call to fetch the lines (and, once again, save the result in a cache).
It works quite well.
Now I was trying to figure out if I can move some of this logic from the client to the server.
When my action is called I want to fetch the order header and start a new thread - responsible of the order lines fetching - and return the result to the client.
In a test app I tried both ThreadPool.QueueUserWorkItem and Task.Factory but I want the generated thread to access my cache.
I've put together a test app and done something like this:
TEST 1
[HttpPost]
public JsonResult RunTasks01()
{
var myCache = System.Web.HttpContext.Current.Cache;
myCache.Remove("KEY1");
ThreadPool.QueueUserWorkItem(o => MyFunc(1, 5000000, myCache));
return (Json(true, JsonRequestBehavior.DenyGet));
}
TEST 2
[HttpPost]
public JsonResult RunTasks02()
{
var myCache = System.Web.HttpContext.Current.Cache;
myCache.Remove("KEY1");
Task.Factory.StartNew(() =>
{
MyFunc(1, 5000000, myCache);
});
return (Json(true, JsonRequestBehavior.DenyGet));
}
MyFunc crates a list of items and save the result in a cache; pretty silly but it's just a test.
I would like to know if someone has a better solution or knows of some implications I might have access the cache in a separate thread?!
Is there anything I need to be aware of, I should avoid or I could improve ?
Thanks for your help.
One possible issue I can see with your approach is that System.Web.HttpContext.Current might not be available in a separate thread. As this thread could run later, once the request has finished. I would recommend you using the classes in the System.Runtime.Caching namespace that was introduced in .NET 4.0 instead of the old HttpContext.Cache.