GUI and Threads in a chat program - multithreading

Hello guys ive been searching for an answer to this question and was unable to find a suiting solution to my problem.
i have a chat program that has a somewhat advanced gui. The chat program in total consists of two programs a server and a client. ive created a protocol that my clients listens to and reacts depending on what type information it gets.
i have created a class called clientReciver which extends Thread. but i am now confused on how i will get the informaiton that the thread recives and use it in my gui.
and example of this could be how will i get the text that one of my clients sends and add it to my GUI?
It may be worth mentioning that i am using JavaFx Scenebuilder to build my GUI.
Hope someone is able to help be
Best Regards Marc Rasmussen

Hard to advise without details on your custom protocol. See the zenjava blog for some inspiration.
Use a Task to invoke your server from your client. If the result of the client server call is synchronous get the value returned by the call when the task completes. If the call is asynchronous or the server pushes data to the client, set up a listener on the client running in it's own thread and when it gets a result invoke Platform.runLater to feed the result to the JavaFX application thread for UI processing.

Related

Building Websites only on NodeJs and Express blocking requests over http

I have a question regarding the examples out there when using Nodejs, Express and Jade for templates.
All the examples show how to build some sort of a user administrative interface where you can add user profiles, delete them and manage them.
Those are considered beginner's guides to NodeJs. My question is around the fact that if I have have 10 users concurrently accessing the same interface and doing the same operations, surely NodeJs will block the requests for the other users as they are running on the same port.
So let's say I am pulling out a list of users which may be something like 10000. Yes I can do paging, but that is not the point. While I am getting the list from the server another 4 users want to access the application. They have to wait for my process to end. That is my question - how can one avoid that using NodeJS & Express?
I am on this issue for a couple of months! I currently have something in place that does the following:
Run the main processing of stuff on a port
Run a Socket.io process on a different port
Use a sticky session
The idea is that I do a request (like getting a list of items), and immediately respond with some request reference but without the requested items, thus releasing the port.
In the background "asynchronously" I then do the process of getting the items. Upon which when completed, I do an http request from one node to the socket node port node SENDING the items through.
When that is done I then perform a socket.io emit WITH the data and the initial request reference so that the correct user gets the message.
On the client side I have an event listening for the socket which then completes the ajax request by populating the list.
I have SOME success in doing this! It actually works to a degree! I have an issue online which complicates matters due to ip addresses, and socket.io playing funny.
I also have multiple workers using clustering. I use it in the following manner:
I create a master worker
I spawn workers
I take any connection request and pass it to the relevant worker.
I do that for the main node request as well as for the socket requests. Like I said I use 2 ports!
As you can see I have had a lot of work done on this and I am not getting a proper solution!
My question is this - have I gone all around the world 10 times only to have missed something simple? This sounds way to complicated to achieve a non-blocking nodejs only website.
I asked myself - surely all these tutorials would have not missed on something as important as this! But they did!
I have researched, read, and tested a lot of code - this is my very first time I ask anything on stackoverflow!
Thank you for any assistance.
P.S. One example of the same approach is this: I request a report using jasper, I pass parameters, and with the "delayed ajax response" approach as described above I simply release the port, and in the background a very intensive report is being generated (and this can be very intensive process as a lot of calculations are being performed)..! I really don't see a better approach - any help will be super appreciated!
Thank you for taking the time to read!
I'm sorry to say it, but yes, you have been going around the world 10 times only to have been missing something simple.
It's obvious that your previous knowledge/experience with webservers are from a blocking point of view, and if this was the case, your concerns had been valid.
Node.js is a framework focused around using a single thread to execute code, which means if it does any blocking operations, no one else would be able to get anything done.
There are some operations that can do this in node, like reading/writing to disk. However, most node operations will be asynchronous.
I believe you are familiar with the term, so I won't go into details. What asynchronous operations allows node to do, is to keep this single thread idle as much as possible. By idle I mean open for other work. If your code is fully asynchronous, then handling 4 concurrent users (or even 400) shouldn't be a problem, even for a single thread.
Now, in regards to your initial problem of ports: Once a request is received on a given port, node.js execute whatever code you have written for it, until it encounters an asynchronous operation as soon as that happens, it is available to to pick up more requests on the same port.
The second problem you inquire about, is the database operation. In this case, node-js would send the query to the database (which takes no time at all) and the database does that actual execution of the query. In the meantime, node is free to do whatever it wants, until the database is finished, and lets node know there is a result to fetch.
You can recognize async operations by their structure: my_function(..., ..., callback). Function that uses a callback function, is in most cases asynch.
So bottom line: Don't worry about the problems around blocking IO, as you will hardly encounter any in node. Use a single port if you want (By creating multiple child processes, you can even have multiple node instances on the same port).
Hope this explains it good enough. If you have any further questions, let me know :)

Can I mq_send to reply after I mq_recieve?

I have one or more daemon app running and to communicate with it I have a client app. The client app is something simple executed on the command line. Chances are only one will be up at a given moment. When I do a command such as daemon update-config the client does mq_open and sends the command. Some commands like list I'd want results. It appears that if I run mq_send in my daemon after I receive I may receive the message within the daemon app.
What's the best way to send the reply to the client w/o accidentally processing it in the daemon? After a quick lookup there didn't appear to be an obvious solution so I do sleep(1) which seems to solve my problem completely even though it's a 'hack'. Whats the best solution? is sleep the most understandable and straightforward solution? I don't feel like generating random/unique values, passing it in and opening another mq to send it. The sleep for a second feels like the best solution but I wonder what your solutions may be.
When using messaging systems, you can do RPC calls even if it is not the best paradigm to use messaging in general. The general approach to RPC with messaging is:
have distinct queues for requests and for replies (the latter ones can be ephemeral queues, created for each request, or persistent queues);
give to each message a unique ID, that will be used in the replies to identify which message it was replying to. (it's called correlation_id in AMQP for example).
I do guess that you can use the same approach with Posix queues as well.

Creating a simple Linux API

I have a simple application on a OpenWRT style router. It's written in C++ currently. The router (embedded Linux) has very limited disk space and RAM. For example there is not enough space to install Python.
So, I want to control this daemon app via the network. I have read some tutorials on creating sockets, and listening to the port for activity. But I haven't been able to integrate the flow into a C++ class. And I haven't been able to figure out how to decode the information received, or how to send a response.
All the tutorials I've read are dead ends, they show you how to make a server that basically just blocks until it receives something, and then returns a message when it got something.
Is there something a little more higher level that can be used for this sort of thing?
Sounds like what you are asking is "how do I build a simple network service that will accept requests from clients and do something in response?" There are a bunch of parts to this -- how do you build a service framework, how do you encode and decode the requests, how do you process the requests and how do you tie it all together?
It sounds like you're having problems with the first and last parts. There are two basic ways of organizing a simple service like this -- the thread approach and the event approach.
In the thread approach, you create a thread for each incoming connection. That thread reads the messages (requests) from that connection (file descriptor), processes them, and writes back responses. When a connection goes away, the thread exits. You have a main 'listening' thread that accepts incoming connections and creates new threads to handle each one.
In the event approach, each incoming request becomes an event. You then have event handlers that processes these events, sending back responses. Its important that the event handlers NOT block and complete promptly, otherwise the service may appear to lock up. Your program has a main event loop that waits for incoming events (generally blocking on a single poll or select call) and reads and dispatches each event as appropriate.
I installed python-mini package with opkg, which has socket and thread support.
Works like a charm on a WRT160NL with backfire/10.03.1.

How to code in GWT server side?

In GWT, I will use G chart to present data in the browser, in the gwt server side i will need one thread will generate random data, other threads will represent the data to the client(browser) in a timely manner say after every 2 sec(synchronously),How can i code in the server side ??
Any kind of help is appreciable..
Thanks in advance
Writing code in the server side of GWT is really exactly the same as writing java code without GWT. In other words, once you get a hold of the data that the client sent inside your implementation of RemoteServiceServlet, then you are free to use whatever java code, libs, and/or frameworks to process that data.
From your description, it sounds like you need to kick off another thread to generate random data and then respond immediately to the client. You might want to read about creating new threads in java: http://download.oracle.com/javase/tutorial/essential/concurrency/.
There are several libraries that make it easier to run jobs. I'm familiar with quartz. You could use a scheduler like quartz to schedule a job that generates random data when the client requests? Or maybe it could just generate random data every so often?
From the client side, you'll probably want to poll every 2 seconds to check whether there is new data to display. Here'a another thread that gives some options for polling from gwt:
Client side Callback in GWT
Dave

Notifying a browser about events on server

I have a java based web application(struts 1.2). I have a requirement to display a status on the frontend (jsp). Now the status might change which my server gets notified by another server. But I want this status change to be notified to the browser.
I don't want to make a refresh at intervals. Rather I have to implement something like done in gmail chat, ie. the browser gets notified by changing events on the server.
Any ideas on how to go about this?
I was thinking on lines of opening a request to server for status, and at the server end I would hold the request and wouldn't respond back until there is a status change. Any pointers, examples on this?
Best possible solution will be to make use of XMPP protocol. It's standardized and a lot of open source solutions will get you started within minutes. You can use combination of Smack, StropheJS and Openfire to get your java based app work as desired.
There's a method called Long Polling (Comet). It basically sends a request to the server. The request thread created on the server simply waits for new data for the user, with a time limit of maybe 1 minute or more. When new data is available it is returned.
The main problem is to tackle the server-side issue, you don't want to have one thread for every user just waiting for new data. Of course you could use some asynchronous methods depending on your back-end.
Ref: http://en.wikipedia.org/wiki/Push_technology
Alternative way would be to use WebSockets. The problem is that it's not supported by all browsers today.

Resources