I have a Fedora Directory server that I need to shut down. In order to do so, I need to find a list of all clients currently authenticating to this server. Not being familiar with Fedora/389 Directory, I was wondering if there's an easy way to do that? My best option at this point seems to be to comb through the log files.
An LDAP-compliant server should send the unsolicited notification to clients about events transpiring between the client and server. The notification contains information that the client can use to take an action. Therefore, properly coded clients should not care about the server being shutdown. Clients that do not support the unsolicited notification should have that support added.
see also
LDAP: Programming Practices
Related
I am currently working on a web application using the MEAN stack. It has a social aspect to it so I want to be able to push notifications to users.
The way I do it now is when something happens that should be a notification it gets stored in a mongo database with an unread flag. Each client will send a get request to the server every 30 second and will receive every notification marked as unread, and will then mark it as read.
I want to switch to using a message queue and sockets so less network resources will be used, and also provide the user with a real-time experience. I've thought about using redis and its pubsub structure but I can't seem to figure out how to do this securely. If I push out notifications to the affected users, won't it be easy for someone malicious to subscribe to somebody else's channel and receive notifications not meant for them? Am I missing something or is it just the wrong approach for such a system?
Edit: Figure I update with the solution I went with if anyone else reading this is having the same problem.
Instead of using rabbitmq, as the answer suggested, I figured that a much more easy and elegant solution is to just use socket.io. When new sockets connects to the server I save a mapping from the userID to the socketId in a redis in-memory DB. (After I've validated their token) That way, if I need to push a notification to a user I just look up the socketId in the redis DB, and then send it to the correct socket.
This way I don't need any security beyond that as socketIDs are unguessable, and the message is only sent across the single socket that belongs to the given user.
This way it will only get sent through the connection of the given socket, as socketIDs are only used server side to keep track of all the connection. This means no one else can "listen" using someone else's socketID.
you can use RabbitMQ for this. Also authentication is there. Please go through following link and try.
https://www.rabbitmq.com/access-control.html
also, you can apply authentication in existing structure using subscription auth tokens with all subscribed users only.
even redis has its security with topics. Please have a look in link below
https://redis.io/topics/security
So i currently have a chat system running NodeJS that passes messages via rabbit and each connected user has their own unique queue that subscribed and only listening to messages (for only them). The backend can also use this chat pipeline to communicate other system messages like notifications/friend requests and other user event driven information.
Currently the backend would have to loop and publish each message 1 by 1 per user even if the payload of the message is the same for let's say 1000 users. I would like to get away from that and be able to send the same message to multiple different users but not EVERY user who's connected.
(example : notifying certain users their friend has come online).
I considered implementing a rabbit queue system where all messages are pooled into the same queue and instead of rabbit sending all user queues node takes these messages and emit's the message to the appropriate user via socket connections (to whoever is online).
Proposed - infrastructure
This way the backend does not need to loop for 100s and 1000s of users and can send a single payload containing all users this message should go to. I do plan to cluster the nodejs servers together.
I was also wondering since ive never done this in a production environment, will i need to track each socketID.
Potential pitfalls i've identified so far:
slower since 1000s of messages can pile up in a single queue.
manually storing socket IDs to manually trasmit to users.
offloading routing to NodeJS instead of RabbitMQ
Has anyone done anything like this before? If so, what are your recommendations. Is it better to scale with user unique queues, or pool all grouped messages for all users into smaller (but larger pools) of queues.
as a general rule, queue-per-user is an anti-pattern. there are some valid uses of this, but i've never seen it be a good idea for a chat app (in spite of all the demos that use this example)
RabbitMQ can be a great tool for facilitating the delivery of messages between systems, but it shouldn't be used to push messages to users.
I considered implementing a rabbit queue system where all messages are pooled into the same queue and instead of rabbit sending all user queues node takes these messages and emit's the message to the appropriate user via socket connections (to whoever is online).
this is heading down the right direction, but you have to remember that RabbitMQ is not a database (see previous link, again).
you can't randomly seek specific messages that are sitting in the queue and then leave them there. they are first in, first out.
in a chat app, i would have rabbitmq handling the message delivery between your systems, but not involved in delivery to the user.
your thoughts on using web sockets are going to be the direction you want to head for this. either that, or Server Sent Events.
if you need persistence of messages (history, search, last-viewed location, etc) then use a database for that. keep a timestamp or other marker of where the user left off, and push messages to them starting at that spot.
you're concerns about tracking sockets for the users are definitely something to think about.
if you have multiple instances of your node server running sockets with different users connected, you'll need a way to know which users are connected to which node server.
this may be a good use case for rabbitmq - but not in a queue-per-user manner. rather, in a binding-per-user. you could have each node server create a queue to receive messages from the exchange where messages are published. the node server would then create a binding between the exchange and queue based on the user id that is logged in to that particular node server
this could lead to an overwhelming number of bindings in rmq, though.
you may need a more intelligent method of tracking which server has which users connected, or just ignore that entirely and broadcast every message to every node server. in that case, each server would publish an event through the websocket based on the who the message should be delivered to.
if you're using a smart enough websocket library, it will only send the message to the people that need it. socket.io did this, i know, and i'm sure other websocket libraries are smart like this, as well.
...
I probably haven't given you a concrete answer to your situation, and I'm sure you have a lot more context to consider. hopefully this will get you down the right path, though.
Let's say I want to write a mobile chat application (just as an example).
How to receive only the messages meant for one client and don't let other clients receive messages which where not meant for them!?
Create a temp queue only known to the client? - Secure enough?
Encrypting the message with clients public key? - Own PKI needed!
Restrict access to queues based on some credentials the client sends with every request? - Every request needs to be authenticated!
...?
If a client sends a message to the outgoing queue, how to prevent other clients from reading the message directly out of the queue!?
Restricting access to write-only? - Don't know if this is possible...
Encrypting the message? - Own PKI needed!
...?
I hope my question/problem is clear and I'm really looking forward to hear your ideas and best practices!
Thanks in advance!
//edit: So using a temp queue for every client with encrypted messages might be a good choice. Or do you have any other ideas???
If you use RabbitMQ AMQP broker, then you can use Validated User-ID extension power, but you have to create separate users for each client.
Using per-client queue maybe a good choice, but you have to realize that it "security through obscurity" and it smells. But as you suggested, message encryption may fix that.
You can play with Access Control but you may find better to have some server application to handle complex user management things and use it api from clients for better user policies management.
I'd like to implement push notification server using node.js. The basic scenario is:
Some applications sends notification messages to the server.
Notification server receives the request and forwards the message to uesr's mail or IM client based on user's preference.
In step 1, which protocol (e.g. REST, socket, HTTP/XML and so on.) would you recommend from the performance perspective?
Also in step 2, I have a plan to use node-xmpp module for IM client but for mail, which way is the best to implement? For example,
Just use SMTP. (But I think this might occur performance degradation because SMTP is an expensive communication and performance depends on SMTP server capacity.
use queue mechanism, in order to avoid drawbacks from the above. node.js app simply puts the message into the queue, and smtp server pulls the message.
other solutions...
Thanks in advance.
With regards to what to use as a protocol, i would go for a REST interface, whereby the application posting sends a POST request to a resource associated with the USER. something along the lines of "http://example.com/rest/v1/{userID}/notifications
I personally would use json as the data/content of the rest request and have node.js write this information to a message queue. (as a json string).
You can than have xmpp readers for each user, as well as an SMTP handler reading from this queue as fast as the SMTP server allows it to go.
However, this full post is what i would do in your situation, rather than a factual response on what is best. I know JMS fairly well and i've been working a lot with rest interfaces lately, therefore this is the way i would do it.
I am working on a Chatting application (needs to connect to a server) on iPhone. The sending packet from iPhone shouldn't be a problem.
But I would like to know whether it is possible for iPhone to establish a incoming socket connection to server continuously or forever under mobile environment.
OR What do I need to do to give the connection alive ? Need to send something over it to keep it alive ?
Thanks.
Not sure why you want to have chatting app to have persisted connection... I'd better use SMS like model. Anyways, Cocoa NSStream is based on NSSocket and allows a lot of functionality. Take a look at it.
Response to the question. Here is in a nutshell, what I would do:
Get an authentication token from the server.
this will also take care of user presence if necessary but now we are talking about the state; once presence is known, the server may send out notifications to clients that are active and have a user on their contact list.
Get user's contact list and contact presence state.
When a message send, handle it according to addressee state, i.e. if online, communicate back to the other user, if offline, queue for later delivery or reject.
Once token expires, reject communication with appropriate error and make the client to request a new token.
Communication from server to client, can be based on pull or push model. In first case, client periodically makes a request and fetches all messages. This may sound not good but in reality, how often users compose and send messages? Several times a minute? That's not too much. So fetching may happen every 5-10 seconds.
For push model, client must be able to listen and accept connections.
Finally, check out SIP, session initiation protocol. No need to use full version of it though. Just basic stuff.
This is very rough and perhaps simplified. I don't know the target complexity of your chatting system. For example, the simplest thing can also be that server just enables client to client communication by distributing their end points and clients take care of everything themselves.
Good luck!
Super out of date response, but maybe it will help the next person.
I would use xmppframework and a jabber server.