I have used pusher recently in my PHP laravel project and it is working fine.
What I know about pusher is that it is a real time layer between our server and client and creates web socket connection to our client browser.
I setup pusher in my application using below tutorial:
pusher integration with laravel
What I have created using pusher for my web application:
1.I have created a notification functionality. Where when one user add some data to database say when one user starts following other user a event is triggered and that event sends data to particulr channel say 'notification-channel' and in my js code I have subscribed to this channel. For that I have written below line of codes:
//instantiate a Pusher object with our Credential's key
var pusher = new Pusher('68fd8888888888ee72c', {
encrypted: true
});
//Subscribe to the channel we specified in our Laravel Event
var channel = pusher.subscribe('notification-channel');
//Bind a function to a Event (the full Laravel class)
channel.bind('App\\Events\\HelloPusherEvent', addMessage);
By using addMessage() function I display some data. Now I have put a check on client side so that only if logged in user is intended to receive this message by writing simple if condition. As I have sent intended user's id in within data from App\Events\HelloPusherEvent so I used this Id to display msg to specific users only.
But I think this is not right approach to use pusher for notifications or any other functionality. Further in my project I want to use pusher for displaying new news feeds on user's newsfeed without page refresh, where obviously few users will see those posts according to whom is posting that news post.
But How I will user pusher in a way that I don't need to implement if conditions on client side to stop displaying data.
Here my concern is that if I will keep sending data to all the active clients and put if conditions to filter data that will ultimately degrade my application.
My concerns:
If pusher sends data to multiple clients that is obviously all the active users then will it not cause overhead.
Is there any option to use channels to channelize data to intended users only.
As I am implementing pusher first time I have few doubts regarding its actual working so is there any blog which can help me to understand its real time working.
Let me know if my question is not clear and specific enough, I will elaborate it further.
Thanks in advance to all who will try to answer.
This Question pusher-app-client-events explained that we can create different channels for different users to send msg to only intended users.
I go through this FAQ and came to know that we can create unlimited channels for one registered APP.
Creating multiple channels won't cause any overhead.
Now if I want to send notification to user 1 then I would create a channel 'notificaton-channel-1' and would subscribe user 1 to same channel within my frontend code.
The event class that I am using within my PHP laravel project looks like below:
<?php
namespace App\Events;
use App\Events\Event;
use Illuminate\Queue\SerializesModels;
use Illuminate\Contracts\Broadcasting\ShouldBroadcast;
/**
* Just implement the ShouldBroadcast interface and Laravel will automatically
* send it to Pusher once we fire it
**/
class HelloPusherEvent extends Event implements ShouldBroadcast
{
use SerializesModels;
/**
* Only (!) Public members will be serialized to JSON and sent to Pusher
**/
public $message;
public $id;
public $for_user_id;
/**
* Create a new event instance.
* #param string $message (notification description)
* #param integer $id (notification id)
* #param integer $for_user_id (receiver's id)
* #author hkaur5
* #return void
*/
public function __construct($message,$id, $for_user_id)
{
$this->message = $message;
$this->id = $id;
$this->for_user_id = $for_user_id;
}
/**
* Get the channels the event should be broadcast on.
*
* #return array
*/
public function broadcastOn()
{
//We have created names of channel on basis of user's id who
//will receive data from this class.
//See frontend pusher code to see how we have used this channel
//for intended user.
return ['notification-channel_'.$this->for_user_id];
}
}
and on Frontend I subscribed to 'notification-channel-'+logged_in_user_id
//Subscribe to the channel we specified in our Laravel Event
//Subscribe user to the channel created for this user.
//For example if user's id is 1 then bind to notification-channel_1
var channel = pusher.subscribe('notification-channel_'+$('#logged_in_userId').val());
//Bind a function to a Event (the full Laravel class)
channel.bind('App\\Events\\HelloPusherEvent', addMessage);
This way we can send data to intended users only rather blocking data received by all the users by putting conditions in our client side code.
I think you should add User ID within Blade template directly, not using a form field:
var channel = pusher.subscribe('notification-channel_{{ Auth::id() }}');
Related
I am working on an e-commerce site. There are times where a product would no longer be available but the user would have added it to the cart or added to their saved items. How do I implement the feature such that if the product has been updated, the user would be notified as soon as possible?
I thought about doing a cron job that would check the status of the product if it still available or has been recently updated. But I do not know if that is feasible. I am open to better ideas
Thanks
Similar images are included below
What you are trying to achieve falls into real-time updates category and technically there would be more than one option to achieve this.
The chosen solution would depend on your application architecture and requirements. Meanwhile, I can suggest looking into Ably SDK for Node.js which can offer a good starter.
Here down a sample implementation where on the back-end you will be publishing messages upon item's stock reaching its limit:
// create client
var client = new Ably.Realtime('your-api-key');
// get appropriate channel
var channel = client.channels.get('product');
// publish a named (may be the product type in your case) message (you can set the quantity as the message payload
channel.publish('some-product-type', 0);
On the subscriber side, which would be your web client, you can subscribe to messages and update your UI accordingly:
// create client using same API key
var client = new Ably.Realtime('your-api-key');
// get product channel
var channel = client.channels.get('product');
// subscribe to messages and update your UI
channel.subscribe(function (message) {
const productName = message.name;
const updatedQuantity = message.data;
// update your UI or perform whatever action
});
Did a live betting app once and of course live updates are the most important part.
I suggest taking a look into websockets. The idea is pretty straight forward. On backend you emit an event let's say itemGotDisabled and on frontend you just connect to your websocket and listen to events.
You can create a custom component that will handle the logic related to webscoket events in order to have a cleaner and more organized code an you can do any type of logic you want to updated to component as easy as yourFEWebsocketInstance.onmessage = (event) => {}.
Of course it's not the only way and I am sure there are packages that implements this in an even more easy to understand and straight forward way.
We integrate the Quickblox chat application in our project. We are using Node Js in backend and Angular 7 in frontend. We are already implemented both private and group chat.
Now there is a problem when we are going to show the opponent user status. It is Online or Offline.
It is not clearly mentioned in Quickblox documentation. Please help.
Managing presence status is covered in this section of QuickBlox documentation.
To receive user status (online / offline), use the following callback:
/*
Returns:
* (Integer) userId - The sender ID
* (String) type - If user leave the chat, type will be 'unavailable'
*/
QB.chat.onContactListListener = function(userId, type) {
// callback function
};
I am developing a web app bot on azure (v3) and I am using async methods but I can't seem to solve an issue which is SyntaxError: Unexpected token function.
I've tried updating my nodeJS from 6.9.4 to 8.9 but that didn't work. I also ran npm i -g azure-functions-core-tools#core but still nothing.
class OAuthHelpers {
/**
* Enable the user to schedule meeting and send an email attachment via the bot.
* #param {TurnContext} turnContext
* #param {TokenResponse} tokenResponse
* #param {*} emailAddress The email address of the recipient
*/
async function createevent(turnContext, tokenResponse, emailAddress) {
if (!turnContext) {
throw new Error('OAuthHelpers.createevent(): `turnContext` cannot be undefined.');
}
if (!tokenResponse) {
throw new Error('OAuthHelpers.createevent(): `tokenResponse` cannot be undefined.');
}
var client = new SimpleGraphClient(tokenResponse.token);
// Calls the Graph API with the subject and content message...
await client.createevent(
emailAddress,
`Lunch`,
`I will be taking everyone to lunch as a reward for your hardwork.`
);
// Success message...
await turnContext.sendActivity(`Success! I have scheduled a meeting with you and ${ emailAddress } have created an event on each of their calendars.`);
}
I want the bot to run normally but it can't because azure can't detect the async function for some reason. Any help is appreciated
The OAuthHelpers class requires 'simple-graph-client' which houses all of the methods you are looking to utilize. In the original sample your code draws from, BotBuilder-Sample 24.bot-authentication-msgraph, if you navigate to the simple-graph-client.js file, you will see the methods called (i.e. sendMail, getRecentMail, getMe, and getManager) in the OAuthHelpers.js file.
If you haven't already, you will need to include a method for creating an event. This, in turn is called from the OAuthHelpers.js file as part of the bot dialog.
It's hard to know what is what without more code, but my guess is the token is being passed into your createevent method but, as the method (likely) doesn't exist as a graph api call, it doesn't know what to do with it.
Check out the following links for guidance:
MS Graph sample showing a GET call for top 3 calendar events
MS Graph unit test example, but demonstrates an event POST
API reference for creating an event
Add'l info on creating recurring events...might prove useful
Hope of help!
Looking at the docs for the SignalR bindings to send a message to a specified user you include the UserId property on the message as such -
[FunctionName("SendMessage")]
public static Task SendMessage(
[HttpTrigger(AuthorizationLevel.Anonymous, "post")]object message,
[SignalR(HubName = "chat")]IAsyncCollector<SignalRMessage> signalRMessages)
{
return signalRMessages.AddAsync(
new SignalRMessage
{
// the message will only be sent to these user IDs
UserId = "userId1",
Target = "newMessage",
Arguments = new [] { message }
});
}
This example is taken straight from the documentation, but the comment implies you message multiple userids, even though the property is a string and not an array.
How would you specify multiple users? (If for example, they are in a private chat channel together) Or is this mistake in the wording of the comment and you would need to send a message per user?
With other versions of SignalR I would put them in a group, but bindings for this do not exist for functions.
Group operations were introduced in the latest release.
Now you can:
Send a message to a group using GroupName in SignalRMessage
Add/remove user in a group using IAsyncCollector<SignalRGroupAction> output
Unfortunately just like the doc says, right now with Azure function binding we can only send message to one user or to all clients.
See the code of current extension SDK Microsoft.Azure.WebJobs.Extensions.SignalRService v1.0.0-preview1-10002.
It shows the extension has only two methods SendToAll and SendToUser.
Task SendToAll(string hubName, SignalRData data);
Task SendToUser(string hubName, string userId, SignalRData data);
The comment confused you is actually for old sample, the author forgot to modify it.
Good news is that support for group operation is under progress.
I'm looking to develop a chat application with Pubnub where I want to make sure all the chat messages that are send is been stored in the database and also want to send messages in chat.
I found out that I can use the Parse with pubnub to provide storage options, But I'm not sure how to setup those two in a way where the messages and images send in the chat are been stored in the database.
Anyone have done this before with pubnub and parse? Are there any other easy options available to use with pubnub instead of using parse?
Sutha,
What you are seeking is not a trivial solution unless you are talking about a limited number of end users. So I wouldn't say there are no "easy" solutions, but there are solutions.
The reason is your server would need to listen (subscribe) to every chat channel that is active and store the messages being sent into your database. Imagine your app scaling to 1 million users (doesn't even need to get that big, but that number should help you realize how this can get tricky to scale where several server instances are listening to channels in a non-overlapping manner or with overlap but using a server queue implementation and de-duping messages).
That said, yes, there are PubNub customers that have implemented such a solution - Parse not being the key to making this happen, by the way.
You have three basic options for implementing this:
Implement a solution that will allow many instances of your server to subscribe to all of the channels as they become active and store the messages as they come in. There are a lot of details to making this happen so if you are not up to this then this is not likely where you want to go.
There is a way to monitor all channels that become active or inactive with PubNub Presence webhooks (enable Presence on your keys). You would use this to keep a list of all channels that your server would use to pull history (enable Storage & Playback on your keys) from in an on-demand (not completely realtime) fashion.
For every channel that goes active or inactive, your server will receive these events via the REST call (and endpoint that you implement on your server - your Parse server in this case):
channel active: record "start chat" timetoken in your Parse db
channel inactive: record "end chat" timetoken in your Parse db
the inactive event is the kickoff for a process that uses start/end timetokens that you recorded for that channel to get history from for channel from PubNub: pubnub.history({channel: channelName, start:startTT, end:endTT})
you will need to iterate on this history call until you receive < 100 messages (100 is the max number of messages you can retrieve at a time)
as you retrieve these messages you will save them to your Parse db
New Presence Webhooks have been added:
We now have webhooks for all presence events: join, leave, timeout, state-change.
Finally, you could just save each message to Parse db on success of every pubnub.publish call. I am not a Parse expert and barely know all of its capabilities but I believe they have some sort or store local then sync to cloud db option (like StackMob when that was a product), but even if not, you will save msg to Parse cloud db directly.
The code would look something like this (not complete, likely errors, figure it out or ask PubNub support for details) in your JavaScript client (on the browser).
var pubnub = PUBNUB({
publish_key : your_pub_key,
subscribe_key : your_sub_key
});
var msg = ... // get the message form your UI text box or whatever
pubnub.publish({
// this is some variable you set up when you enter a chat room
channel: chat_channel,
message: msg
callback: function(event){
// DISCLAIMER: code pulled from [Parse example][4]
// but there are some object creation details
// left out here and msg object is not
// fully fleshed out in this sample code
var ChatMessage = Parse.Object.extend("ChatMessage");
var chatMsg = new ChatMessage();
chatMsg.set("message", msg);
chatMsg.set("user", uuid);
chatMsg.set("channel", chat_channel);
chatMsg.set("timetoken", event[2]);
// this ChatMessage object can be
// whatever you want it to be
chatMsg.save();
}
error: function (error) {
// Handle error here, like retry until success, for example
console.log(JSON.stringify(error));
}
});
You might even just store the entire set of publishes (on both ends of the conversation) based on time interval, number of publishes or size of total data but be careful because either user could exit the chat and the browser without notice and you will fail to save. So the per publish save is probably best practice if a bit noisy.
I hope you find one of these techniques as a means to get started in the right direction. There are details left out so I expect you will have follow up questions.
Just some other links that might be helpful:
http://blog.parse.com/learn/building-a-killer-webrtc-video-chat-app-using-pubnub-parse/
http://www.pubnub.com/blog/realtime-collaboration-sync-parse-api-pubnub/
https://www.pubnub.com/knowledge-base/discussion/293/how-do-i-publish-a-message-from-parse
And we have a PubNub Parse SDK, too. :)