I am working on this QR code based application which is using a ClearDB MySQL database that is stored on Heroku servers. The frontend communicates with the database by using a REST API built with Node.js and Express
Whenever a user scans a QR code, the value of the code changes in the database. But I don't know how to reflect that change instantly in the frontend. Basically what I need help with is finding a way to automatically refresh the page whenever that value changes in the database and display the new QR code based on the new value. Such that when a user scans the code, it instantly updates on his page, not only in the database.
I looked into sockets but didn't quite understand how to integrate it into my application, especially when it comes to the frontend.
You can use a state value for detecting the change.
You declare a state value of which type is boolean(any type you want). Then you can implement setState function(in case of class component) or useState hook(in case of functional component) in the module of updating data in the database.
For example:
const [isChanged, setChanged] = useState(false);
const updateQRData = (data) => {
... Update codes
setChanged(isChanged => !isChanged);
}
Then you can use useEffect hook:
useEffect(() => {
... Write some codes
}, [isChanged]);
How are you storing the information for the QR codes? In Redux? Or local state? Are you using class-based components or hooks? If you want the least amount of headaches, I'd heavily suggest hooks. Let me know and I can give more specifics for the code.
Your final product would look something like this:
const [qrData, setQrData] = useState(null)
const updateDatabase = () => {
axios.post(`your/database/url/with/your/data/attached`).then(res => {
setQrData(res.data.your.response.object.from.database)
}
}
Related
I have run into an unforeseen problem with my socket.io setup.
I use socket.io to live load data from my database (mongoDB, nodejs, react).
To accomplish this, I use mongoDB's changestream to detect changes and then push them to the front-end via socket.io.
Now this works perfectly as long as the user is connected. And right now, when the user reconnects, it just reloads all data. While this is fine for most users, there is a small group with very bad network connection and thus the front-end is reloading data all the time. Which causes the front-end to be unresponsive for some time.
So, I am looking for a way to only send events that occurred during the front-end being offline. While the front-end can do this quite easily: https://socket.io/docs/v4/client-offline-behavior/
It doesn't seem possible to do this at the server side. Since socket.io (server side) immediately forgets sockets that have disconnected and thus cant buffer events.
So, I was wondering if there is a good way do this? Or would this need a full "wrapper" around socket.io that caches disconnected sockets?
Any help or advice would be appreciated!
I find it is a really interesting and painful problem ! ^^'
If you can give more variables, it may help people to give you a better answer
For instance
How many data are stored in database, how much a typical user will receive, and how many events are triggered on a time frame ?
How long should an event take to be visible ? I mean, if users receive an event with a 10s,30s,... delay, is it harmfull for the service they provide.
How your data is structured ? is it a simple json array with the same field, custom field, dynamic json object, etc..
How your react app is structured, do you put heavy logic when your data is update, etc..
I think you should put more controls in your front end code and update only when new datas.
Some paths to explore
1. Put more controls in your front end
As you stated, for the users with bad connection, the react client seems to update his state too quickly, when they reload data after the websocket is connected, again and again. Ui may freeze in this case, yes.
For this, I think of two approaches :
Before updating the state, check if react current state is the same as the data you receive from websocket connection. If the reconnection is quick enough and no new data arrived, it should be the same. So in this case do not update react state.
If too many events are triggered and after each reconnection new data arrived, you can buffer the datas from the websocket and display it only once per time frame. What i mean by time frame, is you can use functions like setInterval or requestAnimationFrame to trigger react update. A pseudo react code to illustrate this.
function App() {
const [events, setEvents] = useState({ datas: [] });
const bufferedEvents = useRef([]);
useEffect(() => {
websocket.on("connected", (newEvents) => {
bufferedEvents.current = bufferedEvents.current.concat(newEvents);
})
websocket.on("data", (newEvent) => {
bufferedEvents.current = bufferedEvents.current.concat(newEvent);
})
// In the setInterval function you take all the events receive at the connection + new events. to update the react state. You clean the bufferedEvents at the same time.
const intervalId=setInterval(() => {
const events = bufferedEvents.current;
bufferedEvents.current = [];
//update if new datas
if (events.length > 0) {
setEvents((prevState) => { return { datas: prevState.datas.concat(events) } });
}
// console.log(events)
}, 1000) // trigger data update every second. You could replace this approach with a requestAnimationFrame. You can adapt the time refresh as you need.
//Do not forget to clear the interval when the component is unmount
return ()=>{
clearInterval(intervalId)
}
}, []);
return (
<div>
<span>Total events : {events.datas.length}</span>
<br />
{
events.datas.map(event => {
return <div>{event.data}</div>
})
}
</div>
)
}
You can look at this article for details on using requestAnimation frame.
I think that modifying the front end is needed in all case, but still alone, not really good on performance.
2. Fetch only new data in your back end
For this approach, it really depends how your data is structured in the database.
If the data have some timestamp in it, I can think of a naive but simple cookie with a timestamp in it.
When user connects the first time, this cookie is null.
When they fetch the data, on the websocket connection, they receive all the datas. When datas arrived, you update the cookie timestamp with the most recent date in the data.
Websocket is disconnected, you open a new websocket with the cookie timestamp on it. With this information you can query all the datas more recent than the timestamp on the cookie.
Like this, you don't have to download the entirity of data, but only fresh ones.
Other approaches may be more helpfull but without more informations on your datas and more precise requirements, it is hard to say.
If you have a lot of data, I will personally check some pagination mechanism and maybe combine some classic http request for fetching the data, and websocket, sse, or long polling for live events.
You can put a comment if needed and I will update my response !
Cheers
Tech stack - Node.js, MongoDB for the database, Strapi CMS for editing and API, React - my application.
I have a database with a long list of entries and a ready-to-use application that allows users to read data from the database. I need to be able to generate a simple website with a single entity from my database as a source to fill the template.
Mockup
Here is a mock-up. Hopefully, it will make things a bit clearer.
Clarification
After a day of thinking about the task, I believe I need something like a simplest static website generator - an application that will allow me to select a single bit of data from the list and generate a small website filled with it. The end goal is to get a website in some subfolder of my application where I can get it and use it however I need.
A bit more about specifics:
It will be used locally
Security can be neglected
Running always in development is not a problem (just in case, thinking about additional question #2)
Few additional questions:
Is it possible to run NPM scripts from the application (like npm build)
Is there any way to show one component in development mode, but replace it with another during building for production?
App.js
//...
function App() {
if() {
return <AdminUI /> // This one is to be shown in development mode
} else {
return <Website /> // This one is to be used instead of AdminUI in the build
}
UPDATE
Well, I'm digging a path to create a site generator and so far I come up with the following basic plan:
Get my template ready
Create a new directory for my website
Copy a template to the new folder
Get an HTML file, parse it to a string to modify
Swap some bits with my data
Save to a file from the modified string
repeat if needed for other files.
If that works as expected, the whole process probably might be improved by moving from a fixed template to a component, that will be prepared with a JavaScript bundler and started with the help of something like node-cmd (to run shell commands from my application)...
What you want could be achievable, but if it's just a string and little else, I'd say it's much simpler to fetch the data at startup from a given file, and populate from there. You can put a JSON file under the public folder (together with other static data, like images) and have the file being your configuration.
In the App.js file, write an async componentDidMount() and you can do an await axios.get("") with your configuration.
So App.js would look like (code written on the fly, didn't check in an IDE):
export class App extends React.App {
constructor(props) {
super(props);
this.state = { loading: true, };
}
async componentDidMount() {
const response = await axios.get("your/data.json");
this.setState({ loading: false, ... whatever})
}
render = () => (
<>
(this.state.loading && <div>Still loading...</div>)
(this.state.adminData && <AdminUI data={this.state.admingData} />)
(this.state.devData && <Website data={this.state.devData} />)
</>
)
}
If you don't care about security, wouldn't be much simpler like this? And if you use TypeScript you'll have a much much simpler life in handling the data too.
Maybe it's worth doing an AdminUI to generate the JSON, and the another UI which reads the JSON, so you end up doing two UIs. The template-generated UI could even ask for a JSON file to bootstrap directly to the user, if it simplifies... In general, an approach based on simple JSON sounds a lost simpler than going for a CI/CD pipeline.
I am trying to build a logging mechanism, to log changes done to a record. I am currently logging previous and new record. However, as the site is very busy, I expect the logfile to grow seriously huge. To avoid this, I plan to only capture the modified fields only.
Is there a way to capture only the modifications done to a record (in REACT), so my {request.body} will have fewer fields?
My Server-side is build with NODE.JS and the client-side is REACT.
One approach you might want to consider is to add an onChange(universal) or onTextChanged(native) listener to the text field and store the form update in a local state/variables.
Finally, when a user makes an action (submit, etc.) you can send the updated data to the logging module.
The best way I found and works for me is …
on the api server-side, where I handle the update request, before hitting the database, I do a difference between the previous record and {request.body} using lodash and use the result to send to my update database function
var _ = require('lodash');
const difference = (object, base) => {
function changes(object, base) {
return _.transform(object, function (result, value, key) {
if (!_.isEqual(value, base[key])) {
result[key] = (_.isObject(value) && _.isObject(base[key])) ? changes(value, base[key]) : value;
}
});
}
return changes(object, base);
}
module.exports = difference
I saved the above code in a file named diff.js and included it in my server-side file.
It worked good.
Thanks for giving the idea...
I am currently using the below server side rendering logic (using reactjs + nodejs +redux) to fetch the data synchronously the first time and set it as initial state in store.
fetchInitialData.js
export function fetchInitialData(q,callback){
const url='http://....'
axios.get(url)
.then((response)=>{
callback((response.data));
}).catch((error)=> {
console.log(error)
})
}
I fetch the data asynchronously and load the output in to store the first time the page loads using callback.
handleRender(req, res){
fetchInitialData(q,apiResult => {
const data=apiResult;
const results ={data,fetched:true,fetching:false,queryValue:q}
const store = configureStore(results, reduxRouterMiddleware);
....
const html = renderToString(component);
res.status(200);
res.send(html);
})
}
I have a requirement to make 4 to 5 API calls on initial page load hence thought of checking to see if there is an easy way to achieve making multiple calls on page load.
Do I need to chain the api calls and manually merge the response from different API calls and send it back to load the initial state?
Update 1:
I am thinking of using axios.all approach, can someone let me know if that is a ideal approach?
You want to make sure that requests happen in parallel, and not in sequence.
I have solved this previously by creating a Promise for each API call, and wait for all of them with axios.all(). The code below is
basically pseudo code, I could dig into a better implementation at a later time.
handleRender(req, res){
fetchInitialData()
.then(initialResponse => {
return axios.all(
buildFirstCallResponse(),
buildSecondCallResponse(),
buildThirdCallResponse()
)
})
.then(allResponses => res.send(renderToString(component)))
}
buildFirstCallResponse() {
return axios.get('https://jsonplaceholder.typicode.com/posts/1')
.catch(err => console.error('Something went wrong in the first call', err))
.then(response => response.json())
}
Notice how all responses are bundled up into an array.
The Redux documentation goes into server-side rendering a bit, but you might already have seen that. redux.js.org/docs/recipes/ServerRendering
Also checkout the Promise docs to see exactly what .all() does.developer.mozilla.org/en/docs/Web/JavaScript/Reference/Global_Objects/Promise/all
Let me know if anything is unclear.
You could try express-batch or with GraphQL is another option.
You also could use Redux-Sagas to use pure Redux actions to trigger multiple api calls and handle all of those calls using pure actions. Introduction to Sagas
Let's say I configure redstone as follows
#app.Route("/raw/user/:id", methods: const [app.GET])
getRawUser(int id) => json_about_user_id;
When I run the server and go to /raw/user/10 I get raw json data in a form of a string.
Now I would like to be able to go to, say, /user/10 and get a nice representation of this json I get from /raw/user/10.
Solutions that come to my mind are as follows:
First
create web/user/user.html and web/user/user.dart, configure the latter to run when index.html is accessed
in user.dart monitor query parameters (user.dart?id=10), make appropriate requests and present everything in user.html, i.e.
var uri = Uri.parse( window.location.href );
String id = uri.queryParameters['id'];
new HttpRequest().getString(new Uri.http(SERVER,'/raw/user/${id}').toString() ).then( presentation )
A downside of this solution is that I do not achieve /user/10-like urls at all.
Another way is to additionally configure redstone as follows:
#app.Route("/user/:id", methods: const [app.GET])
getUser(int id) => app.redirect('/person/index.html?id=${id}');
in this case at least urls like "/user/10" are allowed, but this simply does not work.
How would I do that correctly? Example of a web app on redstone's git is, to my mind, cryptic and involved.
I am not sure whether this have to be explained with connection to redstone or dart only, but I cannot find anything related.
I guess you are trying to generate html files in the server with a template engine. Redstone was designed to mainly build services, so it doesn't have a built-in template engine, but you can use any engine available on pub, such as mustache. Although, if you use Polymer, AngularDart or other frameowrk which implements a client-side template system, you don't need to generate html files in the server.
Moreover, if you want to reuse other services, you can just call them directly, for example:
#app.Route("/raw/user/:id")
getRawUser(int id) => json_about_user_id;
#app.Route("/user/:id")
getUser(int id) {
var json = getRawUser();
...
}
Redstone v0.6 (still in alpha) also includes a new foward() function, which you can use to dispatch a request internally, although, the response is received as a shelf.Response object, so you have to read it:
#app.Route("/user/:id")
getUser(int id) async {
var resp = await chain.forward("/raw/user/$id");
var json = await resp.readAsString();
...
}
Edit:
To serve static files, like html files and dart scripts which are executed in the browser, you can use the shelf_static middleware. See here for a complete Redstone + Polymer example (shelf_static is configured in the bin/server.dart file).