What is `response.send` (from README.md of redux-thunk)? - redux-thunk

Find it in Composition section
Where it come from?
// This is very useful for server side rendering, because I can wait
// until data is available, then synchronously render the app.
store.dispatch(
makeSandwichesForEverybody()
).then(() =>
response.send(ReactDOMServer.renderToString(<MyApp store={store} />))
);

I believe it comes from Javascript XMLHttpRequest
https://developer.mozilla.org/en-US/docs/Web/API/XMLHttpRequest/send

Related

How do you save the results of a mongoose query to a variable, so that its value can be used *later* in the code?

I know this question has been asked a few times, but none seem to answer the particular part of using the query results for later.
I also know the problem resides on the fact that queries are asynchronous, and perhaps this is the reason I cannot seem to find a satisfactory answer.
Here's what I'm trying to do:
I have a node project with several sections, each section with different content. These sections have individual properties, which I decided to store in a Model for later use.
So far (and for simplicity sake) I have the following schema:
const SectionSchema = new Schema({
name: String,
description: String
})
const Section = mongoose.model('Sections',SectionSchema)
I'd like to retrieve this data to be used in one of my layouts (a navigation header), so I tried something like this:
const express = require('express')
const app = express()
Section.find().then(function(docs){
app.locals.sections = docs
})
console.log(app.locals.sections) // undefined
This obviously doesn't quite work due to find() being asynchronous, or rather, it does work but the values are populated at a different time. I know that if I do the console.log check inside the function I'd get a result, but that's not the concern, I want to store the data in app.locals so that I could later use it in one of my layouts.
Ideally I'd like to load this data once, before the server begins to listen to requests.
Feel free to correct me if I've made any wrong assumptions, I'm very new to node, so I don't quite know how to approach things quite yet.
Thanks in advance.
EDIT: I should've mentioned I'm using express.
Your node app will likely be comprised of route handlers for http requests. app.locals.section will be undefined if you call it outside of the callback, but it will exist in the route handler.
Let's say you were using something like express or restify:
const app = restify.createServer()
app.get('/', (req, res) => {
return res.json(app.locals.sections)
})
Section.find().then(function(docs){
app.locals.sections = docs
})
console.log(app.locals.section) // is undefined
app.listen(8080-, ()=>{
console.log('Server started 🌎 ',8080)
})
Actually, it might be undefined if the database call took a long time and or a user hit the app super soon after startup. Starting the server in the callback would ensure app.locals.section existed under every scenario:
Section.find().then(function(docs){
app.locals.sections = docs
app.listen(8080-, ()=>{
console.log('Server started 🌎 ',8080)
})
})
You can use async/await within a function to make it seem like you aren't using promises. But you can't use it at the top level of your module. See here: How can I use async/await at the top level?
It really would be fairly idiomatic to do all your app startup in a promise chain. It's a style of coding you are going to see a lot of.
Section.find().then((docs)=>{app.locals.sections = docs})
.then (()=>{/*dosomething with app.locals.sections */})
.then(startServer)
function startServer() {app.listen(8080-, ()=>{
console.log('Server started 🌎 ',8080)
})}

Alternative of document object in node

I am working on a react component to copy text on clipboard. I am using document.execCommand('copy') for it, which is working fine for browsers. However it may not found "document" object in other environment and will break there ie. node.
Is there any alternative I can use to make it work for cross platform?
jsdom is widely used in Node.js applications to provide support for some client-side features, primarily DOM. document.execCommand is not among them.
In order for document.execCommand('copy') to not cause an error during SSR, client-side features in use can be stubbed in Node:
global.document = {
execCommand() {}
};
An alternative approach is to detect Node.js environment, e.g. with detect-node. Either with in-place conditions:
if (isNode)
document.execCommand('copy');
Or by using loosely coupled components and IoC/DI containers. Redux store or React context can act as a container which platform-dependent components can be read from.
For instance, with React 16.3 context API:
const ClipboardComponent = (props) => /* default implementation */;
export const PlatformContainer = React.createContext({
ClipboardComponent,
});
The component is retrieved from the context where it's used:
<PlatformContainer.Consumer>{({ ClipboardComponent }) =>
<ClipboardComponent/>
}</PlatformContainer.Consumer>
It's rendered with default implementation on client side:
render(<App />, rootElement);
And no-op implementation can be provided in entry point on server side:
renderToString(
<PlatformContext.Provider value={ { ClipboardComponent: () => null } }>
<App />
</PlatformContext.Provider>
);

How to structure multiple HTTP requests in a server-side rendered React app?

I am currently using the below server side rendering logic (using reactjs + nodejs +redux) to fetch the data synchronously the first time and set it as initial state in store.
fetchInitialData.js
export function fetchInitialData(q,callback){
const url='http://....'
axios.get(url)
.then((response)=>{
callback((response.data));
}).catch((error)=> {
console.log(error)
})
}
I fetch the data asynchronously and load the output in to store the first time the page loads using callback.
handleRender(req, res){
fetchInitialData(q,apiResult => {
const data=apiResult;
const results ={data,fetched:true,fetching:false,queryValue:q}
const store = configureStore(results, reduxRouterMiddleware);
....
const html = renderToString(component);
res.status(200);
res.send(html);
})
}
I have a requirement to make 4 to 5 API calls on initial page load hence thought of checking to see if there is an easy way to achieve making multiple calls on page load.
Do I need to chain the api calls and manually merge the response from different API calls and send it back to load the initial state?
Update 1:
I am thinking of using axios.all approach, can someone let me know if that is a ideal approach?
You want to make sure that requests happen in parallel, and not in sequence.
I have solved this previously by creating a Promise for each API call, and wait for all of them with axios.all(). The code below is
basically pseudo code, I could dig into a better implementation at a later time.
handleRender(req, res){
fetchInitialData()
.then(initialResponse => {
return axios.all(
buildFirstCallResponse(),
buildSecondCallResponse(),
buildThirdCallResponse()
)
})
.then(allResponses => res.send(renderToString(component)))
}
buildFirstCallResponse() {
return axios.get('https://jsonplaceholder.typicode.com/posts/1')
.catch(err => console.error('Something went wrong in the first call', err))
.then(response => response.json())
}
Notice how all responses are bundled up into an array.
The Redux documentation goes into server-side rendering a bit, but you might already have seen that. redux.js.org/docs/recipes/ServerRendering
Also checkout the Promise docs to see exactly what .all() does.developer.mozilla.org/en/docs/Web/JavaScript/Reference/Global_Objects/Promise/all
Let me know if anything is unclear.
You could try express-batch or with GraphQL is another option.
You also could use Redux-Sagas to use pure Redux actions to trigger multiple api calls and handle all of those calls using pure actions. Introduction to Sagas

Sending nested request to node.js web server

I am about to teach creating a simple web server in node.js to my students. I am doing it initially using the http module and returning a static page. The server code looks like this:
var http = require('http');
var fs = require('fs');
http.createServer(function(request, response) {
getFile(response);
}).listen(8080);
function getFile(response) {
var fileName = __dirname + "/public/index.html";
fs.readFile(fileName, function(err, contents) {
if (!err) {
response.end(contents);
} else {
response.end();
console.log("ERROR ERROR ERROR");
}
});
}
index.html looks like this:
<!DOCTYPE html>
<html>
<head>
<title>Static Page</title>
</head>
</body>
<h1>Returned static page</h1>
<p>This is the content returned from node as the default file</p>
<img src="./images/portablePhone.png" />
</body>
</html>
As I would expect, I am getting the index.html page display without the image (because I am not handling the mime-type). This is fine; what is confusing me is, when I look at the network traffic, I would expect to have the index.html returned three times (the initial request, the image request and one for favicon.ico request). This should happen, because the only thing the web server should ever return is the index.html page in the current folder. I logged the __dirname and fileName var and they came out correctly on each request and there were indeed three requests.
So my question is, what am I missing? Why am I not seeing three index.html response objects in the network monitor on Chrome? I know one of the students will ask and I'd like to have the right answer for him.
what is confusing me is, when I look at the network traffic, I would
expect to have the index.html returned three times (the initial
request, the image request and one for favicon.ico request)
When I run your app, I see exactly three network requests in the network tab in the Chrome debugger, exactly as you proposed and exactly as the HTML page and the web server are coded to do. One for the initial page request, one for the image and one for favicon.ico.
The image doesn't work because you don't actually serve an image (you are serving index.html for all requests) - but perhaps you already know that.
So my question is, what am I missing? Why am I not seeing three
index.html response objects in the network monitor on Chrome?
Here's my screenshot from the network tab of the Chrome debugger when I run your app:
The code that you actually wrote (originally, can't be sure you won't edit the question) just serves an index.html. There is nothing in there that could read any other file (like an image).
I don't think you should teach students that syntax/mechanism because it is outdated. For starters, do not teach them to indent with tabs or four spaces. Indent with 2 spaces for JavaScript. Also, it just doesn't make sense to teach ES5 at this point. They should learn ES2015 or later (ES6/ECMAScript 2016/whatever they call it). For the current version of Node out of the box (6.6 as of writing), this would be the equivalent of what you wrote:
const http = require('http');
const fs = require('fs-promise');
http.createServer( (request, response) => {
fs.readFile(`${__dirname}/public/index.html`)
.then( data => {response.end(data)} )
.catch(console.error);
}).listen(8080);
But what you seem to be trying to do is create a gallery script.
Another thing about Node is, there are more than 300,000 modules available. So it just absolutely does not make sense to start from 0 and ignore all 300,000+ modules.
Also, within about three months, 6 at the most, async/await will land in Node 7 without requiring babel. And people will argue that kids will be confused if they don't have enough time toiling with promises, but I don't think I buy that. I think you should just teach them how to set up babel and use async/await. Overall its going to make more sense and they will learn a much clearer way to do things. And then the next time you teach the class you won't need babel probably.
So this is one way I would make a simple gallery script that doesn't ignore all of the modules on npm and uses up-to-date syntax:
import {readFile} from 'fs-promise';
import listFilepaths from 'list-filepaths';
import Koa from 'koa';
const app = new Koa();
app.use(async (ctx) => {
if (ctx.request.querystring.indexOf('.jpg')>0) {
const fname = ctx.request.querystring.split('=')[1];
ctx.body = await readFile(`images/${fname}`);
} else {
let images = await listFilepaths('./images',{relative:true});
images = images.map(i=>i.replace('images/',''));
ctx.body = `${images.map(i=> `<img src = "/?i=${i}" />` )}`;
}
});
app.listen(3000);

Stateful interaction for testing Express Apps

I wrote a simple JSON api with express and I'm trying to use mocha to do some black-box testing. Throughly testing the API requires authenticating as different users, so each test for a specific feature is made of at least two requests: a login operation and one or more authenticated requests that test the actual feature.
I haven't found any library similar to django.test.client to simulate stateful interaction between an HTTP client and a server. Supertest seems to be popular, but it is very low-level compared to the django test client. This is how I would write a simple authenticated test with it (pardon my coffeescript):
it 'should return a 200 OK', (done) ->
supertest(server.app)
.post('/login')
.send("username=xxx&password=pass")
.end (err, res) ->
res.should.have.status(200)
supertest(server.app)
.get('/api/users')
.set('cookie', res.headers['set-cookie'][0])
.expect(200, done)
Is this really the cleanest way to execute the interaction? Is there any library that would help me with asynchronicity (it's not like I am going to need anything but plain serialization of the tests in 99% of cases, callbacks are just confusing) and statefulness? Something that would go like this:
it 'should rock', (done) -> myCoolLibrary [
->
#post '/login', {username: "xxx", password: "pass"}, (err, res) =>
res.should.have.status 200
#done()
,
->
#get '/api/users', (err, res) =>
res.should.have.status 200
#done()
]
If nothing similar exists, I should write it myself :-)
The reliance on the context is because I am using too much ZappaJS these days, and thanks to CoffeeScript's fat arrow, it's not a bad idiom at all.
It sounds like you could benefit from zombiejs. It simulates a browser and keeps cookies and session data between requests.
It also gives you more powerful features such as allowing you to fill out forms and submit them, for example.
A typical test would look something like this:
var Browser = require('zombie')
, browser = new Browser({site:'http://yoursite.com'});
describe('page',function(){
before(function(done){
browser.visit('/loginpage',done);
});
it('should return a 200 page',function(done){
browser.fill('username','xxx');
browser.fill('password','pass');
//assuming your form points to /login
browser.pressButton('button[type="submit"]'),function(){
assert(browser.success); //status code is 2xx
}).then(done,done); //call the done handler after promise is fulfilled
});
it('should rock',function(done){
browser.visit('/api/users',function(){
assert(browser.success);
}).then(done,done);
});
As a more general solution for making async code cleaner to read, check out async. https://github.com/caolan/async
async.serial would do just what you need, but I would particularly recommend async.auto, which allows you to link together various steps with their dependencies in a clear way.
I ended up writing myself a small library that is pretty close to my "ideal" example in the question. It doesn't deserve its own package for now, so I just put it in a gist:
https://gist.github.com/BruceBerry/5485917
I could not get superagent and supertest to perform stateful interaction, so I just ditched them in favor of request. The main difference seems to be that you can't chain expects and you have to do all tests in the callback, but those look odd anyway if you are already using another testing library such as should.js

Resources