I've build a NextJS blog and want to write backend with NodeJS/Express for that blog to run in my local and then my own server.
And I wanted to use MongoDB/Mongoose to fetch blog posts data in my local.
I've created [slug].js file to show single blog post in my /posts/ directory.
In my index.js pages, I could achieve fetching all the posts data from my local and show all the blog titles without an error. So getStaticProps function working correctly with my mongoose.
But the problem is when i click the single post link in index page, i can't go to single blog post.When i click the link, nothing happens.
Also if i make a get request to single post page from postman application. No response is coming.
Instead of using my own database, i put some data top of the [slug].js page and i could use that data as a db succesfully.
Also if i copy the getStaticProps or getStaticPaths function from [slug].js file to another file to test. I can still receive the data from db.
Only in [slug].js file and only when i use my local db; single post page does not working.
Here is my [slug].js file:
import PostModel from "../../backend/models/PostModel";
export async function getStaticPaths() {
// Return a list of possible value for slug
const allPostData2 = await PostModel.find();
const paths = allPostData2.map((post) => {
return { params: { slug: post.slug } };
});
return {
paths,
fallback: false,
};
}
export async function getStaticProps(context) {
// Fetch necessary data for the blog post using params.slug
const slug = context.params.slug;
const myPost2 = await PostModel.find({ slug: slug });
return {
props: {
post: myPost2,
},
};
}
export default function PostPage({ post }) {
return (
<div>
<div>{myPost.title}</div>
<div>This is a single post page</div>
</div>
);
}
Here is the github repo for the project,any help would be greatly appreciated:
https://github.com/hakanolgun/try-next-blog
Related
I'm new to Next.js and I'm trying to understand the suggested structure and dealing with data between pages or components.
For instance, inside my page home.js, I fetch an internal API called /api/user.js which returns some user data from MongoDB. I am doing this by using fetch() to call the API route from within getServerSideProps(), which passes various props to the page after some calculations.
From my understanding, this is good for SEO, since props get fetched/modified server-side and the page gets them ready to render. But then I read in the Next.js documentation that you should not use fetch() to all an API route in getServerSideProps(). So what am I suppose to do to comply to good practice and good SEO?
The reason I'm not doing the required calculations for home.js in the API route itself is that I need more generic data from this API route, as I will use it in other pages as well.
I also have to consider caching, which client-side is very straightforward using SWR to fetch an internal API, but server-side I'm not yet sure how to achieve it.
home.js:
export default function Page({ prop1, prop2, prop3 }) {
// render etc.
}
export async function getServerSideProps(context) {
const session = await getSession(context)
let data = null
var aArray = [], bArray = [], cArray = []
const { db } = await connectToDatabase()
function shuffle(array) {
var currentIndex = array.length, temporaryValue, randomIndex;
while (0 !== currentIndex) {
randomIndex = Math.floor(Math.random() * currentIndex);
currentIndex -= 1;
temporaryValue = array[currentIndex];
array[currentIndex] = array[randomIndex];
array[randomIndex] = temporaryValue;
}
return array;
}
if (session) {
const hostname = process.env.NEXT_PUBLIC_SITE_URL
const options = { headers: { cookie: context.req.headers.cookie } }
const res = await fetch(`${hostname}/api/user`, options)
const json = await res.json()
if (json.data) { data = json.data }
// do some math with data ...
// connect to MongoDB and do some comparisons, etc.
But then I read in the Next.js documentation that you should not use fetch() to all an API route in getServerSideProps().
You want to use the logic that's in your API route directly in getServerSideProps, rather than calling your internal API. That's because getServerSideProps runs on the server just like the API routes (making a request from the server to the server itself would be pointless). You can read from the filesystem or access a database directly from getServerSideProps. Note that this only applies to calls to internal API routes - it's perfectly fine to call external APIs from getServerSideProps.
From Next.js getServerSideProps documentation:
It can be tempting to reach for an API Route when you want to fetch
data from the server, then call that API route from
getServerSideProps. This is an unnecessary and inefficient approach,
as it will cause an extra request to be made due to both
getServerSideProps and API Routes running on the server.
(...) Instead, directly import the logic used inside your API Route
into getServerSideProps. This could mean calling a CMS, database, or
other API directly from inside getServerSideProps.
(Note that the same applies when using getStaticProps/getStaticPaths methods)
Here's a small refactor example that allows you to have logic from an API route reused in getServerSideProps.
Let's assume you have this simple API route.
// pages/api/user
export default async function handler(req, res) {
// Using a fetch here but could be any async operation to an external source
const response = await fetch(/* external API endpoint */)
const jsonData = await response.json()
res.status(200).json(jsonData)
}
You can extract the fetching logic to a separate function (can still keep it in api/user if you want), which is still usable in the API route.
// pages/api/user
export async function getData() {
const response = await fetch(/* external API endpoint */)
const jsonData = await response.json()
return jsonData
}
export default async function handler(req, res) {
const jsonData = await getData()
res.status(200).json(jsonData)
}
But also allows you to re-use the getData function in getServerSideProps.
// pages/home
import { getData } from './api/user'
//...
export async function getServerSideProps(context) {
const jsonData = await getData()
//...
}
You want to use the logic that's in your API route directly in
getServerSideProps, rather than calling your internal API. That's
because getServerSideProps runs on the server just like the API routes
(making a request from the server to the server itself would be
pointless). You can read from the filesystem or access a database
directly from getServerSideProps
As I admit, what you say is correct but problem still exist. Assume you have your backend written and your api's are secured so fetching out logic from a secured and written backend seems to be annoying and wasting time and energy. Another disadvantage is that by fetching out logic from backend you must rewrite your own code to handle errors and authenticate user's and validate user request's that exist in your written backend. I wonder if it's possible to call api's within nextjs without fetching out logic from middlewars? The answer is positive here is my solution:
npm i node-mocks-http
import httpMocks from "node-mocks-http";
import newsController from "./api/news/newsController";
import logger from "../middlewares/logger";
import dbConnectMid from "../middlewares/dbconnect";
import NewsCard from "../components/newsCard";
export default function Home({ news }) {
return (
<section>
<h2>Latest News</h2>
<NewsCard news={news} />
</section>
);
}
export async function getServerSideProps() {
let req = httpMocks.createRequest();
let res = httpMocks.createResponse();
async function callMids(req, res, index, ...mids) {
index = index || 0;
if (index <= mids.length - 1)
await mids[index](req, res, () => callMids(req, res, ++index, ...mids));
}
await callMids(
req,
res,
null,
dbConnectMid,
logger,
newsController.sendAllNews
);
return {
props: { news: res._getJSONData() },
};
}
important NOTE: don't forget to use await next() instead of next() if you use my code in all of your middlewares or else you get an error.
Another solution: next connect has run method that do something like mycode but personally I had some problems with it; here is its link:
next connet run method to call next api's in serverSideProps
Just try to use useSWR, example below
import useSWR from 'swr'
import React from 'react';
//important to return only result, not Promise
const fetcher = (url) => fetch(url).then((res) => res.json());
const Categories = () => {
//getting data and error
const { data, error } = useSWR('/api/category/getCategories', fetcher)
if (error) return <div>Failed to load</div>
if (!data) return <div>Loading...</div>
if (data){
// {data} is completed, it's ok!
//your code here to make something with {data}
return (
<div>
//something here, example {data.name}
</div>
)
}
}
export default Categories
Please notice, fetch only supports absolute URLs, it's why I don't like to use it.
P.S. According to the docs, you can even use useSWR with SSR.
I am learning by building. I am building a blog management system using Reactjs, Nodejs, Mongodb.
I would like to store some frontend values in the database so that anyone I give admin permission can post, edit and delete such values. These values are website name, sub-name, sidebar bio description, header image and bio image.
This is the code to create the value:
//create new frontend paramters into database
router.post("/", async (req, res) =>{
const newFrontendValues = new Frontend(req.body);//we called the frontend model we created and we used req.body
try{
const savedFrontendValues = await newFrontendValues.save()//we tried to save the frontend values created
res.status(200).json(savedFrontendValues)
}catch(err){
res.status(500).json(err)
}
});
I wrote the code in node to get the values like this after creating them:
//get frontend parameters
router.get("/:id", async (req, res) =>{
try{
const frontend = await Frontend.findById(req.params.id)
res.status(200).json(frontend)
}catch(err){
res.status(500).json(err)
}
})
my server api code
app.use("/api/frontend", frontend)
In react, I wanted to call the _id of the values but I am lost. I really don't know how to go about that.
It is working fine as desired in postman because I can directly implement the value _id.
See attached image
But in React, I wanted that to be dynamic.
Here is my React code:
useEffect(() => {
const fetchFrontendValue = async () =>{
const res = await axios.get("/frontend")
console.log(res.data)
}
fetchFrontendValue()
}, [])
How do I add the _id to this
axios.get("/frontend")
You'd want to look at get request parameters. Usually as a convention, people pass them in the URL. So it would be something like http://localhost:5000/api/frontend?id=617944dc7e00022337a483be78 and on the API side, you'd use req.body.id to pass that to the database. There are other ways to do it too, but this is the most common because some old browser drop the parameters attached to a GET request. Here's a link:https://www.w3schools.com/tags/ref_httpmethods.asp
You should consider going for a complete solution. On a basic level, you should be following these steps
Implement a backend route /getall that fetch out all items in DB in this manner
await Frontend.find({})
Render the fetched list on frontend side in a way that each item would be a React UI item and as part of each item, you have the buttons for deleting and updating the item data
{backendData?.map((item, index)=><SingleItem key={item?._id} data={item} />)}
As each SingleItem has update and delete buttona and also the mongodb ID as part of data, so on clicking update and delete button, you will get the id from the data and call relevant DB Url on backend side
I am working with Gatsby and WordPress. I am trying to redirect some URLs using the Gatsby redirect API. I write the query to get an Object and then I use the Map method to create an array of the items we need from that object. I then run a for Each method to get the individual data from that array but it fails on running the development server.
What is the Right way to do this?
const { createRedirect } = actions;
const yoastRedirects = graphql(`
{
wp {
seo {
redirects {
format
origin
target
type
}
}
}
}
`)
const redirectOriginUrls = yoastRedirects.wp.seo.redirects.map(redirect=>(redirect.origin))
const redirectTargetUrls = yoastRedirects.wp.seo.redirects.map(redirect=>(
redirect.target
))
redirectOriginUrls.forEach(redirectOriginUrl=>(
redirectTargetUrls.forEach(redirectTargetUrl=>(
createRedirect({
fromPath: `/${redirectOriginUrl}`,
toPath: `/${redirectTargetUrl}`,
isPermanent: true
})
))
))
The createRedirect API needs to recieve a structure like:
exports.createPages = ({ graphql, actions }) => {
const { createRedirect } = actions
createRedirect({ fromPath: '/old-url', toPath: '/new-url', isPermanent: true })
createRedirect({ fromPath: '/url', toPath: '/zn-CH/url', Language: 'zn' })
createRedirect({ fromPath: '/not_so-pretty_url', toPath: '/pretty/url', statusCode: 200 })
// Create pages
}
In your case, you are not entering to the correct fetched data. Assuming that the loops are properly done, you must do:
let redirectOriginUrls=[];
let redirectTargetUrls=[];
yoastRedirects.data.wp.seo.redirects.map(redirect=>{
return redirectOriginUrls.push(redirect.origin)
});
yoastRedirects.data.wp.seo.redirects.map(redirect=>{
return redirectTargetUrls.push(redirect.target)
})
Instead of:
const redirectOriginUrls = yoastRedirects.wp.seo.redirects.map(redirect=>(redirect.origin))
const redirectTargetUrls = yoastRedirects.wp.seo.redirects.map(redirect=>(
redirect.target
))
Notice the .data addition in the nested object.
In addition, keep in mind that the createRedirect API will only work only when having a hosting infrastructure behind, like AWS or Netlify, both have plugins integration with Gatsby. This will generate meta redirect HTML files for redirecting on any static file host.
I'm using TinyMCE in a custom field for the KeystoneJS AdminUI, which is a React app. I'd like to upload images from the React front to the KeystoneJS GraphQL back. I can upload the images using a REST endpoint I added to the Keystone server -- passing TinyMCE an images_upload_handler callback -- but I'd like to take advantage of Keystone's already-built GraphQL endpoint for an Image list/type I've created.
I first tried to use the approach detailed in this article, using axios to upload the image
const getGQL = (theFile) => {
const query = gql`
mutation upload($file: Upload!) {
createImage(file: $file) {
id
file {
path
filename
}
}
}
`;
// The operation contains the mutation itself as "query"
// and the variables that are associated with the arguments
// The file variable is null because we can only pass text
// in operation variables
const operation = {
query,
variables: {
file: null
}
};
// This map is used to associate the file saved in the body
// of the request under "0" with the operation variable "variables.file"
const map = {
'0': ['variables.file']
};
// This is the body of the request
// the FormData constructor builds a multipart/form-data request body
// Here we add the operation, map, and file to upload
const body = new FormData();
body.append('operations', JSON.stringify(operation));
body.append('map', JSON.stringify(map));
body.append('0', theFile);
// Create the options of our POST request
const opts = {
method: 'post',
url: 'http://localhost:4545/admin/api',
body
};
// #ts-ignore
return axios(opts);
};
but I'm not sure what to pass as theFile -- TinyMCE's images_upload_handler, from which I need to call the image upload, accepts a blobInfo object which contains functions to give me
The file name doesn't work, neither does the blob -- both give me server errors 500 -- the error message isn't more specific.
I would prefer to use a GraphQL client to upload the image -- another SO article suggests using apollo-upload-client. However, I'm operating within the KeystoneJS environment, and Apollo-upload-client says
Apollo Client can only have 1 “terminating” Apollo Link that sends the
GraphQL requests; if one such as apollo-link-http is already setup,
remove it.
I believe Keystone has already set up Apollo-link-http (it comes up multiple times on search), so I don't think I can use Apollo-upload-client.
The UploadLink is just a drop-in replacement for HttpLink. There's no reason you shouldn't be able to use it. There's a demo KeystoneJS app here that shows the Apollo Client configuration, including using createUploadLink.
Actual usage of the mutation with the Upload scalar is shown here.
Looking at the source code, you should be able to use a custom image handler and call blob on the provided blobInfo object. Something like this:
tinymce.init({
images_upload_handler: async function (blobInfo, success, failure) {
const image = blobInfo.blob()
try {
await apolloClient.mutate(
gql` mutation($image: Upload!) { ... } `,
{
variables: { image }
}
)
success()
} catch (e) {
failure(e)
}
}
})
I used to have the same problem and solved it with Apollo upload link. Now when the app got into the production phase I realized that Apollo client took 1/3rd of the gzipped built files and I created minimal graphql client just for keystone use with automatic image upload. The package is available in npm: https://www.npmjs.com/package/#sylchi/keystone-graphql-client
Usage example that will upload github logo to user profile if there is an user with avatar field set as file:
import { mutate } from '#sylchi/keystone-graphql-client'
const getFile = () => fetch('https://github.githubassets.com/images/modules/logos_page/GitHub-Mark.png',
{
mode: "cors",
cache: "no-cache"
})
.then(response => response.blob())
.then(blob => {
return new File([blob], "file.png", { type: "image/png" })
});
getFile().then(file => {
const options = {
mutation: `
mutation($id: ID!, $data: UserUpdateInput!){
updateUser(id: $id, data: $data){
id
}
}
`,
variables: {
id: "5f5a7f712a64d9db72b30602", //replace with user id
data: {
avatar: file
}
}
}
mutate(options).then(result => console.log(result));
});
The whole package is just 50loc with 1 dependency :)
The easies way for me was to use graphql-request. The advantage is that you don't need to set manually any header prop and it uses the variables you need from the images_upload_handler as de docs describe.
I did it this way:
const { request, gql} = require('graphql-request')
const query = gql`
mutation IMAGE ($file: Upload!) {
createImage (data:
file: $file,
}) {
id
file {
publicUrl
}
}
}
`
images_upload_handler = (blobInfo, success) => {
// ^ ^ varibles you get from tinymce
const variables = {
file: blobInfo.blob()
}
request(GRAPHQL_API_URL, query, variables)
.then( data => {
console.log(data)
success(data.createImage.fileRemote.publicUrl)
})
}
For Keystone 5 editorConfig would stripe out functions, so I clone the field and set the function in the views/Field.js file.
Good luck ( ^_^)/*
I have tried to find a solution for the following but it does not seem to be obvious for a beginner. I am integrating a form upload using backbone and express. The form consists of a user information and a file upload. The idea is to upload the file and persist the user model to a database with a field referencing to a file.
Uploading a file using express can be easily implemented sending the form through POST with the proper HTML tags (<form method='post' action='/upload' enctype='multipart/form-data'>) and a simple handler like it is done in http://shivalibari.com/blog/2014/02/file-upload-using-node/.
In my case, however, I use a backbone view to handle the submit event and use save() to persist the model making a POST to express:
submit: function(e){
e.preventDefault();
var formData = {};
console.log("submit form");
// read form...
$('.form-group').children('input').each(function(i,el){
if( $( el ).val() != '' )
{
formData[ el.id ] = $( el ).val();
}
});
this.user = new User();
this.user.save(formData);
}
as simplified version of the POST handler...
var app = express();
app.post( '/api/users', function( request, response) {
var user = new UserModel({
name: request.body.name,
email: request.body.email,
file: request.body.file
});
user.save( function( err ) {
if( !err ) {
return console.log( 'created' );
} else {
return console.log( err );
}
return response.send( user );
});
}
Any ideas of how to handle the file upload maybe before, after or during model persistence?
I'd like to minimize the use of plugins unless strictly necessary using node.js.
I found my own answer in the following blog post: http://estebanpastorino.com/2013/09/27/simple-file-uploads-with-backbone-dot-js/. The solution consists on using the iframe trick to pass the file objects to the server and can be easily implemented using a jquery plugin called jquery.iframe-transport.
This jquery plugin makes the file objects available on the server in the request.files attribute, which can be used to be read/write files as explained in http://shivalibari.com/blog/2014/02/file-upload-using-node/.
In my implementation I end up using two different models for the user profile and the files. I first synced the file model, and then used a success callback to sync the user profile model.