According to Swagger website, there are two approaches: Bottom-up and Top-down.
I have an existing NodeJS server that I'd like to deploy in the Azure enviroment, that require a swagger document (API APP).
Does anyone know a tool for generating the swagger using the code? Even better if you could point a tutorial. I couldn't find it.
Question is a bit old but still. It is possible to generate completely automatically Swagger (OpenAPI) specification just by embedding analysis middleware like this: https://github.com/mpashkovskiy/express-oas-generator
const express = require('express');
const expressOasGenerator = require('express-oas-generator');
let app = express();
expressOasGenerator.init(app, {});
run some client or REST API tests agains your service and open http://host:port/api-docs
It’s not difficult to integrate Swagger in exist express applications following this tutorial.
Generally, we can follow these steps:
Add the dependencies in our package.json, and run npm install to install them. The dependencies should be:
"dependencies": {
"swagger-node-express": "~2.0",
"minimist": "*",
"body-parser": "1.9.x",
...
}
Download the zip project of Swagger-UI, copy the dist folder into the root directory of our project, the directory should almost like:
Introduce the dependencies at the beginnng of app.js:
var argv = require('minimist')(process.argv.slice(2));
var swagger = require("swagger-node-express");
var bodyParser = require( 'body-parser' );
Set up a subpath for swagger doc:
var subpath = express();
app.use(bodyParser());
app.use("/v1", subpath);
swagger.setAppHandler(subpath);
Make sure that /dist is able to serve static files in express:
app.use(express.static('dist'));
Set the info for API:
swagger.setApiInfo({
title: "example API",
description: "API to do something, manage something...",
termsOfServiceUrl: "",
contact: "yourname#something.com",
license: "",
licenseUrl: ""
});
Introduce /dist/index.html for swagger UI:
subpath.get('/', function (req, res) {
res.sendfile(__dirname + '/dist/index.html');
});
Complete the swagger configurations:
swagger.configureSwaggerPaths('', 'api-docs', '');
var domain = 'localhost';
if(argv.domain !== undefined)
domain = argv.domain;
else
console.log('No --domain=xxx specified, taking default hostname "localhost".');
var applicationUrl = 'http://' + domain;
swagger.configure(applicationUrl, '1.0.0');
Configure doc file dependence in /dist/index.html:
if (url && url.length > 1) {
url = decodeURIComponent(url[1]);
} else {
<del>url = "http://petstore.swagger.io/v2/swagger.json";</del>
url = "/api-docs.json";
}
Create api-docs.json file with the info of your APIs, put it in the dist folder.
Run the Express app on local, visit http://localhost:3000/v1, we can check the swagger doc.
Here is my test sample repo for your reference.
To my knowledge, your options are:
Using swagger-node-express which is very cumbersome in my opinion.
Writing up the swagger document manually yourself with the help of swagger editor as suggested in this SO Answer
If you go for option 2, you could use swagger-ui-express to generate the swagger-ui
A lot of developers are still having this problem so I built an open-source tool to help -- the tool is kind of like Git for APIs. It works by running a proxy while you're developing the API, analyzing the traffic, and updating the spec for you as the API's behavior changes. Hopefully, the workflow saves you a lot of time: https://github.com/opticdev/optic
Most alternatives require some sort of API specification through Json, Yaml or even embedded in JSdocs. On the other hand there are some runtime analyzers intercepting HTTP requests and building that specification on-demand.
express-sitemap-html follows a different approach inspecting the express object and its routes at setup time. Thus it always provides an up-to-date swagger UI for installed routes on existing express instance.
const sitemap = require('express-sitemap-html')
...
sitemap.swagger('Title', app) // app is an express instance
Then get swagger UI from your domain /api-docs.
Related
We have many APIs written in nodejs, using nestjs framework.
We can generate openapi.yaml using SwaggerModule from nestjs. that works perfectly. But the problem is that it needs the API to be up, and therefore that the database is up and running. That's a problem for us in our CI/CD, because we need to generate openapi specification before running the API.
Is it possible to generate openapi specification from our code, without needing to run the application?
Or maybe is there a simple way to mock our database?
Thanks for your help
The short answer is no, there isn't a way to generate the docs without running the NestJS application. However, you can generate a JSON file representing the OpenAPI documentation and then generate a static website from there. This issue gets you half-way there:
async function bootstrap() {
const app = await NestFactory.create(AppModule);
const options = new DocumentBuilder()
.setTitle('Cats example')
.setDescription('The cats API description')
.setVersion('1.0')
.addTag('cats')
.build();
const document = SwaggerModule.createDocument(app, options);
const outputPath = path.resolve(process.cwd(), 'swagger.json');
writeFileSync(outputPath, JSON.stringify(document), { encoding: 'utf8'});
await app.close();
}
bootstrap();
This will generate a file swagger.json containing the OpenAPI specification. From there, you can use a tool like Spectacle to generate the actual HTML:
npx spectacle-docs -t public/docs swagger.json
An even less documented feature is the ability to retrieve a JSON representation of the OpenAPI specification from the regular endpoint using only curl.
Let's say you have a standard #nestjs/swagger integration that publishes the OpenAPI docs to /docs/:
const options = new DocumentBuilder()
.setTitle('core-api')
.setDescription('The core API description')
.setVersion('3.0')
.addTag('core-api')
.setBasePath(version)
.build();
const document = SwaggerModule.createDocument(app, options);
SwaggerModule.setup('docs', app, document);
If you browser to http:/localhost:3000/docs/, you can access the HTML version of the docs. However, if you browser to http://localhost:3000/docs-json you will receive a JSON representation. Simply append -json to whatever you spec path is.
Tying this all together, you can integrate this into a CI pipeline with a little hackery. I have integrated this into a Gitlab CI pipeline like so:
script:
- until nc -vz $API_IP 3000; do sleep 1; done
- curl http://$API_IP:3000/docs-json -o swagger.json
- npx spectacle-docs -t public/docs swagger.json
In your CI pipeline, you'll still have to run your NestJS application and as well as Mongo and any other dependant services required for it to start, but once you generate the JSON you can stop your application, build the static HTML site and publish it elsewhere.
I managed to generate the swagger spec from my e2e tests without starting the server
generating swagger json file without running nest js server
I implemented a project for test purpose with restify framework in node and implemented a GET API.
But I don't know how to integrate a swagger with restify framework.
There are many blogs for integration swagger with express..
I followed a link like
https://www.npmjs.com/package/restify-swagger-jsdoc
https://github.com/bvanderlaan/swagger-ui-restify
Please help me how to integrate.
For everyone who's come this far, as well as me, I assume you're using restify instead of express and haven't found an easy answer yet. I have an API server using restify, use typescript to program and convert the files to javascript before running my server, all in a Docker container. After a lot of searching I managed to solve my problem as follows:
Install the "swagger-autogen" package.
Create a file called "swagger.ts" with the following commands:
const swaggerAutogen = require('swagger-autogen')()
const outputFile = '../swagger_output.json'
const endpointsFiles = ['../routes/users.router.ts'] // root file where the route starts.
swaggerAutogen(outputFile, endpointsFiles)
.then(() => {
require('./dist/src/app.js') // Your project's root file
})
Go to the files reserved for your API routes (where they have the GET, POST, PATCH, ... functions) -- in my case '../routes/users.router.ts', and put comments, like :
// #swagger.path = "/users"
// #swagger.tags = ['User']
// #swagger.description = 'Endpoint to create a user.'
/*
#swagger.responses[201] = {
schema: { "$ref": "#/definitions/User" },
description: "User was successfully created." }
*/
NOTE: To learn more about these swagger-autogen comment tags see:
https://github.com/davibaltar/swagger-autogen#swagger-20
Run the following command: (In my case, this command is in a script in package.json that is called inside the Dockerfile):
NOTE: Remember that if you don't automatically generate the javascript files from the script in typescript you need to create your structure entirely in javascript and not in typescript. Pay attention to file names and extensions.
node ./dist/src/server/swagger.js // change to your path
When you do this a swagger_output.json file will be created containing your application details.
After the file has been generated, you need to install the "swagger-ui-restify" package.
Put these commands where the middlewares are (usually at application startup):
const swaggerUi = require("swagger-ui-restify")
const swaggerDocument = require("../swagger_output.json")
const swaggerOptions = {
explorer: true,
baseURL: 'api-docs',
}
app.get("/api-docs"+'/*', ...swaggerUi.serve)
app.get("/api-docs", swaggerUi.setup(swaggerDocument, swaggerOptions))
Now your application has a middleware to consult the automatically generated documentation.
I have a Aurelia app that I host in an Azure app service. I would like to configure the api endpoint that Aurelia connects to by defining it in a Application Setting. How can I read that setting inside Aurelia?
As the other answers and comments also mentioned, Aurelia runs as a clientside application and has no knowledge of backend-driven applications. So the concept of something like the web.config or appsettings.json is not available here without some serious hacks. You don't want to go there.
That being said, of course you can! :) You can pretty much define any settings file you like, similar to the concept of appsettings.json of ASP.NET (Core) apps but then in Aurelia.
A great example for this is the Aurelia open source plugin Aurelia-Configuration.
Simple instructions are that your first start by adding any .json file you like (like config.json) to your project. Next, register it in your Aurelia startup:
export function configure(aurelia) {
aurelia.use
.standardConfiguration()
.developmentLogging()
.plugin('aurelia-configuration'); // <-- there you go
aurelia.start().then(a => a.setRoot());
}
Finally, just read out the values using AureliaConfiguration. The sample below illustrates it with dependency injection:
import {inject} from 'aurelia-framework';
import {AureliaConfiguration} from 'aurelia-configuration';
#inject(AureliaConfiguration)
export class MyComponent {
constructor(config) {
this.config = config;
this.config.get('endpoint');
}
}
The README explains it all.
Note: I'm not affiliated with the aurelia-configuration plugin, but just a fan of it.
Isn't Aurelia a JavaScript client framework, e.g. all-in-browser no backend? Application Settings is a server side thing (key-value store) in App Service. No backend, no app settings.
Consider this restify minimal backend that returns Application Settings by calling /settings/{app-setting-name}:
var restify = require('restify');
function respond(req, res, next) {
// Returns app setting value.
// Provides zero input validation,
// DO NOT COPY PASTE INTO PROD,
// ALL YOUR BASE WILL BELONG TO US.
res.send(process.env[req.params.setting]);
next();
}
var server = restify.createServer();
server.get('/settings/:setting', respond);
server.head('/settings/:setting', respond);
server.listen(process.env.PORT || 3000, function() {
console.log('restify listening...');
});
Hope this all makes more sense now.
My Node.js folders hirarchy looks like the next image:
My folders hirarchy
While app.js it's the Node.js main file, routes it's the Node.js routes and src it's the client public html files.
This is the code in app.js:
var express = require('express');
var app = express();
var server = require('http').createServer(app);
global.io = require('socket.io').listen(server);
var compression = require('compression');
var helmet = require('helmet');
var session = require('express-session');
var bodyParser = require('body-parser');
app.use(bodyParser.json()); // support json encoded bodies
app.use(bodyParser.urlencoded({ extended: true })); // support encoded bodies
app.use(express.static(__dirname + '/src'));
app.use(helmet());
app.use(compression());
app.use('/rides', require('./routes/ridesServer'));
app.use('/user', require('./routes/userServer'));
app.use('/offers', require('./routes/offersServer'));
app.use('/notifications', require('./routes/notificationsServer'));
server.listen("8080", function() {
console.log("Connected to db and listening on port 8080");
});
This is another API in routes/userServer.js file:
router.post('/verifytoken', function(req, res, next) {
// some functions here
});
And this another HTTP REQUEST I am doing from client side, in page: ride.js:
$.ajax({
method: "POST",
headers: {
"Content-Type": "application/json"
},
url: "user/verifytoken",
data: JSON.stringify(something),
success: function(response) {
// some code
},
error: function(error) {
// some code
}
});
As you can see, client files and Node.js server files are on the same server, and Node.js serves those static files via this command:
app.use(express.static(__dirname + '/src'));
I think, that it should be avoided, and there is a better way!
If you are a Node.js expert and familier with best practices, please, tell me if the next method of working is correct, if it does not, please correct me:
I thought about putting static files on public_html directory
and Node.js files in server directory which is under public_html directory.
Then run pm2 start app.js --watch or, node app.js on the app.js which is located in server directory, and not in public_html.
In result, index.html file will be served as just as another static file without any relation to Node.js server, and Node.js will be in its own folder, not dealing with any kind of client the side.
In other words, seperate Node.js and static files and put Node.js files as a sub directory and not main directory.
Then the HTTP REQUEST will be looking like this:
$.ajax({
method: "POST",
headers: {
"Content-Type": "application/json"
},
url: "server/user/verifytoken",
data: JSON.stringify(something),
success: function(response) {
// some code
},
error: function(error) {
// some code
}
});
Please note that I have added SERVER directory.
Furthermore, I can exchange the
url: "server/user/verifytoken",
to an IP from a remote app (like Ionic):
url: "123.123.123.123:443/server/user/verifytoken",
And then my HTTP REQUESTS will be served via HTTPS (because I am sending for port 443), I can create multiple apps on the same server and I have no struggles with any Node.js express static folders.
What do you think?
Thanks!
First let me say I'm not an expert. But I have 3 years of continuous development of Node.js based solutions.
In the past I have created solutions mixing client side code and server side code on the same project and it has work. At least for a while. But in the long run is a bad idea for many possible reasons. Some of them are:
Client side code and server side code may require different processes to produce working code. For example client side code may require trans compiling from ES6 to more compatible ES5 using something as gulp or Webpack. This is normally not the case for server side code because the runtime is more targeted.
Mixing client side code and an API server may prevent you from horizontally scaling one of them without the other.
This is like a mono repo. And having a mono repo without a CI process tailor for this scenario may produce very long development times.
What we currently do at my work is as follow:
Create a separated API server project. This way you can concentrate on developing a good API while working on this specific project. Let cross-cutting concerns (like authentication) outside the API server.
Create a separated project for your client side code (SPA maybe). Set your dev environment to proxy API requests to a running API server (may be running locally).
Create a separated project for the deployment of the entire solution. This project will put together the serving of the client code, proxying requests to the API and implementing cross-cutting concerns like authentication, etc.
Having your code separated in this way makes easy developing each pieces and fast evolution. But it may introduce some complexities:
This multi-project structure require you to be able to trigger testing the hole product each time one of the project changes.
It surface the need of integration testing
Some other considerations are:
API server and Website server may run on the same machine but in different ports.
You may secure your API server using SSL (on node using the standard https module) but notice that in all cases you need another actor in front of the API server (a website proxying requests to the actual API server of a API gateway that implement cross-cutting concerns like authentication, rate limiting, etc). In the past I pose the same question you have made yourself regarding the apropriate of using SSL in this scenario and the answer is here. My answer is: depends on the deployment conditions.
Is it possible to store templates for an express application in a separate package?
In my usecase I'd like to have a shared package containing global templages to give all apps the same look and feel even so they run as an independent entity on another port or even another server. Local content templates could live within the app, so all I'm looking for is a way to share that kind of code between multiple apps.
Going a step further I was thinking about skinning packages which can overwrite the default templates. Once installed in the "template package" it could change the look and feel of all applications using the core templates.
Is there a way of doing that without having to drop the comfort of express?
cu
Roman
This is possible using express. You can basically mount a whole app object to a specific route (with all routes and middleware).
var express = require('express');
var coreApp = express();
var blogApp = express();
var wikiApp = express();
// init blogApp and wikiApp with middleware and routes
coreApp.use('/blog', blogApp);
coreApp.use('/wiki', wikiApp);
Now you can mount your templates into this modular apps and then mount them into your core app.
Here's a screen cast from the express creator himself, called modular web applications.