This CLI tool is designed to streamline setting up a Node.js backend project. It includes features for setting up database configurations, adding necessary middleware, and initializing a project structure for ease of development
✅ Quick Initialization: Generate a project structure with MongoDB, MySQL, or no database.
✅ Modular Add-Ons: Use simple commands to add Docker, Multer, Redis, Nodemailer, and more.
✅ Redis Support: Pre-configured Redis caching for faster APIs.
✅ Developer-Friendly Defaults: Comes with a clean folder structure, middleware, and basic routes.
✅ Extendable: Easy-to-use commands to include advanced features as needed.
initcommand to initialize a backend project with MongoDB or MySQL database or none.addcommand to add optional features like Multer, Docker, MongoDB, MySQL, Nodemailer, etc boilerplate code
To install the backend-maker-cli package globally, use npm:
npm install -g backend-maker-cli
Once installed, you can run the CLI commands from your terminal.
To initialize your project structure, run:
backend-maker init
This will set up a basic backend project structure with default configurations.
If you want to set up a specific database (either MongoDB or MySQL) right at the start, you can use the following commands:
- For MongoDB:
backend-maker init --mongodb
- For MySQL:
backend-maker init --mysql
These commands will add the necessary database configurations to your project.
After you initialize your project, you can add optional features as you require. Use the add command to add features like Multer, Docker, etc.
For example, to add Multer (for file uploading), run:
backend-maker add multer
This will generate a Dockerfile for your Node.js project:
backend-maker add docker
To learn more about available features, you can use the --help option:
backend-maker add --help
This will display a list of all the available features you can add to your project, including options for Docker, MongoDB, and more.
This package now supports Redis for caching and improving API performance.
Configuration is located in:
- Connection setup:
./config/redis.js - Caching APIs:
./utils/redis-cache.js
Add the following variables to your .env file:
REDIS_HOST=127.0.0.1
REDIS_PORT=6379
REDIS_PASSWORD=your_redis_passwordRedis allows you to cache API responses, significantly boosting performance by reducing redundant database queries or computations.
You can choose between the official redis package and the ioredis package for Redis integration. When setting up Redis, an interactive CLI guides you through selecting your preferred package.
This feature provides an easy and secure way to implement Google OAuth 2.0 for user authentication in your application. With this integration, users can sign in using their Google accounts, reducing the need for creating separate credentials for your application.
Key Features • Secure Authentication: Allows users to log in via their Google accounts, leveraging OAuth 2.0 for enhanced security.
• Session Management: Automatically manages user sessions with serialization and deserialization logic.
• Customizable Routes: Predefined routes for login, callback, profile, and logout, making it easy to plug and play.
USAGE
backend-maker add passport-google
Integrate with Your App
const { configureGooglePassport, initializeGoogleAuth } = require('./middlewares/passport-google');
const googleRoutes = require('./routes/Google-0Auth-SampleRoutes');
const app = express();
configureGooglePassport();
initializeGoogleAuth(app);
app.use('/auth', googleRoutes);
Passport-local, Passport-GitHub, Passport-Facebook also available now ✅
To initialize Docker configurations for your project, run:
docker init
Here is the folder structure for the generated project:
my-backend-project/
│
├── config/ # Database, middleware, and general config files
│ ├── dbMongo.js # Database connection logic
│ ├── multer.js # Multer configuration for file uploads
| ├── redis.js # Redis configuration for caching
| ├── cloudinary.js # Cloudinary configuration for image, video uploads on
| cloud
|
|── constants/ # Constants for your application
│
├── controllers/ # Controllers for your routes
│ ├── authController.js # Example controller
│
├── models/ # Database models (e.g., User, Product)
│ ├── authModel.js # Example model
│
├── middlewares/
| ├── authMiddleware.js
│ ├── passport-google.js # Passport Google strategy
│ ├── passport-github.js # Passport GitHub strategy
|
├── routes/ # Routes for your application
│ ├── sampleRoutes.js # Example routes for user-related endpoints
│
├── public/ # Public folder (for uploaded files)
│ └── uploads/ # Folder for storing file uploads
│
├── utils/ # Utility functions and helpers
│ ├── emailSender.js # Email sender utility
│ ├── redis-cache.js # Redis caching utility
|
├── .env # Environment variables (Email, DB credentials, etc.)
├── .gitignore # Git ignore file to prevent sensitive files from being tracked
├── app.js # Main server file to start your application
├── package.json # NPM package configuration
├── Dockerfile # Dockerfile for containerizing your node app
Here are some of the core features of the generated structure:
const express = require('express');
const app = express();
const dotenv = require('dotenv');
dotenv.config();
app.use(express.json());
app.set(express.urlencoded({ extended: true }));
const cors = require('cors');
app.use(cors());
const connectDB = require('./config/db');
connectDB();
const authMiddleware = require('./middleware/authMiddleware');
const sampleRoutes = require('./routes/sampleRoutes.js');
const PORT = 8000 || process.env.PORT;
app.get('/', (req, res) => res.send('HELLO WORLD'));
app.use('/api', sampleRoutes);
app.listen(PORT, () => {
console.log(`Server is running at http://localhost:${PORT}`);
});const mongoose = require('mongoose');
const dotenv = require('dotenv');
dotenv.config()
function connectToDb() {
try {
mongoose.connect(process.env.MONGO_URI)
.then(() => console.log('Connected to MongoDB'))
} catch (err) {
console.error(`Error: ${err.message}`);
process.exit(1);
}
}
module.exports = connectToDb;Ensure you define the following in your .env file:
MONGO_URI=mongodb://your-mongo-uri
const multer = require('multer');
const storage = multer.diskStorage({
destination: function (req, file, cb) {
cb(null, 'public/uploads/');
},
filename: function (req, file, cb) {
cb(null, Date.now() + '-' + file.originalname);
}
});
const fileSizeLimit = 1024 * 1024 * 100; // 100MB file size limit
const upload = multer({
storage: storage,
limits: {
fileSize: fileSizeLimit,
},
fileFilter: function (req, file, cb) {
if (!file.mimetype.startsWith('image/')) {
return cb(new Error('Only image files are allowed.'));
}
cb(null, true);
}
});
module.exports = upload;const jwt = require('jsonwebtoken');
const authMiddleware = (req, res, next) => {
const token = req.headers['authorization'];
if (!token) return res.status(401).send({ error: 'Access Denied' });
try {
const verified = jwt.verify(token, process.env.JWT_SECRET);
req.user = verified;
next();
} catch (err) {
res.status(400).send({ error: 'Invalid Token' });
}
};
module.exports = authMiddleware;const redis = require('../config/redis');
const messages = require('../constants/redisMessages');
// Function to set a key in Redis with expiration
const setCache = async (key, value, ttl = 3600) => {
try {
await redis.set(key, JSON.stringify(value), 'EX', ttl);
console.log("messages.CACHE_SET_SUCCESS" + key);
} catch (err) {
console.error(messages.SET_CACHE_ERROR + " " + key, err);
}
};
// Function to get a key from Redis
const getCache = async (key) => {
try {
const data = await redis.get(key);
return data ? JSON.parse(data) : null;
} catch (err) {
console.error(messages.GET_CACHE_ERROR + " " + key, err);
return null;
}
};
// Function to delete a key from Redis
const deleteCache = async (key) => {
try {
await redis.del(key);
console.log("messages.CACHE_DELETED_SUCCESS" + key);
} catch (err) {
console.error(messages.DELETE_CACHE_ERROR + " " + key, err);
}
};
module.exports = { setCache, getCache, deleteCache };const nodemailer = require('nodemailer');
require('dotenv').config();
const transporter = nodemailer.createTransport({
service: 'gmail',
auth: {
user: process.env.EMAIL, // Make sure to define EMAIL and PASS in your .env file
pass: process.env.EMAIL_PASSWORD,
},
});
const sendEmail = (to, subject, text, html) => {
const mailOptions = {
from: process.env.EMAIL,
to: to,
subject: subject,
text: text,
html: html,
};
return new Promise((resolve, reject) => {
transporter.sendMail(mailOptions, (error, info) => {
if (error) {
return reject(error);
}
resolve(info);
});
});
};
module.exports = sendEmail;const { Sequelize } = require("sequelize");
// Configure Sequelize instance with options
const sequelize = new Sequelize(process.env.MYSQL_URI, {
dialect: "mysql",
logging: process.env.SEQUELIZE_LOGGING === "true" ? console.log : false, // Toggle query logging
pool: {
max: 5, // Maximum number of connections in pool
min: 0, // Minimum number of connections in pool
acquire: 30000, // Maximum time (ms) to get a connection
idle: 10000, // Maximum time (ms) a connection can be idle
},
retry: {
max: 3, // Retry connection attempts
},
});
// Async function to connect to the database
const connectDB = async () => {
try {
await sequelize.authenticate();
console.log("✅ MySQL Connected successfully.");
} catch (err) {
console.error("❌ Unable to connect to the database. Please check the connection details:", err.message);
// Retry mechanism for better fault tolerance
console.error("⚠️ Retrying database connection...");
setTimeout(async () => {
try {
await sequelize.authenticate();
console.log("✅ MySQL Reconnected successfully after retry.");
} catch (retryErr) {
console.error("❌ Retry failed. Exiting application:", retryErr.message);
process.exit(1); // Exit the application in case of persistent failure
}
}, 5000);
}
};
// Export both the Sequelize instance and the connection function
module.exports = { sequelize, connectDB };Feel free to open issues or pull requests for improvements. Suggestions are always welcome!
License: MIT