Entity inheritance (Embedded Inheritance)
Entity inheritance You can reduce duplication in your code by using entity inheritance. For example, you have Photo, Question, Post entities: As you can see all those entities have common columns: id, title, description. To reduce duplication and produce a better abstraction we can create a base class called Content for them: All columns (relations, embeds, etc.) from parent entities (parent can extend other entity as well) will be inherited and created in final entities.
Typeorm:Simple-Json column type
simple-json column type There is a special column type called simple-json which can store any values which can be stored in database via JSON.stringify. Very useful when you do not have json type in your database and you want to store and load object without any hassle. For example: Will be stored in a single database column as {“name”:”John”,”nickname”:”Malkovich”} value. When you’ll load data from the database, you will have your object/array/primitive back via JSON.parse
Entities & Decorators
Creating entities and tables using decorators in TypeORM is a fundamental aspect of defining your data model and database schema. TypeORM is an Object-Relational Mapping (ORM) library that enables you to define your database structure using TypeScript classes and decorators, making it easier to work with databases in a more object-oriented manner. Here’s how you can create entities and tables using decorators in TypeORM: import { Entity, PrimaryGeneratedColumn, Column } from ‘typeorm’; @Entity()export class User {@PrimaryGeneratedColumn()id: number; } @Entity({ name: ‘custom_users’ })export class User {// …} Command to generate migrations:ON TYPEORM CLI npx typeorm migration:generate -n InitialMigrationnpx typeorm migration:run Defining entities using decorators simplifies the process of setting up your database schema, as you don’t need to write raw SQL queries. Instead, you define your data model using TypeScript classes and decorators, and TypeORM takes care of translating this into SQL statements for creating tables and columns.
Security Considerations:
Security is a critical aspect of web applications, and Express.js provides various tools and practices to enhance the security of your applications. Here are some key security considerations and practices, including password hashing and JSON Web Tokens (JWT), in Express.js: 1. Password Hashing: Storing passwords in plain text is a significant security risk. Instead, you should hash passwords using strong cryptographic algorithms before storing them in the database. The bcrypt library is commonly used for password hashing in Node.js applications. const bcrypt = require(‘bcrypt’); const plaintextPassword = ‘myPassword’; const saltRounds = 10; bcrypt.hash(plaintextPassword, saltRounds, (err, hash) => { if (err) throw err; // Store ‘hash’ in the database }); 2. Salting: Salting involves adding random data to the password before hashing to increase security. bcrypt automatically handles salting for you. 3. User Authentication and Sessions: Implement user authentication to ensure that only authorized users can access certain parts of your application. Express provides various authentication libraries such as Passport.js to simplify this process. 4. JWT (JSON Web Tokens): JWTs are a secure way to transmit information between parties as a JSON object. They can be used for authentication, authorization, and more. The jsonwebtoken library is commonly used for working with JWTs in Node.js. const jwt = require(‘jsonwebtoken’); const secretKey = ‘mySecretKey’; const payload = { userId: 123 }; const token = jwt.sign(payload, secretKey, { expiresIn: ‘1h’ }); // Later, verify and decode the token jwt.verify(token, secretKey, (err, decoded) => { if (err) throw err; console.log(decoded.userId); // Access the payload }); 5. Input Validation: Always validate and sanitize user inputs to prevent security vulnerabilities such as SQL injection and cross-site scripting (XSS). Libraries like express-validator can help you with input validation. 6. CSRF (Cross-Site Request Forgery) Protection: Use strategies to prevent CSRF attacks, where unauthorized commands are transmitted from a user that the web application trusts. Express provides middleware like csurf to mitigate CSRF risks. 7. HTTP Security Headers: Set appropriate HTTP security headers to mitigate attacks like clickjacking, content sniffing, and XSS. You can use libraries like helmet to easily set these headers. 8. CORS (Cross-Origin Resource Sharing) Configuration: Properly configure CORS to control which origins can access your resources. Use the cors middleware to manage CORS settings. 9. SQL Injection Prevention: Use parameterized queries or an ORM (Object-Relational Mapping) library like Sequelize to prevent SQL injection attacks. 10. Error Handling: Avoid exposing sensitive information in error responses. Implement centralized error handling and logging. 11. HTTPS: Always use HTTPS to encrypt data transmitted between the client and the server, especially for sensitive information like passwords and tokens. 12. Dependencies and Security Updates: Regularly update your application’s dependencies to include security patches and updates. 13. Third-party Libraries: Be cautious when using third-party libraries. Only use well-maintained and reputable packages from trusted sources. Implementing these security practices in your Express.js application will significantly enhance its security posture and protect it from common vulnerabilities and attacks.
Promises: async/await
Working with Promises and async/await in Node.js provides a more organized and readable way to handle asynchronous operations compared to traditional callback-based approaches. Promises and async/await simplify the control flow and error handling of asynchronous code, making it easier to understand and maintain. Let’s explore both concepts: Promises: A Promise represents a value that may be available now, in the future, or might fail to be available altogether. It has three states: pending, fulfilled, or rejected. Promises provide a clear structure for handling asynchronous operations. Creating a Promise: function asyncOperation() {return new Promise((resolve, reject) => {// Perform asynchronous operationif (errorOccurred) {reject(new Error(‘An error occurred’));} else {resolve(result);}});} Using Promises: asyncOperation().then(data => {console.log(‘Data:’, data);}).catch(err => {console.error(‘Error:’, err.message);}); Working with Promises and async/await in Node.js provides a more organized and readable way to handle asynchronous operations compared to traditional callback-based approaches. Promises and async/await simplify the control flow and error handling of asynchronous code, making it easier to understand and maintain. Let’s explore both concepts: Promises: A Promise represents a value that may be available now, in the future, or might fail to be available altogether. It has three states: pending, fulfilled, or rejected. Promises provide a clear structure for handling asynchronous operations. Creating a Promise: javascriptCopy codefunction asyncOperation() { return new Promise((resolve, reject) => { // Perform asynchronous operation if (errorOccurred) { reject(new Error(‘An error occurred’)); } else { resolve(result); } }); } Using Promises: javascriptCopy codeasyncOperation() .then(data => { console.log(‘Data:’, data); }) .catch(err => { console.error(‘Error:’, err.message); }); Promises enable chaining multiple asynchronous operations together using .then() and handling errors with .catch(). async/await: async/await is a modern syntax that allows you to write asynchronous code that looks and behaves like synchronous code. It makes asynchronous operations appear linear, which improves readability. Using async/await: async function run() {try {const data = await asyncOperation();console.log(‘Data:’, data);} catch (err) {console.error(‘Error:’, err.message);}} run(); With async/await, the await keyword is used to pause the execution of the function until the asynchronous operation completes. It provides a more intuitive way to handle promises without excessive nesting. Error Handling: Both Promises and async/await provide clear ways to handle errors. Promises: asyncOperation().then(data => {console.log(‘Data:’, data);}).catch(err => {console.error(‘Error:’, err.message);}); async/await: async function run() {try {const data = await asyncOperation();console.log(‘Data:’, data);} catch (err) {console.error(‘Error:’, err.message);}} Handling Multiple Promises Concurrently: Promises can be used to handle multiple asynchronous operations concurrently using Promise.all(). const promises = [asyncOperation1(), asyncOperation2(), asyncOperation3()]; Promise.all(promises).then(results => {console.log(‘Results:’, results);}).catch(err => {console.error(‘Error:’, err.message);}); Using async/await with Multiple Promises: async/await can simplify handling multiple promises concurrently. async function run() {try {const result1 = await asyncOperation1();const result2 = await asyncOperation2();const result3 = await asyncOperation3();console.log(‘Results:’, result1, result2, result3);} catch (err) {console.error(‘Error:’, err.message);}} run(); Promise.race: Promise.race takes an array of promises as its argument and returns a new promise that resolves or rejects as soon as one of the promises in the array resolves or rejects. The value or reason of the first settled promise is used to resolve or reject the resulting promise. Using Promises and async/await improves code readability, error handling, and control flow when dealing with asynchronous operations in Node.js. It’s important to choose the approach that best suits your application’s requirements and complexity. const promise1 = asyncOperation1();const promise2 = asyncOperation2();const promise3 = asyncOperation3(); Promise.race([promise1, promise2, promise3]).then(firstResolvedValue => {console.log(‘First promise resolved:’, firstResolvedValue);}).catch(firstRejectedReason => {console.error(‘First promise rejected:’, firstRejectedReason);}); In this example, the .race method resolves or rejects based on the first promise that settles (either resolves or rejects). Use Cases: Both Promise.all and Promise.race provide ways to efficiently manage multiple asynchronous operations and handle their results or errors. They help you design more responsive and well-structured code when working with concurrency and asynchronous programming.
Asynchronous Control Flow: Using callbacks effectively in node js
Asynchronous control flow in Node.js involves managing the execution order of asynchronous operations to ensure that they occur in the desired sequence. Callbacks are a core concept in achieving this, as they allow you to specify what should happen after an asynchronous operation completes. Effective use of callbacks is crucial to avoid callback hell and create maintainable and readable code. Here’s how to use callbacks effectively in Node.js for managing asynchronous control flow: Basic Callback Pattern: The basic callback pattern involves passing a callback function as an argument to an asynchronous function. The callback function is executed once the asynchronous operation is completed, usually with the result or an error. function asyncOperation(arg, callback) {// Perform asynchronous operationif (errorOccurred) {callback(new Error(‘An error occurred’));} else {callback(null, result);}} asyncOperation(‘input’, (err, data) => {if (err) {console.error(‘Error:’, err.message);return;}console.log(‘Data:’, data);}); Avoiding Callback Hell: Callback hell occurs when you have multiple nested callbacks, leading to code that is hard to read and maintain. To avoid this, use named functions or modularize your code by breaking callbacks into separate functions. function step1(callback) {// …callback(null, result1);} function step2(arg, callback) {// …callback(null, result2);} function step3(arg, callback) {// …callback(null, result3);} step1((err1, result1) => {if (err1) {console.error(‘Error:’, err1);return;} }); Modularization and Control Flow Libraries: To simplify managing asynchronous control flow and avoiding callback hell, you can use libraries like async or native features like Promises and async/await. // Using Promisesfunction asyncOperation(arg) {return new Promise((resolve, reject) => {// …if (errorOccurred) {reject(new Error(‘An error occurred’));} else {resolve(result);}});} async function run() {try {const data = await asyncOperation(‘input’);console.log(‘Data:’, data);} catch (err) {console.error(‘Error:’, err.message);}} run(); Effectively using callbacks and understanding the various control flow options in Node.js is key to writing maintainable and readable asynchronous code.
events module in Node.js
The events module in Node.js provides an event-driven architecture that allows you to work with and manage custom events and event listeners within your applications. This module is essential for building applications that respond to various asynchronous events, such as user interactions, data updates, and more. The events module is used to create your own custom event emitters and listeners, allowing different parts of your application to communicate and react to specific events. Key Concepts: Creating and Using Event Emitters: To create an event emitter, you need to create an instance of the EventEmitter class: const EventEmitter = require(‘events’); class MyEmitter extends EventEmitter {} const myEmitter = new MyEmitter(); You can then emit events using the emit() method and attach listeners to those events using the on() method: myEmitter.on(‘customEvent’, (arg) => {console.log(‘Custom event occurred with argument:’, arg);}); myEmitter.emit(‘customEvent’, ‘Hello, world!’); In this example, a custom event named ‘customEvent’ is emitted with the string ‘Hello, world!’ as an argument. The attached listener responds to the event and logs the provided argument. Built-in Events: The events module also provides several built-in events, and you can extend the EventEmitter class to override and customize these events. For example, the ‘error’ event is a common built-in event that is emitted when an error occurs: myEmitter.on(‘error’, (err) => {console.error(‘Error occurred:’, err);}); myEmitter.emit(‘error’, new Error(‘Something went wrong’)); Removing Event Listeners: To remove an event listener, you can use the removeListener() method. This is particularly useful when you want to manage the lifecycle of event listeners or ensure that listeners are no longer active when they are no longer needed. const listener = () => {console.log(‘Listener triggered.’);}; myEmitter.on(‘customEvent’, listener); myEmitter.emit(‘customEvent’); // Triggers the listenermyEmitter.removeListener(‘customEvent’, listener);myEmitter.emit(‘customEvent’); // Listener is not triggered The events module is a fundamental part of building event-driven and reactive applications in Node.js. It allows you to create well-structured and modular code that responds to asynchronous events in a clear and organized manner.
fs module for file system operations in node
The fs module in Node.js is a built-in module that provides functionality for interacting with the file system. It allows you to perform various operations related to files and directories, such as reading and writing files, creating directories, deleting files, and more. The module provides both synchronous and asynchronous methods for these operations. Here’s an overview of some commonly used functions provided by the fs module: Reading Files: Writing Files: Asynchronous Version Example: const fs = require(‘fs’); fs.writeFile(‘file.txt’, ‘Hello, world!’, ‘utf8’, (err) => {if (err) {console.error(‘Error writing file:’, err);return;}console.log(‘File written successfully.’);}); Synchronous Version Example: const fs = require(‘fs’); try {fs.writeFileSync(‘file.txt’, ‘Hello, world!’, ‘utf8’);console.log(‘File written successfully.’);} catch (err) {console.error(‘Error writing file:’, err);} Reading and Writing Streams: Creating Directories: Deleting Files or Directories: Checking File or Directory Existence: These are just a few examples of the functions provided by the fs module. Remember that when using asynchronous methods, you provide a callback function that will be executed once the operation is complete. Using synchronous methods blocks the execution until the operation is finished, so they are often best avoided in scenarios where you want to keep your application responsive. The fs module is a powerful tool for handling file-related tasks in Node.js applications and is widely used for file I/O, data persistence, and various other scenarios where interaction with the file system is required.
http & https modules in Node
In Node.js, the http and https modules are core modules that provide functionality for building web servers. These modules allow you to create, configure, and manage HTTP and HTTPS servers, making it possible to serve web content, handle requests, and interact with clients. Here’s an overview of both modules: http Module: The http module is used to create and manage HTTP servers. It allows you to listen for incoming HTTP requests, handle them, and send appropriate responses back to the clients. Creating an HTTP Server: const http = require(‘http’); const server = http.createServer((req, res) => {res.writeHead(200, {‘Content-Type’: ‘text/plain’});res.end(‘Hello, world!\n’);}); server.listen(3000, () => {console.log(‘Server is listening on port 3000’);}); In this example, we create an HTTP server that listens on port 3000. When a request is received, it responds with a plain text message. https Module: The https module is similar to the http module but is used for creating and managing HTTPS servers, which use SSL/TLS encryption to secure data transmission. Creating an HTTPS Server: const https = require(‘https’);const fs = require(‘fs’); const options = {key: fs.readFileSync(‘private-key.pem’),cert: fs.readFileSync(‘public-cert.pem’)}; const server = https.createServer(options, (req, res) => {res.writeHead(200, {‘Content-Type’: ‘text/plain’});res.end(‘Secure Hello, world!\n’);}); server.listen(443, () => {console.log(‘Server is listening on port 443’);}); In this example, we create an HTTPS server using SSL/TLS certificates. It listens on port 443 and responds with a secure plain text message. Both the http and https modules follow a similar pattern for creating servers and handling requests. They provide events and methods to customize how requests are handled, enabling you to build more complex web applications. These modules are foundational when building web servers in Node.js, and they can be used to create APIs, serve static files, handle authentication, and more. Keep in mind that in production environments, you might want to consider using additional libraries or frameworks (like Express.js) to simplify and enhance your web server setup.
File System fs(read, write, create dir, R/W streams)
Asynchronous file operations in Node.js involve reading from or writing to files without blocking the execution of other code. These operations are crucial when working with potentially large files or when you want to ensure that your application remains responsive while handling file I/O. Node.js’s built-in fs (file system) module provides various functions for performing asynchronous file operations. Working with the file system is a common task in many applications, and Node.js provides a built-in module called fs (short for “file system”) that allows you to interact with files and directories. With the fs module, you can perform various file-related operations such as reading files, writing files, creating directories, and more. Here’s an overview of how to work with the file system in Node.js: Reading Files: Asynchronous Reading: The primary method for asynchronously reading files is fs.readFile(). It reads the contents of a file and provides the data via a callback function. To read the contents of a file, you can use the fs.readFile() function. It takes the file path and an optional encoding as arguments. If an encoding is provided, the contents will be returned as a string; otherwise, a buffer will be returned. const fs = require(‘fs’); fs.readFile(‘example.txt’, ‘utf8’, (err, data) => {if (err) {console.error(‘Error reading file:’, err);return;}console.log(‘File contents:’, data);}); Writing Files: Asynchronous Writing: Asynchronous writing to files is done using fs.writeFile(). This function writes data to a file and triggers a callback once the write operation is finished. To write data to a file, you can use the fs.writeFile() function. It takes the file path, data to be written, an optional encoding, and a callback function. const fs = require(‘fs’); const content = ‘This is the content to write to the file.’;fs.writeFile(‘output.txt’, content, ‘utf8’, err => {if (err) {console.error(‘Error writing file:’, err);return;}console.log(‘File written successfully.’);}); Creating Directories: You can create directories using the fs.mkdir() function. It takes the directory path and a callback function. const fs = require(‘fs’); fs.mkdir(‘new-directory’, err => {if (err) {console.error(‘Error creating directory:’, err);return;}console.log(‘Directory created successfully.’);}); Checking if a File or Directory Exists: To check if a file or directory exists, you can use the fs.existsSync() function. const fs = require(‘fs’); if (fs.existsSync(‘example.txt’)) {console.log(‘File exists.’);} else {console.log(‘File does not exist.’);} Reading and Writing Streams: Asynchronous Streams: Streams are a more memory-efficient way to perform asynchronous reading and writing of files, especially for large files. Node.js provides stream-based file operations through methods like fs.createReadStream() and fs.createWriteStream(). Node.js provides streams for reading and writing large files efficiently. Streams allow you to work with data in smaller chunks, reducing memory usage. The fs.createReadStream() and fs.createWriteStream() functions are used for this purpose. const fs = require(‘fs’); const readStream = fs.createReadStream(‘largeFile.txt’, ‘utf8’);const writeStream = fs.createWriteStream(‘output.txt’, ‘utf8’); readStream.pipe(writeStream); // Pipe data from read stream to write stream Error Handling: As with any asynchronous operation, proper error handling is important. All of the above methods accept a callback function that is executed when the operation is complete. The first parameter of this callback is an error object, allowing you to handle any errors that might occur during the file operation. fs.readFile(‘example.txt’, ‘utf8’, (err, data) => { if (err) { console.error(‘Error reading file:’, err); return; } console.log(‘File contents:’, data); }); fs.writeFile(‘output.txt’, content, ‘utf8’, err => {if (err) {console.error(‘Error writing file:’, err);return;}console.log(‘File written successfully.’);}); When working with asynchronous file operations, you ensure that your application remains responsive and efficient, especially when dealing with tasks that involve reading from or writing to files that could potentially take time. Working with the file system in Node.js can be powerful, but it’s important to handle errors properly and ensure that you close open resources (streams) when you’re done using them. The examples provided above use callback functions, but you can also use Promises or async/await for more readable and maintainable code.