[Part-2] : Comprehensive Technical Guide for Integrating Elasticsearch with Node.js

[Part-2] : Comprehensive Technical Guide for Integrating Elasticsearch with Node.js

In previous part we dived into what is elastic-search and using it with Node.Js. In this part of the blog we would go through integrating it wit Node.Js.

Elasticsearch index

An Elasticsearch index serves as a logical namespace that maps to one or more physical shards, which are the basic unit of scalability in Elasticsearch. Think of an index as a collection of documents that share similar characteristics or belong to the same data category within your Elasticsearch cluster. Each document within an index is a JSON object, and Elasticsearch indexes are highly flexible, allowing you to store, search, and analyze diverse types of data efficiently.When creating an index in Elasticsearch, you typically define its settings, mappings, and optionally, aliases. Settings include parameters such as the number of shards and replicas, which impact the index’s performance and resilience. Mappings define the schema for documents within the index, specifying the data types and properties for each field.

Elastic Cloud setup

When setting up Elasticsearch, you’re presented with two main avenues: self-managed deployment or harnessing the power of Elastic Cloud. This tutorial focuses on demystifying the setup journey specifically for the Elastic Cloud trial. Whether you’re a seasoned developer seeking simplicity or a newcomer eager to explore Elasticsearch’s capabilities, this step-by-step guide aims to streamline your experience with Elastic Cloud, offering insights and instructions to kickstart your exploration of Elasticsearch’s potential.

To get started, create a new Elastic Cloud account or log in to an existing one.

After successful login, create and configure a new deployment. Copy the resulting deployment credentials which will be used in the future.

Prerequisite

Before starting, ensure you have node.js installed together with NPM.

Project Setup

Create a new directory for our project and run following commands in the terminal

npm init -y
npm install express body-parser dotenv @elastic/elasticsearch        

Installed dependencies are as follows:

Create a .elastic.env file and paste the deployment credentials along with Cloud ID, which can be found in deployments and selecting your deployment.

CLOUD_ID="your-cloudid"
ELASTIC_USERNAME="your-username"
ELASTIC_PASSWORD="your-password"        

Creating elasticsearch index

Create a elasticClient.js file where we’ll initialize our elasticsearch client. Add following code in it.

const { Client } = require("@elastic/elasticsearch");

require("dotenv").config({ path: ".elastic.env" });

const elasticClient = new Client({
  cloud: {
    id: process.env.ELASTIC_CLOUD_ID,
  },
  auth: {
    username: process.env.ELASTIC_USERNAME,
    password: process.env.ELASTIC_PASSWORD,
  },
});

module.exports = elasticClient;        

The initiated client will be used to connect to our cloud and operate on it.

Now, create a postIndex.js file in which a new posts index for storing post data is created:

const elasticClient = require("./elastic-client");

const createIndex = async (indexName) => {
  await elasticClient.indices.create({ index: indexName 
};

createIndex("posts");        

Run, node postIndex.js which will initiate the index.

Creating Express Routes:

We will start by creating app.js where we’ll initialize our express application:

const express = require("express");
const bodyParser = require("body-parser");
const elasticClient = require("./elastic-client");

const app = express();

app.use(bodyParser.json());

# Routes for our application

app.listen(8080);        

We’ll add the following routes to our file. These routes will act as a proxy to our elasticsearch REST API.

# Route to our homepage

app.get("/", (req, res) => {
  res.redirect("http://localhost:3000/");
});

# Route to create our post

app.post("/create-post", async (req, res) => {
  const result = await elasticClient.index({
    index: "posts",
    document: {
      title: req.body.title,
      author: req.body.author,
      content: req.body.content,
    },
  });

  res.send(result);
});

# Route to delete our post

app.delete("/remove-post", async (req, res) => {
  const result = await elasticClient.delete({
    index: "posts",
    id: req.query.id,
  });

  res.json(result);
});

# Route to search our post

app.get("/search", async (req, res) => {
  const result = await elasticClient.search({
    index: "posts",
    query: { fuzzy: { title: req.query.query } },
  });

  res.json(result);
});

# Route to get our posts

app.get("/posts", async (req, res) => {
  const result = await elasticClient.search({
    index: "posts",
    query: { match_all: {} },
  });

  res.send(result);
});        

We have used index(), delete() and search() methods given to us by elasticsearch client.

Write following command to start your node application.

node app.js        

Our node application listens on port 8080. You can create your frontend using React, Angular, Vue or any other front-end framework and make call on localhost:8080 to work with our application.

Conclusion

In conclusion, Techlusion stands as a trusted partner in implementing Elasticsearch with Node.js, leveraging our expertise to guide companies through every stage of the implementation process. With a focus on tailored solutions and deep understanding of Elasticsearch and Node.js, we collaborate closely with clients to align technology with business objectives, whether it’s building real-time search engines, optimizing log management, or enhancing content discovery. Our commitment to continuous learning and innovation ensures that we deliver cutting-edge solutions that are scalable, efficient, and future-proof.





To view or add a comment, sign in

Explore topics