nepalcargoservices.com

Leveraging PostgreSQL as a Message Broker in Microservices

Written on

Chapter 1: Introduction to PostgreSQL in Microservices

In a previous discussion, we explored how to use PostgreSQL's Listen/Notify feature for implementing a Publish/Subscribe communication pattern, specifically using Go-based microservices for both producing and consuming data. In this article, we will expand upon that by showcasing PostgreSQL's role as a message broker within a multi-language microservices framework, particularly concentrating on the data consumer aspect implemented in Python.

Multi-Language Microservices

Microservices architecture allows applications to be structured as a set of independently deployable and loosely coupled services. Each service is centered around a specific business function, termed a bounded context in Domain-Driven Design, and is managed by a small, dedicated team. Different teams may choose different technologies based on their expertise, which facilitates the selection of the most suitable technology for each service. This adaptability is one of the core advantages of microservices. For example, if a machine learning component is involved, developers might opt for Python, while other services may be more efficiently built with Golang. This approach is made possible through the use of decoupled services that communicate via a shared message broker.

In our last article, we established PostgreSQL as a competent message broker for Go services. Now, we will demonstrate its effectiveness when one of our teams opts for Python.

Data Processing Application Overview

The complete setup was described in the previous article, but here’s a quick recap:

  1. Data Producer: A Go-based service that exposes an HTTP API with a POST endpoint. Upon notification, it generates random data and stores it in a table.
  2. Data Consumer: Another Go-based service that listens for new data notifications and processes them (i.e., logs them).
  3. PostgreSQL Database: This serves both as a data repository and as a message broker facilitating communication between the data producer and consumer.

Adding a Python Data Consumer

Next, we'll introduce a Python data consumer. Like the existing consumer, it will subscribe to notifications from the database on a designated channel, waiting for notifications to process.

For this, we'll utilize the Psycopg 3 library, the successor to the popular Psycopg 2. Psycopg 3 complies with the Python Database API Specification v2.0, incorporating support for modern PostgreSQL and Python features. If you're familiar with Psycopg 2's Listen/Notify mechanism, keep in mind that the notification access interface has evolved in version 3 to align more closely with contemporary Python coding standards.

We’ll start by defining a model that encapsulates the notifications we expect from PostgreSQL. In our data processing application, each notification signifies a new record in a table, containing a timestamp, signal name, and signal value. To ensure schema validation upon receipt, we'll use the Pydantic library, which is well-regarded for Python data validation.

The model comprises three fields:

  • timestamp of type datetime
  • signal_name of type str
  • signal_value of type float

Notably, the field names align with the database column names, streamlining the parsing of JSON notifications into the Pydantic model.

Notification Handling Process

To accurately consume notifications, we will follow five essential steps:

  1. Establish a database connection.
  2. Issue the LISTEN command to the database.
  3. Wait for notifications to arrive.
  4. Parse the notification payload.
  5. Pass the payload for consumption.

We’ll encapsulate this functionality within a class named Listener, which will accept the database connection as a dependency. Now let's delve into the code.

# Example Python code illustrating the Listener class setup

# (Insert actual code here)

Next, we set up the main components in main.py. We configure the logger and define the channel name to listen to. The get_connection function reads the connection string from an environmental variable and attempts to connect to the database.

# Example Python code for main setup

# (Insert actual code here)

Containerization with Docker

Now that we have our Python data consumer ready, we will encapsulate it in a Docker container and include it in a Docker Compose configuration. The Python data consumer relies on just two external libraries. We will use a two-stage build process to minimize the final image size.

# Dockerfile setup example for the Python consumer

# (Insert actual Dockerfile content here)

Once the Dockerfile is prepared, we can add the Python data consumer to the docker-compose.yml file, ensuring it mirrors the configuration of the Go consumer service. This includes utilizing the .env file for environmental variables and waiting for the database to become operational.

Testing the Setup

Let’s see everything in action! From the terminal where the docker-compose.yml file is located, run:

docker-compose up -d --build

This command builds and runs the application in detached mode. If successful, you should see output confirming the creation of services.

To trigger the notification, use the following command:

curl -X POST http://localhost:8080/ingest

If everything is functioning correctly, you should observe the callback output in the logs of the Python consumer service:

docker-compose logs python_consumer --follow

Congratulations! By employing PostgreSQL as a message broker, we have established communication between services written in Go and Python, utilizing a Publish/Subscribe messaging pattern.

Conclusion

The PostgreSQL Listen/Notify mechanism is a robust solution for enabling communication between loosely coupled microservices, even when developed in different programming languages. By harnessing this mechanism, teams can effectively integrate diverse components, fostering collaboration and leveraging the combined expertise within the organization.

Resources

  • PostgreSQL as a message broker for Go applications
  • Building minimal Go Docker images
  • Bounded context in domain-driven design
  • Psycopg 3 documentation
  • Python Database API Specification v2.0
  • Pydantic documentation

The complete code for this example can be found in my GitHub repository.

Chapter 2: Video Demonstrations

Explore how PostgreSQL can potentially replace your messaging queue by watching this insightful video.

Learn about microservices architecture, including message brokers and transport methods, in this informative video.

Share the page:

Twitter Facebook Reddit LinkIn

-----------------------

Recent Post:

Animating Data Visualizations with Plotly Express in Python

Discover how to animate your data visualizations using Plotly Express in Python to create engaging and insightful representations.

Unleashing the Future: Apple’s M3 Chip and Its AI Revolution

Explore how Apple’s M3 chips redefine on-device AI with a 60% faster Neural Engine, enhancing user experience and computing capabilities.

Enhancing User Engagement through Interactive Input Loops in Python

Discover how to create engaging user experiences with effective input loops in Python.

Revolutionizing Device Charging: MIT's Wi-Fi Power Technology

Discover how MIT's innovative technology harnesses Wi-Fi signals to power electronic devices, changing the future of charging.

The Science of Dancing Together: How Our Brains Connect

Explore how dancing with others engages our brains in unique ways, fostering connection and coordination.

# Why AWS CloudFormation May Not Be Ready for Production Use

Analyzing AWS CloudFormation's limitations compared to alternatives and how CDK addresses these issues.

Mastering Communication Skills for Software Engineering Interviews

Explore essential communication skills needed for software engineering interviews and learn how to impress your interviewers effectively.

Respecting Diverse Opinions: A Path to Understanding

Emphasizing the importance of valuing different viewpoints for enhanced understanding and relationships.