Message queues are a software architecture problem that solves a lot of issues that distributed systems face. When I first started my career, I had to learn what a message brokering system (RabbitMQ, specifically) was and how I’d go about using it for the system I was building.
At the time, it was a complicated topic for me, so my goal with today’s newsletter is to break it down into an easy-to-understand read format.
» If you’re interviewing for a software role, you’re going to want to know what message queues are - they may be asked in your system design interview!
What are messages?
If you’ve ever looked up an analogy for message queuing, you’ll often see the analogy of the mail system:
You write a letter to your friend - often referred to as the “publisher” or “producer”.
The post office is responsible for delivering it - often referred to as the “message queue”
Your friend receives the letter - generally referred to as the “consumer”.
Think of messages as the letter you’re sending to your friend - it contains something that your friend could use, whether if that would just be a nice little note or a gift card.
In software, these messages are dictionaries:
{
"task": "send_email",
"to": "[email protected]"
}What are message queues?
Message queues are like a “holding area” for your messages - publishers will send a message to this queue, and consumers will read the message:

» Related reading: what is a queue
Say you’re working on building a data pipeline that’s relatively straight-forward: you’re pulling raw data into your system, transforming it and writing it to a database, then sending the data downstream to customers:

At the surface, this works and is a valid solution if we were to stuff it into one script and call it a day. However, there’s a few major issues we can pick up right off the bat:
If your code throws an error writing to the database, you won’t be able to send the data downstream to customers since that code won’t run.
If you have multiple raw data ingest sources, you may want to have multiple threads running to ingest, process, and send, especially it’s a real-time system.
This is where message queues come in - they allow you to send data to another part of your system so that your services do one thing and one thing only:

The biggest advantages you get from the pub/sub architecture (“message queues”) are as follows:
Decoupling - all services (green boxes above) don’t have knowledge of each other. You can swap out, scale, or rewrite either side without touching the other.
Resilience - If a consumer goes down, messages continue to queue up and then get processed when the service recovers. Nothing is lost and the producer never knew there was a problem.
Scalability - you can spin up multiple consumers (multi-threaded or multi-processing) to chew through a backlog in parallel, or add multiple producers without any coordination logic. The queue absorbs the load.
Python Examples for publishing and consuming
There are several kinds of message queuing systems, such as Kafka and RabbitMQ. These examples use Amazon SQS, but the same concept applies for different brokers.
Before we can do anything, we need to be able to communicate with our AWS services by setting up a boto3 client (boto3 is the package to handle these things):
import boto3
import json
AWS_REGION = "us-east-1"
QUEUE_URL = "https://sqs.us-east-1.amazonaws.com/123456789/my-queue"
# The queue we'll be publishing and consuming to/from.
sqs = boto3.client("sqs", region_name=AWS_REGION)Publishing
To publish, or send, a message, our code may look something such as:
def publish(message: dict) -> str:
response = sqs.send_message(
QueueUrl=QUEUE_URL,
MessageBody=json.dumps(message),
)
return response["MessageId"]
# Example of publishing a message
message = {"text" : "Hello, world!", "action" : "print"}
msg_id = publish(message)Consuming
Once a message is published, we can consume it with the following code:
def consume(max_messages: int = 10, wait_seconds: int = 5):
response = sqs.receive_message(
QueueUrl=QUEUE_URL,
MaxNumberOfMessages=max_messages,
WaitTimeSeconds=wait_seconds, # long polling
)
messages = response.get("Messages", [])
for msg in messages:
body = json.loads(msg["Body"])
print("Received:", body)
# Do all processing here
# Delete after processing (important)
sqs.delete_message(
QueueUrl=QUEUE_URL,
ReceiptHandle=msg["ReceiptHandle"],
)Happy coding!
📧 Join the Python Snacks Newsletter! 🐍
Want even more Python-related content that’s useful? Here’s 3 reasons why you should subscribe the Python Snacks newsletter:
Get Ahead in Python with bite-sized Python tips and tricks delivered straight to your inbox, like the one above.
Exclusive Subscriber Perks: Receive a curated selection of up to 6 high-impact Python resources, tips, and exclusive insights with each email.
Get Smarter with Python in under 5 minutes. Your next Python breakthrough could just an email away.
You can unsubscribe at any time.
Interested in starting a newsletter or a blog?
Do you have a wealth of knowledge and insights to share with the world? Starting your own newsletter or blog is an excellent way to establish yourself as an authority in your field, connect with a like-minded community, and open up new opportunities.
If TikTok, Twitter, Facebook, or other social media platforms were to get banned, you’d lose all your followers. This is why you should start a newsletter: you own your audience.
This article may contain affiliate links. Affiliate links come at no cost to you and support the costs of this blog. Should you purchase a product/service from an affiliate link, it will come at no additional cost to you.

