Deploy MQ Client, Pub and Sub
Message queue clients, publishers and subscribers together are one of the trigger features that can be deployed and managed from CLI.
A MQ client can send messages to a data process via a MQ subscriber:
Similarly, a data process can return results to the MQ client via a MQ publisher:
Both MQ subscriber and MQ publisher can be linked to a data process intependent of each other.
MQ Client
A MQ client indicates the host and port of the MQ broker service.
Generate a MQ client template file:
./loc mq client init
name: message-queue-client
description: message-queue-client
client:
type: Kafka
brokers:
- host: 0.0.0.0
port: 9092
For now only Apache Kafka is supported.
Deploy the MQ client:
./loc mq client deploy -f message-queue-client.yaml
MQ Subscriber and Publisher
Once the MQ client is deployed, find out its permanent ID with
./loc mq client list
Then generate template files for MQ subscriber and publisher:
Subscriber
./loc mq sub init
name: mq-subscriber
description: mq-subscriber
clientId: 00000000-0000-0000-0000-000000000000
dataProcessPids:
- pid: 00000000-0000-0000-0000-000000000000
revision: latest
consumer:
type: Kafka
topic: example-topic
For now, any incoming messages will trigger linked data processes. You can read the message topics or other information via the MQ payload in logic.
Publisher
./loc mq pub init
name: mq-publisher
description: mq-publisher
clientId: 00000000-0000-0000-0000-000000000000
dataProcessPids:
- pid: 00000000-0000-0000-0000-000000000000
revision: latest
producer:
type: Kafka
topic: example-topic
You have to modify both with the correct clientId
(MQ client PID), data process PIDs, MQ type as well as message topic.
Deploy
Then deploy the MQ subscriber and publisher:
./loc mq sub deploy -f subscriber.yaml
./loc mq pub deploy -f publisher.yaml