🛰️Zafer Satılmış - Aviora

Protocol Layer Overview

Multi-protocol gateway · one JSON profile · TCP and MQTT

The gateway firmware can load several protocol stacks at once—TCP-style sessions, MQTT toward a broker, TLV and other framing—without a separate image for every mix. The customer file protocol_config.json is where that profile is declared: which blocks are enabled (use: true), how each one connects, and the rest of the fields the generator consumes. Run generate_protocol_config.py and it emits the C headers (for example INIT_PROTOCOLS and related init expansion). Product-specific wiring stays in JSON and generated code, not in ad-hoc branches scattered through the sources.

TCP and MQTT are both supported: addresses, ports, push/pull roles where applicable, broker host, topics, credentials, and other transport parameters are supplied through the same JSON configuration—so connection behaviour is chosen per customer file, not baked in as a one-off C edit. This page is the entry to the protocol layer: it describes the overall layout; the links in the sidebar lead to each transport and protocol for detail.

JSON-driven profiles Multi-protocol in one build TCP + MQTT transports Connection params in JSON Generated headers Protocol layer overview

Customer Builds and JSON-Driven Selection

Like file system, logger, and time service, the protocol plane uses per-customer JSON + generated headers. The difference from a naive “pick A or B” model: protocol_config.json can enable several protocol entries at once (use: true on RigelMq, Orion TLV, ZD, Metallix…). Each brings its own protocolFunc hooks and connection defaults; the script composes INIT_PROTOCOLS so init/start run in order. Multiple sensors are not an afterthought—each active protocol names a linkedSensorName; the union drives INIT_SENSORS. One customer folder, one generator run, one firmware personality—spelled out in JSON.

Why this matters. A gateway can speak MQTT to the cloud and TCP TLV to a legacy head-end in the same binary—if your product needs both, you flip booleans and regenerate; you do not fork the application. Metering, directives, and readouts still converge on AppMeterOperations.

Customer protocol_config.json and generate_protocol_config.py

Under Customers/<CustomerName>/Protocol/, each customer keeps a single protocol_config.json that declares which protocol blocks are on (one or many), their connection defaults (MQTT or TCP), and the sensors referenced by linkedSensorName. The Python tool Customers/generate_protocol_config.py reads that JSON and emits C headers so the gateway gets composed INIT_PROTOCOLS / INIT_SENSORS macros—no manual stitching.

End-to-end flow

From JSON to firmware includes
flowchart LR
  subgraph IN["Input"]
    J[protocol_config.json]
  end
  subgraph PY["Python"]
    G[generate_protocol_config.py --customer NAME]
  end
  subgraph OUT["Generated C headers"]
    CUS[Customers/NAME/Protocol/cus_protocol_config.h]
    SHIM[Customers/Protocol_Config.h]
  end
  subgraph FW["Firmware"]
    APP[AppZMeterGw.c includes Protocol_Config.h]
    INIT[INIT_PROTOCOLS / INIT_SENSORS macros]
  end
  J --> G
  G --> CUS
  G --> SHIM
  SHIM -->|shim includes| CUS
  CUS --> APP
  APP --> INIT

Selection logic (active protocol and sensors)

What the generator keeps
flowchart TB
  P["protocol.use"]
  P --> AP["activeProtocol doc label"]
  P --> PL["protocols map use true or false each"]
  PL --> EMIT["emit INIT_PROTOCOLS for each use-true"]
  EMIT --> LINK["linkedSensorName per protocol"]
  LINK --> SENS["sensor union from linked names"]
  SENS --> MAC["SENSOR init macros plus INIT_SENSORS"]

Run from the repository root (paths below are relative to Aviora root):

PowerShell — repo root
PS Aviora> python Customers/generate_protocol_config.py --customer ZD_0101
Wrote C:\...\Customers\ZD_0101\Protocol\cus_protocol_config.h
Wrote C:\...\Customers\Protocol_Config.h

The shim Customers/Protocol_Config.h only includes the selected customer file, for example:

#include "ZD_0101/Protocol/cus_protocol_config.h"

Root fields (protocol_config.json)

FieldRole
customerCustomer folder name (e.g. ZD_0101); echoed in the generated header comment.
version / releaseDateMetadata stored in the banner of cus_protocol_config.h.
protocolContainer for enable flag, active protocol name, per-protocol definitions, and the sensors array.

protocol object

FieldRole
useIf false, the generator emits no protocol blocks (empty active list).
activeProtocolHuman-readable label for the chosen stack (e.g. protocolRigelMq); documentation / consistency only.
protocolsMap of stack keys: protocolRigelMq, protocolOrionTlv, protocolZD, protocolMetallix. Each has use, connection, and protocolFunc.
sensorsArray of sensor definitions referenced by linkedSensorName from active protocols.

Per-protocol entry (example keys)

FieldRole
useOnly entries with true are generated (macros, includes, INIT_PROTOCOLS).
name / descriptionDocumentation inside JSON.
linkedSensorNameMust match a sensors[].name. Drives which sensor init macros are emitted and in which order (unique list following active protocols).
connectionSee below. Merged with top-level keys for default #defines (Rigel / Orion / ZD).
protocolFuncsourcePath (header to include), initFunc, startFunc, stopFunc, putIncomingFunc — wired to APP_*_PROTOCOL_* macros.

connection (MQTT vs TCP)

The script flattens connection into settings for emit_*_defaults:

typeTypical keysGenerated defaults (examples)
mqtt mqttBroker.ipAddr, port, username, password; deviceIpAddr; pullPort; optional mqttRequestTopic / mqttResponseTopic RG_RIGEL_DEFAULT_* in cus_protocol_config.h for RigelMq (overrides or supplements app headers).
tcp serverIpAddr, pushPort, deviceIpAddr or deviceIP, pullPort ORION_TLV_DEFAULT_* or ZD_DEFAULT_* depending on protocol key.

sensors[] (ZD_0101 example)

FieldRole
nameIdentifier; referenced by linkedSensorName (e.g. electricityMeter, sensor1).
typemeterMeterCommInterface_t block; temperature / humidity → sensor struct with init/read/write/erase/sync.
drvSrcPathHeader path emitted as #include before the sensor macro.
meter functionsinitFunc, writeFunc, readFunc, setBaudrateFunc (required for meters).
temperature / humidityreadFunc, writeFunc, eraseFunc, syncFunc; initFunc optional (or derived from readFunc).

Generated output (excerpt)

For ZD_0101 with only protocolRigelMq.use: true and linkedSensorName: electricityMeter, the customer header defines protocol macros, Rigel default IPs/ports/topics, INIT_PROTOCOLS, and INIT_SENSORS:

#define ACTIVE_PROTOCOL_NUMBER (1)
#include "Application/.../AppProtocolRigelMq.h"
#define APP_RIGEL_MQ_PROTOCOL_INIT_FUNC(serialNumber)   appProtocolRigelMqInit(serialNumber)
#define RG_RIGEL_DEFAULT_MQTT_BROKER_IP       "127.0.0.1"
/* ... */
#define INIT_PROTOCOLS(setErrorFlag)   do{ ... } while(0)
#define ACTIVE_SENDOR_NUMBER (1)
#define SENSOR_ELECTRICITYMETER_INITIALIZE_FUNC(setErrorFlag)   do{ ... } while(0)
#define INIT_SENSORS(setErrorFlag)   do{ ... } while(0)
Rebuild after edits. Whenever you change protocol_config.json, run generate_protocol_config.py again so cus_protocol_config.h and Customers/Protocol_Config.h stay in sync before compiling the firmware.

Isolated Protocol Services

Each stack (ZD, Metallix, Orion, RigelMq) remains a self-contained module: framing, sessions, transport callbacks. When several are enabled, they still talk to the rest of the gateway through the same narrow seams—especially AppMeterOperations for meters and jobs. The UI or unrelated drivers do not care how many wire formats are live—only that requests resolve consistently.

What You Configure

protocol_config.json is the inventory: which protocols are on, broker/server endpoints, topics, device/pull addresses, and which sensor drivers back each link. Regenerate after edits so cus_protocol_config.h stays the single source of truth for that customer’s INIT_* story.

Implementations at a Glance

Module Wire Transport Documentation
AppProtocolZD JSON packets (text) TCP via AppTcpConnManager ProtocolZD
AppProtocolMetallixTLV Binary TLV ($#) TCP via AppTcpConnManager ProtocolMetallixTLV
AppProtocolOrionTLV Binary TLV + TRANS_NUMBER TCP via AppTcpConnManager ProtocolOrionTLV
AppProtocolRigelMq JSON + TRANS_NUMBER MQTT via AppMqttConnService ProtocolRigelMq

Architecture — multi-protocol, one meter core

A build can load several protocol modules when JSON says so; each uses its transport (TCP and/or MQTT). All operational traffic still meets at AppMeterOperations for metering, directives, and jobs—parallel front doors, one house.

Gateway data path (conceptual)
flowchart LR
  subgraph CFG["protocol_config.json"]
    ON[Multiple use:true protocols possible]
  end
  subgraph ST["Protocol stacks"]
    P1[Stack A e.g. RigelMq]
    P2[Stack B e.g. Orion TLV]
  end
  MO[AppMeterOperations]
  NET[Head-ends / brokers]
  ON -.-> ST
  P1 --> MO
  P2 --> MO
  P1 --> NET
  P2 --> NET

Repository Layout

Application/AppZMeterGw/Services/Protocol/
├── inc/
│   ├── AppTcpConnManager.h
│   ├── AppMeterOperations.h
│   ├── AppProtocolZD.h
│   ├── AppProtocolMetallixTLV.h
│   ├── AppProtocolOrionTLV.h
│   └── AppProtocolRigelMq.h
├── src/
│   ├── AppTcpConnManager.c
│   ├── AppMeterOperations.c
│   ├── AppProtocolZD.c
│   ├── AppProtocolMetallixTLV.c
│   ├── AppProtocolOrionTLV.c
│   └── AppProtocolRigelMq.c
└── Test_Server/
    ├── ProtocolZD_TestServer.py
    ├── ProtocolMetallixTLV_TestServer.py
    ├── ProtocolOrionTLV_TestServer.py
    └── ProtocolRigelMq_TestServer.py

Push vs Pull (TCP-based Stacks)

Channel Who initiates Typical use
Push Device connects to push socket of the server via TCP ident, alive, readout / load-profile data toward the backend
Pull Device listens locally; server connects in log, setting, fwUpdate, readout requests, directive list/add/delete

Details: TCP Connection Service — non-blocking select, push and pull sockets, request-based connect/disconnect.

MQTT (broker-based stacks)

TCP push and pull are not the only transports. When protocol_config.json sets a protocol block’s connection to MQTT, that stack uses AppMqttConnService: one session to the configured broker, subscribe/publish on the topics you declare (including optional request/response pairs), with reconnect and link callbacks wired the same way across builds. The same binary can run MQTT and TCP protocols together—each enabled block picks its own transport from JSON; nothing extra is hand-coded in the app to mix them.

Aspect MQTT path
Who is in the middle A broker (not a raw TCP server socket on the device). The gateway is an MQTT client.
Configuration mqttBroker.ipAddr, port, credentials, optional mqttRequestTopic / mqttResponseTopic—see the connection (MQTT vs TCP) table on this page.
Typical protocol module ProtocolRigelMq — JSON payloads over MQTT topics, session handling aligned with AppMqttConnService.

Details: MQTT Connection Service — broker attach, IncomingCb / LinkCb, reconnect. Protocol specifics: ProtocolRigelMq.

Documentation Map

Transports

TCP Connection ServiceAppTcpConnManager, callbacks, idle timeouts.

MQTT Connection Service — broker session, IncomingCb / LinkCb, reconnect.

Meter Operations

Meter Operations — registry, directives, readout and profile jobs shared by all protocols.

Protocols

ProtocolZD — JSON/TCP messages and state.

ProtocolMetallixTLV — Metallix binary TLV.

ProtocolOrionTLV — Orion TLV, TRANS_NUMBER, sessions.

ProtocolRigelMq — JSON over MQTT, topics, sessions.

Tools

Test Server — Python harnesses and ports.

README — extended Markdown bundle.