πŸ›°οΈZafer SatΔ±lmış - Aviora

Time Service β€” Architecture And Code Generation

The application Time Service exposes one API for epoch, struct tm, and formatted strings. Which backends run (NTP, internal RTC, external RTC, Linux host clock, software tick) is not chosen with #ifdef in core files: it is driven by Customers/<Customer>/Time/time_config.json and the generator generate_time_service_code.py, which writes customer headers and AppTimeService_Autogen.c. Core logic in AppTimeService.c includes that autogen file and always calls the same static functions; behavior follows the generated implementation.

End-To-End Configuration Pipeline

A single JSON file per customer defines the time domain. The Python script emits three artifacts: customer-specific macros and optional driver includes (Cus_TimeService_Config.h), a thin wrapper that points at the customer (Customers/TimeService_Config.h), and the static wiring inside AppTimeService_Autogen.c.

Figure β€” From JSON To Running Code
flowchart LR
  subgraph Input
    J[time_config.json]
  end
  subgraph Generator
    G[generate_time_service_code.py]
  end
  subgraph Outputs
    CUS[Cus_TimeService_Config.h]
    TSW[TimeService_Config.h]
    AUTO[AppTimeService_Autogen.c]
  end
  subgraph Core
    APP[AppTimeService.c]
  end
  J --> G
  G --> CUS
  G --> TSW
  G --> AUTO
  TSW --> APP
  AUTO --> APP
  CUS -.->|included via| TSW
              

Example β€” Linuxgcc Customer time_config.json

Below is a real configuration shape (values may change). When timeService.use is true, child features are evaluated. If timeService.use is false, NTP and all RTC / OS time flags are forced off in the generator (same rule as application code expectations).

{
  "customer": "LinuxGcc",
  "timeService": {
    "use": true,
    "timeZone": "UTC",
    "ntp": {
      "use": false,
      "ipAddr": "pool.ntp.org",
      "port": 123,
      "updatePeriodMin": 60
    },
    "extRTC": { "use": false, "deviceConfig": { ... } },
    "intRTC": { "use": false, "deviceConfig": { ... } },
    "linuxLocalTime": {
      "use": true,
      "deviceConfig": {
        "driverPath": "Middleware/MiddComm/Midd_OS/inc/Linux_DateTime.h"
      }
    }
  }
}
Preferred epoch source (what appTimeServiceGetEpoch uses) is resolved in this order: internal RTC β†’ external RTC β†’ Linux local time (host time()) β†’ software tick β€” first enabled backend wins. NTP is separate: it periodically fetches network time and pushes epoch into RTCs / OS / soft tick via appTimeServiceAutogenUpdateRtcsFromEpoch when enabled.

JSON Parameters To Generated Macros

JSON Path Generated Symbols (Cus Header) Role
timeService.use APP_TIME_SERVICE_USE Master switch; when 0, all feature flags in Cus header are 0.
timeZone APP_TIME_SERVICE_TZ_OFFSET_MINUTES UTC offset in minutes (e.g. UTC+3).
ntp.use, ipAddr, port, updatePeriodMin APP_TIME_SERVICE_USE_NTP, APP_TIME_SERVICE_DEFAULT_NTP_*, APP_TIME_SERVICE_NTP_UPDATE_PERIOD_MIN NTP enable and defaults; period drives the periodic timer in AppTimeService.c when NTP is on.
intRTC.use + deviceConfig.driverPath APP_TIME_SERVICE_USE_INT_RTC, include + INT_RTC_* macros Internal RTC driver; stubs with FAILURE when off.
extRTC.use + I2C + driverPath APP_TIME_SERVICE_USE_EXT_RTC, EXT_RTC_I2C_*, include + EXT_RTC_* External RTC (e.g. M41T11) via I2C.
linuxLocalTime.use + driverPath APP_TIME_SERVICE_USE_LINUX_LOCAL_TIME, include Linux_DateTime.h Host OS clock; only when both RTCs are off and this is on; otherwise software tick fills the gap.

Runtime Flow β€” Read Time

Figure β€” Get Epoch From Preferred Source
flowchart TD
  A[appTimeServiceGetEpoch] --> B[appTimeServiceAutogenGetEpochFromPreferredSource]
  B --> C{Int RTC enabled?}
  C -->|Yes| D[middRtcIntGetTime β†’ epoch]
  C -->|No| E{Ext RTC enabled?}
  E -->|Yes| F[middRtcExtGetTime β†’ epoch]
  E -->|No| G{Linux local enabled?}
  G -->|Yes| H[getCurrentUnixTime]
  G -->|No| I[appTimeSoftTickGetEpoch]
              

Runtime Flow β€” NTP Sync (When Enabled)

When APP_TIME_SERVICE_USE_NTP == 1, appTimeServiceInit registers a repeating event timer. Period = APP_TIME_SERVICE_NTP_UPDATE_PERIOD_MIN Γ— WAIT_1_MIN. Each tick calls appTimeNtpGetEpoch then updates backends via appTimeServiceAutogenUpdateRtcsFromEpoch.

Figure β€” NTP Timer Callback
sequenceDiagram
  participant T as Event Timer
  participant C as ntpTimerCb
  participant N as appTimeNtpGetEpoch
  participant U as appTimeServiceAutogenUpdateRtcsFromEpoch
  T->>C: periodic
  C->>N: get epoch from NTP stack
  N-->>C: epoch or failure
  C->>U: push epoch to RTCs / OS / soft tick
              

See Network Time Protocol for parameter binding and activation rules.

Where Core Meets Generated Code

AppTimeService.c

Public API, timezone conversion, string format, and #include "AppTimeService_Autogen.c" so static helpers live in one translation unit.

AppTimeService_Autogen.c

Implements appTimeServiceAutogenInit, GetEpochFromPreferredSource, UpdateRtcsFromEpoch, GetNtpEpoch, SetNtpServer β€” only the backends selected in JSON appear here.

TimeService_Config.h

Includes <Customer>/Time/Cus_TimeService_Config.h so the rest of the tree includes one stable name.

Cus_TimeService_Config.h

All APP_TIME_SERVICE_* macros, NTP defaults, RTC I2C, driver includes or FAILURE stubs.

Deep Dives (Sidebar)

Use the left navigation for focused topics:

Regenerate After Editing JSON

python Customers/generate_time_service_code.py --customer LinuxGcc

Replace LinuxGcc with your customer folder name under Customers/.