Azure manages policy and lifecycle. GENCITY executes AI locally. No raw data traverses the boundary between local infrastructure and cloud services.
The architecture separates the control plane (Azure) from the execution plane (edge nodes). This separation ensures that sensitive data is never transmitted to cloud compute resources.
RESTful APIs provide authenticated access to inference endpoints without cloud routing. Local API gateway handles request validation and rate limiting.
Operational metrics are hardware-anonymized before reporting to Azure. Performance data flows; sensitive content does not.
Multi-node topology with centralized fleet orchestration. Nodes span multiple physical sites under unified policy control.
Automatic workload failover across available nodes. Nodes continue operating autonomously during connectivity interruptions.
| Boundary | Data Type | Direction | Raw Data |
|---|---|---|---|
| Edge → Orchestration | Inference results, node health | Upstream | Local only |
| Orchestration → Azure | Anonymized telemetry, policy ACKs | Upstream | None — HW anonymized |
| Azure → Orchestration | Policy updates, model manifests | Downstream | N/A — control metadata |
| Orchestration → Edge | Model deployments, inference tasks | Downstream | Local only |
| Edge → Cloud (direct) | — | Blocked | No path exists |