Architecturally, the Steward consists of four layers:
A larger set of custom extensions developed by 2021.AI which enable the APEX Enterprise components to compose and emit event messages.
Message Listener and Store
A non-graphical central component contains a data model for all registered event types (data ingest, dataset transformation, experiment development, model runs, deployments, retraining, platform scaling, etc.). Additionally, this component is responsible for receiving the messages from the generators
Working from the message store, the dashboard provides an overview of the performance of all Data Science functions. For example, this includes the ratio of experiments to deployments, or the types of ingestion operations segmented by data sources and utilized in experiments.
The data that is stored in the Steward acquires a whole new dimension if it is benchmarked upon other APEX Enterprise users with appropriate anonymization and opt-in. The role of the upstreamer, which - with the client's explicit approval - transmits appropriately sanitized performance data to a centralized 2021.AI meta-Steward and can receive benchmarked data for display on a dashboard.
Telemetry and Backstopping
The telemetric and backstopping functional area is one of the key features of APEX Enterprise.
It is designed to encourage and facilitate growth in the Data Science functionality, while it is also possible to leave this to 2021.AI under an AI as a Service agreement. In short, the APEX platform establishes an excellent technological foundation from which there is support between the internal Data Science team and the 2021.AI Data Scientists and Infrastructure experts. AI as a Service empowers your organization to begin to develop your own in-house expertise by supporting your organization’s growth with the APEX platform.
Data Science Backstopping
Inside each instance of APEX Enterprise, the Steward accumulates information concerning the performance of the Data Science processes.
If the client elects to upstream this data to 2021.AI's central facilities, then the 2021.AI Data Science team can provide support with proactive monitoring of the process performance.Furthermore, your in-house Data Scientists will be able to freely reach out to members of the 2021.AI Data Science team for guidance and consultation. Should you wish to outsource an entire model development task to the 2021.AI upstream resources, we also offer this under our AI as a Service.Both APEX Now and APEX Enterprise support the manual aspects of this backstopping. However, automated up-streaming is a feature of APEX Enterprise.
Infrastructure and Technical Backstopping
APEX Go and APEX Enterprise support the delegation of monitoring to a central resource at 2021.AI
This is accomplished using the Icinga2 monitoring system operated by 2021.AI. When deploying an APEX Go instance, this instance is subjected to our Icinga2 monitoring under related obligations in the Service Level Agreement. Note, this Icinga2 monitoring system is only accessible by 2021.AI's infrastructure specialists. Should the client wish to own their monitoring system, we can implement this as a separate service.