Skip to main content

Basics

Engine

Engine is a core of any application and organizer of its pipeline. It does all the work controlling lower-level instances and at the same time provides simple and user-friendly interface. Engine manages data (video) streams, processing and rendering. It is created for particular Processor. Processor constructor is provided to the Engine and former is responsible to initialize, setup and control processor during life-circle of the application. Results of processing are passed to a Renderer attached to the Engine. Renderers use provided results to define application's logic and visualization.

All core components of @geenee/armature: Engine, Processor, and Renderer, are generic classes parametrized by type of processing results emitted by Processor. If Engine is created for Processor emitting ResultT data, only Renderers accepting ResultT can be attached to it. Optional type parameter of instance settings can also be defined, it controls object's behavior. For example for Processors it's usually a set of flags that enable or disable evaluation of particular result. If result is not required for Renderer its computation can be skipped increasing performance and speed of the application.

To build AR experience you only need to implement Renderer where all application logic happens. SDK provides set of ready made Processors as well as set of predefined Renderer classes that can be used as a starting points. SDK is framework-agnostic, so you can utilize any rendering engine like three.js, babylon.js, etc, for visualization.

Via Engine instance you can setup, start, pause the pipeline. Before starting the pipeline call both initialization methods init() and setup(), these methods setup processing and video capture respectively. To initialize an Engine instance you need an SDK access token associated with the current url where the web app is deployed. If you call setup() when the pipeline is in playing state, Engine will be reset() automatically so you'll need to call start() to resume playing state. Do not call Engine's state control methods concurrently.

AsyncEngine is an experimental extension of basic Engine thet does processing in the background. Its pipeline provides for better performance and more stable frames per second. By using async type of engine in some cases you can achieve smoother and faster experience in the app. AsyncEngine and Engine are compatible, you can use any of them without additional code adjustments. AsyncEngine is experimental.

Example

To build an app you simply need to create an Engine instance for Processor and attach a Renderer:

import { Engine } from "@geenee/armature";
import { FaceProcessor } from "@geenee/bodyprocessors";
import { YourRenderer } from "./yourrenderer";
import "./index.css";

const engine = new Engine(FaceProcessor);
// Equivalently
// const engine = new FaceEngine();
const token = location.hostname === "localhost" ?
"localhost_sdk_token" : "prod.url_sdk_token";

async function main() {
const container = document.getElementById("root");
if (!container)
return;
const renderer = new YourRenderer(container);
await Promise.all([
engine.addRenderer(renderer),
engine.init({ token: token, transform: true })
]);
await engine.setup({ size: { width: 1920, height: 1080 } });
await engine.start();
}
main();

The SDK provides the next ready-made engine specializations:

Processors

Processor is the core computation part of an Engeenee app. Engine is created for particular Processor. Its constructor is provided to the Engine and former is responsible to initialize, setup and control processor during life-circle of the application. Results of processing are passed to a Renderer attached to the Engine. Renderers use provided results to define application's logic and visualization.

SDK provides the next ready-made processors:

For more details refer documentation of @geenee/bodyprocessors module.

Renderer

Renderer is the core visualization and logical part of any application. It's attached to the Engine. Basically, renders define two methods load() and update(). The first one is used to initialize all assets and prepare the scene for example set up lightning, environment map. Engine will call load() method during pipeline initialization or when renderer is attached. The second one is used to update the scene according to results results of video processing. This's where all the logic happens. Renderer itself is a generic abstract class defining common API.