You.i Engine One: Platform Specific Video Players

John Cassidy
7 min readOct 14, 2020

You.i Engine One takes advantage of specific video player implementations on each supported platform. The platform specific players can be moved and manipulated through the Engine with basic interaction at the JavaScript level. In addition to providing information to the players, the position of the player and integration into your app design can be easily achieved using both the AE workflow or standard Animation libraries with React Native.

Sample Squeeze Credits

This article explains how that abstraction layer works, and how the platform players are rendered where you want them to be rendered to get the best user experience possible.

Platform Players

Each platform supported by You.i Engine One comes with a native player implementation that works for the platform being built. Typically this is the de-facto standard for those platforms if you were to build a native app in that platform’s framework. Two common platforms that demonstrate this integration are Apple platforms and Android platforms:

You.i Engine One provides an abstraction layer, CYIAbstractVideoPlayer, that provides the functionality you would encounter cross-platform when interacting with a player. When using the default players shipped with the engine, specific integration details are not necessary for which player is being used and how it is being used.

React Native Video Component

A provided JSX Component, Video or VideoRef, is provided to represent the CYIAbstractVideoPlayer interface. It contains properties, imperative calls, and callbacks that match up with the abstraction.

<Video
style={{ videoWidth, videoHeight }}
source={{
uri: bigBuckBunnyUrl',
type: 'HLS'
}}
onReady={onReadyToPlay}
/>

Recap on Native Components

A JSX Component is the declarative component that provides a way for the developer to describe how they want something rendered on screen. The Counterpart is a You.i Engine One SDK Element that is represented in the application Scene Tree. This element is ultimately what is rendered on screen. The ShadowView is the glue that holds them together. It is responsible for receiving properties from the JSX component and mapping them to the counterpart SDK Element. It is also responsible for listening for signals from the SDK Element and propagating them to the JSX component for consumption in the JS layer.

Video Native Component

Like all other native components, Video is no different. The JSX Component is represented by Video or VideoRef. While it might be assumed that the Counterpart, the SDK Element, would be the concrete implementation of the Abstract player itself, it is actually a CYIVideoSurfaceView. This element is what is rendered on screen, and represents the area where the video screen will be.

That being said, if the counterpart SDK Element is not an instance of the player, where do the properties from <Video /> go? Where do we listen to callbacks to pass along back to the JS layer?

The ShadowView in this case is also responsible for the creation of a concrete video player to which it can pass properties from the JSX Component to and listen to events from it.

Video JSX Component
A basic component that provides the properties and callbacks necessary for a video player experience. All logic and callbacks for the player experience can be done in JS without any knowledge about which platform you are working on.

<Video
source={{
uri: 'https://example.com/stream.m3u8',
type: 'HLS'
}}
onPreparing={() => {} }
onReady={() => {} }
onPlaying={() => {} }
onPaused={() => {} }
onPlaybackComplete={() => {} }
/>

When using AE workflow, the equivalent would be a VideoRef component inside a Composition.

<Composition source="Composition_source">
<VideoRef
name="layer_name"
source={{
uri: 'https://example.com/stream.m3u8',
type: 'HLS'
}}
onPreparing={() => {} }
onReady={() => {} }
onPlaying={() => {} }
onPaused={() => {} }
onPlaybackComplete={() => {} }
/>
</Composition>

CYIVideoSurfaceView

The CYIVideoSurfaceView is the renderable item that can be manipulated in the scene tree and represent where your video player will be. If correct capabilities are set, the surface view can be scaled, translated, and even have opacity manipulated. This allows for you to move the player on the screen for a more immersive and positive design experience.

AE Workflow of creating a Squeeze Credits screen

The AE workflow provides the most flexibility and freedom of motion design (to the designer) as it allows them to decide how your video surface will respond while the user is interacting to elements on the screen.

Preview of Squeeze Credits screen made using the AE Workflow

Of course, animations can also be applied programmatically with the Animated react-native library.

Note: If transforming the surface on Android, due to the way android updates the platform video view position you might observe a lag in the video view. In the past I have faded out the video momentarily until the transform animation is complete and brought it back for best user experience. Other players may also have limitations at the platform level of what you can do with their player surface, such as Tizen.

VideoShadowView

The manager responsible for receiving props from the JSX Video Component, creating (or finding) the CYIVideoSurfaceView counterpart, creating the PlatformPlayer and brokering all communication between them.

PlatformPlayer

A concrete implementation of CYIAbstractVideoPlayer.

Concrete player will implement all of the above methods to implement a platform player

The core implementation will be in C++, but each respective platform will be done in accordance with that platform’s SDK integration layer.

Typical structure for working with multiple platforms with a common interface

The PlatformPlayer is responsible for integrating with platform specific SDKs for players (AVPlayer, ExoPlayer) and mapping calls from the abstraction layer to the platform player directly. If you are familiar with native development on Apple or Android, this is where you will see familiar SDK integrations in Java and Kotlin for Android as well as Objective-C and Swift for Apple.

PlatformSurface

Created by the PlatformPlayer, the PlatformSurface is an implementation of CYIVideoSurfacePlatform. It is provided to the CYIVideoSurfaceView upon creation to represent the surface that the platform player is rendered on. It is generally responsible for three items:

  1. Interface for receiving updated positioning coordinates of the player
  2. Signal handler for when it is attached to a CYIVideoSurfaceView
  3. Signal handler for when it is detached from a CYIVideoSurfaceView

Items (2) and (3) are generally more important when the lifespan of a player lives a longer life and is not tightly coupled to the lifecycle of the JSX Component as is the case with RN. In more traditional C++ apps, it was more common to have a PlatformPlayer and PlatformSurface live for the duration of the Application lifecycle where the only thing that would change would be which CYIVideoSurfaceView they are associated with.

How is a Platform Player Rendered on Screen

With all the components above in place, the manner in which the platform player knows where to render can be simplified as follows:

On each frame, the CYIVideoSurfaceView requests it’s position on screen and then notifies the Platform Surface, which has been attached as a listener. The Platform Surface then notifies the Platform Player, which then updates the platform specific SDK View frame to match the size and position. This means that on every frame update, the SDK view that is associated with the player is updated to match.

It is not rendered directly on the surface, but instead the SDK view is inserted below the Engine’s rendering surface. The CYIVideoSurfaceView acts as a window to the SDK view that is being updated to match. This is so that any custom controls built with the engine will render on-top of the player surface.

Updating a Platform Player SDK View

With each frame update, the Platform Player is provided coordinates that can be translated to coordinates that map to the entire surface. The following code snippet is from a Platform Player implementation that receives coordinates from the Platform Surface.

Sample of receiving coordinates from the CYIVideoSurfaceView

These coordinates can then be provided to an SDK View to match.

Sample implementation of updating position of platform player view on iOS

If the SDK player instance has not been created, there is nothing to update. If an SDK view that represents the SDK player doesn’t exist, it’s created at appropriate size and position. If both those conditions are met, and the provided coordinates differ from the current coordinates of the SDK view, then it is updated to reflect where it should be on screen.

The end result is the Platform Player constantly being updated to be in place where the CYIVideoSurfaceView is in the application Scene Tree, providing a window to the platform player surface.

--

--