Swift

How It Works

Video Frame Processing Flow

  1. Frame Reception: The Vonage SDK calls renderVideoFrame(_:) on your CustomVideoRender instance for each video frame
  2. Frame Processing: The frame is passed to CustomRenderView, which processes the frame data and creates an image
  3. Display: The draw(_:) method is called on the main thread to render the processed image to the screen

The example implementation converts frames to grayscale, but you can modify the processing logic in renderVideoFrame(_:) to apply any transformation you need.

Customization Options

You can modify the frame processing in CustomRenderView.renderVideoFrame(_:) to apply different effects or transformations. The OTVideoFrame object provides access to the raw frame data through its planes property, which you can process however you need.

For more advanced processing, you can also use Metal or Core Image frameworks to apply filters and effects to the video frames.

Testing

Test on iOS Simulator

  1. Run the app in the iOS Simulator
  2. The simulator will use a demo video (no camera access)
  3. You should see the grayscale video feed

Test on Physical Device

  1. Connect an iOS device
  2. Select it as the run destination
  3. Grant camera permissions when prompted
  4. You should see your camera feed in grayscale

Test with Multiple Participants

  1. Run the app on a device or simulator
  2. Use the Vonage Video Playground to join the same session
  3. You should see both your custom-rendered stream and the standard subscriber stream