Swift
How It Works
Video Frame Processing Flow
- Frame Reception: The Vonage SDK calls
renderVideoFrame(_:)on yourCustomVideoRenderinstance for each video frame - Frame Processing: The frame is passed to
CustomRenderView, which processes the frame data and creates an image - Display: The
draw(_:)method is called on the main thread to render the processed image to the screen
The example implementation converts frames to grayscale, but you can modify the processing logic in renderVideoFrame(_:) to apply any transformation you need.
Customization Options
You can modify the frame processing in CustomRenderView.renderVideoFrame(_:) to apply different effects or transformations. The OTVideoFrame object provides access to the raw frame data through its planes property, which you can process however you need.
For more advanced processing, you can also use Metal or Core Image frameworks to apply filters and effects to the video frames.
Testing
Test on iOS Simulator
- Run the app in the iOS Simulator
- The simulator will use a demo video (no camera access)
- You should see the grayscale video feed
Test on Physical Device
- Connect an iOS device
- Select it as the run destination
- Grant camera permissions when prompted
- You should see your camera feed in grayscale
Test with Multiple Participants
- Run the app on a device or simulator
- Use the Vonage Video Playground to join the same session
- You should see both your custom-rendered stream and the standard subscriber stream
Basic video rendering
Learn how to use a custom video renderer in Swift to display a black-and-white version of a video stream using the Vonage Video iOS SDK.
Steps
1
Introduction2
Getting Started3
Creating a New Project4
Adding the Vonage Video SDK5
Setting Up Authentication6
Understanding the Architecture7
Create the Custom Render View8
Create the Custom Video Renderer9
Integrating with Vonage Video Manager10
Create UIView to SwiftUI Wrapper11
Display in SwiftUI12
How It Works13
Conclusion