Thanks to visit codestin.com
Credit goes to github.com

Skip to content

Conversation

MaximKalinin
Copy link

Screen share on iOS gets captured at Broadcast Extension (that is implemented by the library consumer) and then is sent to the main app via unix socket. However, previously it was not very efficient mainly for 2 reasons:

  1. The extension was sending the frames as JPEG images. Encoding to JPEG is relatively expensive and cannot output 60fps (which as of now is the maximum provided by the ReplayKit).
  2. Those frames were wrapped in HTTP message. Receiving HTTP message chunk by chunk requires reallocating memory each time, making it inefficient.

To adress these bottlenecks, H264 video codec is now used to send the frames from the extension to the main app. H264 is hardware-accelerated on all the iOS devices, and is designed for the video, so is very efficient. Decoding is done with the help of Transcoding library. Since that library is only available as SPM package, its source code was vendored into the project. The library is also written in Swift, which means the part of the project interacting with it (ScreenCapturer class) had to be written in Swift as well.

Messages are now sent in binary format, with prefixed length, meaning there is no need to reallocate memory for each message chunk as the length is already known.

@8BallBomBom 8BallBomBom added enhancement Enhancement of an existing feature ios Anything concerning the iOS platform needs-reviewing A review is needed labels Aug 16, 2025
Screen share on iOS gets captured at Broadcast Extension (that is implemented by
the library consumer) and then is sent to the main app via unix socket. However,
previously it was not very efficient mainly for 2 reasons:

1. The extension was sending the frames as JPEG images. Encoding to JPEG is
   relatively expensive and cannot output 60fps (which as of now is the maximum
   provided by the ReplayKit).
2. Those frames were wrapped in HTTP message. Receiving HTTP message chunk by
   chunk requires reallocating memory each time, making it inefficient.

To adress these bottlenecks, H264 video codec is now used to send the frames
from the extension to the main app. H264 is hardware-accelerated on all the iOS
devices, and is designed for the video, so is very efficient. Decoding is done
with the help of [Transcoding](https://github.com/finnvoor/Transcoding/)
library. Since that library is only available as SPM package, its source code
was vendored into the project. The library is also written in Swift, which means
the part of the project interacting with it (ScreenCapturer class) had to be
written in Swift as well.

Messages are now sent in binary format, with prefixed length, meaning there is
no need to reallocate memory for each message chunk as the length is already
known.
Since Transcoding library requires iOS 15, we need to bump the target iOS
version to 15, otherwise it won't compile.
@MaximKalinin MaximKalinin force-pushed the refactor/use-h264-to-send-frames-from-extension branch from ad5510e to 64f48fe Compare August 17, 2025 11:32
Newer version of Xcode (16) is not compatible with CocoaPods. It broke when a
broadcast extension target was added to the project with the following error:

> RuntimeError - PBXGroup attempted to initialize an object with unknown ISA
> PBXFileSystemSynchronizedRootGroup from attributes:

Fix was found here:
CocoaPods/CocoaPods#12583 (comment)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement Enhancement of an existing feature ios Anything concerning the iOS platform needs-reviewing A review is needed
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants