Sorry for the long development cycle before this official release. We are on our way to making iterations much shorter than that, but there's still some work to do before we can continuously maintain this new development process.
Here's the most important updates that we've been working on.
New WebRTC use cases
We now have the following WebRTC use cases available in MPlatform SDK with sample applications and documentation:
remote preview and control for a playlist application;
remote preview and control for a video recording application;
transmitting WebRTC streams and messages between native applications.
WebRTC features are available to anyone with a basic MPlatform SDK license.
New screen capture engine
The new screen capture engine, being a native implementation (not based on DirectShow), provides better stability, enhanced performance and better capture quality. The engine is included with the basic MPlatform SDK license.
BlueFish hardware support
Cards by Australian manufacturer FlueFish444 are now officially supported through the hardware integration part of MFormats SDK. Click here for information about supported models.
Multithreaded decoding logic for M-JPEG and MPEG-DASH playback support
It is now possible to play back MPEG-DASH streams: try it with the Network Playback Sample application just like you would with a common network stream. Also, M-JPEG decoding performance has been increased with multithreaded logic, allowing for streams to decode up to 4 times faster (depending on the CPU).
Improved transition logic for clips with different frame rates
In order to avoid possible audio issues when doing a transition from, for instance, a 25 FPS file to a 30 FPS file, we now convert the stream to the target frame rate before performing the transition. This has made transitions smoother and free from audio defects.
Extended statistics for application monitoring
In order to perfect our customers' applications, we often need to know exactly what's going on at each stage of the pipeline. If there are drops on the output, do they actually happen in the renderer or at the source? See this post on advanced statistics for more information.
In order to make sure that the two frameworks (MPlatfrom and MFormats) can safely co-exist on one PC, we've moved shared classes and interfaces to "Medialooks.Runtime.dll". See this post for more information.
New "output.time_sync" option and stability improvements in Blackmagic renderer
The new "output.time_sync" parameter enables special logic that corrects the stream time according to the original clock in case of frame drops (situations when the renderer has to wait for a frame that didn't arrive on time). This is useful in scenarios such as 2 playlists being rendered in sync.
Also, we have noticed that sometimes Blackmagic's native clock, which is used for stream rendering, has occasional gaps in it's "ticking". This leads to visible drops in the card's output. MPlatform now tracks the card's clock and corrects it if gaps occur.
For monitoring these situations, see the post about advanced statistics.
New "no_signal.thread" parameter for MLive object (enabled by default)
Before this update, in case of signal loss, the MLive object was sending "no signal" frames once per second. With "no_signal.thread=true" it sends "no signal" frames with the same frame rate as that of the original stream. It makes the behaviour of downstream objects (such as MMixer or MPlaylist) much more stable.
Receving line 21 сlosed сaptions
Now MPlatform SDK can decode line 21 closed captions. You can use MCCDisplay plugin to display these captions on screen, capture to .SCC files or output to an external device as line 21 (VBI), CEA-608 or CEA-608 wrapped into CEA-708 formats. Closed Captions lib required.
You will find more information on the Release notes page.