I’m excited to announce availability of Media SDK v1.1. Version 1.0 introduced Microsoft’s Media Foundation Transform (MFT) framework for multimedia processing on AMD platforms, and version 1.1 introduces the AMD Media Framework (AMF) library for video encoding and decoding. There are more details in the Beta Release announcement.
Media SDK 1.1 provides basic samples that enable you to realize any video pipeline. It also includes several advanced samples that illustrate how to employ these elements in real-world applications.
Basic Media SDK Samples
The release offers basic implementations for encoding, decoding, video conversion (color conversion and scaling), transcoding and identifying platform capabilities. They allow you to construct real-world applications that harness the hardware acceleration of AMD’s unified video decoder (UVD), video coding engine (VCE) and graphics processing unit (GPU).
Advanced Media SDK Samples
Media SDK 1.1 includes the following advanced samples:
- D3D multi-encoder: Demonstrates how to build and execute multiple encoding sessions using Direct3D (D3D) surfaces. It’s useful for applications such as cloud gaming, also called gaming on demand, where remote servers execute and render a game before encoding the video results and streaming them to the player’s computer over the Internet.
- Multi-batch transcoder: Shows you how to build and execute multiple batch transcoding sessions for H.264 elementary streams. This capability benefits applications such as YouTube that must resize the same content for multiple resolutions.
- OpenCL™ video filter: Illustrates how to build and execute a video-transcoding application using OpenCL video filtering. It also demonstrates interoperability between DirectX® (DX) and OpenCL.
- Video surveillance: Shows how to build and execute a video-surveillance application. The example decodes and resizes outputs and then creates a 1080p video frame using OpenCL.
The release also includes an update:
- Pipeline playback: Adds support for OpenGL’s presenter memory type.
Multiple Devices and Multithreading
For samples that include multiple processing sessions (such as D3D multi-encoder), the sessions can be distributed to separate devices. For example, if an APU + discrete GPU (dGPU) are available, two sessions can work in parallel, one on the APU and one on the dGPU. Similarly, if the hardware configuration is CPU + dGPU + dGPU, two sessions can work in parallel (one on each dGPU). This approach delivers better performance as well as optimal device utilization.
In addition, all samples include multithreaded implementations to improve performance.
Both the basic and advanced samples output a set of useful performance parameters, such as:
- Latency: Time to process the first input to yield the first output.
- Average performance time: How long the actual processing takes.
- Average file write time: Time to write a single output to a file when the file-write option is enabled.
The AMD AMF Reference Manual features updated definitions of APIs and structures in the AMF library.
All AMD Radeon™ HD 7xxx Graphics and higher platforms support Media SDK 1.1, as do AMD Radeon R9 285 Graphics. To use the new release, download and install the latest AMD Catalyst™ Omega driver. Then download and install the Media SDK. Dive into the examples in the SDK and have fun. Happy coding!
We eagerly await your feedback on the Media SDK forum.
Amit Agarwal is a technical staff member of the Heterogeneous Application Solutions team at AMD. His postings are his own opinions and may not represent AMD’s positions, strategies or opinions. Links to third party sites, and references to third party trademarks, are provided for convenience and illustrative purposes only. Unless explicitly stated, AMD is not responsible for the contents of such links, and no third party endorsement of AMD or any of its products is implied.
Editor’s note: Comments are closed on this blog post, however, we do want to hear from you. Please leave your comments, questions and discussions in our Media SDK forum.