AiSK
從現場螢幕到觀眾螢幕:直播技術流程的完全解析
Deep Dive into Live Streaming: Technical Workflow Uncovered

Deep Dive into Live Streaming: Technical Workflow Uncovered

2022-08-03

|

A lot of the livestreaming process happens behind the scenes seamlessly, automatically. But if you want to ensure your audience has the best viewing experience possible, having a bit more knowledge of what each element of the livestream process will help. Today, let’s do a deep dive into how livestreaming works, going through the workflow step-by-step, so you’ll learn everything you need to know to create high-quality live streams for audiences.

We won’t go into much detail about the production process here since we’re focusing more on the backend, and technical aspects of the workflow. Obviously, the livestream process starts with production or footage capture.

It then moves to the encoding and transcoding processes, ingests, and delivery.

Let’s start with encoding.

1. Livestream encoding

Encoding and transcoding (and also ingest, which we’ll discuss later) generally happen at the same time or overlap each other but we’ll discuss them separately to make things easier.

1-1 The encoder

So, what is the livestream encoder? Have you ever shot digital video or photos, then had problems viewing the files because they were in the wrong format and/or file size? The encoder helps with that, converting the original RAW video file footage into an optimal, i.e., more universal file format that’s viewable to all systems.

EXAMPLE: A company produces a livestream using the newest 4K or 6K equipment. This is great on the production end, however if one were to send these files in their original 4K format they’d find, they’re too large to transmit using most providers.

The encoding process compresses the video, using presets that footage compatible across all systems and platforms.

How the encoding process works

The encoder transforms the original footage format to something else for viewers’ playback. Encoders can be software or hardware based and are powerful enough to do all of this in real-time.

  • Dedicated hardware encoders: Some are standalone like a PCI capture card or other hardware, a dedicated device. But they’re also sometimes a component included in other devices like IP cameras.
  • Software encoders: They can be purchased, as a standalone product. But they’re also often included in web or mobile-based platforms like those found on livestreaming or social platforms.

Codecs compress the footage

The encoder uses a codec, a digital or algorithmic compression tool, to convert the files. It’s called codec because it’s a coder and decoder combined. The encoder also stores metadata like titles, closed captions, etc.

  • The encoder compresses files, and then transcoder does its job.
  • After that, the encoder comes back into play, re-codes those same files for viewing—when it reaches viewers’ screens, it makes the file larger again but in a different format.

The file that viewers see is in a different format than the file was shot in’ one large enough, with enough resolution for viewers to enjoy. There are different codecs for video and audio files. H.264 (video) and AAC (audio) are the most common codecs for digital video and audio files. However, some other common names you may see include HEVC, QuickTime and WMV.

Video containers or wrappers

As the original source file is converted to a smaller size by the codec (compression), it also outputs the now compressed file to a different file format, often referred to as a “video container” or “wrapper.” The livestream video container or wrapper are the technical names for what we commonly call the “digital format”, “file type”, or “file extension.” (MP3, MP4, are some which you should be familiar.)

1-2 Codecs output footage into different types of wrappers/containers

Think of a container as a template or preset, a standard used by many systems. Some wrappers or containers, like MP4 have solid quality but lower definition than higher res outputs, like the .MOV wrapper. However, outputting to an .MOV file, while it has better resolution, may hog up the bandwidth on viewers’ systems, causing poor quality and interrupted streams (depending on the length/size of your livestream file, latency issues, and other production elements.)

Picking the right container is usually a matter of choosing the best compromise: highest definition possible without causing interruption of play.

Some common audio/video containers include:

  • AVI
  • MP3
  • MP4
  • MOV
  • WMV (a codec and also container type)
  • QuickTime

2. Livestream transcoding

Encoding and transcoding are two other confusing, often interchangeably used terms, probably because the two often happen almost simultaneously, as a 3-step process

  • Encoding is the first and last part of the inter-related process: first, the compression of the original RAW video, then after transcoding, the encoder recodes footage into its container or wrapper (MP4, AAC, etc.) and is delivered to viewers
  • Transcoding happens in the middle; it’s the part of the livestream process where footage files are copied into different sizes or variations for delivery to playback systems

Transcoding process, part 1

Encoding is a process that can occur on its own. For example, you have footage you need to send to one or two platforms, and you just need to compress file size, nothing else. Here, you’re just using it like any file converter, going from RAW footage to an MP4, etc.

But transcoding always requires encoding—it’s not standalone—and usually, if you’re livestreaming, you’ll need to encode/transcode due to the file size. Transcoding can’t start until the file has been compressed/encoded, then put into a different wrapper/container (like MP4.) Transcoding always refers to two combined processes: first encoding or compression…

And then the rest of the transcoding process, which is creating file variations.

Transcoding process, part 2

So the second part of the transcoder process, after encoding, is to create different versions or variations of the same footage (audio/video files) so that when each viewer hits “play”, their system—whether it’s a simple video player on an old laptop or a newer, OTT streaming platform—can offer them the optimal viewing experience.

Transcoding alters files as needed to account for each individual viewer’s system, creating additional outputs/variations of footage using preset alterations–sometimes making files a little larger/ smaller, adding other minor tweaks and changes. One or more variations must be created to ensure individuals on different devices, with different signal strengths, speeds, and bitrates can watch a live stream.

3. Streaming protocols and ingest

During the ingest process the content is sent to an on-site server or in the cloud where encoding/transcoding occurs. Like several other processes, ingest actually happens before or around the same time as encoding/transcoding. But to make it easier to understand, we’ve put it in a standalone section.

Ingest (and other parts of this process) use a streaming protocol, the set of rules dictating how information is transmitted from source/origin to an end point—exactly how and what tools to use to get from content from Point A to Point B

Some well-known HTTP and dedicated streaming protocols include:

  • HLS (HTTP Livestreaming) is one of the commonly used video protocols; it was developed by Apple (web-based protocol)
  • MPEG-DASH (Dynamic Adaptive Streaming over HTTP) (web-based protocol)
  • RTMP (real time messaging protocol) older, not used as often but offers some encoding advantages; when utilized now it will be used in conjunction with another protocol like HLS to provide playback
  • RTSP (real time streaming protocol) Newer, but not yet widely used but believed to be one of the best protocols for streaming in the future. (web-based protocol)
  • SRT (secure reliable transport) -Newer technology, open source.
  • WebRTC (web real-time communications) is commonly used by popular online, web conferencing and video platforms, including social media.

Choosing the right protocol matters

The type of livestreaming protocol used may affect the quality of the live stream as well as latency. Some protocols like RTMP have better quality but are less than ideal for your streaming situation overall, while others may be faster (like WebRTC) but not provide the same quality. Also, some protocols have other features like better captioning, digital rights management (DRM), synchronized playback of multiple screens, and more.

FYI: RTSP and RTMP are used by many, primarily because they support low-latency streaming. However, due to the age of these systems, the two may not work on mobile devices, browsers, computers, or TV. Instead, they’re primarily used today as part of the process: helping move the video between something like an encoder, an IP camera, etc. to its dedicated media server.

4. Packetizing

Also called rewrapping or transmuxing, packetizing is the part of the process when the compressed file (audio/video) is repackaged into a new format for final delivery. It’s similar to transcoding but only occurs when there is no need for change to the content, no compression or metadata or other elements added. It’s a straight conversion of containers from one to the other (like QuickTime to MP4.)

5. Content delivery or CDN process

The CDN (content delivery network) is how livestream files are sent to viewers’ playback systems. CDNs are large systems of servers distributed worldwide to transport media files. By using a CDN instead of one or two local servers, producers don’t have to worry about one server failing, signals being bottlenecked, latency problems. CDN use also helps with scalability and other potential issues.

6. Playback

The video is now, finally, received by your viewers through different video players on different devices.

ConclusiNeed help with your live stream?

That’s our guide to livestreaming. Hopefully, it helps you better understand more of the technical process of delivering high-quality content. Have more questions about livestreaming, encoding or our OTT solutions? Reach out today to our BlendVision media professionals.

One Centralized Platform.
Endless Multimedia Possibilities Unleashed.

Explore everything you need to build, manage and scale your video business.

Get Started Today.

Fill in the form and one of our consultants will help you shape your video strategy.

Thank you! Your submission has been received. We will contact you as soon as possible.
Oops! Something went wrong while submitting the form.

Related Articles

An Insider’s Guide to Successful E-Learning Classes

Discover the advantages and challenges of e-learning, and explore essential tips to create engaging and efficient online courses.

2022-10-02

How to Livestream Events Online : Ultimate Guide to Live Event Planning

Discover key strategies and tips for planning and executing high-quality live streaming events. Learn how to manage technical aspects and engage your audience.

2022-10-02

4 Advantages to Build a Self-owned Video Platform for Monetization

Discover the benefits of creating your own video platform over public streaming services. Learn about setup costs, copyright control, and SEO improvements.

2022-08-29