Learning should always be fun

About Video Streaming

About Video Streaming

Hello gud ppl! In this blog we will see the different aspects of video streaming and how can it be achieved. With the increasing amount of social media influence and mobile phones users, any media platform will feel a need of media streaming. However there is no any solid definitive way to achieve this. In this tutorial I will not be talking about flash and other third party plugins. We will totally discuss on how can you achieve video streaming on your own. This blog will not be any platform specific and we will only discuss on the different ways to create your own video streaming platform.

 

Video codec

 

Video codec is the very important topic in video streaming. Some popular video codec are VP8, VP9 and H264. So what does a video codec do? Lets say you have a video with 15 frames per second. Now lets suppose each pixel has 3 bytes of information of color with each byte representing red, green and blue. Then a single frame can become 921,600 bytes ie 900kb for 640*480 frame size(640*480*3/1024). For a 15 frame the size becomes 13.1 Mb. If you look carefully we can see that in one second video, the size becomes 13.1 Mb. This is insanely large.

In order to solve this problem we need a mechanism to reduce the video size and this is done with the help of codecs. VP8 and VP9 are free open source codecs. H264 on other hand is proprietary codec. H264 is widely used and is found to be most effective than that of VP8 and VP9. To my knowledge H264 is free to use for open source projects. For proprietary application I think the charge is based on number of users but I am not sure about it.

So when you are about to prepare for video streaming you need to decide which codec to use. VP8 and VP9 are also widely used. If you application need very effective compression VP8, VP9 and h264 are good for all possible case. I have tried all of them and have found h264 more efficient than VP8 and VP9 but for general cases they all fit your need.

 

Type of media

 

There are different media options in video streaming. If you are planning to stream with the help if image files then you may need to compress each of these images and stream frame by frame. If you are planning to stream a video content then the video might be separated in different fragments and streamed. One example of video segment streaming is playing http video. When playing http video the media player (OR browser) sends 206 partial content request to the server. The server read the 206 request and reads the range of bytes requested by the media player. Once done the server reads the range of  bytes from the video file and provides the data to the client. This data is then decoded by the media player and displayed to the user.

Also we have seen single video being separated into different files and streamed by sending these single files one by one. Many adaptive streaming uses this technique to stream the video content to the user based on the network bandwidth.

 

Protocol 

 

If you are streaming video, you need a media displaying platform like video element in DOM or media players like VLC and others. These platform needs to communicate with the video server to display the video. This communication can be done in some pre defined protocols. Some popular protocols are HTTP, RTSP, HLS, WebRTC and so on.

Which protocol to choose ?

 

The choice of protocol may vary. If you are streaming static content then HTTP video streaming can be a good choice. If you need realtime streaming then RTSP, WebRTC and HLS can be a good choice. The use of RTSP and HLS are not natively supported in all browsers so you may need WebRTC for the purpose. If you are programming for android, ios or other native platform then you are free to choose any of these protocols.

WebRTC can be good for many reasons as its supported in almost all platform and different challenges of video streaming has been addressed by it. However for live broadcast and stuffs like group video chat WebRTC will not be a good choice for you.

Video streaming is not protocol dependent. The only thing you need to do is feed the media player or media renderer the content it understands. Lets say I want to broadcast my webcam to 10000 users in a browser. What I can do here is grab the media stream from the browser and then send the stream data over web socket by converting the stream to array buffer. The same stream can be received on the other end and played as video blob.

Another example of this is streaming video feed from camera from android. The media recorder api of android does not fill up the NAL unit headers until the recording is finished. Due to this reason some developer fill up these headers themselves by analyzing the headers of demo recorded video. After you do so you may stream them using appropriate protocol as the video stream now has understandable format.

Means of communication

 

You may need to stream data via TCP/UDP or may be web socket. All of them have their own use case. Web socket is necessary if you stream data to the browser. WebRTC on the other hand can use both TCP and UDP for streaming. TCP is stateful protocol and guarantees the data will reach to destination. UDP on other hand does not guarantee the data will reach to the receiving end. Due to this fact UDP is some how faster than TCP and used in many video streaming services as we may ignore if some few data did not reach to the client.

 

Adaptive Streaming

 

We may need to change the video quality based on the bandwidth condition. Adaptive streaming helps to send user video content based on the bandwidth of the user. One example of adaptive streaming is Youtube. When you see youtube video you may have noticed change in video quality when you internet becomes slow. This type of streaming can be achieved in different ways. It depends on the type of application and implementation of adaptive streaming varies form application to application.

Lets say you are making live video chat application then you may notify recorder to reduce the recording bitrate and video quality by any means. Then you may restrict bandwidth from the source and send the slow bandwidth device data in its own pace. On the other hand if you are adding adaptive video streaming feature for static content then what you may do is make different video files for same video file but with each having different video qualities and then streaming the respective video files based on user bandwidth.

 

Final Thoughts

 

All the different points I have mentioned might not be the best option. Let me know if there are any room for improvements. I have always been interested in media streaming and working with cool people who make these thing happen is amazing. I will also come up with new blog with one example for live video streaming of web cam in browser with adaptive streaming feature without WebRTC. Cheers!!

 

Feature Image Credit:

Photo by Sam McGhee on Unsplash