There's a minor downside that the increased latency of HTTP means you need to get a few segments in the pipe before you can smoothly play through them all. Conveniently, web browsers speak HTTP, so it's highly interoperable too. HTTP packetizes data anyway, though, so you can frame your video segments into HTTP responses, and then if one is lost or slow you can discard it if you're falling behind. Great for file transfer, less good for video - but on a lightly loaded network, you're unlikely to lose any packets at all these days, so the downside isn't too bad.īut the longer the stream of video, and the larger your latency (or smaller the bandwidth, or both), the more a lost packet will mess things up for you. So if I send a sequence of 20 packets, and you miss the first one, the other 19 will be buffered at your end while your computer asks for the missing one. TCP operates by creating a "virtual circuit", a continuous data stream containing a sequence of bytes, where each byte won't arrive until the previous ones have done so. If you know the basic parameters of the video (size, colour depth, etc) you can start playing at any key frame - but if you lose the key frame you have to wait until you get the next one. Video works by sending a "key frame", following by a bunch of delta frames. Mostly HTTP (and thus TCP) due to the limitations of browsers etc, but some could use WebRTC and other technologies to improve performance now.įirst, video. The closest available browser-based capability is WebRTC, but it is not widely supported as yet. Twitch probably didn't want to require that of users. So using UDP, the user would have to download and install a custom app to watch streams. With UDP you would need to develop your own protocol on top of it (and maybe client) if one of the existing ones (mostly for teleconf?) did not quite fit your use case.Įdit2: Currently you cannot make UDP connections through the browser, period. It is much easier to support with existing tools and through browsers. But I could see a use case for HTTP (TCP) live streaming. This SO question on the same topic has a pretty good accepted answer.Įdit: I don't specifically know what strategy Twitch is using. UDP is seemingly more appropriate for live / multicast videos. Or do the same in the cloud to scale internet video load. I can host the videos in a central location and then have a local caching proxy that will serve the video (chunks) it has seen recently without using internet bandwidth, for example. Using HTTP, each chunk has a different url (usually thru query parameters), so the chunks can be cached. If the player can't download fast enough to keep ahead of the user, it switches to a lower resolution (smaller) chunk on the next request. It downloads the video in chunks (maybe 5-10 seconds? not sure) over TCP, and the client-side (javascript) player tries to keep a couple of chunks ahead of where the user is. Most of what we use is HTTP (which is on TCP). This is a common practice in most web-applications as it is not only easy to implement, but it also helps you get through firewalls and works via SSL. I didn't get a chance to save these packets, but it is possible to decipher the content and play it through another video player. Although, the player didn't allow me to rewind a live feed. When I tried this through Insomnia, I noticed that they still serve older packets which means the fee is being stored. If you pause a live feed and resume it, the client will drop expired packets and continue requesting data from the current time. This service-worker thread fetches aggregate video frames sequentially over HTTP and caches them locally. Twitch uses a service worker in the background. I noticed that there's a pub-sub module setup which means they could also push the updates from the server as it would be efficient that way with thousands of clients. I'm not sure if they intended to use it for streaming or other notifications. On the client, a web-socket connection is initiated and upgraded to TCP or downgraded to HTTP depending on what the network and browser support. I think there's a proxy server that the client connects to. Now, the server side software can be configured to serve content directly (most likely twitch isn't doing that). If the content will be made available later, it is most likely using TCP. The backend has a video streaming server where the content gets pushed via a client-side streaming software. I briefly glanced through the twitch app to decipher what is going on. I've never used twitch, but I love the engineering challenges with streaming video and have worked with it quite a bit in the past.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |