Webinar

How Live Dealer Casino Games can achieve real-time streaming, without sacrificing video quality and stability for their players

Watch our Insightful Webinar on Live Dealer Casino Games

  •  The requirements for video streaming and live dealer casino games.
  • Current approaches used for low-latency video streaming: WebRTC, LL-HLS, LL-DASH, HESP, and WebSockets. 
  • Attention points for each approach in terms of: scalability, video quality, content protection, start-up time, and stability across different network conditions. 
  • How to choose the right approach for your use case. 
 

Webinar transcript

Explore the discussion on real-time streaming Live Dealer Casino Games between Steven Tielemans, Pieter-Jan Speelmans and Bart Snoeks. 

Introduction

Steven: Hello, hello, welcome everyone! My name is Steven. I'm the CEO of THEO Technologies and I will be your host today for this webinar.

So, in today's webinar, we will cover the topic how providers of live dealer casino games can achieve real-time streaming. And this, and that's an important one, and this without sacrificing video quality or stability for their viewers. And for this webinar, I have invited two colleagues to talk on their experience in this field. First of all, we have Pieter-Jan or PJ. Pieter-Jan is CTO at THEO Technologies, a deep technical expert on low latency video streaming technologies and I can say one of the drivers for some of the latest innovations in the industry in this field over the last year. Welcome, PJ, as always, I'm sure that a lot of our viewers today will be curious to hear your technical insights and best practices.

Pieter-Jan: Happy to be here. So hello, everyone.

Steven: And then we also have Bart Snoeks. Bart is an account director of THEO. He has closely worked together with providers of live dealer casino games, both large and smaller ones and he will share his experience and learnings working together with them. So welcome Bart, also looking forward to hear your insights.

Bart: Welcome everyone to this webinar, looking forward to it.

Steven: So, before we dive into the actual topic, let me start with a little bit background about THEO Technologies and also why we are talking today on streaming for iGaming or live dealer casino games. So, our team at THEO, we have been supporting media and entertainment companies all over the world for more than 10 years now. We come from that part of the industry where some of our customers are really spending hundreds of millions of dollars for the rights to stream content to their viewers, such as the largest sports events in the world. And when you pay so much money, you want, of course, to do that with the best possible experience for your viewers. So user experience is key for these companies. And there is an evolution ongoing where these providers want to stream their content at lower latency. So no longer at 8 to 15 seconds, which was for a very long time the industry standard, but going down now to one to five seconds kind of range, so video playback quality and low latency. That is where we help them with.

And then there is the iGaming industry, which is a little bit different, and that makes it very interesting as well. So real-time streaming is not new for the iGaming industry. In fact, it's being done already for multiple years. So you can wonder then, why should I care? Well, what you see is that to achieve these really low latencies, and I'm talking then over sub-second latencies, to achieve these latencies, trade-offs in a lot of cases were made in the approaches used for video deliveries: trade-offs in terms of video quality, trade-offs in terms of stability. But viewers, players in the case of iGaming, are today becoming more demanding than ever. And today everyone, including you, including me, everyone expects the same high quality online video experience as they are used to get from the Netflix, like, companies in this world. So on one side, you have the media and entertainment industry, such as board streaming, traditionally focusing on high quality viewer experience, where they are now focusing on bringing the latency down while still maintaining that premium video experience. And on the other side, you have the iGaming industry, traditionally focusing on low latency video streaming, because it's often a requirement for their games, where they are now focusing on bringing the online viewer experience of their players to the same level as the well-known OTT streaming services. When high quality viewer experience and low latency video delivery comes together, that's our sweet spot of expertise.

Before we dive into the topic, some practical things. So via the controls at the bottom, you can ask questions. Please do so during the webinar. We will try to cover your questions at the end of the webinar. And yes, in case you are wondering, we will send you the slides and recording of this webinar via email after the session. So the agenda for today, three main sections. We will talk on the requirements and challenges that we see, also on the different technical approaches used today for real-time streaming. And you will see that each has their pros and cons. And then how to select the best approach for your use case. Note that during this webinar, we will specifically focus on the use cases of providers of live dealer casino games. But no worries if you're active in sports streaming, betting, or any other form of interactive entertainment. A lot of what we discuss here is actually also applicable in the same way on other use cases.

Before we really dive into it, it's poll time. For us to get a good idea of the audience today, we'd like to understand what is your experience today already with low latency live streaming. So you should normally see the poll on your screen now. So we will give you a couple of seconds to enter your answer. Are you working together with a third-party vendor for webRTC or WebSocket solutions or have you built your own solution in-house? Do you use low latency HLS or DASH with latencies of two to five seconds, do you use traditional HLS or DASH with higher latencies? Or you're not streaming video at all, but are planning to do it, or you're just here to learn from the latest technical evolutions? Let's wait a couple of seconds. Interesting. So I think we have a mix of people here. So, some people with experience with WebRTC and WebSocket, some with HLS and MPEG-DASH, and some are just here to learn as well, which is also fine.

So, okay, enough introduction. I would say let's dive into it.

Navigating Latency and Quality Challenges in iGaming Live Streaming

Steven: Bart, you have worked together over the last year with multiple providers in the iGaming industry. So, what are your learnings, what are your insights on their requirements and challenges today?

Bart: Yeah, indeed, Steven. There are actually quite a lot of requirements when streaming video for live casino, but one of the most important, of course, is latency. So, when we talk about latency just to be aligned, we always measure end-to-end latency, meaning from the camera ingest in the studio, over the encoding and packaging, and using worldwide delivery mechanisms then up to the devices that are shown, usually a wide variety of devices these days. If we look at it from a Mathematics perspective, live casino environments Mathematics are very simple actually. The lower the video streaming latency, the more rounds you can play. Your operational cost for the studio, the host, and technical people is always made anyway. So the more rounds you can play for a game, it means just more revenue. Live dealer casino catalog consists of games with low, medium, and high latency requirements. If we look a bit more in detail, card games, for example, are really player decision based. They can clearly benefit from lower latency. Roulette is easy, can go with a higher latency, but by having a lower latency, you can increase the betting window, so you can keep betting longer open, and hence accept more bets and make more money. Vertical Wheel, also often used as Wheel of Fortune, typically one of the most popular games for concurrency. It doesn't really require ultra-low latency, but instead it needs to scale very well. Due to the amount of simultaneous viewers, which is quite high, but we'll come back to that later, how to do that.

Typically, most of the real-time technologies, they target sub-second latency as the most optimal solution today for live casino video streaming. And a last important point also to consider when we talk about latency is where are your viewers located actually. If you are in Brazil, in India, or the Philippines, Network infrastructure is typically not that stable as in Europe, and to compensate this with certain technologies, we see that some of the casino providers use higher latency approach to still offer an acceptable viewer experience.

That actually brings us to the next requirement, which is the quality. You all invest, people here in this webinar, in nice studio infrastructure, so you also want that good quality video that you produce being distributed in the best way possible. And that's not really the case today. Also, on top of that, your viewers, they become more and more demanding, as their expectations are set by other streaming platforms, as Steven mentioned. For example, Twitch streams in full HD at 6 megabits. YouTube already at 12 megabits today. So at THEO, we are involved with a lot of high-quality sports streaming, like the Olympics, like the World Cup football today. So, we know that everyone at home is looking at better video quality today compared to, for example, a year ago.

Enhancing User Experience and Ensuring Quality Delivery in Live Streaming

Bart: For us, if we look at this, there are a few important factors that actually determine the user's quality of experience. First, there's, of course, the resolution. You want to deliver in the most optimized resolution for the viewer's screen. That's clear. There's no need to show a full HD on a smartphone. But, it makes a lot of sense on an iPad Pro or even Smart TV. The bitrate goes hand-in-hand with the resolution, and together they form the essentials for delivering adaptive bitrate, multiple profiles of your same video, and each of those profiles optimized for device capabilities at any time. Then the frame rate is often underestimated in live casino. You can produce in your studio today at 50, 60 frames per second, but you don't distribute this way. Why not? I would say.

And then, startup time is the last important one. Why would you wait seconds before the video starts? Users could be gone. So fast startup times can also give you additional benefits. Also, for example, when switching between landscape or portrait mode video, especially important also on mobile devices, when you can handle this at the same time as the user rotates his device, it becomes really a nice experience.

ABR has always been very important to deliver optimal viewing experiences in any network condition. It's actually made for that. So still in live casino, it's not very common today as some of the low latency technologies don't support it properly.

For us, the conclusion here on quality is that you want crisp and smooth pictures. An even more important streaming video will continue to improve in general, so you need to follow as a live casino provider as well. You should not just be happy with what you achieve today.

The next important one is stability. The last mile delivery of your video to the end user, is something you cannot control. You are relying on existing networks worldwide and you need to hope that that delivery for your technology will work fine in that region. In Norway or Sweden, it's probably not an issue at all to deliver anything anywhere in a good condition, but in Southern Europe, Latin America, Africa, Asia, you really need to consider heavily how you want to deliver. Edge CPU processing is often not available at all, so regular CDN delivery is often much better, as these countries have seen a huge increase in regular video streaming over the past three years as well. So, if you look, for example, at Vietnam, where live streaming has tripled in a short time frame over the past year, so companies like Akamai, Fastly, all the CDN providers, they invested heavily in good infrastructure for the last mile video delivery in this region. So why not use that as well, that investment?

How many times have I seen frame drops during a roulette game? It's really crazy, and unfortunately, it is related to some of the technologies used in the market today. You don't want frame drops in your delivery, which some protocols do, as they do not use ABR to compensate the network going up and down, so they drop frames instead. Also, it is well known that mobile operators often block UDP traffic on the networks when it starts congesting. Do you want your live casino game to stop streaming because everyone on the last mile network starts watching World Cup football? We don't think so.

A last key factor is, of course, protection. And as an add-on, more specifically, also content protection. A lot of streaming happens in Clear. Anyone can take your content, steal it, restream it, use it for cheating. They can basically do whatever they want with it. Basic tokenization is a first layer of protection. but it's not always fully integrated with your backend to grant access, it can actually be hacked in a very easy way if it's not. So, make sure you always work with a video streaming service that integrates the tokenization flow also with your user management.

Your stream startup is now protected, but still your content is not encrypted. For this, you need a real DRM solution. By adding DRM, you can also encrypt the content itself and deliver it in a similar Hollywood-style, great way as the big streaming providers worldwide, which Steven already mentioned. Legal and compliance teams within live casino, they will love this capability, especially when they need to onboard new countries, as this requires a lot of work for their teams. DRM will provide more certainty from a compliance perspective as well.

As a last point, on top of protection layers, you can, of course, also have a need for geo-blocking to restrict streaming geographically. Or you could even personalize the same video streaming content from the studio with other graphics and restrict each of the personalized video player outputs to a region, for example. So that's also an interesting use case.

That's a bit of a short wrap-up, Steven, of what I have experienced over the past year.

Comparing Low Latency Video Streaming Protocols

Steven: Thank you, Bart. So, Pieter-Jan. I can say you're a deep technical expert on low latency video streaming technologies. There are today multiple different approaches used by providers to achieve low latency video streaming. So, I'm curious to hear your experience with all these different approaches and how you see them compare in terms of pros and contras.

Pieter-Jan: I shouldn't start talking about the low latency part probably. I should probably start with the other protocols like Apple's HLS and MPEG-DASH. I mean, these are the streaming protocols that were built when the internet was actually growing and when video streaming became more and more common. And as a result, there was a need for streaming protocols that were designed to really stream to massive audiences. This had to be able to scale over the normal CDNs, and these are still the protocols used today by Netflix, by YouTube, by Twitch, and all of these other big guys. These protocols were actually primed for premium experiences. That means high-quality content, good user experiences, but also the studio-grade content and the content protection that Bart mentioned.

There is a big disadvantage to these protocols. As I said, they were developed when the internet was still growing and the trade off that was made to reach these massive audiences, that was latency. And what these protocols basically do is they take the long video stream and they chop it up in different blocks of video data. Usually a couple of seconds, in the past it was even 10 seconds or more. And the problem with this approach is that a video client has to download one of these blocks, well, first, it has to even be published. It has to be downloaded completely before it can really be played. And the mechanic behind this really introduces a lot of latency. In the past, it was very common to have latencies between 10 seconds, even 45, or even more. I mean, BBC iPlayer during the World Cup, if you read the news about it, they have 90 seconds of latency. They are using these protocols. So that's a lot. And yes, I mean, today you can already bring this down. You can bring this down to like five seconds. If you start tuning it down, you can bring it even lower. Might impact stability a bit. But latency will always remain the biggest issue with these protocols.

But there is a big contrast with some other of the protocols. For example, with the WebRTC protocol. WebRTC, the real-time communication protocol for the web. Well, it was designed to do exactly that. So, to really make sure that you can do communication in real time, it was developed to do things like doing a Google Meet call or a Zoom call. And of course, when you're on a call, well, latency becomes critical. It doesn't just have to be sub second, but ideally it's actually real time. But again, as with HLS, where latency was a trade-off. With WebRTC, there were other trade-offs that had to be made. And the trade-off there was usually quality. WebRTC is built on UDP, so it means that it can send the packet across the line very quickly. But this is the kind of protocol that is the first to be blocked whenever there is network congestion. So, there might be frames which are not arriving, which leads to the frame drops that Bart was so annoyed by. And this is not a protocol that's built for premium content. This is a protocol built for real-time communication.

So, there's a lot of other things which don't work in the way that they work with HLS and DASH. There is no DRM. And yes, I mean, there are ways to start adding this on top, but this is not... well, not ready to be used in production. You cannot use this on, for example, an iPhone, which is a critical device for live dealer casinos. So, yes, I mean, this is a very interesting protocol, but the problem really here is the latency and the scalability that is the result of this latency. And this really boils down, if you look at it, to the architecture. Because how normal HTTP-based streaming protocols are based is: you have one streaming server, and then you use these CDNs, these commonly available systems, which have grown over the past years when the internet has grown, and they bring you an easy way to scale. WebRTC does not have this. With WebRTC, you're actually using point-to-point connections between the streaming server and the clients directly. That means that you need a whole lot more streaming servers if you really want to start scaling out these kind of solutions.

It's actually kind of similar to the next protocol that I want to talk about. It's a WebSocket-based streaming protocol. And yes, there might be people here saying, well, WebSockets are not a video streaming protocol. And that's true. WebSockets actually use a similar architecture to WebRTC. But WebRTC was built for video distribution. WebSockets were built for generic distribution. You can send anything across it. It is based on HTTP and on TCP, so delivery will arrive, if you send the video frame, it will actually arrive. But the delivery of video over WebSocket, it's a proprietary thing. There is no standardization on how video has to be distributed, which means, well, there is a risk of a vendor lock-in. On the other hand, there is more flexibility. With WebSocket-based approaches, you do see that there's often no frame drops, quality can be a bit more flexible, can be a bit higher, usually perceived a bit better compared to WebRTC because the number of stalls goes down. But there are other disadvantages still. Problems with startup times, DRM remains an issue, and, well, it's already better, it's a step in the right direction. But there are other needs. I mean, as you mentioned, Steven, the industry is evolving. People are expecting more and more better quality.

And that actually brings us to the final protocol, which is a lot like HLS and DASH in the ways that it's HTTP based. But it's actually not just built to scale and to deliver premium content, but also to deliver sub-second latency, and that's neat. HESP, that's the High Efficiency Streaming Protocol. This is really the last generation of streaming protocols that are out there on the market. And I must admit, I'm a bit biased here because, of course, this is a streaming protocol where we started development on at THEO more than five years ago at this point in time. and which has, in the meantime, evolved, it's been published on the IETF as a standard. And while we have, for example, our THEOlive product, which allows you to do live video streaming at sub-second latency, there are other products, and there's a blooming ecosystem of HESP-based products starting to pop up.

But as I mentioned, HESP, it's actually HTTP-based. So, it scales over standard CDNs. an easy way to start distributing across the globe, towards even millions of viewers. And it's a protocol really focused not just on the problems that we see in the live dealer casino gig space but also on the premium content space. So this is focused on high quality delivery, super fast channel startup, allowing people to get into a game fast, but also the ability to do content protection with DRM. And yes, I mean, HESP's latency is not as low as WebRTC. With WebRTC, you can go to like 500 milliseconds. HESP will probably be more around 700, 800 milliseconds, but, the latency is configurable. So you can actually tune the latency towards the game that you're on and towards the use case that you have. But if I really look at it, it's these four categories of streaming protocols that are probably the best suited for the live dealer casino space.

Choosing the Right Video Streaming Approach

Steven: Thanks PJ. A lot of different approaches. I think each with their pros and contras. I see the first questions are coming in, so that's very good. So I want to remind everyone as well that you can ask questions via the question button in the control bar. So we will try to cover as many questions as possible at the end of the webinar. As next section now, Pieter-Jan and Bart, there are multiple different approaches possible, but as a provider, how do you choose the right approach for your use case and what are the criteria that you take into account for that decision?

Bart: I think, Steven, of course, it's important to look at the decision criteria first. So, having a lower latency can always be beneficial, although it's not always needed, probably. Check also the regions where you are delivering your games and be aware about the server infrastructure capabilities there for video delivery, especially when you deliver to mobile users. Scalability becomes very important when your games are successful. So for five players around the table, you probably do not need necessarily a CDN to scale. There are other solutions out there. In-game channel changes and fast startup are now possible as well with newer protocols like HESP. So probably some games can also be retaught or switching as I mentioned, between landscape and portrait mode can now be done also with a good viewer experience. And if cheating in general is common in the regions where you stream video, consider also high secure delivery by using full DRM.

When deciding which technology is the best fit, then your use case is important, as I mentioned. So therefore, we created this nice matrix for you so you can use it yourself, to review your needs for your specific use cases.

But let's dive into a few examples here. We mentioned on this slide the capabilities of the different protocols, so it should be fairly easy for you guys:

Suppose you want to deliver, for example, a roulette game to Eastern European customers. You definitely don't want high latency and to keep seeing the ball on the spinning wheel, you don't want dropped frames, so video quality should be high. Stability is always important for most games, and you attract quite some viewers, so you also need to be able to scale with this game.

In order to retain your customers for the future, you want to have a fast start-up time, and you have a lot of viewers over mobile networks, so you have a risk of dropping UDP packets with WebRTC, so that's really not preferred. DRM is probably less a requirement, for example, so you can choose what you would prefer there to use.

To look at another example, you want to deliver, for example, a card game to Asia customer region. Again, low latency with high quality video, stability are preferred. Scalability for a card game is probably less relevant unless it's, of course, very popular. Network conditions are for sure also important for this region. And as a last point, you don't want to mess with security, so DRM is here the best option for this use case.

Then as a last example, delivery of a Wheel of Fortune game to European and USA geographical regions. Latency is far less important for this game, so actually all protocols could work fine. Quality and stability are as always important for your good viewer experience. And as this game is very popular, you for sure want to scale as well in the best possible way, and also in a cost-efficient way. So scaling over a regular CDN is a preferred path here. Network conditions for these regions are usually quite good, and the risk for cheating might be less as well.

If you look at some of the key takeaways in this next slide. Definitely, latency, quality, stability, and content protection are key requirements. WebRTC, WebSocket, and HESP can provide sub-second latency, where HESP can scale with less powerful server infrastructure. If you need to scale, you probably want to use HLS, DASH, or HESP, as they can scale over a regular existing CDN infrastructure, so you can reuse what other people have built there. Then WebSocket, HLS, DASH and HESP usually provide the best quality. That's important, of course, to retain your customers. And if you need full protection with DRM, use them HLS, DASH or HESP.

These are my takeaways, Steven, preparing this webinar.

Q&A

Steven: Thanks, Bart. Thanks, PJ. And let's look into some of the questions that came in. So first question that I see here is regarding quality. Bart, maybe that's one for you. Which bit rates, resolutions, FPS, so frames per second, do you commonly see used today? And maybe an extra question on top from my side on quality of experience for viewers: you talked earlier on ABR, is streaming in a single quality today still happening?

Bart: Yes, actually, a lot of the live casino providers are still producing in single bitrate, typically because they use protocols which are not optimized for ABR. If you look at bitrates, I see varieties of going from 800 kilobits in single bitrate up to three, three and a half megabits in ABR in 1080p, but even the 800 kilobits at 720p often. So this is a bit of variation which I see. Frames per second, typically they target 25 frames per second, but as I mentioned, I often see with a lot of games and technologies being used for those games also that that frame rate often drops up to 15 frames per second because the network cannot cope with the delivery. There's no ABR, so they try to keep going with that single bit rate, but then drop frame rates, for example. And live casino licenses, they often combine games from different providers. So quality for me is also a bit of hygiene factor. You don't always win with quality, but you can also not stay really behind but that's my opinion on this one.

Steven: Okay, thanks. Another question on HLS and DASH. Pieter Jan, maybe one for you. Do you see adoption of HLS and DASH in the market today for streaming of live dealer casino games? Or is this typically too high of a latency for their use case?

Pieter-Jan: Oh, I mean, there's definitely adoption of HLS and DASH. And I mean, we even saw it in the poll here, there's quite a lot of people using it. And not every game requires sub-second latency, it's the same thing that Bart mentioned. And HLS and DASH do allow you to provide a higher quality of experience. On the other hand, for some games, this really is not an option. You will need to go towards more sub-second latency there. And even for those games where you don't need the low latency per se, for your business case, this can still be a very interesting to do. So, is there usage of HLS and DASH? Well, yes, absolutely. But it really depends on the use cases where you should use it or where you shouldn't use it.

Steven: Okay thanks! A question on WebRTC: what are the possibilities to support drm with WebRTC

Pieter-Jan: It's probably a good one for me to take as well um well I mentioned it slightly. I mean, there are possibilities to start doing WebRTC with DRM. There are, let's say, prototype-like implementations here. The problem that I have with these implementations is they're not available on all of the platforms where I would expect them to be able to be used. So if I look, for example, at iPhones, which for me, are a critical device. They don't support the APIs that are needed for WebRTC with DRM today. And that's for me, yeah, something that proves that this is not ready for production use today.

Steven: Okay, thank you. And then a question on HESP. So PJ, maybe for you as well: for HESP under quality of experience section, you have fast startup time. How fast are we talking?

Pieter-Jan: Well, there is a level of tuning here as well. And I can go very deep into the technical details to explain how this actually works. Very happy to do that, but probably not time for that now. But with HESP, what you can actually do is you can go to a few round trip times. And what does that mean? Usually that means you can go to like 200, 300, 400 milliseconds and that's the startup time, which is a very big difference compared to how it works, for example, with WebSocket or WebRTC. Because there, what you have is you usually need to wait for a GOP on average. And a GOP is basically the time until there is a full image being sent towards the user. There, you're usually looking at a few seconds. So WebRTC, fast startup time, well, it's an order of magnitude like 10x faster compared to most of the other streaming protocols.

Steven: Okay, thank you. Bart, maybe one for you: which of the streaming technologies has the widest adoption at the moment?

Bart: If we look at that, I think most widely used is for sure WebSocket. And as a second WebRTC today, of course, for less latency-dependent games, HLS as well, but WebSocket by far today. Reason for that is, of course, as well, that there was no real existence to do this real-time latency with other protocols. As PJ mentioned as well, if you really want to drop down to two or three seconds with HLS or DASH, you need to stress the whole ecosystem on ingest, encoding, packaging, CDN delivery, player buffer, everything is stressed, so there's a high risk of failure. So that's why today these technologies are used. But with HESP, of course, there's new technology now available in the market, heavily worked on over the past years by a really expert team to do this, and which brings all of this together. So bringing high quality with sub-second latency on any screen, actually. So yeah, today other protocols for sure have the widest adoption, mainly due to the latency capabilities. But yeah, there are other things out there today.

Steven: Okay, great. And then maybe a last question, Bart, also for you: based on what you see currently in the market, is there a cost difference between the different streaming technologies or the different streaming approaches?

Bart: Yeah, for sure. Especially, of course, on the ingest, if you look at, as I mentioned, some operators still using single bit rates, they do not need to transcode for adaptive bit rate. So there, of course, the ingest cost could be low there, especially a bit lower compared to re-encoding for adaptive bit rate. but then on the delivery side of things, if you need to use indeed edge compute, it's much more expensive than existing CDN-delivery that's known by everyone. So if you can use a regular CDN on the delivery side, it makes a huge cost difference actually between other delivery protocols like webRTC or website based

Steven: Okay and maybe one more question on HESP, pretty technical. So PJ, I'm looking at you there: do you use media source extension API for playback with HESP protocol? And if HESP is based on HTTP and can be used with existing CDNs, then what value of cache control max age value are you used to have a sub-second latency? I know it's pretty technical but give it a go.

Pieter-Jan: That's fine. Probably the person asking this question is relatively technical as well. So I will abuse that or assume that. Media source extensions, is it used for playback? Well, it depends on the platform. Not every platform has media source extensions as an API. For example, with THEOplayer, what we do is when media source extension API is available, yes, then we will use it. But for example, on platforms like Safari on iOS, this is not available. So we have other ways to make sure that we can play HESP there as well. So it's a yes and no kind of answer, which probably is not a real answer.

But if I remember correctly, the second part of the question was, if it's based on HTTP, how do you configure the CDN? Well, HESP is built in a, well, it's actually a very simple way if you know how it's built. But it actually uses very long cache times. There are no cache times in HESP which have to be low. The nice thing here is that when you have a manifest file, that manifest file allows you to predict how the entire stream for the future, what it will look like. And yes, at times you will need to update it, but that's very rare. So you don't have to cache this very short, you can cache this manifest for a long time. There are two other types of files with HESP that you need to cache. The file that we call the initialization files or the initialization packets, which are numbered files, depending on the time when they need to be requested. And there are segments which are also numbered and well, which can go on and on. And that's the nice thing here. You don't need short cache times, but probably something to dig in deeper with the person who asked this question, because as I said, I can go on and on and I can go pretty deep technical in this.

But a good solution would be to start reading the spec or to look at the HESP Alliance website, because it has a lot of information on how this all works from a technical perspective.

Steven: Okay, PJ and Bart, thank you! We have to wrap up now, we have some questions that we have not covered, but I promised we will come back for sure to you via email. Thank all for joining and don't hesitate to contact myself, Bart or PJ via Linkedin or email, you can also catch up with us in-person during ICE event in London in early February. And as promised, you will receive the slides on your mailbox. So enjoy the rest of your day, bye! 

Back to top

The team behind the webinar

Bart_Snoeks_Yellow-Background-01-01
Bart Snoeks
Account & Partnership Director @ THEO Technologies
Steven-Tielemans_Yellow-Background-01-01
Steven Tielemans
CEO @ THEO Technologies
PJ_Yelllow-Background-01-01-1
Pieter-Jan Speelmans
CTO @ THEO Technologies

Want to deliver high-quality online video experiences to your viewers, efficiently?

We’d love to talk about how we can help you with your video player, low latency live delivery and advertisement needs.