Conference PaperPDF Available

Enhanced IPTV services through time synchronisation


Abstract and Figures

User interactivity and customisation are key features that must be promoted to help distinguish IPTV from satellite/aerial/cable TV alternatives. Furthermore, IPTV providers also have to compete with Internet TV by providing extra features to justify the cost. In this paper, we present a number of scenarios whereby IPTV providers can achieve this through the use of synchronised time. Integrating and tightly synchronising multiple user-selected media streams in Real-Time that are logically and temporally related potentially adds significant extra value to any IPTV platform. It also raises significant research challenges. We outline the prototype developed for a TV/Text Feed scenario and present some details of ongoing and future work.
Content may be subject to copyright.
Enhanced IPTV Services through Time
Lourdes Beloqui Yuste
Discipline of Information Technology
NUI Galway
Galway, Ireland
Hugh Melvin
Discipline of Information Technology
NUI Galway
Galway, Ireland
Abstract—User interactivity and customisation are key features
that must be promoted to help distinguish IPTV from satel-
lite/aerial/cable TV alternatives. Furthermore, IPTV providers
also have to compete with Internet TV by providing extra features
to justify the cost. In this paper, we present a number of scenarios
whereby IPTV providers can achieve this through the use of
synchronised time. Integrating and tightly synchronising multiple
user-selected media streams in Real-Time that are logically and
temporally related potentially adds significant extra value to any
IPTV platform. It also raises significant research challenges. We
outline the prototype developed for a TV/Text Feed scenario and
present some details of ongoing and future work.
Index Terms—IPTV, RSS, Media Synchronisation.
The difficulty in synchronising multiple media streams
over an IP Network is derived firstly from the potentially
separated locations of the sources; secondly, the issue
of where to perform the synchronisation, and thirdly the
technical challenges of displaying multiple synchronised
flows to the final user. In an IPTV platform the operator
often has control over all media streams on the network so
the first synchronisation challenge can be managed through
specification of media content. In Fig. 1 different multimedia
sources connected to the same Network Time Protocol (NTP)
Server are synchronised at source ready to send to the client
through an IP Network.
We envisage that IPTV benefits such as superior image
quality, safe well-managed network and known viewers [1],
can be combined with the advantages of Internet TV such as
worldwide access and unlimited media choice [1].
We envisage that a number of scenarios where Real-Time
integration and synchronisation of user-selected media streams
that are both logically and temporally related, could generate
significant end-user interest especially in sports related events.
To enable IPTV to present multiple media streams that are
logically and temporally related, precision time stamping at
source and synchronisation at delivery are key requirements.
Examples from the sporting world include: in the Soccer
World Cup where group of games are played simultaneously
users can select to be informed about one or more games
in Real-Time. Media streams might include related video
streams, video stream with Internet Radio stream or a
Fig. 1. Different multimedia sources (TV, Radio and Feed) to be synchronised
at source via NTP Server or equivalent
combined Real Simple Syndication (RSS)/Atom Feed with
All of these scenarios, merging multiple video streams,
TV/Feed and TV/Radio, present a range of common research
challenges; we identify and partially examine these through
implementation of a TV/RSS prototype and indicate our plans
to expand the work.
The remainder this paper is structured as follows: Section
II describes the different scenarios, section III outlines the
TV/Feed design issues, section IV details timing issues
for both video and text Feeds, section V describes the
implementation, section VI outlines our research plans, and
section VII concludes the paper.
A. Multiple Video Streams
This scenario presents a mosaic of video/media streams of
a live World Cup Football match originating from different
content providers. For example a video stream of a match plus
another two video streams from the two different sets of fans in
Server-side Synchronisation Client-side Synchronisation
Maximises clients bandwidth: only
one multimedia stream is delivered
to client
Requires more client bandwidth:
two multimedia streams delivered
to client
Server cannot provide all different
client’s option in large systems
Unlimited options to choose every
option in the STB
System not scalable System scalable
Increase server complexity/cost Increase STB complexity/cost
Server overload Client overload
their home countries. This mosaic of the three different media
streams comprising the live match feed, and video feeds of
the two home-based fans in their respective countries, allows
the end-user to see the simultaneous reaction to a match event.
Typically the audio channel should be the one related to the
TV channel transmitting the sport event.
B. TV/Feed
There are different types of multimedia synchronisation
described in [2]. This case of video and text feed can be con-
sidered as inter-media synchronisation because audio, video
and Feed textual information are synchronised.
The user selects a sporting event as well as a related
Feed which has the information that the user is interested
in. Presenting betting odds/prices via a Feed in Real-Time on
a live streamed horse racing event is one example. Another
would be Formula 1 racing whereby a Feed gives Real-Time
information on lap times in one Feed and inter-car distances
in another.
This could be used to facilitate Real-Time betting where a
Feed can show all the odds related to the sport event displayed
on TV. The alignment of the video and Real-Time odds as well
as the exact time when a client execute a bet is critical.
C. TV/Radio
In this scenario a soccer fan watches his/her team in the
World Cup, from a foreign country via IPTV, but wishes to
listen to his/her national audio commentary on the match. The
radio could either be delivered via IPTV or Internet TV. In
this case, the IPTV provider must remove the embedded audio
from the video and replace it with a precisely aligned user-
chosen audio stream.
IPTV provides a concrete number of Radio channels. How-
ever Internet Radio widens the client’s choice although it adds
more technical challenges due to the use of the public IP
In the second scenario presented above, logically related TV
and RSS Feeds, though separately sourced, can be selected by
the user to have his/her own personalised TV. They choose
which TV channel and the type of information they want to
receive via a Feed.
Regardless of scenario, in terms of scalability, a key issue to
RSS 2.0 Atom
<channel> <feed>
rssChannel <lastBuildDate>
rssChannel <pubDate>atomFeed <updated>
<item> <entry>
atomEntry <published>
rssItem <pubDate>atomEntry <updated>
Date format follows: obsolete
RFC822 (Second level granularity)
Date format follows: RFC3339
compatible with ISO 8601 (Better
than second level granularity)
be implemented is location of synchronisation, server-side or
client-side. This, in turn, will impact on the potential for mul-
ticasting which will have implications for bandwidth as well
as server/client complexity and Set-Top Box (STB) cost. Both
options for deployment were implemented; the integration and
synchronisation facility running on the server and running on
the client. Table 1 establishes the main differences between
the two options.
A key requirement is for media content to be time stamped
at source with a universal time source. NTP is the most
widely used mechanism that allows synchronisation of dif-
ferent sources to better that 1 msec on LANs and less than
10msec on WANs [3] when properly implemented.
An IPTV provider often has full control of media sources
on its own well-managed private network and can therefore
ensure that media content is adequately time stamped. In our
scenario, this refers to Feed and Video time stamps.
A. RSS/Atom Timing
There are two main Feed formats: RSS [4] and Atom [5].
Although 80% of feeds use RSS [6], it is a frozen standard
and no changes can be made [4]. Atom has been adopted
as IETF Proposed Standard and enhances the RSS format
and it could provide future improvements including making
compulsory milliseconds accuracy in the time formats. In table
2 we present the main differences between elements/tags in the
two standards and the date format standards they use.
RSS 2.0 is based on a tag Extensible Mark-up Language
(XML) format. In the RSS Channel, following RSS 2.0
specification [4], the tags related to time are optional. The
<pubDate>tag indicates the time when the channel has
been published; <ttl>indicates the period in minutes that the
client waits until the channel is reloaded from the server, and
<lastBuildDate>indicates when the channel’s content was
last modified. The individual items also may have <pubDate>,
different from above, which details the time and date the item
has to be displayed. All of them follow the RFC822 [7] format
which includes the date, day of the week and time in seconds,
Greenwich Meridian Time (GMT). Second level granularity is
insufficient for our application scenario where synchronisation
Fig. 2. Example time standards used by RSS (RFC822) and Atom (RFC3339)
Fig. 3. RSS 2.0 and Atom time related tags used as time stamps for
of the order of millisecond is required.
Like RSS, Atom is based on XML but adds new features.
Atom adds the namespace structure so all elements belong to
one defined namespace [6]. Tags and time format are different
from RSS.
In Atom the tag <published>in an entry is not obligatory
and it indicates the time the entry has been created. In Atom
the tag <updated>is compulsory for the feed and the entries;
it indicates the time when the information has been updated.
The main difference is that Atom uses RFC3339 [8] which
is compatible with ISO 8601 [9] and specifies granularity
better than seconds but it does not specify how many digits.
Again, for multimedia synchronisation purposes, particularly
scenario three, we would recommend a minimum of 3 digits
to have milliseconds in order to have acceptable inter-media
synchronisation. In Fig. 2 an example of both formats, RFC822
and RFC3339, can be found.
For our purpose all time related tags would be compulsory
in the RSS/Atom Feed and the time granularity required would
be dependent on the application. This could be of the order of
<100 milliseconds to meet the synchronisation requirements
in the scenarios presented such as Real-Time race betting
or Real-Time race track information in Formula 1 racing.
Fig. 3 shows a general example of the distribution in an RSS
Channel/Atom Feed of the time-related tags inside the Feed
B. MPEG2 Timing
In the three possible scenarios the TV channel sent via the
network from the TV streamer to the STB will typically be
in MPEG2 Transport Stream (TS) format [10]. In MPEG2
TS there are a number of different timestamps embedded in
the packets: Presentation Time Stamp (PTS), Decoding Time
Stamp (DTS) and Program Clock Reference (PCR). DTS and
PTS perform the inter-media synchronisation between audio-
video while PCR is used by the decoder to perform Clock
Recovery functions [10]. The PCR (27MHz) is located in the
Fig. 4. MPEG2 TS Header with all time stamps fields and related fags
Fig. 5. MPEG2 PLL High Level Diagram where PCR and STC are compared
to assure that encoder and decoder’s frequency is 27MHz [10]
Transport Stream Adaptation Field and DTS and PTS (90KHz)
are located at the Transport Packet Payload inside the Package
Elementary Stream (PES) Packet Header. Fig. 4 shows where
the time stamps are located in a TS packet. Not all TS packets
have the Adaptation Field and PTS and DTS are not always
present. The PTS DTS flags indicate its presence in the PES
Packet Header. If DTS is missing then DTS=PTS. DTS can
never be found on its own.
It is important that encoder and decoder clock run at the
same frequency, 27MHz in MPEG2. The Phase-Locked Loop
(PLL) is responsible for the clock recovery functions at the
decoder to guarantee that both run at the same frequency. PLL
compares the encoder’s PCR and the decoder’s System Time
Clock (STC) and applies clock recovery functions to assure
that both run at 27MHz. Fig. 5 shows an MPEG2 PLL High
Level Diagram where decoder compares PCR with STC and
applies the clock recovery functions [10].
The STB decoder takes care of the synchronisation be-
tween DTS, PTS and PCR. In our system of video/text
synchronisation has to synchronise the PTS from the video
with the <pubDate>/<updated>time stamp from the RSS
channel/Atom Feed. This is done either client or server-side
as discussed in section 5.
There are three pieces of hardware involved: the TV
Streamer, the Feed Server and the STB. Both, the video stream
and RSS data are stored on the TV/RSS server. Both must
be time stamped from Coordinated Universal Time (UTC)
sources, such as a NTP server. Synchronising the STB to a
NTP Server would also facilitate a mechanism to indicate
how old the information is as well as facilitating clock skew
removal via an alternative mechanism of clock recovery [11].
Fig. 6. Ralationship time stamps between RTP and RTCP protocols and the
Video PES Header [11]
C. Supported Protocols
Real-Time Protocol (RTP) is used to transport the MPEG2
TS over User Datagram Protocol (UDP) and Real-Time Con-
trol Protocol (RTCP) is used to report back on Quality of
Service (QoS) of the communications.
UDP [12] is a transaction oriented transport protocol which
does not guarantee the delivery of streams and does not
guarantee duplicate protection. Therefore, it assumes no re-
transmission of lost packets which is suited to video Real-Time
characteristics. Sending MPEG2 TS on RTP packets adds extra
features broadly explained in [13] with the only trade-off being
a small increase of the payload.
There is a specific RTP payload to carry MPEG1/MPEG2
video that provides extra information about the video content.
Its functions are to increase compatibility between MPEG2
systems and to ensure that MPEG2 media streams are com-
patible with any media streams encapsulated in RTP [14].
This RTP format to send MPEG1/MPEG2 video streams
relates the time stamps of the video/audio PTS with the one
in RTP. RTCP associates RTP time stamps to the NTP based
clock time [14].
When properly implemented, NTP can deliver millisecond
level synchronisation. In Fig. 6 we can see the relationship
between all time stamps in the protocols involved in the RTP,
RTCP and PES headers.
In the case of the Feeds they are typically delivered using
Hypertext Transfer Protocol (HTTP) over Transmission Con-
trol Protocol (TCP). TCP is a connection oriented transport
protocol which provides retransmission of lost packets and
congestion control [15], therefore it is the appropriate protocol
for the normal delivery of Feeds where the content is essential
and its delivery must be guaranteed to the client. In the
example of sending Real-Time betting information via a Feed
we must guarantee the client gets all the information available.
If retransmission of a packet is required, this impacts on Real-
Time delivery and it must be compensated by the system.
If the Feed information is not critical it could then be
delivered by UDP and RTP using the same protocols as
MPEG2 TS. Fig. 7 shows the protocol stack with the two
possible groups of protocols to deliver Feeds in the transport
and application layer.
Fig. 7. Differences between different transport protocols (UDP/TCP) and
application protocols (HTTP/RTP/RTCP)
A. STB User Interface
A web interface allows the user to select different multi-
media options making the system fully personalised. There
could be, for example, TV channels, Radio Channels or
Feeds. In many cases, these multimedia choices will not be
logically or temporally related and therefore synchronisation
is not required. As an example, a film and a Feed about a
football match do not require synchronising because the only
purpose is to keep the viewer informed of a sports event whilst
watching a film, therefore a delay of a few of seconds won’t
make any difference to the viewer.
In our prototype the user selects video and RSS Feeds to be
aligned and can select where to perform the synchronisation,
server-side or client-side (STB) alignment.
B. STB Alignment Option
In this scenario TV and RSS are streamed independently
to the STB so every client requires the complexity to syn-
chronise the TV channel with the selected RSS Feed. Server
scalability is therefore less of an issue as multicast-support
is more feasible and bandwidth requirements at server are
lessened. However, the STB must meet the Real-Time inte-
gration processing requirements. The STB buffers and time
aligns (synchronises) the channels for display. A convenient
mechanism to implement this could be through an extension
of the existing subtitles facility as described briefly below.
C. Server-side Option
The other method whereby the media streams are inte-
grated/aligned on the server greatly reduces STB complexity
which is a key advantage but presents scalability challenges.
This relates to both server bandwidth required to service
individual client needs due to reduced capacity for multicasting
and complexity on the server to process requests. On client-
side, it also requires less bandwidth which, though of relevance
here, is more pertinent when other synchronisation scenarios
such as Multiple Video streams and/or Audio Substitution are
Fig. 8. High Level Diagram of the prototype. The external elements: the
database Server and Sky RSS Feed are not connected to the NTP Server but
the system’s Web, TV and RSS Server are connected along with the client
D. RSS Feed
The prototype had two possible type sources for the RSS
Feed, first a choice from Sky server, with no time stamps,
and secondly an RSS Feed stored in the same server as the
TV streamer was located, with time stamps. The idea was to
develop a system that could use an internal and external RSS
Feeds to show the full potential of the idea.
As previously mentioned the time stamps in RSS are not
compulsory and in the internal RSS feed we included them in
the file items with milliseconds level granularity.
Using a real Sky RSS Feed has the drawback of not having
the <pubDate>tags in the items implemented. The system
works around this inconvenience by adding the time stamps
to the items to synchronise the display in the video.
The Database server is external to the system and it stores
information about the client TV requirements. The Sky RSS
Feed Server is also external with no possibility of modifying
the content of the RSS Feed. Fig. 8 shows the full high level
diagram with all the system’s components and its connection
with the NTP Server.
The two approaches of server-side, client-side synchronisa-
tion were developed with the media player VideoLan (VLC)
[16]. It is used to stream the MPEG2 TS stream in the server
and is used in the client-side as a decoder.
The embedded subtitle mechanism within VLC provides
a suitable mechanism to integrate the RSS Feed within the
TV stream. The level of alignment required for good subtitle
implementation is much less than 1 sec and therefore fitted our
requirements. Essentially our implementation converts RSS
Fig. 9. Transformation time stamps RSS Feed into subtitle SRT file before
displaying video/audio synchronised with the subtitles in the client’s decoder
Feeds into subtitles format and embeds this within a TV
In the client-side implementation the RSS Feed was trans-
formed into the subtitles file SubRip format which allows html
text formatting with millisecond timing accuracy. In the server-
side alignment the subtitles file was embedded to the MPEG2
video and streamed to the client. In the client-side alignment
the RSS file is read from the server and transformed into
subtitles in the client. Fig. 9 shows how the RSS Feed format is
transformed into a subtitle in SubRip format before displaying
the video at the client.
Both the STB and the Server options have been imple-
mented and are currently being tested. The system has been
developed in Java for the server-side and JavaScript for the
client-side. The user-interfce has been developed with Java
Server Pages (JSP). VLC is the media player chosen as a TV
streamer at the server-side, and as a decoder at the client-side.
The prototype only accepts RSS standard as a Feed. A new
model that also uses Atom has to be implemented. In the
prototype we will evaluate UDP and TCP based RSS/Atom
delivery mechanism and the effects it has on the multimedia
To date our research has focused on MPEG2 and we plan
to extend this to MPEG4 in developing a mosaic of multiple
video streams.
The different format companies use to deliver Internet
Radio, in context of the TV/Radio scenario, will be another
consideration. Internet Radio uses different systems such
as MPEG Audio Layer 3 (MP3), Ogg Vorbis, Windows
Multimedia Audio (WMA), RealAudio and High-Efficiency
Advanced Audio Coding (HE-accPlus) among others. It
is also important to analyse the different protocols used
to deliver these formats. Some use RTP/TCP and others
In this abstract, we present an overview of a research
plan to synchronise multiple media streams within an IPTV
environment. We outline a number of scenarios and detail one
in particular whereby a TV and RSS Feed are synchronised.
We summarise the implementation and the key technical
issues involved including scalability design trade-off. In our
ongoing work, we are examining these trade-offs in detail
and extending the implementation scope to the other scenarios
Thanks to SolanoTech, our industrial partner for support
Thanks to the Irish Research Council for Science, Engineer-
ing and Technology (IRCSET) for the Enterprise Partnership
Scheme funding.
[1] J. Maisonneuve, M. Deschanel, J. Heiles, H. Liu, R. Sharpe, Y. Wu.
An overview of IPTV standards Development, in: IEEE Transactions on
Broadcasting vol.55, no3, June 2009 pp.315-328.
[2] F. Boronat, J. Lloret, M. Garcia. Multimedia group and inter-stream
synchronisation techniques: A comparative study, Information Systems
34, 2009, pp 108-131.
[3] Internet Engineering Task Force. RFC1129 Internet Time Synchronisa-
tion: The Network Time Protocol. October 1989.
[4] Berkam Center. RSS 2.0 at Harvard Law (2003). Available at: [Accessed: 02 December 2009].
[5] Internet Engineering Task Force. RFC4287. The Atom Syndication For-
mat. December 2005.
[6] H. Wittenbrink. RSS and Atom, Pack Publishing Ltd, Birminhgam. 2005.
[7] Internet engineering Task Force RFC0822. Standard for the format of
ARPA Network Text Messages. August 1982.
[8] Internet Engineering Task Force. RFC 3339 Date and Time on the
Internet: Timestamps. July 2002.
[9] ISO 8601. Data Elements and Interchange Formats Information Inter-
change Representation of Dates and Times 2000.
[10] ISO/IEC 13818-1. Information Technology Generic coding of moving
pictures and associated audio: Systems Recommendation H.222 (2000E).
[11] H. Melvin, L. Murphy. Synchronisation of Internet Multimedia Streams:
Some Issues and Solutions, Work In Progress Proceedings of the Real
Time Systems Symposium, RTSS 2005, Miami, Dec. 2005.
[12] Internet Engineering Task Force. RFC768 User Datagram Protocol.
August 1980.
[13] J. Goldberg. RTP/UDP/MPEG2 TS as a means of transmission for
IPTV Streams. Telecommunication Standardization Sector. Focus Group
on IPTV. IPTV-ID-0087. July 2006.
[14] Internet Engineering Task Force. RFC2250 RTP Payload Format for
MPEG1/MPEG2 Video. January 1998.
[15] Internet Engineering Task Force. RFC793 Transmission Control Proto-
col. September 1981.
[16] VideoLAN VLC media player. Client Software Open Source Code. [Accessed: 06 April 2010].
... Here aspects of synchronization, devices, sources, and contents are presented as they were extracted from papers. The contributions of the surveyed papers are applications: the paper focus on the description of a specific multimedia application [18][19][20][21][22][23][24][25][26][27][28][29][30]; architecture: the paper proposes an architecture to solve synchronization problems but does not present programming interfaces or formalization [31][32][33][34][35][36][37][38][39]; framework: the paper presents a framework [40] that developers may use to provide synchronization to their multimedia applications [41][42][43][44][45][46][47][48][49]; language: the paper presents the description of a programming language that may be used to develop applications [17]; model: the paper presents the modeling of an approach that in theory may be used to bring synchronization to the applications [50][51][52][53][54][55][56]; platform/middleware: a platform which provides synchronization functionalities is presented. Also modifications to existing platforms are considered in this category [10,[57][58][59][60][61][62][63][64][65][66][67][68][69]; protocol: the paper defines rules and conventions for 6 ...
... For synchronization based on a global timer, all objects are attached to a time axis that represents an abstraction of real time [19,20,27,35,49,69,[71][72][73]. In virtual time axes specification method, it is possible to specify coordinate systems with user-defined measurement units [56,59,63,64]. ...
... Synchronization on server side [17, 19-21, 28, 31, 32, 34, 41, 42, 44, 46, 48, 50, 55, 57, 58, 60, 62, 66, 70, 75, 76, 78, 80, 81, 83] maximizes client's bandwidth, because only one multimedia stream is delivered to client [35]. All extra content is sent synchronized with the main content and just played on the client. ...
Full-text available
Context. Interactive TV has not reached yet its full potential. How to make the use of interactivity in television content viable and attractive is something in evolution that can be seen with the popularization of new approaches as the use of second screen as interactive platform. Objective. This study aims at surveying existing research on Multiple Contents TV Synchronization in order to synthesize their results, classify works with common points, and identify needs for future research. Method. This paper reports the results of a systematic literature review and mapping study on TV Multiple Contents Synchronization published until middle 2013. As result, a set of 68 papers was generated and analyzed considering general information such as sources and time of publication; covered research topics; and synchronization aspects such as methods, channels, and precision. Results. Based on the obtained data, the paper provides a high level overview of the analyzed works; a detailed exploration of each used and proposed technique and its applications; and a discussion and proposal of a scenario overview and classification scheme based on the extracted data.
... A key point is to provide a synchronised play-out of video and audio streams. This allows IPTV companies to offer this possibility to their subscribed clients [2] and allows those broadcast companies to be differentiated from other free TV delivery systems. Fig. 1 illustrates this possibility based on the Network Time Protocol (NTP) [3], which in this paper is establish a way to distribute time in DVB to enable synch between media delivered via DVB and IP Networks. ...
Watching a sport event via an IPTV channel and choosing a broadcast radio station as a play-out audio stream is the scenario discussed in this paper. Hybrid Broadcast and Broadband TV (hbbTV) provides an excellent platform to combine multimedia delivered via both systems. By following the recommendations in this paper, a synchronized play-out of multiple media streams delivered via the hbbTV system can be achieved.
Nowadays, media content can be delivered via diverse broadband and broadcast technologies. Although these different technologies have somehow become rivals, their coordinated usage and convergence, by leveraging of their strengths and complementary characteristics, can bring many benefits to both operators and customers. For example, broadcast TV content can be augmented by on-demand broadband media content to provide enriched and personalized services, such as multi-view TV, audio language selection, and inclusion of real-time web feeds. A piece of evidence is the recent Hybrid Broadcast Broadband TV (HbbTV) standard, which aims at harmonizing the delivery and consumption of (hybrid) broadcast and broadband TV content. A key challenge in these emerging scenarios is the synchronization between the involved media streams, which can be originated by the same or different sources, and delivered via the same or different technologies. To enable synchronized (hybrid) media delivery services, some mechanisms providing timelines at the source side are necessary to accurately time align the involved media streams at the receiver-side. This paper provides a comprehensive review of how clock references (timing) and timestamps (time) are conveyed and interpreted when using the most widespread delivery technologies, such as DVB, RTP/RTCP and MPEG standards (e.g., MPEG-2, MPEG-4, MPEG-DASH, and MMT). It is particularly focused on the format, resolution, frequency, and the position within the bitstream of the fields conveying timing information, as well as on the involved components and packetization aspects. Finally, it provides a survey of proofs of concepts making use of these synchronization related mechanisms. This complete and thorough source of information can be very useful for scholars and practitioners interested in media services with synchronization demands.
Conference Paper
Hybrid broadcast and broadband TV (HbbTV) provides a unique system to unify various TV delivery systems, from Digital Video Broadcasting (DVB) to Internet Protocol TV (IPTV) and WebTV. We propose to expand such services by multiplexing and tightly synchronising a video stream from IPTV and an audio stream via any Internet Radio station. This would be particularly interesting in content related streams, such as sports events transmitted via multiple channels, whereby a user selects the TV channel from one source and the audio from a separate Internet Radio. Multiplexing media streams at client-side consists of two steps, firstly inserting the new audio stream into the MPEG-2 Transport Stream (MP2T) and secondly synchronising the original video stream within the MP2T with the new audio stream. In this paper we describe the methods employed to insert the audio stream from Internet Radio into the MP2T video stream from an IPTV channel at client-side and we depict the results obtained in the performed testing. The next major challenge is to ensure precise lip-synchronisation (lip-sync) and this is being implemented at present.
Full-text available
Synchronised clocks and circuit switching constitute a basic building block within the heart of the traditional PSTN and this coupled with dumb terminals ensures that media syn-chronisation is not an issue. The same cannot be said of Inter-net Multimedia where delays are generally non-deterministic and where terminals are much more complex. Previous work by the authors has shown that by incorporating synchronised time into VoIP terminals, significant gains in voice quality can be achieved. Related work by the authors has examined the extent to which a lack of synchronisation (or skew) both within and between ter-minals can affect VoIP quality and has proposed and tested a high-level solution for skew detection/compensation. In this pa-per we present a number of more complex scenarios where the lack of clock synchronisation can impact on performance; these include PSTN/VoIP gateways, the use of media mixers for com-bining media streams and conferencing services. We describe a number of testbeds currently under development where the use of synchronised time and the high level skew detection/compensation approach will be evaluated as a means of dealing effectively with these scenarios.
Full-text available
In the last few years, IPTV has emerged as one of the major distribution and access techniques for broadband multimedia services. It is one of the primary growth areas for the telecommunications industry. However, existing IPTV systems are generally based on proprietary implementations that do not provide interoperability. Recently, many international standard bodies have published, or are developing, a series of IPTV related standards.. This paper is an overview of the most significant recent and upcoming IPTV standards.
This paper presents the most comprehensive analysis and comparison of the most-known multimedia group and inter-stream synchronization approaches. Several types of multimedia synchronization are identified but only inter-stream and group synchronization algorithms are considered. This is the first survey including group synchronization techniques. A classification of the main synchronization techniques included in most of the analyzed algorithms complements the paper. Finally, a table is presented summarizing the main characteristics of each analyzed algorithm according to those techniques and other critical issues.
Client Software Open Source Code.
  • Vlc Videolan
  • Player
13818-13821 Information Technology Generic coding of moving pictures and associated audio: Systems Recommendation H
  • Iso Iec
MPEG2 TS as a means of transmission for IPTV Streams. Telecommunication Standardization Sector. Focus Group on IPTV. IPTV-ID-0087
  • J Goldberg
  • Rtp
  • Udp
Synchronisation of Internet Multimedia Streams: Some Issues and Solutions, Work In Progress Proceedings of the Real Time Systems Svmoosium. RTSS 2005
  • H Melvin
  • L Murphy
RTP/UDP/MPEG2 TS as a means of transmission for IPTV Streams. Telecommunication Standardization Sector
  • J Goldberg
J. Goldberg. RTP/UDP/MPEG2 TS as a means of transmission for IPTV Streams. Telecommunication Standardization Sector. Focus Group on IPTV. IPTV-ID-0087. July 2006.