Monday, 28 December 2015

Wireless Video Service in CDMA Systems

Abstract

Video services are becoming an integral part of future communication systems. Especially for the upcoming 3G-CDMA system wireless networks such as UMTS, video may very well turn out to be the key value addition that achieves the required return of investment.While previous generations of wireless communication systems were primarily designed and used for voice services, next generation systems have to support a broad range of applications in a wide variety of settings. The early market stages were characterized by the needs of early adopters, mostly for professional use. As the market matures from the early adopters to normal users, new services will be demanded. These demands will likely converge toward the demands that exist for wired telecommunications services.
Market research finds that mobile commerce for 3G wireless systems and beyond will be dominated by basic human communication such as messaging, voice, and video communication [1]. Because of its typically large bandwidth requirements, video communication (as opposed to the lower rate voice and the elastic e-mail) is expected to emerge as the dominant type of service in 3G/4G wireless systems. Video services both real-time services and streaming services are gaining a lot of importance and applications in CDMA systems.
There are three ways to spread the bandwidth of the signal:
• Frequency hopping. The signal is rapidly switched between different frequencies within the hopping bandwidth pseudo-randomly, and the receiver knows beforehand where to find the signal at any given time.
• Time hopping. The signal is transmitted in short bursts pseudo-randomly, and the receiver knows beforehand when to expect the burst.
• Direct sequence. The digital data is directly coded at a much higher frequency. The code is generated pseudo-randomly, the receiver knows how to generate the same code, and correlates the received signal with that code to extract the data.
These direct sequence spread spectrum signals can be digitized voice, ISDN channels, modem data, etc.
Why real-time video Traffic is difficult to handle
 Video Requires Extremely Large Bitrate
– Large Bandwidth Requirement for Video Sources vs. Limited Resources of Wireless Networks.
– High Compression Ratio Bitstreams Extremely Sensitive to Channel Errors and Network Impairments.
– Stringent Latency Requirements.
 Compressed Video Requires Error Protection
– Most Existing Video Compression Standards Originally Not Designed for Lossy Channels.
– Appropriate Error Protection Schemes: an Important Research Topic.

Video Streaming over CDMA-Based Wireless Networks

In video streaming services, the playout begins when the queue length of the receiver buffer is above a threshold. This threshold must be large to reduce the buffer underflow probability and absorb the bit rate variations caused by the wireless channel. On the other hand, it is important to reduce this threshold in order to reduce the initial playout delay and also the size of the receiver buffer.Here a video streaming service is considered, where the last link is a wireless CDMA-based link and it has been shown how a truncated power control allows to reduce the pre-roll delay without increasing the average transmission power and without degrading the video quality. It presents the proposed truncated power control and an analytical model to evaluate the achievable pre-roll delay reduction.

System Model

The system model consists of a source, a server and a client. As shown in Figure 2, the video streaming passes through a wired network without losses and a lossy wireless link with CDMA-based transmission. The source can be a live program or a prestored program; in the first case the source passes a video frame to the server every t seconds, while in the second case the source passes all of the video frames to the server at the beginning of the session. The server is responsible of delivering video frames from the source to the client through a heterogeneous wired/wireless network; in this paper we refer to a UDP-based transport platform. The UDP protocol does not permit to recover from data losses and this functionality is left to the link.
The server encapsulates each video frame within a UDP packet and each packet is en-queued into the UDP transmission buffer. Both at the server side and at the client side a link layer ARQ buffer and a playout/UDP buffer are needed. At the client side, the playout begins when the queue length n of the playout buffer is above a specified threshold Npr. Such a phase is called pre-roll process and it is needed in order to reduce the buffer underflow probability at the expenses of an initial delay (pre-roll delay).
At the client side, underflow occurs when n = 0; after the buffer underflow occurrence the receiver temporarily suspends the playout of the video and a new pre-roll process starts. Both the pre-roll delay and the buffer underflow probability depends on the pre-roll threshold Npr and on the channel reliability. Large Npr results in a small underflow probability but increase the pre-roll delay.It is common for streaming media clients to have a 5 to 15 seconds of buffering delay before playback starts.

Video Conferencing

Videoconferencing is a medium where two or more people at different locations can meet face-to-face in real time .Offers new possibilities to connect with guest speakers and experts.Can make relevant learning opportunities more accessible and exciting.
Videoconferencing is the conduct of a video conference (also known as a video conference or video teleconference) by a set of telecommunication technologies which allow two or more locations to communicate by simultaneous two-way video and audio transmissions. It has also been called 'visual collaboration' and is a type of groupware .
Videoconferencing uses audio and video telecommunications to bring people at different sites together. This can be as simple as a conversation between people in private offices (point-to-point) or involve several (multi point) sites in large rooms at multiple locations. Besides the audio and visual transmission of meeting activities, allied videoconferencing technologies can be used to share documents and display information on whiteboards.

No comments:

Post a Comment