Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
Semantic Analysis of a Cricket Broadcast Video
#1

1 Abstract
Most approaches to the semantic analysis of sports videos involve the use of auxiliary cues to detect
events[4]. Detecting events of semantic importance based on video alone is a di_cult task.
In this project, we attempt to semantically characterize a cricket broadcast video based on video
alone. Initially, we perform shot boundary detection and shot classi_cation based on multi-scale
spatio temporal analysis of colour and optical ow features. This allows the representation of ev-
ery frame by a feature vector over which a classi_er can then be built. We can then query the
video for shallow queries like \How many fours were hit by Hayden ?", by aligning it with textual
commentary[3].
Finally, we examine an optical ow based feature set construction to identify the semantic char-
acteristics of the events (viz. balls) detected using the above technique. This may enable us to
answer queries involving _ner semantic features for eg. "How many balls were hit by Hayden on the
On Side ?" with greater accuracy and without the aid of any external cues like commentary.
2 Introduction
Semantics' of a video involves the development of a model to identify, extract and represent 'seman-
tically' relevant events. Events for sports video analysis are extracted from three primary channels
: video, audio and text[5]. Each channel furnishes certain cues that correlate with the occurrence of
events. As part of the project,we stick to the extraction of cues based on the video alone
For the purpose of semantic analysis, some amount of video processing needs to be done to gen-
erate a framework where the semantics of the video may be examined. The steps involved are listed
in Figure 2 .Moreover, in episodic games like cricket where certain semantically relevant events (viz.
balls) are repeated, it may be bene_cial to be able to identify switches between cameras as well as to
classify shots once their boundaries have been detected, as even the shots have a tendency to show
a recurring pattern due to the inherent episodic nature of the game.
A common approach to scene change detection exploits changes in a Colour Histogram at shot
boundaries to detect changes in scenes[1]. Such an approach assumes that the content of a video
changes across shot boundaries and tries to characterize this change in terms of the Colour His-
togram. This approach would fail in cases when a camera is switched but the the content of the
scene remains the same.
We examine a multi-scale spatio temporal analysis of colour and ow features for Shot Bound-
ary Detection and Shot Classi_cation. Once every shot has been classi_ed, a Bayesian Probability
Analysis is done to detect the shots that represent the start of the ball. Finally, for every ball, we
look at their semantic characterization through a generation of features based on optical ow. This
feature vector is then used to classify the balls on the basis of the runs scored, area in which the
ball is hit and the type of batting stroke played. The results of the experiments are presented.

Download full report
http://googleurl?sa=t&source=web&cd=1&ve...report.pdf&ei=K8BETu_hLJGvrAfcmoX4Aw&usg=AFQjCNFLFBFMmbRq3CihM6WuR98yFPzObA&sig2=7xbeJ0eZ7MXhm966-2EIGg
Reply



Forum Jump:


Users browsing this thread:
1 Guest(s)

Powered By MyBB, © 2002-2024 iAndrew & Melroy van den Berg.