throbber
(12) United States Patent
`BrOWn et al.
`
`(10) Patent No.:
`(45) Date of Patent:
`
`US 7.447,331 B2
`Nov. 4, 2008
`
`USOO7447331 B2
`
`(54) SYSTEMAND METHOD FOR GENERATING
`AVEWABLE VIDEO INDEX FOR LOW
`BANDWDTH APPLICATIONS
`(75) Inventors: Lisa Marie Brown, Pleasantville, NY
`(US); Jonathan H. Connell, Cortlandt
`Manor, NY (US); Raymond A. Cooke,
`Bloomington, MN (US); Arun
`input. iNS 2. (US);
`Sharathchandra UmapathiRao
`Pankanti, Mount Kisco, NY (US);
`Andrew William Senior, New York, NY Kompatsiaris et al., “Spatiotemporal Segmentation and Tracking of
`(US); Ying-Li Tian, Yorktown Heights,
`Objects for Visualization of Videoconference Image Sequences'.
`NY (US)
`IEEE Transactions on Circuits and Systems for Video Technology,
`Ieee Inc., New York, vol. 10, No. 8, Dec. 2000, pp. 1388-1402.
`
`5,969,755 A * 10/1999 Courtney .................... 348/143
`6,026,183 A
`2/2000 Talluri et al. ................ 382,194
`6,169,573 B1
`1/2001 Sampath-Kumar et al. .. 348/169
`6,173,317 B1 *
`1/2001 Chaddha et al. ............. TO9,219
`
`(Continued)
`OTHER PUBLICATIONS
`
`(73) Assignee: International Business Machines
`Corporation, Armonk, NY (US)
`Subject to any disclaimer, the term of this
`patent is extended or adjusted under 35
`U.S.C. 154(b) by 1003 days.
`
`(*) Notice:
`
`(Continued)
`Primary Examiner Abolfazl Tabatabai
`(74) Attorney, Agent, or Firm Duke W. Yee; Anne
`Dougherty: Brandon G. Williams
`
`(21) Appl. No.: 10/785,890
`
`(57)
`
`ABSTRACT
`
`(22) Filed:
`(65)
`
`Feb. 24, 2004
`Prior Publication Data
`US 2005/O185823 A1
`Aug. 25, 2005
`s
`
`(51) Int. Cl.
`(2006.01)
`G06K 9/00
`(2006.01)
`HO)4N 5/225
`(52) U.S. Cl. ....................................... 382/103: 348/169
`(58) Field of Classification Search ................. 382/103,
`382/107. 154, 194, 224, 243. 142: 348/143
`348/169 700. 375 1241.1 E7004 E7.01 1.
`375/E70 2. ET.O76 E707s. 380,212
`See application file for complete search history.
`References Cited
`U.S. PATENT DOCUMENTS
`5,521,841 A
`5/1996 Arman et al. ........... 364,514 A
`5,923,365 A * 7/1999 Tamir et al. ................. 348,169
`5,933,535 A
`8/1999 Lee et al. .................... 382.243
`
`(56)
`
`A system and method for generating a viewable video index
`for low bandwidth applications are provided. The exemplary
`aspects of the present invention solve the problems with the
`prior art systems by incorporating information for generating
`a viewable representation of the video data into the index,
`thus generating a viewable video index. The viewable video
`index contains information for generating a visual represen
`tation of moving objects in the video data, a visual represen
`tation of the background of the video capture area, i.e. the
`scene, a representation of the object trajectory, a representa
`tion of the object attributes, and a representation of detected
`events. The result is that the viewable video index may be
`transmitted to a low bandwidth application on a client device
`and may be used along with associated object and back
`ground models to generate a representation of the actual
`Video data without requiring that the original video data itself
`be streamed to the client device.
`
`26 Claims, 4 Drawing Sheets
`
`- - - - - - - - - - - - - - - - -
`WIEWABLE WIDEOINDEX
`
`- 520
`
`
`
`
`
`
`
`535
`
`
`
`OBJECT MODEL FLES
`550- OBJECT PATCH
`560
`OBJECT MASK
`
`LOGFILE
`
`IndexStartime
`
`IndexEndTime
`
`BACKGROUND
`IMAGEFILES
`
`

`

`US 7.447,331 B2
`Page 2
`
`U.S. PATENT DOCUMENTS
`6,185.314 B1* 2/2001 Crabtree et al. ............. 382,103
`6,271,892 B1* 8/2001 Gibbon et al.
`... 348,700
`6,366,296 B1
`4/2002 Boreczky et al. ............ 345,719
`6,385,772 B1
`5/2002 Courtney .................... 725/105
`6,389,168 B2
`5/2002 Altunbasak et al.
`... 382,224
`6,400,831 B2
`6/2002 Lee et al. .................... 382,103
`6.424,370 B1
`7/2002 Courtney .................... 348/143
`6,560,281 B1
`5, 2003 Black et al. .
`... 375,240
`6,614,847 B1
`9/2003 Das et al. ............... 375,24O16
`2001/0035907 A1 11/2001 Broemmelsiek ............ 348,169
`2002fOOO8758 A1
`1/2002 Broemmelsiek et al. .... 348/143
`2003/0044045 A1
`3/2003 Schoepflin et al. .......... 382,103
`2003/0081564 A1
`5/2003 Chan .......................... 370,328
`2003/O123850 A1
`7/2003 Jun et al. ....
`... 386/68
`2003/0185434 A1 10, 2003 Lee et al. .................... 382,154
`OTHER PUBLICATIONS
`Wren et al., “Pfinder: Real-Time Tracking of the Human Body”.
`IEEE Transactions on Pattern Analysis and Machine Intelligence,
`IEEE Inc., New York, vol. 19, No. 7, Jul. 1997, pp. 780-785.
`
`
`
`Collins et al., “Algorithms for cooperative multisensor Surveillance'.
`Proceedings of the IEEE, vol. 89, No. 10, Oct. 2001, pp. 1456-1477.
`RegaZZoni et al., “3D pose estimation and shape coding of moving
`objects based on statistical morphological skeleton'. Proceedings of
`the International Conference on Image Processing (ICIP), Washing
`ton Oct. 23-26, 1995, IEEE Comp. Soc. Press, vol. 3, Oct. 1995, pp.
`612-615.
`Foresti et al., “Statistical Morphological Skeleton for Representing
`and Coding Noisy Shapes”, IEEE Proceedings: Vision, Image and
`Signal Processing. Institution of Electrical Engineers, GB, vol. 146,
`No. 2, Apr. 1999, pp. 85-92.
`Senior et al., “Appearance Models for Occlusion Handling”. Pro
`ceedings of the Second International Workshop on Performance
`Evaluation of Tracking and Surveillance Systems in Conjunction
`with CVPR'01, Dec. 2001, 8 pgs.
`“Real-Time Articulated Human Body Tracking using Silhouette
`Information', IEEE Workshop on Performance Evaluation of Track
`ing and Surveillance, Nice, France, Oct. 2003, pp. 1-8.
`* cited by examiner
`
`

`

`U.S. Patent
`
`Nov. 4, 2008
`
`Sheet 1 of 4
`
`US 7.447,331 B2
`
`
`
`2O2
`
`PROCESSOR
`
`PROCESSOR
`
`204
`
`SYSTEM BUS
`
`206
`
`MEMORY
`208 -N CONTROLLER/ I/O BRIDGE
`CACHE
`
`210
`
`200
`A.
`
`f
`LOCAL
`
`209
`
`230
`
`GRAPHCS
`ADAPTER
`
`232
`
`214
`
`216
`C E H R is
`ange
`
`PCBUS
`
`PC BUS
`
`212
`
`I/O
`BUS
`
`MODEM
`
`NETWORK
`ADAPTER
`
`222
`
`C E
`C E
`
`218 PC BUS 220
`PC BUS FD
`226
`
`PC BUS
`
`224
`
`PCBUS
`RD
`228
`
`

`

`U.S. Patent
`
`Nov. 4, 2008
`
`Sheet 2 of 4
`
`US 7.447,331 B2
`
`so,
`
`302
`Processorks. Eake E.
`
`308
`
`HOST/PCI
`
`304
`
`MAN
`
`316
`
`AUDIO
`
`BUS
`
`306
`
`SCSI HOST
`BUS ADAPTER
`
`LAN
`ADAPTER
`
`ANSON GRAPHics|| E3
`INTERFACE
`ADAPTER
`ADAPER
`
`312
`
`310
`
`314
`
`318
`
`319
`
`DISK
`TAPE
`
`KEYBOARD AND
`320-MoUSEADAPTERI
`328
`330
`
`FIG. 3
`
`MODEM
`
`| MEMORY
`
`322
`
`324
`
`410
`
`420
`
`430
`
`440
`
`450
`
`CONTROLLER
`
`OBJECT
`DETECTION
`MODULE
`
`OBJECT
`CLASSIFICATION
`MODULE
`
`USER SPECIFIED WI VIDEO DATA
`PARAMETER SET
`STORAGE
`STORAGE DEVICE
`DEVICE
`
`go
`
`great
`
`VIDEO INPUT
`DEVICE
`INTERFACE
`
`MULTI-OBJECT
`TRACKING
`MODULE
`
`
`
`EVENT
`DETECTION
`MODULE
`
`VIEWABLE
`VIDEO INDEX
`GENERATOR
`
`NETWORK
`INTERFACE
`
`415
`
`425
`
`435
`
`FIG. 4 445
`
`455
`
`

`

`U.S. Patent
`
`Nov. 4, 2008
`
`Sheet 3 of 4
`
`US 7.447,331 B2
`
`710
`
`730 740
`
`- - - - - - - - - - - - - - - - - - - - - - - - 520
`VIEWABLEVIDEOINDEX
`
`LOG FILE
`
`510
`
`MAIN INDEX DIRECTORY
`
`
`
`
`
`
`
`610-StartFrame
`History
`Emesamp OOOOOOOO73-2003-09-08-21-16-41-819 73
`Centrol 718, 1811
`N Area 328.0
`650Y-Bounding Box 62 85 23 22
`660 SES E. for sooooo
`- OCClusion Fraction 0.
`FIG. 6 E?-Class -1 Unknown
`690-ModelFollows O
`665-End Frame
`
`

`

`U.S. Patent
`
`Nov. 4, 2008
`
`Sheet 4 of 4
`
`US 7.447,331 B2
`
`FIG. 8
`
`810
`
`RECEIVE VIDEO DATA
`
`820-N IDENTIFY FOREGROUND AND
`BACKGROUND OBJECTS
`
`
`
`
`
`
`
`
`
`
`
`830
`
`840
`
`850
`
`860
`
`870
`
`IDENTIFY MODELS FOR
`REPRESENTING
`FOREGROUND AND
`BACKGROUND OBJECTS
`
`TRACK MOVEMENT OF
`OBJECTS AND CHANGES
`IN OBJECT PARAMETERS
`IN RECEIVED WIDEO DATA
`
`ANALYZE OBJECT
`MOVEMENT TO DETERMINE
`OCCURRENCE OF EVENTS
`
`GENERATE VIEWABLE VIDEO
`INDEX BASED ON TRACKING
`INFORMATION, OBJECT
`MODELS, EVENT
`NFORMATION AND OBJECT
`PARAMETER INFORMATION
`
`STORE VIEWABLE
`VIDEOINDEX
`
`TRANSMTVIEWABLEVIDEO
`INDEX TO CLIENT DEVICE
`
`880
`
`END
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`910
`
`920
`
`FIG. 9
`
`START
`
`
`
`RECEIVE VIEWABLE
`VIDEO INDEX (VVI)
`
`PARSE VIEWABLE VIDEO
`INDEX TO DENTFY
`TRACKING INFORMATION
`FILES, OBJECT AND
`BACKGROUND FILES, etc
`
`NECESSARY FILES
`PRESEN LOCALLY
`
`GENERATE
`REPRESENTATION OF
`VIDEO DATAUSING
`VIEWABLEVIDEO INDEX,
`TRACKING INFORMATION,
`OBJECT AND
`BACKGROUND FILES
`
`GENERATE TIMELINE
`REPRESENTATION WITH
`EVENT MARKERS
`
`UPDATE POSITION, SIZE,
`ORIENTATION OF
`FOREGROUND OBJECTS
`ON BACKGROUND IMAGE
`NACCORDANCE WITH
`TRACKING INFORMATION
`
`REOUEST
`FILES THAT
`ARE NOT IN
`LOCAL
`STORAGE
`FROM SERVER
`
`
`
`960
`
`
`
`
`
`970
`
`
`
`
`
`

`

`US 7,447,331 B2
`
`1.
`SYSTEMAND METHOD FOR GENERATING
`A VIEWABLE VIDEOINDEX FOR LOW
`BANDWIDTH APPLICATIONS
`
`BACKGROUND OF THE INVENTION
`
`2
`condensed version of the presentation. In this system, the
`condensed version of the video data can be used indepen
`dently, i.e. without using the original video. However, the
`condensed version of the video data is not a complete repre
`sentation of the original video.
`U.S. Pat. No. 6,271,892, issued to Gibbon et al., describes
`a system that extracts key frames from video data and asso
`ciates it with corresponding closed captioning text. This
`information may be rendered in a variety of ways, e.g., a page
`with printed key frames with associated closed captioning, to
`give a Summary of the video. This system is in principle
`similar to the Black system discussed above and suffers the
`same drawback that the Summary of the video is not a com
`plete representation of the video data.
`Current video Surveillance and tracking systems analyze
`Video to detect and track objects. They use the object tracking
`information to infer the occurrence of certain events in the
`Video to thereby generate event markers. These systems then
`use these event markers as indices for viewing the original
`video.
`For example, U.S. Pat. No. 5,969,755, issued to Courtney,
`describes a video Surveillance system which incorporates
`object detection and tracking. The Courtney System generates
`a symbolic representation of the video based on the object
`tracking information. The Courtney System also uses the
`object tracking information to infer events in the video Such
`as appearance/disappearance, deposit/removal, entrance?
`exit, etc. The Courtney system uses these event markers to
`retrieve relevant bits of the video for the user. The key draw
`back of the Courtney System, and systems like it, is that it
`requires both the index information, i.e. the event marker
`information, and the original video in order for the user to be
`able to make an independent assessment of the event.
`U.S. Pat. No. 6,385,772, which is also issued to Courtney,
`describes a video Surveillance system that uses a wireless link
`to transmit video to a portable unit. The video surveillance
`system uses motion detection as a trigger to transmit a video
`frame to the portable unit so that the user can make an assess
`ment of the event. This system, while linking up a viewable
`representation of a detected event, does not provide a com
`plete representation of the video corresponding to the event.
`Thus, the Courtney system limits the ability of the user to
`make assessments of the situation without accessing the
`original video footage.
`U.S. Patent Application Publication No. 2003004.4045 to
`Schoepflin discloses a system for tracking a user selected
`object in a video sequence. In the Schoepflin reference an
`initial selection is used as a basis for updating both the fore
`ground and background appearance models. This system,
`while discussing object tracking, does not address both the
`event detection problem and the problem of generating a
`complete representation of the video data.
`U.S. Patent Application Publication No. 2001.0035907 to
`Boemmelsiek describes a video surveillance system which
`uses object detection and tracking to reduce the information
`in a video signal. The detected objects are used as a basis for
`generating events which are used to index the original video
`data. This system again has the drawback of requiring the
`original video data for the user to make an independent
`assessment of the detected event.
`Current video compression systems are completely
`focused on reducing the number of bits required to store the
`Video data. However, these video compression systems do not
`concern themselves with indexing the video in any form. For
`example, U.S. Patent Application Publication No.
`20030081564 to Chan discloses a wireless video surveillance
`system where the data from a video camera is transmitted
`
`1. Technical Field
`The present invention is generally directed to the fields of
`automatic video analysis and video compression. More spe
`cifically, the present invention is directed to a mechanism for
`performing automatic video analysis and video compression
`on video data provided by video input devices in order to
`generate representations of the video data using a low band
`width data stream.
`2. Description of Related Art
`Video compression and automatic video analysis for track
`ing of moving objects are both very active areas of research.
`However, these have been disconnected areas of research.
`Video compression deals with minimizing the size of the
`Video data in the video stream while video analysis is con
`cerned with determining the content of video data.
`In the context of video monitoring systems, such as video
`surveillance or security systems, the index data will alert the
`monitoring user to the presence of an interesting activity in
`the scene. However, in order to take an action, the user needs
`to view the corresponding video to gain a complete under
`standing of the activity. This feature is very essential since
`most automatic video analysis systems have errors in the
`event detection and will often indicate activity that is of little
`or no interest to the human being monitoring the video.
`Current automatic video analysis systems analyze the
`video and generate an index. A typical video index may
`consist of a temporal reference into the video stream and a
`descriptor, where the descriptor may be a semantic token
`(e.g., the presence of a human face and cardinality) or a
`feature descriptor of the video (e.g., color histogram of the
`dominate objects). The implicit assumption of video indexing
`systems is that the actual video data will be available to the
`monitoring user when they choose to use the index to review
`the actual video. More information about such video analysis
`systems is available in the Handbook of Video Databases,
`Design and Applications by Fruth and Marques, CRC Press,
`2003.
`Many different types of video analysis systems have been
`devised for use in determining the content of video data. For
`example, U.S. Patent Application Publication No.
`2003.01.23850 to Jun et al. discloses a system that analyzes
`news video and automatically detects anchorperson and other
`types of segments to generate temporal indices into the news
`video. The Jun systems uses the index information to provide
`content based access to the indexed segments and also allows
`for different reproduction speeds for different types of seg
`ments. This system requires both the index and the original
`video to allow a user to browse the news video.
`U.S. Pat. No. 6,366,269, issued to Boreczky et al.,
`describes a media file browser where the file is accessed based
`on a user selected feature. For example, a user may choose to
`jump to a point in the media file where there is an audio
`transition from music to speech or a visual transition from one
`scene to the other. This system also requires both the index
`and the original video to allow a user to browse the video
`content based on the index.
`U.S. Pat. No. 6,560,281, issued to Blacket al., is directed to
`a system which can analyze video data from a presentation,
`cluster frames into segments corresponding to each overhead
`slide used in the presentation, recognize gestures by the
`speaker in the video and use this information to generate a
`
`10
`
`15
`
`25
`
`30
`
`35
`
`40
`
`45
`
`50
`
`55
`
`60
`
`65
`
`

`

`3
`over a wireless link to a computer display. Such a system
`provides access to video data from the camera without any
`regard to event detection. Thus, this system requires that the
`user view the video in order to detect events himself.
`U.S. Pat. No. 5,933,535, issued to Lee et al., teaches a
`method of using objects or object features as the basis for
`compression, as opposed to rectangular blocks. This results in
`higher compression efficiency and lower errors. This method,
`while using the object properties to reduce the bandwidth
`required to transmit the video data, does not look at the event
`behavior of the objects.
`U.S. Pat. No. 6,614,847, issued to Das et al., discloses an
`object oriented video compression system which decom
`poses the video data in regions corresponding to objects and
`uses these regions as the basis for compression. However, this
`system, like most other compression systems, does not incor
`porate any video event information.
`
`5
`
`10
`
`15
`
`SUMMARY OF THE INVENTION
`
`25
`
`The critical drawback of current video analysis, video
`compression and video Surveillance systems is their disjoint
`nature. That is, compression technology does not consider
`how the user gets the relevant portions of the video while
`Video analysis and video Surveillance technology generates
`relevance markers but assumes the presence of the original
`video data for the user to view the material.
`The present invention addresses these problems in the prior
`art by providing a system and method for generating a view
`able video index for low bandwidth application. The exem
`30
`plary aspects of the present invention solve the problems with
`the prior art systems by incorporating a viewable representa
`tion of the video into the index, thus generating a viewable
`video index. The viewable video index contains a visual rep
`resentation of moving objects in the video data, a visual
`representation of the background of the video capture area,
`i.e. the scene, a representation of the object trajectory, a
`representation of the object attributes, and a representation of
`detected events.
`The visual representation of the video capture area back
`ground includes a color bit map of the scene background or
`stationary parts of the video capture area or scene. This color
`bit map is updated whenever the background changes appre
`ciably as determined by a pre-established threshold. The
`background provides a static image of the environment in
`which moving objects, or foreground objects, move.
`The visual representation of the moving objects includes a
`colorbit map of all moving objects in the scene. The color bit
`map is updated at multiple time intervals during the lifetime
`of the object in the video capture area and may be Superim
`posed on a background image in order to provide a represen
`tation of the moving object moving within the environment
`depicted by the background image.
`The representation of the object trajectory includes a time
`synchronized representation of position of the object and its
`subparts over time. The representation of object attributes
`includes, but is not limited to, the type of object, the object
`size, object color, etc. The representation of the detected
`events includes a time synchronized representation of a vari
`ety of events that are detected in the video data. These may
`include the occurrence of movement, directional movement,
`etc.
`Using the object trajectory information, the background
`images and the moving object bitmap, a representation of the
`Video data may be generated in which the moving object
`bitmap images are Superimposed over the background image
`and move across the background image in accordance with
`
`35
`
`40
`
`45
`
`50
`
`55
`
`60
`
`65
`
`US 7,447,331 B2
`
`4
`the trajectory information. In addition, a timeline representa
`tion with event markers corresponding to the sequence of
`events is provided with the event markers being selectable for
`jumping the representation of the video data to corresponding
`time points.
`Thus, the viewable video index of the present invention
`provides a complete representation of the input video stream
`which can be used for distributed processing and after the fact
`event detection. The viewable video index makes use of mod
`els to represent foreground objects. These models may be
`generated by capturing an area around the points of move
`ment within a series of video frames or may be pre-estab
`lished models. Since the model data is provided along with
`the viewable video index, classification of the models into
`different types of objects, even those that were not envisioned
`at the time that the viewable video index was generated, may
`be made through an analysis of the model data.
`In addition, since the viewable video index is a times
`tamped representation of a video capture area, the timestamps
`may be used to correlate the viewable video index with other
`types of timestamped information, whether generated by the
`same or a different system. Thus, for example, the viewable
`Video index may be correlated with a security access card
`Swiping device, badge reader, or keypad log file to determine
`an identity of a particular person within a video capture area
`represented by the viewable video index.
`The viewable video index may be associated with a par
`ticular video capture device and may be marked with an
`identifier of the particular video capture device. In this way,
`by associating the identifier of the particular video capture
`device with information maintained regarding the layout of
`Video capture devices at a particular location, "camera-hand
`off for tracking an object as it crosses multiple video capture
`areas is made possible. These and other features and advan
`tages of the present invention will be described in, or will
`become apparent to those of ordinary skill in the art in view
`of the following detailed description of the preferred
`embodiments.
`
`BRIEF DESCRIPTION OF THE DRAWINGS
`
`The exemplary aspects of the present invention will best be
`understood by reference to the following detailed description
`when read in conjunction with the accompanying drawings,
`wherein:
`FIG. 1 is an exemplary diagram of a distributed data pro
`cessing system in which the exemplary aspects of the present
`invention may be implemented;
`FIG. 2 is an exemplary diagram of a server computing
`device in which exemplary aspects of the present invention
`may be implemented;
`FIG. 3 is an exemplary diagram of a client computing
`device in which exemplary aspects of the present invention
`may be implemented;
`FIG. 4 is an exemplary block diagram illustrating a video
`analysis engine in accordance with one exemplary embodi
`ment of the present invention;
`FIG. 5 is an exemplary diagram illustrating an example of
`the structure of a viewable video index for a sampled video
`stream generated by a video analysis engine in accordance
`with one exemplary embodiment of the present invention;
`FIG. 6 is an exemplary diagram illustrating details of an
`index entry at a given time instant for a sampled video stream
`in accordance with one exemplary embodiment of the present
`invention;
`
`

`

`US 7,447,331 B2
`
`5
`FIG. 7 is an exemplary diagram of a visual output gener
`ated based on a viewable video index in accordance with one
`exemplary embodiment of the present invention;
`FIG. 8 is a flowchart outlining an exemplary operation of
`one exemplary embodiment of the present invention when
`generating a viewable video index; and
`FIG. 9 is a flowchart outlining an exemplary operation of
`one exemplary embodiment of the present invention when
`generating a visual output using a viewable video index.
`
`DETAILED DESCRIPTION OF THE PREFERRED
`EMBODIMENT
`
`5
`
`10
`
`15
`
`6
`(LAN), or a wide area network (WAN). FIG. 1 is intended as
`an example, and not as an architectural limitation for the
`present invention.
`In the depicted example, server 104 may incorporate a
`viewable video index video analysis system in accordance
`with the exemplary aspects of the present invention. Server
`104 may be coupled to one or more video input device 150
`154 which are used to provide video data streams to the server
`104. The video input devices 150-154 may be, for example,
`digital video cameras or the like. Alternatively, the video
`input devices 150-154 may provide video data streams from
`stored video data, Such as in the case of a video tape player,
`DVD player, or other video data source having a storage
`medium upon which the video data may be recorded. The
`video data streams received from the video input devices
`150-154 are analyzed to identify events occurring in the vari
`ous video capture areas as well as to generate viewable video
`indices in the manner described hereafter.
`Referring to FIG. 2, a block diagram of a data processing
`system that may be implemented as a server, such as server
`104 in FIG. 1, is depicted in accordance with a preferred
`embodiment of the present invention. Data processing system
`200 may be a symmetric multiprocessor (SMP) system
`including a plurality of processors 202 and 204 connected to
`system bus 206. Alternatively, a single processor System may
`be employed. Also connected to system bus 206 is memory
`controller/cache 208, which provides an interface to local
`memory 209. I/O bus bridge 210 is connected to system bus
`206 and provides an interface to I/O bus 212. Memory con
`troller/cache 208 and I/O bus bridge 210 may be integrated as
`depicted.
`Peripheral component interconnect (PCI) bus bridge 214
`connected to I/O bus 212 provides an interface to PCI local
`bus 216. A number of modems may be connected to PCI local
`bus 216. Typical PCI bus implementations will support four
`PCI expansion slots or add-in connectors. Communications
`links to clients 108-112 in FIG. 1 may be provided through
`modem 218 and network adapter 220 connected to PCI local
`bus 216 through add-in connectors.
`Additional PCI bus bridges 222 and 224 provide interfaces
`for additional PCI local buses 226 and 228, from which addi
`tional modems or network adapters may be supported. In this
`manner, data processing system 200 allows connections to
`multiple network computers. A memory-mapped graphics
`adapter 230 and hard disk 232 may also be connected to I/O
`bus 212 as depicted, either directly or indirectly.
`Those of ordinary skill in the art will appreciate that the
`hardware depicted in FIG. 2 may vary. For example, other
`peripheral devices, such as optical disk drives and the like,
`also may be used in addition to or in place of the hardware
`depicted. The depicted example is not meant to imply archi
`tectural limitations with respect to the present invention.
`The data processing system depicted in FIG.2 may be, for
`example, an IBM eServer pSeries system, a product of Inter
`national Business Machines Corporation in Armonk, N.Y.,
`running the Advanced Interactive Executive (AIX) operating
`system or LINUX operating system.
`With reference now to FIG. 3, a block diagram illustrating
`a data processing system is depicted in which the present
`invention may be implemented. Data processing system 300
`is an example of a client computer, Such as client device 108,
`110 or 112 in FIG.1. Data processing system 300 employs a
`peripheral component interconnect (PCI) local bus architec
`ture. Although the depicted example employs a PCI bus, other
`bus architectures such as Accelerated Graphics Port (AGP)
`and Industry Standard Architecture (ISA) may be used. Pro
`cessor 302 and main memory 304 are connected to PCI local
`
`The present invention provides a system and method for
`generating a viewable video index for low bandwidth appli
`cations. As such, the present invention is particularly well
`Suited for use in distributed data processing systems in which
`data is transmitted, via wired and/or wireless connections,
`over a network between a plurality of computing devices.
`Therefore, the following FIGS. 1-3 are intended to provide a
`brief description of one exemplary distributed data process
`ing system and the computing devices within this distributed
`data processing system as a context for the further description
`of the mechanisms of the present invention. The example
`systems and devices shown in FIGS. 1-3 are intended only as
`examples and no limitation on the systems or devices that
`may be used with the present invention is intended or implied
`by the depiction or description of FIGS. 1-3.
`With reference now to the figures, FIG.1 depicts a pictorial
`representation of a network of data processing systems in
`which the present invention may be implemented. Network
`data processing system 100 is a network of computers in
`which the present invention may be implemented. Network
`data processing system 100 contains a network 102, which is
`the medium used to provide communications links between
`various devices and computers connected together within
`network data processing system 100. Network 102 may
`include connections, such as wire, wireless communication
`links, or fiber optic cables.
`In the depicted example, server 104 is connected to net
`work 102 along with wireless server 106. In addition, clients
`108, 110, and 112 are connected to network 102. Clients 108
`and 110 represent clients that communicate via the network
`102 using wired connections to the network 102. Client 112
`represents a client device, such as a personal digital assistant
`(PDA) or wireless telephone, that communicates with the
`network 102 using a wireless connection via the wireless
`server 106 which may be coupled to a base station or other
`type of wireless transceiver (not shown). These clients 108,
`110, and 112 may be, for example, personal computers or
`network computers. In the depicted example, server 104 pro
`vides data, Such as boot files, operating system images, and
`applications to clients 108-112. Clients 108, 110, and 112 are
`clients to server 104. Network data processing system 100
`may include additional servers, clients, and other devices not
`shown. In the depicted example, network data processing
`system 100 is the Internet with network 102 representing a
`worldwide collection of networks and gateways that use the
`Transmission Control Protocol/Internet Protocol (TCP/IP)
`Suite of protocols to communicate with one another. At the
`60
`heart of the Internet is a backbone of high-speed data com
`munication lines between major nodes or host computers,
`consisting of thousands of commercial, government, educa
`tional and other computer systems that route data and mes
`sages. Of course, network data processing system 100 also
`may be implemented as a number of different types of net
`works, such as for example, an intranet, a local area network
`
`45
`
`25
`
`30
`
`35
`
`40
`
`50
`
`55
`
`65
`
`

`

`7
`bus 306 through PCI bridge 308. PCI bridge 308 also may
`include an integrated memory controller and cache memory
`for processor 302. Additional connections to PCI local bus
`306 may be made through direct component interconnection
`or through add-in boards. In the depicted example, local area
`network (LAN) adapter 310, SCSI hostbus adapter 312, and
`expansion bus interface 314 are connected to PCI local bus
`306 by direct component connection. In contrast, audio
`adapter 316, graphics adapter 318, and audio/video adapter
`319 are connected to PCI local bus 306 by add-in boards
`inserted into expansion slots. Expansion bus interface 314
`provides a connection for a keyboard and mouse adapter 320,
`modem 322, and additional memory 324. Small computer
`system interface (SCSI) hostbus adapter 312 provides a con
`15
`nection for hard disk drive326, tape drive 328, and CD-ROM
`drive 330. Typical PCI local bus implementations will Sup
`port three or four PCI expansion slots or add-in connectors.
`An operating system runs on processor 302 and is used to
`coordinate and provide control of various components within
`data processing system 300 in FIG. 3. The operating system
`may be a commercially available operating system, Such as
`Windows XP, which is available from Microsoft Corporation.
`An objectoriented programming system such as Java may run
`in conjunction with the operating system and provide calls to
`the operating system from Java programs or applications
`executing on data processing system 300. "Java” is a trade
`mark of Sun MicroSystems, Inc. Instructions for the operating
`system, the object-oriented programming system, and appli
`cations or programs are located on storage devices, such as
`hard disk drive 326, and may be loaded into main memory
`304 for execution by pro

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket