throbber
Adaptive gate multifeature Bayesian statistical tracker
`Adaptive gate multifeature Bayesian statistical tracker
`
`W. B. Schaming
`W. B. Scheming
`RCA Advanced Technology Laboratories
`RCA Advanced Technology Laboratories
`Camden, New Jersey 08102
`Camden, New Jersey 08102
`
`Abstract
`Abstract
`
`A statistically based tracking algorithm is described which utilizes a powerful segmenta(cid:173)
`A statistically based tracking algorithm is described which utilizes a powerful segmenta-
`tion algorithm. Multiple features such as intensity, edge magnitude, and spatial frequency
`tion algorithm.
`Multiple features such as intensity, edge magnitude, and spatial frequency
`are combined to form a joint probability distribution to characterize a region containing
`are combined to form a joint probability distribution to characterize a region containing
`a target and its immediate surround. These distributions are integrated over time to pro(cid:173)
`a target and its immediate surround.
`These distributions are integrated over time to pro-
`vide a stable estimate of the target region and background statistics. A Bayesian decision
`vide a stable estimate of the target region and background statistics.
`A Bayesian decision
`rule is implemented using these distributions to classify individual pixels as target or
`rule is implemented using these distributions to classify individual pixels as target or
`nontarget. An adaptive gate process is used to estimate desired changes in the tracking
`nontarget.
`An adaptive gate process is used to estimate desired changes in the tracking
`window size.
`window size.
`
`Introduction
`Introduction
`
`This paper documents progress during the past year toward the development and demonstra(cid:173)
`This paper documents progress during the past year toward the development and demonstra-
`tions of a statistical tracking algorithm. Papers^ '2 presented in 1981 described some of
`tions of a statistical tracking algorithm. Papers ,2 presented in 1981 described some of
`the initial concepts in this development. Since that time, the statistical tracking algo(cid:173)
`the initial concepts in this development.
`Since that time, the statistical tracking algo-
`rithm has been expanded to incorporate (a) the simultaneous use of multiple features, (b) an
`rithm has been expanded to incorporate (a) the simultaneous use of multiple features,
`(b) an
`adaptive gate process for control of the window size, and (c) positional dependence of the
`adaptive gate process for control of the window size, and (c) positional dependence of the
`misclassification cost factor.
`misclassification cost factor.
`
`The tracking algorithm is based on the use of multifeature joint probability density
`The tracking algorithm is based on the use of multifeature joint probability density
`functions for the statistical separation of targets from their background. The features
`functions for the statistical separation of targets from their background.
`The features
`currently being used are intensity, edge magnitude, and a pseudo spatial frequency feature.
`currently being used are intensity, edge magnitude, and a pseudo spatial frequency feature.
`These features are combined to form the joint distributions which characterize a target
`These features are combined to form the joint distributions which characterize a target
`region and its immediate surround. The distributions are integrated over time to provide
`region and its immediate surround.
`The distributions are integrated over time to provide
`a stable estimate of the target and background statistics. A Bayesian decision rule is im(cid:173)
`a stable estimate of the target and background statistics.
`A Bayesian decision rule is im-
`plemented using these distributions to classify individual pixels as target or nontarget
`plemented using these distributions to classify individual pixels as target or nontarget
`within a tracking window. An adaptive gate process is used to estimate desired changes in
`within a tracking window.
`An adaptive gate process is used to estimate desired changes in
`the tracking window size. The algorithm at present assumes manual target designation.
`the tracking window size.
`The algorithm at present assumes manual target designation.
`
`RCA believes this tracking process is capable of operation in all environments; insensi(cid:173)
`RCA believes this tracking process is capable of operation in all environments; insensi-
`tive to target type, signature, and orientation.; applicable to a variety of sensors; and
`tive to target type, signature, and orientation; applicable to a variety of sensors; and
`extendable to multisensor processing and readily implementable.
`extendable to multisensor processing and readily implementable.
`
`Preprocessing and A/D conversion
`Preprocessing and A/D conversion
`
`The video preprocessing function is an important part of any imaging sensor system, but
`The video preprocessing function is an important part of any imaging sensor system, but
`is more critical when the sensor is an IR device which may exhibit very high dynamic range
`is more critical when the sensor is an IR device which may exhibit very high dynamic range
`capability. In this case it is insufficient to perform a simple AGC based upon global
`capability.
`In this case it is insufficient to perform a simple AGC based upon global
`statistics because the subsequent rescaling to reduce the dynamic range will destroy the low
`statistics because the subsequent rescaling to reduce the dynamic range will destroy the low
`contrast local detail. Instead, some form of local adaptive contrast enhancement should be
`contrast local detail.
`Instead, some form of local adaptive contrast enhancement should be
`applied in which the gain varies with the local contrast. Lo^ simulated and compared sever(cid:173)
`applied in which the gain varies with the local contrast.
`Loa simulated and compared sever-
`al such techniques.
`al such techniques.
`
`Although necessary in a hardware implementation, this function has not been included in
`Although necessary
`in a hardware implementation, this function has not been included in
`the simulations reported here. Ten-second image sequences were digitized from video tape
`the simulations reported here.
`Ten -second image sequences were digitized from video tape
`via an analog video disc and an image processing system. The input to the image processing
`via an analog video disc and an image processing system.
`The input to the image processing
`system was passed through a video processing amplifier so that the levels could be properly
`system was passed through a video processing amplifier so that the levels could be properly
`matched to the A/D converter.
`matched to the A/D converter.
`
`Statistical tracking algorithm
`Statistical tracking algorithm
`
`Targets are often separated from their background by a simple thresholding scheme. Some(cid:173)
`Targets are often separated from their background by a simple thresholding scheme.
`Some-
`times the computation of the threshold is quite sophisticated and involves looking at the
`times the computation of the threshold is quite sophisticated and involves looking at the
`statistics of the video signal. However, thresholding is inherently limited in ability as:
`statistics of the video signal.
`However, thresholding is inherently limited in ability as,
`can be seen by the diagrams in Fig. 1. A simple black and white target can be readily
`A simple black and white target can be readily
`can be seen by the diagrams in Fig. 1.
`thresholded to isolate it from its background. On the other hand a gray target cannot be
`On the other hand a gray target cannot be
`thresholded to isolate it from its background.
`thresholded without using a pair of thresholds properly placed to contain the intensity
`thresholded without using a pair of thresholds properly placed to contain the intensity
`levels on the target. This dual threshold in itself is not prohibitive, but rather the prob(cid:173)
`levels on the target. This dual threshold in itself is not prohibitive, but rather the prob-
`lem lies in the ability to place the thresholds at the appropriate levels.
`lem lies in the ability to place the thresholds at the appropriate levels.
`
`68
`68 / SPIE Vol. 359 Applications of Digital Image Processing IV (1982)
`/ SPIE Vol. 359 Applications of Digital Image Processing IV (1982)
`
`Downloaded From: http://proceedings.spiedigitallibrary.org/ on 11/28/2016 Terms of Use: http://spiedigitallibrary.org/ss/termsofuse.aspx
`
`Page 1 of 9
`
`SAMSUNG EXHIBIT 1008
`Samsung v. Image Processing Techs.
`
`

`

`WPM
`
`INTENSITY
`INTENSITY
`
`GRAY
`GRAY
`
`BLACK-,
`
`BLACK
`
`THRESHOLD
`THRESHOLD
`
`WHITE
`WHITE
`
`INTENSITY
`INTENSITY
`
`BLACK
`BLACK
`
`[GRAY
`GRAY
`
`WHITE
`WHITE
`
`(a) THIS TARGET CAN BE EASILY THRESHOLDED
`(a) THIS TARGET CAN BE EASILY THRESHOLDED
`
`(b) THIS TARGET CANNOT BE EASILY THRESHOLDED BUT
`(b) THIS TARGET CANNOT BE EASILY THRESHOLDED BUT
`REQUIRES A PAIR OF THRESHOLDS PROPERLY PLACED.
`REQUIRES A PAIR OF THRESHOLDS PROPERLY PLACED.
`
`SCAN DISTANCE
`SCAN DIS TANCE
`
`SCAN DISTANCE
`SCAN DISTANCE
`
`Fig. 1.
`Fig. 1.
`
`Example showing two postulated targets. One is easily segmented from the back(cid:173)
`One is easily segmented from the back-
`Example showing two postulated targets.
`ground using a single threshold. The other, however, requires two thresholds
`The other, however, requires two thresholds
`ground using a single threshold.
`which are not easily determined. The statistical process provides a separate
`The statistical process provides a separate
`which are not easily determined.
`threshold for each intensity level.
`threshold for each intensity level.
`
`The statistical segmentation process is a technique which provides an improved method for
`The statistical segmentation process is a technique which provides an improved method for
`extracting the target from its background. Figure 2 depicts this process. Shown are two
`Figure 2 depicts this process. Shown are two
`extracting the target from its background.
`histograms, one taken from a window area of the image containing the target and the other
`histograms, one taken from a window area of the image containing the target and the other
`taken from the immediate surround which represents the background. A single feature, in(cid:173)
`A single feature, in-
`taken from the immediate surround which represents the background.
`tensity, is shown in these histograms for illustrative purposes. The shape of the dis(cid:173)
`The shape of the dis-
`tensity, is shown in these histograms for illustrative purposes.
`tribution shown is arbitrary; there are no assumptions made about their actual shape. The
`tribution shown is arbitrary; there are no assumptions made about their actual shape.
`The
`segmentation process makes a separate assessment of each bin in the histogram to determine
`segmentation process makes a separate assessment of each bin in the histogram to determine
`if pixels whose intensity falls in the bin are more likely to be target or background. In
`if pixels whose intensity falls in the bin are more likely to be target or background.
`In
`addition to solving the threshold selection problem, the statistical tracking algorithm pro(cid:173)
`addition to solving the threshold selection problem, the statistical tracking algorithm pro-
`vides a method to both simplify the multimode tracking concept and provide added capability.
`vides a method to both simplify the multimode tracking concept and provide added capability.
`
`The simplification comes about in the following way. State-of-the-art multimode trackers
`The simplification comes about in the following way.
`State -of- the -art multimode trackers
`typically operate a contrast, edge, and correlation tracker in parallel. An executive
`An executive
`typically operate a contrast, edge, and correlation tracker in parallel.
`process may be defined to determine at any given time which tracking mode is providing
`process may be defined to determine at any given time which tracking mode is providing
`the most reliable estimate of target position. The statistical process, as currently de(cid:173)
`The statistical process, as currently de-
`the most reliable estimate of target position.
`fined, eliminates this mode polling process by combining the available features into multi(cid:173)
`fined, eliminates this mode polling process by combining the available features into multi-
`dimensional statistics representing target and background. Consider the use of intensity
`Consider the use of intensity
`dimensional statistics representing target and background.
`and edge magnitude as the two candidate features. In this case the statistical approach
`In this case the statistical approach
`and edge magnitude as the two candidate features.
`encompasses three tracking modes in an integrated single mode without the need to poll the
`encompasses three tracking modes in an integrated single mode without the need to poll the
`performance of the individual processes. When intensity is the best target background
`When intensity is the best target background
`performance of the individual processes.
`separator, the algorithm operates like a contrast tracker. When edge magnitude is pre(cid:173)
`separator, the algorithm operates like a contrast tracker. When edge magnitude is pre-
`dominate it operates similar to an edge centroid tracker. Because the process is searching
`Because the process is searching
`dominate it operates similar to an edge centroid tracker.
`for pixels in the current frame that are statistically similar to those pixels selected as
`for pixels in the current frame that are statistically similar to those pixels selected as
`target in previous frames, the algorithm is in a sense a correlation type process as well.
`target in previous frames, the algorithm is in a sense a correlation type process as well.
`
`The added capability comes from the fact that there are target/background conditions
`The added capability comes from the fact that there are target /background conditions
`which are inseparable using two features independently but are readily separable using the
`which are inseparable using two features independently but are readily separable using the
`In this example,
`same two features jointly. This is illustrated quite simply in Fig. 3.
`This is illustrated quite simply in Fig. 3.
`In this example,
`same two features jointly.
`neither edge magnitude nor intensity can be used independently to separate the target from
`neither edge magnitude nor intensity can be used independently to separate the target from
`background because both flat distributions cover the entire variable range for both features.
`background because both flat distributions cover the entire variable range for both features.
`On the other hand, the joint distribution clearly delineates the two areas.
`On the other hand, the joint distribution clearly delineates the two areas.
`
`69
`SPIE Vol. 359 Applications of Digital Image Processing IV 11982) /
`SPIE Vol. 359 Applications of Digital Image Processing IV (1982) / 69
`
`Downloaded From: http://proceedings.spiedigitallibrary.org/ on 11/28/2016 Terms of Use: http://spiedigitallibrary.org/ss/termsofuse.aspx
`
`SAMSUNG EXHIBIT 1008
`Page 2 of 9
`
`

`

`NUMBER i
`NUMBER
`OF
`OF
`PIXELS
`PIXELS
`
`1
`
`TARGET WINDOW
`TARGET WINDOW
`- HISTOGRAM
`HISTOGRAM
`
`BACKGROUND
`WINDOW
`HISTOGRAM
`
`TARGET
`WINDOW
`
`BACKGROUND
`BACKGROUND
`WINDOW
`WINDOW
`
`Fig. 2.
`Fig. 2.
`
`HISTOGRAM
`HISTOGRAM
`NUMBER OF PIXELS IN EACH INTENSITY GROUP
`NUMBER OF PIXELS IN EACH INTENSITY GROUP
`Example of how histograms are used to separate a target from its background. Each
`Example of how histograms are used to separate a target from its background.
`Each
`bin in the histogram is examined to determine if the intensity value falling within
`bin in the histogram is examined to determine if the intensity value falling within
`that bin are more likely to be target or background. Although this is a single
`that bin are more likely to be target or background.
`Although this is a single
`feature (intensity) example, the same process is used with multiple features in an
`feature (intensity) example, the same process is used with multiple features in an
`N-dimensional histogram representing a joint probability density.
`N- dimensional histogram representing a joint probability density.
`
`INTENSITY
`
`INTENSITY
`INTENSITY
`
`BACKGROUND
`
`BACKGROUND
`
`INTENSITY
`INTENSITY
`
`EDGE MAGNITUDE
`EDGE MAGNITUDE
`
`(A)
`(A)
`
`JOINT DISTRIBUTION OF INTENSITY
`JOINT DISTRIBUTION OF INTENSITY
`AND EDGE MAGNITUDE FOR A
`AND EDGE MAGNITUDE FOR A
`POSTULATED TARGET/BACKGROUND SCENE
`POSTULATED TARGET /BACKGROUND SCENE
`
`(B)
`(B)
`
`INDEPENDENT DISTRIBUTIONS FROM THE
`INDEPENDENT DISTRIBUTIONS FROM THE
`SAME POSTULATED TARGET/BACKGROUND SCENE
`SAME POSTULATED TARGET /BACKGROUND SCENE
`(TARGET AND BACKGROUND DISTRIBUTIONS
`(TARGET AND BACKGROUND DISTRIBUTIONS
`LOOK ALIKE)
`LOOK ALIKE)
`
`Fig. 3. Simple example showing how the use of joint statistics aids in the separation of
`Simple example showing how the use of joint statistics aids in the separation of
`Fig. 3.
`target from background in situations where the use of the features singly fails.
`target from background in situations where the use of the features singly fails.
`
`Figure 4 is a flow diagram of the statistical tracking mode. The preprocessed video is
`Figure 4 is a flow diagram of the statistical tracking mode.
`The preprocessed video is
`used to generate multiple feature images to be used in the decision process. The features
`The features
`used to generate multiple feature images to be used in the decision process.
`are combined into two joint probability density functions for (a) a target tracking window
`are combined into two joint probability density functions for (a) a target tracking window
`and (b) a background window frame. These distributions are the basis of a statistical
`These distributions are the basis of a statistical
`and (b) a background window frame.
`decision process which is used to classify the image pixels inside the tracking window to
`decision process which is used to classify the image pixels inside the tracking window to
`separate the target from the background. In actuality the statistics from previous frames
`In actuality the statistics from previous frames
`separate the target from the background.
`are used in the classification process for the current frame. At the same time, histo(cid:173)
`At the same time, histo-
`are used in the classification process for the current frame.
`grams are generated from the current image frame so that the statistics can be updated for
`grams are generated from the current image frame so that the statistics can be updated for
`processing subsequent frames. At the end of the classification process the segmented image
`At the end of the classification process the segmented image
`processing subsequent frames.
`is analyzed to determine the appropriate error signals as well as the window size and posi(cid:173)
`is analyzed to determine the appropriate error signals as well as the window size and posi-
`tion for the next frame. In parallel with the pixel rate computations for the Nth frame,
`In parallel with the pixel rate computations for the Nth frame,
`tion for the next frame.
`the statistics from the N-lst frame are integrated with past history and a decision rule is
`the statistics from the N-lst frame are integrated with past history and a decision rule is
`generated for the N+lst frame.
`generated for the N+lst frame.
`
`A sample output from the process is shown in Fig. 5. Only two features were used for
`A sample output from the process is shown in Fig. 5.
`Only two features were used for
`this example, namely, intensity and edge magnitude. The total number of bits utilized for
`this example, namely, intensity and edge magnitude.
`The total number of bits utilized for
`the features is seven four for intensity and three for edge magnitude. The edge magni(cid:173)
`the features is seven - four for intensity and three for edge magnitude.
`The edge magni-
`tude used is the absolute value approximation to the Sobel operator.
`tude used is the absolute value approximation to the Sobel operator.
`
`The next few paragraphs describe some of the steps in this process in more detail.
`The next few paragraphs describe some of the steps in this process in more detail.
`
`The first step in the statistical process is the generation of the features to be used.
`The first step in the statistical process is the generation of the features to be used.
`There are many potential candidates, some of which are computationally too burdensome for
`There are many potential candidates, some of which are computationally too burdensome for
`real-time implementation at this time. We therefore have limited our selection of features
`real -time implementation at this time.
`We therefore have limited our selection of features
`
`Computation of features
`Computation of features
`
`70
`/ SPIE Vol 359 Applications of Digital Image Processing IV (1982)
`70 / SPIE Vol. 359 Applications of Digital Image Processing IV (1982)
`
`Downloaded From: http://proceedings.spiedigitallibrary.org/ on 11/28/2016 Terms of Use: http://spiedigitallibrary.org/ss/termsofuse.aspx
`
`SAMSUNG EXHIBIT 1008
`Page 3 of 9
`
`

`

`1
`
`FEATURE
`FEATURE
`
`j1
`
`1
`
`.
`
`.
`
`•
`
`FEATURE
`1
`
`Fl
`FILTER STATISTICS
`NERATE DECISION
`GENERATE DECISION
`GE
`RULE
`LE
`RL
`
`'
`
`1
`UPDATE WINDOW
`UPDATE WINDOW
`POSITION AND SIZE
`POSITION AND SIZE
`AND COSTS FOR NEXT
`AND COSTS FOR NEXT
`FRAME
`FRAME
`
`PREPROCESSEO VIDEO
`PREPROCESSED VIDEO
`
`[
`;
`
`1
`FEATURE
`FEATURE
`2
`
`*
`
`•
`
`i
`
`COMPUTE STATISTICS
`COMPUTE STATISTICS
`CLASSIFY PIXELS
`CLASSIFY PIXELS
`
`1
`
`ANALYSIS OF
`ANALYSIS OF
`SEGMENTED TARGET
`SEGMENTED TARGET
`AREA
`AREA
`
`1
`
`ERROR SIGNALS
`ERROR SIGNALS
`
`Fig. 4. Block diagram of the Bayesian statistical tracking mode. The feature computation,
`The feature computation,
`Block diagram of the Bayesian statistical tracking mode.
`Fig. 4.
`statistics generation, and pixel classification are performed at the pixel rate.
`statistics generation, and pixel classification are performed at the pixel rate.
`The computation of error signals is performed during vertical sync.
`The computation of error signals is performed during vertical sync.
`
`INPUT
`INPUT
`VIDEO
`VIDEO
`
`PREPROCESSING
`PREPROCESSING
`
`• MEDIAN FILTER
`MEDIAN FILTER
`
`TARGET
`TARGET
`SEPARATION
`SEPARATION
`
`• BAYESIAN
`BAYESIAN
`STATISTICAL
`STATISTICAL
`SEGMENTATION
`SEGMENTATION
`
`TRACKING
`TRACKING
`
`• ADAPTIVE GATE
`ADAPTIVE GATE
`• PROJECTIONS
`PROJECTIONS
`
`FEATURE
`FEATURE
`EXTRACTION
`EXTRACTION
`
`• 4 BITS INTENSITY
`4 BITS INTENSITY
`• 3 BITS EDGE
`3 BITS EDGE
`
`Fig. 5. Sample output from the Bayesian statistical tracker simulation using a 64-X-64
`Sample output from the Bayesian statistical tracker simulation using a 64 -x -64
`Fig. 5.
`pixel image of an aircraft at a mountain boundary. Two features were used in the
`Two features were used in the
`pixel image of an aircraft at a mountain boundary.
`statistical segmentation with a total of seven bits.
`statistical segmentation with a total of seven bits.
`
`to those which are readily implemented,
`to those which are readily implemented.
`spatial frequency.
`spatial frequency.
`
`These features are intensity, edge magnitude, and
`These features are intensity, edge magnitude, and
`
`The intensity feature is simply a requantized version of the digitized video signal to
`The intensity feature is simply a requantized version of the digitized video signal to
`obtain the desired number of bits of intensity resolution. The edge magnitude feature is
`The edge magnitude feature is
`obtain the desired number of bits of intensity resolution.
`the sum of absolute values approximation to the Sobel operator. The absolute sum is an
`The absolute sum is an
`the sum of absolute values approximation to the Sobel operator.
`acceptable and computationally more appealing approximation than the true edge magnitude.
`acceptable and computationally more appealing approximation than the true edge magnitude.
`
`The third feature is an approximation to spatial frequency in the horizontal direction.
`The third feature is an approximation to spatial frequency in the horizontal direction.
`Because it is a measure of object size, it could also be considered a simple texture
`Because it is a measure of object size, it could also be considered a simple texture
`measure in a broad sense. The spatial frequency is defined as the function of the run
`The spatial frequency is defined as the function of the run
`measure in a broad sense.
`length where a run is the number of consecutive pixels between which the pixel-to-pixel
`length where a run is the number of consecutive pixels between which the pixel -to -pixel
`difference does not exceed a predefined threshold. The threshold used is the mean value of
`The threshold used is the mean value of
`difference does not exceed a predefined threshold.
`the absolute difference beweeen pixels in the previous frame. The feature value is then
`The feature value is then
`the absolute difference beweeen pixels in the previous frame.
`defined as:
`defined as:
`
`SF = MAXIMUM [0, (2N - RUN LENGTH)]
`SF = MAXIMUM [o, (2N - RUN LENGTH) J
`(1)
`where 2^ is the number of levels into which the spatial frequency feature will be quantized.
`where 2N is the number of levels into which the spatial frequency feature will be quantized.
`
`SPIE Vol 359 Applications of Digital Image Processing /V (1982) /
`SPIE Vol. 359 Applications of Digital Image Processing IV (1982) / 71
`71
`
`Downloaded From: http://proceedings.spiedigitallibrary.org/ on 11/28/2016 Terms of Use: http://spiedigitallibrary.org/ss/termsofuse.aspx
`
`SAMSUNG EXHIBIT 1008
`Page 4 of 9
`
`

`

`An example of the spatial frequency feature is shown in Fig. 6. An arbitrary function
`An example of the spatial frequency feature is shown in Fig. 6.
`An arbitrary function
`is plotted to represent the image intensity I at successive pixels in the x direction.
`is plotted to represent the image intensity I at successive pixels in the x direction.
`Beneath the plotted data are shown the actual pixel intensities, absolute differences, run
`Beneath the plotted data are shown the actual pixel intensities, absolute differences, run
`lengths, and feature values. The threshold used to compute run lengths in the example
`The threshold used to compute run lengths in the example
`lengths, and feature values.
`is 1.3 and the number of quantization levels is 8(N « 3 bits). The first sample-to-sample
`is 1.3 and the number of quantization levels is 8(N = 3 bits).
`The first sample -to- sample
`difference which exceeds the threshold 1.3 is the sixth sample. Samples 1 to 5 represent
`Samples 1 to 5 represent
`difference which exceeds the threshold 1.3 is the sixth sample.
`a run of length 5 in which the differences do not exceed threshold. The corresponding
`The corresponding
`a run of length 5 in which the differences do not exceed threshold.
`feature value is 3 which is assigned to all pixel locations in the run. The higher feature
`The higher feature
`feature value is 3 which is assigned to all pixel locations in the run.
`values indicate smaller distances between gradient values exceeding threshold. Note that
`values indicate smaller distances between gradient values exceeding threshold.
`Note that
`the low amplitude variation between the pixels 6 and 14 do not exceed the threshold and
`the low amplitude variation between the pixels 6 and 14 do not exceed the threshold and
`therefore do not define the boundary of a run. The feature is intended to provide informa(cid:173)
`therefore do not define the boundary of a run.
`The feature is intended to provide informa-
`tion about the size (in the x direction) of areas or patches which have uniform or slowly
`tion about the size (in the x direction) of areas or patches which have uniform or slowly
`varying intensity.
`varying intensity.
`
`Generation and integration of statistics
`Generation and integration of statistics
`
`Histograms from two separate regions in the image must be computed to provide the prob(cid:173)
`Histograms from two separate regions in the image must be computed to provide the prob-
`ability density functions required by the decision rule. The regions from which the histo(cid:173)
`The regions from which the histo-
`ability density functions required by the decision rule.
`grams are generated are shown in Fig. 7. The assumption in the segmentation algorithm is
`The assumption in the segmentation algorithm is
`grams are generated are shown in Fig. 7.
`that the target is absent from the frame region. For both the frame and window regions a
`For both the frame and window regions a
`that the target is absent from the frame region.
`multifeature histogram is defined as
`multifeature histogram is defined as
`
`HFR (fl, f2, f3)
`f 3>
`HFR
`
`Frame Region Histogram
`Frame Region Histogram
`
`HN
`HWR (f
`HWR
`
`2'
`f2, f3)
`
`1,
`for the Nth image in the sequence.
`for the Nth image in the sequence.
`
`Window Region Histogram
`Window Region Histogram
`
`After normalization by the respective areas of the frame and window regions the histo(cid:173)
`After normalization by the respective areas of the frame and window regions the histo-
`grams become the discrete joint probability densities
`grams become the discrete joint probability densities
`
`SENSOR
`SENSOR
`FIELD-OF-VIEW
`FIELD -OF -VIEW
`
`FRAME REGION
`FRAME REGION
`(FR)
`(FR)
`
`WINDOW REGION
`WINDOW REGION
`(WR)
`(WR)
`
`Hp R (fj, f2,13) - multifeature histogram from frame region
`HFR Di, f2, f3) - multifeature histogram from frame region
`
`FWR (f1' f2' ty ~ multifeature histogram from window region
`FAIR (ff, fZ, f3) - multifeature histogram from window region
`
`PFR (f1, f2. f3)
`FR
`
`f2 , f3 ).
`PWR (fl, f2, f3) .
`WR
`
`-
`
`+
`
`X
`
`X
`
`X
`
`X
`
`z
`
`XX
`
`X
`
`X
`
`XTMITY
`zióá3100
`
`V
`
`z
`
`X
`
`,
`
`:
`
`x
`
`X
`
`:
`
`,
`
`7
`
`%
`
`%
`
`%
`
`.
`
`C
`
`X
`
`z
`
`X
`
`-
`
`X
`
`X
`
`X
`
`X
`
`X
`
`X
`
`X
`
`, a
`
`,
`
`,
`
`,
`
`,
`
`',
`
`,
`
`',
`
`,
`
`',
`
`,
`
`,
`
`,
`
`Fig. 6. Sample which shows the procedure
`Sample which shows the procedure
`Fig. 6.
`for calculating the pseudo spatial
`for calculating the pseudo spatial
`frequency feature. The absolute
`The absolute
`frequency feature.
`difference threshold used to compute
`difference threshold used to compute
`run lengths in the example is 1.3,
`run lengths in the example is 1.3,
`which is the average difference. The
`The
`which is the average difference.
`number of quantization levels for
`number of quantization levels for
`the feature is 8.
`the feature is 8.
`
`Fig. 7. Areas of the image over which the
`Areas of the image over which the
`Fig. 7.
`multifeature histograms are com(cid:173)
`multifeature histograms are com-
`puted. It is assumed that the
`It is assumed that the
`puted.
`target is absent from the frame
`target is absent from the frame
`region which is defined as a
`region which is defined as a
`border around the window region
`border around the window region
`containing the target.
`containing the target.
`
`72 / SPIE Vol. 359 Applications of Digital Image Processing IV (1982)
`/ SPIE Vol. 359 Applications of Digital Image Processing IV (1982)
`72
`
`Downloaded From: http://proceedings.spiedigitallibrary.org/ on 11/28/2016 Terms of Use: http://spiedigitallibrary.org/ss/termsofuse.aspx
`
`SAMSUNG EXHIBIT 1008
`Page 5 of 9
`
`

`

`To minimize short-term statistical variations these probability densities are combined
`To minimize short -term statistical variations these probability densities are combined
`in a weighted sum with the past history of the statistics. This fading memory filtering
`in a weighted sum with the past history of the statistics.
`This fading memory filtering
`is performed once each frame so that the statistical updating keeps up with the frame rate
`is performed once each frame so that the statistical updating keeps up with the frame rate
`of the video. The filtering is defined by
`The filtering is defined by
`of the video.
`_
`N
`__
`N-l
`N
`= a PFR + (1 - a) FPN-1
`FPFR
`
`(2)
`
`FPWR
`TA?TD
`
`= b PWR + (1 - b) FPWR1
`TATTD
`'
`~~
`'
`TATTD
`
`(3)
`'
`'
`
`where
`where
`
`Ni?n
`T? <U^WR
`FP , FPTvm are the filtered probability density functions at the Nth frame
`are the filtered probability density functions at the Nth frame
`FPFR, FPWR
`time
`time
`
`N
`N
`P^L' p]!iL
`FR WR
`PFR' PWR
`
`a, b
`a, b
`
`are the unfiltered density functions computed from the current frame
`are the unfiltered density functions computed from the current frame
`N
`N
`
`are the weighting factors which control the amount of smoothing
`are the weighting factors which control the amount of smoothing
`performed.
`performed.
`
`In the simulations performed to date, the filtered statistics up to and including frame
`In the simulations performed to date, the filtered statistics up to and including frame
`N-l are used to generate the decision rule to be used on frame N.
`N -1 are used to generate the decision rule to be used on frame N.
`
`Minimal cost decision rule
`Minimal cost decision rule
`
`The decision rule used in the classification of pixel

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket