`Apple Inc.
`v.
`Zentian Limited
`Patent 10,062,377
`
`Patent Owner’s Demonstratives
`
`Presented March 12, 2024
`
`1
`
`
`
`Argument Roadmap
`
`v Jiang alone cannot meet “feature vector” requirements
`v Petitioner has failed to prove a motivation for modifying
`Jiang to use feature vector-based recognition
`v Petition’s claim 2 theory lacks a reasonable
`expectation of success
`v Petitioner’s new claim 2 theory is improper and refuted
`
`2
`
`
`
`Argument Roadmap
`
`v Jiang alone cannot meet “feature vector” requirements
`v Petitioner has failed to prove a motivation for modifying
`Jiang to use feature vector-based recognition
`v Petition’s claim 2 theory lacks a reasonable
`expectation of success
`v Petitioner’s new claim 2 theory is improper and refuted
`
`3
`
`
`
`Jiang alone cannot meet “feature vector” requirements
`
`A codeword is not a plurality of extracted and/or derived quantities
`
`1(b): “feature vector comprises a
`plurality of extracted and/or derived
`quantities. . .”
`
`v “the n amplitudes for the given
`time interval represent n-
`component vector in the n-
`dimensional space” Nadas, Ex.
`1019, 1:15-37
`
`v Feature vectors had, e.g., thirty-
`nine dimensions. Ex. 1003 ¶¶ 41,
`66, 124
`
`Vector quantized codeword is a single
`quantity
`
`v VQ “code[s] a spectral vector into one of a
`fixed number of discrete symbols” Rabiner,
`Ex. 1015, 28, 35
`
`v VQ converts “multidimensional” input value
`to “single dimension identifying nearest
`vector-quantized value,” i.e., “single value
`associated with the codebook entry”
`Schmandt, Ex. 1010, 60
`
`v VQ converts feature vector to “single lookup
`number or index or code word index or code
`word” Anderson, Ex. 1073, 54:1-3
`
`Sur-reply at 2-3
`
`4
`
`
`
`Jiang alone cannot meet “feature vector” requirements
`
`A codeword is not a plurality of extracted and/or derived quantities
`
`Sur-reply at 2-3; Ex. 1073 at 53:13-54:11
`
`5
`
`
`
`Jiang alone cannot meet “feature vector” requirements
`
`A codeword is not a plurality of extracted and/or derived quantities
`
`Sur-reply at 2-3; POR at 7-8, 16-17; Ex. 1012 at 29 (pdf); Ex. 2005 at 17:34-36
`
`6
`
`
`
`Jiang alone cannot meet “feature vector” requirements
`
`A codeword is not extracted or derived from the recited digital audio stream
`
`1(b): “feature vector comprises . . .
`extracted and/or derived quantities
`from said digital audio stream during
`a defined audio time frame”
`
`v “When an unknown speech input
`is uttered, for each time interval, a
`value is measured or computed for
`each of the n components, where
`each component is referred to as a
`feature. The values of all of the
`features are consolidated to form
`an n-component feature vector for
`a time interval.” Nadas, Ex. 1019,
`1:15-37
`
`Vector quantized codeword is created
`from training data, not audio stream
`
`v VQ codebook (and codewords) developed
`from a “training set” of vectors; Rabiner, Ex.
`1015, 155-156
`
`v Codebook values are created from
`“specimen data” or “sample data”;
`Schmandt, Ex. 1010, 60
`
`v “the codeword is representative of the
`template vector,” which is “based on the
`centroid of other data, data that has come
`and gone and not used in the recognition
`process. It’s, rather, training data.”
`Anderson, Ex. 1073, 54:12-15, 56:4-23
`
`Sur-reply at 4-5
`
`7
`
`
`
`Jiang alone cannot meet “feature vector” requirements
`
`Claims require using distances calculated from feature vectors to identify words
`
`Limitation 1(g): “wherein said identification of spoken
`words uses one or more distances calculated from a first
`feature vector”
`
`Limitation 1(h): “a search stage for using the calculated
`distances to identify words. . .”
`
`Sur-reply at 6-7
`
`8
`
`
`
`Jiang alone cannot meet “feature vector” requirements
`
`Petition only relies on Jiang’s codeword-based recognition to meet 1(g) and 1(h)
`
`Sur-reply at 6-7 (Pet. 56, 59-60)
`
`9
`
`
`
`Jiang alone cannot meet “feature vector” requirements
`
`Jiang only teaches codeword-based word recognition
`
`Sur-reply at 6-7 (Jiang, 8:16-51); Ex. 2020 ¶ 50
`
`10
`
`
`
`Jiang alone cannot meet “feature vector” requirements
`
`“Feature vector” as recited in challenged claims cannot encompass codewords
`
`v Claims define “feature vector”
`
`v Codeword is not a “feature vector”
`
`v “Calculating distances from a
`feature vector” does not include
`computing probabilities from
`codewords. Ex. 2020 ¶ 43; Ex.
`1015, 152-154.
`
`v Calculations from codewords will
`be different than calculations from
`feature vectors. Schmandt, Ex.
`1010, 61-62 (light bulb example);
`Ex. 2020 ¶¶ 43-44
`
`v Known tradeoffs in speech
`recognition, including with respect
`to using VQ, cannot expand claim
`scope beyond express terms
`
`v Specification’s general discussion
`of lossy compression does not
`outweigh claim language
`(limitations 1(b), 1(g), 1(h))
`
`v Specification only teaches
`calculating distances from feature
`vectors, not codewords. Ex. 2020
`¶¶ 45-47, Ex. 1001, 12:42-44,
`13:5-7, 24:62-64, 25:18-20,
`25:57-60, 34:48-50, Fig. 18-23.
`
`Sur-reply at 8-13
`
`11
`
`
`
`Argument Roadmap
`
`v Jiang alone cannot meet “feature vector” requirements
`v Petitioner has failed to prove a motivation for modifying
`Jiang to use feature vector-based recognition
`v Petition’s claim 2 theory lacks a reasonable
`expectation of success
`v Petitioner’s new claim 2 theory is improper and refuted
`
`12
`
`
`
`No motivation to modify Jiang to use
`feature vector-based word recognition
`
`Schmandt dec. ¶¶ 170-182 directed to different
`modification, i.e., adding second processor/DSP to Jiang
`in view of Smyth
`
`Schmandt dec. ¶ 183 provides no motivation for
`modifying Jiang to use feature vector-based word
`recognition
`
`Schmandt dec. ¶ 172 states calculating distances was
`“computationally-intensive,” which motivates against
`modifying Jiang to use more computationally intensive
`feature-vector based word recognition. Ex. 2020 ¶¶ 54-58
`
`Sur-reply at 13-16
`
`13
`
`
`
`No motivation to modify Jiang to use
`feature vector-based word recognition
`
`Petitioner’s assertion that POSA may have determined the
`modification provides “a worthy tradeoff” or that
`increased load “can be offset by a second processor”
`proves no motivation to combine
`
`Sur-reply at 16-17
`
`14
`
`
`
`Argument Roadmap
`
`v Jiang alone cannot meet “feature vector” requirements
`v Petitioner has failed to prove a motivation for modifying
`Jiang to use feature vector-based recognition
`v Petition’s claim 2 theory lacks a reasonable
`expectation of success
`v Petitioner’s new claim 2 theory is improper and refuted
`
`15
`
`
`
`Combination of Jiang and Nguyen lacks
`reasonable expectation of success
`
`Undisputed that Nguyen cannot be combined with feedback-based pruning
`
`v Petitioner’s claim 2 theory relies on
`Nguyen’s approach of “performing
`a similarity measure (i.e. distance
`calculation) while
`contemporaneously performing the
`search step.” Pet. 62; POR, 20.
`
`v Nguyen’s relied upon teaching
`cannot be combined with
`feedback-based pruning. POR 30-
`32, Ex. 2020 ¶¶ 83-88, Ex. 2018
`at 69, Ex. 1050 at 4, Ex. 1012 at
`91.
`
`v Petitioner’s Reply does not mention
`Nguyen and concedes that
`Nguyen’s teaching could not have
`been combined with feedback-
`based pruning
`
`v Petitioner’s “one frame delay”
`theory is not supported by any
`prior art, and is therefore
`irrelevant. Ex. 2021, 55:4-8
`
`Sur-reply at 17-19
`
`16
`
`
`
`Combination of Jiang and Nguyen lacks
`reasonable expectation of success
`
`Jiang’s pruning is part of the combination, and not optional therein
`
`v Ravishankar and Mathew do not
`obviate the need for Jiang’s pruning in
`the combination; Mr. Schmandt
`excluded both from his combination. Ex.
`2021, 41:2-6, 38:11-16
`
`v Patent Owner’s arguments are not
`based on bodily incorporation; they are
`based on the incompatibility between
`Jiang’s pruning teachings and Nguyen’s
`parallel processing teachings
`
`v Petitioner’s combination includes
`Jiang’s pruning technique. POR, 20; Ex.
`2016, 13:19-14:8, Ex. 1003 ¶ 178
`
`v Mr. Schmandt’s supplemental
`declaration did not dispute that the
`combination includes Jiang’s pruning
`
`v Jiang’s pruning is not “optional” in the
`combination because Mr. Schmandt’s
`theory included it
`
`v Mr. Schmandt admitted that reducing
`Jiang’s pruning, much less eliminating
`it, risks the system falling “increasingly
`behind real time.” Ex. 1003 ¶ 177; Ex.
`1004, 2:19-28.
`
`Sur-reply at 19-22
`
`17
`
`
`
`Combination of Jiang and Nguyen lacks
`reasonable expectation of success
`
`Jiang’s pruning is feedback-based pruning
`
`Sur-reply at 19-22
`
`18
`
`
`
`Combination of Jiang and Nguyen lacks
`reasonable expectation of success
`
`Jiang’s pruning is feedback-based pruning
`
`Sur-reply at 19-22; Ex. 1005 at 8:16-20, 8:44-51, 8:52-64, 8:25-43
`
`19
`
`
`
`Combination of Jiang and Nguyen lacks
`reasonable expectation of success
`
`Jiang’s pruning is feedback-based pruning
`
`Sur-reply at 19-22
`
`20
`
`
`
`Combination of Jiang and Nguyen lacks
`reasonable expectation of success
`
`Jiang’s pruning is feedback-based pruning
`
`Krishna, Ex. 2018, at 42
`
`Mathew I, Ex. 1050, at 11
`
`Sur-reply at 19-22; Ex. 1050 at 11; Ex. 2018 at 42
`
`21
`
`
`
`Argument Roadmap
`
`v Jiang alone cannot meet “feature vector” requirements
`v Petitioner has failed to prove a motivation for modifying
`Jiang to use feature vector-based recognition
`v Petition’s claim 2 theory lacks a reasonable
`expectation of success
`v Petitioner’s new claim 2 theory is improper and refuted
`
`22
`
`
`
`Petitioner’s new claim 2 theory is untimely and refuted
`
`“Rather than explaining how its original petition was
`correct, Continental’s subsequent arguments amount to an
`entirely new theory of prima facie obviousness absent from
`the petition. Shifting arguments in this fashion is foreclosed
`by statute, our precedent, and Board guidelines.”
`
`Wasica v. Con’tl Auto Sys., 853 F.3d 1272, 1286-87 (Fed. Cir. 2017)
`
`Sur-reply at 26-28
`
`23
`
`
`
`Petitioner’s new claim 2 theory is untimely and refuted
`
`Sur-reply at 26-28; Ex. 2021, at 43:3-7, 41:2-6, 38:11-16
`
`24
`
`
`
`Petitioner’s new claim 2 theory is untimely and refuted
`
`Sur-reply at 26-28; Ex. 2018 at 42, 69, 108, 142-145
`
`25
`
`