throbber
Filed on behalf of Securus Technologies, Inc.
`By:
`Justin B. Kimble (jkimble@bcpc-law.com)
`
`Jeffrey R. Bragalone (jbragalone@bcpc-law.com)
`
`Daniel F. Olejko (dolejko@bcpc-law.com)
`
`Bragalone Conroy P.C.
`
`2200 Ross Ave.
`
`Suite 4500 – West
`
`Dallas, TX 75201
`
`Tel: 214.785.6670
`
`Fax: 214.786.6680
`
`
`
`UNITED STATES PATENT AND TRADEMARK OFFICE
`
`BEFORE THE PATENT TRIAL AND APPEAL BOARD
`
`GLOBAL TEL*LINK CORPORATION,
`Petitioner,
`
`v.
`
`SECURUS TECHNOLOGIES, INC.,
`Patent Owner.
`
`
`
`Case IPR2016-01220
`U.S. Patent No. 9,007,420 B1
`
`
`
`
`
`PATENT OWNER’S PRELIMINARY RESPONSE
`
`
`Mail Stop PATENT BOARD
`Patent Trial and Appeal Board
`U.S. Patent & Trademark Office
`P.O. Box 1450
`Alexandria, Virginia 22313-1450
`
`
`
`

`
`Case IPR2016-01220
`Patent 9,007,420
`Table of Contents
`
`I. 
`
`INTRODUCTION ............................................................................................... 2 
`A.  Grounds in the Petition .................................................................................. 2 
`B.  The ’420 Patent – The Challenged Patent ..................................................... 3 
`II.  CLAIM CONSTRUCTION ............................................................................. 7 
`A.  Petitioner’s Construction of “Electronic Visitation Session” is
`Unreasonably Broad. .............................................................................................. 9 
`Petitioner’s Construction of “Feature Detection Process” Is Overbroad and
`B. 
`Ignores the Teachings of the Specification. ......................................................... 12 
`Performing a “three-dimensional (3D) facial recognition process . . . to
`C. 
`identify the user” means “performing a process utilizing algorithms for detecting
`3D facial features in a frame, determining whether the facial features in the
`frame match an authorized user, and identifying changes or differences in
`measurements of those features from measurements of features in another frame
`to verify that an actual face (e.g., not a photograph) is present in the image.” .... 15 
`III.  ARGUMENT .................................................................................................. 17 
`A.  The Combination of Torgersrud and Kenoyer Fails to Disclose the
`Limitations Requiring Determining or Verifying the Presence of an “Actual
`Face” in Independent Claims 1, 11, and 21. ......................................................... 17 
`B.  Torgersrud Fails to Disclose Capturing an Image, with an Image Capture
`Device, of a User “in Response to” the “Request to Initiate an Electronic
`Visitation Session” in Independent Claims 1, 11, and 21. ................................... 23 
`C. 
`Petitioner Fails to Consider Claim 21 as a Whole. ..................................... 26 
`IV.  CONCLUSION .............................................................................................. 28 
`
`
`1
`
`

`
`Case IPR2016-01220
`Patent 9,007,420
`I.
`INTRODUCTION
`
`Patent Owner Securus Technologies, Inc. (“Securus” or “Patent Owner”)
`
`hereby files this preliminary response (“Preliminary Response”) to the Petition
`
`(Paper 2) (the “Petition”) for Inter Partes Review of U.S. Patent No. 9,007,420 (Ex.
`
`1001) (the “’420 Patent”) in IPR2016-01220 filed by Global Tel*Link Corporation
`
`(“GTL” or “Petitioner”).
`
`The Petitioner’s challenge to the ’420 Patent claims should be rejected
`
`because (1) the prior art lacks material claim limitations; (2) Petitioner has failed to
`
`consider claims as a whole; and (3) Petitioner’s grounds of unpatentability rely on
`
`erroneous claim constructions.
`
`This Response is timely under 35 U.S.C. § 313 and 37 C.F.R. § 42.107(b), as
`
`it is filed within three months of the June 22, 2016 mailing date of the Notice of
`
`Filing Date Accorded to Petition and Time for Filing Patent Owner Preliminary
`
`Response. (Paper 3). For purposes of this Preliminary Response, Patent Owner has
`
`limited its identification of deficiencies in the Petition and does not intend to waive
`
`any arguments not addressed in this Preliminary Response.
`
`A. Grounds in the Petition
`The Petition includes two grounds of alleged invalidity; both of the grounds
`
`rely on
`
`the combination of Torgersrud
`
`(U.S. Patent App. Pub. No.
`
`2012/0262271 A1) and Kenoyer (U.S. Patent No. 8,218,829) for allegedly rendering
`
`2
`
`

`
`Case IPR2016-01220
`Patent 9,007,420
`obvious independent claims 1 and 11 of the ’420 Patent under 35 U.S.C. § 103.
`
`Ground 2 additionally relies upon Zhang (U.S. Patent No. 7,436,988).
`
`Ground References Combined
`
`1
`2
`
`
`Pet. at 12.
`
`Torgersrud and Kenoyer
`Torgersrud, Kenoyer, and
`Zhang
`
`Independent
`Claims
`1, 11
`21
`
`Dependent
`Claims
`2-9, 12-19
`10, 20
`
`As discussed in more detail below, none of the references above, either
`
`separately or in combination, disclose all limitations in the independent claims,
`
`including, for example, the detection of the presence of an “actual face.”
`
`Additionally, none of the references cited disclose the capture of a user’s image in
`
`response to a request to initiate an electronic visitation session. Thus, the Petition
`
`does not demonstrate a reasonable likelihood that any of the proposed grounds of
`
`unpatentability will succeed for any claim of the ’420 patent.
`
`B. The ’420 Patent – The Challenged Patent
`The ’420 Patent titled “Verifying Presence of Authorized Persons During an
`
`Electronic Visitation” was filed on January 10, 2014 and is directed “to methods and
`
`systems for verifying presence of authorized persons during an electronic visitation.”
`
`’420 Patent at 1:5-10. One important goal of the ’420 Patent is to prevent
`
`circumvention of the authentication process by users who may use a photograph to
`
`fool prior art facial detection and recognition techniques. Id. at 1:27-35; 8:66-9:12.
`
`3
`
`

`
`Case IPR2016-01220
`Patent 9,007,420
`To this end, the ’420 Patent provides a system and method that uses a “feature
`
`detection process on the captured image,” such as “a 3D feature detection process,”
`
`to verify the user’s presence by, among other things, determining if the user’s “actual
`
`face” (i.e., a person’s corporeal face) is shown in the captured image, as opposed to
`
`a photograph or other facsimile. Id.
`
`A controlled-environment facility such as a prison may provide various
`
`options for inmates to communicate with visitors and other parties outside the prison.
`
`One of these options includes video visitation. These outside parties (e.g., family
`
`members) are sometimes located long distances from the inmate, making travel to
`
`the prison onerous or impractical. In some cases, outside parties may visit with an
`
`inmate remotely using a personal computer with an image capture device such as a
`
`web-cam. This is sometimes referred to as “at home visitation,” and occurs via
`
`“electronic visitation sessions.” However, inmates are typically restricted to
`
`receiving visitation only from approved persons. Id. at 1:14-34.
`
`When an individual visits an inmate in person, the individual’s identity may
`
`easily be determined by providing identification documents to staff of the controlled-
`
`environment facility for verification. Id. The identification documents may include
`
`a photo-ID such as a driver’s license or the like and the staff members may cross-
`
`reference the individual’s name with a list of individuals on the inmate’s approved
`
`visitor list.
`
`4
`
`

`
`Case IPR2016-01220
`Patent 9,007,420
`But identification of visitors is more difficult with telephone or video
`
`visitation. Id. at 1:27-35. This is especially true in the case of at-home visitation
`
`because there are no staff to physically verify the physical presence and identity of
`
`the authorized visitor. The individual may provide a personal identification number,
`
`phrase, or pass code, but it is often difficult to ascertain whether the person providing
`
`the identifying information is in fact the authorized visitor without visual
`
`confirmation of the person’s identity. Id. Unauthorized parties may attempt to defeat
`
`or circumvent verification of a user’s identity. Id. For example, an authorized visitor
`
`may pass identification information to unauthorized individuals so that they may
`
`pose as an authorized visitor for the electronic visitation. Id.
`
`Moreover, the inventor recognized that it is difficult to confirm that an actual
`
`person is taking part in a video visitation and that typical facial recognition
`
`verification techniques may be susceptible to circumvention. See id. at 1:27-35;
`
`8:66-9:12. For example, inmates or unauthorized outside parties may present a
`
`photograph to the camera of an authorized individual when prompted to verify their
`
`identity. The ’420 Patent addresses and solves these problems, in part, by providing
`
`embodiments for verifying the genuine presence of an actual person. For this
`
`purpose, exemplary embodiments perform specialized methods of face recognition,
`
`including, for example, three-dimensional facial recognition. See id. Each
`
`embodiment also includes certain triggers and event timing in order perform the user
`
`5
`
`

`
`Case IPR2016-01220
`Patent 9,007,420
`verification. Id. at 11:4-12:53. For example, each claimed embodiment includes
`
`receiving a request to initiate an electronic visitation session and capturing a user
`
`image in response to the request. Id. Furthermore, the claimed embodiments involve
`
`verification of the actual face at particular times during the process, including
`
`continued confirmation of the actual face based on a second image captured during
`
`the electronic visitation session. Id.
`
`In some embodiments, the ’420 Patent teaches that “actual,” i.e., corporeal,
`
`face detection may include examining differences from a plurality of captured
`
`frames. See id. at 8:66-9:12. Additionally, a facial feature detection process may be
`
`performed to verify that an actual corporeal human face is present. Id. This feature
`
`detection process may include identifying three-dimensional characteristics of an
`
`image to include measurements of features of a face at a plurality of points on the
`
`image. Id.
`
`Furthermore, the claimed system of the ’420 Patent prevents connection of
`
`the electronic visitation session until after a determination that the actual face is
`
`present in the captured image. Id. at 11:4-12:53. Various embodiments may perform
`
`other actions in response to a determination that a corporeal human face is not
`
`present in the captured image. Id. Further embodiments may include performing a
`
`facial recognition process on the captured image, following performance of the
`
`feature detection process and/or in response to a determination that a corporeal
`
`6
`
`

`
`Case IPR2016-01220
`Patent 9,007,420
`human face is present in the captured image, to identify the user and/or to confirm
`
`an identity of the user. Id. The processing device may be further configured to
`
`connect the electronic visitation session in response to a verification that the actual
`
`corporeal human face was present in the captured image. Id.
`
`The Petition attempts to characterize the ’420 Patent as merely describing
`
`“using face recognition during a video conference.” Pet. at 9 (“Securus claims to
`
`have invented using face recognition during a video conference with an inmate.”).
`
`But, as discussed above and further analyzed below, the Petition mischaracterizes
`
`the claimed embodiments of the ’420 Patent. While “face recognition” is an element
`
`of some of the claimed embodiments, none of the prior art references disclose, teach,
`
`or suggest, separately or in combination, all of the limitations of any of the claims.
`
`Thus no prima facie case of obviousness is made because the Petitioner has not
`
`shown that one having ordinary skill in the art would have combined the cited art
`
`since the proposed combinations of prior art do not result in disclosure of the claimed
`
`invention.
`
`II. CLAIM CONSTRUCTION
`Petitioner proposes construction of two terms: “electronic visitation session”
`
`and “feature detection process.” Pet. at 21-24. The term “electronic visitation
`
`session” is used in each of the independent claims of the ’420 Patent. The term
`
`“feature detection process” is used in independent claims 1 & 11 and also used in
`
`7
`
`

`
`Case IPR2016-01220
`Patent 9,007,420
`dependent claims 5, 10, and 20 of the ’420 Patent. Petitioner argues that these terms
`
`need to be construed, but does not offer any explanation as to why these terms cannot
`
`be given their “plain and ordinary meaning.” See In re Translogic Tech., Inc., 504
`
`F.3d 1249, 1257 (Fed. Cir. 2007) (holding that claim terms are presumed to be given
`
`their ordinary and customary meaning as would be understood by one of ordinary
`
`skill in the art in the context of the entire disclosure). Indeed, Petitioner’s proposed
`
`constructions are not offered because these terms require construction, but instead,
`
`so that Petitioner can attempt to match the prior art to the challenged claims. This is
`
`an improper reason for construing these terms. And further, Petitioner’s improper
`
`constructions undermine its arguments that the prior art renders the challenged
`
`claims obvious.
`
`Even under the broadest reasonable interpretation standard applicable here,
`
`claim constructions must be consistent with the patent specification. Servicenow,
`
`Inc. v. Hewlett-Packard Co., No. IPR2015-00702, Paper No. 12, Decision Denying
`
`Institution of Inter Partes Review at 6 (PTAB Aug. 21, 2015) (citing Microsoft Corp.
`
`v. Proxyconn, Inc., 789 F.3d 1291, 1298 (Fed. Cir. 2015)). As the Board has
`
`acknowledged, a claim’s construction “should always be read in light of the
`
`specification and teaching in the underlying patent,” and “cannot be divorced from
`
`the specification.” Id. Indeed, “the specification ‘is the single best guide to the
`
`meaning of a disputed term.’” In re Translogic Tech., at 1257 (Fed. Cir. 2007)
`
`8
`
`

`
`Case IPR2016-01220
`Patent 9,007,420
`(quoting Phillips v. AWH Corp., 415 F.3d 1303, 1315 (Fed. Cir. 2005)). Because the
`
`Petitioner’s constructions are unreasonable in light of the specification they cannot
`
`be correct. Servicenow, Inc., No. IPR2015-00702, Paper No. 12 at 6 (quoting
`
`Microsoft Corp., 789 F.3d at 1298) (“The Board . . . may not ‘construe claims during
`
`IPR so broadly that its constructions are unreasonable under general claim
`
`construction principles.’”).
`
`A. Petitioner’s Construction of “Electronic Visitation Session” is
`Unreasonably Broad.
`
`Petitioner suggests that the term “electronic visitation session” should be
`
`interpreted to mean “an electronic communication, such as by text, video, or voice,
`
`with a resident of a controlled environment facility.” Pet. at 23. But this construction
`
`is unreasonably broad. It essentially covers any “electronic communication” with an
`
`inmate. Petitioner’s construction completely ignores the word “session” in the term
`
`“electronic visitation session.” Thus, Petitioner’s construction improperly removes
`
`the temporal limitations inherent in the term.
`
`The term “session” connotes certain temporal limitations under its plain
`
`meaning,1 and in view of the specification. For example, a visitation session at a
`
`
`1 See, e.g., Xerxes Mazda & Fraidoon Mazda, The Focal Illustrated Dictionary of
`
`Telecommunications 555 (1999) (defining “Session” as “The period of time during
`
`9
`
`

`
`Case IPR2016-01220
`Patent 9,007,420
`correctional facility would begin when the visitor and the inmate are joined in the
`
`same room, and would end when either the visitor or the inmate leaves the room. If
`
`the visitor came back the next week, that visitation would constitute a separate
`
`visitation session.
`
`Further, Petitioner’s construction does not apply the proper meaning to the
`
`term “visitation.” A “visitation” is not just a communication with a resident of a
`
`controlled environment facility. Such communications could occur with other
`
`residents of the controlled environment facility, and would not be consistent with
`
`the well-understood meaning of visitation within a controlled environment. A
`
`visitation is, in fact, well-understood to be a controlled visit of a non-resident with a
`
`
`which two terminals on a network are connected together by a transmission path, to
`
`allow
`
`communications
`
`to
`
`occur
`
`between
`
`them.”),
`
`available
`
`at
`
`https://books.google.com/books?id=i5ZNAGQOX18C&pg=PA555
`
`(last visited
`
`Sept. 21, 2016); Mrinal Talukdar, Dictionary of Computer & Information
`
`Technology 272 (2013) (“In telecommunications, a session is a series of interactions
`
`between two communication end points that occur during the span of a single
`
`connection.”),
`
`available
`
`at
`
`https://books.google.com/books?id=NsswBQAAQBAJ&pg=PA272 (last visited
`
`Sept. 21, 2016) (Ex. 2001).
`
`10
`
`

`
`Case IPR2016-01220
`Patent 9,007,420
`resident of a controlled environment facility. ’420 Patent at 3:51-55. Further, in the
`
`context of the ’420 Patent, the visitation occurs through a communication processing
`
`system and visitation system that establishes a controlled and monitored connection
`
`between the resident and the non-resident. Id. at Fig. 1, 3:8-23, 8:45-9:16, 10:12-20.
`
`Thus, Petitioner’s proposed construction expands
`
`the broadest reasonable
`
`interpretation of this claim limitation and should be rejected.
`
`An “electronic visitation session” has a beginning and an end. An electronic
`
`visitation session begins when the communication processing system and visitation
`
`system connect the inmate and remote visitor to allow electronic communication
`
`(e.g., voice call, videoconference, or online chat) (which is analogous to the two
`
`participants being placed into the same room). See id. at 9:39-44. And, similarly, an
`
`electronic visitation session ends when the communication processing system and
`
`visitation system disconnects the communication between the inmate and the remote
`
`visitor. See id. at 9:44-49.
`
`Accordingly, the term “electronic visitation session” should be construed as
`
`“a period of time during which a resident of a controlled environment facility and a
`
`non-resident visitor are connected together by a communication processing system
`
`and a visitation system to allow the controlled and monitored exchange of electronic
`
`communications, wherein the session begins when the electronic communication is
`
`11
`
`

`
`Case IPR2016-01220
`Patent 9,007,420
`connected and ends when it is disconnected.” This construction is consistent with
`
`the plain meaning of the term and the specification.
`
`B. Petitioner’s Construction of “Feature Detection Process” Is
`Overbroad and Ignores the Teachings of the Specification.
`
`Petitioner suggests the term “feature detection process” be construed as “a
`
`process for detecting characteristics of an image, such as measurements of features
`
`of a face at a plurality of points on the image.” Pet. at 23. However, Petitioner ignores
`
`the purpose of the “feature detection,” the teachings of the specification, and the
`
`plain meaning of the word “feature” in view of the specification. In each instance
`
`where the term “feature detection process” (or “the detection process”) is recited in
`
`the claims, it refers to the phrase “to verify that an actual face is present.” See ’420
`
`Patent at 11:9-11, 11:49-50, 12:3-5. Petitioner’s proposed construction of “feature
`
`detection process” ignores this important aspect of the claims and should be rejected.
`
`An appropriate claim construction must include construction of the entire phrase
`
`“performing a [feature] detection process . . . on the [second] image to verify that
`
`[an/the] actual face [was/is] present in the [second] image.” ’420 Patent at 10:21-24.
`
`The ’420 Patent’s “feature detection process” is included in independent
`
`claims 1 and 11. Each of these claims provide that the feature detection process
`
`operates on a captured image, and is used “to verify that an actual face was present
`
`in the image.” Id. at 1:51-59. The ’420 Patent performs this detection to determine
`
`“that the person is an actual person and not simply a photograph presented to trick
`
`12
`
`

`
`Case IPR2016-01220
`Patent 9,007,420
`the system.” Id. at 9:5-8. Furthermore, the specification provides that a “captured
`
`image may include either a still frame photograph or a video image.” Id. at 8:55-65.
`
`That is, the captured image may be a “single video frame” or a “plurality of video
`
`frames.” Id. In the disclosed embodiments, the process includes “measurements of
`
`features of a face at a plurality of points on the image.” Id. at 9:2-5.
`
`In order to detect whether the captured image is an “actual face” as opposed
`
`to a photograph, embodiments of the process evaluate “changes in the measurements
`
`from frame to frame of a video image” or may use “a plurality of still frame
`
`photograms” that “may be captured and differences in measurements may be
`
`calculated.” Id. at 9:5-12. Thus, changes in the captured image from frame to frame,
`
`such as movement of the lips, eyes, etc. allow the feature detection process “to
`
`determine if the presented person is an actual person or a photograph.” See id.
`
`at 8:66-9:12.
`
`The Board should reject GTL’s overbroad proposed construction because it
`
`ignores this critical aspect of the claimed “feature detection process.” It is
`
`inappropriate to construe the term “feature detection process” in isolation. Instead,
`
`the more complete phrase “performing a [feature] detection process . . . on the
`
`[second] image to verify that [an/the] actual face [was/is] present in the [second]
`
`image” should be construed. Consistent with the plain meaning and in view of the
`
`specification, the claim limitation should be construed as “performing a process
`
`13
`
`

`
`Case IPR2016-01220
`Patent 9,007,420
`utilizing algorithms for detecting features in a frame and identifying changes or
`
`differences in those features from features in another frame to verify that an actual
`
`face (e.g., not a photograph) is present in the image.”
`
`Claim differentiation does not require otherwise. “Claim differentiation is a
`
`guide, not a rigid rule.” Marine Polymer Techs., Inc. v. Hemcon, Inc., 672 F.3d 1350,
`
`1359 (Fed. Cir. 2012). While dependent claims 10 and 19 specifically “utilize[]
`
`three-dimensional (3D) feature detection,” they do not render superfluous the
`
`requirement in independent claim 1 that the “feature detection process . . . verify that
`
`an actual face was present in the image.” Indeed, throughout the specification, the
`
`“feature detection process” is described as a process “to verify that an actual face
`
`was present in the image,” even outside of the context of 3D feature detection. Id.
`
`at Abstract, Fig. 4, 1:45-47, 1:54-57, 8:67-9:2. Because the specification does not
`
`limit the process of detecting changes or differences in features from frame to frame
`
`to the 3D feature detection process, GTL’s claim-differentiation argument is
`
`unpersuasive. See, e.g., Wi-LAN USA, Inc. v. Apple Inc., --- F.3d ----, 2016 WL
`
`4073324, at *11-12 (Fed. Cir. Aug. 1, 2016).
`
`14
`
`

`
`Case IPR2016-01220
`Patent 9,007,420
`C. Performing a “three-dimensional (3D) facial recognition process . . .
`to identify the user” means “performing a process utilizing
`algorithms for detecting 3D facial features in a frame, determining
`whether the facial features in the frame match an authorized user,
`and identifying changes or differences in measurements of those
`features from measurements of features in another frame to verify
`that an actual face (e.g., not a photograph) is present in the image.”
`
`Independent claim 21 recites “performing a three-dimensional (3D) facial
`
`recognition process . . . to identify the user.” In the context of method 500 depicted
`
`in Figure 5, the specification states that the “3D facial recognition process” is used
`
`“to affirmatively identify the user as an authorized user as shown in block 504.” Id.
`
`at Fig. 5, 9:34-36. “For example, the visitation system 130 may verify that the inmate
`
`present matches a PIN entered by the inmate and that the remote user is a member
`
`of the inmate’s PAC list.” Id. at 9:37-39. It further states that “[t]he second image
`
`may be processed according to the 3D facial recognition process, and it may be
`
`further determined whether the face in the second image matches an authorized user
`
`as shown at block 508.” Id. at 9:55-58.
`
`In the context of Figure 6, the specification states that “[t]his embodiment may
`
`also further demonstrate the method 500 described in FIG. 5.” Id. at 10:4-8. “In this
`
`embodiment, the method 500 is carried, at least in part, by a smart terminal 103.” Id.
`
`at 10:8-9. “During the use of the electronic visitation session, the visitation system
`
`130 may monitor webcam 801 to ensure that the actual authorized person’s face 601
`
`is still present . . . .” Id. at 10:13-16. “If the monitoring application no longer detects
`
`15
`
`

`
`Case IPR2016-01220
`Patent 9,007,420
`an actual authorized face 601 within camera field of view 603, the existing login
`
`session and display information are closed.” Id. at 10:16-20.
`
`The specification further notes that the inmate smart terminal 103 “utilizes
`
`algorithms for detecting 3D facial features to verify that a human face is presented
`
`to the camera.” Id. at 10:21-24. The “3D feature detection process may identify
`
`three-dimensional characteristics of an image, including measurements of features
`
`of a face at a plurality of points on the image.” Id. at 9:2-5. Further, the specification
`
`explains that the “facial identification process” requires not only “facial detection,”
`
`“but the face must be matched against a known likeness of the inmate.” Id. at 10:33-
`
`36. “A match to the known face 601 of the inmate then acts not only as a means of
`
`maintaining the electronic visitation session but also as an additional level of
`
`confirmation of the inmate’s identity so as to preclude unauthorized use.” Id. at
`
`10:36-40.
`
`Thus, according to the specification, the “three-dimensional (3D) facial
`
`recognition process” has multiple aspects: (1) it utilizes 3D feature detection to
`
`analyze each image; (2) it determines whether the face present matches the likeness
`
`of a known person that is authorized to make the call; and (3) it determines whether
`
`the face present is the actual authorized face (e.g., not a photograph used to trick the
`
`system) by comparing differences in an image captured by the system (e.g., changes
`
`in the measurements from frame to frame in the captured image). In light of this
`
`16
`
`

`
`Case IPR2016-01220
`Patent 9,007,420
`disclosure, the phrase “performing a three-dimensional (3D) facial recognition
`
`process . . . to identify the user” should be construed as “performing a process
`
`utilizing algorithms for detecting 3D facial features in a frame, determining whether
`
`the facial features in the frame match an authorized user, and identifying changes or
`
`differences in measurements of those features from measurements of features in
`
`another frame to verify that an actual face (e.g., not a photograph) is present in the
`
`image.”
`
`III. ARGUMENT
`A. The Combination of Torgersrud and Kenoyer Fails to Disclose the
`Limitations Requiring Determining or Verifying the Presence of an
`“Actual Face” in Independent Claims 1, 11, and 21.
`
`Independent claims 1, 11, and 21 of the ’420 Patent require, among other
`
`things, determining or verifying an “actual face” in the captured image so as to
`
`prevent users from circumventing the verification system by presenting a photograph
`
`to the image capture devices. The ’420 Patent differentiates between simply
`
`detecting a “face” and detecting an “actual face” in order to allow the system to
`
`distinguish whether a user is an “actual person and not simply a photograph
`
`presented to trick the system.” ’420 Patent at 9:5-8. Hence, the embodiments in
`
`claims 1, 11, and 21 include, among other verification techniques, the prevention of
`
`users from circumventing the authentication process by displaying a photograph to
`
`the visitation system’s camera.
`
`17
`
`

`
`Case IPR2016-01220
`Patent 9,007,420
`Specifically, claim 1 recites a method comprising:
`
`receiving a request to initiate an electronic visitation session;
`capturing an image, with an image capture device, of a user in response
`to the request;
`performing a feature detection process, with a processor, on the image
`to verify that an actual face was present in the image;
`connecting
`the electronic visitation session
`in response
`determination that the actual face was present in the image;
`capturing a second image of the user with the image capture device
`during the electronic visitation session;
`performing the detection process on the second image, with the
`processor, to verify that the actual face is present in the second image.
`
`to a
`
`’420 Patent at 11:4-19. System claim 11 contains similar limitations. Id. at 11:43-
`
`12:5. Claim 21 requires “performing a three-dimensional (3D) facial recognition
`
`process . . . to identify the user” (id. at 11:33-52), which, as discussed, also involves
`
`detecting whether an actual face is present.
`
`The combination of Torgersrud and Kenoyer, however, fails to teach or
`
`suggest these limitations. Petitioner ignores the claim language “actual face” and
`
`instead analyzes the claim term as if it were simply substituted with “face.” See Pet.
`
`at 18 (“[T]he feature detection process verifies that an actual face was present in the
`
`image, in other words detects a face.” (emphasis added)). Such a substitution is
`
`legal error. See In re Greene, 22 F.3d 1104, 1994 WL 89035, at *2 (Fed. Cir. 1994)
`
`(unpublished disposition) (reversing the Board’s finding of obviousness based on an
`
`18
`
`

`
`Case IPR2016-01220
`Patent 9,007,420
`interpretation of the claim term “deformable hinge” that ignored the meaning of
`
`“deformable” in light of the specification); see also In re Wilson, 424 F.2d 1382,
`
`1385 (C.C.P.A. 1970) (“All words in a claim must be considered in judging the
`
`patentability of that claim against the prior art.”). Furthermore, Petitioner’s
`
`interpretation is inconsistent with the specification which distinguishes the
`
`modifying term “actual” as “not simply a photograph presented to trick the system.”
`
`’420 Patent at 9:5-12.
`
`Torgersrud is drawn to an interactive communications kiosk for use in a
`
`secure facility that “provides access to services including internet services, text-
`
`based messaging, tele-medical services, religious and educational materials,
`
`commissary ordering, and entertainment.” Torgersrud at Abstract. Torgersrud’s
`
`kiosk provides for verification of users by facial recognition. Id. (“The kiosk is
`
`configured to authenticate the identity of a user by verifying a personal identification
`
`number entered by the user and also performing one or more of a facial recognition
`
`via the camera or a biometric voice recognition via the microphone.”). In fact,
`
`Torgersrud focuses primarily on facial recognition, i.e., recognizing the identity of
`
`a person—not feature detection for determining the presence of an actual face. See
`
`Torgersrud ¶ [0040] (“The platform 225 may also include voice and/or facial
`
`recognition features, described in more detail below.”); id. ¶ [0054] (“The kiosk 102
`
`19
`
`

`
`Case IPR2016-01220
`Patent 9,007,420
`includes an integrated camera 303 that can be used for video communications or for
`
`user authentication via facial recognition.”).
`
`Torgersrud also claims to perform “facial detection,” but this “detection” is
`
`precisely the type of verification technique that is susceptible to circumvention when
`
`users present a photograph to the camera. And this is exactly a problem the
`
`’420 Patent solves over the prior art. ’420 Patent at 8:66-9:12. Torgersrud’s
`
`detection technique looks at individual frames of video to verify if a face is present
`
`in the frame. Torgersrud ¶ [0085] (“The facial detection software uses video analysis
`
`of individual frames of video to detect that a human face is present inside the video
`
`frame.”). If Torgersrud does not detect a “face” in the frame (such as when the
`
`camera is covered up), it prevents system access or perturbs the video image. Id.
`
`¶¶ [0064], [0085]. But detecting a face in a video frame image as is disclosed in
`
`Torgersrud is not the same as detecting the presence of an “actual face” as claimed
`
`in the ’420 Patent.
`
`Petitioner appears to recognize the weakness in its arg

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket