throbber

`Trials@uspto.gov
`571-272-7822
`
`
`PUBLIC VERSION
`
`Paper 67
`Entered: April 5, 2023
`
`UNITED STATES PATENT AND TRADEMARK OFFICE
`
`BEFORE THE PATENT TRIAL AND APPEAL BOARD
`
`UNIFIED PATENTS, LLC,
`Petitioner,
`v.
`MEMORYWEB, LLC,
`Patent Owner.
`
`IPR2021-01413
`Patent 10,621,228 B2
`
`
`
`
`
`
`
`
`
`Before LYNNE H. BROWNE, NORMAN H. BEAMER, and
`KEVIN C. TROCK, Administrative Patent Judges.
`TROCK, Administrative Patent Judge.
`
`
`JUDGMENT
`Final Written Decision
`Determining All Challenged Claims Unpatentable
`35 U.S.C. § 318(a)
`
`
`
`
`
`

`

`
`IPR2021-01413
`Patent 10,621,228 B2
`
`I.
`
`INTRODUCTION
`
`We have authority to hear this inter partes review under 35 U.S.C. § 6.
`This Final Written Decision is issued pursuant to 35 U.S.C. § 318(a) and 37
`C.F.R. § 42.73. For the reasons discussed below, we determine that
`Petitioner, Unified Patents, LLC (“Unified”), has shown by a preponderance
`of the evidence that claims 1–7 (the “challenged claims”) of U.S. Patent No.
`10,621,228 B2 (Ex. 1001, “the ’228 Patent”) are unpatentable. See 35
`U.S.C. § 316(e) (2018); 37 C.F.R. § 42.1(d) (2019).
`A. Procedural History
`The Petition (Paper 2, “Pet.” or “Petition”) requested inter partes
`review of the challenged claims of the ’228 Patent. Patent Owner,
`MemoryWeb, LLC, filed a Preliminary Response. Paper 8 (“Prelim.
`Resp.”). With our authorization, Petitioner filed a Preliminary Reply (Paper
`11), and Patent Owner filed a Preliminary Sur-reply (Paper 12). Based upon
`the record at that time, we instituted inter partes review on all challenged
`claims on the grounds presented in the Petition. Paper 15 (“Institution
`Decision” or “Dec.”).
`After institution, Patent Owner filed a Response (Paper 23, “PO
`Resp.”), Petitioner filed a Reply (Paper 29, “Pet. Reply”), Patent Owner
`filed a Sur-reply (Paper 35, “PO Sur-reply”), and with our authorization,
`Petitioner filed a Sur-sur-reply (Paper 42, “Pet. Sur-sur-reply”).
`Petitioner filed a Motion to Exclude certain evidence (Paper 44).
`Patent Owner opposed the motion (Paper 45).
`On December 16, 2022, an oral hearing was held. The hearing
`comprised a confidential session and a public session. A transcript of the
`
`2
`
`

`

`
`IPR2021-01413
`Patent 10,621,228 B2
`hearing was made a part of this record. Paper 52 (confidential session),
`Paper 53 (public session).
`B. Real Party-in-Interest
`In the Petition, Petitioner stated that “[p]ursuant to 37 C.F.R.
`§ 42.8(b)(1), Unified Patents, LLC . . . certifies that Unified is the real party-
`in-interest and certifies that no other party exercised control or could
`exercise control over Unified’s participation in this proceeding, filing this
`petition, or conduct in any ensuing trial.” Pet. 1.
`In its Preliminary Response, Patent Owner argued that “Apple and
`Samsung1 should have been [named] as RPIs [(real parties in interest)] in
`this proceeding, and the failure to identify Apple and Samsung is a basis for
`the Board to deny institution.” Prelim. Resp. 28; see also id. at 22–28.
`As noted above, we authorized additional preliminary briefing to
`allow the parties to address RPI issue, as well as other issues. Ex. 1020. In
`its Preliminary Reply, Petitioner argued that “Patent Owner’s (PO’s) RPI
`arguments should be rejected as inappropriate or, at best, premature. As is
`the case here, the Board need not address whether a party is an unnamed RPI
`where no time bar or estoppel provisions under 35 U.S.C. § 315 are
`implicated.” Paper 11, 1 (citing SharkNinja Operating LLC v. iRobot Corp.,
`IPR2020-00734, Paper 11 at 18 (PTAB, Oct. 6, 2020) (precedential)
`
`
`1 We infer from the record that Patent Owner is referring to Samsung
`Electronics Co., Ltd. (“Samsung”) and Apple, Inc. (“Apple”) based on the
`petitions filed by these companies challenging the ’228 patent. See Sec. C,
`below.
`
`3
`
`

`

`
`IPR2021-01413
`Patent 10,621,228 B2
`(“SharkNinja”); Unified Patents, LLC v. Fat Statz, LLC, IPR2020-01665,
`Paper 19 at 2–3 (PTAB, Apr. 16, 2021).
`Based upon the preliminary record at that time, we instituted inter
`partes review on all the challenged claims on the grounds presented in the
`Petition, but declined to determine whether Apple and Samsung were real
`parties in interest. Dec. 15. We declined to decide the real party in interest
`question at that time partly because determining whether a non-party is an
`RPI is a highly fact-dependent question and the case was still in its
`preliminary stage without a fully developed factual record. Moreover, we
`determined that we need not address the RPI issue at that time because there
`was no allegation that Apple or Samsung were subject to a time bar or
`estoppel that would preclude this proceeding. Accordingly, under the
`Board’s precedential decision in SharkNinja, IPR2020-00734, Paper 11 at
`18, we declined to decide the RPI issue at that time. See Paper 15, 11–14.
`After institution, Patent Owner raised the RPI issue again, arguing in
`its Response that
`the Board should terminate this proceeding because Petitioner
`has failed to name all real parties-in-interest (“RPIs”), including
`at least Samsung and Apple. Alternatively, the Board should find
`that Apple and Samsung are estopped from challenging the
`validity of claims 1–7 of the ‘228 patent in related proceedings:
`Apple Inc. v. MemoryWeb, LLC, IPR2022-00031 (the “Apple
`IPR”) and Samsung Electronics Co., Ltd., v. MemoryWeb, LLC,
`IPR2022-00222 (the “Samsung IPR”) (collectively, the “Related
`IPRs”).
`PO Resp. 14–15.
`
`
`4
`
`

`

`
`IPR2021-01413
`Patent 10,621,228 B2
`Given that we now had a fully-developed factual record before us,
`including probative evidence on the RPI issue that was not available to us at
`the institution phase of this case, 2 and the parties have been able to argue
`this issue before the Board during a confidential session of the hearing in
`this proceeding (see Paper 52), we were able to fully address the real party in
`interest issue raised by Patent Owner in its Response. Accordingly, based
`upon the complete evidentiary record and the parties’ arguments, we issued
`an Order on March 8, 2023, (Paper 56) identifying Apple and Samsung as
`RPIs in this proceeding and instructing Petitioner to “update its Mandatory
`Notices by March 10, 2023, identifying all Real Parties in Interest consistent
`with this Order pursuant to its obligations under 37 C.F.R. § 42.8(b)(1).”
`See Paper 56, 34.
`C. Related Matters
`According to the parties, the ’228 patent was asserted in the following
`district court proceedings: MemoryWeb, LLC v. Samsung Electronics Co.,
`Ltd. et al., Case No. 6:21-cv-00411 (W.D. Tex.); MemoryWeb, LLC v. Apple
`Inc., Case No. 6:21-cv-00531 (W.D. Tex.); and MyHeritage (USA), Inc. et.
`al. v. MemoryWeb, LLC, Case No. 1:21-cv-02666 (N.D. Ill.). Pet. 1–2;
`Paper 4, 2; Paper 7, 2; Paper 9, 2–3.
`Patent Owner also identifies U.S. Patent No. 9,098,531 (“the ’531
`patent”), U.S. Patent No. 10,423,658 (“the ’658 patent”), U.S. Patent No.
`9,552,376 (“the ’376 patent”), U.S. Patent No. 11,017,020 (“the ’020
`
`2 Since institution, the parties supplemented the record with Exhibits 1030–
`1043 and 2027–2047, which included the deposition transcript of the CEO
`of Unified (Ex. 2036), as well as other relevant evidence on this issue.
`5
`
`

`

`
`IPR2021-01413
`Patent 10,621,228 B2
`patent”), U.S. Patent No. 11,163,823 (“the ’823 patent”), pending U.S.
`Patent Application 17/381,047, and pending U.S. Patent Application
`17/459,933 as related to the ’228 patent. Paper 7, 2; Paper 9, 2–3.
`Patent Owner additionally indicates the following inter partes
`proceedings as related matters: Samsung Electronics Co., Ltd., v.
`MemoryWeb, LLC, IPR2022-00222 (PTAB) challenging the ’228 patent;
`Apple Inc. v. MemoryWeb, LLC, IPR2022-00031 (PTAB) challenging the
`’228 patent; Apple Inc. v. MemoryWeb, LLC, IPR2022-00111 (PTAB)
`challenging the ’020 patent; Apple Inc. v. MemoryWeb, LLC, PGR2022-
`00006 (PTAB) challenging the ’020 patent; Apple Inc. v. MemoryWeb, LLC,
`IPR2022-00033 (PTAB) challenging the ’658 patent; and Apple Inc. v.
`MemoryWeb, LLC, IPR2022-00032 (PTAB) challenging the ’376 patent.
`Paper 7, 2; Paper 9, 2–3.
`D. The ’228 Patent (Ex. 1001)
`The ’228 patent is titled “Method and Apparatus for Managing Digital
`Files” and “relates generally to the management of digital files and, more
`particularly, to a computer-implemented system and method for managing
`and using digital files such as digital photographs.” Ex. 1001, code (54),
`1:21–24. The ’228 patent describes a need for “a medium that allows people
`to organize, view, preserve and share [digital] files with all the memory
`details captured, connected and vivified via an interactive interface” and
`“allow digital files, including documents, photos, videos and audio, to tell a
`full story now, and for generations to come.” Id. at 1:61–67. The ’228
`patent provides a solution in the form of “a computer-implemented method
`of associating digital tags with digital files” and “a web-based digital file
`
`6
`
`

`

`
`IPR2021-01413
`Patent 10,621,228 B2
`storage system [that] comprises a digital file repository for storing and
`retrieving digital files.” Id. at 2:3–6, 2:21–25, 2:40–45.
`The ’228 patent describes details of an “Application” (also called the
`“MemoryWeb Application”), which is an online program that can (i) import,
`associate and embed digital tags to digital files, (ii) view, sort, annotate, and
`share digital files from various Application Views, and (iii) store the digital
`files through an interactive storage system through a user relationship table.
`Id. at 8:63–9:16. The ’228 patent explains that the Application may be
`accessible “over various user interfaces” including those of “smart phones
`(e.g., iPhones), Personal Digital Assistants (PDAs) and Tablets (e.g.,
`iPads).” Id. at 9:18–22. The Application provides views (i.e., “Application
`Views”) that utilize the Application’s ability to associate digital tags to
`digital files and display them in customized views such as Uploads,
`Collections, Slideshow, Location, Timeline, Family Tree, People Profile,
`and Recipes. Id. at 9:23–28. The views enable a user to display the user’s
`digital media files and their tagged attributes. Id. at 5:57–60. The views
`include, inter alia: a location view that “identifies within an interactive map
`([e.g.,] Google map . . .), where digital files were taken or originated . . .
`[and] can also provide additional outputs such as a journey route that
`identifies the specific locations for an event or trip that can be customized by
`users”; a people view that “shows thumbnail photos of all the people in the
`system that can be clicked in for a people profile view”; and a people profile
`view that “shows a profile picture of an individual, their birth/death
`information, family relationships, overview (comments) on the person, as
`well as links to other views that contain that individual in the system.” Id. at
`
`7
`
`

`

`
`IPR2021-01413
`Patent 10,621,228 B2
`6:13–30. Some views provided by the ’228 patent’s Application are shown
`in Figures 32 and 34, reproduced below. Id. at 3:62–66, 28:22–24.
`Figure 32 illustrates a People Application View (at indicator 1400)
`and a People Profile Application View (at indicator 1430). Id. at 18:37–40,
`22:59–61.
`
`
`
`
`
`In Figure 32, above, People Application View 1400 is used to display
`all the people that were created within a user’s Application. Id. at 22:60–
`23:11. This view can be seen by selecting “People” (illustrated at menu item
`1401) from any of the Application Views within the Application, which then
`provides a list of people in various sort orders. Id. For each person, a
`thumbnail of their face along with their name is depicted, as shown in Figure
`
`8
`
`

`

`
`IPR2021-01413
`Patent 10,621,228 B2
`32, where Jon Smith (item 1403) and JC Jon Smith (item 1404) along with
`some other people are illustrated. Id. Also, at the top of every Application
`View within the Application, the user can select to apply filters (Apply
`Filters at item 1451). Id. In the People Profile Application View in Figure
`32, a single profile (item 1430) is illustrated. Id. at 23:12–49. The profile
`shows: the individual’s name (displayed at the top of the page, at 1431)
`along with their nicknames (at 1433); when they were born (at 1434); their
`family members (at 1435, 1436, 1437); their biography (at 1438); and a
`profile photo (at 1439). Id. For each person, the system can allow the user
`to quickly see all the tags that are associated to a person. Id.
`In Figure 32, the system illustrates that there are four photos (1452)
`associated with that person and illustrates thumbnails of each of the four
`photos (1446). Id. These thumbnails can be selected and then the user will
`be taken to the slideshow view for that digital file. Id. If the user selects
`Locations (1443), all of the locations that the specific person has been
`tagged within will be displayed. Id. If the user selects Family Relationships
`(1444), the people that the user is associated with will be displayed in a
`family chart or tree. Id. If the user selects any of the Application Dot-Tags
`such as the individual’s mother Jane Smith (Doe) (1449), the application
`will take the user to an individual people profile view of Jane Smith (Doe).
`Id. An Application Dot-Tag is a structure that enables navigation of the data
`in the Application, helps the user organize their digital files with key
`components of related information such as people, date of file, location, and
`collection, and indicates the manner in which a Digital Tag is displayed
`within the Application using pill-shaped indicators that can reside near a
`
`9
`
`

`

`
`IPR2021-01413
`Patent 10,621,228 B2
`file’s image or overlaid on the file’s image. Id. at 9:40–67. The ’228 patent
`explains that the “Application Dot-Tag is more than just text” because
`“Memory-Web Application Dot-Tags act as mini search engines that allow
`the user to see how many matching files there are to that MemoryWeb Tag
`and if selected will take the user to the corresponding Application View to
`illustrate the linked search results of that Application Dot-Tag.” Id.
`Figure 34 of the ’228 patent, reproduced below, illustrates Location
`Views. Id. at 21:36–38, 24:16–17.
`
`Figure 34, above, shows Location Application View 1600 that
`displays all the locations that were created within the user’s Application; for
`each location, a thumbnail of a digital file from that location (e.g., Wrigley
`Field 1601); a view of a single location (1630), with the individual location
`
`
`
`10
`
`

`

`
`IPR2021-01413
`Patent 10,621,228 B2
`name displayed at the top of the page (1632); thumbnails of each digital file
`within the specific collection, such as a photo (1633) taken at Wrigley Field
`(1634) that is associated with the location Wrigley Field. Id. at 24:16–54.
`The ’228 patent provides that “the Application can interact with a Third
`Party Geographical Mapping System to pull maps that correspond to the
`exact location of Digital Files that have a location tag.” Id. at 32:10–13.
`Figure 41 of the ’228 patent, reproduced below, is a screenshot of an
`Application Dot-Tag Filter in a Location Application View. Id. at 4:7–8.
`
`
`Figure 41, above, illustrates filtering results for an Application Dot-
`Tag filter in a Location Application View (at item 0870), providing a world
`map view that illustrates all the locations that are associated with one or
`
`
`
`11
`
`

`

`
`IPR2021-01413
`Patent 10,621,228 B2
`more digital files for a user. Id. at 29:41–64, 32:15–18. As shown in Figure
`41, digital files are displayed within an interactive map (e.g., a Google map).
`Id. at 29:40–64. Individual or groups of digital files are illustrated as photo
`thumbnails (at indicators 0874 and 0875) on the map, and the user can select
`the thumbnail to see all the digital files with the same location, or the user
`can use the interactive map and narrow the map view by using a zoom
`in/zoom out bar (0876) or by selecting the map. Id. If an advanced filter is
`applied in the Locations Application View, a filter (e.g., of “JC Smith” at
`item 0872) is illustrated, and only the digital files that contain the person JC
`Smith are illustrated with their geographic location on the map. Id.
`E. Challenged Claims
`Petitioner challenges claims 1–7 of the ’228 patent. Pet. 2, 4. Claim 1
`is independent. Claim 1 is illustrative and is set out below.
`1. [1a-preamble] A method comprising:
`[1b] responsive to a first input, causing a map view to be
`displayed on an interface, [1c] the map view including:
`(i) an interactive map;
`[1d] (ii) a first location selectable thumbnail image at a first
`location on the interactive map; and
`[1e] (iii) a second location selectable thumbnail image at a
`second location on the interactive map;
`[1f] responsive to an input that is indicative of a selection of the
`first location selectable thumbnail image, causing a first location
`view to be displayed on the interface, [1g] the first location view
`including (i) a first location name associated with the first
`location and (ii) a representation of at least a portion of one
`digital file in a first set of digital files, [1h] each of the digital
`files in the first set of digital files being produced from outputs
`
`12
`
`

`

`
`IPR2021-01413
`Patent 10,621,228 B2
`of one or more digital imaging devices, the first set of digital files
`including digital files associated with the first location;
`[1i] responsive to an input that is indicative of a selection of the
`second location selectable thumbnail image, causing a second
`location view to be displayed on the interface, [1j] the second
`location view including (i) a second location name associated
`with the second location and (ii) a representation of at least a
`portion of one digital file in a second set of digital files, [1k] each
`of the digital files in the second set of digital files being produced
`from outputs of the one or more digital imaging devices, the
`second set of digital files including digital files associated with
`the second location; and
`[1l] responsive to a second input that is subsequent to the first
`input, causing a people view to be displayed on the interface,
`[1m] the people view including:
`(i) a first person selectable thumbnail image including a
`representation of a face of a first person, the first person
`being associated with a third set of digital files including
`digital photographs and videos;
`[1n] (ii) a first name associated with the first person, the first
`name being displayed adjacent to the first person selectable
`thumbnail image;
`[1o] (iii) a second person selectable thumbnail image
`including a representation of a face of a second person, the
`second person being associated with a fourth set of digital
`files including digital photographs and videos; and
`[1p] (iv) a second name associated with the second person,
`the second name being displayed adjacent to the second
`person selectable thumbnail image.
`Ex. 1001, 35:32–36:11 (with brackets noting Petitioner’s labels, see Pet. 13–
`60).
`
`13
`
`

`

`
`IPR2021-01413
`Patent 10,621,228 B2
`F. Evidence
`Reference or Declaration
`U.S. Patent Application Publication
`No. 2011/0122153 A1 (“Okamura”)
`U.S. Patent No. 6,714,215 B1
`(“Flora”)
`U.S. Patent Application Publication
`No. 2011/0163971 A1 (“Wagner”)
`U.S. Patent Application Publication
`No. 2010/0172551 A1 (“Gilley”)
`Declaration of Benjamin Bederson,
`Ph.D.
`Reply Declaration of Benjamin
`Bederson, Ph.D.
`First Declaration of Professor Glenn
`Reinman
`Second Declaration of Professor
`Glenn Reinman
`
`
`
`
`
`
`Date
`May 26, 2011
`
`Exhibit No.
`Ex. 1004
`
`March 30, 2004
`
`Ex. 1005
`
`July 7, 2011
`
`July 8, 2010
`
`Sept. 3, 2021
`
`Ex. 1006
`
`Ex. 1007
`
`Ex. 1002
`
`Aug. 29, 2022
`
`Ex. 1038
`
`Dec. 17, 2021
`
`Ex. 2001
`
`June 6, 2022
`
`Ex. 2038
`
`14
`
`

`

`
`IPR2021-01413
`Patent 10,621,228 B2
`G. Asserted Grounds of Unpatentability
`Claim(s) Challenged 35 U.S.C. §3
`1–7
`103(a)
`1-7
`103(a)
`1–7
`103(a)
`1–7
`103(a)
`
`Reference(s)
`Okamura, Flora
`Okamura, Flora, Wagner
`Okamura, Flora, Gilley
`Okamura, Flora, Wagner, Gilley
`
`Pet. 4.
`
`II. ANALYSIS
`
`A. Principles of Law: Obviousness
`A claim is unpatentable as obvious under 35 U.S.C. § 103 if the
`differences between the subject matter sought to be patented and the prior art
`are such that the subject matter as a whole would have been obvious to a
`person having ordinary skill in the art to which said subject matter pertains.
`KSR Int’l Co. v. Teleflex Inc., 550 U.S. 398, 406 (2007). The question of
`obviousness is resolved on the basis of underlying factual determinations,
`including: (1) the scope and content of the prior art; (2) any differences
`between the claimed subject matter and the prior art; (3) the level of skill in
`the art; and (4) objective evidence of nonobviousness, i.e., secondary
`
`
`3 The Leahy-Smith America Invents Act, Pub. L. No. 112–29, 125 Stat. 284
`(2011) (“AIA”), amended 35 U.S.C. § 103. The ’228 patent claims priority
`to Patent Application No. 13/157,214, providing an effective filing date of
`June 9, 2011. See Ex. 1001, code (63). Because this priority date is before
`the effective date of the applicable AIA amendments (March 16, 2013), we
`use the pre-AIA version of 35 U.S.C. § 103 in this proceeding.
`15
`
`

`

`
`IPR2021-01413
`Patent 10,621,228 B2
`considerations. 4 See Graham v. John Deere Co. of Kansas City, 383 U.S. 1,
`17–18 (1966).
`The Supreme Court has made clear that we apply “an expansive and
`flexible approach” to the question of obviousness. KSR, 550 U.S. at 415.
`Whether a patent claiming the combination of prior art elements would have
`been obvious is determined by whether the improvement is more than the
`predictable use of prior art elements according to their established functions.
`Id. at 417. Reaching this conclusion, however, requires more than a mere
`showing that the prior art includes separate references covering each
`separate limitation in a claim under examination. Unigene Labs., Inc. v.
`Apotex, Inc., 655 F.3d 1352, 1360 (Fed. Cir. 2011). Rather, obviousness
`requires the additional showing that a person of ordinary skill would have
`selected and combined those prior art elements in the normal course of
`research and development to yield the claimed invention. Id.
`B. Level of Ordinary Skill
`In determining whether an invention would have been obvious at the
`time it was made, we consider the level of ordinary skill in the pertinent art
`at the time of the invention. Graham v. John Deere Co., 383 U.S. 1, 17
`(1966). “The importance of resolving the level of ordinary skill in the art
`lies in the necessity of maintaining objectivity in the obviousness inquiry.”
`Ryko Mfg. Co. v. Nu-Star, Inc., 950 F.2d 714, 718 (Fed. Cir. 1991).
`
`
`4 The current record does not present or address any evidence of
`nonobviousness.
`
`16
`
`

`

`
`IPR2021-01413
`Patent 10,621,228 B2
`Petitioner contends that a person of ordinary skill in the art “would
`have had at least a bachelor’s degree in computer science, electrical
`engineering, or a related field, and at least two years of academic or industry
`experience in software development related to content management systems
`and user interfaces,” and that “[m]ore education can supplement practical
`experience and vice-versa.” Pet. 8 (citing Ex. 1002 ¶ 23).
`Patent Owner does not provide a description of a person of ordinary
`skill in the art but “does not dispute Petitioner’s proposed level of skill.” PO
`Resp. 26.
`Petitioner’s description of the level of ordinary skill is generally
`consistent with the subject matter of the ’228 Patent, with the exception of
`the qualifier “at least,” which creates a vagueness that may extend the level
`to that reflecting an expert. Based on the record presented, including our
`review of the ’228 patent and the types of problems and solutions described
`in the ’228 patent and the cited prior art, we determine that a person of
`ordinary skill in the art is a person with a bachelor’s degree in computer
`science, electrical engineering, or a related field, with two years of academic
`or industry experience in software development related to content
`management systems and user interfaces.
`C. Claim Construction
`Pursuant to 37 C.F.R. § 42.100(b), we apply the claim construction
`standard as set forth in Phillips v. AWH Corp., 415 F.3d 1303 (Fed. Cir.
`2005) (en banc). Under Phillips, claim terms are generally given their
`ordinary and customary meaning as would be understood by one with
`ordinary skill in the art in the context of the specification, the prosecution
`
`17
`
`

`

`
`IPR2021-01413
`Patent 10,621,228 B2
`history, other claims, and even extrinsic evidence including expert and
`inventor testimony, dictionaries, and learned treatises, although extrinsic
`evidence is less significant than the intrinsic record. Phillips, 415 F.3d at
`1312–17. Usually, the specification is dispositive, and it is the single best
`guide to the meaning of a disputed term. Id. at 1315.
`Only terms that are in controversy need to be construed, and then only
`to the extent necessary to resolve the controversy. Nidec Motor Corp. v.
`Zhongshan Broad Ocean Motor Co. Matal, 868 F.3d 1013, 1017 (Fed. Cir.
`2017) (in the context of an inter partes review, applying Vivid Techs., Inc. v.
`Am. Sci. & Eng’g, Inc., 200 F.3d 795, 803 (Fed. Cir. 1999)).
`Petitioner asserts that “no terms of the ’228 patent warrant
`construction beyond their ordinary and customary meaning.” Pet. 8.
`Patent Owner “does not believe claim construction is required because
`the plain and ordinary meaning of the claims is clear.” PO Resp. 26.
`For purposes of this Decision, we agree with the parties that no claim
`terms require express construction. See Vivid Techs., 200 F.3d at 803
`(holding that only terms that are in controversy need to be construed, and
`“only to the extent necessary to resolve the controversy”). To the extent that
`the meaning of any claim term is addressed, we use its ordinary and
`customary meaning as discussed in our analysis below.
`D. Relevant Prior Art
`1. Okamura (Ex. 1004)
`Okamura is titled “Information Processing Apparatus, Information
`Processing Method, and Program” and “relates to . . . an information
`processing apparatus which displays contents such as image files, an
`
`18
`
`

`

`
`IPR2021-01413
`Patent 10,621,228 B2
`information processing method, and a program for causing a computer to
`execute the information processing method.” Ex. 1004, code (54), ¶ 2. The
`image files may be digital files, such as “image files recorded by an image
`capturing apparatus such as a digital still camera,” and Okamura’s
`information processing apparatus (i) calculates transformed coordinates for
`each of a plurality of superimposed images associated with coordinates in a
`background image, by transforming coordinates of other superimposed
`images with respect to one superimposed image as a reference image in such
`a way that coordinate intervals within a predetermined area with respect to
`the reference image become denser with increasing distance from the
`reference image toward the boundary, (ii) sets coordinates of the reference
`image on the basis of a mean value obtained by calculating a mean of the
`calculated coordinates of the other superimposed images with respect to the
`reference image, and (iii) displays the background image and the plurality of
`superimposed images on a display section in such a way that the reference
`image is placed at the set coordinates in the background image. Id. at code
`(57) (Abstract), ¶ 91.
`In Okamura, in accordance with an operational input for activating a
`content playback application, an index screen is displayed on a display. Id.
`¶ 233. An index screen is a display screen that displays a listing of clusters
`(including image files, such as still image files) from which to select a
`desired cluster. Id. ¶¶ 125, 233, 139 (“Clustering refers to grouping
`(classifying) together a plurality of pieces of data within a short distance
`from each other in a data set” where “[t]he distance between contents refers
`to the distance between the positions (such as geographical positions,
`
`19
`
`

`

`
`IPR2021-01413
`Patent 10,621,228 B2
`positions along the temporal axis, or positions along the axis representing
`the similarity between faces) of two points corresponding to contents. A
`cluster is a unit in which contents are grouped together by clustering.”).
`Examples of displays of index screens are shown in Figures 18 to 21. Id.
`¶ 233. When a desired cluster is determined by a user operation on the index
`screen shown in each of Figures 18 to 21, a content playback screen is
`thereafter displayed. Id. ¶ 248.
`Figures 18 and 19 of Okamura, reproduced below, are examples of
`display of index screens that display cluster maps as index images. Id.
`¶¶ 38–39, 234.
`
`Figure 18 is an example of a display of an index screen. Id. ¶ 38.
`
`
`
`
`20
`
`

`

`
`IPR2021-01413
`Patent 10,621,228 B2
`
`
`
`Figure 19 is another example of a display of an index screen. Id. ¶ 39.
`As shown above in Figures 18 and 19, a cursor (mouse pointer) 419
`that moves with the movement of a mouse is displayed on the screen shown
`on the display. Id. ¶ 234. The cursor 419 is a mouse pointer used to point to
`an object of instruction or operation on the screen displayed on the display
`section 181. Id. As shown in Figure 18, an “EVENT” tab 411, a “FACE”
`tab 412, a “PLACE” tab 413, a cluster map display area 414, and left and
`right buttons 415 and 416 are provided on an index screen 410. Id. ¶ 235.
`“EVENT” tab 411, “FACE” tab 412, and “PLACE” tab 413 are tabs for
`displaying another index screen. Id. ¶ 236. In the cluster map display area
`414, a listing of marks (cluster maps) representing clusters is displayed. Id.
`¶ 237. For example, as shown in Figure 18, cluster maps of the same size
`are displayed in a 3×5 matrix fashion, for example. Id. ¶ 237.
`
`21
`
`

`

`
`IPR2021-01413
`Patent 10,621,228 B2
`When the mouse is placed over a cluster map 417 by a user operation
`on index screen 410 shown in Figure 18, as shown in Figure 19, the color of
`the cluster map 417 is changed, and pieces of information 418 related to the
`cluster map 417 are displayed. Id. ¶ 240. For example, the entire cluster
`map 417 is changed to a conspicuous color (for example, grey) and
`displayed. Id. As the pieces of information 418 related to the cluster map
`417, for example, the number of contents “28” belonging to a cluster
`corresponding to the cluster map 417, and the cluster title “Mt. Fuji” of the
`cluster are displayed. Id. Also, as the pieces of information 418 related to
`the cluster map 417, for example, information on the latitude and longitude
`of the center position of the cluster corresponding to the cluster map 417,
`“Lat. 35°21’N, Long. 138°43’E,” is displayed. Id. As pieces of information
`418 related to cluster map 417, information indicating the size of the cluster
`may be also displayed together. Id. ¶ 241. For example, the diameter of a
`circle corresponding to the cluster can be displayed indicating kilometers.
`Id. In order to allow the user to intuitively grasp whether the size of a circle
`corresponding to a cluster is large or small, display of icons or color can be
`made to differ depending on whether the size is large or not. Id. More
`particularly, Okamura explains:
`when comparing an urban area and a rural area with each other,
`it is supposed that while buildings, roads, and the like are densely
`packed in the urban area, in the rural area, there are relatively
`many mountains, farms, and the like, and there are relatively few
`buildings, roads, and the like. For this reason, the amount of
`information in a map often differs between the urban area and
`the rural area. Due to this difference in the amount of information
`in a map, it is supposed that when cluster maps of the urban area
`and rural area are displayed simultaneously, the user feels a
`
`22
`
`

`

`
`IPR2021-01413
`Patent 10,621,228 B2
`difference in the perceived sense of scale between the urban area
`and the rural area. Accordingly, for example, by displaying these
`cluster maps in different manners depending on whether the size
`of a circle corresponding to a cluster is large or small, it is
`possible to prevent a difference i

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket