throbber
US010664518B2
`
`(12) United States Patent
`US 10,664,518 B2
`(10) Patent N0.:
`McKinnon et al.
`
`(45) Date of Patent: *May 26, 2020
`
`(54)
`
`WIDE AREA AUGMENTED REALITY
`LOCATION-BASED SERVICES
`
`(71)
`
`Applicant:
`
`Nant Holdings IP, LLC, Culver City,
`CA (US)
`
`Inventors:
`
`(72)
`
`David McKinnon, San Francisco, CA
`(US); Kamil Wnuk, Playa del Rey, CA
`(US); Jeremi Sudol, New York, NY
`(US); Matheen Siddiqui, Culver City,
`CA (US); John Wiacek, Los Angeles,
`CA (US); Bing Song, La Canada, CA
`(US); Nicholas J. Witchey, Laguna
`Hills, CA (US)
`
`(73)
`
`Assignee:
`
`Nant Holdings IP, LLC, Culver City,
`CA (US)
`
`(*)
`
`Notice:
`
`Subject to any disclaimer, the term of this
`patent is extended or adjusted under 35
`U.S.C. 154(b) by 0 days.
`
`This patent is subject to a terminal dis-
`claimer.
`
`(21)
`
`Appl. No.:
`
`16/168,419
`
`(22)
`
`(65)
`
`Filed:
`
`Oct. 23, 2018
`
`Prior Publication Data
`
`US 2019/0057113 A1
`
`Feb. 21, 2019
`
`Related US. Application Data
`
`(63)
`
`Continuation of application No. 15/794,993, filed on
`Oct. 26, 2017, now Pat. No. 10,140,317, which is a
`(Continued)
`
`Int. Cl.
`
`(51)
`
`G09G 5/00
`G06F 16/58
`
`(2006.01)
`(2019.01)
`(Continued)
`
`(52) US. Cl.
`CPC .......... G06F 16/5866 (2019.01); G06F 16/29
`(2019.01); G06F 16/50 (2019.01);
`(Continued)
`(58) Field of Classification Search
`CPC .................................................... G06T 19/006
`
`See application file for complete search history.
`
`(56)
`
`References Cited
`U. S. PATENT DOCUMENTS
`
`5,625,765 A
`5,682,332 A
`
`4/1997 Ellenby et a1.
`10/1997 Ellenby et a1.
`(Continued)
`
`FOREIGN PATENT DOCUMENTS
`
`EP
`EP
`
`1 012 725
`1 246 080 A2
`
`6/2000
`10/2002
`
`(Continued)
`
`OTHER PUBLICATIONS
`
`International Search Report and Written Opinion issued in Interna-
`tional Application No. PCT/US2012/032204 dated Oct. 29, 2012.
`(Continued)
`
`Primary Examiner 7 Charles Tseng
`(74) Attorney, Agent, or Firm 7 Mauriel Kapouytian
`Woods LLP; Andrew A. Noble
`
`ABSTRACT
`(57)
`Apparatus, methods and systems of providing AR content
`are disclosed. Embodiments of the inventive subject matter
`can obtain an initial map of an area, derive views of interest,
`obtain AR content objects associated with the views of
`interest, establish experience clusters and generate a tile map
`tessellated based on the experience clusters. A user device
`could be configured to obtain and instantiate at least some of
`the AR content objects based on at least one of a location and
`a recognition.
`
`39 Claims, 6 Drawing Sheets
`
`
`Visw(s,\ oi interest
`1
`fl
`Descrlplors assouaked
`
`with View(s)oi1nlere51
`fl
`
`
`imiiai Map
`513A
`
`"LE1? AR Conieniabjecits) 1
`
`L
`‘
`m ,l
`i
`
` , View
`
`
`
`Field 0L/
`
`interest
`{View
`
`Pomi ni ’
`wewmi m5 View
`
`
`
`x
`
`
`
` Area Tile Map
`
`Area Tile Map
`5337
`fl
`
`
`Niantic's Exhibit No. 1001
`
`Page 001
`
`Niantic's Exhibit No. 1001
`Page 001
`
`

`

`US 10,664,518 B2
`Page 2
`
`Related U.S. Application Data
`
`continuation of application No. 15/406,146, filed on
`Jan. 13, 2017, now Pat. No. 9,817,848, which is a
`continuation of application No. 14/517,728, filed on
`Oct. 17, 2014, now Pat. No. 9,582,516.
`
`Provisional application No. 61/892,238, filed on Oct.
`17, 2013.
`
`Int. Cl.
`
`G06T 19/00
`G06F 16/29
`G06F 16/50
`G06F 16/583
`G06F 16/9535
`G06T 15/20
`U.S. Cl.
`
`(2011.01)
`(2019.01)
`(2019.01)
`(2019.01)
`(2019.01)
`(2011.01)
`
`CPC .......... G06F 16/58 (2019.01); G06F 16/5854
`(2019.01); G06F 16/9535 (2019.01); G06T
`15/20 (2013.01); G06T 19/003 (2013.01);
`G06T 19/006 (2013.01); H05K 999/99
`(2013.01); G06T 2219/024 (2013.01)
`
`References Cited
`
`U.S. PATENT DOCUMENTS
`
`(60)
`
`(51)
`
`(52)
`
`(56)
`
`5,742,521 A
`5,815,411 A
`5,991,827 A
`6,031,545 A
`6,037,936 A
`6,064,398 A
`6,064,749 A
`6,098,118 A
`6,130,673 A
`6,173,239 B1
`6,278,461 B1
`6,307,556 B1
`6,396,475 B1
`6,414,696 B1
`6,522,292 B1
`6,535,210 B1
`6,690,370 B2
`6,804,726 B1
`7,016,532 B2
`7,031,875 B2
`7,245,273 B2
`7,301,536 B2
`7,477,780 B2
`7,529,639 B2
`7,564,469 B2
`7,565,008 B2
`7,641,342 B2
`7,680,324 B2
`7,696,905 B2
`7,710,395 B2
`7,768,534 B2
`7,774,180 B2
`7,847,699 B2
`7,889,193 B2
`7,899,915 B2
`7,904,577 B2
`7,907,128 B2
`7,908,462 B2
`7,916,138 B2
`8,218,873 B2
`8,224,077 B2
`8,224,078 B2
`8,291,346 B2
`8,315,432 B2
`8,321,527 B2
`8,427,508 B2
`8,472,972 B2
`
`4/1998 Ellenby et 31.
`9/1998 Ellenby et 31.
`11/1999 Ellenby et 31.
`2/2000 Ellenby et 31.
`3/2000 Ellenby et 31.
`5/2000 Ellenby et 31.
`5/2000 Hirota et 31.
`8/2000 Ellenby et 31.
`10/2000 Pulli et 31.
`1/2001 Ellenby
`8/2001 Ellenby et 31.
`10/2001 Ellenby et 31.
`5/2002 Ellenby et 31.
`7/2002 Ellenby et 31.
`2/2003 Ellenby et 31.
`3/2003 Ellenby et 31.
`2/2004 Ellenby et 31.
`10/2004 Ellenby et 31.
`3/2006 Boncyk et 31.
`4/2006 Ellenby et 31.
`7/2007 Eberl et 31.
`11/2007 Ellenby et 31.
`1/2009 Boncyk et 31.
`5/2009 Rasanen et 31.
`7/2009 Cohen
`7/2009 Boncyk et 31.
`1/2010 Eberl et 31.
`3/2010 Boncyk et 31.
`4/2010 Ellenby et 31.
`5/2010 Rodgers et 31.
`8/2010 Pentenrieder et 31.
`8/2010 Joussemet et 31.
`12/2010 Lee et 31.
`2/2011 Platonov et 31.
`3/2011 Reisman
`3/2011 Taylor
`3/2011 Bathiche et 31.
`3/2011 Sung
`3/2011 John et 31.
`7/2012 Boncyk et 31.
`7/2012 Boncyk et 31.
`7/2012 Boncyk et 31.
`10/2012 Kerr et 31.
`11/2012 Lefevre et 31.
`11/2012 Martin et 31.
`4/2013 Mattila et 31.
`6/2013 Nadler et 31.
`
`8,502,835
`8,519,844
`8,527,340
`8,537,113
`8,576,756
`8,605,141
`8,606,657
`8,633,946
`8,700,060
`8,711,176
`8,810,598
`8,872,851
`8,933,841
`8,965,741
`9,128,520
`9,129,644
`9,131,208
`9,167,386
`9,177,381
`9,182,815
`9,183,560
`9,230,367
`9,311,397
`9,396,589
`9,482,528
`9,495,591
`9,536,251
`9,582,516
`9,817,848
`9,824,501
`10,127,733
`10,140,317
`2002/0163521
`2004/0203380
`2005/0024501
`2005/0208457
`2005/0285878
`2005/0289590
`2006/0025229
`2006/0038833
`2006/0047704
`2006/0161379
`2006/0190812
`2007/0109619
`2007/0146391
`2007/0182739
`2008/0024594
`2008/0154538
`2008/0157946
`2008/0198159
`2008/0198222
`2009/0003662
`2009/0081959
`2009/0102859
`2009/0167787
`2009/0193055
`2009/0210486
`2009/0237546
`2009/0271160
`2009/0271715
`2010/0017722
`2010/0023878
`2010/0045933
`2010/0113157
`2010/0188638
`2010/0189309
`2010/0208033
`2010/0217855
`2010/0246969
`2010/0257252
`2010/0287485
`2010/0315418
`2010/0321540
`2010/0325154
`2011/0038634
`2011/0221771
`2011/0279445
`2011/0316880
`2012/0105474
`
`B1
`B2
`B2
`B2
`B2
`B2
`B2
`B2
`B2
`B2
`B2
`B2
`B2
`B2
`B2
`B2
`B2
`B2
`B2
`B2
`B2
`B2
`B2
`B2
`B2
`B2
`B2
`B2
`B2
`B2
`B2
`B2
`A1
`A1
`A1
`A1
`A1
`A1
`A1
`A1
`A1
`A1
`A1
`A1
`A1
`A1
`A1
`A1
`A1
`A1
`A1
`A1
`A1
`A1
`A1
`A1
`A1
`A1
`A1
`A1
`A1
`A1
`A1
`A1
`A1
`A1
`A1
`A1
`A1
`A1
`A1
`A1
`A1
`A1
`A1
`A1
`A1
`A1
`A1
`
`8/2013
`8/2013
`9/2013
`9/2013
`11/2013
`12/2013
`12/2013
`1/2014
`4/2014
`4/2014
`8/2014
`10/2014
`1/2015
`2/2015
`9/2015
`9/2015
`9/2015
`10/2015
`11/2015
`11/2015
`11/2015
`1/2016
`4/2016
`7/2016
`11/2016
`11/2016
`1/2017
`2/2017
`11/2017
`11/2017
`11/2018
`11/2018
`11/2002
`10/2004
`2/2005
`9/2005
`12/2005
`12/2005
`2/2006
`2/2006
`3/2006
`7/2006
`8/2006
`5/2007
`6/2007
`8/2007
`1/2008
`6/2008
`7/2008
`8/2008
`8/2008
`1/2009
`3/2009
`4/2009
`7/2009
`7/2009
`8/2009
`9/2009
`10/2009
`10/2009
`1/2010
`1/2010
`2/2010
`5/2010
`7/2010
`7/2010
`8/2010
`8/2010
`9/2010
`10/2010
`11/2010
`12/2010
`12/2010
`12/2010
`2/2011
`9/2011
`11/2011
`12/2011
`5/2012
`
`Meehan
`Richey et 31.
`Fisher et al.
`Weising et 31.
`K0 et al.
`Dialameh et al.
`Chesnut et 31.
`Cohen
`Huang
`Douris et 31.
`Soon-Shiong
`Choubassi et 31.
`Valaee et al.
`McCulloch et al.
`Geisner et 31.
`Gay et 31.
`Jin
`Valaee et al.
`McKinnon
`Small et 31.
`Abelow
`Stroila
`Meadow et al.
`Soon-Shiong
`Baker et 31.
`Visser et al.
`Huang et al.
`McKinnon et al.
`McKinnon et al.
`Soon-Shiong
`Soon-Shiong
`McKinnon et al.
`Ellenby et 31.
`Hamdi et 31.
`Ellenby et 31.
`Fink et al.
`Singh et 31.
`Check et al.
`Mahajan et al.
`Mallinson et al.
`Gopalakrishnan
`Ellenby et 31.
`Ellenby et 31.
`Eberl et al.
`Pentenrieder et al.
`Platonov et al.
`Ritchey
`Stathis
`Eberl et 31.
`Liu et al.
`Gowda
`Joseph et 31.
`Gyorfi et al.
`Athsani et 31.
`Bathiche et al.
`Kuberka et 31.
`Lim
`Bloebaum et al.
`Copenhagen et al.
`Tumuluri
`Cohen
`Douris et 31.
`Eberl et al.
`Chin et al.
`Eberl et al.
`Rouzes et 31.
`Edge et al.
`Przybysz et al.
`Winder et 31.
`Dougherty et al.
`Bertolami et 31.
`W00
`W00 et al.
`Schloter et al.
`DeCusatis et al.
`Cramer et 31.
`Murphy et al.
`Ojala et al.
`Cudalbu et al.
`
`Niantic's Exhibit No. 1001
`
`Page 002
`
`Niantic's Exhibit No. 1001
`Page 002
`
`

`

`US 10,664,518 B2
`Page 3
`
`(56)
`
`References Cited
`U.S. PATENT DOCUMENTS
`
`2012/0105475 A1
`2012/0113141 A1
`2012/0122570 A1
`2012/0127201 A1
`2012/0219181 A1
`2012/0231891 A1
`2012/0244950 A1
`2012/0276997 A1
`2012/0293506 A1
`2012/0302129 A1
`2013/0050496 A1
`2013/0064426 A1
`2013/0073988 A1
`2013/0128060 A1
`2013/0159096 A1
`2013/0176202 A1
`2014/0161323 A1
`2014/0184749 A1
`2015/0172626 A1
`
`5/2012 Tseng et al.
`5/2012 Zimmerman et a1.
`5/2012 Baronoff
`5/2012 Kim et al.
`8/2012 Tseng et al.
`9/2012 Watkins, Jr. et al.
`9/2012 Braun
`11/2012 Chowdhary et al.
`11/2012 Vertucci et al.
`11/2012 Persaud et a1.
`2/20 13
`Jeong
`3/2013 Watkins, Jr. et al.
`3/2013 Groten et a1.
`5/2013 Rhoads et al.
`6/2013 Santhanagopal et al.
`7/2013 Gervautz
`6/2014 Livyatan et al.
`7/2014 Hilliges et a1.
`6/2015 Martini
`
`FOREIGN PATENT DOCUMENTS
`
`EP
`EP
`EP
`JP
`JP
`JP
`KR
`KR
`WO
`WO
`WO
`WO
`WO
`WO
`WO
`WO
`WO
`WO
`WO
`WO
`WO
`WO
`
`1 354 260
`1 119 798 B1
`2 207 113 A1
`2010-118019 A
`2011-153324 A
`2011-253324 A
`2010-0124947 A
`10-1171264 B1
`97/44737 A1
`99/42946 A2
`99/42947 A2
`00/20929 A1
`01/63487 A1
`01/71282 A1
`02/03091 A2
`02/059716 A2
`02/073818 A1
`2007/140155 A2
`2010/079876 A1
`2010/138344 A2
`2011/028720 A1
`2013/023705 A1
`
`10/2003
`3/2005
`7/2010
`5/2010
`8/2011
`12/2011
`11/2010
`8/2012
`11/1997
`8/1999
`8/1999
`4/2000
`8/2001
`9/2001
`1/2002
`8/2002
`9/2002
`12/2007
`7/2010
`12/2010
`3/2011
`2/2013
`
`OTHER PUBLICATIONS
`
`Wauters, “Stanford Graduates Release Pulse, A Must-Have News
`App for the iPad,” Techcrunch.com, techcrunch.com/2010/05/31/
`pulse-ipad/, 2010.
`Hickins, “A License to Pry,” The Wall Street Journal, http://blogs.
`wsj.com/digits/2011/03/10/a-license-to-pry/tab/print/, 2011.
`
`Notice of Reasons for Rejection issued in Japanese Patent Appli-
`cation No. 2014-503962 dated Sep. 22, 2014.
`Notice of Reasons for Rejection issued in Japanese Patent Appli-
`cation No. 2014-503962 dated Jun. 30, 2015.
`European Search Report issued in European Patent Application No.
`127675668 dated Mar. 20, 2015.
`“3D Laser Mapping Launches Mobile Indoor Mapping System,” 3D
`Laser Mapping, Dec. 3, 2012, 1 page.
`Banwell et al., “Combining Absolute Positioning and Vision for
`Wide Area Augmented Reality,” Proceedings of the International
`Conference on Computer Graphics Theory and Applications, 2010,
`4 pages.
`Li et al., “3-D Motion Estimation and Online Temporal Calibration
`for Camera-IMU Systems,” Proceedings of the IEEE International
`Conference on Robotics and Automation (ICRA), 2013, 8 pages.
`Li et al., “High-fidelity Sensor Modeling and Self-Calibration in
`Vision-aided Inertial Navigation,” Proceedings of the IEEE Inter-
`national Conference on Robotics and Automation (ICRA), 2014, 8
`pages.
`Li et al., “Online Temporal Calibration for Camera-IMU Systems:
`Theory and Algorithms,” International Journal of Robotics Research,
`vol. 33, Issue 7, 2014, 16 pages.
`Li et al., “Real-time Motion Tracking on a Cellphone using Inertial
`Sensing and a Rolling-Shutter Camera,” Proceedings of the IEEE
`International Conference on Robotics and Automation (ICRA),
`2013, 8 pages.
`Mourikis, “Method for Processing Feature Measurements in Vision-
`Aided Inertial Navigation,” 3 pages, 2013.
`Mourikis et al., “Methods for Motion Estimation With a Rolling-
`Shutter Camera,” Proceedings of the IEEE International Conference
`on Robotics and Automation (ICRA), Karlsruhe, Germany May
`6-10, 2013, 9 pages.
`Panzarino, “What Exactly WiFiSlam Is, and Why Apple Acquired
`It,” httpz//thenextweb.com/apple/20 13/03/26/What-exactly-Wifislam-
`is-and-Why-apple-acquired-it, Mar. 26, 2013, 10 pages.
`Vondrick et al., “HOGgles: Visualizing Object Detection Features,”
`IEEE International Conference on Computer Vision (ICCV), 2013,
`9 pages.
`Vu et al., “High Accuracy and Visibility-Consistent Dense Multiview
`Stereo,” IEEE Transactions on Pattern Analysis and Machine Intel-
`ligence, 2012, vol. 34, No. 5, 13 pages.
`International Search Report and Written Opinion issued in Interna-
`tional Application No. PCT/US2014/061283 dated Aug. 5, 2015, 11
`pages.
`Pang et al., “Development of a Process-Based Model for Dynamic
`Interaction in Spatio-Temporal GIS”, GeoInformatica, 2002, vol. 6,
`No. 4, pp. 323-344.
`Zhu et al., “The Geometrical Properties of Irregular 2D Voronoi
`Tessellations,” Philosophical Magazine A, 2001, vol. 81, No. 12, pp.
`2765-2783.
`US. Appl. No. 16/186,405, filed Nov. 9, 2018.
`“S2 Cells,” S2Geometry, https://s2geometry.io/devguide/s2celli
`hierarchy, 27 pages, Oct. 10, 2019.
`US. Appl. No. 16/557,963, filed Aug. 30, 2019.
`
`Niantic's Exhibit No. 1001
`
`Page 003
`
`Niantic's Exhibit No. 1001
`Page 003
`
`

`

`U.S. Patent
`
`May 26, 2020
`
`Sheet 1 0f 6
`
`US 10,664,518 B2
`
`Q Map Generation Engine
`/.I
`\\
`M
`
`«,2
`
`,, S t03‘“
`Q
`// t
`Object Generation Engine
`L04
`
`«Network
`
`
`
`
`
`
`
`
`
`"
`-
`‘
`Ima e
`content content
`content
`inmai
`Signal
`\gdeo
`Datga
`
`'
`'
`'
`Mame)
`D t
`ata
`object(s)object(s) ob;eot(s)
`118
`161163
`114
`31—2
`
`MM _ ,LMZM X312...“ 1—24 W,
`
`
`(i Network “m \\\\\“\_\:\._\:\.
`
`
`
`AR Management Engine
`122.9
`
`
`
`WQEFEJ
`
`[33%
`
`AR Content
`Initial
`View(s))of
`Map
`Interest
`A/object(s)
`
`
`32
`fl
`
`
`
`
`
`
`
`
`
`
`Area Tite
`
`
`Expenence
`Cluster(s)
`
`
`Database
`AR Content
`
`
`Lg:
`object(s)
`
`
`1 34A
`
` m
`
`Descriptor
`
`Figure 1
`
`Niantic's Exhibit No. 1001
`
`Page 004
`
`Niantic's Exhibit No. 1001
`Page 004
`
`

`

`U.S. Patent
`
`May 26, 2020
`
`Sheet 2 0f 6
`
`US 10,664,518 B2
`
`User
`
`Map Generation Engine
`202
`
`.
`.
`
`l.%
`
`.
`.
`....
`
`.
`.
`
`IE
`
`Interface
`
`
`
`
`Signal
`Video
`Image
`Data
`Data
`Data
`
`202A
`2023
`2026
`
`
`Area Database
`
`m
`
`
`
`Figure 2
`
`Niantic's Exhibit No. 1001
`
`Page 005
`
`Niantic's Exhibit No. 1001
`Page 005
`
`

`

`U.S. Patent
`
`May 26, 2020
`
`Sheet 3 0f 6
`
`US 10,664,518 B2
`
`Step m — Obtain
`lnitiai Map and Area
`Data
`
`Step fl — Recognize
`Area Characteristics
`
`and Objects Within
`Area
`
`Recognized Objects
`
`Step m — Obtain
`Descriptors for
`
`Step fl — Associate
`Recognized Objects
`
`Step fl — Generate
`View(s) of Interest
`
`Figure 3
`
`Niantic's Exhibit No. 1001
`
`Page 006
`
`Niantic's Exhibit No. 1001
`Page 006
`
`

`

`U.S. Patent
`
`May 26, 2020
`
`Sheet 4 0f 6
`
`US 10,664,518 B2
`
`
`
`User
`Interface
`
`User
`interface
`
`
`
`
`
`
`
`
`
`
`422C
`
`MW,
`2f Network
`
`.
`
`
`
`mm
`
`
`W_LMM_LNetwork
`405
`
`
`Object Generation Engine
`Network
`
`fl
`
`-
`Content
`Content
`
`
`
`.
`/M..\\¥/
`DescriptorA
`iject
`bject
`
`
`
`
`
`422A
`424A
`‘.
`....................................
`W/ i
`
`
`M
`1
`
`Descriptor
`-
`JL Content
`Content
`,
`DescnptorB
`Object
`Object
`:
`Database
`J?
`
`
`
`
`
`
`
`426A
`____________________________________
`4223
`
`
`
`//
`
` -escriptorC C-ntent Content—05C .biect Object
`‘
`
`4243
`
`
`
`WIN
`
`/,,....-
`
`
`
`
`
`AR Content Database
`
`fl
`
`
`
`'
`,
`Video
`AR
`object(s)
`m
`
`
`Image
`AR
`object(s)
`fl
`
`
`Audio
`AR
`obiect(s)
`426
`— /
`
`
`
`Figure 4
`
`Niantic's Exhibit No. 1001
`
`Page 007
`
`Niantic's Exhibit No. 1001
`Page 007
`
`

`

`U.S. Patent
`
`May 26, 2020
`
`Sheet 5 0f 6
`
`US 10,664,518 B2
`
`AR Management Engine
`E
`
`q]:_‘
`"11..-;----..--.-.-.-.:1
`
`View(s) of interest
`&
`
`Descriptors associated
`with View(s) of interest
`&
`
`AR Content object(s)
`m//‘
`
`initiai Map
`518A
`
`Fieid of,//
`interest
`
`Point of
`view origin
`
`I
`
`— F
`
`Area Tile Map
`fl
`
`AreasgilfirMap
`
`igure 5
`
`Niantic's Exhibit No. 1001
`
`Page 008
`
`Niantic's Exhibit No. 1001
`Page 008
`
`

`

`U.S. Patent
`
`6
`
`6,01SU
`
`2B8fl
`
`$22:.83x3925E29>
`
`clam
`
`>3995x~>623$\“632>\.v63mmx\\£95>8
`
`“622m
`
`\£5699$
`
`2m0~2._6,M2.y.,aa
`

`
`.\Em:
`>>O>on~
`
`
`t>\m595>9»111%,}:
`f*032¢\/......................................................................m..............................................................0X
`6w>399$X.
`mmE\0"K...................................................................
` x....................................................................................................................................
`.n.
`
`90.5.»:Mc.m
`
`x,,,,,,vx599582mm
`
`yo39>
`
`>>$295
`
`Niantic's Exhibit No. 1001
`
`Page 009
`
`.$995
`i~>>
`
`*0E2“.
`
`Niantic's Exhibit No. 1001
`Page 009
`
`
`
`

`

`US 10,664,518 B2
`
`1
`WIDE AREA AUGMENTED REALITY
`LOCATION-BASED SERVICES
`
`This application is a continuation of US. application Ser.
`No. 15/794,993, filed Oct. 26, 2017, which is a continuation
`of US. application Ser. No. 15/406,146, filed Jan. 13, 2017,
`which is a continuation of US. application Ser. No. 14/517,
`728, filed Oct. 17, 2014, which claims priority to US.
`Provisional Application No. 61/892,238, filed Oct. 17, 2013.
`These and all other extrinsic references referenced herein are
`
`incorporated by reference in their entirety.
`
`FIELD OF THE INVENTION
`
`The field of the invention is augmented reality service
`technologies.
`
`BACKGROUND
`
`The following description includes information that may
`be useful in understanding the present invention. It is not an
`admission that any of the information provided herein is
`prior art or relevant to the presently claimed invention, or
`that any publication specifically or implicitly referenced is
`prior art.
`As advances in technology continue to be developed, the
`utilization of Augmented Reality (AR) to enhance experi-
`ences is becoming increasingly popular. Various entities
`have attempted to capitalize on this increasing popularity by
`providing AR content to users based on specific types of
`object recognition or location tracking.
`For example, US. Pat. No. 8,519,844 to Richey et al.,
`filed on Jun. 30, 2010 contemplates accessing first and
`second location data, wherein the second location data has
`increased accuracy regarding the location of a device, and
`communicating augmented data to the device based on the
`location data.
`
`The ’844 patent and all other publications identified
`herein are incorporated by reference to the same extent as if
`each individual publication or patent application were spe-
`cifically and individually indicated to be incorporated by
`reference. Where a definition or use of a term in an incor-
`
`porated reference is inconsistent or contrary to the definition
`of that term provided herein, the definition of that term
`provided herein applies and the definition of that term in the
`reference does not apply.
`Another example of location based content services,
`while not directed to AR content, can be found in US. Pat.
`No. 8,321,527 to Martin, et al, filed on Sep. 10, 2009, which
`describes a system for scheduling content distribution to a
`mobile device by storing different locations, collecting user
`location data over a period of time, collecting wireless signal
`strength data, and scheduling pre-caching of content to the
`device if the user is predicted to be at a location with poor
`signal strength.
`Still
`further, various other examples of systems and
`methods for providing content to a user based on a location
`or other parameters can be found in International Patent
`Application Publication Number WO 2013/023705 to Hoff-
`man, et al, filed on Aug. 18, 2011, International Patent
`Application Publication Number WO 2007/ 140155 to Leon-
`ard, et al, filed on May 21, 2007, US. Patent Application
`Publication Number 2013/0003708 to Ko, et al, filed on Jun.
`28, 2011, US. Patent Application Publication Number 2013/
`0073988 to Groten, et al, filed on Jun. 1, 2011, and US.
`Patent Application Publication Number 2013/0124326 to
`Huang, et al, filed on Nov. 15, 2011.
`
`5
`
`10
`
`15
`
`20
`
`25
`
`30
`
`35
`
`40
`
`45
`
`50
`
`55
`
`60
`
`65
`
`2
`
`While some of the known references contemplate refining
`location identification or pre-caching content based on loca-
`tion information, they fail to consider that areas have various
`views of interest, and fail to differentiate between sub-areas
`based on AR content densities. Viewed from another per-
`spective, known location based systems fail to contemplate
`segmenting an area into clusters based on what is viewable
`or what AR content is available.
`
`there is still a need for improved AR service
`Thus,
`technologies, and especially location based AR service
`technologies.
`
`SUMMARY OF THE INVENTION
`
`The inventive subject matter provides apparatuses, sys-
`tems and methods in which AR content is provided to one or
`more user devices based on at least one of location identi-
`
`In some contemplated
`recognition.
`fication and object
`aspects, the user device could be auto-populated with AR
`content objects based on a location, and the AR content
`objects could be instantiated based on object recognition
`within the location.
`
`One aspect of the inventive subject matter includes a
`content management system comprising a content manage-
`ment engine coupled with an area database and a content
`database. The content management engine can be configured
`to communicate with the databases and perform various
`steps in order to provide content objects to a device for
`modification or instantiation.
`
`The area database could be configured to store area data
`related to an area of interest. This area data could comprise
`image data, video image data, real-time image data, still
`image data, signal data (e.g., Compressive Sensing of Sig-
`nals (CSS) data, Received Signal Strength (RSS), WiFi
`signal data, beacon signal data, etc.), audio data, an initial
`map (e.g., CAD drawing, 3-dimensional model, blueprint,
`etc.), or any other suitable data related to a layout of an area.
`The content database could be configured to store aug-
`mented reality or other digital content objects of various
`modalities, including for example, image content objects,
`video content objects, or audio content objects. It is con-
`templated that the content objects could be associated with
`one or more real world objects viewable from an area of
`interest.
`
`Viewed from another perspective, a content management
`engine of the inventive subject matter could comprise an AR
`management engine that is configured to obtain an initial
`map of an area of interest from the area data within the area
`database. The step of obtaining the initial map could com-
`prise obtaining a CAD, blueprint, 3-D model, a robot or
`drone created map, or other representation from the area
`database itself, or could comprise obtaining area data such
`as image data, signal data, video data, audio data, views
`data, viewable object data, points of interest data, field of
`view data, etc. to generate the initial map.
`The AR management engine could then derive a set of
`views of interest from at least one of the initial map and
`other area data. The views of interest are preferably repre-
`sentative of where people would, should, or could be look-
`ing while navigating through various portions of the area of
`interest. The views of interest could be derived by the map
`generation engine, or via recommendations, requests or
`other inputs of one or more users (e.g., potential viewer,
`advertiser, manager, developer, etc.), could be created manu-
`ally by a systems manager or other user, or could be modeled
`based on some or all of the area data. The views of interest
`
`could comprise, among other things, a view-point origin, a
`
`Niantic's Exhibit No. 1001
`
`Page 0010
`
`Niantic's Exhibit No. 1001
`Page 0010
`
`

`

`US 10,664,518 B2
`
`3
`field of interest, an owner, metadata, a direction (e.g., a
`vector, an angle, etc.), an orientation (e.g., pitch, yaw, roll,
`etc.), a cost, a search attribute, a descriptor set, an object of
`interest, or any combination or multiples thereof. For
`example, a view of interest could comprise a view-point
`origin (i.e., point of view origin), at least one field of interest,
`and a viewable object of interest. Another view of interest
`could comprise a view-point origin, at least two fields of
`interest, and a viewable object of interest.
`Once the views of interest have been derived, the AR
`management engine could obtain a set ofAR content objects
`(e.g., a Virtual object, chroma key content, digital image,
`digital Video, audio data, application, script, promotion,
`advertisement, game, workflow, kinesthetic, tactile, lesson
`plan, etc.) from the AR content database. Each of the AR
`content objects will preferably be related to one or more of
`the derived views of interest. The AR content objects could
`be selected for obtaining based on one or more of the
`following: a search query, an assignment of content objects
`to a view of interest or object of interest within the view, one
`or more characteristics of the initial map, a context of an
`intended user of a user (e.g., a potential viewer, advertiser,
`manager, developer, etc.), or a recommendation, selection or
`request of a user.
`The AR management engine could then establish AR
`experience clusters within the initial map as a function of the
`AR content objects obtained and views of interest derived.
`These clusters will preferably represent a combination of the
`views of interest and related information, and a density or
`other characteristic of AR content objects related to the
`views of interest. Viewed from another perspective, each
`cluster could represent a subset of the derived views of
`interest and associated AR content objects.
`Based on the AR experience clusters or information
`related thereto, the AR management engine could generate
`a tile map comprising tessellated tiles (e.g., regular or
`non-regular (e.g., semi-regular, aperiodic, etc.), Voronoi
`tessellation, penrose tessellation, K-means cluster, etc.) that
`cover at least a portion of the area of interest. Some or all of
`the tiles could advantageously be individually bound to a
`subset of the obtained AR content objects, which can com-
`prise overlapping or completely distinct subsets. Addition-
`ally or alternatively, the tiles could be associated with one or
`more of an identification, an owner, an object of interest, a
`set of descriptors, an advertiser, a cost, or a time. Still
`further, it is contemplated that the tiles could be dynamic in
`nature such that the tessellation of the area could change
`based on an event or a time. Contemplated events include,
`among other things, a sale, a news event, a publication, a
`change in inventory, a disaster, a change in advertiser, or any
`other suitable event. It is also contemplated that a view-point
`origin, a field of interest, a view or an object of interest could
`be dynamic in nature.
`The AR management engine could further configure a
`device (e.g., a mobile device, a kiosk, a tablet, a cell phone,
`a laptop, a watch, a vehicle, a server, a computer, etc.) to
`obtain at least a portion of the subset based on the tile map
`(e. g., based on the device’s location in relation to the tiles of
`a tile map, etc.), and present at least a portion of the AR
`content objects on a display of the device (e.g., instantiate
`the object, etc.). It is contemplated that the device could
`compose a data center and be coupled with a cloud server.
`Various objects, features, aspects and advantages of the
`inventive subject matter will become more apparent from
`the following detailed description of preferred embodi-
`ments, along with the accompanying drawing figures in
`which like numerals represent like components.
`
`4
`BRIEF DESCRIPTION OF THE DRAWINGS
`
`5
`
`FIG. 1 is a schematic of a system of the inventive subject
`matter.
`
`FIG. 2 is a schematic showing a generation of an initial
`map of an area of interest.
`FIG. 3 provides an example overview of the derivation of
`views of interest.
`
`10
`
`FIG. 4 is a schematic showing a generation of an AR
`content database.
`
`FIG. 5 is a schematic showing a generation of a tessellated
`area map.
`FIG. 6 is a schematic showing an area tile map based on
`view of interest clusters.
`
`DETAILED DESCRIPTION
`
`It should be noted that while the following description is
`drawn to a computer/server based device interaction system,
`various alternative configurations are also deemed suitable
`and may employ various computing devices including serv-
`ers, workstations, clients, peers, interfaces, systems, data-
`bases, agents, peers, engines, controllers, modules, or other
`types of computing devices operating individually or col-
`lectively. One should appreciate the use of such terms are
`deemed to represent computing devices comprising at least
`one processor configured or programmed to execute soft-
`ware instructions stored on a tangible, non-transitory com-
`puter readable storage medium (e.g., hard drive, FPGA,
`solid state drive, RAM, flash, ROM, memory, distributed
`memory, etc.). The software instructions preferably config-
`ure the computing device to provide the roles, responsibili-
`ties, or other functionality as discussed below with respect
`to the disclosed apparatus. Further, the disclosed technolo-
`gies can be embodied as a computer program product that
`includes a non-transitory computer readable medium storing
`the software instructions that causes a processor to execute
`the disclosed steps. In especially preferred embodiments, the
`various servers, systems, databases, or interfaces exchange
`data using standardized protocols or algorithms, possibly
`based on HTTP, HTTPS, AES,
`public-private key
`exchanges, web service APIs, known financial transaction
`protocols, or other electronic information exchanging meth-
`ods. Data exchanges among devices can be conducted over
`a packet-switched network, the Internet, LAN, WAN, VPN,
`or other type of packet switched network; a circuit switched
`network; cell switched network; or other type of network.
`One should appreciate that the disclosed techniques pro-
`vide many advantageous technical effects including provid-
`ing augmented reality content to a user device based on a
`precise location of the user device relative to one or more
`tiles of a tessellated area associated with view(s) of interest.
`The
`following discussion provides many example
`embodiments of the inventive subject matter. Although each
`embodiment represents a single combination of inventive
`elements,
`the inventive subject matter is considered to
`include all possible combinations of the disclosed elements.
`Thus if one embodiment comprises elements A, B, and C,
`and a second embodiment comprises elements B and D, then
`the inventive subject matter is also considered to include
`other remaining combinations of A, B, C, or D, even if not
`explicitly disclosed.
`A system of the inventive subject matter could advanta-
`geously identify a location of a device at or near a tile of a
`tessellated area of interest and auto-populate the device with
`pre-selected content objects based upon the identified loca-
`tion. Exemplary systems and methods for identifying a
`
`15
`
`20
`
`25
`
`30
`
`35
`
`40
`
`45
`
`50
`
`55
`
`60
`
`65
`
`Niantic's Exhibit No. 1001
`
`Page 001 1
`
`Niantic's Exhibit No. 1001
`Page 0011
`
`

`

`US 10,664,518 B2
`
`5
`location of a user or device within or near a tile can be found
`
`in US. pre-grant publication number 2014/0011518 Valaee,
`et al, entitled “System, Method And Computer Program For
`Dynamic Generation Of A Radio Map” and US. pre-grant
`publication 2012/014941 5, to Valaee, et al entitled “System,
`Method and Computer Program for Anonymous Localiza-
`tion.”
`
`Where the device is configured or programmed to capture
`image or other sensor data (e.g., orientation data, position
`data, etc.) that indicates that an object is viewable by a user
`of the device, the system can cause the device to instantiate
`some or all of the content objects based on an association
`between the viewable object(s) and the content object(s)
`(e. g., based on at least one of object recognition, orientation,
`location, etc.). The instantiated AR content object could be
`presented in any suitable manner, including for example, as
`an occlusion mask, behind one or more objects, behind an
`object and in front of a different object, or as a moving object
`across an object of interest.
`FIG. 1 is a schematic of an exemplary system 100 of the
`inventive subject matter. System 100 comprises a map
`generation engine 102, which can be leveraged by one or
`more users to capture, generate, or otherwise obtain area
`data related to an area of interest. Among other suitable data,
`area data could comprise image data 112 (e.g., still image
`data, real-time image data, etc.), video data 114, signal data
`116 (e.g., CSS data, RSS data, WiFi signal data, beacon
`signal data, etc.), and/or initial maps 118 that could be
`transmitted to and stored in area database 110 via network
`
`105. AR management engine 130, coupled with area data-
`base 110 via network 125, can be configured to obtain an
`initial map 118A related to an area of interest from area
`database 110, or could be configured to obtain other area
`data and generate initial map 118A based on the obtained
`data.
`
`An area of interest can be considered generally to be a
`real-world space, area or setting selected within which the
`processes and functions of the inventive subject matter will
`be carried out. The area of interest can be an a priori,
`user-defined area or an ad-hoc area generated by the system.
`For a priori defined areas, an area of interest can corre-
`spond to existing, predefined boundaries that can be physical
`(e. g., the physical boundaries of a road or a beachfront up to
`the water,
`the structural boundaries of a building, etc.),
`non-physical (e.g., a geographical boundary, geo-political
`boundary (e.g., a country border, an embassy’s territory,
`etc.), geofence, territorial bound

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket