• All Courts
  • Federal Courts
  • Bankruptcies
  • PTAB
  • ITC
Track Search
Export
Download All
Displaying 99-112 of 112 results

1010 Exhibit: US20150116316A1 Fitzgerald

Document PGR2021-00067, No. 1010 Exhibit - US20150116316A1 Fitzgerald (P.T.A.B. Mar. 11, 2021)
... read as disclosing examples of ways such systems may be implemented and describing some possible features that may be implemented, specific components that may be utilized and certain benefits that may be achieved, though none ...
cite Cite Document

1007 Exhibit: Declaration of Gordon MacPherson

Document PGR2021-00067, No. 1007 Exhibit - Declaration of Gordon MacPherson (P.T.A.B. Mar. 11, 2021)
aper Citations Abstract Document Sections 1 Introduction 2 Estimate Camera Pose for Repeat Photography 3 Adding View Bookmarks to the Ar System 4 Ar Navigation for Taking Repeat Photo 5 Image Adjustment Show Full Outline Authors Figures Abstract:We propose an Augmented Reality (AR) system that helps users take a picture from a designated pose, such as the position and camera angle of an earlier photo.
Repeat phot... View more Metadata Abstract: We propose an Augmented Reality (AR) system that helps users take a picture from a designated pose, such as the position and camera angle of an earlier photo.
Authors Figures References Citations Keywords Metrics Media IEEE Personal Account Purchase Details Profile Information Need Help?
» Change Username/Password » Update Address » Payment Options » Order History » View Purchased Documents » Communications Preferences » Profession and Education » Technical Interests » US & Canada: +1 800 678 4333 » Worldwide: +1 732 981 0060 » Contact & Support
When the user succeeds in moving the camera inside the cone, the color of the ball turns green and the message “Position OK” appears in the camera’s viewer window (Figure 2(c)).
cite Cite Document

1019 Exhibit: Milgram 1995

Document PGR2021-00067, No. 1019 Exhibit - Milgram 1995 (P.T.A.B. Mar. 11, 2021)
These include the need for accurate and precise, low latency body and head tracking, accurate and precise calibration and viewpoint matching, adequate field of view, and the requirement for a snug (no-slip) but comfortable and preferably untethered head-mount.5,9 Other issues which present themselves are more perceptual in nature, including the conflicting effects of occlusion of apparently overlapping objects and other ambiguities introduced by a variety of factors which define the interactions between computer generated images and real object images.9 Perceptual issues become even more challenging when ST-AR systems are constructed to permit computer augmentation to be presented stereoscopically.10 Some of these technological difficulties can be partially alleviated by replacing the optical ST with a conformal video-based HMD, thereby creating what is known as "video see-through".
(Of course, as computer graphic and imaging technologies continue to advance, the day will certainly arrive in which it will not be immediately obvious whether the primary world is real or simulated, a situation corresponding to the centre of the RV continuum in Fig. 1).
In our prototype virtual control system24,25, we also create partial world models, by interactively teaching a telemanipulator important three dimensional information about volumetrically defined regions into which it must not stray, objects with which it must not collide, bounds which it is prohibited to exceed, etc. To illustrate how the EWK dimension might relate to the other classes of MR displays listed above, these have been indicated across the top of Fig. 2.
It is important to point out that this figure is actually a gross simplification of a complex topic, and in fact lumps together several classes of factors, such as display hardware, signal processing and graphic rendering techniques, etc., each of which could in turn be broken down into its own taxonomic elements.
As exocentric vs egocentric differences between MR classes, while taking into account the need for strict The importance of the EPM dimension in our MR taxonomy is principally as a means of classifying metaphors depicted below the axis.
cite Cite Document

1008 Exhibit: File History for US Serial No 15518378

Document PGR2021-00067, No. 1008 Exhibit - File History for US Serial No 15518378 (P.T.A.B. Mar. 11, 2021)
DES-NDIL 000649 Falkbuilt Ex. 1008 Page 001 DES-NDIL 000650 Falkbuilt Ex. 1008 Page 002 DES-NDIL 000651 Falkbuilt Ex. 1008 Page 003 DES-NDIL 000652 Falkbuilt Ex. 1008 Page 004 DES-NDIL 000653 Falkbuilt Ex. 1008 Page 005 DES-NDIL 000654 Falkbuilt Ex.
cite Cite Document

1002 Exhibit: Declaration of Dr Gregory F Welch, PhD

Document PGR2021-00067, No. 1002 Exhibit - Declaration of Dr Gregory F Welch, PhD (P.T.A.B. Mar. 11, 2021)
Thus, a POSITA reading the specification could conclude that U.S. Patent No. 10,783,284 “the virtual reality headset” may be the virtual reality “module,” the virtual reality “component,” or the virtual reality “system,” or none of these items ...
cite Cite Document

1013 Exhibit: US20150097719A1 Balachandreswaran

Document PGR2021-00067, No. 1013 Exhibit - US20150097719A1 Balachandreswaran (P.T.A.B. Mar. 11, 2021)
The system uses dynamic scanning, active reference marker positioning, iner tial measurement, imaging, mapping and rendering to gener ate an AR for a physical environment.
[0094] It will be appreciated that many physical environ ments, such as, for example, a building with a plurality of rooms, contain obstacles, such as walls, that are prone to break the path travelled by an emitted beam of an LRF.
[0103] As shown in FIG. 5, the HMD 12 may comprise a processing unit 130 to perform various processing functions, including mapping, imaging and rendering, and, in aspects, mediation of game play parameters and interactions with other users and their respective HMDs and peripherals; alter natively, the central console 11 shown in FIG. 3 may mediate the game play parameters and interactions between all the users and their respective HMDs and peripherals in the sys tem.
[0104] The processor may collect data from the other com ponents described herein, as shown in FIG. 2, including, for example, the camera system, the LPS and the scanning sys tem to generate and apply AR renderings to captured image streams of the physical environment.
In an additional exem plary scenario, the processor causes a generated dragon to fly along a trajectory calculated to avoid physical and virtual obstacles in the rendered environment.
cite Cite Document

1021 Exhibit: Bolte 2011

Document PGR2021-00067, No. 1021 Exhibit - Bolte 2011 (P.T.A.B. Mar. 11, 2021)
Several novel devices and user interfaces have been developed over the past years, which allow to capture user’s body movements in front of a display and map Laval Virtual VRIC 2011 Proceedings - RICHIR Simon, SHIRAI Akihiko Editors Falkbuilt Ex. 1021, Page 001
The jump animation seemed to play an important role not exclusively on disorientation, but also on the user acceptance, because subjects evaluated the teleportation metaphor as significantly less satisfying and more difficult to fulfill the traveling task.
[20] M. Usoh, K. Arthur, M. C. Whitton, R. Bastos, A. Steed, M. Slater, and F. P. Brooks Jr., “Walking > walking-in-place > flying, in virtual environments,” in Proceedings of the 26th Annual Conference on Computer Graphics and Interactive Techniques (SIGGRAPH ’99), pp.359–364, ACM, 1999.
[25] B. Williams, G. Narasimham, T. P. McNamara, T. H. Carr, J. J. Rieser, and B. Bodenheimer, “Updating orientation in large virtual environments using scaled translational gain,” in Proceedings of the 3rd Symposium on Applied Perception in Graphics and Visualization (APGV ’06), vol.
[26] B. Williams, G. Narasimham, B. Rump, T. P. McNamara, T. H. Carr, J. J. Rieser, and B. Bodenheimer, “Exploring large virtual environments with an HMD on foot,” in Proceedings of the 3rd Symposium on Applied Perception in Graphics and Visualization (APGV ’06), vol.
cite Cite Document

1006 Exhibit: Provisional App No 62064156

Document PGR2021-00067, No. 1006 Exhibit - Provisional App No 62064156 (P.T.A.B. Mar. 11, 2021)
[0003] One such industry that has employed specific types of software and other computational technology increasingly over the past few years is that related to building and/or architectural design.
For example, a user's view of a conventional three-dimensional rendering on a computer screen may fall short on conveying an full appreciation for the scale of a particular feature or design.
[0007] Implementations of the present invention comprise systems, methods, and apparatus configured to allow a user to navigate within a three-dimensional rendering of an architectural design.
For example, Figure 4 and the corresponding text illustrate flowcharts of a sequence of acts in a method for displaying a three-dimensional view of an architectural design to a user through the one or more virtual reality components.
[0043] For example, Figure 4 illustrates that an implementation of a method for displaying a three-dimensional view of an architectural design to a user through the one or more virtual reality components can comprise an act 400 of rece1vmg a communication from position tracking sensors.
cite Cite Document

1003 Exhibit: Shingu 2010

Document PGR2021-00067, No. 1003 Exhibit - Shingu 2010 (P.T.A.B. Mar. 11, 2021)
Camera Pose Navigation using Augmented Reality Jun Shingu*, Eleanor Rieffel**, Don Kimber**, Jim Vaughan**, Pernilla Qvarfordt**, Kathleen Tuite*** *FujiXerox Co. Ltd., **FX Palo Alto Laboratory Inc., ***University of Washington
We propose an Augmented Reality (AR) system that helps users take a picture from a designated pose, such as the position and camera angle of an earlier photo.
Our system IEEE International Symposium on Mixed and Augmented Reality 2010 Science and Technolgy Proceedings 13 -16 October, Seoul, Korea
When the camera is inside the cone, the difference in direction between the current view of object point T and that of the bookmarked pose is less than a predetermined angle.
When the user succeeds in moving the camera inside the cone, the color of the ball turns green and the message “Position OK” appears in the camera’s viewer window (Figure 2(c)).
cite Cite Document

1014 Exhibit: US20130179841A1 Mutton

Document PGR2021-00067, No. 1014 Exhibit - US20130179841A1 Mutton (P.T.A.B. Mar. 11, 2021)

cite Cite Document

1015 Exhibit: US20140114845A1 Rogers

Document PGR2021-00067, No. 1015 Exhibit - US20140114845A1 Rogers (P.T.A.B. Mar. 11, 2021)

cite Cite Document

1001 Exhibit: US Patent No 10783284

Document PGR2021-00067, No. 1001 Exhibit - US Patent No 10783284 (P.T.A.B. Mar. 11, 2021)

cite Cite Document

1023 Exhibit: Sutherland 1963

Document PGR2021-00067, No. 1023 Exhibit - Sutherland 1963 (P.T.A.B. Mar. 11, 2021)

cite Cite Document

1025 Exhibit: Brooks 1992

Document PGR2021-00067, No. 1025 Exhibit - Brooks 1992 (P.T.A.B. Mar. 11, 2021)

cite Cite Document
<< 1 2 3 4 5 6 7 8