`Peurach et al.
`
`I 1111111111111111 11111 111111111111111 11111 11111 1111111111111111 Ill lllll llll
`US006131097 A
`[11] Patent Number:
`[45] Date of Patent:
`
`6,131,097
`Oct. 10, 2000
`
`[54] HAPTIC AUTHORING
`
`[75]
`
`Inventors: Thomas M. Peurach, Novi; Todd
`Yocum, Ann Arbor; Douglas Haanpaa,
`Ann Arbor; Charles J. Jacobus, Ann
`Arbor, all of Mich.
`
`[73] Assignee: Immersion Corporation, San Jose,
`Calif.
`
`[21] Appl. No.: 08/859,877
`
`[22] Filed:
`
`May 21, 1997
`
`Related U.S. Application Data
`
`[ 63] Continuation-in-part of application No. 08/543,606, Oct. 16,
`1995, Pat. No. 5,629,594, which is a continuation-in-part of
`application No. 08/257,070, Jun. 9, 1994, Pat. No. 5,459,
`382, which is a division of application No. 07 /984,324, Dec.
`2, 1992, Pat. No. 5,389,865, and a continuation of applica(cid:173)
`tion No. 08/854,375, May 12, 1997.
`[60] Provisional application No. 60/018,037, May 21, 1996.
`Int. CI.7 ...................................................... G06F 17/30
`[51]
`[52] U.S. Cl . ............................................. 707/102; 707/104
`[58] Field of Search ...................................... 707/102, 104
`
`[56]
`
`References Cited
`
`U.S. PATENT DOCUMENTS
`
`3,919,691
`4,604,016
`4,795,296
`4,982,918
`5,004,391
`5,007,300
`5,018,922
`5,044,956
`5,062,594
`5,103,404
`5,116,180
`5,142,931
`5,143,505
`5,146,566
`5,180,351
`5,185,561
`5,186,629
`5,220,260
`
`11/1975 Noll ..................................... 340/172.5
`8/1986 Joyce .......................................... 414/7
`1/1989 Jau .............................................. 414/5
`1/1991 Kaye ....................................... 244/223
`4/1991 Burdea .. ...... .......... ...... ...... .... ...... 414/6
`4/1991 Siva ................................... 74/471 XY
`5/1991 Yoshinada et al. ......................... 414/5
`9/1991 Behensky .................................. 434/45
`11/1991 Repperger ............................... 244/175
`4/1992 McIntosh ........................... 318/568.22
`5/1992 Fung et al. .................................. 414/5
`9/1992 Menahem ........................... 74/471 XY
`9/1992 Burdea et al. .............................. 414/5
`9/1992 Hollis, Jr. et al. ...................... 395/275
`1/1993 Ehrenfried ................................ 482/52
`2/1993 Good et al. ............................. 318/432
`2/1993 Rohen ..................................... 434/114
`6/1993 Schuler ................................... 318/561
`
`5,223,776
`5,264,776
`5,382,885
`5,389,865
`5,405,152
`5,451,924
`5,459,382
`5,482,051
`5,506,605
`5,513,100
`5,515,919
`5,562,572
`5,576,727
`5,583,478
`5,587,937
`
`6/1993 Radke et al. ......................... 318/568.1
`11/1993 Gregory et al.
`.. ...................... 318/561
`1/1995 Salcudean et al. ................ 318/568.11
`2/1995 Jacobus et al. .................... 318/568.11
`4/1995 Katanics et al. ........................ 273/438
`9/1995 Massimino et al. ................. 340/407.1
`10/1995 Jacobus et al. .................... 318/568.11
`1/1996 Reddy et al. ........................... 128/733
`4/1996 Paley ....................................... 345/163
`4/1996 Parker et al. ...................... 364/167.01
`4/1996 Araki ...................................... 345/156
`10/1996 Carmein ...................................... 482/4
`11/1996 Rosenberg et al. ..................... 345/179
`12/1996 Renzi ................................... 340/407.1
`12/1996 Massie et al. .......................... 364/578
`
`(List continued on next page.)
`
`FOREIGN PATENT DOCUMENTS
`
`WO95/20788
`
`8/1995 WIPO.
`
`OTIIER PUBLICATIONS
`
`Fletcher, L.A., and Kasturi, R.; A Robust Algorithm for Text
`String Separation from Mixed Text/Graphic Images. IEEE
`Transactions on Pattern Analysis and Machine Intelligence,
`vol. 10, No. 6, Nov. 1988; pp. 910--918.
`
`(List continued on next page.)
`
`Primary Examiner-Wayne Amsbury
`Attorney, Agent, or Firm-Gifford, Krass, Groh, Sprinkle,
`Anderson & Citkowski, PC
`
`[57]
`
`ABSTRACT
`
`Methods are presented for authoring geometrical databases
`which incorporate touch or haptic feedback. In particular, a
`database of geometrical elements incorporates attributes
`necessary to support haptic interactions such as stiffness,
`hardness, friction, and so forth. Users may instantiate
`objects designed through CAD/CAM environments or
`attach haptic or touch attributes to subcomponents such as
`surfaces or solid sub-objects. The resulting haptic/visual
`databases or world-describing models can then be viewed
`and touched using a haptic browser or other appropriate user
`interface.
`
`58 Claims, 9 Drawing Sheets
`
`AtJTlfORI1IO/IEDITil!IG
`
`Valve Exhibit 1047
`Valve v. Immersion
`
`
`
`6,131,097
`Page 2
`
`U.S. PATENT DOCUMENTS
`
`5,588,139
`5,589,828
`5,589,854
`5,619,180
`5,623,582
`5,623,642
`5,625,576
`5,666,473
`5,691,898
`5,709,219
`5,714,978
`5,721,566
`5,729,249
`5,734,373
`5,742,278
`5,755,577
`5,767,836
`5,769,640
`5,790,108
`5,802,353
`5,805,140
`5,825,308
`5,844,392
`5,889,670
`
`12/1996 Lanier et al. ........................... 395/500
`12/1996 Armstrong ................................ 341/20
`12/1996 Tsai ......................................... 345/161
`4/1997 Massimino et al. ................. 340/407.1
`4/1997 Rosenberg ................................ 395/99
`4/1997 Katz et al. .............................. 395/500
`4/1997 Massie et al.
`.......................... 364/578
`9/1997 Wallace ................................... 345/420
`11/1997 Rosenberg .............................. 364/190
`1/1998 Chen et al. ............................. 128/782
`2/1998 Yamanaka et al. ..................... 345/157
`2/1998 Rosenberg et al. ..................... 345/161
`3/1998 Yasutake ................................. 345/173
`3/1998 Rosenberg et al. ..................... 345/161
`4/1998 Chen et al. ............................. 345/156
`5/1998 Gillio ...................................... 434/262
`6/1998 Rosenberg .............................. 345/161
`6/1998 Jacobus et al. ......................... 434/262
`8/1998 Salcudean et al. .. ................... 345/184
`9/1998 Avila et al. ............................. 395/500
`9/1998 Rosenberg et al ...................... 345/161
`10/1998 Rosenberg ................................ 341/20
`12/1998 Peurach et al. .................... 318/568.17
`3/1999 Schuler et al ........................... 364/186
`
`OIBER PUBLICJITIONS
`
`Cavnar, W.B., Vayda, A.J., "Using Superimposed Coding of
`N-Gram Lists for Efficient Inexact Matching". Proceedings
`of the Fifth Advanced Technology Conference, Washington
`D. C., Nov. 1992, pp. 253-267.
`Iwaki, 0., Kida, H., and Arakawa, H.; A Segmentation
`Method Based on Office Document Hierarchical Structure.
`Proceedings of the IEEE International Conference on Sys(cid:173)
`tems, Man, and Cybernetics; Oct. 1987; pp. 759-763.
`Fisher, J.L., Hinds, S.C., D' Amato, D.P.; A Rule-Based
`System for Document Image Segmentation. Proceedings of
`the IEEE 10th International Conference on Pattern Recog(cid:173)
`nition, Jun. 1990; pp. 567-572.
`Wong, K.Y., Casey, R.G., and Wahl, F.M.; Document Analy(cid:173)
`sis System. IBM Journal of Research and Development, vol.
`26, No. 6, 1982; pp. 647-656.
`Lu, C.; Publish It Electronically. Byte Magazine, Sep. 1993;
`pp. 95-109.
`Zlatopolsky, A.A.; Automated Document Segmentation. Pat(cid:173)
`tern Recognition Letters, vol. 15, Jul. 1994; pp. 699-704.
`Mantelman, L.; Voting OCR: Three (or More) Engines Are
`Better Than One. Imaging Magazine, vol. 3, No. 12, Dec.
`1994; pp. 28-32.
`Wayner, P.; Optimal Character Recognition. Byte Maga(cid:173)
`zine, Dec. 1993; pp. 203-210.
`UNL V /Information Science Research Institute, 1994 Annual
`Report, University of Nevada, Las Vegas, Apr. 1994.
`van der Merwe, N.; The Integration of Document Image
`Processing and Text Retrieval Principles. The Electronic
`Library, vol. 11, No. 4/5, Aug./Oct. 1993: pp. 27'3-278.
`Nartker, T.A. Rice, S.V., and Kanai, J.; OCR Accuracy:
`UNLV's Second Annual Test. Inform Magazine, Jan. 1994;
`pp. 40--45.
`Nartker, T.A. and Rice, S.V.; OCR Accuracy, UNLV's Third
`Annual Test. Inform Magazine, Sep. 1994; pp. 30-36.
`Wahl, F.M., Wong, K.Y. and Casey, R.G.; Block Segmenta(cid:173)
`tion and Text Extraction in Mixed Text/Image Documents.
`Computer Vision, Graphics and Image Processing, vol. 20,
`1982; pp. 375-390.
`
`Wang, D. and Srihari, S.N.; Classification of Newspaper
`Image Blocks Using Texture Analysis. Computer Vision,
`Graphics and Image Processing, vol. 47, 1989; pp. 327-352.
`O'Gorman, L.; The Document Spectrum for Page Layout
`Analysis. IEEE Transactions on Pattern Analysis and
`Machine Intelligence, vol. 15, No. 11, Nov. 1993; pp.
`1162-1173.
`Pavlidis, T., and Zhou, J.; Page Segmentation and Classifi(cid:173)
`cation. CVGIP: Graphical Models and Image Processing,
`vol. 54, No. 6, Nov. 1992; pp. 484--496.
`Saitoh, T., Pavlidis, T.; Page Segmentation Without Rect(cid:173)
`angle Assumption. Proceedings IEEE 11th International
`Conference on Pattern Recognition, Sep. 1992; pp.
`277-280.
`Perry,A. and Lowe, D.G.; Segmentation of Textured Images.
`Proceedings of the IEEE Computer Society Conference on
`Computer Vision and Pattern Recognition, 1989; pp.
`326-332.
`Shih, F.Y., Chen, S., Hung, D.C.D., Ng. P.A.; A Document
`Segmentation, Classification and Recognition System. Pro(cid:173)
`ceedings IEEE of the Second International Conference on
`Systems Integration-ICSI, 1992; pp. 258-267.
`Saitoh, T., Yamaai, T., and Tachikawa, M.; Document Image
`Segmentation and Layout Analysis. IEICE Transactions on
`Information and Systems, vol. E77-D, No. 7, Jul. 1994; pp.
`778-784.
`Smith, Geoffrey, "Call It Palpable Progress," Business
`Week, Oct. 9, 1995, p. 93-96.
`Ouh-young, Ming et al., "Using a Manipulator for Force
`Display in Molecular Docking", IEEE 1988, pp. 1824-1829.
`Iwata, Hiroo et al., "Volume Haptization", IEEE 1993, pp.
`16-18.
`Arps, R. B., and Truong, T.K.; Comparison of International
`Standards for Lossless Still Image Compression. Proceed(cid:173)
`ings of the IEEE, vol. 82, No. 6, Jun. 1994, pp. 889-899.
`Baird, H.S., Jones, S.E., Fortune, S.J.; Image Segmentation
`by Shape-Directed Covers. Proceedings of the IEEE 10th
`International Conference on Pattern Recognition, Jun. 1990;
`pp. 567-572.
`Heid, J.; Page to Pixel. MacWorld, Jul. 1992; pp. 174--181.
`Clark, M. and Bovik, AC.; Experiments in Segmenting Text
`on Patterns Using Localized Spatial Filters. Pattern Recog(cid:173)
`nition, vol. 22, No. 6, 1989; pp. 707-717.
`Jain, Anil K. and Bhattacharjee, Sushil; Text Segmentation
`Using Gabor Filters for Automatic Document Processing.
`Machine Vision and Applications, vol. 5, 1992; pp. 169-184.
`Diehl, S. and Eglowstein, H. Tame the Paper Tiger. Byte
`Magazine, Apr. 1991; pp. 220-238.
`Hampel, H.,Arps, R.B., et al.; Technical features oftheJBIG
`standard for progressive bi-level image compression. Signal
`Processing: Image Communication, vol. 4,No. 2, Apr. 1992,
`pp. 103-111.
`Farrokhnia, F. and Jain, AK.; A Multi-Channel Filtering
`Approach to Texture Segmentation. Proceedings of the IEEE
`Computer Vision and Pattern Recognition Conference, Jun.
`1991; pp. 364--370.
`Adachi, Y., "Touch & Trace on the Free-Form Surface of
`Virtual Object", Proceedings of IEEE Virtual Reality Annual
`International Symposium (Sep. 18-22, 1993, Seattle WA)
`pp. 162-168.
`Adlestein, Bernard D. et al., "Design & Implementation of
`a Force Reflecting manipulandum for Manual Control
`Research", 1992, pp. 1-24.
`
`
`
`6,131,097
`Page 3
`
`Bejczy, Antal K., "The Phantom Robot: Predictive Displays
`for Teleoperation with Time Delay", IEEE 1990, pp.
`546--550.
`Iwata, Hirao, "Pen-Based Haptic Virtual Environment"
`Proceedings of IEEE Virtual Reality Annual International
`Symposium (Sep. 18-22, 1993, Seattle, Washington).
`Kotoku, Tetsue et al., "Environment Modeling for the Inter(cid:173)
`active Display (EMID) Used in Telerobotic Systems", IEEE
`Nov. 3-5, 1991,pp. 999-1004.
`Burdea, Grigore et al., "Dextrous Telerobotics with Force
`Feedback-An Overview", Robotica 1991, vol 9.
`Hannaford, Blake et al., "Performance Evaluation of a
`Six-Axis Generalized Force-Reflecting Teleoperator'',
`IEEE May/Jun. 1991, vol. 21, Nov. 3, pp. 620---633.
`Fischer, Patrick et al., "Specification and Design of Input
`Devices for Teleoperation", 1990.
`Colgate, J. Edward et al., "Implementation of Stiff Virtual
`Walls in Force-Reflecting Interfaces", Sep. 22, 1993, pp.
`1-9.
`Tan, Hong Z., et al., "Human Factors for the Design of
`Force-Reflecting Haptic Interfaces," Tan, Srinvasan, Eber(cid:173)
`man, & Chang, ASME WAM 1994, pp. 1-11.
`Buttolo, Pietro et al., "Pen-Based Force Display for Preci(cid:173)
`sion Manipulation in Virtual Environments", IEEE, Mar.
`1995, pp. 1--8.
`Kim, Won S. et al., "Graphics Displays for Operator Aid in
`Telemanipulation", IEEE 1991, pp. 1059-1067.
`Rosenberg. Louis B., "Perceptual Design of a Virtual Rigid
`Surface Contact," Center for Design Research, Stanford U .,
`Armstrong Laboratory, AL/CF-TR-1995-0029, Apr. 1993.
`Ouh-young, Ming et al., "Force Display Performs Better
`than Visual Display in a Simple 6-D Docking Task", IEEE
`1989, pp. 1462-1466.
`Rosenberg, Louis B. et al., "Perceptual Decomposition of
`Virtual Haptic Surfaces," IEEE, Oct. 1993.
`Pavlidis, T., and Zhou, J.; Page Segmentation and Classifi(cid:173)
`cation. CVGIP: Graphical Models and Image Processing,
`vol. 54, No. 6, Nov. 1992; pp. 484-496.
`Saitoh, T., Pavlidis, T.; Page Segmentation Without Rect(cid:173)
`angle Assumption. Proceedings IEEE 11th Conference on
`Pattern Recognition, Sep. 1992; pp. 277-280.
`Perry, A. and Lowe, D .G .; Segmentation of Textured Images.
`Proceedings of the IEEE Computer Society Conference on
`Computer Vision and Pattern Recognition, 1989; pp.
`326--332.
`A. Kelly, S. Salcudean, "MagicMouse: Tactile and Kines(cid:173)
`thetic Feedback in the Human-Computer Interface Using an
`Electromagnetically Actuated Input/Output Device," Uni(cid:173)
`versity of British Columbia, Oct. 1993.
`A. Kelly, S. Salcudean, "On the Development of a Force(cid:173)
`-Feedback Mouse and its Integration into a Graphical User
`Interface," Nov. 1994.
`
`B. Schult, R. Jebens, "Application Areas for a Force-Feed(cid:173)
`back Joystick," ASME 1993, pp. 47-54.
`M. Minsky, M Ouh-young, L. Steele, F. Brooks, Jr., M.
`Behensky, "Feeling and Seeing: Issues in Force Display,"
`pp. 235-42. (no date).
`M. Russo, "The Design and Implementation of a Three
`Degree-of-Freedom Force Output Joystick," May 1990.
`L. Rosenberg, "A Force Feedback Programming Primer,"
`1997.
`J. Payette, "Evaluation of a Force Feedback (Haptic) Com(cid:173)
`puter Pointing Device in Zero Gravity," ASME 1996, pp.
`547-53.
`B. Hannaford, Z Szakaly, "Force-Feedback Cursor Control,
`"NASA Tech Brief, vol. 13, No. 11, Item No. 21, Nov. 1989.
`C. Ramstein, V. Hayward, The Pantograph: A Large Work(cid:173)
`space Haptic Devcie for a Multimodal Human-Computer
`Interaction, Computer-Human Interaction, CHI, 1994.
`Y. Yokohohji, R. Hollis, T. Kanade, "What You Can See is
`What You Can Feel-Development of a Visual/Haptic Inter(cid:173)
`face to Virtual Environment," Proceedings of VRAIS pp.
`46-54.
`L. Rosenberg, S. Brave, "The Use of Force Feedback to
`Enhance Graphical User Interfaces," Proc. SPIE 2653, vol.
`19, pp. 243-24.8.
`H. Iwata, Artificial Reality with Force-Feedback: Develop(cid:173)
`ment of Desktop Virtual Space with Compact Master
`Manipulator, Computer Graphics, vol. 24, No. 4, Aug. 1990,
`pp. 165-70.
`S. Su, R. Furuta, "The Virtual Panel Architecture: A 3D
`Gesture Framework," pp. 387-393. (no date).
`G. Burdea, E. Roskos, D. Gomez, N. Langrana, P. Richard,
`"Distributed Virtual Force Feedback," May 1993, pp. 25-44.
`T. Kotoku, "A Predictive Display with Force Feedback and
`its Application to Remote Manipulation System with Trans(cid:173)
`mission Time Delay," Proc. IEEE 1992, pp. 239-46.
`R. Ellis, 0. Ismmaeil, G. Lipsett, "Design and Evaluation of
`a High-Performance Prototype Planar Haptic Interface,"
`ASME Dec. 1993, pp. 55-64.
`L. Rosenberg, T. Lacey, D. Stredney, M. VanDerLoss,
`"Commercially Viable Force Feedback Controller for Indi(cid:173)
`viduals with Neuromotor Disabilities," U.S. Air Force Final
`Report, Oct. 1995-May 1996.
`P. Kilpatrick, "The Use of Kinesthetic Supplement in an
`Interactive Graphics System," Univ. of North Carolina at
`Chapel Hill, 1976.
`G. Winey III, "Computer Simulated Visual and Tactile
`Feedback as an Aid to Manipulator and Vehicle Control,"
`Mass. Institute of Technology, Jun. 1981.
`K. Hirota, M. Hirose, "Development of Surface Display,"
`IEEE 1993, pp. 256--62.
`
`
`
`World Description
`(from over the network
`or from a local file)
`
`-
`
`/ " / ~ -
`✓
`~ ~
`Visual
`Haptic
`Haptic
`Parser
`Parser
`Parser
`
`INPUT/OUTPUT
`
`; ~
`
`H,
`
`,.
`Haptic
`Rendering
`System
`
`~
`
`Haptic
`Device
`Hardware
`
`, ..
`
`User Touch
`and Point
`
`Visual
`Editing
`System
`
`Haptic
`Editing
`System
`
`~ / EDITING
`"
`.... .. Navigation
`
`Haptic
`
`/
`
`NAVIGATION
`
`Visual
`Parser
`
`~ .
`
`Visual
`Rendering
`System
`
`,..
`
`Display
`Device
`Hardware
`
`,,
`
`User Sights
`
`DISPLAY
`
`BROWSER
`
`AUTHORING/EDITING
`
`d .
`r:JJ. .
`~
`~ ......
`~ = ......
`
`0
`I")
`:-'"
`
`'""" ~=
`8
`
`N
`0
`
`'JJ. =(cid:173)~
`~ ....
`'""" 0 ....,
`
`\0
`
`0--,
`....
`~
`~
`
`~ = \0
`
`.....:a
`
`
`
`U.S. Patent
`
`Oct.10, 2000
`
`Sheet 2 of 9
`
`6,131,097
`
`,.-
`r---..,.......
`
`"-.
`
`r
`r--.....
`
`~
`)
`
`-....,
`
`---
`
`Figure - 2
`
`World
`Description(s)
`(from over the
`network or from a
`local file)
`
`-
`
`-
`
`'-----o INPUT
`
`DISPLAY or
`BROWSE
`
`EDIT and
`ATTRIBUTE
`
`OUTPUT
`
`r
`
`NAVIGATION
`
`1
`
`Reaction Force
`From Touch or
`Penetration
`
`Penetration or
`Touch location
`
`Touched
`Surface
`
`Figure- 3
`
`
`
`U.S. Patent
`
`Oct. 10, 2000
`
`Sheet 3 of 9
`
`6,131,097
`
`@ ~
`
`Cone
`Parameters: r, radius, h, height
`
`Sphere
`Parameters: r, radius
`
`• Voxel
`
`Bounding Box: is same as Cylinder
`Bounding Box: +/- r Contact: perpendicular drop point
`Contact: avatar point between endpoints of centerline and
`less than radius
`perpendicular drop distance less than the
`cone radius at the centerline drop point.
`
`Parameters: array
`values
`
`Bounding Box: array
`size
`
`Cylinder
`Parameters: r, radius, 1, length
`
`Bounding Box: +/- r, +/- r, +/-V2
`Contact: perpendicular drop to
`centerline less than r and drop point
`between endpoints of centerline
`
`1
`
`Box
`Parameters: 1, length, w, width, h, height
`
`Bounding Box: +/-1/2, h/2, w/2
`Contact: avatar within bounding limits
`
`Polygonal, Models
`Parameters: vertex list, connectivity list
`
`B-Spline Patch Models
`Parameters: vertex list, connectivity list
`
`Bounding Box: maximum x, minimum x,
`maximum y, minimum y, maximum z,
`minimum z (plus sub-boxes and per(cid:173)
`polygon boxes)
`
`Bounding Box: maximum x, minimum x,
`maximum y, minimum y, maximum z,
`minimum z (plus sub-boxes and per-
`' polygon boxes)
`
`Figure- 4
`
`
`
`U.S. Patent
`
`Oct. 10, 2000
`
`Sheet 4 of 9
`
`6,131,097
`
`(World Poly_21
`(Object 1
`(Poly 1 (Edges 1 1 2 3 7 7 9 9) RED (Hardness 21))
`(Poly 1 (Edges 15 20 33 33 56 8) RED (Hardness 21))
`(Poly 1 (Edges 3 3 5 6 3 3 2 1) RED (Hardness 21))
`
`)
`(Object 2
`(Sphere 2 (Center 21 33 40) (Radius 20) GREEN (Stiffness 2))
`(Box 3 ... (Stiffness 21) (Hardness 3))
`(Object 3
`http://www.cybernetcom/cyberobject. vrm1
`
`(Cyberobject
`additional object description data
`
`Figure- 5
`
`Brov.ser Files Stored on a l.oc;aJ
`Disk
`
`Browser Files Accessed o.....- the a Network frcm Multiple
`HostCompm=
`
`Directory 1
`Filel
`File 2
`File 3
`Directory 2
`File4
`File5
`
`File6
`
`Hostl
`
`Document 1
`
`Subdoc:umcnt - - (cid:173)
`
`Subdoc:umcnt ,___ _ _
`
`Subdocumc:nt
`
`Subdocumc:nt
`
`Host2
`
`□
`
`Document 4
`
`Figure - 6
`
`
`
`U.S. Patent
`
`Oct. 10, 2000
`
`Sheet 5 of 9
`
`6,131,097
`
`FlLE: Poly _21
`(World Poly _21
`(Object 1
`(Poly 1 (Edges 1 123 7 7 9 9) RED (Hardness 21))
`(Poly 1 (Edges 15 20 33 33 56 8) RED (Hardness 21))
`(Poly 1 (Edges 3 3 5 6 3 3 2 1) RED (Hardness 21))
`
`)
`(Object 2
`(Sphere 2 (Center 21 33 40) (Radius 20) GREEN (Stiffness 2)
`(Box 3 .. . (Stiffness 21) (Hardness 3))
`
`J(Poly 1 (Edges 1123 7 7 9 9) RED (Hardness 21)) I
`
`/~
`
`Call Haptic Rendering Routine for
`Polygons with data
`
`Call VJSual Rendering Routine for
`Polygons with data
`
`(Edges 1 1 2 3 7 7 9 9)
`(Harness 21)
`
`(Edges 1 1 2 3 7 7 9 9)
`RED
`
`Figure - 7
`
`Static Objects
`
`Dynamic Objects
`
`l
`
`Read.Record
`
`Figure - 8
`
`Create Object
`
`Create Object
`
`.. ..
`
`1/
`
`Task Fork
`
`Update Object
`
`Create Object
`Dynamic
`Behavior
`Simulation
`
`i - - -..
`
`Simulate
`Object
`Dynamic
`Behavior
`
`
`
`U.S. Patent
`
`Oct.10, 2000
`
`Sheet 6 of 9
`
`6,131,097
`
`l
`
`Coordinate Frame
`
`Object god point
`
`Object Location Lin.k:ed to
`the Haptic Device Frame
`used 10 probe (Object
`Center is the god point)
`
`Device control point
`
`Figure- 9
`
`Figure-10
`
`Penetrationlocation,x,y
`(in surface coordinates) -7---...l._
`
`Touching Surface
`
`Surface Penetration
`by AintoB, p
`
`Where K1, K2, and K3 are force constants and
`t[x,y] is a force texture map (two dimensional
`array of force offsets associated with the surface)
`
`V elcx:ity Difference
`From A toB, v
`
`
`
`U.S. Patent
`
`Oct.10, 2000
`
`Sheet 7 of 9
`
`6,131,097
`
`Figure-11
`(ENTIRE SHEET)
`
`© Ol::iject XForm
`Device XFo:cm o
`Parent p
`
`,r
`
`@ Ol::iject XFo:cm
`Device XFo:cm a
`Parent ::>
`
`,r
`
`Ol::iject XFo:cm
`Device XFo:cm o
`Parent ;:,
`
`@
`----------··-··-··-··-··-··_...·-··-··- ·-··
`,r
`
`©
`
`Object XFo:cm
`Device XFo:cm o
`Parent p
`
`Object XFo:cn
`Device XFo:cm 0-
`Parent 9
`
`I
`
`Device XForm
`Axis O
`Parent
`
`®
`
`Device XForm
`Axis O
`Parent
`
`@
`
`Device XFo:cm
`Axis O
`Pa.rent
`
`ir
`,r
`Object XForm
`Device XForm 0
`Parent...,
`
`G)
`
`"" Object @
`~ Worl.cl
`
`Device
`Wor1cl
`
`
`
`U.S. Patent
`
`Oct.10, 2000
`
`Sheet 8 of 9
`
`6,131,097
`
`Figure-13
`
`Force
`
`Force
`
`Motion
`
`Motion
`
`Force
`
`GridPw,,.
`
`F<rc:ear:tra<:tingtheAvuar~~ 7 : \~
`0
`e
`IO Ille N..- Grid
`
`SNAP TO GRID
`
`SNAP TO FEATtlRE
`
`ROO<tial Porco 1nm
`V,nw Gmund Plane
`
`/
`
`/
`
`NCI Fcrc:e ,nth
`AccolorallS Vinual
`Objoct
`
`Appliedrorm
`_ / lnmtJacr
`
`Vinual Object
`
`Figure-12
`
`Figure-15
`
`•
`
`•
`
`•
`
`•
`•
`
`•
`
`•
`
`•
`
`•
`
`•
`
`Figure-14
`
`...
`...
`
`........ ···:·· ·(cid:173). . .
`
`pitch,?
`roll,
`yaw
`
`
`
`U.S. Patent
`
`Oct. 10, 2000
`
`Sheet 9 of 9
`
`6,131,097
`
`Force Value 11:ipping from
`--------Texrure 11:ip to Surface
`
`Force Texture Map Array
`
`Surface Component of an Object
`
`Figure - 16
`
`\l Motion is constrained to
`
`be arom1d the hinge
`center line in response to
`pushing Force
`
`Figure - 17
`
`t
`
`Machine and Operating Systr-rn Independence Foundation
`
`Configuration Files Defining Parameters and Ge=etry
`
`Operating System Dependent Library
`
`Machine Independent Coding Language
`
`Industry Standard
`Browser File
`Formats:
`VRML
`DXF
`IGES
`PDES
`
`Figure -18
`
`
`
`6,131,097
`
`2
`features relating to "browsers," wherein common geometri(cid:173)
`cal descriptions are shared among visual and haptic render(cid:173)
`ing functions. Both of these applications are incorporated
`herein in their entirety by reference. The need remains,
`5 however, for methods, data structures, and control strategies
`to organize the development of world models driven by
`these integrated haptic/visual environments.
`
`1
`HAPTIC AUTHORING
`
`REFERENCE TO RELATED APPLICATION
`
`This application claims priority of U.S. Provisional Appli(cid:173)
`cation Ser. No. 60/018,037, filed May 21, 1996, and U.S.
`patent application Ser. No. 08/854,375 filed May 12, 1997
`which is a continuation of U.S. patent application Ser. No.
`08/543,606, filed Oct. 16, 1995, now U.S. Pat. No. 5,629,
`594, which is a continuation-in-part of U.S. patent applica-
`tion Ser. No. 08/257,070, filed Jun. 9, 1994, now U.S. Pat.
`No. 5,459,382, which is a divisional of U.S. patent appli(cid:173)
`cation Ser. No. 07/984,324, filed Dec. 2, 1992, now U.S. Pat.
`No. 5,389,865, the entire contents of all of which are
`incorporated herein by reference.
`
`10
`
`SUMMARY OF THE INVENTION
`The present invention resides in authoring tools which
`allow a user to create or import existing geometry files,
`attach haptic or other attributes to the object components of
`the files, and browse file contents in final or intermediate
`states of composition. The final edited world files may be
`15 written to disk or exported over networks, preferably in
`standardized formats, including hyperlinks over the world(cid:173)
`wide web, to applications programs which incorporate
`visual, haptic, and/or sound capabilities, enabling other
`users view, touch, attached to, and manipulate the objects.
`In a preferred embodiment, the invention incorporates a
`distinct set of facilities for reading, writing, browsing,
`navigating, and/or editing databases which encode hierar(cid:173)
`chical geometric data, so as to combine surface attribution
`Specialized force-feedback devices originated in the
`and touch or haptic attribution. To enhance these functions,
`1960's with the introduction of teleoperations, wherein,
`25 visual/haptic avatar may be provided to assist in designating
`typically, a smaller controller or master robot was moved by
`a user's position in the virtual world being edited.
`an operator to control the movements of a larger slave robot.
`Forces detected at the slave were then fed back to the
`Applied forces may be used to aid a user in a number of
`operator through actuators at the location of the master. Such
`ways, including the following:
`prior art is discussed in U.S. Pat. Nos. 5,389,865, 5,459,382
`moving a point to a particular discrete grid position (snap
`and 5,629,594 to Jacobus, et al, and also described else(cid:173)
`to grid);
`where in the literature.
`moving to a geometrical object control point of feature
`In the late 1980's, NASA funded several programs using
`( end points of a line, center of a circle, radius of a circle,
`force feedback devices which were not identically config(cid:173)
`control points of a b-spline, etc.);
`ured as miniature versions of a slave device. This advance
`to resist stretching or compression of a feature (i.e.,
`enabled an operator such as an astronaut to control a number
`programmable stiffness or elasticity);
`of different space-based robots and cranes from a "univer(cid:173)
`to resist user actions ( through viscosity, friction, repulsive
`sal" controller. To realize this concept, the master controller
`force);
`was logically connected to the slave through a network of
`to help in aligning a new object with respect to an existing
`computers which were capable of translating the master
`one (with or without knowledge of coordinate values needed
`kinematics typically into Cartesian coordinates and from
`for most equivalent operations); or to support material
`Cartesian to slave kinematics (and back again).
`removal, as in carving or scraping operations.
`With such computer translation in place on the master side
`Forces may also be used to demark the locations of menu
`of the system, it becomes possible to send inputs from the
`items, dialog response locations, and icons (similar to use of
`master, be it a joystick, wheel, yoke, or other type of 45
`forces to aid in locating control points or grid locations), or
`manipulator, to a simulated slave rather than to a real one,
`to maintain orientation or positional constraints while per(cid:173)
`and to accept forces from the simulation for application to
`forming another operation.
`the master as well. The simulation need not represent a real
`The invention further supports the use of transform and
`device, like a crane or robot, but may be a simulated vehicle,
`object hierarchy for coding haptic world and object
`weapon or other implement. The simulation may also reside 50
`databases, as well as machine independent program descrip(cid:173)
`in a person performing a task in a virtual world such as
`tion languages for haptic authoring system communications
`walking, handling objects, and touching surfaces. Such
`interfacing and control algorithms, independent of
`innovations are among those disclosed in the patents refer(cid:173)
`computer/operating system, control device type and com-
`enced above.
`As force-feedback technology proliferates, haptic inter- 55 munications syStems.
`faces will need to accommodate numerous different control-
`BRIEF DESCRIPTION OF THE DRAWINGS
`lers and environments. The issued patents referenced above
`disclose multi-degree of freedom controllers for use in
`various representative configurations, including totally self(cid:173)
`contained configurations. At the same time, tools and tech- 60
`niques will need to be created to provide consistency in
`developing and improving haptic applications.
`Toward these ends, co-pending U.S. application Ser. No.
`08/859,157 provides means for adjusting behavioral
`attributes associated with haptic device control, whether 65
`during development or execution, and co-pending U.S.
`application Ser. No. 08/861,080 discloses architectures and
`
`FIELD OF THE INVENTION
`
`This invention relates generally to force-feedback and
`haptic devices and, in particular, to the authoring of world
`models which incorporate haptic and visual integration.
`
`20
`
`BACKGROUND OF THE INVENTION
`
`30
`
`35
`
`40
`
`FIG. 1 presents basic elements associated with a haptic/
`visual authoring tool;
`FIG. 2 is a flow-chart representation of a method of the
`invention;
`FIG. 3 is a diagram which shows an avatar interacting
`with a virtual object, generating responsive forces;
`FIG. 4 illustrates primitive force generating objects and
`their descriptive parameters;
`FIG. 5 illustrates haptic/visual browser object hierarchical
`description files;
`
`
`
`6,131,097
`
`3
`FIG. 6 shows where files for browsing may be located;
`FIG. 7 is a diagram which depicts data flow from an input
`file to an API to a rendering engine;
`FIG. 8 illustrates static versus dynamic entity processing;
`FIG. 9 is an oblique drawing used to illustrate fixed and
`movable objects, transform chains and coordinate point
`definitions;
`FIG. 10 illustrates force generation from object surface
`penetration;
`FIG. 11 depicts a transform and object hierarchy for a
`typical haptic device;
`FIG. 12 illustrates flying in six degrees of freedom;
`FIG. 13 shows pushing, rolling, and opening as examples
`of force-based tasks;
`FIG. 14 shows how an avatar may be moved so as to snap
`to a grid or feature point;
`FIG. 15 illustrates how alignment may be maintained
`during motion according to the invention;
`FIG. 16 is a force-texture map;
`FIG. 17 illustrates a door-hinge motion constraint set; and
`FIG. 18 is a diagram used to convey operating systems
`and computer systems independence.
`
`DETAILED DESCRIPTION OF IBE
`INVENTION
`
`15
`
`4
`into/out of a field, such as magnetic or gravimetric, or entry
`into a new medium, such as from air to water. In addition,
`since avatar (and haptic device or controller) position,
`velocity, and acceleration states are made available to the
`5 virtual reality simulation, the avatar position and other
`simulated state changes can be stimulated through user
`motion and collision events.
`Concurrently with maintaining avatar and static geometry
`data, the geometry data is preferably also used to generate
`10 three-dimensional, viewable imagery. Although conven(cid:173)
`tional visual rendering is well known, unique to this
`invention, are processes associated with haptic rendering,
`including the way in which such rendering is synchronized
`with visual rendering so as to effect a multi-media (i.e.,
`touch and sight) immersive virtual reality.
`The concept of geometrical database browsing arises in
`part from the recognition that the geometric data which is
`loaded into the virtual world, thus initializing it, described as
`a hierarchy of objects may be described as statements or
`20 records in files (FIG. 5). As shown in FIG. 4, such data may
`represent simple objects, polygon arrays, and/or b-spline
`patches. As files, which may take the form of a collection or
`records or a single record, an object description can be read
`into memory for instantiation into the virtual world (by
`25 sending parsed forms of these records to the hap tic rendering
`processing routines or the visual rendering processing
`routines), can be moved to different spots or named locations
`within the files system, or can be sent over a network (FIG.
`6). Haptic/visual browsing is the function of reading the
`30 geometry description files from any source and causing
`them, under user direction, to be rendered visually in a
`haptic sense, and optionally, with sound generation or other
`characteristics as well.
`The actual form of the geometrical database can be
`application-specific, as in the case of man