`Volkswagen Group of America, Inc., Petitioner
`
`1
`
`
`
`US 8,682,673 B2
`Page 2
`
`(56)
`
`References Cited
`
`U.S. PATENT DOCUMENTS
`
`>>>>>>D>D>>>>>D>>>>>>>>D>D>D>D>D>D>>D>D>D>D>D>>D>D>D>D>D>>
`
`4,577,177
`4,708,224
`4,749,062
`4,979,593
`4,995,479
`5,027,104
`5,042,620
`5,056,629
`5,086,450
`5,159,163
`5,200,583
`5,255,341
`5,287,266
`5,295,064
`5,463,209
`5,485,897
`5,551,532
`5,606,154
`5,638,425
`5,689,094
`5,749,443
`5,819,201
`5,819,284
`5,844,181
`5,852,775
`5,887,139
`5,918,222
`5,932,853
`5,955,710
`5,979,757
`5,984,051
`5,987,381
`6,011,839
`6,067,297
`6,073,727
`6,078,928
`6,082,500
`6,157,705
`6,163,749
`6,202,008
`6,202,799
`6,206,142
`6,223,160
`6,230,132
`6,236,968
`6,332,127
`6,341,668
`6,349,797
`
`3/1986
`11/1987
`6/1988
`12/1990
`2/1991
`6/1991
`8/1991
`10/1991
`2/1992
`10/1992
`4/1993
`10/1993
`2/1994
`3/1994
`10/1995
`1/1996
`9/1996
`2/1997
`6/1997
`11/1997
`5/1998
`10/1998
`10/1998
`12/1998
`12/1998
`3/1999
`6/1999
`8/1999
`9/1999
`11/1999
`11/1999
`11/1999
`1/2000
`5/2000
`6/2000
`6/2000
`7/2000
`12/2000
`12/2000
`3/2001
`3/2001
`3/2001
`4/2001
`5/2001
`5/2001
`12/2001
`1/2002
`2/2002
`
`Marubashi
`Schrooder
`Tsuji et al.
`Watanabe et al.
`Fujiwara et al.
`Reid
`Yoneda et al.
`Tsuji et al.
`Kitagawa et al.
`Bahjat et al.
`Kupersmith et al.
`Nakajima
`Malec et al.
`Malec et al.
`Figh et al.
`Matsumoto et al.
`Kupersmith
`Doigan et al.
`Meador et al.
`Friedli et al.
`Romao
`DeGraaf
`Farber et al.
`Amo et al.
`Hidary
`Madison et al.
`Ful<ui et al.
`Friedli et al.
`DiFranza
`Tracy et al.
`Morgan et al.
`Oshizawa
`Friedli et al.
`Beach
`DiFranza et al.
`Schnase et al.
`Amo et al.
`Perrone ................. ..
`McDonough et al.
`Beckert et al.
`Drop
`Meacham
`Kostka et al.
`Class et al.
`Kanevsky et al.
`Bandera et al.
`Fayette et al.
`Newville et al.
`
`6,360,167 B1
`6,397,976 B1
`6,421,305 B1
`6,460,036 B1
`6,466,232 B1
`6,504,571 B1
`6,526,506 B1
`6,571,279 B1
`6,587,835 B1
`6,594,580 B1
`6,606,644 B1
`6,615,175 B1
`6,651,045 B1
`6,799,327 B1
`6,801,792 B1
`6,944,533 B2
`6,990,312 B1
`7,136,853 B1
`7,305,345 B2
`7,577,244 B2
`7,702,798 B2
`7,765,588 B2
`7,783,978 B1
`8,234,119 B2 *
`2003/0195833 A1
`2004/0104842 A1
`2005/0239402 A1
`2006/0069749 A1
`2007/0255838 A1
`2009/0077100 A1
`2010/0023392 A1
`
`3/2002 Millington et al.
`6/2002 Hale et al.
`7/2002 Gioscia et al.
`10/2002 Herz
`10/2002 Newell et al.
`1/2003 Narayanaswami et al.
`2/2003 Lewis
`5/2003 Herz et al.
`7/2003 Treyz et al.
`7/2003 Tada et al.
`8/2003 Ford et al.
`9/2003 Gazdzinski
`11/2003 Macaulay
`9/2004 Reynolds et al.
`10/2004 Schuster et al.
`9/2005 Kozak et al.
`1/2006 Gioscia et al.
`11/2006 Kohda et al.
`12/2007 Bares et al.
`8/2009 Taschereau
`4/2010 Apreutesei et al.
`7/2010 Sahota et al.
`8/2010 Andrews et al.
`7/2012 Dhawan et al.
`10/2003 Baranowski
`6/2004 Drury et al.
`10/2005 Gioscia et al.
`3/2006 Herz et al.
`11/2007 Hassan et al.
`3/2009 Hancock et al.
`1/2010 Merriman et al.
`
`............. .. 704/270
`
`379/88.01
`
`JP
`JP
`JP
`JP
`JP
`
`FOREIGN PATENT DOCUMENTS
`
`01226681 A
`03272977 A
`05017083 A
`05058564 A
`05201624 A
`
`9/1989
`12/1991
`1/1993
`3/1993
`8/1993
`
`OTHER PUBLICATIONS
`
`Karen Jacobs (Dec. 7, 1999) “Elevator Maker to Add Commercial
`Touch,” The Wall Street Journal, pp. 1-2.
`Lewis Perdue (Jul. 20, 1999) “Forget Elevator Music, Here Comes
`Elevator Internet,” Internet VC Watch, pp. 1-2.
`Stevens Institute of Technology, Spring 1999 Final Report, pp. 1-12.
`KenjiYoneda, et al. (Dec. 1997) “Multi-Objective Elevator Supervi-
`sory-Control System with Individual Floor-Situation Control,”
`Hitachi Review, p. 1.
`
`* cited by examiner
`
`2
`
`
`
`US 8,682,673 B2
`
`U.S. Patent
`
`M
`
`M
`
`S
`
`42pl0
`
`ME
`
`5,uzmmos2.acme;
`
`mmo85.28
`
`1Esauagcfizoommo,E>m.a
`
`E
`
`N:
`
`IIIIIEIIIlIiIIIIII|II....I
`
`1-—.
`(‘I1-—¢
`
`V __.__l
`.§
`
`riIIIIIIIIVII
`
`9:W.a§mEWm:N
`
`:7WfxlkkWrim:me
`
`M:
`
`3
`
`
`
`
`
`
`U.S. Patent
`
`Mar.
`
`US 8,682,673 B2
`
`4
`
`
`
`U.S. Patent
`
`Mar. 25, 2014
`
`Sheet 3 of 24
`
`US 8,682,673 B2
`
`5
`
`4550
`
`.mo§me.mo§
`
`fiufio
`
`dozmedoz
`
`magma
`
`o:
`
`mmémm
`
`m.65
`
`3:
`
`5
`
`
`
`
`
`U.S. Patent
`
`Mar. 25, 2014
`
`Sheet 4 of 24
`
`US 8,682,673 B2
`
`USER SELECT
`"BUILDING DIRECTORY“
`FUNCTION KEY
`
`AUDIO
`
`RETRIEVE CELP FILE
`
`SYNTHESIZE
`VOICE PROMPT
`("NAME")
`
`
`
`
` SYSTEM
`PROGRAMMED
`
`
`
`FOR AUDIO
`OR VISUAL
`
`PRO§VIPT
`
`
`
`VISUAL
`
`RETRIE‘/E BUILDING
`DIRECTORY FILE
`
`DISPLAY DIRECTORY
`FILE ON DISPLAY
`DEVICE
`
`
`
`LO%”f§"fP‘§§3E§I§1EEVE
`FOR SELECTED ENTRY
`
`
`
`
`DISPLAY GRAPHIC FILE
`
`
`
`CALCULATE
`CONT. RATING
`
`
`
`SAMPLE VOICE
`AND DIGITIZE
`
`
`
`RETRIEVE BLDG.
`DIRECTORY FILES
`
`
`
`
`
`
`FIG. 4
`
`FLOOR
`SELEVCTED
`
`SYNTI-IESIZBPROMPT
`
`
`
`COMPARE SAMPLED
`VOICE 'WITHLI%IRECTORY
`F1
`
`
`LIST OF MATCHES
`GENERATE SEQUENCE
`
` SYNTHESIZE NEXT PROMPT
`
`BASED ON MATCHES
`
`
`SER INPUT U
`
` SYNTHESIZE PROMPT
`
`6
`
`
`
`U.S. Patent
`
`Mar. 25, 2014
`
`Sheet 5 of 24
`
`US 8,682,673 B2
`
`FIG.5
`
`113
`
`/
`
`\ \
`/’
`
`/¢’./
`‘s;
`__§a__
`lag“
`/’
`:j"~
`/ “N
`A
`/
`
`\ \
`
`7
`
`
`
`U.S. Patent
`
`Mar. 25, 2014
`
`Sheet 6 of 24
`
`US 8,682,673 B2
`
`m:
`
`8
`
`
`
`U.S. Patent
`
`Mar. 25, 2014
`
`Sheet 7 of 24
`
`US 8,682,673 B2
`
`as.03
`
`9
`
`
`
`U.S. Patent
`
`Mar. 25, 2014
`
`Sheet 8 of 24
`
`US 8,682,673 B2
`
`
`
`
`
`mmmbqmadmmcoucumsZ<m
`
`8.65
`
`10
`
`10
`
`
`
`U.S. Patent
`
`Mar. 25, 2014
`
`Sheet 9 of 24
`
`US 8,682,673 B2
`
`3:.NE.
`
`m:
`
`.3.Maefim
`
`mcFmmmbaaM52
`
`fioeéoan<Szaon
`
`
`Q<O.E..BOmQ<OAZ_3OQ
`
`
`
`\\\\\
`
`E
`
`EM
`
`2
`
`l*~
`
`L’)
`I--4
`
`Lu
`
`11
`
`
`
`
`
`U.S. Patent
`
`Mar. 25, 2014
`
`Sheet 10 of 24
`
`US 8,682,673 B2
`
`
`
`asSmsmmmommmoofiIR
`
`S
`
`EE‘5»
`
`CI2
`
`#5
`
`am5
`
`E
`
`H
`
`w.©E
`
`2»
`
`A
`
`12
`
`cm
`
`am
`
`moon
`
`momma
`
`éw
`
`A
`
`12
`
`
`
`
`U.S. Patent
`
`Mar. 25, 2014
`
`Sheet 11 of 24
`
`US 8,682,673 B2
`
`13
`
`13
`
`
`
`U.S. Patent
`
`Mar. 25, 2014
`
`Sheet 12 of 24
`
`US 8,682,673 B2
`
`
`
`COUNT# OF
`I
`NON~ZERO OUTPUTS
`
`
`
`
`SUM ALL NON-ZERO
`OUTPUTS
`
`
`
`CORRECTION
`
`T0 OMAXU
`
`OBTAIN # OF FLOORS
`SELECTED IN CAR
`
`
`
`
`COMPARE OE, OMINP,
`OMINS, & OMAXC
`AND CHOOSE LARGEST
`
`
`
`GENERATE
`BYPASS CONTROL
`SIGNAL
`
`FIG. 10
`
`14
`
`14
`
`
`
`U.S. Patent
`
`Mar. 25, 2014
`
`Sheet 13 of 24
`
`US 8,682,673 B2
`
`E:
`
`3
`T
`
`33>
`
`MSmuSma
`
`WEE
`
`mmamm
`
`2T\_
`
`m_E.§Em
`
`M40
`
`15
`
`15
`
`
`
`
`U.S. Patent
`
`Mar. 25, 2014
`
`Sheet 14 of 24
`
`US 8,682,673 B2
`
`3.03
`
`MEmmfia2:
`
` .55
`
`16
`
`16
`
`
`
`U.S. Patent
`
`Mar. 25, 2014
`
`Sheet 15 of 24
`
`US 8,682,673 B2
`
`
`NUMBER
`
`FLOOR
`
`FIG.143.
`
`17
`
`17
`
`
`
`U.S. Patent
`
`2M
`
`42f061mhS
`
`US 8,682,673 B2
`
`23.1‘
`
`MsmsfiE39::mmhomgmm
`
`
`
`mmeumfimzozomzoa$58§.<>mmtmzfimE5%megaES
`
`J
`
`SNE
`
`
`2:35mm$.HOZ.<5,
`
`ZOHMMOAW.mmpumfimzoSEE4mzowamansmmmmhmqmmzo3%10%:mozommommzommflomfimmoW.mZOZmk/.mm>g
`
`
`
`
`
`
`
`3m>E
`
`LE.05
`
`18
`
`18
`
`
`
`
`U.S. Patent
`
`M025,2MM
`
`m.h__S
`
`US 8,682,673 B2
`
`IIICI\IIIIlI.7!Wmmdm“4mfioammfiurH"mm«_%Eom
`02mT/«m“n_wmmofi"
`
`19
`
`19
`
`
`
`
`U.S. Patent
`
`Mar. 25, 2014
`
`Sheet 18 of 24
`
`US 8,682,673 B2
`
`
`
`READER
`ACCESS
`
`PASSWORD
`
`FILE
`
`OF DATABASE
`
`
`
`USER ENTBR
`ELEVATOR CAR
`
`_
`
`READER
`ACTIVATE
`
`’
`
`READER INTERROGATE
`RFID TAG
`
`RFID TAG DECRYPT
`READER EMISSION
`
`
`
`
`
`GENERATE
`CONTROL
`SIGNAL REMOVING
`BLOCK FROM
`AUTHORIZED FLOORS
`
`
`
`
`
`
`
`
`TAG XMIT
`"NOT
`RECOGNIZED"
`WARNING
`
`
`
`
`
`ENABLE UTILITY
`FUNCTION KEYS
`
`USER SELECT
`DESIRED FUNCTIONS
`
`READER DECRYPT
`RFID EMISSION
`
`READER ACCESS
`DATABASE
`
`DEACTIVATE
`READER
`
`LOG DATA
`IN DATABASE
`
`
`DISPLAY
`
`
`
`
`WARNING
`
`
`
`
`"
`
`Y
`
`READER ENABLE
`INPUT DEVICE
`
`I USERENTERwxsswoma
`
`FIG. I 6
`
`20
`
`20
`
`
`
`U.S. Patent
`
`Mar. 25, 2014
`
`Sheet 19 of 24
`
`US 8,682,673 B2
`
`*'\—1o2
`
`FIG.17
`
`21
`
`
`
`U.S. Patent
`
`Mar. 25, 2014
`
`Sheet 20 of 24
`
`US 8,682,673 B2
`
`PARSE IMAGE
`
`
`
`
`
`SAMPLE SPEECH— DATAADDRESS
`
`FROM ALL
`OF OCCUPANTS
`
`
`RETRIEVEALL
`IMAGE DATA
`
`
`
`gggzgsgg
`SPEECH To
`DIGITAL FORMAT
`
`
`
`
`
`
`ALLOCATE ONE
`IMAGE DATA
`
`FILE TO EACH DISPL.
`
` SET
`AMBIGUITY FLAG
`
`
`N
`MATCH
`ANY LIBRARY
`WQIRD
`
`
`
`
`SUB-FILE(S)
` . _ _ — . . . . _ . . _ . . . _¢
`
`IDENTIFY
`
` SUB FILE
`PREVIOUS
`
`MATCH
`
`
`
` PARSE IMAGE’
`
`
`
`AUDIO FILE
`ADDRESSES)
`
`
`RETRIEVE
`IMAGEIAUDIO
`
`FILES
` DISPLAY/PLAY
`
`
`RETRIEVED
`
`DATA ON
`
`ALL DISPL AYE
`FOR PREDBTERMINED
`
`PERIOD
`
`22
`
`
`
`U.S. Patent
`
`M02
`
`42:1012w_h__S
`
`US 8,682,673 B2
`
`.33
`
`32
`
`.5,$3.w22EMmfiflmzmE
`
` 2.2\Em.damn
`
`
`£3m.E._mB
`A\
`
`pfAVE
`
`23
`
`23
`
`
`
`
`
`
`U.S. Patent
`
`Mar. 25, 2014
`
`Sheet 22 of 24
`
`US 8,682,673 B2
`
`
`
` NEW CALL
`SIGN/U; REC’D
`
`
`
`
`
`
`
`
`
`
`
`OTHER
`
`SELECTED
`CALL SIGNALS
`BY OCCUPANTS
`PRE§ENT
`
`
`
`
`
`
`SELECTED
`BYCAR occup.
`'2
`
`
`CALL SIGNAL
`PRE§ENT
`
`
`
`
`
`
`DETERMINE DIRECTION
`OF TRAVEL
`
`‘°A§f.§,fD
`
`ORDER CALL SIGNALS
`BASED ON DIRECTION
`
`SELECTNEXT FLOOR
`TO BE ENCOUNTERED
`
`-
`
`
`
`RETRIEVE
`IMAGE/AUDIO
`DATA FOR ID
`
`
`
`DISPLAY/PLAY
`RBTRIEVED
`DATA
`
`TIME 1;
`PRESET VALUE
`?
`
`
`
`ESTIMATE TIME TO NEXT
`FLOOR ENCOUNTERED
`
`
`
`
`
`> Y
`
`ACCLESS BLDG};I§£REéI"10RY
`OR NEX
`O R
`
`b
`RUN TENANT
`SELECTION
`
`ROUTINE
`
`MULTIPLE
`
`TENANTS ON
`
`
`FLQOR
`
`FIG. 180
`
`24
`
`
`
`FLOOR
`REAEEHED
`u
`
`24
`
`
`
`U.
`
`.5 Patent
`
`Mar. 25, 2014
`
`Sheet 23 0f24
`
`US 8,682,673 B2
`
`
`
`
`
`
`
`RETRIEVE
`STATISTIC
`FILES FOR
`ALL K SUB-FILES
`
`
`
`
`
`RETRIEVE
`HISTORICAL
`DISPLAYFILES FOR
`ALL K SUB~FILES
`
`
`
`
`
`
`
`VERT
`CON
`T0 DIGITAL
`
`FARE
`“RY
`
`PARS?-3 SUBms
`
`“3
`
`RETRIEVE
`SUB—FILE
`STATISTIC FILE
`
`INCREMENT
`STATISTIC
`FILE BY ONE
`
`STORE
`STATISTIC
`FILE
`
`
`
`WAIT PRESET
`
`
`
`
`
`COMPARE
`HISTORICAL
`DISPLAY FILE
`
`
`
`
`
`
`gfififigiffifl
`
`
`
`
`DISTRIB.
`
`
`
`
`
`
`
`
`SELECT
`U3_
`5 wrm
`S LA§%EgT D
`
`
`
`
`
`
`RBTRIEVE DATA
`FILE FOR SUB-FILE
`WITH LARGESTD
`
`
`
`PERIOD
`
`DISPLAY FILE FOR
`FILE DISPLAYED
`
`
`
`
` FIG. 18d
`
`25
`
`
`
`U.S. Patent
`
`Mar. 25, 2014
`
`Sheet 24 of 24
`
`US 8,682,673 B2
`
`
`
`USER SELECT DEDICATED
`FUNCTION KEY OR TOUCH
`SCREEN FUNCTION KEY
`
`
`
`
`
`
`
`
`
`
`PROCESSOR GENERATE
`ID CODE FOR
`SELECTED FUNCTION
`
`
`
`
`
`PROCESSOR RETRIEVE
`STORED "TAG" FILE FOR
`ADVERTISING SUB—I-‘ILES
`
`
`
` COMPARE GENERATED
`ID coma T0 TAG
`
`ASSOCIATED WITH ONE
`
`(OR MORE) SUB-FILES
`
` PROCESSOR SEQUENCE
`TAGS
`
`
`
`
`
`
`
`DATA FILE FOR IITH
`SUB-FILE IN SEQUENCE
`
`
`
` PROCESSOR RETRIEVE
`
`
`
` ‘ Nn=n+1
`
`DISPLAY YITH
`SUB-FILE DATA FILE;
`
`WAIT PREDETERMINBD
`
`INCREMENT 1'1:
`
`26
`
`26
`
`
`
`US 8,682,673 B2
`
`1
`COMPUTERIZED INFORMATION AND
`DISPLAY APPARATUS
`
`This application is a continuation of and claims priority to
`co-owned U.S. patent application Ser. No. 13/369,850 filed
`Feb. 9, 2012 and entitled “COMPUTERIZED INFORMA-
`TION PRESENTATION APPARATUS”, now U.S. pat. No.
`8,447,612, which is a continuation of and claims priority to
`co-owned U.S. patent application Ser. No. 12/711,692 filed
`Feb. 24, 2010 and entitled “ADAPTIVE INFORMATION
`PRESENTATION APPARATUS AND METHODS”, now
`U.S. Pat. No. 8, 1 17,037, which is a continuation ofand claims
`priority to co-owned U.S. patent application Ser. No. 1 1/506,
`975 filed Aug. 17, 2006 and entitled “SMART ELEVATOR
`SYSTEM AND METHOD”, U.S. Pat. No. 7,711,565, which
`is a divisional of and claims priority to co-owned U.S. patent
`application Ser. No. 10/935,957 filed Sep. 7, 2004 and
`entitled “ELEVATOR ACCESS CONTROL SYSTEM AND
`METHOD”, now U.S. Pat. No. 7,093,693, which is a divi-
`sional of co-owned U.S. patent application Ser. No. 10/651,
`451 filed Aug. 29, 2003 and entitled “SMART ELEVATOR
`SYSTEM AND METHOD”, now U.S. Pat. No. 6,988,071,
`which is a continuation of co-owned U.S. patent application
`Ser. No. 09/330,101 filed Jun. 10, 1999 and entitled“SMART
`ELEVATOR SYSTEM AND METHOD”, now U.S. Pat. No.
`6,615,175, each ofthe foregoing incorporated into the present
`application by reference in its entirety. This application is also
`related to U.S. patent application Ser. No. 12/703,666 filed
`Feb. 10, 2010 entitled “Adaptive Advertising Apparatus and
`Methods”, now U.S. Pat. No. 8,065,155, U.S. patent applica-
`tion Ser. No. 12/704,431 filed Feb. 11, 2010 entitled “Adap-
`tive Advertising Apparatus and Methods”, now U.S. Pat. No.
`8,078,473, Ser. No. 12/711,692 filed on Feb. 24, 2010 entitled
`“ADAPTIVE INFORMATION PRESENTATION APPA-
`RATUS AND METHODS”, now U.S. Pat. No. 8,117,037,
`Ser. No. 12/711,857 filed Feb. 24, 2010 and entitled “ADAP-
`TIVE INFORMATION PRESENTATION APPARATUS
`AND METHODS”, now U.S. Pat. No. 8,065,156, Ser. No.
`13/364,194 filed Feb. 1, 2012 and entitled “COMPUTER-
`IZED INFORMATION PRESENTATION APPARATUS”,
`now U.S. Pat. No. 8,285,553, Ser. No. 13/362,902 filed Jan.
`31, 2012 and entitled “ADAPTIVE INFORMATION PRE-
`SENTATION APPARATUS”, now U.S. Pat. No. 8,370,158,
`and Ser. No. 13/357,487 filed Jan. 24, 2012 and entitled
`“ELECTRONIC INFORMATION ACCESS SYSTEM AND
`METHODS”, now U.S. Pat. No. 8,301,456, Ser. No. 13/404,
`606 entitled “COMPUTERIZED INFORMATION PRE-
`SENTATION APPARATUS”, now U.S. Pat. No. 8,290,781,
`Ser. No. 13/404,980 entitled “COMPUTERIZED INFOR-
`MATION PRESENTATION APPARATUS”, now U.S. Pat.
`No. 8,296,146, Ser. No. 13/404,853 entitled “COMPUTER-
`IZED INFORMATION PRESENTATION APPARATUS”,
`now U.S. Pat. No. 8,290,778, and Ser. No. 13/405,046
`entitled “COMPUTERIZED INFORMATION PRESENTA-
`TION METHODS” now U.S. Pat. No. 8,296,153, each filed
`on Feb. 24, 2012, Ser. No. 13/406,408 entitled “COMPUT-
`ERIZED INFORMATION SELECTION AND DOWN-
`LOAD APPARATUS AND METHODS” filed on Feb. 27,
`2012, now U.S. Pat. No. 8,311,834, and Ser. No. 13/410,080
`entitled “NETWORK APPARATUS AND METHODS FOR
`USER INFORMATION DELIVERY” filed Mar. 1, 2012,
`now U.S. Pat. No. 8,285,551, each of which is incorporated
`herein by reference in its entirety. This application is also
`related to co-owned and co-pending U.S. patent application
`Ser. No. 13/728,512 filed contemporaneously herewith on
`Dec. 27, 2012 and entitled “SMART INFORMATION AND
`DISPLAY APPARATUS”, Ser. No. 13/733,098 filed Jan. 2,
`
`5
`
`10
`
`15
`
`20
`
`25
`
`30
`
`35
`
`40
`
`45
`
`50
`
`55
`
`60
`
`65
`
`2
`2013 and entitled “COMPUTERIZED INFORMATION
`AND DISPLAY APPARATUS”, Ser. No. 13/737,833 filed
`Jan. 9, 2013 and entitled “COMPUTERIZED INFORMA-
`TION AND DISPLAY APPARATUS”, Ser. No. 13/737,853
`filed Jan. 9, 2013 and entitled “TRANSPORT APPARATUS
`WITH COMPUTERIZED INFORMATION AND DIS-
`PLAY APPARATUS”, Ser. No. 13/746,266 filed Jan. 21,
`2013 and entitled “COMPUTERIZED INFORMATION
`AND DISPLAY APPARATUS”, Ser. No. 13/750,583 filed
`Jan. 25, 2013 and entitled “COMPUTERIZED INFORMA-
`TION AND DISPLAY APPARATUS”, Ser. No. 13/752,222
`filed Jan. 28, 2013 and entitled “COMPUTERIZED INFOR-
`MATION AND DISPLAY APPARATUS”, Ser. No. 13/753,
`407 filed Jan. 29, 2013 and entitled “COMPUTERIZED
`INFORMATION AND DISPLAY APPARATUS AND
`METHODS”, Ser. No. 13/755,682 filed Jan. 31, 2013 and
`entitled “INTELLIGENT ADVERTISING METHODS”,
`and Ser. No. 13/758,898 filed Feb. 4, 2013 and entitled
`“INTELLIGENT ADVERTISING APPARATUS”, each of
`which is incorporated herein by reference in its entirety.
`
`BACKGROUND OF THE INVENTION
`
`1. Field of the Invention
`
`The present invention relates to the field ofpersonnel trans-
`port apparatus, and specifically to elevators and similar
`devices for transporting people from one location to another
`which incorporate various information technologies.
`2. Description of Related Technology
`Elevators and similar personnel transport devices (such as
`moving walkways or shuttles) are important aspects of mod-
`em urban life. Commonly used in office buildings, airports,
`shopping malls, and other large structures, these devices
`transport large numbers of people and equipment between
`two locations on a routine basis. Elevators in particular are
`widely used throughout the world.
`Depending on loading, a person may spend up to several
`minutes on an elevator during travel between floors. Signifi-
`cant amounts of time may also be spent waiting for the eleva-
`tor to arrive when called. This time is usually “dead” from the
`standpoint that very little can be accomplished or very few
`tasks undertaken during these few minutes. However, often
`times an individual may require information which will be of
`use after leaving the elevator. For example, the person may
`wish to obtain travel information such as directions to the
`
`nearest airport or public transportation node, or the location
`of a nearby restaurant. Weather-related information or trafiic
`reports may also be useful. A plethora of different types of
`information, including financial data, breaking news head-
`lines, sports scores and the like may also be of interest to one
`waiting for or riding on an elevator or other transport device.
`An associated problem relates to determining the location
`of a person, firm, or store within a building when unfamiliar.
`Building directories are often posted in the lobby ofthe build-
`ing, yet these require the user to manually or visually locate
`the name of the person, firm, or store which they are looking
`for, and remember the location information associated there-
`with. Additionally, such directories often do not provide pre-
`cise location information, but rather merely a floor number
`and/or suite number. The user often times does not have a
`
`graphical representation of the desired location in relation to
`the elevators, thereby resulting in additional wasted time in
`finding the location once off of the elevator. Even if a graphi-
`cal display is provided, it often requires the user to spatially
`orient themselves to determine relative location.
`
`Security is also a concern when riding elevators late at
`night or to remote locations. Many elevator systems are used
`
`27
`
`27
`
`
`
`US 8,682,673 B2
`
`3
`partly or entirely within parking garages, which often may be
`sparsely populated at off hours. People are all too frequently
`assaulted or robbed when departing from elevators under
`such conditions. Unfortunately, existing elevator systems do
`not have the facility to provide the occupant(s) with the ability
`to selectively observe the area immediately surrounding the
`elevator doors on one or more destination floors, or otherwise
`take precautions to enhance their security.
`Another problem associated with existing elevator systems
`relates to their loading capacity. Often, especially at peak use
`hours such as during the noon hour, the call buttons for several
`different floors within a building will be activated, and eleva-
`tor cars which are at or near their loading capacity will
`respond. With no additional room available in the elevator, the
`person depressing the call button on a given floor is left to wait
`for the elevator doors to close, depress the call button again,
`and wait for another (hopefully partially vacant) car to arrive.
`This process not only delays the person waiting for the car, but
`also those on the elevator car(s), and those waiting on other
`floors.
`
`In addition to the foregoing, many elevators must have a
`means of restricting access to certain floors during certain
`time periods while not interfering with other operations.
`These elevators generally also include means by which cer-
`tain users may gain access to the restricted floors, such as a
`magnetic striped card which is inserted into a card reader on
`the elevator. However, such card readers are prone to wear and
`having to re-swipe the card several times in order to obtain
`access. Furthermore, as the card wears due to repeated swip-
`ing or bending (such as when left in the pocket of the indi-
`vidual carrying the card), the card will be more prone to
`failure and will eventually require replacement. Also, such
`cards are prone to unauthorized use. Someone stealing or
`finding the lost card can simply insert it into the card reader of
`the elevator and gain access to the restricted floor(s). It is also
`noted that since access is restricted to certain floors typically
`during late-night or weekend hours, HVAC and lighting sys-
`tems are typically turned off or dormant in order to conserve
`energy. Hence, when the user arrives at one ofthese restricted
`access floors, several minutes are typically spent turning on
`the HVAC, lights, and any other number of electrical devices.
`Some systems require the user to insert their magnetic strip
`card in a separate reader, such as in the control room for the
`HVAC (which is typically located on a different floor), in
`order to initiate equipment operation. This is obviously time
`consuming and cumbersome.
`Lastly, there is often an element of discomfort associated
`with riding an elevator car, especially when several individu-
`als are present in the car. Due in part to minimal space within
`the car and nothing to occupy the occupants attention visu-
`ally, there is a natural tendency for one to stare up, down, or
`forward at the door of the elevator, or at the visual floor
`indicators so as to avoid prolonged eye contact with the other
`occupants.
`Heretofore, many of the technologies necessary to address
`the aforementioned issues have not been available or, alter-
`natively, have been cost or space prohibitive to implement.
`However, recent advances in data networking, thin or flat
`panel display technology, personal electronics, and speech
`recognition and compression algorithms and processing have
`enhanced the viability of such features from both technologi-
`cal and commercial perspectives.
`Based on the foregoing, there is a need for an improved
`elevator system and method of operation which will reduce
`the time spent waiting for and travelling on the elevator car,
`reduce the frustration associated with repeated stops at dif-
`ferent floors, and allow the occupants of the elevator (as well
`
`10
`
`15
`
`20
`
`25
`
`30
`
`35
`
`40
`
`45
`
`50
`
`55
`
`60
`
`65
`
`4
`
`as those waiting for the car) to use their time more efficiently
`and obtain needed information. Additionally, such an elevator
`system would enhance the security of the occupants upon
`egress, and allow for automatic recognition ofan individual in
`order to provide access to certain restricted locations and
`initiation of certain functions such as lighting and HVAC.
`2. Summary of the Invention
`In one aspect, a computerized information apparatus is
`disclosed. In one embodiment, the apparatus includes: a net-
`work interface; processing apparatus in data communication
`with the network interface; a display device; a speech digiti-
`zation apparatus in data communication with the processing
`apparatus; and a storage apparatus comprising at least one
`computer program. In one variant, the at least one program is
`configured to, when executed on a processing apparatus:
`receive a digitized speech input from the speech digitization
`apparatus, the input relating to a desired information which a
`user wishes to locate; cause evaluation ofthe digitized speech
`input to identify one or more words or word strings within the
`digitized input; and cause, based at least in part on the one or
`more identified one or more words or word strings, access of
`a remote network entity to obtain the desired information.
`In another embodiment, the apparatus includes: a wireless
`network interface; first processing apparatus in data commu-
`nication with the network interface; a display device; a speech
`recognition apparatus in data communication with at least a
`second processing apparatus; and a storage apparatus in data
`communication with at least the second processing apparatus.
`In one variant, the storage apparatus includes at least one
`computer program, the at least one program being configured
`to, when executed by the second processing apparatus:
`receive a first digitized speech input from the speech recog-
`nition apparatus, the input relating to a desired function which
`a user wishes to perform; cause evaluation of the digitized
`speech input and identify a plurality of possible matches
`relating to the desired function; prompt the user to select one
`of the plurality of possible matches; and based at least in part
`on the user’s selection, cause access of a remote network
`entity to obtain information relating to the desired function.
`In another aspect, a computerized information and display
`apparatus is disclosed. In one embodiment, the apparatus
`includes: a wireless network interface, the interface compli-
`ant with an IEEE 802.1 1 Standard; first processing apparatus
`in data communication with the network interface; a substan-
`tially flat panel display device; a speech digitization apparatus
`in data communication with at least the first processing appa-
`ratus; a microphone in communication with the speech digi-
`tization apparatus; and a storage apparatus in data communi-
`cation with the at least the second processing apparatus. In
`one variant, the storage apparatus includes at least one com-
`puter program, the at least one program being configured to,
`when executed: receive a first digitized speech input from the
`speech digitization apparatus, the input comprising input for
`performance of a desired function which a user wishes to
`perform, the desired function being one of a plurality of
`predetermined different functions displayable to the user on
`the display device and from which the user must select; evalu-
`ate the digitized speech input and identify a plurality of pos-
`sible matches thereto; prompt the user to select one of the
`plurality of possible matches via a subsequent user speech
`input; and based at least in part on the user’s subsequent
`speech input, cause access of a remote network entity to
`obtain information relating to a substance of the subsequent
`speech input.
`In another embodiment, the computerized information and
`display apparatus includes a first apparatus comprising: a
`network interface configured for communication with an
`
`28
`
`28
`
`
`
`US 8,682,673 B2
`
`5
`internetwork; a first wireless interface; first processing appa-
`ratus in data communication with the network interface and
`
`the wireless interface; and a substantially flat panel display
`device. The computerized information and display apparatus
`of this embodiment further includes a second apparatus com-
`prising: a second wireless interface configured for wireless
`data communication with the first wireless interface; a second
`processing apparatus in data communication with the second
`wireless interface; a speech digitization apparatus in data
`communication with the at least the second processing appa-
`ratus; and a microphone in communication with the speech
`digitization apparatus. The computerized information and
`display apparatus is configured to: receive a user’s speech
`input via the microphone, the input providing a search term
`for a desired function; digitize the speech input using at least
`the speech digitization apparatus; cause evaluation of the
`digitized speech to identify one or more words or word
`strings; cause access ofa remote information or content entity
`based at least in part on the identified one or more words or
`word strings; receive, via the network interface, information
`or content relating to the desired function, the information or
`content obtained from the access ofthe remote information or
`
`6
`FIG. 15 is a block diagram illustrating one embodiment of
`the identification and access sub-system of the present inven-
`tion.
`
`FIG. 16 is a logic diagram illustrating the operation of the
`identification and access sub-system of FIG. 15.
`FIG. 17 is a plan view of one embodiment of a utility
`services selection display associated with the identification
`and access sub-system of FIG. 15.
`FIG. 18a is a logic diagram illustrating the operation of a
`first embodiment of the prompt mode of the adaptive adver-
`tising sub-system of the invention.
`FIG. 18b illustrates the library data file structure used in
`conjunction with the advertising sub-system of the invention.
`FIG. 18c is a logic diagram illustrating the operation of a
`second embodiment of the advertising sub-system of the
`invention.
`
`FIG. 18d is a logic diagram illustrating the operation of a
`third embodiment of the adaptive advertising sub-system of
`the invention.
`
`FIG. 19 is a logic diagram illustrating the operation of a
`fourth embodiment of the adaptive advertising sub-system of
`the invention.
`
`10
`
`15
`
`20
`
`content entity; and display the received information or con-
`tent on the substantially flat panel display device.
`
`25
`
`BRIEF DESCRIPTION OF THE DRAWINGS
`
`FIG. 1 is a block diagram of one embodiment of the infor-
`mation and control system of the invention, showing those
`components local to each elevator car.
`FIG. 2 is a plan view of a first embodiment of the interface
`panel ofthe information and control system of FIG. 1, includ-
`ing the touch keypad and the display device.
`FIG. 3 is a block diagram of one embodiment of the infor-
`mation and control system network architecture.
`FIG. 4 is a logic diagram illustrating the operation of one
`embodiment of the building directory sub-system of the
`invention.
`
`FIG. 5 is a plan view of one embodiment of a building
`directory sub-system graphic location file, as shown on the
`display device of the information and control system.
`FIG. 6a is a plan view of one embodiment of a network
`input device having dedicated function keys thereon.
`FIGS. 6b and 6c illustrate one embodiment of an exem-
`
`plary coordinated graphic weather display according to the
`present invention.
`FIG. 7 is a plan view of one embodiment of the PED data
`download terminal of the invention.
`
`FIG. 8 is a block diagram of one embodiment of the capac-
`ity sensing sub-system according to the present invention.
`FIG. 9 is a plan view of one embodiment of the elevator
`floor sensor array used in conjunction with the capacity sens-
`ing sub-system of FIG. 8.
`FIG. 10 is a logic diagram illustrating the method of opera-
`tion of the capacity sensing sub-system of FIG. 8.
`FIG. 11 is a block diagram illustrating one embodiment of
`the monitoring and security sub-system of the present inven-
`tion.
`FIG. 12 illustrates one embodiment of the elevator car
`
`touch panel used in conjunction with the monitoring and
`security sub-system of FIG. 11.
`FIG. 13 is a block diagram of a second embodiment of the
`monitoring and security sub-system of the present invention.
`FIGS. 14a and 14b are plan views of one embodiment of
`the parking and video monitoring displays, respectively, of
`the monitoring and security sub-system of FIG. 11.
`
`30
`
`35
`
`40
`
`45
`
`50
`
`55
`
`60
`
`65
`
`DESCRIPTION OF THE INVENTION
`
`Reference is now made to the drawings listed above,
`wherein like numerals refer to like parts throughout.
`It is noted that while the system and methods of the inven-
`tion disclosed herein are described primarily with respect to
`an elevator car, certain aspects of the invention may be useful
`in other applications,
`including, without limitation, other
`types of personnel transport devices such as trams or shuttles
`or moving walkways, or stationary devices such as kiosks
`within the lobby or elevator waiting areas of a building. As
`used herein, the term “building” is meant to encompass any
`structure, whether above ground or underground, permanent
`or temporary, used for any function.
`General Description
`Referring now to FIGS. 1 and 2, one embodiment of an
`improved elevator information system is generally described.
`As shown in FIG. 1, the system 100 includes an input device
`102, speech recognition (SR) module 104, central processor
`106 with associated motherboard 121, video RAM 107, non-
`volatile storage device 108 containing a database (not
`shown), graphics co-processor 109, volatile or dynamic stor-
`age device 110 with associated DMA module 139, audio
`amplifier and speaker module 111, speech synthesis module
`112, micro-controller 123, PCI slots 147, and display device
`113. The system also includes a serial bus with universal
`asynchronous receiver transmitter (UART) 117 or altema-
`tively universal serial bus (USB), as described in greater
`detail below with respect to FIG. 7. As shown in FIG. 2, the
`input device 102 of the present embodiment is a touch-sen-
`sitive keypad and/or display screen of the type well known in
`the electrical arts. The input device 102 includes a variety of
`different functional keys 114 on a keypad 116 (and/or on a
`touch-sensitive display screen 113, as described below)
`which allow the user to initiate a query of the database either
`manually via the keypad 116, display device 113, or audibly
`through the speech recognition module 104.
`As shown in FIG. 1, the speech recognition module 104 of
`the present invention includes a high quality, high SNR audio
`microphone 118, analog-to-digital converter (ADC) 141, and
`linear predictive coding (LPC)-based spectral analysis algo-
`rithm run on a digital signal processor 125 having associated
`SR module RAM 127. It will be recognized that other forms
`of spectral analysis, such as MFCC (Mel Frequency Cepstral
`
`29
`
`29
`
`
`
`US 8,682,673 B2
`
`7
`Coefficients) or cochlea modeling, may be used. Phoneme/
`word recognition in the present embodiment is based on
`HMM (hidden Markov modeling), although other processes
`such as, without limitation, DTW (Dynamic Time Warping)
`or NNs (Neural Networks) may be used. Myriad speech rec-
`ognition systems and algorithms are available, all considered
`within the scope of the invention disclosed herein.
`In the present embodiment, CELP-based voice data com-
`pression is also utilized for transmission and storage of voice
`data. CELP algorithms in general are useful for converting
`analog speech to a compressed digital format which is more
`rapidly and easily manipulated and stored within a digital
`system using less bandwidth and memory. CELP algorithms
`and low bit rate vocoder technology are well known in the
`signal processing art, and accordingly will not be described
`further herein. No