throbber
EP 0 613 109 A1
`
`5
`
`10
`
`15
`
`20
`
`25
`
`30
`
`35
`
`40
`
`45
`
`50
`
`55
`
`pattern all the way down the runway. This lighting pattern could be turned-on as a plane is cleared for landing
`and then turned-off after the aircraft has touched down. A pilot approaching the runway along an intersecting
`taxiway would be alerted in a clear and unambiguous way that the 'runway was active and should not be
`crossed.
`If an incursion was detected the main computers 26, 28 could switch the runway strobe lights 48 from the
`·rabbit" pattern to a pattern that alternatively flashes either side of the runway in a wig-wag fashion. A switch
`to this pattern would be interpreted by the pilot of an arriving aircraft as a wave off and a signal to go around.
`The abrupt switch in the pattern of the strobes would be instantaneously picked up by the air crew in time for
`them to initiate an aborted landing procedure.
`During Category III weather conditions both runway and taxiway visibility are very low. Currently radio
`based landing systems are used to get the aircraft from final approach to the runway. Once on the runway it
`is not always obvious which taxiways are to be used to reach the airport terminal. In system 10 the main com(cid:173)
`puters 26,28 can control the taxiway lamps 40 as the means for guiding aircraft on the ground during CAT III
`conditions. Since the intensity of the taxiway lamps 40 can be controlled remotely, the lamps just in front of
`an aircraft could be intensified or flashed as a means of guiding it to the terminal.
`Alternatively, a short sequence of the "rabbir pattern may be programmed into the taxiway strobes just
`in front of the aircraft At intersections, either the unwanted paths may have their lamps turned off or the en(cid:173)
`trance to the proper section of taxiway may flash directing the pilot to head in that direction. Of course in a
`smart system only those lights directly in front of a plane would be controlled, all other lamps on the field would
`remain in their normal mode.
`Referring now to FIG. 9. a block diagram is shown of the data flow within the system 10 (as shown in FIG.
`1 and FIG. 5). The software modules are shown that are used to process the data within the computers 26.
`28 of the central computer system 12. The tracking of aircraft and other vehicles on the airport operates under
`the control of a sensor fusion software module 101 which resides in the computers 26, 28. The sensor fusion
`software module 101 receives data from the plurality of sensors 50. a sensor 50 being located in each edge
`light assembly 201• n which reports the heat level detected, and this software module 101 combines this infor(cid:173)
`mation through the use of rule based artificial intelligence to create a complete picture of all ground traffic at
`the airport on a display 30 of the central computer system 12.
`The tracking algorithm starts a track upon the first report of a sensor 50 detecting a heat level that is above
`.
`the ambient background level of radiation. This detection is then verified by checking the heat level reported
`by the sensor directly across the pavement from the first reporting sensor. This secondary reading is used to
`confirm the vehicle detected and to eliminate false alarms. After a vehicle has been confirmed the sensors
`adjacent to the first reporting sensor are queried for changes in their detected heat level. As soon as one of
`the adjacent sensors detects a rise in heat level a direction vector for the vehicle can be established. This proc-
`ess continues as the vehicle is handed off from sensor to sensor in a bucket brigade fashion as shown in FIG.
`7. Vehicle speed can be roughly determined by calculating the time between vehicle detection by adjacent sen(cid:173)
`sors. This information is combined with information from a system data base on the location of each sensor
`to calculate the velocity of the target. Due to hot exhaust or jet blast, the sensors behind the vehicle may not
`return to a background level immediately. Because of these condition, the algOrithm only uses the first four
`se~sors (two on either side of the taxiway) to calculate the vehicles position. The vehicle is always assumed
`to be on the centerline of the pavement and between the first four reporting sensors.
`Vehicle identification can be added to the track either manually or automatically by an automated source
`that can identify a vehicle by its position. An example would be prior knowledge of the next aircraft to land on
`a particular runway. Tracks are ended when a vehicle leaves the detection system. This can occur in one of
`two ways. The first way is that the vehicle leaves the area covered by the sensors 50. This is detennined by
`a vehicle track moving in the direction of a gateway sensor and then a lack of detection after the gateway sensor
`has lost contact A second way to leave the detection system is for a track to be lost in the middle of a sensor
`array. This can occur when an aircraft departs or a vehicle runs onto the grass. Takeoff scenarios can be de(cid:173)
`termined by calculating the speed of the vehicle just before detection was lost. If the vehicle speed was in-
`creasing and above rotation speed then the aircraft is assumed to have taken off. If not then the vehicle is
`assumed to have gone on to the grass and an alarm is sounded.
`Referring to FIG. 5 and FIG. 9. the ground clearance routing function is performed by the speech recog(cid:173)
`nition unit 33 along with the ground clearance compliance verifier software module 103 running on the com(cid:173)
`puters 26.28. This software module 103 comprises a vehicle identification routine, clearance path routing.
`clearance checking routine and a path checking routine.
`The vehicle identification routine is used to receive the airline name and flight number (I.e. °Delta 374j
`from the speech recognition unit 33 and it highlights the icon of that aircraft on the graphic display of the airport
`on display 30.
`
`12
`
`Sony, Ex. 1002, p.901
`
`

`
`EP 0 613 109 A1
`
`5
`
`10
`
`15
`
`20
`
`25
`
`30
`
`35
`
`4()
`
`45
`
`50
`
`55
`
`The dearance path routine takes the remainder of the controller's phrase (i.e. ·outer taxiway to echo, hold
`short of runway 15 Left") and provides a graphical display of the dearance on the display 30 showing the airport.
`The dearance checking routine checks the clearance path for possible conflict with other dearances and
`vehides. If a conflict is found the portion of the path that would cause an incursion is highlighted in a blinking
`red and an audible indication is given to the controller via speaker 32.
`The path checking routine checks the actual path of the vehide as detected by the sensors 50 after the
`clearance path has been entered into the computers 26, 28 and it monitors the actual path for any deviation.
`If this routine detects that a vehide has strayed from the assigned course. the vehide Icon on the graphic dis(cid:173)
`play of the airport flashes and an audible indicator is given to the controller via speaker 32 and optionally the
`vehide operator via radio 37.
`The airport vehide incursion avoidance system 10 operates under the control of safety logic routines which
`reside In the collision detection software module 104 running on computers 26, 28. The safety logic routines
`receive data from the sensor fusion software module 101 location program via the tracker software module
`102 and interpret this information through the use of rule based artificial intelligence to predict possible colli-
`sions or runway incursions. This Information is then used by the central computer system 12 to alert tower con(cid:173)
`trollers, aircraft pilots and truck operators to the possibility of a runway incursion. The tower controllers are
`alerted by the display 30 along with a computer synthesized voice message via speaker 32. Ground traffic is
`alerted by a combination of traffic lights, flashing lights, stop bars and other alert lights 34, lamps 40 and 48,
`and computer generated voice commands broadcast via radio 36.
`Knowledge based problems are also called fuzzy problems and their solutions depend upon both program
`logic and an interface engine that can dynamically create a decision tree, selecting which heuristics are most
`appropriate for the specific case being considered. Rule based systems broaden the scope of possible applj..
`cations. They allow designers to Incorporate judgeme'nt and experience, and to take a consistent solution ap(cid:173)
`proach across an entire problem set
`The programming of the rule based incursion detections software is very straight forward. The rules are
`written in English allowing the experts, in this case the tower personnel and the pilots, to review the system
`at an understandable level. Another feature of the rule based system is that the rules stand alone. They can
`be added, deleted or modified without affecting the rest of the code. This is almost impossible to do with code
`that is created from scratch. An example of a rule we might use is:
`If (Runway-Status = Active)
`then (Stop_Bar_Lights = REO).
`This Is a very simple and straight forward rule. It stands alone requiring no extra knowledge except how Run(cid:173)
`way-Status is created. So let's make some rules affecting Runway_Status.
`If (Departure = APPROVED) or (Landing = IMMINENT),
`then (Runway_Status = ACTIVE).
`For incursion detection, another rule is:
`If (Runway_Status = ACTIVE) and (Intersection = OCCUPIED),
`then (Runway-Incursion = TRUE).
`Next. detect that an Intersection of a runway and taxiway are occupied by the rules:
`If (Intersection_Sensors = DETECT),
`.
`then (Intersection = OCCUPIED).
`To predict that an aircraft will run a Hold Position stop, the following rule Is created:
`If (AlrcrafLStoppins-Dlstance > Distance_to_Hold_Position),
`then (Intersection = OCCUPIED).
`In order to show that rules can be added without affecting the reset of the program. assume that after a
`demonstration of the system 10 to tower controllers, they decided that they wanted a ·Panic Button" in the
`tower to override the rule based software in case they spot a safety violation on the ground. Besides installing
`the button, the only other change would be to add this extra rule.
`If (Panic_button = PRESSED).
`then (Runway_Incursion = TRUE).
`It Is readily seen that the central rule based computer program is very straight forward to create. understand
`and modify. As types of incursions are defined. the system 10 can be upgraded by adding more rules.
`Referring again to FIG. 9, the block diagram shows the data flow between the functional elements within
`the system 10 (FIG. 1). Vehides are detected by the sensor 50 in each of the edge light assemblies 201_ n. This
`information is passed over the local operating network (LON) via edge light wiring 21 1_ n to the LON bridges
`221_ n. The individual message packets are then passed to the redundant computers 26 and 28 over the wide
`area network (WAN) 14 to the WAN interface 108. After arriving at the redundant computers 26 and 28, the
`message packet is checked and verified by a message parser software module 100. The contents of the me&-
`
`13
`
`Sony, Ex. 1002, p.902
`
`

`
`EP 0 613109 A1
`
`sage are then sent to the sensor fusion software module 101. The sensor fusion software module 101 is used
`to keep track of the status of all the sensors 50 on the airport; it filters and verifies the data from the airport
`and stores a representative picture of the sensor array in a memory. This information is used directly by the
`display 30 to show which sensors 50 are responding and used by the tracker software module 102. The tracker
`software module 102 uses the sensor status information to determine which sensor 50 reports correspond to
`actual vehicles. In addition, as the sensor reports and status change, the tracker software module 102 iden(cid:173)
`tifies movement of the vehicles and produces a target location and direction output. This information is used
`by the display 30 in order to display the appropriate vehicle icon on the screen.
`The location and direction of the vehicle is also used by the collision detection software module 104. This
`module checks all of the vehicles on the ground and plots their expected course. If any two targets are on in(cid:173)
`tersecting paths, this software module generates operator alerts by using the display 30, the alert lights 34,
`the speech synthesis unit 29 coupled to the assoclated speaker 32, and the speech synthesis unit 31 coupled
`to radio 37 which is coupled to antenna 39.
`Still referring to FIG. 9, another user of target location and position data is the ground clearance compliance
`verifier software module 103. This software module 103 receives the ground clearance commands from the
`controller's microphone 35 via the speech recognition unit 33. Once the cleared route has been determined,
`it is stored in the ground clearance compliance verifier software module 103 and used for comparison to the
`actual route taken by the vehicle. If the information received from the tracker software module 102 shows that
`the vehicle has deviated from its assigned course, this software module 103 generates operator alerts by using
`the display 30, the alert lights 34, the speech synthesis unit 29 coupled to speaker 32, and the speech synthesis
`unit 31 coupled to radio 37 which is coupled to antenna 39.
`The keyboard 27 is connected to a keyboard parser software module 109. When a command has been
`verified by the keyboard parser software module 109, it is used to change display 30 options and to reconfigure
`the sensors and network parameters. A network configuration data base 106 is updated with these reconfi(cid:173)
`guration commands. This information is then turned into LON message packets by the command message
`generator 107 and sent to the edge light assemblies 201." via the WAN interface 108 and the LON bridges
`221_ n•
`Referring now to FIG. 1 and FIG. 10, FIG. 10 shows a pictorial diagram of an infrared vehiCle identification
`system 109 invention comprising an infrared (IR) transmitter 112 mounted on an airplane 110 wheel strut 111
`and an IR receiver 128 which comprises a plurality of edge light assemblies 201_" of an airport lighting system
`also shown in FIG. 1. The combination ofthe IR transmitter 112 mounted on aircraft and/or other vehicles and
`a plurality of IR receivers 128 located along runways and taxiways form the infrared vehicle identification sys(cid:173)
`tem 109 for use at airports for the safety, guidance and control of surface vehicles in order to provide positive
`detection and identification of all aircraft and other vehicles and to prevent runway incursions. Runway incur(cid:173)
`sions generally occur when aircraft or other vehicles get onto a runway and conflict with aircraft cleared to land
`or takeoff on that same runway. All such incursions are caused by human error.
`Referring now to FIG. 11, a block diagram of the IR transmitter 112 is shown comprising an embedded
`microprocessor 118 having DC power 114 inputs from the aircraft host or vehicle on which the IR transmitter
`112 is mounted and an 10 switch 116 within the aircraftforentering vehicle identification data which is received
`by the IR transmitter 1120n a serial line. Vehicle position information is provided to the IR transmitter 112 from
`a vehicle position receiver 117 which may be embodied by a global pOSitioning system (GPS) receiver readily
`known in the art. The output of embedded microprocessor 118 feeds an IR emitter comprising a light emitting
`diode (LED) array 120. When power is applied to the IR transmitter 112, the microprocessor continuously out(cid:173)
`puts a coded data stream 121 (FIG. 13) which is transmitted by the IR LED array 120. The embedded micro(cid:173)
`processor 118 may be embodied by microprocessor Model MC 6803 or equivalent manufactured by Motorola
`Microprocessor Products of Austin; Texas. The IR LED array 120 may be embodied by IR LED Devices man(cid:173)
`ufactured by Harris Semiconductor of Melborne, Florida.
`Referring now to FIG. 12, a top view of the IR transmitter 112 comprising the IR LED array 120 mounted
`on an airplane wheel strut 111 Is shown. The IR LED array 120 comprises a plurality of high power LEOs each
`having a beam width of 15°. By placing thirteen LEOs in an array, a 195° area can be covered. The IR LED
`array 120 illuminates edge light assemblies 201 ..... along the edges of the runway 64. Each of the edge light
`assemblies 201 ..... comprises an IR receiver 128.
`Referring now to FIG. 13, the coded data stream emitted from the IR transmitter 112 comprises six sep(cid:173)
`arate fields. The first field is called timing pattern 122 and comprises a set of equally spaced pulses. The sec(cid:173)
`ond field is called unique word 123 which marks the beginning of a message. The third field Is called character
`count 124 which specifies the number of characters in a message. The fourth field is called vehicle identifi(cid:173)
`cation number 125. The fifth field is called vehicle position 126 and provides latitude and longitude information.
`
`5
`
`10
`
`15
`
`20
`
`25
`
`30
`
`35
`
`45
`
`50
`
`55
`
`14
`
`Sony, Ex. 1002, p.903
`
`

`
`EP 0 61310g Ai
`
`The sixth field is called message checksum 127. The equally spaced pulses of the timing pattern 122 allow
`the IR receiver 128 to calculate the baud rate of a transmitted message and automatically adjust its internal
`timing to compensate for either a doppler shift or an offset in clock frequency. The checksum 126 field allows
`the IR receiver 128 to find the byte boundary. The character count 124 field is used to alert the IR receiver
`128 in the edge light assemblies 201-4 as to the length of the message being received. The IR receiver 128
`uses this field to determine when the message has ended and if the message was truncated.
`The vehicle identification number 125 field comprises an airline flight number or a tail number of an aircraft
`or a license number of other vehicles. The actual number can be alpha-numeric since each character will be
`allocated eight (8) bits. An ASCII code which is known to those of ordinary skill in the art is an example of a
`code type that may be used. The only constraints on the vehicle 10 number is that it be unique to the vehicle
`and that it be entered in the airporfs central computer data base to facilitate automatic identification. The
`checksum 127 guarantees that a complete and correct message is received. If the message is interrupted for
`any reason, such as a blocked beam or a change in vehicle direction, it is instantly detected and the reception
`voided. This procedure reduces the number of false detects and guarantees that only perfect vehicle identifi-
`cation messages are passed on to the central computer system 12 at the airport tower.
`Referring now to FIG.1, FIG. 2, FIG. 10 and FIG. 14, a block diagram of the IR receiver 128 is shown in
`FIG. 14 which comprises and IR sensor 130 connected to an edge light assembly 201_ n shown in FIG. 1. FIG.
`2 and FIG. 10, on an airport In FIG. 14. only the relevant portions of FIG. 2 are shown, but it should be under(cid:173)
`stood that all of the elements of the edge light assembly 201• n shown in FIG. 2 are considered present in FIG.
`14. The IR receiver 128 comprises the IR sensor 130 which receives the coded data stream 121 (FIG. 13) from
`the transmitter 112. The output of the IR sensor 130 is fed to the microprocessor 44 for processing by an IR
`message routine 136 for detecting the data message. A vehicle sensor routine 138 in microprocessor 44 proc(cid:173)
`esses signals from the vehicle sensor 50 for detecting an aircraft or other vehicles. The IR message routine
`136 is implemented with software within the microprocessor 44 as shown in the flow chart of FIG. 15. The
`vehicle sensor routine 138 is also implemented with software within the microprocessor 44 as shown in the
`flow chart of FIG. 16. The outputs of the IR message routine 136 and vehicle sensor routine 138 are processed
`by the microprocessor 44 which sends via the power line modem 54 the identified aircraft or vehicle and their
`position data over the edge light wiring 21 1_ n communication lines to the central computer system 12 shown
`in FIG. 1 at the airport terminal or control tower. The IR sensor 130 may be embodied with Model RY5B001
`IR sensor manufactured by Sharp Electronics, of Paramus. New Jersey. The microprocessor 44 may be em(cid:173)
`bOdied by the VLSI Neuron® Chip. manufactured by Echelon Corporation. of Palo Alto. California.
`Referring to FIG. 15, a flow chart oft he IR message routine 136 is shown which is a communication protocol
`continuously performed in the microprocessor 44 of the IR receiver 128. After an IR signal is detected 150 the
`next action is transmitter acquisition or to acquire timing 152. The microprocessor 44 looks for the proper timing
`relationship between the received IR pulses. If the correct on/off ratio exists. the microprocessor 44 calculates
`the baud rate from the received timing and waits to acquire the unique word 156 signifying byte boundary and
`then checks for the capture of the character count 160 field byte. After the character count is known, the mi(cid:173)
`croprocessor 44 then captures each character in the vehicle 10 162 field and stores them away in a buffer. It
`then captures vehicle position 163 including latitude and longitude data. If the IR coded data stream is dis-
`rupted before all the vehicle 10 characters are received, the microprocessor 44 aborts this reception try and
`returns to the acquisition or IR detected 150 state. After all characters have been received. the checksum 164
`is calculated. If the checksum matches 166. then the message is validated and the vehicle 10 relayed 168 to
`the central computer system 12. With this scheme the microprocessor 44 is implementing both the physical
`and a link layer of the OSI protocol by providing an error free channel.
`Referring now to FIG. 16. a flow chart Is shown of the vehicle sensor routine 138 software running on mi-
`croprocessor 44. This software routine 138 runs as a continuous loop. An internal timer is continuously
`checked for a time out condition (timer = zero 170). As soon as the timer expires the analog value from sensor
`50 is read (Read Sensor Value 171) by the microprocessor 44 and the timer is reset to the poILtime 172 variable
`downloaded by the central computer system 12. This sensor value is compared against a predetermined de-
`tection threshold 173 and downloaded by the central computer system 12. If the sensor value Is less than the
`detection threshold. the microprocessor 44 sets the network variable prelim_detect to the FAlSE state 174.
`If the sensor value is greater than the detection threshold the microprocessor 44 sets the network variable
`prelim_detect to the TRUE state 175. If a preliminary detection is declared. the program then checks to see
`what reporting mode 176 is in use. If all detections are required to be sent to the central computer system 12.
`then this sensor value 180 is sent If only those readings that are different from the previous reading by a pre(cid:173)
`determined delta and download by the central computer system 12. then this check is made 1 n. If the change
`is greater than the delta 177. the program checks to see if it should confirm the detection 178 to eliminate any
`
`5
`
`10
`
`15
`
`20
`
`25
`
`30
`
`35
`
`40
`
`45
`
`50
`
`. 55
`
`15
`
`Sony, Ex. 1002, p.904
`
`

`
`EP 0 613109 Ai
`
`false alanns. If a confinnation is not required, then this sensor value 181 is sent If in the confirmation mode.
`then the adjacent sensor's 179 preliminary network variable is checked. If the adjacent sensor has also de(cid:173)
`tected the object, then the current sensor value 182 is sent
`This concludes the description of the preferred embodiment However, many modifications and alterations
`will be obvious to one of ordinary skill In the art without departing from the spirit and scope of the inventive
`concept Therefore, it is intended that the scope of this invention be limited only by the appended claims.
`
`Claims
`
`1. A vehicle identification system for identifying aircraft and other vehicles on surface pathways including
`runways and other areas of an airport comprising:
`means disposed on said aircraft and other vehicles for transmitting identification message data;
`means disposed in each of a plurality of light assembly means on said airport for receiving and de(cid:173)
`coding said message data from said transmitting means;
`means for providing power to each of said plurality of light assembly means;
`means for processing said decoded identification message data generated by said receiving and
`decoding means in each of said plurality of light assembly means;
`means for providing data communication between each of said light assembly means and said proc(cid:173)
`essing means; and
`said processing means compriSes means for providing a graphic display of said airport comprising
`symbols representing said aircraft and other vehicles, each of said symbols having said identification mes(cid:173)
`sage data displayed.
`
`2. The vehicle identification system as recited in Claim 1 wherein said transmitting means comprises:
`means for creating a unique message data which includes aircraft and flight identification; and
`infrared means c~upled to said message creating means for transmitting a coded stream of said
`message data.
`
`5
`
`10
`
`15
`
`20
`
`25
`
`30
`
`3. The vehicle identification system as reCited in Claim 3 wherein:
`said message data further includes position information.
`
`4. The vehicle identification system as recited in Claim 1 wherein:
`said receiving and decoding means comprises an infrared sensor.
`
`35
`
`4S
`
`50
`
`55
`
`5. The vehicle identification system as recited in Claim 3 wherein:
`said receiving and decoding means comprises microprocessor means coupled to said infrared sen(cid:173)
`sor for decoding said message data.
`
`6. The vehicle identification system as recited in Claim 1 wherein:
`said plurality of light assembly means being arranged in two parallel rows along runways and taxi(cid:173)
`ways of said airport.
`
`7. The vehicle identification system as recited in Claim 1 wherein said light assembly means comprises:
`light means coupled to said lines of said power providing means for lighting said airport;
`vehicle sensing means for detecting aircraft or other vehicles on said airport;
`microprocessor means coupled to said receiving and decoding means. said light means, said ve(cid:173)
`hicle sensing means and said data communication means for decoding said identification message data;
`and
`
`said data communication means being coupled to said microprocessor means and said lines of said '
`power providing means.
`
`8. The vehicle identification system as recited in Claim 1 wherein:
`said symbols representing aircraft and other vehicles comprise icons having a shape indicating
`type of aircraft or vehicle.
`
`9. The vehicle identification system as recited in Claim 1 wherein:
`said processing means determines a location of said symbols on said graphic display of said airport
`in accordance with data received from said light assembly means.
`
`16
`
`Sony, Ex. 1002, p.905
`
`

`
`EP 0613 109 A1
`
`10. A vehicle identification system for identifying aircraft and other vehicles on surface pathways including
`runways and other areas of an airport comprising:
`means disposed on said aircraft and other vehicles for creating a unique message including aircraft
`and flight identification;
`infrared means coupled to said message creating means for transmitting a coded stream of said
`message data;
`infrared means disposed in each of a plurality of light assembly means on said airport for receiving
`said message data from said transmitting means;
`microprocessor means coupled to said receiving means for decoding said message data;
`means for providing power to each of said plurality of light assembly means;
`means for proceSSing said decoded message data generated by said decoding means in each of
`said plurality of light assembly means;
`means for providing data communication between each of said light assembly means and said proc(cid:173)
`essing means; and
`said processing means comprises means for providing a graphic display of said airport comprising
`symbols representing said aircraft and other vehicles, each of said symbols having said identification mes(cid:173)
`sage data displayed.
`
`5
`
`10
`
`15
`
`11. The vehicle identification system as recited in Claim 10 wherein:
`said message data further includes position information.
`
`20
`
`25
`
`30
`
`35
`
`40
`
`50
`
`55
`
`12. The vehicle identification system as recited in Claim 10 wherein:
`said plurality of light assembly means being arranged in two parallel rows along runways and taxi(cid:173)
`ways of said airport
`
`13. The vehicle identification system as recited in Claim 10 wherein said light assembly means comprises:
`light means coupled to said lines of said power providing means for lighting said airport;
`vehicle senSing means for detecting aircraft or other vehicles on said airport;
`said microprocessor means coupled to said decoding means, said light means, said vehicle sensing
`means and said data communication means further processes a detection signal from said vehicle sens(cid:173)
`ing means; and
`said data communication means being coupled to said microprocessor means and said lines of said
`power providing means.
`
`14. The vehicle identification system as recited in Claim 10 wherein:
`said symbols representing aircraft and other vehicles comprise icons having a shape indicating
`type of aircraft or vehicle.
`
`15. The vehicle identification system as recited in Claim 10 wherein:
`said processing means determines a location of said symbols on said graphic display of said airport
`in accordance with data received from said light assembly means.
`
`16. A vehicle identification system for surveillance and Identification of aircraft and other vehicles on an air(cid:173)
`port comprising:
`a plurality of light circuits on said airport. each of said light Circuits comprises a plurality of light
`assembly means;
`means for providing power to each of said plurality of light circuits and to each of said light assembly
`means;
`means in each of said light assembly means for sensing ground traffic on said airport;
`means disposed on said aircraft and other vehicles for transmitting identification message data;
`means disposed in each of said light assembly means for receiving and decoding said message
`data from said transmitting means;
`means for processing ground traffic data from said sensing means and decoded message data
`from each of said light assembly means for presentation on a graphic display of said airport;
`means for providing data communication between each of said light assembly means and said proo-
`essing means; and
`said processing means comprises means for providing such graphic display of said airport COI"l'l(cid:173)
`prising symbols representing said ground traffic, each of said symbols having direction, velocity and said
`identification message data displayed.
`
`17
`
`Sony, Ex. 1002, p.906
`
`

`
`EP 0 613109 Ai
`
`17. The vehicle identification system as recited In Claim 16 wherein:
`each of said light circuits being located along the edges of taxiways or runways on said airport
`
`18. The vehicle identification system as recited in Claim 16 wherein:
`said sensing means comprises infrared detectors.
`
`19. The vehicle identification system as recited in Claim 16 wherein said transmitting means comprises:
`means for creating unique message data which Includes aircraft and flight Identification; and
`infrared means coupled to said message creating means for transmitting a coded stream of said
`message data.
`
`5
`
`10
`
`20. The vehicle identification system as recited in Claim 19 wherein:
`said message data further comprises position information.
`
`15
`
`21. The vehicle identification system as recited in Claim 16 wherein:
`said receiving and decoding means comprises an infrared sensor.
`
`20
`
`25
`
`30
`
`35
`
`40
`
`50
`
`22. The vehicle identification system as recited in Claim 21 wherein:
`said receiving and decoding means comprises microprocessor means coupled to said infrared sen(cid:173)
`sor for decoding said message data.
`
`23. The vehicle identification system as recited in Claim 16 wherein:
`said plurality of light assembly means of said light circuits being arranged in two parallel rows along
`runways and taxiways of said airport.
`
`24. The vehicle identification system as recited in Claim 16 wherein said light assembly means comp

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket