throbber
Case 2:17-cv-00932-JLR Document 61-3 Filed 03/09/18 Page 1 of 25
`
`
`
`
`Case 2:17-cv—00932-JLR Document 61-3 Filed 03/09/18 Page 1 of 25
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`EXHIBIT C
`
`
`
`
`
`EXHIBIT C
`
`

`

`Case 2:17-cv-00932-JLR Document 61-3 Filed 03/09/18 Page 2 of 25
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`IN THE UNITED STATES DISTRICT COURT
`FOR THE WESTERN DISTRICT OF WASHINGTON
`SEATTLE DIVISION
`
` CASE NO. 2:17-cv-00932
`
`
`
`JURY TRIAL DEMANDED
`
`
`CYWEE GROUP LTD.,
`
`Plaintiff,
`
`HTC CORPORATION
`and
`HTC AMERICA, INC.,
`
`Defendants.
`
`
`
`DECLARATION OF NICHOLAS GANS, PH.D.
`I, Nicholas Gans, Ph.D., hereby declare as follows:
`I have been asked by counsel for Plaintiff CyWee Group Ltd. (“CyWee”) to offer
`1.
`information and my opinions as to the technologies disclosed in U.S. Patent No. 8,441,438 (the
`“’438 patent”) and U.S. Patent No. 8,552,978 (the “’978 Patent”).
`In connection with the preparation of this Declaration, I have reviewed the
`2.
`materials listed below:
`The ’438 patent;
`•
`•
`The file wrapper for the ’438 patent;
`•
`The ’978 patent; and
`•
`The file wrapper for the ’978 patent.
`
`
`
`
`
`1
`2
`3
`4
`5
`6
`7
`8
`9
`10
`11
`12
`13
`14
`15
`16
`17
`18
`19
`20
`21
`22
`23
`24
`25
`26
`
`
`Exhibit C, 17-cv-932-JLR-001
`
`Shore Chan DePumpo LLP
`901 Main St., Ste. 3300
`Dallas, TX 75202
`214.593.9110
`
`

`

`Case 2:17-cv-00932-JLR Document 61-3 Filed 03/09/18 Page 3 of 25
`
`
`
`All of the opinions stated in this declaration are based on my personal knowledge
`3.
`and professional judgment. If called as a witness, I am prepared to testify competently about
`them.
`I.
`
`EXPERIENCE AND QUALIFICATIONS
`I am a Clinical Associate Professor with the Department of Electrical Engineering
`4.
`at the University of Texas at Dallas.
`I received my doctorate in Systems and Entrepreneurial Engineering from the
`5.
`University of Illinois at Urbana-Champaign, with dissertation research in the fields of robotics,
`controls, and estimation. I continue to research and teach these topics in my capacity as a
`Professor, with over 100 peer reviewed publications and three patents. I have authored multiple
`papers on the topic of Inertial Measurement Units and related sensors and fusion algorithms.
`A more complete list of my qualifications is set forth in my curriculum vitae, a
`6.
`copy of which is attached hereto as Exhibit A.
`I am being compensated for work in this matter. My compensation in no way
`7.
`depends on the outcome of this litigation, nor do I have a personal interest in the outcome of this
`litigation.
`NATURE OF THE DISCLOSED TECHNOLOGIES
`II.
`The ’438 patent and ’978 patent disclose devices and methods for tracking the
`8.
`motion of a device in 3D space and compensating for accumulated errors. That is, at a high level,
`the patented inventions teach how to determine a device’s current orientation based on motion
`data detected by its motion sensors, such as an accelerometer, gyroscope, and magnetometer.
`There are different types of motion sensors, including accelerometers, gyroscopes,
`9.
`and magnetometers. Accelerometers measure accelerations. For example, airbags use
`accelerometers, such that the airbag is triggered based on sudden deceleration. Accelerometers
`can also measure forces due to gravity. Gyroscopes measure rotation rates or angular velocities.
`Magnetometers measure magnetism, including the strength of a magnetic field along a particular
`
`
`
`
`
`1
`2
`3
`4
`5
`6
`7
`8
`9
`10
`11
`12
`13
`14
`15
`16
`17
`18
`19
`20
`21
`22
`23
`24
`25
`26
`
`
`Exhibit C, 17-cv-932-JLR-002
`
`Shore Chan DePumpo LLP
`901 Main St., Ste. 3300
`Dallas, TX 75202
`214.593.9110
`
`

`

`Case 2:17-cv-00932-JLR Document 61-3 Filed 03/09/18 Page 4 of 25
`
`
`
`direction. Each type of motion sensor is subject to inaccuracies. For example, a gyroscope sensor
`has a small, added offset or bias. This bias will accumulate over time and lead to large drift error.
`Similarly, magnetometers are subject to interference from natural and manmade sources (e.g.
`power electronics). Additionally, errors can accumulate over time. These sensors typically take
`measurements along a single direction. To accurately measure motions along an arbitrary axis,
`three like sensors are grouped together and aligned at right angles. Such a sensor set is generally
`referred to as a 3-axis sensor.
`To incorporate the data from multiple sensors and compensate for the errors
`10.
`described above, the ’438 patent and ’978 patent each disclose a sensor fusion technology.
`Specifically, the ’438 patent discloses an enhanced sensor fusion technology and application for
`calculating orientation (including tilting angles along all three spatial axes) by using
`measurements from both a 3-axis accelerometer and a 3-axis gyroscope; furthermore, it can
`correct or eliminate errors associated with the motion sensors. This technology is especially
`suited for accurately representing a mobile device’s orientation in 3D space on a 2D display
`screen by mapping the yaw, pitch, and roll angles relating to movement along the three spatial
`axes to a 2D display reference frame. Simply put, the ’438 patent discloses an improved system
`and method to capture motion of the device and for eliminating or correcting errors based on
`movements and rotations of the device.
`Likewise, the ’978 patent discloses a similar enhanced sensor fusion technology
`11.
`for calculating orientation. Unlike the ’438 patent, which discloses and claims using two motion
`sensors—an accelerometer and gyroscope—the ’978 patent discloses and claims using a third
`sensor—a magnetometer.
`Orientation information returned by the claimed inventions of the ’438 and ’978
`12.
`patents has many uses, particularly for mobile cellular devices, such as navigation, gaming, and
`augmented/virtual reality applications. Navigation applications can use orientation information to
`determine the heading of the phone, indicate what direction the user is facing, and automatically
`
`
`
`
`
`1
`2
`3
`4
`5
`6
`7
`8
`9
`10
`11
`12
`13
`14
`15
`16
`17
`18
`19
`20
`21
`22
`23
`24
`25
`26
`
`
`Exhibit C, 17-cv-932-JLR-003
`
`Shore Chan DePumpo LLP
`901 Main St., Ste. 3300
`Dallas, TX 75202
`214.593.9110
`
`

`

`Case 2:17-cv-00932-JLR Document 61-3 Filed 03/09/18 Page 5 of 25
`
`
`
`orient the map to align with the cardinal directions. Increasing numbers of games and other
`applications use the motion of the phone to input commands, such as tilting the mobile device
`like a steering wheel. Augmented and virtual reality applications rely on accurate estimation of
`the device orientation in order to render graphics and images at the proper locations on the
`screen.
`III. OPINION REGARDING ADVANTAGES OVER PRIOR ART
`In the past, motion sensors had limited applicability to handheld pointing devices
`13.
`due to a variety of technological hurdles. For example, different types of acceleration (e.g.,
`linear, centrifugal, gravitational) could not be readily distinguished from one another, and rapid,
`dynamic, and unexpected movements caused significant errors and inaccuracies. These
`difficulties were compounded by the miniaturization of the sensors necessary to incorporate them
`in handheld devices. With the development of micro-electromechanical systems, or “MEMS,”
`miniaturized motion sensors could be manufactured and incorporated on a semiconductor chip,
`but such MEMS sensors had significant limitations.
`For example, it is impossible for MEMS accelerometers to distinguish different
`14.
`types of acceleration (e.g., linear, centrifugal, gravitational). When a MEMS accelerometer is
`used to estimate orientation, it must measure force along the direction of gravity (i.e., down), but
`that gravitational measurement can be “interfused” with other accelerations and forces (e.g.,
`vibration or movement by the person holding the device). Thus, non-gravitational accelerations
`and forces must be estimated and subtracted from the MEMS accelerometer measurement to
`yield an accurate result. A MEMS gyroscope is prone to drift, which will accumulate increasing
`errors over time if not corrected by another sensor or recalibrated. A MEMS magnetometer is
`highly sensitive to not only the earth’s magnetic fields, but other sources of magnetism (e.g.,
`power lines and transformers) and can thereby suffer inaccuracies from environmental sources of
`interference that vary both in existence and intensity from location to location.
`Additionally, orientation cannot be accurately calculated using only one type of
`15.
`
`
`
`
`
`1
`2
`3
`4
`5
`6
`7
`8
`9
`10
`11
`12
`13
`14
`15
`16
`17
`18
`19
`20
`21
`22
`23
`24
`25
`26
`
`
`Exhibit C, 17-cv-932-JLR-004
`
`Shore Chan DePumpo LLP
`901 Main St., Ste. 3300
`Dallas, TX 75202
`214.593.9110
`
`

`

`Case 2:17-cv-00932-JLR Document 61-3 Filed 03/09/18 Page 6 of 25
`
`
`
`MEMS sensor. For example, if only a 3-axis MEMS accelerometer is used to measure
`orientation, pitch and yaw can be measured, but not roll. If only a MEMS gyroscope is used to
`measure angular velocity, only relative changes in orientation can be measured, not absolute
`orientation.
`The ’438 patent and ’978 patent technologies overcome technological hurdles
`16.
`such as those discussed above in a unique and novel way by incorporating measurements from
`multiple sensors for increased accuracy and algorithms to compensate for accumulated errors.
`The ’438 patent discloses a system and method for interactively and iteratively fusing angular
`velocity measurements in the x, y, and z directions and axial acceleration measurements in the x,
`y, and z directions such that the measurements complement each other according to a specific
`algorithm. The limitations of the specific measurements and accumulated errors therein are
`overcome and eliminated. The ’978 patent builds upon the ’438 patent and further incorporates
`magnetism measurements in the x, y, and z directions to further compliment the accuracy.
`17. Without orientation information, mobile device apps would be limited to very
`static operation. This was the scenario with initial smart phones and other mobile devices.
`Navigation aids could render a map and indicate the location of the device using GPS. However,
`these maps would orient with North on the map pointing to the top of the screen. The user could
`rotate the map using touch commands, but the map would not rotate automatically as the user
`turned. Nor could the device indicate what direction the device was facing.
`18. Many games use motion of the device to control the game. A common control
`scheme, especially for driving and piloting games, is to have the user rotate the phone or device
`like a steering wheel to indicate the direction the vehicle should move. Some puzzle games also
`use motions to cause elements of the game to move. As discussed previously, accelerometers
`measure acceleration, which is a very noisy signal. Acceleration is the derivative of velocity,
`which is the derivative of position. Small magnitude noise can have large derivatives, which
`means that small levels of noise from vibration or electrical fluctuations will be magnified at the
`
`
`
`
`
`1
`2
`3
`4
`5
`6
`7
`8
`9
`10
`11
`12
`13
`14
`15
`16
`17
`18
`19
`20
`21
`22
`23
`24
`25
`26
`
`
`Exhibit C, 17-cv-932-JLR-005
`
`Shore Chan DePumpo LLP
`901 Main St., Ste. 3300
`Dallas, TX 75202
`214.593.9110
`
`

`

`Case 2:17-cv-00932-JLR Document 61-3 Filed 03/09/18 Page 7 of 25
`
`
`
`acceleration level. Even a stationary device will have notable noise measured by an
`accelerometer. A moving device will only amplify this noise. Since accelerometers measure
`linear and centripetal accelerations as well as the acceleration of gravity, orientation estimates on
`a moving device will not be accurate. In Figure 11, we conducted an experiment where an
`accelerometer was placed on a moving pendulum with a very accurate angle measurement from
`an optical encoder. When the pendulum is rotated slowly, the accelerometer is fairly accurate,
`but during moderate or fast motion it shows significant errors.
`
`
`
`
`Acceleration
`Encoder
`
`5
`
`10
`
`15
`
`20
`Time (s)
`
`Figure 1
`
`25
`
`30
`
`35
`
`
`
`2
`1.5
`1
`0.5
`0
`-0.5
`-1
`-1.5
`-2
`0
`
`
`
`Angle (radians)
`
`If only an accelerometer is used, a coarse estimate of the device orientation can be
`19.
`obtained by averaging or numerically filtering the results. Essentially, the device can determine
`if it is tilted left or right, up or down, but the exact angle cannot be estimated accurately while in
`motion. This is suitable for games to move a character or steer a vehicle in a particular direction,
`but generally cannot utilize the magnitude of tilt to move at corresponding faster or slower
`speeds.
`20. Without orientation sensors, games could be controlled using traditional
`“joystick” type inputs. For smart phones with touch screens, commands are given by having the
`
`
`1 T. R. Bennett, R. Jafari and N. Gans, "Motion Based Acceleration Correction for Improved
`Sensor Orientation Estimates," 2014 11th International Conference on Wearable and Implantable
`Body Sensor Networks, Zurich, 2014, pp. 109-114.
`
`
`
`
`
`1
`2
`3
`4
`5
`6
`7
`8
`9
`10
`11
`12
`13
`14
`15
`16
`17
`18
`19
`20
`21
`22
`23
`24
`25
`26
`
`
`Exhibit C, 17-cv-932-JLR-006
`
`Shore Chan DePumpo LLP
`901 Main St., Ste. 3300
`Dallas, TX 75202
`214.593.9110
`
`

`

`Case 2:17-cv-00932-JLR Document 61-3 Filed 03/09/18 Page 8 of 25
`
`
`
`user touch specific parts of the screen.
`Augmented reality (AR) and virtual reality (VR) are new and growing classes of
`21.
`applications for smart phones and mobile devices. In AR, the device camera provides live video
`feed to the screen, and the application overlays generated graphics onto the screen at specific
`locations. AR navigation apps can draw signs or labels to indicate what specific places or objects
`are, or can render arrows or other indicators. AR games and teaching applications can label
`objects or draw characters or items such that they appear as if they are in the real world seen in
`the video. Virtual reality is similar but does not use the camera, rather it completely renders an
`artificial 3D environment on the screen. VR most often requires a head set such that the user only
`sees the screen. Mobile devices and smart phones used for VR generally split the screen and
`display to two side-by-side images of the rendered environment that are slightly offset to
`simulate a left and right eye. The device then sits in a headset with lenses such that the user has
`each eye see only one of the split-screen images and has a sense of stereo (3D) vision.
`22. Without orientation sensing, AR and VR applications cannot work. The system
`will have no ability to understand the orientation of the device and know where to draw objects
`and/or the scene. The rough orientation estimate provided by an accelerometer (ideally with a
`magnetometer) will not be sufficient to track during typical head motions. It has been
`demonstrated the VR applications that use an accelerometer often cause motion sickness, as the
`rendered images do track with the head motions. An AR application with the use of a gyroscope
`and fusion algorithm will not render objects at the correct locations, and may obscure the view
`rather than provide helpful information.
`There are ways to estimate orientation other than the approaches presented in the
`23.
`’438 patent or ’978 patent, which involve algorithms that filter and fuse measurements from
`inertial and magnetic sensors. Most such methods are based on cameras and computer vision
`algorithms. However, the limitations of these methods render them unusable for mobile devices.
`For example, there are a variety of motion capture systems that use cameras arrayed around an
`
`
`
`
`
`1
`2
`3
`4
`5
`6
`7
`8
`9
`10
`11
`12
`13
`14
`15
`16
`17
`18
`19
`20
`21
`22
`23
`24
`25
`26
`
`
`Exhibit C, 17-cv-932-JLR-007
`
`Shore Chan DePumpo LLP
`901 Main St., Ste. 3300
`Dallas, TX 75202
`214.593.9110
`
`

`

`Case 2:17-cv-00932-JLR Document 61-3 Filed 03/09/18 Page 9 of 25
`
`
`
`environment. Markers (e.g. reflective balls) can be placed on objects, and the cameras can locate
`the markers, often to sub-mm accuracy. If an object has three or more markers on it, the
`orientation of the object can be determined with sub-degree accuracy. This method is very
`accurate, but quite expensive (often about $100,000). The cameras are fixed in place, and the
`estimation can only work within a small space (a box of dimensions on the order of tens of
`meters). Clearly, this is not suitable for the vast majority of mobile device users or applications.
`Some high-end VR systems use such an approach, but this is not feasible for general consumers.
`A camera on a mobile device, such as a smart phone, can be used to estimate
`24.
`orientation of the phone. One class of approaches to this problem uses special patterns or
`markers in the environment. These often have the appearance of a QR code or 2D UPC. Taking a
`picture of the pattern, computer vision algorithms can determine the position and orientation of
`the camera with respect to the marker. AR applications have placed the patterns on specific
`objects or consumer products so the device can render images and graphics with respect to the
`pattern. AR games have included patterned mats that are placed on a table or other flat surface,
`and the device renders characters and objects as if they were on the surface. An example of a
`game using different patterned mats is seen in Figure 22.
`
`
`
`2 https://pokemondb.net/spinoff/pokedex-3d
`
`
`
`
`
`1
`2
`3
`4
`5
`6
`7
`8
`9
`10
`11
`12
`13
`14
`15
`16
`17
`18
`19
`20
`21
`22
`23
`24
`25
`26
`
`
`Exhibit C, 17-cv-932-JLR-008
`
`Shore Chan DePumpo LLP
`901 Main St., Ste. 3300
`Dallas, TX 75202
`214.593.9110
`
`

`

`Case 2:17-cv-00932-JLR Document 61-3 Filed 03/09/18 Page 10 of 25
`
`
`
`1
`2
`3
`4
`5
`6
`7
`8
`9
`10
`11
`12
`13
`14
`15
`16
`17
`18
`19
`20
`21
`22
`23
`24
`25
`26
`
`
`Figure 2
`
`
`
`25. Multiple unique patterns can be placed around an environment; so long as one is
`always in view, the camera can maintain an estimate of the orientation and position. In this way,
`it can be used for navigation. An example is seen in Figure 33. The necessity of placing patterns
`would make this approach useless for a majority of applications, particularly outdoors. The
`camera would also need to remain on at all times, which would cause severe battery drain.
`
`
`3 This is an unpublished image from Dr. Gans’ research laboratory.
`
`
`
`
`
`Exhibit C, 17-cv-932-JLR-009
`
`Shore Chan DePumpo LLP
`901 Main St., Ste. 3300
`Dallas, TX 75202
`214.593.9110
`
`

`

`Case 2:17-cv-00932-JLR Document 61-3 Filed 03/09/18 Page 11 of 25
`
`
`
`1
`2
`3
`4
`5
`6
`7
`8
`9
`10
`11
`12
`13
`14
`15
`16
`17
`18
`19
`20
`21
`22
`23
`24
`25
`26
`
`
`Figure 3
`
`
`
`26. Orientation of the camera can also be estimated over an indefinite amount of time
`using vision algorithms known as visual odometry. In visual odometry, changes in the image
`over time are used to estimate the camera velocity. This velocity can be integrated over time to
`estimate the change in orientation. While these methods are well understood, they can only track
`change in relative orientation, not give absolute orientation. They also require the camera to be
`on at all times, which will greatly reduce battery life.”
`
`
`Dated: July 6, 2017
`
`
`
`
`
`
`
`
`
`
`
`
`______________________________
`Nicholas Gans, Ph.D.
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`Exhibit C, 17-cv-932-JLR-010
`
`Shore Chan DePumpo LLP
`901 Main St., Ste. 3300
`Dallas, TX 75202
`214.593.9110
`
`

`

`Case 2:17-cv-00932-JLR Document 61-3 Filed 03/09/18 Page 12 of 25
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`EXHIBIT A
`
`
`
`
`
`
`Exhibit C, 17-cv-932-JLR-011
`
`Shore Chan DePumpo LLP
`901 Main St., Ste. 3300
`Dallas, TX 75202
`214.593.9110
`
`

`

`Case 2:17-cv-00932-JLR Document 61-3 Filed 03/09/18 Page 13 of 25
`
`
`
`Nicholas R. Gans
`Curriculum Vitae
`
`Phone: (972) 883-6755
`Fax: (972) 883-2710
`
`
`
`
`
`
`
`ngans@utdallas.edu, nrgans@gmail.com
`Department of Electrical Engineering, MS: EC33
`The University of Texas at Dallas
`800 W. Campbell Rd.
`Richardson, TX 75080, USA
`http://www.utdallas.edu/~ngans
`
`
`
`
`
`CURRENT POSITION
`• Assistant Professor – Department of Electrical Engineering,
`University of Texas at Dallas, Aug. 2009 – present
`
`EDUCATION
`• Ph.D. Systems and Entrepreneurial Engineering
`University of Illinois Urbana Champaign - December 2005
`Dissertation: Hybrid Switched System Visual Servo Control
`Advisor: Seth Hutchinson
`• M.S. Electrical and Computer Engineering
`University of Illinois Urbana Champaign - May 2002
`Thesis: Performance Tests of Partitioned Approaches To Visual Servo Control
`Advisor: Seth Hutchinson
`• B.S., Cum Laude, Electrical Engineering and Applied Physics, Minor in Philosophy
`Case Western Reserve University - May 1999
`
`RESEARCH INTERESTS
`• Vision-Based Control and State Estimation
`• Mobile Robot Control
`• Multi-Agent/Distributed Control and Estimation
`• Self-Optimizing Systems
`• Nonlinear Control
`• Hybrid Switched-System Control
`• Computer Vision
`• Human/Machine Interaction
`• Engineering Ethics Education
`
`RESEARCH EXPERIENCE
`• Postdoctoral Associate – National Research Council/Air Force Research Laboratory
`Eglin Air Force Base, Sep. 2008 – July 2009
`Directed by David Jeffcoat
`• Postdoctoral Researcher - Nonlinear Controls and Robotics Lab
`University of Florida, Jan. 2006 – Aug. 2008
`Directed by Dr. Warren Dixon
`• Graduate Research Assistant - Robot Motion Planning and Control Lab
`University of Illinois Urbana Champaign, Jan. 2002 - Dec. 2005
`Directed by Dr. Seth Hutchinson
`
`Nicholas Gans
`
`Curriculum Vitae
`
`Page | 1
`
`Exhibit C, 17-cv-932-JLR-012
`
`Shore Chan DePumpo LLP
`901 Main St., Ste. 3300
`Dallas, TX 75202
`214.593.9110
`
`

`

`Case 2:17-cv-00932-JLR Document 61-3 Filed 03/09/18 Page 14 of 25
`
`
`
`
`CURRENT GRANTS
` PI, “Distributed Control and Vision-Based Estimation for UAS Autonomy,” Air Force Research
`Laboratory Munitions Directorate, 5/01/2017 – 4/31/2019, $75,000
` PI, “Engineering Projects in Community Service,“ UT Dallas Center for Teaching and Learning
`Instructional Improvement Grant, 9/1/2016-8/31/2017, $5000
` PI, “Vision-Based Control of Quadrotor UAVs”, Kasling Aircraft, 9/1/2016-12/1/2016, $8,098
` PI, “GOALI: Adaptive Control of Inkjet Printing on 3D Curved Surfaces”, National Science
`Foundation, 5/1/2016 – 4/30/2019, $294,452
` PI, “Vision-Based Surface Shape & Condition Estimation”, Texas Instruments, 12/1/15 -
`11/30/17, $70,000
` PI, “ICC for Olalekan Ogunmolu” (Soft for RadioSurgery), UT Southwestern Medical Center,
`9/1/2014 – 5/31/2016, $13,320
` Co-PI, “Engineering Ethics as an Expert Guided and Socially Situated Activity,” NSF,
`9/1/2013-8/31/2017, $299,967
`
`
`PREVIOUS GRANTS
`
` PI, “Smart City Transportation System: Real-Time Multi-Objective Optimization of
`Autonomous Multi-Agent Controllers in an Orchestrated Resource Environment for Adaptive
`and Responsive Traffic Management,” Texas Research Alliance, 10/1/2015-6/1/2016, $40,000
` PI, “Human Machine Interface for Including Out-of Sequence Information in UAV Target
`Search,” Air Force Research Laboratory, 9/23/2013 – 5/23/2016, $159,998
` Co-PI, “Cooperative Robot Manipulation”, Daegu Gyeongbuk Institute of Science And
`Technology, 3/1/2013 12/31/2015, $272,961
` PI, “3D Interactive Camera/Projector Display Unit for Human/Robot Interaction”, Texas
`Instruments, 1/1/13 - 12/28/15, $105,000
` PI, “Development of Algorithms and an Instructional Lab for Actin Robot Control Software
`and Cyton Robot Arm ,” Energid Technologies Corporation, 1/15/13 – 6-15-13, $23,556
` Co-PI, “Development of an Adaptive Radar Laboratory,” Mustang Technology Group, $35,000
` PI, “System Characterizations of the Scanning Tunneling Microscope”, Zyvex Corporation,
`5/30/2011 - 8/30/2011, $3,084
` PI, “Super-resolution of Target Details for Improved Target State Estimation and
`Classification,” University of Texas Catalyst Grant, June 2010-August 2011, $40,000
` PI, “Hardware in the Loop Simulation for Vision-Based Control of Autonomous Vehicles,” Air
`Force Research Lab grant FA8651-10-1-003, January 2010-September 2010, $10,000
`
`
`TEACHING EXPERIENCE
`•Course Instructor University of Texas at Dallas
`o Engineering Projects in Community Service – Spring 2016, Fall 2016
`Linear Systems and Signals (EECs 6331) – Spring 2010, Spring 2012, Spring 2013, Spring
`2014, Spring 2015, Spring 2016
`o Introduction to Robotics (EEGR 5V80, ENGR 5375) – Spring 2011, Fall 2013, Fall 2014,
`Fall 2015, Fall 2015
`
`Nicholas Gans
`
`Curriculum Vitae
`
`Page | 2
`
`Exhibit C, 17-cv-932-JLR-013
`
`Shore Chan DePumpo LLP
`901 Main St., Ste. 3300
`Dallas, TX 75202
`214.593.9110
`
`

`

`Case 2:17-cv-00932-JLR Document 61-3 Filed 03/09/18 Page 15 of 25
`
`o Senior Design 2 (EE 4389) – Spring 2012, Spring 2013, Spring 2014, Spring 2015, Spring
`2016
`o Vision-Based Estimation and Control (EESC 7V85 ) – Fall 2010, Fall 2012
`o Electronic Circuits Laboratory (EE 3111) – Summer 2011, Summer 2012
`o Systems and Controls (EE 4310) – Fall 2011
`
`
`
`
`
`• One Day Workshop Recent Advances in Extremum Seeking Control and its Applications
`19th IFAC World Congress, Capetown, South Africa, Sep. 2014
`• One Day Workshop Vision-Based Estimation and Control
`University of Johannesburg, Johannesburg, South Africa, Nov. 2009
`
` •
`
` Short Course - Vision-Based Control for Autonomous Vehicles
`NASA Johnson Space Center, Houston, TX, Aug. 2012
`AIAA Guidance, Navigation and Control Conference, Chicago, IL, Aug. 2009
`AIAA Guidance, Navigation and Control Conference, Hilton Head, SC, Aug. 2007
`A two day short course offered through AIAA. I taught sections on cameras and imaging,
`pose estimation through epipolar geometry, visual servoing, chained pose estimation, and
`hardware in the loop simulation.
`
`
`STUDENT SUPERVISION
`Doctoral advisement/direction:
`• Jinglin Shen, PhD awarded August 2013, “Multi-View Systems for Robot-Human Interaction
`and Object Grasping”
`• Yinghua Zhang, PhD awarded May 2014, “Improving Global Properties of Real-Time
`Optimization With Applications in Robotic Visual Search”
`• Jingfu Jin, PhD awarded December 2015, “Unified Formation Control, Heading Consensus
`and Obstacle Avoidance for Heterogeneous Mobile Robots with Nonholonomic Constraints”
`• J.-Pablo Ramirez, PhD awarded May 2016, “Mobile Sensor Guidance for Optimal
`Information Acquisition Under Out-Of-Sequence and Soft Measurements”
`• Terrell Bennet, PhD awarded August 2016, “Algorithms for Enabling Wearable Sensors in
`the Internet of Things”
`
`
`
`
`Masters advisement/direction:
`• Keveh Fathian, MS awarded December 2012, “Virtual Thermal Sensing and Control of Heat
`Distribution Using State Estimation”
`• David Tick, MS awarded May 2011, “Fusion of Discrete and Continuous Epipolar Geometry
`With Wheel and IMU Odometry for Localization of Mobile Robots”
`• Wen Yu, MS awarded August 2011, “Interactive Camera/Projector Display Unit Using Image
`Homography”
`
`
`AWARDS
`•Selected as one of 2014’s IEEE Transactions on Robotics Outstanding Reviewers
`•Nominated for the Provost's Award for Faculty Excellence in Undergraduate Research
`Mentoring, The University of Texas at Dallas, 2014
`
`Nicholas Gans
`
`Curriculum Vitae
`
`Page | 3
`
`Exhibit C, 17-cv-932-JLR-014
`
`Shore Chan DePumpo LLP
`901 Main St., Ste. 3300
`Dallas, TX 75202
`214.593.9110
`
`

`

`Case 2:17-cv-00932-JLR Document 61-3 Filed 03/09/18 Page 16 of 25
`
`
`
`•Best Paper of the Session, K. Fathian, J. Jin and N. Gans, “A New Approach for Solving the
`Five-Point Relative Pose Problem for Vision-Based estimation and Control,” Proc. American
`Control Conference 2014
`•Best doctoral colloquium award (work in progress): Quality Enhancement Framework for
`Wearable Computers - Terrell R. Bennett, (my PhD student), 2014 Proc. Body Sensor Network
`Conference
`•Best Paper of the Session, T. Bennett, R. Jafari, and N. Gans, “An Extended Kalman Filter to
`Estimate Human Gait Parameters and Walking Distance,” 2013 Proc. American Controls
`Conference
`•Best Student Paper, D. Q. Tick, J. Shen, and N.R. Gans, "Fusion of Discrete and Continuous
`Epipolar Geometry for Visual Odometry and Localization," 2010 IEEE International
`Workshop on Robotic and Sensors Environments
`• National Research Council/US Air Force Office of Scientific Research Associateship
`•Best Paper of the Session, K. Kaiser, N. Gans and W. E. Dixon, “Localization and Control of
`an Aerial Vehicle through Chained, Vision-Based Pose Reconstruction,” 2007 American
`Control Conference
`
`
`
`
`PROFESSIONAL ACTIVITIES
`Memberships
`• IEEE Senior Member
`• IEEE Robotics and Automation Society
`• IEEE Control Systems Society
`• AIAA
`• ASME
`
`Conference Organizing Committees
`• Local Arrangements Chair - IEEE Conference on Automation Science and Engineering
`(CASE) 2016
`• Exhibitions Chair - IEEE Conference on Automation Science and Engineering (CASE) 2016
`• Exhibitions CoChair - IEEE Int’l Conf. on Intelligent Robots and Systems (IROS) 2014
`
`
`International Program Committee Member
`• IEEE Int’l Conf. on Intelligent Robots and Systems (IROS): 2006
`• IEEE Int’l Conf. on Intelligent Robots and Systems (IROS), Associate Editor: 2007-2014
`• IEEE/IFAC Int’l Conf. on Inform. in Control, Autom. & Robotics (ICINCO): 2008-2014
`• American Controls Conference (ACC) 2009-2014
`
`Workshop/Invited Session Organizer/Lecturer
`• 6o Taller de Robótica y Planificación de Movimientos, Plenary Speaker, Decentralized
`Formation Control and Obstacle Avoidance for Mobile Robots, Centro de Investigación en
`Matemáticas, Guanajuato, Mexico, April. 2016
`• 19th IFAC World Congress, Organizer and Lecturer at One Day Workshop, Recent Advances
`in Extremum Seeking Control and its Applications, Capetown, South Africa, Sep. 2014
`• University of Johannesburg, Organizer and Lecturer at One Day Workshop, Vision-Based
`Estimation and Control, Johannesburg, South Africa, Nov. 2009
`
`Nicholas Gans
`
`Curriculum Vitae
`
`Page | 4
`
`Exhibit C, 17-cv-932-JLR-015
`
`Shore Chan DePumpo LLP
`901 Main St., Ste. 3300
`Dallas, TX 75202
`214.593.9110
`
`

`

`Case 2:17-cv-00932-JLR Document 61-3 Filed 03/09/18 Page 17 of 25
`
`
`
`• AIAA Guidance, Navigation and Control Conference (GNC), Organizer and Lecturer at Two
`Day Workshop, Vision-Based Control for Autonomous Vehicles, Chicago, IL, Aug. 2009
`•International Conference on Pattern Recognition (ICPR), Presenter at One Day Workshop,
`Visual Observation and Analysis of Animal and Insect Behavior, Tampa, FL, Dec. 2008
`• IEEE International Symposium on Intelligent Control (ISIC), Invited Session Organizer and
`Lecturer: Current Topics Vision-Based Control, San Antonio, TX, Sep. 2008
`• AIAA Guidance, Navigation and Control Conference (GNC), Organizer and Lecturer at Two
`Day Workshop, Vision-Based Control for Autonomous Vehicles, Hilton Head, SC, Aug. 2007
`
`Review Panels
`• National Science Foundation
`• NASA
`
`Publication Reviewer
`• IEEE Transactions on Control (TAC)
`• IEEE Transactions on Robotics (TRO)
`• IEEE Transactions on Systems, Man and Cybernetics (TSMC)
`• IEEE Transactions on Controls System Technology (TCST)
`• ASME Journal of Dynamic Systems, Measurement and Control
`• International Journal of Computer Vision (IJCV)
`• International Journal of Robotics Research (IJRR)
`• Journal of Intelligent and Robotic Systems
`• European Journal of Control (EJC)
`• IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)
`• IEEE Conference on Robotics and Automation (ICRA)
`• IEEE Conference on Decision and Control (CDC)
`• American Controls Conference (ACC)
`• ASME Dynamic Systems and Control Conference (DSCC)
`• IEEE Conference on Control, Automation, Robotics and Vision (ICARV)
`• IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN)
`• International Conference on Intelligent Autonomous Systems (IAS)
`• IEEE Potentials
`
`
`INVITED LECTURES
`
`•“Novel Algorithms for Estimating Camera Pose and Target Structure” – Lecture at Robotics
`Graduate Student Seminars, University of Illinois at Urbana Champaign, October 21, 2016
`•“Novel Algorithms for Estimating Camera Pose and Target Structure” – Lecture at Computer
`Science and Engineering Graduate Student Seminars, Texas A&M University, September 12,
`2016
`•“Decentralized Formation Control and Obstacle Avoidance for Mobile Robots” – Plenary Talk,
`The 6th Robotics Workshop and Planning Movements, El Centro de Investigación en

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket