`US008850517B2
`
`c12) United States Patent
`Kumar
`
`(IO) Patent No.:
`(45) Date of Patent:
`
`US 8,850,517 B2
`Sep.30,2014
`
`(54) RUNTIME RISK DETECTION BASED ON
`USER, APPLICATION, AND SYSTEM ACTION
`SEQUENCE CORRELATION
`
`(71) Applicant: Taasera, Inc., Erie, PA (US)
`
`(72)
`
`Inventor: Srinivas Kumar, Cupertino, CA (US)
`
`2008/0288330 Al
`2009/0292743 Al
`2010/0125911 Al
`2011/0131658 Al
`2011/0179477 Al*
`2011/0321175 Al
`2012/0216244 Al*
`2013/0096980 Al *
`
`11/2008 Hildebrand et al.
`11/2009 Bigus et al.
`5/2010 Bhaskaran
`6/2011 Bahl
`7/2011 Starnes et al ..................... 726/9
`12/2011 Slater
`..................... 726/1
`8/2012 Kumar et al.
`4/2013 Basavapatna et al.
`....... 705/7.28
`
`(73) Assignee: Taasera, Inc., Erie, PA (US)
`
`OTHER PUBLICATIONS
`
`( *) Notice:
`
`Subject to any disclaimer, the term ofthis
`patent is extended or adjusted under 35
`U.S.C. 154(b) by O days.
`
`May 14, 2014 International Search Report issued in PCT/US2014/
`011427.
`May 14, 2014 Written Opinion issued in PCT/US2014/0l 1427.
`
`(21) Appl. No.: 13/741,878
`
`(22)
`
`Filed:
`
`Jan. 15, 2013
`
`(65)
`
`(51)
`
`(52)
`
`(58)
`
`Prior Publication Data
`
`US 2014/0201806 Al
`
`Jul. 17, 2014
`
`(2006.01)
`(2006.01)
`(2013.01)
`
`Int. Cl.
`H04J 1116
`G06F 17130
`G06F 21157
`U.S. Cl.
`CPC .................................... G06F 211577 (2013.01)
`USPC ........................ 726/1; 726/3; 726/6; 709/223
`Field of Classification Search
`CPC ... H04L 47/10; H04L 63/0227; H04L 63/102;
`H04L 47/2458
`USPC ......................................... 726/1-1 O; 709/223
`See application file for complete search history.
`
`(56)
`
`References Cited
`
`U.S. PATENT DOCUMENTS
`
`* cited by examiner
`
`Primary Examiner - Haresh N Patel
`(74) Attorney, Agent, or Firm - Buchanan Ingersoll &
`Rooney PC
`
`(57)
`
`ABSTRACT
`
`A method for assessing runtime risk for an application or
`device includes: storing, in a rules database, a plurality of
`rules, wherein each rule identifies an action sequence; stor(cid:173)
`ing, in a policy database, a plurality of assessment policies,
`wherein each assessment policy includes at least one rule of
`the plurality of rules; identifying, using at least one assess(cid:173)
`ment policy, a runtime risk for an application or device,
`wherein the identified runtime risk identifies and predicts a
`specific type of threat; and identifying, by a processing
`device, a behavior score for the application or device based on
`the identified runtime risk, wherein the action sequence is a
`sequence of at least two performed actions, and each per(cid:173)
`formed action is at least one of: a user action, an application
`action, and a system action.
`
`1/2012 Mir et al ....................... 717/104
`8,091,065 B2 *
`8,327,441 B2 * 12/2012 Kumar et al .................... 726/22
`
`24 Claims, 4 Drawing Sheets
`
`START
`
`218
`
`Identify Runtime
`RiskofAction
`Sequence
`
`: Base Runtime Risk on:
`----.: MatchingRulesand:
`LProbabilisticWerJhts:
`
`:Base Runtime Risk on:
`----.: TimeProximityand :
`L_ NaturalAffinity __ :
`220
`
`Identify Behavior
`Score of Application
`or Device
`
`END
`
`SYSTEM
`
`NO
`
`216
`, - - -L . . - ,
`ldenlifySystem
`Acton
`
`Palo Alto Networks - Exhibit 1001
`Palo Alto Networks v. Taasera - IPR2023-00704
`Page 1 of 11
`
`
`
`110
`
`100
`
`Processing Unit
`
`Display Unit
`
`Memory Unit
`
`Communications
`Infrastructure
`
`Application
`Runtime Monitor
`
`Action Sequence
`Rules Database
`
`Assessment
`Policy Database
`
`102
`
`104
`
`108
`
`110
`
`112
`
`Display Device
`
`106
`
`106
`
`FIG.1
`
`Communications
`Interface
`
`Communications
`Path
`---------------------~
`---------------------r---
`
`~
`00
`•
`~
`~
`~
`
`~ = ~
`
`rJJ
`('D
`'?
`~
`~o
`N
`0 ....
`
`.i;...
`
`('D
`('D
`
`rJJ =(cid:173)
`.....
`....
`0 ....
`
`.i;...
`
`d r.,;_
`00
`00
`UI = tit
`-....l = N
`
`"'""'
`
`Palo Alto Networks - Exhibit 1001
`Palo Alto Networks v. Taasera - IPR2023-00704
`Page 2 of 11
`
`
`
`202
`
`Monitor for
`Performed Actions
`
`START
`
`206
`
`Identify a
`Preceding Action
`
`SYSTEM
`
`APPLICATION
`
`208
`
`218
`
`Identify
`Subsequent
`System Action
`
`Identify Runtime
`Risk of Action
`Sequence
`
`USER
`
`210
`
`Identify
`Subsequent
`Performed Action
`
`FIG. 2
`
`SYSTEM
`
`NO
`
`-------------,
`: Base Runtime Risk on:
`----.: Matching Rules and :
`: Probabilistic Weights :
`L ____________ J
`
`-------------,
`: Base Runtime Risk on:
`- - - - .: Time Proximity and :
`: Natural Affinity
`:
`L------------A
`220
`
`Identify Behavior
`Score of Application
`or Device
`
`216
`
`Identify System
`Action
`
`END
`
`~
`00
`•
`~
`~
`~
`
`~ = ~
`
`rJJ
`('D
`'?
`~
`~o
`N
`0 ....
`
`.i;...
`
`('D
`('D
`
`rJJ =(cid:173)
`.....
`N
`0 ....
`
`.i;...
`
`d r.,;_
`00
`00
`UI = tit
`-....l = N
`
`"'""'
`
`Palo Alto Networks - Exhibit 1001
`Palo Alto Networks v. Taasera - IPR2023-00704
`Page 3 of 11
`
`
`
`304
`
`\
`
`Application
`Action
`
`----....----
`
`302
`
`/
`
`User Action 1
`
`3osa........_ I ~ ~
`I(
`System )
`Action
`-
`-
`
`User H
`Action
`-
`
`-
`
`308b'-::::. I ~ - - ' . ' ln? ~304
`I~ ~
`
`3osc........_ I ~ ~
`
`306
`
`rl.tJf'-'11\JU.LIVI I
`
`11
`
`I
`
`•I
`
`\....JJVL\JIII
`
`I
`
`System
`Action
`
`308d
`
`FIG. 3A
`
`FIG. 3B
`
`I
`
`I
`
`I
`
`I
`I
`
`I
`
`~
`00
`•
`~
`~
`~
`
`~ = ~
`
`rJJ
`('D
`'?
`~
`~o
`N
`0 ....
`
`.i;...
`
`rJJ =-('D
`.....
`0 ....
`
`('D
`
`~
`
`.i;...
`
`d r.,;_
`00
`00
`UI = tit
`-....l = N
`
`"'""'
`
`Palo Alto Networks - Exhibit 1001
`Palo Alto Networks v. Taasera - IPR2023-00704
`Page 4 of 11
`
`
`
`400
`
`Store, in a rules database, a plurality of rules, wherein each rule
`identifies an action sequence
`
`"
`Store, in a policy database, a plurality of assessment policies, wherein
`each assessment policy includes at least one rule of the plurality of rules
`
`"
`Identify, using at least one assessment policy, a runtime risk for an
`application or device, wherein the identified runtime risk identifies and
`predicts a specific type of threat
`
`"
`Identify, by a processing device, a behavior score for the application or
`device based on the identified runtime risk, wherein the action sequence
`is a sequence of at least two performed actions and each performed
`action is at least one of: a user action, an application action, and a
`system action
`
`FIG. 4
`
`402
`
`404
`
`406
`
`408
`
`~
`00
`•
`~
`~
`~
`
`~ = ~
`
`rJJ
`('D
`'?
`~
`~o
`N
`0 ....
`
`.i;...
`
`('D
`('D
`
`.i;...
`
`rJJ =(cid:173)
`.....
`0 ....
`
`.i;...
`
`d r.,;_
`00
`00
`UI = tit
`-....l = N
`
`"'""'
`
`Palo Alto Networks - Exhibit 1001
`Palo Alto Networks v. Taasera - IPR2023-00704
`Page 5 of 11
`
`
`
`US 8,850,517 B2
`
`2
`assessment policy, a runtime risk for an application or device,
`wherein the identified runtime risk identifies and predicts a
`specific type of threat, and identify a behavior score for the
`application or device based on the identified runtime risk,
`5 wherein the action sequence is a sequence of at least two
`performed actions, and each performed action is at least one
`of: a user action, an application action, and a system action.
`
`10
`
`BRIEF DESCRIPTION OF THE DRAWING
`FIGURES
`
`The scope of the present disclosure is best understood from
`the following detailed description of exemplary embodi(cid:173)
`ments when read in conjunction with the accompanying
`15 drawings. Included in the drawings are the following figures:
`FIG. 1 is a high level architecture illustrating a system for
`assessing the runtime risk for an application or device in
`accordance with exemplary embodiments.
`FIG. 2 is a flow diagram illustrating a method for identify-
`20 ing an action sequence and assessing a runtime risk and
`subsequent behavior score based on the identified action
`sequence in accordance with exemplary embodiments.
`FIGS. 3A and 3B are a diagram illustrating performed
`actions and action sequences identified using the method of
`25 FIG. 2 in accordance with exemplary embodiments.
`FIG. 4 is a flow diagram illustrating a method for creating
`a transaction record in accordance with exemplary embodi(cid:173)
`ments.
`Further areas of applicability of the present disclosure will
`30 become apparent from the detailed description provided here(cid:173)
`inafter. It should be understood that the detailed description of
`exemplary embodiments are intended for illustration pur(cid:173)
`poses only and are, therefore, not intended to necessarily limit
`the scope of the disclosure.
`
`35
`
`DETAILED DESCRIPTION
`
`1
`RUNTIME RISK DETECTION BASED ON
`USER, APPLICATION, AND SYSTEM ACTION
`SEQUENCE CORRELATION
`
`FIELD
`
`The present disclosure relates to cognitive behavior recog(cid:173)
`nition, specifically assessing the runtime risk of an applica(cid:173)
`tion or device in a computer system using cognitive behavior
`recognition.
`
`BACKGROUND
`
`With the prevalence of computers and other computing
`systems in the daily lives of both companies and individuals,
`computer and cyber security has become increasingly impor(cid:173)
`tant. A variety of programs and approaches have been devel(cid:173)
`oped to provide security and protect end users from harmful
`threats. Such programs that have been developed include
`antivirus programs, firewalls, and intrusion detection/preven(cid:173)
`tion systems (IDSs/IPSs). These programs can be beneficial
`in protecting a computing system and its end user from a
`number of threats.
`However, as technology in the computing devices them(cid:173)
`selves is developing, so is the technology behind the threats
`against those same computing device. Emerging cyber
`threats, commonly referred to as advanced persistent threats
`(APT), often remain undetected using traditional security
`programs and approaches. As a result, many harmful threats
`and infections can attack a system that includes these security
`programs unbeknownst to the user and system operator,
`which could have devastating results. For example, it can
`place companies at risk for the theft of proprietary informa(cid:173)
`tion, such as confidential information, trade secrets, etc., and
`individuals at risk for identify theft.
`Thus, there is a need for a technical solution to properly
`detect and prevent attacks by advanced persistent threats
`undetected using
`traditional
`security programs and
`approaches.
`
`SUMMARY
`
`The present disclosure provides a description of a system
`and method for the assessing of a runtime risk of an applica(cid:173)
`tion or device.
`A method for assessing runtime risk for an application or
`device includes: storing, in a rules database, a plurality of
`rules, wherein each rule identifies an action sequence; stor(cid:173)
`ing, in a policy database, a plurality of assessment policies,
`wherein each assessment policy includes at least one rule of
`the plurality of rules; identifying, using at least one assess(cid:173)
`ment policy, a runtime risk for an application or device,
`wherein the identified runtime risk identifies and predicts a
`specific type of threat; and identifying, by a processing
`device, a behavior score for the application or device based on 55
`the identified runtime risk, wherein the action sequence is a
`sequence of at least two performed actions, and each per(cid:173)
`formed action is at least one of: a user action, an application
`action, and a system action.
`A system for assessing runtime risk for an application or 60
`device includes a rules database, a policy database, and a
`processing device. The rules database is configured to store a
`plurality of rules, wherein each rule identifies an action
`sequence. The policy database is configured to store a plural-
`ity of assessment policies, wherein each assessment policy 65
`includes at least one rule of the plurality of rules. The pro(cid:173)
`cessing device is configured to identify, using at least one
`
`40
`
`System for Assessing Runtime Risk for an Application or
`Device
`FIG. 1 is a high level architecture illustrating a computing
`system 100 configured to assess the runtime risk of an appli(cid:173)
`cation or device and identify a behavior score based on the
`assessed runtime risk. It will be apparent to persons having
`skill in the relevant art that methods and processes disclosed
`45 herein may be implemented in the computing system 100
`using hardware, software, firmware, non-transitory computer
`readable media having instructions stored therein, or a com(cid:173)
`bination thereof, and may be implemented in more than one
`computing systems or other processing systems. It will be
`50 further apparent to persons having skill in the relevant art that
`the configuration of the computing system 100 as illustrated
`in FIG. 1 is provided as an illustration, and other configura(cid:173)
`tions and systems for performing the functions disclosed may
`be suitable.
`The computing system 100 may include a processing unit
`102. The processing unit 102 may include a single processor
`or a plurality of processor, each of which may include one or
`more processor cores. The processing unit 102 may be a
`general purpose processing unit or a special purpose process(cid:173)
`ing unit, such as a general purpose processing unit pro(cid:173)
`grammed for performing a specific purpose, such as the iden-
`tification of the runtime risk of an application or program. The
`processing unit 102 may be configured to connect to a com(cid:173)
`munications infrastructure 110 for communication with addi-
`tional components of the computing system 100.
`The communications infrastructure 110 may be a bus, mes(cid:173)
`sage queue, network, multi-core message-passing scheme, a
`
`Palo Alto Networks - Exhibit 1001
`Palo Alto Networks v. Taasera - IPR2023-00704
`Page 6 of 11
`
`
`
`US 8,850,517 B2
`
`3
`combination thereof, or any other suitable type or configura(cid:173)
`tion of communications infrastructure as will be apparent to
`persons having skill in the relevant art. The computing system
`100 may further include a display unit 104. The display unit
`104 may be configured to control a display device 106, which 5
`may be connected to the computing system 100 physically
`(e.g., via a cable, such as a VGA, DVI, or HDMI cable) or
`wirelessly (e.g., via Bluetooth, etc.). The display unit 104
`may be a video card, video adapter, graphics card, display
`card, graphics board, display adapter, graphics adapter, video 10
`controller, graphics controller, etc. and may be integrated into
`the computing system 100 or may be removable.
`The display device 106 may be configured to display infor(cid:173)
`mation ( e.g., data, graphics, output from an application pro(cid:173)
`gram, etc.) transmitted to the display device 106 via the 15
`display unit 104. Suitable types of display devices for use as
`the display device 106 will be apparent to persons having skill
`in the relevant art and may include a liquid crystal display
`(LCD), light-emitting diode (LED) display, thin film transis(cid:173)
`tor (TFT) LCD, capacitive touch display, etc.
`The computing system 100 may further include a memory
`unit 106. The memory unit 106 may be any type of memory
`suitable for the storage of data and performing of the func(cid:173)
`tions disclosed herein, such as a hard disk drive, floppy disk
`drive, magnetic tape drive, optical disk drive, solid state drive, 25
`or other non-transitory computer readable medium. In some
`embodiments, the memory unit 106 may be removable stor(cid:173)
`age ( e.g., flash memory, a compact disc, digital versatile disc,
`Blu-ray disc, etc.) or a combination of non-removable and
`removable storage. In one embodiment, the memory unit 106 30
`may be external to the computing system 100 and accessed
`via a network by a communications interface 108, discussed
`in more detail below, such as cloud storage. The memory unit
`106 may include random access memory (RAM), read-only
`memory (ROM), or a combination thereof. Suitable types and 35
`configurations of the memory unit 106 will be apparent to
`persons having skill in the relevant art.
`The memory unit 106 may include at least a runtime moni-
`tor 108, a rules database 110, and a policy database 112. The
`memory unit 106 may include additional data, applications, 40
`programs, etc. as will be apparent to persons having skill in
`the relevant art, such as an operating system, drivers for
`components of the system 100, etc. The runtime monitor 108
`may be an application program stored on the memory unit 106
`including program code executable by the processing unit 45
`102, configured to assess the runtime risk of an application
`(e.g., stored in the memory unit 106 and executed by the
`processing unit 102) or device (e.g., connected to the com(cid:173)
`puting system 100 externally (e.g., via the communications
`interface 108) or internally (e.g., via the communications 50
`infrastructure 110), as discussed in more detail below. In
`some embodiments, the runtime monitor 108 may be further
`configured to identify a behavior score for an action sequence
`of at least two performed actions executed by ( e.g., by com(cid:173)
`ponents or programs included in) the computing system 100. 55
`The runtime monitor 108 may be configured to perform
`additional functions in the computing system 100, such as
`those described in U.S. Pat. No. 8,372,441, filed as U.S.
`application Ser. No. 13/399,065, entitled "System and
`Method for Application Attestation," filed on Feb. 17, 2012; 60
`U.S. application Ser. No. 13/559,707, entitled "Systems and
`Methods for Orchestrating Runtime Operational Integrity"
`and filed on Jul. 27, 2012; U.S. application Ser. No. 13/559,
`7 66, entitled "Systems and Methods for Threat Identification
`and Remediation" and filed on Jul. 27, 2012; U.S. application 65
`Ser. No. 13/559,665, entitled "Systems and Methods for Pro(cid:173)
`viding Mobile Security Based on Dynamic Attestation" and
`
`4
`filed on Jul. 27, 2012; U.S. application Ser. No. 13/559,692,
`entitled "Systems and Methods for Using Reputation Scores
`in Network Services and Transactions to Calculate Security
`Risks to Computer Systems and Platforms" and filed on Jul.
`27, 2012; and U.S. application Ser. No. 13/559,732, entitled
`"Systems and Methods for Network Flow Remediation
`Based on Risk Correlation" and filed on Jul. 27, 2012, which
`are herein incorporated by reference in their entirety. Addi(cid:173)
`tional functions that may be performed by the runtime moni(cid:173)
`tor 108 will be apparent to persons having skill in the relevant
`art.
`The rules database 110 may be configured to store a plu-
`rality of rules, wherein each rule identifies an action
`sequence. An action sequence, discussed in more detail
`below, may include a sequence of at least two performed
`actions, wherein the performed actions may be one of a user
`action, an application action, and a system action. The policy
`database 112 may be configured to store a plurality of assess(cid:173)
`ment policies, wherein each assessment policy includes at
`20 least one rule of the plurality of rules stored in the rules
`database 112. The runtime monitor 108 may be configured to
`identify the runtime risk of the application or device using at
`least one of the assessment policies stored in the policy data-
`base 112 based on, for example, matching rules and probabi(cid:173)
`listic weights and scores. Methods for identifying runtime
`risk using an assessment policy comprised of one or more
`rules is discussed in more detail below.
`The rules database 110 and the policy database 112 may be
`configured in any type of suitable database configuration,
`such as a relational database, a structured query language
`(SQL) database, a distributed database, an object database,
`etc. Suitable configurations and database storage types will be
`apparent to persons having skill in the relevant art. The data(cid:173)
`bases may each be a single database, or may comprise mul(cid:173)
`tiple databases which may be interfaced together, such as
`physically (e.g., internally via the communications infra-
`structure 110 or externally via the communications interface
`108) or wirelessly (e.g., via Bluetooth), and may include one
`or more databases included within the computing system 100
`and one or more databases external to the computing system
`100, such as an external hard disk drive or storage accessed
`via a network (e.g., in the cloud).
`The communications interface 108 may be configured to
`allow software and data to be transmitted between the com(cid:173)
`puting system 100 and external networks and devices. The
`communications interface 108 may be a modem, network
`interface card ( e.g., an Ethernet card), a communications port,
`a Personal Computer Memory Card International Association
`(PCMCIA) card, or other type of communications interface
`suitable for performing the functions disclosed herein as will
`be apparent to persons having skill in the relevant art. Soft-
`ware and data transmitted to or from the computing system
`100 may be in the form of signals, which may be electronic,
`electromagnetic, optical, etc. The signals may travel via a
`communications path 114, which may be configured to carry
`the signals physically or wirelessly via a network. For
`example, the communications path 114 may carry signals
`from the communications interface 108 to a network such as
`a local area network (LAN), a wide area network (WAN), a
`wireless network (e.g., WiFi), a mobile communication net(cid:173)
`work, a satellite network, the Internet, fiber optic, coaxial
`cable, infrared, radio frequency (RF), or any combination
`thereof. Other suitable network types and configurations will
`be apparent to persons having skill in the relevant art.
`The communications interface 108 may be further config(cid:173)
`ured to connect the computing system 100 with a plurality of
`input devices, which may enable a user of the computing
`
`Palo Alto Networks - Exhibit 1001
`Palo Alto Networks v. Taasera - IPR2023-00704
`Page 7 of 11
`
`
`
`US 8,850,517 B2
`
`5
`system 100 to control the system. In some instances, the
`communications interface 108 may include multiple inter(cid:173)
`faces or connections, for connecting to a variety of external
`devices or networks. For example, the communications inter(cid:173)
`face 108 may include a plurality ofuniversal serial bus (USB)
`connectors, an Ethernet connector, audio connectors, video
`connectors, etc. Suitable types of input devices that may be
`used with the computing system 100 for providing input will
`be apparent to persons having skill in the relevant art and may
`include a keyboard, mouse, touch screen, tablet, click wheel,
`trackball, microphone, camera, etc.
`Method for Identifying Runtime Risk and Behavior Score for
`an Application or Device
`FIG. 2 is a flow diagram illustrating a method for identify(cid:173)
`ing an action sequence of actions performed in the computing
`system 100 and identifying the corresponding runtime risk
`and a behavior score for an application or device involved in
`the action sequence. It will be apparent to persons having skill
`in the relevant art that the action sequences identified using
`the method illustrated in FIG. 2 are provided as means of
`illustration only and are not exhaustive as to action sequences
`that may be suitable for use in identifying the runtime risk of
`an application or device by the runtime monitor 108 of the
`computing system 100.
`In step 202, the runtime monitor 108 may monitor the
`computing system 100 for performed actions, which may
`include user actions, application actions, and system actions,
`each of which are discussed in more detail below with respect
`to FIG. 3. Once an action is detected by the runtime monitor
`108, in step 204 the runtime monitor may (e.g., via the pro(cid:173)
`cessing unit 102) identify the type of action performed. If the
`type of action performed is a system action, then, in step 206,
`the runtime monitor 108 may attempt to identify an action
`preceding the system action. Then, the process may return to
`step 204 to identify the type of action of the preceding action,
`such that an action sequence including only system actions
`may not be used to identify a runtime risk of an application or
`device, as no action involving the application or device may
`be performed.
`If the identified action is an application action, then, in step
`208, the runtime monitor 108 may monitor for a subsequent
`system action, which may be identified by the processing unit
`102. The application action followed by the system action
`may be an action sequence, as discussed in more detail below,
`which may be used to identify the runtime risk of an applica(cid:173)
`tion or device ( e.g., whose application action caused the sys(cid:173)
`tem action to occur) at step 218, also discussed in more detail
`below. If the action identified at step 204 is a user action, then
`the runtime monitor 108 may monitor for any type of subse(cid:173)
`quent action, which may be identified by the processing unit
`102 in step 210. At step 212, the processing unit 102 may
`identify if the subsequent performed action is a system action
`or an application action.
`If the action is identified in step 212 as being a system
`action, then the user action followed by the system action may
`be identified as an action sequence suitable for identifying the
`runtime risk of an application or device ( e.g., which, when
`triggered by the user action, caused the system action to
`occur). If the action identified in step 212 is an application
`action, then, at step 213, the runtime monitor 108 may moni(cid:173)
`tor for, and the processing unit 102 may identify, a subsequent
`system action. If no subsequent system action is identified,
`then the user action followed by the application action may be
`identified as a suitable action sequence. If, on the other hand,
`a subsequent system action occurs, then, in step 216, the
`processing unit 102 may identify the system action that
`
`6
`occurred, which may identify a user action followed by an
`application action and a subsequent system action as a suit(cid:173)
`able action sequence.
`In step 218, the processing unit 102 (e.g., executing the
`5 application program of the runtime monitor 108), may iden(cid:173)
`tify the runtime risk of the application or device based on the
`identified action sequence. The runtime risk may identify
`and/or predict a specific type of threat caused by the applica(cid:173)
`tion or device. For example, a runtime risk may identify a
`10 physical damage threat, accidental threat, deliberate threat, a
`threat of technical failure, a compromise of function or infor(cid:173)
`mation, a combination thereof, etc. In some instances, mul(cid:173)
`tiple runtime risks may be identified corresponding to mul(cid:173)
`tiple types of threats. The runtime risk may be identified using
`15 at least one assessment policy stored in the policy database
`112. Each assessment policy may comprise one or more rules
`stored in the rules database 110 and may identify the runtime
`risk based on matching rules and probabilistic weights and
`scores. Types and values of probabilistic weights and scores
`20 that may be used by an assessment policy for identifying the
`runtime risk of an application or device will be apparent to
`persons having skill in the relevant art.
`The rules stored in the rules database 110 may each identify
`an action sequence. For example, one or more rules in the
`25 rules database 110 may identify an action sequence including
`a user action followed by a system action. Additional types of
`action sequences that may be identified by a corresponding
`rule in the rules database 110 will be apparent to persons
`having skill in the relevant art and are discussed in more detail
`30 below with respect to FIG. 3. The identification of the runtime
`risk of an action sequence is also discussed in more detail
`below with respect to FIG. 3.
`In step 220, the runtime monitor 108 may identify a behav(cid:173)
`ior score of the application or device, which may be based on
`35 the identified runtime risk of the application or device. The
`behavior score may be a number or other value ( e.g., a rating)
`that may indicate the risk or threat of the application or device
`to the computing system 100 or other device ( e.g., a network
`connected to the computing system 100 via the communica-
`40 tions interface 108, etc.). In some embodiments, the behavior
`score may indicate the risk or threat of the specific identified
`action sequence involving the application or device. In other
`embodiments, the behavior score may indicate the risk or
`threat of the application or device as a whole to the computing
`45 system 100 or other device.
`Action Sequences
`FIG. 3 is an illustration of the performed actions and exem(cid:173)
`plary action sequences that may be used by the computing
`system 100 and in the method illustrated in FIG. 2 for assess-
`50 ing the runtime risk of an application or device.
`The runtime monitor 108 of the computing system 100
`may monitor for actions performed in/by the computing sys(cid:173)
`tem 100. The performed actions monitored by the runtime
`monitor 108 and identified by the processing unit 102 may
`55 include a user action 302, an application action 304, and a
`system action 306. The user action 302 may be any form of
`explicit input from a user of the computing system 100. For
`example, the user action may include a program launch, key(cid:173)
`board input, click of a mouse, touch input on a touch screen,
`60 moving of a mouse, scrolling or clicking on a click wheel,
`voice command into a microphone, gestures into a device
`configured to read human gestures, etc.
`The application action 304 may be any activity performed
`by the application (e.g., stored in the memory unit 106 and
`65 executed by the processing unit 102) initiated programmati(cid:173)
`cally by a task in execution of the computing system 100. In
`an exemplary embodiment, the application performing the
`
`Palo Alto Networks - Exhibit 1001
`Palo Alto Networks v. Taasera - IPR2023-00704
`Page 8 of 11
`
`
`
`US 8,850,517 B2
`
`5
`
`7
`application action 304 may be the application for which the
`runtime risk is identified or an application controlled by or
`configured to control the device for which the runtime risk is
`identified. The application action 304 may include, for
`example, a start type ( e.g., console, service, parent process,
`script), first time execution, fork process ( e.g., child process
`creation), code injection methods, keyboard capture meth(cid:173)
`ods, screen capture methods, dynamic code blocks, open data
`documents, automatic or self-termination, abnormal termina(cid:173)
`tions (e.g., exception conditions), multiple restarts, opening 10
`of device(s) (e.g., the device for which the runtime risk is
`identified), process enumeration, cryptography, communica(cid:173)
`tion protocols, performance metrics, etc. as will be apparent
`to persons having skill in the relevant art.
`The system action 306 may be any operation performed by
`the computing system 100 ( e.g., by one or more components
`included in the computing system 100) on behalf or, or as a
`consequence of, an application or user action that changes the
`state of the computing system. The system action 306 may
`include, for example, system file creation, system folder
`modification, registry modifications or additions, application
`setting modifications, driver install or uninstall (e.g., system
`files), network activity, execution from temporary system
`folders, alternate data streams (ADS), environmental aspects
`(e.g., debugger presence, virtual environments, system boot
`times/sequences), etc.
`The runtime monitor 108 may be configured to identify
`(e.g., using the method illustrated in FIG. 2 and discussed
`above), an action sequence 308 used to identify the runtime
`risk of an application or device. The action sequence 308 may
`include at least two performed actions, and may indicate a set
`of actions performed in the computing system 100. For
`example, the action sequence 308a may include a user action
`302 followed by a system action 306. In some embodiments,
`the runtime monitor 108 may identify the time at which each
`performed action occurs. For example, for the action
`sequence 308a, the runtime monitor 108 may identify the user
`action 302 occurring at a time Tu and the system action 306
`occurring at time Ts where Ts> Tu such that the user action 3 02
`was performed first and the system action 306 performed
`second.
`Additional action sequences 308 that may be identified by
`the runtime monitor 108 and used to identify the runtime risk
`of an application or device may include action sequences
`308b, 308c, and 308d. Action sequence 308b may include a
`user action 302 followed by an application action 304. Action
`sequence 308c may include an application action 304 fol(ci