`
`PRIVACY PROTECTION DURING INSIDER THREAT MONITORING
`
`Richard A. Ford
`
`ChristopherB. Shirey
`
`Jonathan B. Knepher
`
`Lidror Troyansky
`
`BACKGROUND OF THE INVENTION
`
`Field of the Invention
`
`[0001]
`
`The present invention relates in general to the field of computers and similar
`
`technologies, and in particular to software utilized in this field. Still more particularly,it
`
`relates to a method, system and computer-usable medium for privacy protection during
`
`insider threat monitoring.
`
`Description of the Related Art
`
`[0002]
`
`Users interact with physical, system, data, and services resources ofall kinds,
`
`as well as each other, on a daily basis. Each of these interactions, whether accidental or
`
`intended, poses some degree of security risk. Many physical and cyber security efforts
`
`have traditionally been oriented towards preventing or circumventing the intent of
`
`external threats. A growing area of physical and cybersecurity efforts now focuses on
`
`identifying and addressing insider threats. It is known to perform a user input/output
`
`(I/O) event collection operation when identifying and addressing insider threats. With
`
`known I/O collection operations, an I/O event collector collects all keystrokes, user
`
`gestures, and physical security interactions (e.g., use of an access card) performed by a
`
`user within an organization.
`
`SUMMARY OF THE INVENTION
`
`[0003]
`
`A method, system and computer-usable medium are disclosed for performing
`
`a privacy operation, comprising: monitoring user behaviorvia an I/O collector, the I/O
`
`collector capturing data streams, events and metadata resulting from user/device
`
`
`
`Attorney Docket No.: FP0005
`
`interactions between a user and a device; determining whether the user/device
`
`interactions include sensitive personal information; obfuscating the sensitive personal
`
`information, the obfuscating preventing viewing of the sensitive personal information;
`
`and, presenting the sensitive personal information as a sensitive personal information
`
`indication, the sensitive personal information indication indicating the data streams
`
`include sensitive personal information.
`
`BRIEF DESCRIPTION OF THE DRAWINGS
`
`[0004]
`
`The present invention may be better understood, and its numerousobjects,
`
`features and advantages made apparentto those skilled in the art by referencing the
`
`accompanying drawings. The use of the same reference numberthroughoutthe several
`
`figures designates a like or similar element.
`
`[0005]
`
`Figure 1 depicts an exemplary client computer in which the present invention
`
`may be implemented;
`
`[0006]
`
`Figure 2 is a simplified block diagram of an edge device;
`
`[0007]
`
`Figure 3 is a simplified block diagram of a unified agent;
`
`[0008]
`
`Figure 4 is a simplified block diagram of a security analytics system;
`
`[0009]
`
`Figure 5 is a simplified block diagram of a risk-adaptive behavior system;
`
`[0010]
`
`Figure 6 is a simplified block diagram of risk-adaptive behavior elements and
`
`their interrelationship;
`
`[0011]
`
`Figures 7a through 7c are a generalized flowchart of the performanceofrisk-
`
`adaptive behavior policy generation operations;
`
`[0012]
`
`Figure 8 is a generalized flowchart of the performance ofrisk-adaptive
`
`behavior system operations to adaptively manageuser behaviorrisk;
`
`[0013]
`
`Figure 9 is a simplified block diagram of the operation of a risk-adaptive
`
`behavior system for adaptively assessing risk associated with a user behavior;
`
`[0014]
`
`Figure 10 is a simplified block diagram of the operation of a risk-adaptive
`
`behavior system for adaptively respondingto a user request;
`
`-2-
`
`
`
`Attorney Docket No.: FP0005
`
`[0015]
`
`Figure 11 is a graphical depiction of the operation of a risk-adaptive behavior
`
`system for adaptively assessing risk associated with a user behavior;
`
`[0016]
`
`Figure 12 is a graphical depiction of the operation of a risk-adaptive behavior
`
`system for optimizing system efficiency by adaptively assessing user behaviorrisk;
`
`[0017]
`
`Figure 13 is a simplified block diagram of the operation of a risk-adaptive
`
`behavior system for obfuscating and conditionally accessing a user’s sensitive personal
`
`information (SPI);
`
`[0018]
`
`Figures 14a through 14c are a generalized flowchart of the performance of
`
`risk-adaptive behavior system operations to generate an SPIpolicy;
`
`[0019]
`
`Figure 15 is a generalized flowchart of the performance ofrisk-adaptive
`
`behavior system operations to obfuscate a user’s SPI; and
`
`[0020]
`
`Figures 16a through 16b are a generalized flowchart of the performance of
`
`risk-adaptive behavior system operations to gain conditional access to a user’s SPI.
`
`DETAILED DESCRIPTION
`
`[0021]
`
`Certain aspects of the present disclosure include an appreciation that an
`
`input/output (I/O) event collection operation can inadvertently capture and disclose a
`
`user’s passwordor other personally sensitive information. Certain aspects of the present
`
`disclosure include an appreciation that it would be desirable to detect password or
`
`security credential reuse across multiple sites, where and how such passwordsor security
`
`credentials are entered, and associated location information, to provide proactive
`
`detection of credential loss, such as via phishing attacks. Certain aspects of the present
`
`disclosure include an appreciation that it would be desirable to avoid storing any
`
`personally sensitive information obtained during an I/O event collection operation as
`
`human-interpretable information.
`
`[0022]
`
`A method, system and computer-usable medium are disclosed for performing
`
`a privacy protection operation during insider threat monitoring. In certain embodiments,
`
`the privacy protection operation stores a one-way function (e.g., a hash) rendition of the
`
`user’s passwordat login within an endpoint agent. In certain embodiments, the one-way
`
`-3-
`
`
`
`Attorney Docket No.: FP0005
`
`function is an internally complex one-way function such as a multidimensional array of
`
`hashes with their state. Such an internally complex one-way function enables a user case
`
`with unknownstarting and ending points, backspaces andtrivial password variants. As
`
`the endpoint receives individual I/O events, they are sequentially added to the one-way
`
`function such that whenthe userre-enters sensitive personal information (SPI), such as a
`
`user’s password, the system recognizes that the sequence of collected I/O events
`
`correspond to the SPI andidentifies the sequence of collected I/O eventsas a potential
`
`match of the user’s credential. When the sequenceof collected I/O events has been
`
`identified as a potential match to the user’s credential, this information can be used in a
`
`plurality of use cases.
`
`[0023]
`
`In certain embodiments, heuristics are implemented to determine whether a
`
`sequence of collected I/O events may represent SPI. As an example, a user may enter a
`
`series of keystrokes for their password. However, the individual keystrokes may be
`
`displayedas asterisks on the user interface (UI) of the user’s device. Further, the
`
`keystrokes may have been entered within a particular window ofthe UI associated with
`
`user passwords. In this example, the implemented heuristics may indicate a high
`
`likelihood that the keystrokes are the user’s password, and therefore likely represent SPI.
`
`[0024]
`
`In various embodiments, the privacy protection operation captures a context
`
`in which the entered sequence of collected I/O events has occurred and obfuscates the
`
`sequence of collected I/O events corresponding to the SPI such that the sequence of
`
`collected I/O events is not displayed to a security administrator, such as an Insider Threat
`
`Investigator. Instead, the sequence of collected I/O events is rendered, displayed, or
`
`saved as an SPI indication such as a token. Thus, the security administrator can observe
`
`that the SPI was entered by a user, but not the actual SPI itself.
`
`[0025]
`
`In certain embodiments, the privacy protection operation includes a
`
`conditional SPI access scenario that allows an investigator to access associated raw
`
`events if needed. In certain embodiments, the raw events may include the obfuscated
`
`sequence of collected I/O events and a context in which the sequence of collected I/O
`
`events were entered. Such a conditional access scenario prevents casual exposure of
`
`users’ SPI within the insider threat operation. In certain embodiments, such a
`
`-4-
`
`
`
`Attorney Docket No.: FP0005
`
`conditional access scenario likewise adds accountability related to how the users’ SPI is
`
`accessed, and by whom,within an organization. In certain embodiments, the privacy
`
`protection operation captures where the SPI was entered and/or re-used. If the SPI was
`
`entered via an information processing system maintained by the information technology
`
`environment, the privacy protection system might determinethere is noorrelatively low
`
`risk to the organization from the SPI entry. However, if the SPI was entered and/or
`
`reused by a third-party server, then the privacy protection system might determinethat
`
`there is a strong risk of loss of credentials. Such a privacy protection operation allows
`
`users to be alerted, and potentially proactively blocked, when entering their SPI into a
`
`non-approved location. Such a privacy protection operation also provides strong
`
`protection from phishing, as the phishing endpoint is not approved.
`
`[0026]
`
`In certain embodiments, the privacy protection operation offers the user an
`
`opportunity to enroll their SPI in the privacy protection system. Thus, enrolling the SPI
`
`provides the user with an assurance that inadvertent display of the SPI associated with
`
`their personal accounts (e.g., Gmail, Yahoo!) in the company security system (such as an
`
`Insider Threat System) would be prevented, but does not weaken the protection provided
`
`by the company security system. In certain embodiments, the privacy protection
`
`operation uses a software guard (SG) enclave (such as the Software Guard Extension
`
`available from Intel Corporation) to protect the privacy protection system, and/or
`
`developing or leveraging a streaming hash algorithm. In certain embodiments, the one-
`
`way function does not need to be cryptographically secure, and collisions of data
`
`produced by the one-way function result in a positive, rather than a negative, effect. In
`
`various embodiments, the privacy protection operation prevents an attacker from
`
`capturing data generated by the use of the one-way function and thereby being able to
`
`derive the SPI from that data.
`
`[0027]
`
`For purposesof this disclosure, an information handling system may include
`
`any instrumentality or aggregate of instrumentalities operable to compute,classify,
`
`process, transmit, receive, retrieve, originate, switch, store, display, manifest, detect,
`
`record, reproduce, handle, or utilize any form of information, intelligence, or data for
`
`business, scientific, control, or other purposes. For example, an information handling
`
`-5-
`
`
`
`Attorney Docket No.: FP0005
`
`system may be a personal computer, a mobile device such as a tablet or smartphone, a
`
`connected “smart device,” a network appliance, a network storage device, or any other
`
`suitable device and may vary in size, shape, performance, functionality, and price. The
`
`information handling system may include random access memory (RAM), one or more
`
`processing resources such as a central processing unit (CPU) or hardware or software
`
`control logic, ROM,and/or other types of nonvolatile memory. Additional components
`
`of the information handling system may include one or more storage systems, one or
`
`more network ports for communicating externally, as well as various input and output
`
`(I/O) devices, such as a keyboard, a mouse, and a graphicsdisplay.
`
`[0028]
`
`Figure | is a generalized illustration of an information handling system 100
`
`that can be used to implement the system and method ofthe present invention. The
`
`information handling system 100 includes a processor(e.g., central processor unit or
`
`“CPU”) 102, input/output (I/O) devices 104, such as a display, a keyboard, a mouse, and
`
`associated controllers, a storage system 106, and various other subsystems 108. In
`
`various embodiments, the information handling system 100 also includes network port
`
`110 operable to connect to a network 140, whichis likewise accessible by a service
`
`provider server 142. The information handling system 100 likewise includes system
`
`memory 112, which is interconnected to the foregoing via one or more buses 114.
`
`System memory 112 further includes operating system (OS) 116 and in various
`
`embodiments may also include either or both a risk-adaptive behavior system 118 and a
`
`privacy protection system 119. In one embodiment, the information handling system 100
`
`is able to download the risk-adaptive behavior system 118 and/or the privacy protection
`
`system from the service provider server 142. In another embodiment, the risk-adaptive
`
`behavior system 118 and/or the privacy protection system 119 is provided as a service
`
`from the service provider server 142.
`
`[0029]
`
`In various embodiments, the risk-adaptive behavior system 118 performs a
`
`risk-adaptive behavior operation to assess the risk corresponding to a user behavior and
`
`adaptively responding with an associated response. In certain embodiments,the risk-
`
`adaptive behavior operation improves processorefficiency, and thus the efficiency of the
`
`information handling system 100, by automatically performing a risk-adaptive behavior
`
`-6-
`
`
`
`Attorney Docket No.: FP0005
`
`operation. As will be appreciated, once the information handling system 100 is
`
`configured to perform the risk-adaptive behavior system operation, the information
`
`handling system 100 becomes a specialized computing device specifically configured to
`
`perform the risk-adaptive behavior operation and is not a general purpose computing
`
`device. Moreover, the implementation of the risk-adaptive behavior system 118 on the
`
`information handling system 100 improvesthe functionality of the information handling
`
`system 100 and provides a useful and concrete result of automatically assessing the risk
`
`corresponding to a user behavior and adaptively responding with an associated response.
`
`[0030]
`
`In various embodiments, the privacy protection system 119 performs a
`
`privacy protection operation. In certain embodiments,the privacy protection operation
`
`improves processorefficiency, and thusthe efficiency of the information handling system
`
`100, by automatically performing a privacy protection operation. As will be appreciated,
`
`once the information handling system 100 is configured to perform the risk-adaptive
`
`behavior system operation, the information handling system 100 becomesa specialized
`
`computing device specifically configured to perform the privacy protection operation and
`
`is not a general purpose computing device. Moreover, the implementation of the privacy
`
`protection system 119 on the information handling system 100 improvesthe functionality
`
`of the information handling system 100 and providesa useful and concrete result of
`
`automatically protecting sensitive information obtained via a keystroke monitoring
`
`operation.
`
`[0031]
`
`Figure 2 is a simplified block diagram of an edge device implemented in
`
`accordance with an embodimentof the invention. As used herein, an edge device, such
`
`as the edge device 202 shownin Figure 2, broadly refers to a device providing an entry
`
`point into a network 140. Examples of such edge devices 202 may includerouters,
`
`routing switches, integrated access devices (IADs), multiplexers, wide-area network
`
`(WAN)access devices, and network security appliances. In various embodiments,the
`
`network 140 may be a private network(e.g., an enterprise network), a semi-public
`
`network(e.g., a Service provider core network), or a public network (e.g., the Internet).
`
`[0032]
`
`Skilled practitioners of the art will be aware that edge devices 202 are often
`
`implemented as routers that provide authenticated access to faster, more efficient
`
`-7-
`
`
`
`Attorney Docket No.: FP0005
`
`backbone and core networks. Furthermore, current industry trends include making edge
`
`devices 202 more intelligent, which allows core devices to operate at higher speed, as
`
`they are not burdened with additional administrative overhead. Accordingly, such edge
`
`devices 202 often include Quality of Service (QoS) and multi-service functions to
`
`managedifferent types of traffic. Consequently, it is commonto design core networks
`
`with switches that use routing protocols such as Open Shortest Path First (OSPF) or
`
`Multiprotocol Label Switching (MPLS)forreliability and scalability. Such approaches
`
`allow edge devices 202 to have redundantlinks to the core network, which not only
`
`provides improvedreliability, but enables enhanced, flexible, and scalable security
`
`capabilities as well.
`
`[0033]
`
`In various embodiments, the edge device 202 is implementedto include a
`
`communications/services architecture 202, various pluggable capabilities 212, a traffic
`
`router 210, and a pluggable hosting framework 208. In certain of these embodiments, the
`
`communications/services architecture 202 may be implemented to provide access to and
`
`from various networks 140, cloud services 206, or a combination thereof. In various
`
`embodiments, the cloud services 206 may be provided by a cloud infrastructure familiar
`
`to those of skill in the art. In various embodiments, the edge device 202 may be
`
`implemented to provide support for a variety of generic services, such as directory
`
`integration, logging interfaces, update services, and bidirectional risk/context flows
`
`associated with various analytics.
`
`[0034]
`
`In certain embodiments, the edge device 202 is implemented as a generic
`
`device configured to host various network communications, data processing, and security
`
`management capabilities. In various embodiments, the pluggable hosting framework 208
`
`is implemented to host such capabilities in the form of pluggable capabilities 212. In
`
`certain embodiments, the pluggable capabilities 212 may include capability ‘1’ 214 (e.g.,
`
`basic firewall), capability ‘2’ 216 (e.g., general web protection), capability ‘3’ 218 (e.g.,
`
`data sanitization), and so forth through capability ‘n’ 220, which mayinclude capabilities
`
`needed for a particular operation, process, or requirement on an as-neededbasis.
`
`[0035]
`
`In various embodiments, the pluggable capabilities 212 are sourced from
`
`various cloud services 206. In certain embodiments, the pluggable hosting framework
`
`-8-
`
`
`
`Attorney Docket No.: FP0005
`
`208 may be implementedto provide certain computing and communication infrastructure
`
`components, and foundation capabilities, required by one or more of the pluggable
`
`capabilities 212. In various embodiments, the pluggable hosting framework 208 may be
`
`implementedto allow the pluggable capabilities 212 to be dynamically invoked. Skilled
`
`practitioners of the art will recognize that many such embodiments are possible.
`
`Accordingly, the foregoing is not intendedto limit the spirit, scope or intent of the
`
`invention.
`
`[0036]
`
`Figure 3 is a simplified block diagram of a unified agent implemented in
`
`accordance with an embodimentof the invention. As used herein, a unified agent, such
`
`as the unified agent 306 shownin Figure 3, broadly refers to a software agent used in
`
`combination with an endpoint device 304 to establish a protected endpoint 302. Skilled
`
`practitioners of the art will be familiar with software agents, which are computer
`
`programsthat perform actions on behalf of a user or another program. In various
`
`approaches, a software agent may be autonomousor work together with another agent or
`
`auser. In certain of these approaches, the software agent is implemented to
`
`autonomously decide if a particular action is appropriate for a given event, such as an
`
`observed user behavior.
`
`[0037]
`
`An endpoint device 304, as likewise used herein, refers to an information
`
`processing system such as a personal computer, a laptop computer, a tablet computer, a
`
`personaldigital assistant (PDA), a smart phone, a mobile telephone, a digital camera, a
`
`video camera, or other device that is capable of storing, processing and communicating
`
`data. In various embodiments, the communication of the data may take place in real-time
`
`or near-real-time. As an example, a cellular phone conversation may be used to
`
`communicate information in real-time, while an instant message (IM) exchange may be
`
`used to communicate information in near-real-time. In certain embodiments, the
`
`communication of the information may take place asynchronously. For example, an
`
`email message may be stored on an endpoint device 304 whenit is offline. In this
`
`example, the information may be communicatedto its intended recipient once the
`
`endpoint device 304 gains access to a network 140.
`
`[0038]
`
`A protected endpoint 302, as likewise used herein, broadly refers to a policy-
`
`-9-
`
`
`
`Attorney Docket No.: FP0005
`
`based approach to network security that typically requires endpoint devices 304 to
`
`comply with particular criteria before they are granted access to network resources. As
`
`an example, a given endpoint device 304 may be required to have a particular operating
`
`system (OS), or version thereof, a Virtual Private Network (VPN)client, anti-virus
`
`software with current updates, and so forth. In various embodiments, the protected
`
`endpoint 302 is implemented to perform risk-adaptive behavior operations.
`
`[0039]
`
`Risk-adaptive behavior, as used herein, broadly refers to adaptively
`
`responding to a risk associated with an electronically-observable user behavior. As used
`
`herein, electronically-observable user behavior broadly refers to any behavior exhibited
`
`or enacted by a user that can be electronically observed. In various embodiments, user
`
`behavior may include a user’s physical behavior, cyber behavior, or a combination
`
`thereof. As likewise used herein, physical behavior broadly refers to any user behavior
`
`occurring within a physical realm. Moreparticularly, physical behavior may include any
`
`action enacted by a user that can be objectively observed, or indirectly inferred, within a
`
`physical realm.
`
`[0040]
`
`As an example, a user may attempt to use an electronic access card to enter a
`
`secured building. In this example, the use of the access card to enter the building is the
`
`action and the reading of the access card makesthe user’s physical behavior
`
`electronically-observable. As another example, a first user may physically transfer a
`
`document to a second user, which is captured by a video surveillance system. In this
`
`example, the physical transferal of the document from thefirst user to the seconduseris
`
`the action. Likewise, the video record of the transferal makesthefirst and second user’s
`
`physical behavior electronically-observable.
`
`[0041]
`
`Cyberbehavior, as used herein, broadly refers to any behavior occurring in
`
`cyberspace, whether enacted by an individual user, a group of users, or a system acting at
`
`the behest of an individual user, a group of users, or an entity. More particularly, cyber
`
`behavior may include physical, social, or mental actions that can be objectively observed,
`
`or indirectly inferred, within cyberspace. As an example, a user may use an endpoint
`
`device 304 to access and browse a particular website on the Internet. In this example, the
`
`individual actions performed by the user to access and browse the website constitute a
`
`-10-
`
`
`
`Attorney Docket No.: FP0005
`
`cyber behavior. As another example, a user may use an endpoint device 304 to download
`
`a data file from a particular system. In this example, the individual actions performed by
`
`the user to downloadthe datafile constitute a cyber behavior. In these examples, the
`
`actions are enacted within cyberspace, which makes them electronically-observable.
`
`[0042]
`
`Aslikewise used herein, cyberspace broadly refers to a network 140
`
`environment capable of supporting communication between twoor moreentities. In
`
`various embodiments, the entity may be a user, an endpoint device 304, or various
`
`resources, described in greater detail herein. In certain embodiments, the entities may
`
`include various endpoint devices 304 or resources operating at the behest of an entity,
`
`such as a user. In various embodiments, the communication between the entities may
`
`include audio, image, video, text, or binary data.
`
`[0043]
`
`By extension, a risk-adaptive behavior system, as used herein, broadly refers
`
`to a system implemented to monitor various user behaviors, assess the corresponding risk
`
`they may represent, individually or in combination, and respond with an associated
`
`response. In certain embodiments, such responses may be based upon contextual
`
`information associated with a given user behavior. As used herein, contextual
`
`information broadly refers to any information, directly or indirectly, individually or in
`
`combination, related to a particular user behavior. As described in greater detail herein,
`
`the contextual information may include a user’s identification factors, their authentication
`
`factors, their role in an organization, and their associated access rights. Other contextual
`
`information may likewise include various user interactions, whether the interactions are
`
`with an endpoint device 304, a network 140, a resource, or another user. Contextual
`
`information may likewise include the date/time/frequency of various user behaviors, the
`
`user’s location, and certain user gestures employed bythe user in the enactment of a user
`
`behavior. In various embodiments, user behaviors, and their related contextual
`
`information, may be collected at particular points of observation, described in greater
`
`detail herein.
`
`[0044]
`
`In various embodiments, the unified agent 306 is implemented to universally
`
`support a variety of operating systems, such as Apple macOS®, Microsoft Windows®,
`
`Linux®,and so forth. In certain embodiments, the unified agent 306 interacts with the
`
`-11-
`
`
`
`Attorney Docket No.: FP0005
`
`endpoint device 304 through the use of low-level hooks 312 at the OS level. It will be
`
`appreciated that the use of low-level hooks 312 allows the unified agent 306 to subscribe
`
`to multiple events through a single hook. Accordingly, multiple functionalities provided
`
`by the unified agent 306 can share a single data stream, using only those portions of the
`
`data stream they mayindividually need. Accordingly, system efficiency can be improved
`
`and operational overhead reduced.
`
`[0045]
`
`In various embodiments, the unified agent 306 provides a common
`
`infrastructure for pluggable feature packs 308. In certain of these embodiments, the
`
`pluggable feature packs 308 mayprovide certain security managementfunctionalities.
`
`Examples of such functionalities may include variousanti-virus and malware detection,
`
`data loss protection (DLP), insider threat detection, and so forth. In various
`
`embodiments, the security managementfunctionalities may include one or morerisk-
`
`adaptive behavior functionalities, described in greater detail herein.
`
`[0046]
`
`In certain embodiments, a particular pluggable feature pack 308 is invoked as
`
`needed by the unified agent 306 to provide a given risk-adaptive behavior functionality.
`
`In one embodiment, individual features of a particular pluggable feature pack 308 are
`
`invoked as needed. It will be appreciated that the ability to invoke individual features of
`
`a pluggable feature pack 308, without necessarily invoking all such features, will likely
`
`improvethe operational efficiency of the unified agent 306 while simultaneously
`
`reducing operational overhead. Accordingly, the unified agent 306 can self-optimize in
`
`various embodiments by using the commoninfrastructure and invoking only those
`
`pluggable components that are applicable or needed for a given user behavior.
`
`[0047]
`
`In certain embodiments, the individual features of a pluggable feature pack
`
`308 are invoked by the unified agent 306 according to the occurrence ofa particular user
`
`behavior. In various embodiments, the individual features of a pluggable feature pack
`
`308 are invoked by the unified agent 306 according to the context of a particular user
`
`behavior. As an example, the context may be the user enacting the user behavior, their
`
`associated risk classification, which resource they may be requesting, and so forth. In
`
`certain embodiments, the pluggable feature packs 308 are sourced from various cloud
`
`services 206. In one embodiment, the pluggable feature packs 308 are dynamically
`
`-12-
`
`
`
`Attorney Docket No.: FP0005
`
`sourced from various cloud services 206 by the unified agent 306 on an as-needbasis.
`
`[0048]
`
`In various embodiments, the unified agent 306 is implemented with additional
`
`functionalities, such as event analytics 310. In certain embodiments, the event analytics
`
`310 functionality includes analysis of various user behaviors, described in greater detail
`
`herein. In various embodiments, the unified agent 306 is implemented with a thin
`
`hypervisor 314, which can be run at Ring -1, thereby providing protection for the unified
`
`agent 306 in the event of a breach. As used herein, a thin hypervisor broadly refers to a
`
`simplified hypervisor implemented to increase security. As likewise used herein, Ring -1
`
`broadly refers to approaches allowing guest operating systems to run Ring 0 (1.e., kernel)
`
`operations without affecting other guests or the host OS. Those ofskill in the art will
`
`recognize that many such embodimentsare possible. Accordingly, the foregoing is not
`
`intendedto limit the spirit, scope or intent of the invention.
`
`[0049]
`
`Figure 4 is a simplified block diagram of a security analytics system
`
`implemented in accordance with an embodimentof the invention. In various
`
`embodiments, the security analytics system shown in Figure 4 is implemented to provide
`
`log storage, reporting, and analytics capable of performing streaming 406 and on-demand
`
`408 analytics operations. In certain embodiments, the security analytics system is
`
`implemented to provide a uniform platform for storing events and contextual information
`
`associated with various user behaviors and performing longitudinal analytics.
`
`[0050]
`
`As used herein, longitudinal analytics broadly refers to performing analytics
`
`of user behaviors occurring overa particular period of time. As an example, a user may
`
`iteratively attempt to access certain proprietary information stored in variouslocations.
`
`In addition, the attempts may occur overa brief period of time. To continue the example,
`
`the fact that the information the user is attempting to access 1s proprietary, that it is stored
`
`in various locations, and the attempts are occurring in a brief period of time, in
`
`combination, may indicate the user behavior enacted by the useris suspicious.
`
`[0051]
`
`In various embodiments, the security analytics system is implemented to be
`
`scalable. In one embodiment, the security analytics system may be implementedin a
`
`centralized location, such as a corporate data center. In this embodiment, additional
`
`resources may be addedto the security analytics system as needs grow. In another
`
`-13-
`
`
`
`Attorney Docket No.: FP0005
`
`embodiment, the security analytics system may be implementedas a distributed system.
`
`In this embodiment, the security analytics system may span multiple information
`
`processing systems. In yet another embodiment, the security analytics system may be
`
`implemented in a cloud environment. In yet still another embodiment, the security
`
`analytics system may be implementedin a virtual machine (VM) environment. In such
`
`an embodiment, the VM environment may be configured to dynamically and seamlessly
`
`scale the security analytics system as needed. Skilled practitioners of the art will
`
`recognize that many such embodimentsare possible. Accordingly, the foregoing is not
`
`intendedto limit the spirit, scope or intent of the invention.
`
`[0052]
`
`In certain embodiments, an event collector 402 1s implementedto collect
`
`event and contextual information, described in greater detail herein, associated with
`
`various user behaviors. In these embodiments, the event and contextual information
`
`collected by the event collector 402, as described in greater detail herein, is a matter of
`
`design choice. In various embodiments, the event and contextual information collected
`
`by the event collector 402 may be processed by an enrichment module 404 to generate
`
`enriched user behavior information. In certain embodiments, the enrichment may include
`
`certain contextual information related to a particular user behavior.
`
`[0053]
`
`In certain embodiments, enriched user behavior information is provided by the
`
`enrichment module 404 to a streaming 406 analytics module. In turn, the streaming 406
`
`analy
![](/site_media/img/document_icon.png)
Accessing this document will incur an additional charge of $.
After purchase, you can access this document again without charge.
Accept $ ChargeStill Working On It
This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.
Give it another minute or two to complete, and then try the refresh button.
A few More Minutes ... Still Working
It can take up to 5 minutes for us to download a document if the court servers are running slowly.
Thank you for your continued patience.
![](/site_media/img/error_icon.png)
This document could not be displayed.
We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.
![](/site_media/img/error_icon.png)
Your account does not support viewing this document.
You need a Paid Account to view this document. Click here to change your account type.
![](/site_media/img/error_icon.png)
Your account does not support viewing this document.
Set your membership
status to view this document.
With a Docket Alarm membership, you'll
get a whole lot more, including:
- Up-to-date information for this case.
- Email alerts whenever there is an update.
- Full text search for other cases.
- Get email alerts whenever a new case matches your search.
![](/site_media/img/document_icon.png)
One Moment Please
The filing “” is large (MB) and is being downloaded.
Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.
![](/site_media/img/document_icon.png)
Your document is on its way!
If you do not receive the document in five minutes, contact support at support@docketalarm.com.
![](/site_media/img/error_icon.png)
Sealed Document
We are unable to display this document, it may be under a court ordered seal.
If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.
Access Government Site