throbber
Case 4:18-cv-07229-YGR Document 125-5 Filed 10/22/20 Page 1 of 25
`Case 4:18-cv-07229—YGR Document 125-5 Filed 10/22/20 Page 1 of 25
`
`
`
`
`
`EXHIBIT E
`
`EXHIBIT E
`
`

`

`
`
`11111111111111111111111111f1j1I111111111111111111111111111111111
`
`(12) United States Patent
`Wells et al.
`
`(10) Patent No.:
`(45) Date of Patent:
`
`US 8,140,660 B1
`Mar. 20, 2012
`
`(54) CONTENT PATTERN RECOGNITION
`LANGUAGE PROCESSOR AND METHODS
`OF USING THE SAME
`
`(75)
`
`Inventors: Joseph Wells, Pahrump, NV (US);
`Michael Xie, Palo Alto, CA (US)
`
`(73) Assignee: Fortinet, Inc., Sunnyvale, CA (US)
`
`(*) Notice:
`
`Subject to any disclaimer, the term of this
`patent is extended or adjusted under 35
`U.S.C. 154(b) by 914 days.
`
`(21) Appl. No.: 10/624,452
`
`(22) Filed:
`
`Jul. 21, 2003
`
`Related U.S. Application Data
`
`(60) Provisional application No. 60/397,147, filed on Jul.
`19, 2002, provisional application No. 60/397,304,
`filed on Jul. 19, 2002, provisional application No.
`60/397,302, filed on Jul. 19, 2002, provisional
`application No. 60/397,034, filed on Jul. 19, 2002,
`provisional application No. 60/397,033, filed on Jul.
`19, 2002.
`
`(51) Int. Cl.
`(2006.01)
`GO6F 15/173
` 709/224; 713/188; 726/13; 726/22
`(52) U.S. Cl.
`(58) Field of Classification Search
` 709/223-225;
`713/188; 726/13, 22-25
`See application file for complete search history.
`
`(56)
`
`References Cited
`
`U.S. PATENT DOCUMENTS
`5,491,691 A
`2/1996 Shtayer et al.
`5,530,939 A * 6/1996 Mansfield et al.
`5,539,659 A * 7/1996 McKee et al.
`5,557,742 A * 9/1996 Smaha et al.
`5,790,799 A
`8/1998 Mogul
`5,892,348 A * 4/1999 Norman et al.
`5,896,499 A
`4/1999 McKelvey
`5,946,487 A *
`8/1999 Dangelo
`
` 1/1
`
`709/224
`726/22
`
`318/701
`
`717/148
`
`5,991,881 A * 11/1999 Conklin et al.
`6,009,467 A
`12/1999 Ratcliff et al.
`6,067,575 A *
`5/2000 McManis et al.
`6,119,175 A *
`9/2000 Hakkarainen et a1.
`6,279,113 B1* 8/2001 Vaidya
`6,338,141 B1 *
`1/2002 Wells
`6,453,345 B2
`9/2002 Trcka et al.
`6,513,122 B1*
`1/2003 Magdych et al.
`6,519,703 B1
`2/2003 Joyce
`6,637,026 B1 * 10/2003 Chen
`6,654,373 B1
`11/2003 Maher et al.
`6,654,882 B1 * 11/2003 Froutan et al.
`6,823,697 B2
`11/2004 Fukuyama et al.
`(Continued)
`
`OTHER PUBLICATIONS
`
`726/22
`
`719/313
`710/22
`726/23
`726/24
`
`726/23
`
`717/151
`
`713/153
`
`"U.S. Appl. No. 10/624,914 Final Office Action mailed Jul. 10,
`2008", Foar, 17 PgS.
`
`(Continued)
`
`Primary Examiner — Patrice Winder
`Assistant Examiner — Tauqir Hussain
`(74) Attorney, Agent, or Firm — Schwegman, Lundberg &
`Woessner, P.A.
`
`ABSTRACT
`(57)
`A device for detecting network traffic content is provided.
`The device includes a processor configured to receive a sig-
`nature associated with content desired to be detected, and
`execute one or more functions based on the signature to
`determine whether network traffic content matches the con-
`tent desired to be detected. The signature is defined by one or
`more predicates. A computer readable medium for use to
`detect network traffic content is also provided. The computer
`readable medium includes a memory storing one or more
`signatures, each of the one or more signatures associated with
`content desired to be detected. Each of the one or more
`signatures is defined by one or more predicates, and each of
`the one or more predicates can be compiled into a byte code
`stream that controls a logic of a network traffic screening
`device.
`
`29 Claims, 9 Drawing Sheets
`
`402
`Creels a elgnehme using
`one cc more predicates
`
`404
`Comple sIgnehlre(s)
`
`406
`Receive network traffic
`plackets
`
`48
`Nome =pied
`eignathre(s)
`netvork traffic packets
`to determhe whether
`netwod, peck* match
`ciesked content to be
`defected
`
`412
`Bock mnlentto
`
`414
`Pee. corded 10
`USW
`
`QUALYS00029886
`
`

`

`US 8,140,660 B1
`Page 2
`
`U.S. PATENT DOCUMENTS
`6,826,697 B1 * 11/2004 Moran
`7,047,288 B2 *
`5/2006 Cooper et al.
`7,080,408 B1
`7/2006 Pak et al.
`7,134,012 B2
`11/2006 Doyle et al.
`7,181,765 B2 * 2/2007 Patel et al.
`7,181,769 B1 * 2/2007 Keanini et al.
`7,185,368 B2 * 2/2007 Copeland, III
`7,424,744 B1 * 9/2008 Wu et al.
`7,519,990 B1
`4/2009 Xie et al.
`2001/0042214 Al
`11/2001 Radatti et al.
`2002/0038339 Al
`3/2002 Xu
`2002/0094090 Al *
`7/2002 Iino
`2002/0129271 Al
`9/2002 Stanaway, Jr. et al.
`2002/0162026 Al
`10/2002 Neuman
`2002/0174350 Al
`11/2002 Franczek
`2003/0004689 Al *
`1/2003 Gupta et al.
`2003/0014662 Al *
`1/2003 Gupta et al.
`2003/0061496 Al * 3/2003 Ananda
`2003/0145228 Al
`7/2003 Suuronen et al.
`2004/0003284 Al
`1/2004 Campbell et al.
`2005/0021613 Al *
`1/2005 Schmeidler et al.
`2005/0251570 Al
`11/2005 Heasman et al.
`
`OTHER PUBLICATIONS
`
` 726/23
` 709/223
`
` 726/12
` 726/23
` 726/25
` 726/23
`
` 380/282
`
` 702/188
` 713/200
` 713/189
`
` 709/203
`
`"U.S. Appl. No. 10/624,941 Response filed Jul. 16, 2008 to Non-
`Final Office Action mailed Mar. 17, 2008". 9 pgs.
`"U.S. Appl. No. 10/624,941, Non-Final Office Action mailed Dec.
`11, 2008", 6 pgs.
`"U.S. Appl. No. 10/624,941, Response filed Jan. 14, 2009 to Non-
`Final Office Action mailed Dec. 11, 2008". 11 pgs.
`"U.S. Appl. No. 10/624,914, Advisory Action mailed Oct. 19, 2007",
`3 pgs.
`"U.S. Appl. No. 10/624,914, Final Office Action mailed Aug. 10,
`2007", 17 pgs.
`"U.S. Appl. No. 10/624,914, Non-Final Office Action mailed Jan. 2,
`2008", 17 pgs.
`"I T.S. Appl. No. 10/624,914, Non-Final Office Action mailed Mar.
`14, 2007", 10 pgs.
`"U.S. Appl. No. 10/624,914, Preliminary Amendment mailed Apr.
`26, 2005", 5 pgs.
`"U.S. Appl. No. 10/624,914, Response filed Apr. 2, 2008 to Non-
`Final Office Action mailed Jan. 2, 2008", 15 pgs.
`"U.S. Appl. No. 10/624,914, Response filed Jun. 14, 2007 to Non-
`Final Office Action mailed Mar. 14, 2007". 11 pgs.
`"U.S. Appl. No. 10/624,914, Response filed Sep. 24, 2007 to Final
`Office Action mailed Aug. 10, 2007", 12 pgs.
`"U.S. Appl. No. 10/624,941, Advisory Action mailed Aug. 29, 2007",
`3 pgs.
`"U.S. Appl. No. 10/624,941, Final Office Action mailed Jun. 11,
`2007", 12 pgs.
`"U.S. Appl. No. 10/624,941, Non-Final Office Action mailed Mar.
`17, 2006", 15 pgs.
`"U.S. Appl. No. 10/624,941, Non-Final Office Action mailed Oct. 3,
`2007", 16 pgs.
`"U.S. Appl. No. 10/624,941, Non-Final Office Action mailed Oct. 17,
`2006", 11 pgs.
`"U.S. Appl. No. 10/624,941, Notice of Allowance mailed Feb. 25,
`2009", 6 pgs.
`"U.S. Appl. No. 10/624,941, Preliminary Amendment mailed Apr.
`26, 2005", 3 pgs.
`
`"U.S. Appl. No. 10/624,941, Preliminary Amendment mailed Aug.
`27, 2003", 3 pgs.
`"U.S. Appl. No. 10/624,941, Response filed Feb. 20, 2007 to Non-
`Final Office Action mailed Oct. 17, 2006", 11 pgs.
`"U.S. Appl. No. 10/624,941, Response filed Aug. 13, 2007 to Final
`Office Action mailed Jun. 11, 2007", 10 pgs.
`"U.S. Appl. No. 10/624,941, Response filed Dec. 14, 2007 to Non-
`Final Office Action mailed Oct. 3, 2007", 11 pgs.
`"U.S. Appl. No. 10/624,948, Non-Final Office Action mailed Mar.
`30, 2007", 5 pgs.
`"U.S. Appl. No. 10/624,948, Notice of Allowance Aug. 8, 2007", 4
`pgs.
`"U.S. Appl. No. 10/624,948, Preliminary Amendment mailed Apr.
`26, 2005", 3 pgs.
`"U.S. Appl. No. 10/624,941, Non-Final Office Action mailed Dec.
`11, 2008", 6 pgs.
`"U.S. Appl. No. 10/624,941, Response filed Jan. 14, 2009 to Non-
`Final Office Action mailed Dec. 11, 2008", 11 pgs.
`"U.S. Appl. No. 10/624,914, Final Office Action mailed Mar. 31,
`2010", 12 Pgs.
`"U.S. Appl. No. 10/624,948, Final Office Action mailed Jun. 1,
`2010", 14 pages.
`"U.S. Appl. No. 10/624,948, Response filed May 10, 2010 to Non
`Final Office Action mailed Feb. 8, 2010", 25 pgs.
`"U.S. Appl. No. 10/624,914, Final Office Action mailed Aug. 18,
`2011", 17 pgs.
`"U.S. Appl. No. 10/624,948, Response Filed Oct. 3, 2011 to Final
`Office ACtion Received May 2, 2011", 24 pgs.
`"U.S. Appl. No. 12/403,839, Response Filed Aug. 31, 2011 to Final
`Office Action Received May 31, 2011", 9 pgs.
`"U.S. Appl. No. 10/624,914, Non Final Office Action mailed Mar. 3,
`2011", 13 pgs.
`"U.S. Appl. No. 10/624,914, Response filed Jun. 3, 2011 to Non Final
`Office Action mailed Mar. 3, 2011", 8 pgs.
`"U.S. Appl. No. 10/624,914, Response Filed Jun. 3, 2011 to Non-
`Final Office Action Received Mar. 3, 2011", 8 pgs.
`"U.S. Appl. No. 10/624,914, Response filed Aug. 30, 2010 to Final
`Office Action mailed Mar. 31, 2010", 8 pgs.
`"U.S. Appl. No. 10/624,948, Final Office Action mailed May 2,
`2011", 18 pgs.
`"U.S. Appl. No. 10/624,948, Non-Final Office Action mailed Oct. 14,
`2010", 3 pgs.
`"U.S. Appl. No. 10/624,948, Response filed Apr. 8, 2011 to Non-
`Final Office Action Received Oct. 14, 2010", 27 pgs.
`"U.S. Appl. No. 10/624,948, Response filed Sep. 1, 2010 to Final
`Office Action mailed Jun. 1, 2010", 26 pgs.
`"U.S. Appl. No. 12/403,839, Final Office Action mailed May 31,
`2011", 40 pgs.
`"U.S. Appl. No. 12/403,839, Non Final Office Action mailed Nov. 18,
`2010", 59 pgs.
`"U.S. Appl. No. 12/403,839, Response filed Mar. 18, 2011 to Non
`Final Office Action mailed Nov. 18, 2010", 8 pgs.
`"Application Serial No. 90/010,939, Ex-Parte Re-examination Office
`Action Mailed Nov. 5, 2010", 16 pgs.
`"U.S. Appl. No. 10/624,948, Preliminary Amendment mailed Aug.
`27, 2003", 3 pgs.
`"U.S. Appl. No. 10/624,948, Response filed May 18, 2007 to Non-
`Final Office Action mailed Mar. 30, 2007", 30 pgs.
`
`* cited by examiner
`
`QUALYS00029887
`
`

`

`U.S. Patent
`
`Mar. 20, 2012
`
`Sheet 1 of 9
`
`US 8,140,660 B1
`
`18
`2
`
`EI
`
`G
`
`20
`
`14a
`/ im
`.2
`/
`/ efiRto
`/
`/
`
`/
`
`/
`/
`
`/
`/
`
`/
`
`10
`(
`Detection
`Device
`
`/I/
`ri
`/ /
`<
`. .
`16
`
`\--
`
`\
`1
`
`III
`
`Fig. 1
`
`( 14d
`IN
`/ /
`atif
`/
`
`t
`
`\\
`
`( 14e
`mil
`
`-...
`
`QUALYS00029888
`
`

`

`U.S. Patent
`
`Mar. 20, 2012
`
`Sheet 2 of 9
`
`US 8,140,660 B1
`
`Fig. 2
`
`QUALYS00029889
`
`

`

`U.S. Patent
`
`Mar. 20, 2012
`
`Sheet 3 of 9
`
`US 8,140,660 B1
`
`Fig. 3
`
`304
`External storage, SRAM/CAM or other memory
`devices. Stores signature and pattern information
`about virus, attacks, etc.
`
`306
`Internal storage, stores
`signatures, patterns that are
`accessed most frequently
`
`302
`I/O buffer and logic to interfact with
`internal and external storage
`
`314
`
`Scanning logic
`
`} 24
`
`312
`
`Register
`
`308
`I/O buffer and logic to protocol stack
`incoming content data stream.
`For example, through PCI bus.
`
`310
`
`RAM
`Network traffic content is decrypted
`and stored here
`
`QUALYS00029890
`
`

`

`U.S. Patent
`
`Mar. 20, 2012
`
`Sheet 4 of 9
`
`US 8,140,660 B1
`
`Fig. 4
`
`402
`Create a signature using
`one or more predicates
`
`.i
`
`. 400
`
`404
`Compile signature(s)
`
`t
`406
`
`Receive network traffic
`packets
`
`•
`408
`Process compiled
`signature(s) and
`network traffic packets
`to determine whether
`network packets match
`desired content to be
`detected
`
`•
`412
`
`Block content to
`user
`
`414
`Pass content to
`user
`
`QUALYS00029891
`
`

`

`U.S. Patent
`
`Mar. 20, 2012
`
`Sheet 5 of 9
`
`US 8,140,660 B1
`
`508
`510
`504
`506
`512
`502
`Description
`Family
`ID Return Predicate Mnemonic
`Test
`Test literal string
`Ascii
`A
`b
`A(z)
`Test using bitmask
`Test
`Bitmask
`B(m)
`b
`B
`Decision Branch using multiple cases
`Case
`Co )
`C
`V
`Iteration Start Loop (ends on lable)
`D(label)
`Do
`V
`D
`Iteration Repeat function with each byte in list
`E(f, b, b...) Each
`V
`E
`Iteration Repeat function on n buffer bytes
`For
`F(n, f)
`F
`V
`Decision Goto label in sig
`Goto
`G(label)
`V
`G
`Test
`Test d against heuristic flags
`Heuristic
`B
`H(d)
`H
`Decision If test f branch else continue
`If
`l(f, I)
`I
`V
`Pointer Jump using buffer value of size
`Jump
`J(size)
`J
`V
`Function Process keyword
`K(reserved) Keyword
`B
`K
`Test
`Test literal
`Literal
`B
`L(b)
`L
`Function Execute macro NAME
`Macro
`M(name)
`V
`M
`Test
`Test using relative logic
`N(logic)
`Near
`N
`B
`Order (sort) n buffer bytes using method
`0(n, method) Order
`Test
`B
`0
`Process/Procedure Function Execute process name
`P(name)
`P
`V
`Test using ranged logic
`Test
`Q(logic)
`Query
`Q
`B
`Rewind
`Pointer Reset buffer stream pointer
`R(p)
`R
`V
`Pointer Reposition buffer stream pointer
`Seek
`S(n, k)
`B
`S
`Test using positional logic
`Test
`Test
`T(logic)
`T
`B
`Uppercase
`Test after uppercasing buffer string
`U(z)
`Test
`U
`b
`Test
`Test using set summation
`Variable
`V(logic)
`b
`V
`Simply (one byte) wildcards
`W(c)
`Wildcard
`Test
`b
`W
`XraylXor
`Test using xor mask based on b
`X
`b
`X(b)
`Test
`
`Fig. 5
`
`QUALYS00029892
`
`

`

`U.S. Patent
`
`Mar. 20, 2012
`
`Sheet 6 of 9
`
`US 8,140,660 B1
`
`c
`
`10
`
`204
`
`206
`
`Fig. 6
`
`QUALYS00029893
`
`

`

`U.S. Patent
`
`Mar. 20, 2012
`
`Sheet 7 of 9
`
`US 8,140,660 B1
`
`QUALYS00029894
`
`Fig. 7
`
`PI I
`
`0) m _4)
`,r9
`CI- 2
`
`Differentiator
`
`Protocol
`
`

`

`U.S. Patent
`
`Mar. 20, 2012
`
`Sheet 8 of 9
`
`US 8,140,660 B1
`
`Packet A
`
`Packet B
`
`7 .- 704
`
`Protocol
`Differentiator
`
`Packet A
`Packet B
`
`Copy
`Packet B
`to Stack
`
`708
`
`Stack
`
`706
`
`Packet
`Processing
`Module
`
`Rest of
`Packet B
`
`24
`
`Processor
`
`Packet A
`
`Portion of
`Packet B
`
`R4 exult
`
`Packet B
`
`Fig. 8
`
`QUALYS00029895
`
`

`

`lualud •S'll
`
`6 JO 6 WIN
`
`is 09911171'8 Sfl
`
`HOST
`
`DISPLAY
`
`INPUT
`DEVICE
`
`212
`
`214
`
`216
`
`CURSER
`CONTROL
`
`
`
`1206
`
`MAIN
`MEMORY
`— 7\--
`
`- -
`
`1208 `1
`
`ROM
`
` — 1%----
`
`1200
`
`1210 -.
`STORAGE
`DEVICE
`--/‘.-
`
` >
`
`BUS
`
`1202
`
`PROCESSOR
`
`...I
`
`MMI.
`
`MO
`
`M , WI,
`
`.1.1M=P1.
`
`•
`
`imb
`
`9696ZOOOSKIV110
`
`

`

`1
`CONTENT PATTERN RECOGNITION
`LANGUAGE PROCESSOR AND METHODS
`OF USING THE SAME
`
`RELATED APPLICATION DATA
`
`This application claims priority to U.S. Provisional Appli-
`cation Nos. 60/397,147, 60/397,304, 60/397,033, 60/397,
`302, and 60/397,034, all filed Jul. 19, 2002, the disclosures of
`which are expressly incorporated by reference herein.
`
`BACKGROUND OF THE INVENTION
`
`1. Field of the Invention
`The field of the invention relates to computer systems and
`computer networks, and more particularly, to systems and
`methods for detecting content of computer and network traf-
`fic.
`2. Background of the Invention
`The generation and spreading of computer viruses are
`major problems in computer systems and computer networks.
`A computer virus is a program that is capable of attaching to
`other programs or sets of computer instructions, replicating
`itself, and/or performing unsolicited or malicious actions on a
`computer system. Viruses may be embedded in email attach-
`ments, files downloaded from Internet, and macros in MS
`Office files. The damage that can be done by a computer virus
`may range from mild interference with a program, such as a
`display of unsolicited messages or graphics, to complete
`destruction of data on a user's hard drive or server.
`To provide protection from viruses, most organizations
`have installed virus scanning software on computers in their
`network. However, these organizations may still be vulner-
`able to a virus attack until every host in their network has
`received updated anti-virus software. With new attacks
`reported almost weekly, organizations are constantly exposed
`to virus attacks, and spend significant resources ensuring that
`all hosts are constantly updated with new anti-virus informa-
`tion. Furthermore, anti-virus programs that operate at the
`application-level require enormous computing resources,
`making such anti-virus programs expensive to deploy and
`manage.
`Besides virus attacks, many organizations also face the
`challenge of dealing with inappropriate content, such as email
`spam, misuse of networks in the form of browsing or down-
`loading inappropriate content, and use of the network for
`non-productive tasks. Many organizations are struggling to
`control access to appropriate content without unduly restrict-
`ing access to legitimate material and services. Currently, the
`most popular solution for blocking unwanted web activity is
`to block access to a list of banned or blacklisted web sites and
`pages based on their URLs. However, such approach may be
`unnecessarily restrictive, preventing access to valid content in
`web sites that may contain only a limited amount of undesir-
`able material. As with virus scanning, the list of blocked
`URLs requires constant updating.
`Many email spam elimination systems also use blacklists
`to eliminate unwanted email messages. These systems match
`incoming email messages against a list of mail servers that
`have been pre-identified to be spam hosts, and prevent user
`access of messages from these servers. However, spammers
`often launch email spam from different hosts every time,
`making it difficult to maintain a list of spam servers.
`Accordingly, improved systems and methods for detecting
`content of computer and network traffic would be useful.
`
`US 8,140,660 B1
`
`10
`
`2
`receive a signature associated with content desired to be
`detected, and execute one or more functions based on the
`signature to determine whether network traffic content
`matches the content desired to be detected. The signature is
`5 defined by one or more predicates.
`In other embodiments of the invention, a device for detect-
`ing network traffic content includes a processor. The proces-
`sor is configured to receive one or more signatures, wherein
`each of the one or more signatures is defined by one or more
`predicates, and associated with content desired to be detected.
`Each of the one or more predicates can be compiled into a
`byte code stream that controls a logic of the processor.
`In some embodiments of the invention, a computer read-
`able medium for use to detect network traffic content includes
`a memory storing one or more signatures, wherein each of the
`15 one or more signatures associated with content desired to be
`detected. Each of the one or more signatures is defined by one
`or more predicates, and each of the one or more predicates can
`be compiled into a byte code stream that controls a logic of a
`network traffic screening device.
`20 Other aspects and features of the invention will be evident
`from reading the following detailed description of the pre-
`ferred embodiments, which are intended to illustrate, not
`limit, the invention.
`
`25
`
`BRIEF DESCRIPTION OF THE DRAWINGS
`
`30
`
`The drawings illustrate the design and utility of preferred
`embodiments of the present invention, in which similar ele-
`ments are referred to by common reference numerals. In order
`to better appreciate how advantages and objects of the present
`inventions are obtained, a more particular description of the
`present inventions briefly described above will be rendered by
`reference to specific embodiments thereof, which are illus-
`trated in the accompanying drawings. Understanding that
`these drawings depict only typical embodiments of the inven-
`35 tion and are not therefore to be considered limiting its scope,
`the invention will be described and explained with additional
`specificity and detail through the use of the accompanying
`drawings.
`FIG. 1 illustrates a detection device in accordance with
`40 some embodiments of the invention, and an example of a
`network environment in which detection device can be oper-
`ated;
`FIG. 2 illustrates a block diagram of detection device of
`FIG. 1;
`FIG. 3 illustrates an architecture of the processor of detec-
`tion device of FIG. 2;
`FIG. 4 is a flow chart showing a process for detecting
`content of network traffic;
`FIG. 5 is a table listing examples of predicate that may be
`used to control a logic of processor of FIG. 2;
`FIG. 6 illustrates a block diagram of a detection device in
`accordance with alternative embodiments of the invention,
`particularly showing detection device including a compiler;
`FIG. 7 illustrates a block diagram of another detection
`device in accordance with alternative embodiments of the
`55 invention, particularly showing the detection device includ-
`ing a processor configured for managing network traffic flow;
`FIG. 8 shows examples of operation that may be performed
`by components of detection device of FIG. 7; and
`FIG. 9 is a diagram of a computer hardware system with
`60 which embodiments of the present invention can be imple-
`mented.
`
`45
`
`50
`
`SUMMARY OF THE INVENTION
`
`65
`
`DETAILED DESCRIPTION OF THE PREFERRED
`EMBODIMENTS
`
`In some embodiments of the invention, a device for detect-
`ing network traffic content includes a processor configured to
`
`invention are
`Various embodiments of the present
`described hereinafter with reference to the figures. It should
`
`QUALYS00029897
`
`

`

`US 8,140,660 B1
`
`3
`be noted that the figures are not drawn to scale and that
`elements of similar structures or functions are represented by
`like reference numerals throughout the figures. It should also
`be noted that the figures are only intended to facilitate the
`description of specific embodiments of the invention. They
`are not intended as an exhaustive description of the invention
`or as a limitation on the scope of the invention. In addition, an
`illustrated embodiment needs not have all the aspects or
`advantages of the invention shown. An aspect or an advantage
`described in conjunction with a particular embodiment of the
`present invention is not necessarily limited to that embodi-
`ment and can be practiced in any other embodiments of the
`present invention even if not so illustrated.
`FIG. 1 illustrates a detection device 10 in accordance with
`embodiments of the present invention, and an example of a
`network environment in which detection device 10 can be
`operated. Detection device 10 is configured to detect a pro-
`gram content, such as a virus, and/or a non-program content,
`such as a web content, being transmitted from Internet 12 to
`users 14a-e. For example, a sender 18 connected to Internet
`12 may send files containing viruses, worms, or other mali-
`cious programs, to one or more of the users 14a-c and server
`16 via Internet 12. Viruses may also be copied from a server
`20 and transmitted to users 14a-c and network server 16
`through Internet 12. Viruses transmitted to network server 16
`may also infect users 14d and 14e connected to network
`server 16. Detection device 10 scans network traffic content
`transmitted from Internet 12 and prevents undesirable con-
`tent, such as a virus, a worm, an email spam, and a web page
`containing undesirable content, from being transmitted to
`users 14a-e. Besides detecting content, detection device 10
`may also modify or re-direct network traffic content such that,
`for examples, a virus may be removed from a network stream,
`or a HTTP request may be blocked. In some embodiments,
`detection device 10 may be implemented as a firewall, a
`component of a firewall, or a component that is configured to
`be coupled to a firewall.
`FIG. 2 shows content detection device 10 of FIG. 1 in
`further detail. As shown in FIG. 2, detection device 10
`includes a memory 22 and a processor 24 coupled to memory
`22. Detection device 10 also includes a first input port 202 for
`inputting data to memory 22, a second input port 204 for
`receiving network traffic packets from Internet 12 or a net-
`work, and an output port 206 coupled to processor 24. Output
`port 206 is configured for transmitting filtered network traffic
`packets to user 14. In alternative embodiments, memory 22
`can be implemented as a part of processor 24.
`Memory 22 is adapted for storing data to be processed by
`processor 24. Data may be transmitted to memory 22 via
`input port 202 from a user or an administrator. For example,
`a user or an administrator can transmit data to memory 22 via
`a wire, a telephone line, a Tl-line, a cable of a cable modem,
`or other types of transmitter connected to port 202. Data may
`also be transmitted to memory 22 via an infrared transmitter,
`in which case, port 202 would include an infrared receiver. In
`the illustrated embodiments, memory 22 is adapted for stor-
`ing one or more signatures, each of which associated with
`content desired to be detected by detection device 10. The
`signatures will be described in detail below.
`In the illustrated embodiments, the processor 24 includes
`an application-specific integrated circuit (ASIC), such as a
`semi-custom ASIC processor or a programmable ASIC pro-
`cessor. ASICs, such as those described in Application-Spe-
`cific Integrated Circuits by Michael J. S. Smith, Addison-
`Wesley Pub Co. (1st Edition, June 1997), are well known in
`the art of circuit design, and therefore will not be described in
`further detail herein. Processor 24 is configured to receive
`
`10
`
`4
`packets from Internet 12, process packets based on data
`stored in memory 22, and generate a result based on the
`processing of the packets. It should be noted that processor 24
`is not limited to those described previously, and that proces-
`s sor 24 can also be any of a variety of circuits or devices that
`are capable of performing the functions described herein. For
`example, in alternative embodiments, processor 24 can
`include a general purpose processor, such as a Pentium pro-
`cessor.
`FIG. 3 shows an architecture of processor 24 in accordance
`with some embodiments of the present invention. Processor
`24 includes a first I/O buffer and logic 302, an internal storage
`306, a second I/O buffer and logic 308, a register 312, and a
`scanning logic 314. I/O buffer and logic 302 is configured for
`15 processing data (e.g., information associated with content
`desired to be detected) received from an external memory 304
`such that data of desirable format can be stored in internal
`storage 306. I/O buffer and logic 308 is configured for pro-
`cessing decrypted network traffic content received from an
`20 external memory 310 (such as a RAM) such that data of
`desirable format can be stored in register 312. In some
`embodiments of the invention, one or both of I/O buffer and
`logics 302 and 308 can also process data generated by scan-
`ning logic 314 such that data of desirable format can be
`25 transmitted to external storages 304 and 310, respectively.
`Scanning logic 314 processes network traffic content stored
`in register 312 base on data stored in internal memory 306,
`and determines whether network traffic content contains con-
`tent desired to be detected. In the illustrated embodiments of
`30 the invention, I/O buffer and logics 302, 308, and scanning
`logic 314 are implemented in processor 24. In alternative
`embodiments, separate processors or components may be
`used to implement buffer and logics 302 and 308 and scan-
`ning logic 314. In addition, internal storage 306 and register
`35 312 can both be implemented using a single memory, such as
`memory 22. In alternative embodiments, internal storage 306
`and register 312 can each be implemented using a separate
`memory.
`A method 400 for detecting network traffic content using
`40 detection device 10 will now be described with reference to
`FIG. 4. Initially, content pattern recognition language
`(CPRL) is used to create a signature, which represents a
`symbolic detection model for certain prescribed content, such
`as a virus, a worm, a web content, a Trojan agent, an email
`45 spam, a packet transmitted by a hacker, etc., desired to be
`detected (Step 402). Depending on an implementation of the
`CPRI „ in some embodiments of the invention, the 5 signature
`may be expressed in a form similar to a set of sentences or
`phrases in predicate logic. The pattern recognition signature
`so created for a given content desired to be detected is tested for
`validity, compiled, and interpreted by a set of functions
`implemented using processor 24. In some embodiments of
`the invention, the CPRL used is a programming language that
`supports testing, 10 branching, looping, and/or recursion.
`FIG. 5 is a table showing examples of predicates that can be
`used to create a signature of content desired to be detected.
`Column 502 shows identifications of predicates that are the
`basic roots or components of a CPRL. Although only identi-
`fications "A" through "X" are shown, in alternative embodi-
`60 ments, a predicate identification can also includes other let-
`ters, a number, a combination of letters, mathematical
`operator, logical operator, punctuations, and/or combination
`thereof. Column 506 shows mnemonics represented by
`respective predicates.
`Column 504 shows formats in which predicates A-Z are
`used. For examples, predicate "D" has "label" as its argu-
`ment, and predicate "M" has "name" as its argument. In some
`
`55
`
`65
`
`QUALYS00029898
`
`

`

`US 8,140,660 B1
`
`5
`embodiments, the argument of a predicate may include one or
`a combination of bytes, with each of the bytes having two
`characters. In alternative embodiments, the argument can also
`include a number, a letter, a combination of letters, a sentence,
`a mathematical operator, a logical operator, a punctuation,
`and/or combination thereof. In other embodiments, a predi-
`cate may not require an argument.
`In the illustrated embodiments, each predicate of a signa-
`ture is compiled into a byte stream that controls a logic of
`processor 24. Column 510 describes functions that are per-
`formed by processor 24 based on respective predicates.
`Appendix A provides exemplary specifications for the predi-
`cates illustrated in FIG. 5. It should be understood by those
`skilled in the art that the functions prescribed by the predi-
`cates should not be limited to the examples shown in FIG. 5,
`and that other functions may also be prescribed to be per-
`formed by processor 24 based on other predicates. Each func-
`tion prescribed by the respective predicate may return a vari-
`able, such as a Boolean value, a number, a pointer, a "void",
`or other types of return value (Column 512).
`The predicates may be categorized by the types of function
`they perform (Column 508). In the illustrated embodiments,
`CPRL includes five families of predicates, namely, "Test",
`"Decision", "Iteration", "Function", and "Pointer". A "test"
`type predicate provides instruction that causes processor 24
`to test one or more variables using a prescribed operation. A
`"decision" type predicate provides instruction that causes
`processor 24 to decide which operation to perform based on a
`prescribed condition. An "iteration" type predicate provides
`instruction that causes processor 24 to repeat a prescribed
`function. A "function" type predicate provides instruction
`that causes the processor 24 to execute a prescribed function.
`A "pointer" type predicate provides instruction that causes
`processor 24 to position or reset a buffer stream pointer.
`Although five types of predicates are shown, in alternative
`embodiments, CPRL may have other different types of predi-
`cates.
`Like predicate logic, the signature codified using CPRL is
`treated as a formula made up of logical elements and is
`rule-based. Accordingly, each signature must meet these rules
`in order to form a well-formed formula (wff). Codifying
`signature using a predicate-based system is advantageous in
`that the codified signature is much more readable and intui-
`tive than memorizing and using an extensive collection of
`pattern recognition directives in a form of hexadecimal code
`instructions embedded in a signature stream. In some
`embodiments, the predicates can be formalized such that they
`are similar to inline macros, thereby allowing a user to easily
`create signatures without having to learn a completely new set
`of programming language.
`Unlike traditional virus signatures, which are used to detect
`virus using byte-by-byte comparison, a signature created
`using CPRL represent one or more instructions that control an
`operation of a processor being used to detect content. For
`examples, a signature created using CPRL may provide
`instructions for calling functions, pointing to a different sig-
`nature, calling an interpreter of the signature recursively,
`responding to a returned information, and/or performing
`other functions. As such, CPRL is a true pattern recognition
`language, and is far more powerful then traditional antivirus
`signatures. It should be understood by those skilled in the art
`that the scope of the invention is not limited to the examples
`of CPRL described previously, and that other languages or
`symbolic models may also be used to codify signatures.
`The signature(s) may be codified by one or more service
`providers. For example, when a new virus is discovered, a
`service provider may codify the corresponding signature and
`send the signature to the detection device 10 as an update.
`
`10
`
`6
`Alternatively, or additionally, one or more users may also
`codify the signature if a new virus is discovered. The codify-
`ing of the signature(s) may be performed on a computer
`platform. For example, a suitable editor may be used for
`5 writing and/or editing the signature(s). In some embodi-
`ments, an integrated development environment (IDE) may be
`employed for writing and/or editing the signature(s). A
`graphical interface may also b

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket