`
`
`
`
`UNITED STATES PATENT AND TRADEMARK OFFICE
`____________
`
`BEFORE THE PATENT TRIAL AND APPEAL BOARD
`____________
`
`SYMANTEC CORPORATION,
`Petitioner,
`
`v.
`
`INTELLECTUAL VENTURES I LLC,
`Patent Owner
`____________
`
`Case IPR2016-01433
`Patent 7,757,298 B2
`____________
`
`Record of Oral Hearing
`Held: October 12, 2017
`____________
`
`
`
`
`Before THOMAS L. GIANNETTI, HYUN J. JUNG, and GREGG I.
`ANDERSON, Administrative Patent Judges.
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`Case IPR2016-01433
`Patent 7,757,298 B2
`
`APPEARANCES:
`
`ON BEHALF OF THE PETITIONER:
`
`
`JOSEPH J. RICHETTI, ESQ.
`ALEXANDER WALDEN, ESQ.
`Bryan Cave LLP
`1290 Avenue of the Americas
`New York, New York 10104-3300
`
`ON BEHALF OF PATENT OWNER:
`
`
`JOHN KING, ESQ.
`TED M. CANNON, ESQ.
`Knobbe Martens
`2040 Main Street, 14th Floor
`Irvine, California 92614
`
`
`
`
`The above-entitled matter came on for hearing on Thursday,
`October 12, 2017, commencing at 1:00 p.m., at the U.S. Patent and
`Trademark Office, 600 Dulany Street, Alexandria, Virginia.
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
` 2
`
`
`
`Case IPR2016-01433
`Patent 7,757,298 B2
`
`
`P R O C E E D I N G S
`- - - - -
`JUDGE JUNG: Good afternoon. This is the final hearing for
`case IPR2016-01433 between the Petitioner, Symantec Corporation, and
`Patent Owner, Intellectual Ventures I LLC.
`Starting with counsel for Petitioner, followed by counsel for
`Patent Owner, please state your names for the record.
`MR. RICHETTI: Good afternoon, Your Honor, Joseph Richetti
`from Bryan Cave for Symantec. With me, my colleague Alexander
`Walden, also from Bryan Cave.
`JUDGE JUNG: Thank you.
`MR. KING: Good morning, Your Honor. My name is John
`King. I am lead counsel for Patent Owner Intellectual Ventures. With
`me at counsel table is Ted Cannon, back-up counsel, and then I would
`also like to introduce Russ Rigby, a representative of the Patent Owner.
`JUDGE JUNG: Thank you, welcome.
`As stated in the trial hearing order, each party has 30 minutes of
`total argument time. The panel has received your lists, joint lists of the
`objections to the demonstratives and the panel will defer ruling on the
`objections until after the hearing.
`Also, when presenting your arguments, please stay close to the
`microphone and state the slide number of the slide you're about to discuss
`so that Judge Anderson, who is joining us remotely, can follow along.
`With all that said, counsel for Petitioner, you may proceed
`when you're ready.
`
`1
`2
`3
`4
`5
`6
`7
`8
`9
`10
`11
`12
`13
`14
`15
`16
`17
`18
`19
`20
`21
`22
`23
`24
`25
`
`
`
`
`
` 3
`
`
`
`Case IPR2016-01433
`Patent 7,757,298 B2
`
`
`JUDGE GIANNETTI: Mr. Richhetti, before you get started, I
`want to ask you something. Did you forget something?
`MR. RICHETTI: Oh, my tie?
`JUDGE GIANNETTI: Yes.
`MR. RICHETTI: Actually, it's a funny story, I actually had a
`little accident at breakfast. So, I wasn't going to bring that up, but --
`JUDGE GIANNETTI: All right. Well, we understand that. I
`hope you're not suggesting that it's any sign of disrespect.
`MR. RICHETTI: No. It's absolutely meant not to be
`disrespectful.
`JUDGE GIANNETTI: Normally we require ties, but under the
`circumstances, we understand.
`MR. RICHETTI: Understood, Your Honor. Appreciate that.
`JUDGE JUNG: Mr. Richetti, would you like to reserve time
`for rebuttal?
`MR. RICHETTI: Yes, Your Honor, so the '298 patent -- oh, we
`would like to reserve 10 minutes of rebuttal time.
`JUDGE JUNG: You may proceed.
`MR. RICHETTI: Thank you. The '298 patent, Your Honors, is
`pretty straightforward and simple. The patent is directed to scanning files
`on a computer in order to look for files that are known to be bad. There's
`a bunch of examples given about the types of techniques and threshold
`criteria. The techniques disclosed in the patent involve two steps: The
`first step is trying to identify a suspect file. So, you know, as the patent
`explains, this can involve scanning files in a directory or on a storage
`
`1
`2
`3
`4
`5
`6
`7
`8
`9
`10
`11
`12
`13
`14
`15
`16
`17
`18
`19
`20
`21
`22
`23
`24
`25
`
`
`
`
`
` 4
`
`
`
`Case IPR2016-01433
`Patent 7,757,298 B2
`
`device and creating a list of suspect files. And then you further analyze
`these suspect or selected files by creating an ID value. The patent
`explains it's a checksum that's used and then you compare it against a
`pre-existing list of known files that are known to be bad. And if you get
`a match, then you can characterize it as either a bad file or, you know, an
`unauthorized file, as the claim would say.
`If we could turn to the claim, slide 3. Sorry, we just had a little
`technical difficulty, but on slide 3, the -- I'll just continue to go, the
`independent claim follows the same structure. It's -- you know, it has the
`preamble talks about a computer-implemented method for identifying
`and characterizing. And what we see is, you know, the first step is
`obviously under the control of one or more computers, and then it sets
`out the different steps that the computer is going to be doing.
`And the first one is the selecting. And the selecting step breaks
`down into looking for three different types of, you know, threshold
`criteria. And the three criteria, the most important for the IPR petition
`are the two of them, the second and the third one. We're looking for files
`that have -- that are -- you know, there's a mismatch based on content and
`file extension, or we're looking for files that have data that have been --
`that has been appended beyond the end of the file marker.
`So, that's the selecting step, and at the end of that process,
`you're going to have a list of suspect files. Now you're going to
`generate -- you're going into more the characterizing part of the claim,
`and you're going to generate an ID value, and the patent talks about a
`checksum as being the way to do that. Then you're going to compute,
`
`1
`2
`3
`4
`5
`6
`7
`8
`9
`10
`11
`12
`13
`14
`15
`16
`17
`18
`19
`20
`21
`22
`23
`24
`25
`
`
`
`
`
` 5
`
`
`
`Case IPR2016-01433
`Patent 7,757,298 B2
`
`comparing the generated identification file for the suspect file or the
`selected file, and you're going to compare that against the known list of
`unauthorized files. And at that point, if there's a match, then the file is
`characterized as unauthorized.
`So, when it comes to claim construction, there's really only two
`terms that are in dispute, unauthorized file and selecting. So, starting
`with authorized -- unauthorized file, you know, we agree with the Board,
`it really doesn't need construction. The claim language itself makes clear
`what an unauthorized file is. You know, I think the last, you know,
`clause of the claim, when it says characterizing the file as an
`unauthorized file, if the ID value matches one of the plurality of ID
`values associated with the unauthorized files.
`So, it's pretty simple. It's just once you compare the selected
`file against the known list of bad files, or unauthorized files, at that point
`if you have a match, then it's characterized. Whatever files are on that
`list, that's what the claim at least forces to be, requires to be unauthorized.
`Now, you know, the specification uses a whole host of terms as
`it goes through. I mean, it's not limited really. You know, I think the
`interesting part is unauthorized is not used when describing the alleged
`invention, and but what is -- you know, they talk about acceptable or
`unacceptable, the list is errant, undesirable, improper.
`I mean, there's a bunch of words that are used that are either
`used interchangeably, you know, or -- you know, to the extent there is an
`embodiment, I think illicit would be an embodiment of an errant file, but
`I think what's most important is that what is consistent is the structure,
`
`1
`2
`3
`4
`5
`6
`7
`8
`9
`10
`11
`12
`13
`14
`15
`16
`17
`18
`19
`20
`21
`22
`23
`24
`25
`
`
`
`
`
` 6
`
`
`
`Case IPR2016-01433
`Patent 7,757,298 B2
`
`that there's going to be a list, that's a known list of whether it's bad files
`or unauthorized files, and that's going to be compared to the suspect file
`or the selected file.
`So, when we look at -- what the Patent Owner is setting forward
`for what they point to is one sentence in the description of related art
`where it talks about certain types of content, offensive, illegal,
`unauthorized or otherwise undesirable, and we submit they make much
`too much of this one sentence to say that these are distinct categories.
`What these are the types of content that you're looking to remove from
`the computer system.
`And the -- there seems to be maybe a difference of, you know,
`content verse file type. These are all -- these list all the different types of
`content, and there's others that the patent describes, that you want to be
`able to get rid of, but the file types and how the system works, it's all
`based off TXT files or image files or, you know, any of these things.
`This content can be in any and all of these types of files.
`So I think what's important is not -- I mean, the patent spec
`definitely does not provide any clear delineation, but what it does do is
`there's certain content. So what we see in their construction is they look
`to say, well, it can't have a negative -- you know, a negative limitation is
`being added, it can't be advertisements.
`Of course it could be advertisements. And advertisements is a
`text or an image file. If those files have data appended to the end of it, of
`course that would be something that the program would, you know,
`identify as a selected file.
`
`1
`2
`3
`4
`5
`6
`7
`8
`9
`10
`11
`12
`13
`14
`15
`16
`17
`18
`19
`20
`21
`22
`23
`24
`25
`
`
`
`
`
` 7
`
`
`
`Case IPR2016-01433
`Patent 7,757,298 B2
`
`
`Similarly, if the file, even if it's an advertisement, if it's an
`image or a text file and the content doesn't match the extension, then it's
`going to get flagged, and it's going to be compared against the list of
`known bad files. So it's clear they're trying to inject limitations into the
`claim for, you know, reasons that become clear as you see their papers to
`try to get around the prior art.
`The other thing they try to add in is a system administrator.
`Now, what's interesting here is this kind of comes out of the blue. And,
`in fact, in their argument on selecting, they say about how this is a --
`everything is controlled by a computer. But in their unauthorized
`definition, they're injecting in a person and adding in the system
`administrator. Clearly improper and unnecessary.
`So, you know, our point is, we submit that no construction is
`needed because it's readily apparent from the claim, but to the extent the
`Board would like to give a construction, I think it's very clear that there's
`many different types of files that would fall under kind of the "known
`list" of bad files.
`So we'll just move to slide 9. So, the other term, Your Honors,
`is selecting. This is a term that's well understood and plain on its face.
`The experts agree that it's well known in the art. We believe the claim
`language makes very clear, there's three steps where it says what you're
`selecting is what it's based on, the criteria, the threshold criteria.
`It appears that the Patent Owner doesn't disagree with this. At
`one point in their response, they say the Board should construe selecting
`
`1
`2
`3
`4
`5
`6
`7
`8
`9
`10
`11
`12
`13
`14
`15
`16
`17
`18
`19
`20
`21
`22
`23
`24
`
`
`
`
`
` 8
`
`
`
`Case IPR2016-01433
`Patent 7,757,298 B2
`
`a file from a plurality of files to mean selecting a file from a plurality of
`files. So, to us that means no construction is necessary.
`They do go on, of course, to try to add in some additional
`limitations, and try to make distinctions about identifying verse selecting.
`We believe that's -- you know, that's really just a red herring. To us it's
`very clear the files that are selected are the files that meet one of the three
`selection criteria as set forth in the claim, and therefore we submit no
`construction is required on that as well.
`JUDGE JUNG: Mr. Richetti, just to clarify my understanding
`of your reply, it seems that you agree with many points that the Patent
`Owner argues in its response regarding selecting. Is that correct? Is that
`a fair reading of your reply? For example, you agree that it's more than
`just identifying a file for a list.
`MR. RICHETTI: I think what -- I think that's -- there may be
`some lack of clarity. I mean, what we agree is that it's not identifying a
`directory or a list of files to be analyzed, that's not the selecting. The
`selecting is when you pull out ones that are suspect, that you're going to
`give additional scrutiny to.
`So how we understood their argument was they were saying --
`you know, basically identifying just a broad group of files that need to be
`analyzed in the first instance, that's not selecting, and we would agree.
`It's when you go through it and they meet the three criteria, and it's an
`"or," so it's any one of the three criteria, that's the selection process.
`JUDGE JUNG: Okay.
`
`1
`2
`3
`4
`5
`6
`7
`8
`9
`10
`11
`12
`13
`14
`15
`16
`17
`18
`19
`20
`21
`22
`23
`24
`
`
`
`
`
` 9
`
`
`
`Case IPR2016-01433
`Patent 7,757,298 B2
`
`
`MR. RICHETTI: If there's no other questions, I would like to
`turn it over to my colleague, Alexander Walden, to just go through the
`actual prior art grounds, if that's okay.
`JUDGE JUNG: I have no questions.
`MR. WALDEN: Good afternoon. So, just to give a framework
`to the grounds that were instituted. There's two sets, two distinct sets of
`grounds that were instituted. The first set of grounds being based on the
`DeSouza and Hoffman references, and the second set of grounds being
`based on two different references, Hypponen and Johnson.
`JUDGE JUNG: Slide 13. Is that correct?
`MR. WALDEN: Sorry, yes, slide 13.
`JUDGE GIANNETTI: Counsel, I think it's important for you to
`point out what slides you're on for the remote judge here.
`MR. WALDEN: I will, I apologize. So, these are separate and
`distinct grounds that meet all of the independent claims, as well as some
`extra grounds there in the third reference for a handful of the challenged
`dependent claims.
`Given the limited time frame here, I'm just going to focus on
`the specific limitations or arguments that were raised by the Patent
`Owner.
`
`If we could go to slide 14. So the first set of grounds is, again,
`based on these two references, DeSouza and Hoffman. DeSouza, like the
`'298 patent, is generally directed to looking through a computer, through
`computer storage, for files that are illicit or objectionable or offensive.
`
`1
`2
`3
`4
`5
`6
`7
`8
`9
`10
`11
`12
`13
`14
`15
`16
`17
`18
`19
`20
`21
`22
`23
`24
`
`
`
`
`
` 10
`
`
`
`Case IPR2016-01433
`Patent 7,757,298 B2
`
`The example it gives are things like pornographic files, text files with
`objectionable language, things like that.
`And in particular, what DeSouza teaches is that there are -- that
`one technique it uses to identify the suspect files in the beginning is to
`look for files that have been -- that are mismatched based on their name.
`So, in other words, the extension of the suffix doesn't match the actual
`content in the file. And that is one way in which DeSouza picks out files
`from a storage device. And then it designates them as suspect --
`suspicious or questionable, yellow, red, things like that.
`If we can go to slide 15 -- I'm sorry, 16. Slide 16, Hoffman.
`You know, similarly, Hoffman is basically directed to looking at Internet
`content, files, content coming downloaded to a computer from the
`Internet, and again, looking for things to block, filter out, images that the
`user doesn't -- you know, doesn't authorize or images that are just bad,
`and specifically what it describes is an automated process for analyzing
`images by using a checksum, creating a checksum for the issue,
`comparing it to a list of checksums that are associated with bad images,
`and if you get a match, you can block or kill that image.
`And so this set of grounds is based on the combination of
`DeSouza and Hoffman and using Hoffman's checksumming technique,
`its checksumming comparison technique to automatically analyze the
`graphics files in DeSouza. You know, and one big motivation for doing
`that, obviously, would be to cut down and avoid the extremely timely
`process of manually reviewing the images in DeSouza.
`
`1
`2
`3
`4
`5
`6
`7
`8
`9
`10
`11
`12
`13
`14
`15
`16
`17
`18
`19
`20
`21
`22
`23
`24
`
`
`
`
`
` 11
`
`
`
`Case IPR2016-01433
`Patent 7,757,298 B2
`
`
`If we could go to slide 23. So, there are basically three
`limitations in the independent claims that the Patent Owner argues are
`missing. And all of these I would say miss the mark entirely. You know,
`the first one is that DeSouza doesn't teach selecting the file based on its
`mismatched criteria, and that, you know, that is clearly in DeSouza. You
`know, I mean it goes into great detail about how it looks at the suffix and
`compares it to the content.
`The nuance in that appears to be that they're arguing that it's not
`a system administrator, it's a user -- I'm sorry, it's a -- it's a -- it's not a
`system administrator that's -- a computer, I'm sorry, that's selecting the
`file, and that's done by a user or someone. Again, this is missing the
`combination we've proposed, right? The combination we've proposed is
`that DeSouza is identifying specific files on the hard drives that have this
`criteria, and then you would be applying Hoffman's technique to that.
`The second argument they raise is that -- and if we go to slide
`26, is that Hoffman doesn't disclose the comparing step in this checksum
`analysis. And for this one, to be frank, I believe they may have just
`missed what we cited to and described in the petition, because as you can
`see on slide 26, Hoffman explicitly states that the calculated CRC value,
`which is the checksum, is then compared against a list of image
`signatures or IDs, and so I'm not quite sure, you know, what that
`argument is.
`If we could go to slide 27, the last thing they say is missing
`from these set of grounds is that Hoffman doesn't teach the
`characterizing, or as is said in claim -- in one of the other independent
`
`1
`2
`3
`4
`5
`6
`7
`8
`9
`10
`11
`12
`13
`14
`15
`16
`17
`18
`19
`20
`21
`22
`23
`24
`25
`
`
`
`
`
` 12
`
`
`
`Case IPR2016-01433
`Patent 7,757,298 B2
`
`claims, categorizing the file as an unauthorized file, you know, if it
`matches something in the list. And again, I would say that Hoffman
`teaches this, it's explaining that there's -- it compares against a list of bad
`images and if you get a match against a bad image, you can then target or
`kill that image, right, so in other words, that image that's matched is
`categorized or characterized as a bad image.
`Beyond that, they do argue also that Hoffman is the only thing
`we relied upon, and that is not correct. We -- you know, the petition
`explicitly lays out that it's the combination of the two and DeSouza has a
`lot of disclosure on categorizing and characterizing files.
`If we could just go to slide 29, the other set of grounds is
`Hypponen and Johnson. For this one, Hypponen teaches almost
`everything in the claim except for the specific criteria, the specific
`selection criteria. Well, what Hypponen is focused on in this preferred
`embodiment is macro viruses, looking for viruses in macros such as
`Microsoft Word macros, right? But to do that, it selects files that have
`macros, it does this entire checksum process, which, again, is generating
`a checksum, comparing it against a database of known virus checksums,
`and then, you know, removing or telling the user that you've got a
`virus-infected file.
`What it doesn't have is one of those selection criteria. If we
`could go to 20 -- I'm sorry, 30. Yeah, slide 30. So, what Johnson has is
`the third selection criteria, which is that if there -- if you find a file that
`has data beyond the end of data or end of file marker, that's one of the
`criteria in the claims as being something that might be indicative of a
`
`1
`2
`3
`4
`5
`6
`7
`8
`9
`10
`11
`12
`13
`14
`15
`16
`17
`18
`19
`20
`21
`22
`23
`24
`25
`
`
`
`
`
` 13
`
`
`
`Case IPR2016-01433
`Patent 7,757,298 B2
`
`suspicious file, right, that someone has hidden something at the end of
`the file.
`
`And so this group of balances, that combination of basically
`using Hypponen's system and its techniques to analyze the files that
`Johnson selects -- identifies and selects as being -- as having the hidden
`data at the end.
`JUDGE GIANNETTI: Counsel, I have to say I find your
`argument for combining these two references somewhat less than
`compelling. Maybe you could help me on that. Why would one look to
`steganography and virus detection to meet these claims?
`MR. WALDEN: Well, no, I appreciate the question. Excellent
`question, but the -- so, steganography, I think maybe what you're getting
`at, is somewhat the flip of -- is kind of like the opposite side of the coin
`in a sense, where what it's first teaching you is a technique in which you
`could put something at the end of a file, right, and hide -- not necessarily
`hide data, but hide it from an encryption standpoint, right, when you're
`sending something and you don't want someone else to see it, right?
`So, but what it does go on in Johnson to talk about is the flip
`side of that coin, which is how do I find -- how do I look for files that
`have this hidden data at the end, right, and how do I extract that data if I
`want to do something with it? And so our -- I mean, the combination is
`really just, again, that Hypponen has basically everything except the
`selection criteria, and really effectively what we're saying, you know, that
`is this criteria was not something that was new in the patent. I think even
`the '298 patent itself in the background explains that this was a common
`
`1
`2
`3
`4
`5
`6
`7
`8
`9
`10
`11
`12
`13
`14
`15
`16
`17
`18
`19
`20
`21
`22
`23
`24
`25
`
`
`
`
`
` 14
`
`
`
`Case IPR2016-01433
`Patent 7,757,298 B2
`
`way to hide data -- to hide files and content that you don't want to be
`found and, you know, that is illicit, that is offensive and unauthorized.
`This was a common approach to do that.
`And so, this is just effectively showing that here is not only the
`disclosure of that, but here is source code that can even go through and
`find the files and extract that data, and that that is something that you
`would have wanted to look at using Hypponen's techniques because as
`set forth in the petition, there's a lot of references that make clear that
`there are viruses -- that there were known at least to be viruses in these
`type of files at the end of files.
`JUDGE GIANNETTI: But why would a person of ordinary
`skill look to a virus protection patent to solve a problem which is a
`somewhat different problem, but why would one even look at virus
`detection if one were a person of ordinary skill?
`MR. WALDEN: I'm sorry, I don't --
`JUDGE GIANNETTI: Well, virus protection is a separate
`field, right?
`MR. WALDEN: Right, so Hypponen is virus protection.
`JUDGE GIANNETTI: Hypponen is virus protection, but why
`would a person of ordinary skill be faced with the problem of trying to
`filter out objectionable content necessarily look at virus control and virus
`protection as opposed to some other field?
`MR. WALDEN: So, I guess starting from the perspective of
`someone beginning with Hypponen, right, which is all about virus
`detection, I guess our combination -- our proposed combination is really
`
`1
`2
`3
`4
`5
`6
`7
`8
`9
`10
`11
`12
`13
`14
`15
`16
`17
`18
`19
`20
`21
`22
`23
`24
`25
`
`
`
`
`
` 15
`
`
`
`Case IPR2016-01433
`Patent 7,757,298 B2
`
`just that someone based on Hypponen, Hypponen gives an explicit
`reason, it says at the end, this system can be used to detect all kinds of
`other viruses, right, not just macro viruses. And what we're saying is that
`this was another way that people would hide viruses and files, a
`well-known way, even going to the '298 patent, was to stick it in the file
`at the end of this marker.
`And so it's just a matter of how do I select the files, right, in
`Hypponen? Do I select them because I found a macro in the file or do I
`select them because of some other kind of criteria?
`JUDGE GIANNETTI: Is there any mention in the patent of
`virus protection?
`MR. WALDEN: I'm sorry?
`JUDGE GIANNETTI: In the Shuster patent, '298.
`MR. WALDEN: Yeah, the '298 patent, in the background.
`That is what you're referring to?
`JUDGE GIANNETTI: Yes, it didn't seem to -- it didn't seem to
`relate to virus protection. You were talking about objectionable files like
`pornography or other things that you might want to filter out.
`MR. WALDEN: All right, so I think if I'm understanding your
`question, you're asking does the '298 patent actually --
`JUDGE GIANNETTI: Right, it's in a different field, isn't it?
`MR. WALDEN: I wouldn't say so. And actually in the file
`history, they included a limitation initially in the claims that said that the
`files were substantially free of viruses, and then they later in joint
`prosecution removed that negative limitation, you know, thereby making
`
`1
`2
`3
`4
`5
`6
`7
`8
`9
`10
`11
`12
`13
`14
`15
`16
`17
`18
`19
`20
`21
`22
`23
`24
`25
`
`
`
`
`
` 16
`
`
`
`Case IPR2016-01433
`Patent 7,757,298 B2
`
`clear that these claims cover -- cover viruses as a type of illegal or
`objectionable file. Or unauthorized file.
`JUDGE ANDERSON: Yes, if Judge Giannetti is done, I want
`to follow up on that. The Patent Owner's expert notes that macro files are
`particularly susceptible to virus infection, and that other files wouldn't
`necessarily be scanned, because they're not particularly susceptible to
`viruses. So, again, there seems to be a disconnect. So how do you --
`how do you respond to the evidence of the expert Mr. Goldschlag?
`MR. WALDEN: So, and if we could maybe -- maybe this will
`help a little, maybe we can just go to 45 so I can point to that as well,
`slide 45. But, you know, respectfully, first and foremost, whether or not
`a particular type of file's technique is particularly susceptible to viruses,
`you know, I don't think is quite the right inquiry.
`You know, it's clear that this was a way, a known way in which
`virus writers put, you know, infected files with viruses, right? In other
`words, hid viruses in files. And even the background of the '298 patent,
`as shown on slide -- part of which is shown on slide 45, explains that,
`you know, a technique, a known technique for hiding files where you
`basically append the, you know, parts of the file onto the end. Again, it's
`not talking about viruses, but this is just saying that this was a known
`technique, right.
`And then as we say in the petition, we have two or three other
`references that describe -- explicitly describe looking for viruses after an
`end of data -- end of file or end of data marker. And if you can just bear
`with me for one second.
`
`1
`2
`3
`4
`5
`6
`7
`8
`9
`10
`11
`12
`13
`14
`15
`16
`17
`18
`19
`20
`21
`22
`23
`24
`25
`
`
`
`
`
` 17
`
`
`
`Case IPR2016-01433
`Patent 7,757,298 B2
`
`
`So, if we look at slide 32, and this is just one example. So,
`slide 32, if we look at Drake, one of the points of Drake is describing the
`need to repass the end of file of a disk copy of new.exe to ensure that no
`extension that's viral occurred.
`And again, there's a couple of other examples, but there are
`many examples of the fact that this was a known way in which viruses
`would be hidden into different files on a computer system. And so what
`we're saying is that as Hypponen towards the end explicitly states that --
`you know, suggests using its techniques to look for other types of viruses,
`this is just one other type of virus, and instead of looking for files that
`have macros in them, you look for files that have data past the end of the
`data marker.
`JUDGE JUNG: Thank you, Mr. Walden. You have about five
`minutes remaining for your rebuttal time.
`MR. WALDEN: Thanks.
`JUDGE JUNG: You may proceed.
`MR. KING: Good afternoon, Your Honors. I would like to
`turn to slide 2. Slide 2 we see the six instituted grounds of obviousness.
`The first three grounds cite DeSouza and Hoffman, as well as the primary
`references, and then Martins and Farber are added for certain dependent
`claims.
`
`The next three grounds site Hypponen and Johnson as the
`primary references, with Farber and Nachenberg added for certain
`dependent claims. Now, the parties did not separately argue Martins,
`Farber or Nachenberg, and so I'm just going to collectively refer to the
`
`1
`2
`3
`4
`5
`6
`7
`8
`9
`10
`11
`12
`13
`14
`15
`16
`17
`18
`19
`20
`21
`22
`23
`24
`25
`
`
`
`
`
` 18
`
`
`
`Case IPR2016-01433
`Patent 7,757,298 B2
`
`first three grounds simply as DeSouza and Hoffman, and the second three
`grounds simply as Hypponen and Johnson.
`Let's turn to slide 3. Slide 3, I have highlighted the actual
`claim, and we have presented a number of different arguments in our
`Patent Owner response, but there's limited time here. There's only 30
`minutes. And I have to show that both -- that both combinations don't
`invalidate this claim.
`So, I'm going to focus really on one argument that's kind of a
`common theme between both of them. If there's time permitting, I hope
`to get to a few other arguments, we'll see, and if not, we'll rest on the
`briefs. I think they're well briefed.
`So, claim 1, we have this selecting step. We see, we're going to
`start with under control of one or more computer systems. We then have
`selecting a file from a plurality of files. And then it goes on to have these
`three alternatives. The first alternative is selecting a file based on the size
`of the file. This one is not in dispute, I'm not going to argue it.
`The second one is selecting a file based on whether content of
`the file matches a file type, et cetera. And for this, it's the DeSouza and
`Hoffman combination. And, finally, we have the third alternative,
`selecting a file based on whether the file comprises data beyond an end of
`data marker. For this, it will be the Hypponen and Johnson combination.
`So, let's turn to slide 4. Slide 4 shows the amendments made
`during prosecution, that ultimately led to the allowance of the claim. So,
`why is this selecting step with its alternatives important? Now, of course,
`one wants to remove the unauthorized files, but there's a second reason.
`
`1
`2
`3
`4
`5
`6
`7
`8
`9
`10
`11
`12
`13
`14
`15
`16
`17
`18
`19
`20
`21
`22
`23
`24
`25
`
`
`
`
`
` 19
`
`
`
`Case IPR2016-01433
`Patent 7,757,298 B2
`
`Unauthorized files consume memory, and they make the system less
`efficient, but processing every file to find an unauthorized file also
`consumes a great deal of processing power.
`So, what the selecting step is is like a funnel. It's going to take
`a plurality of files, a large number of files, and it's going to eventually
`select those that need to be further processed to see if they're
`unauthorized. These three alternatives were what the inventors used to
`select and narrow down the number of files to actually process.
`Let's turn to slide 5. Slide 5, the reason I'm focusing on the
`highlighted selecting step is it's common to all grounds. All the
`independent claims have selecting a file, and it's a long -- it's a long
`limitation, right? It goes on for many lines. I'm just referring to it as sort
`of shorthand as selecting a file. We'll get into the details of it in a minute.
`But it's in every single claim, and all these alternatives are in
`the claim, so we're just going to -- I'm going to refer to it generally as
`selecting a file.
`Let's move to slide 6. At the beginning, Petitioner argued that
`there isn't a great deal of dispute over the meaning of selecting. And I
`agree. I agree. We look here, this is from their reply, there is no dispute
`that merely creating a list of files by itself does not necessarily select a
`file. We agree with that. We think Symantec has conceded that there
`is -- that a listing of a file is not selecting files, and we think the Board,
`consistent with Symantec's admission, should find that to be so.
`Let's move to slide 7. Same slide, but I just highlight a different
`part. It just goes on to say, you know, the very next sentence says,
`
`1
`2
`3
`4
`5
`6
`7
`8
`9
`10
`11
`12
`13
`14
`15
`16
`17
`18
`19
`20
`21
`22
`23
`24
`25
`
`