`
`Exhibit 30
`
`
`
`Case 1:14-cv-02396-PGG-SN Document 239-12 Filed 11/12/20 Page 2 of 37
`
`9/4/2019
`
`Network-1 Technologies, v. Google LLC and Youtube LLC
`Confidential - Attorneys' Eyes Only
`
`Erling Wold 30(b)(6)
`
`Page 1
`
` UNITED STATES DISTRICT COURT
` SOUTHERN DISTRICT OF NEW YORK
`___________________________________
` )
`NETWORK-1 TECHNOLOGIES, INC., )
` )
` Plaintiff, )
` )
`vs. ) No. 14 Civ. 2396
` ) 14 Civ. 9558
` ) (PGG)
`GOOGLE LLC and YOUTUBE, LLC, )
` )
` Defendants. )
`___________________________________)
`
` CONFIDENTIAL ATTORNEYS' EYES ONLY
` VIDEOTAPED DEPOSITION OF AUDIBLE MAGIC 30(b)(6)
` ERLING WOLD
` September 4, 2019 at 10:01 a.m.
` Three Embarcadero Center, 26th Floor
` San Francisco, California
`
`REPORTED BY: LANA L. LOPER,
` RMR, CRR, CCP, CME, CLR, CSR No. 9667
`____________________________________________________
` DIGITAL EVIDENCE GROUP
` 1730 M Street, NW, Suite 812
` Washington, D.C. 20036
` (202) 232-0646
`
`www.DigitalEvidenceGroup.com
`
`Digital Evidence Group C'rt 2019
`
`202-232-0646
`
`
`
`Case 1:14-cv-02396-PGG-SN Document 239-12 Filed 11/12/20 Page 3 of 37
`
`9/4/2019
`
`Network-1 Technologies, v. Google LLC and Youtube LLC
`Confidential - Attorneys' Eyes Only
`
`Erling Wold 30(b)(6)
`
`1
`2
`3
`4
`5
`6
`7
`8
`9
`10
`11
`12
`13
`14
`15
`16
`17
`18
`19
`20
`21
`22
`
`1
`2
`3
`4
`5
`6
`7
`8
`9
`10
`11
`12
`13
`14
`15
`16
`17
`18
`19
`20
`21
`22
`
` A Whatever you want, it's fine; casual.
` Q Have you ever been deposed before, Mr. Wold?
` A I have.
` Q So just a brief refresher, then.
` I will be asking you some questions. If
`there's anything that I can ask better, or if my
`question is confusing, just let me know and I'll ask you
`a better question.
` A Okay.
` Q And I'll try to take a break about every hour,
`but if you would like a break sooner or you would like
`to keep going, let me know.
` A Okay.
` Q We'll go -- we'll go with that.
` So let's -- let's get started then.
` Where did you go to school, Mr. Wold?
` A You're not talking about way back, but college,
`Cal Tech in Pasadena; and then UC Berkeley for graduate
`school.
` Q And when did you graduate from college?
` A College, graduated in '78.
` Q Okay.
`
`Page 10
`
` A Add graduate school in '87.
` Q What did you study, then, at college?
` A I was initially a math major. And then I went
`into electrical engineering. That was what my degree
`was in. I also studied music at Occidental at the same
`time, actually.
` And then at UC Berkeley, I was in the EECS
`department. My advisor was a CS person. And then I
`also studied music there as well.
` Q What was the EECS department?
` A Electrical engineering and computer science.
` Q And did you take any classes on content
`recognition, either in undergrad or your graduate
`studies?
` A No.
` Q Do you have any papers related to content
`recognition?
` A Yes.
` Q And how many papers do you have on that topic?
` A That's a good question.
` I'm a co-author on, I think, three at least,
`from around -- from the mid-'90s, on audio similarity
`Page 11
`
`1
`2
`3
`4
`5
`6
`7
`8
`9
`10
`11
`12
`13
`14
`15
`16
`17
`18
`19
`20
`21
`22
`
`1
`2
`3
`4
`5
`6
`7
`8
`9
`10
`11
`12
`13
`14
`15
`16
`17
`18
`19
`20
`21
`22
`
`www.DigitalEvidenceGroup.com
`
`Digital Evidence Group C'rt 2019
`
`and so on.
` Q So you said about three papers on audio
`similarity?
` A Yeah. I'm trying to think if there were any
`others.
` There were at least three; maybe a book chapter
`or something like that as well. I -- I don't recall
`exactly.
` Q What about patents; do you have any patents on
`the field of content recognition?
` A Yes. So -- sorry, I just hit that.
` There was a patent when I was working for this
`consulting group called Muscle Fish. There was a group
`of four of us. That was around that same time. It was
`initially filed, I think, in '96 or something like that,
`on this general audio similarity, so feature extraction
`and -- and comparison of audio.
` Later, after we were acquired by Audible Magic,
`Audible Magic has had many patents, I mean, in this area
`and all the systems issues and so on in that area.
` Q And on those later patents, are you listed as
`an inventor on any of them?
`
`Page 12
`
` A I'm listed as an inventor on a number of them,
`yes.
` Q And they all generally relate to content
`recognition, you say?
` A Yes.
` Q Where do you currently work?
` A I'm at Audible Magic, still.
` Q What is your position at Audible Magic?
` A I think they just changed my title to executive
`scientist --
` Q Sounds fancy.
` A -- whatever that is.
` Q When did you begin working with the Audible
`Magic company in any capacity?
` A Well, so in '99 is when they first -- so we
`were a consulting group. And in '99, mid-'99, they
`became a client of ours.
` Q Okay.
` A And then -- yeah, that's it.
` Q And while consulting for Audible Magic, was
`there ever a time that you came -- that you worked on a
`project called Clango; and for the record that's
`Page 13
`Pages 10 to 13
`202-232-0646
`
`
`
`Case 1:14-cv-02396-PGG-SN Document 239-12 Filed 11/12/20 Page 4 of 37
`
`9/4/2019
`
`Network-1 Technologies, v. Google LLC and Youtube LLC
`Confidential - Attorneys' Eyes Only
`
`Erling Wold 30(b)(6)
`
`1
`2
`3
`4
`5
`6
`7
`8
`9
`10
`11
`12
`13
`14
`15
`16
`17
`18
`19
`20
`21
`22
`
`1
`2
`3
`4
`5
`6
`7
`8
`9
`10
`11
`12
`13
`14
`15
`16
`17
`18
`19
`20
`21
`22
`
`C-l-a-n-g-o?
` A Yes.
` Q And in general terms, what was Clango?
` A Clango was intended for people to be listening
`to, say, an Internet radio station. And when they were
`listening to an Internet radio station, they can run
`this other operation called Clango on their desktop.
` And if they heard a tune that they liked, they
`could press a button, I think, and they would identify
`that tune that was playing and then allow you to, you
`know, purchase it, for example, if you were interested.
` It showed you the metadata, and then it gave
`you a URL to an e-commerce site.
` Q Is it fair to say that Clango identified songs
`by their content?
` A Yes.
` Q Why try to identify songs by -- by their
`content?
` A Well, at the time there wasn't -- there wasn't
`always metadata. So if -- if someone was just listening
`to a radio station on the Internet, it might just be a
`stream from a terrestrial radio station, and so it
`Page 14
`
`didn't tell you what the song titles were.
` You couldn't necessarily buy that if you were
`interested, or even find out who did it if you were
`interested.
` Q Why not try to include some sort of mark, like
`a watermark, and search for those?
` MR. LEDAHL: Vague and ambiguous.
`BY MR. DANG:
` Q Okay.
` A That would be a fine thing, if you could
`convince the radio stations to do that.
` Q Did Clango have any competitors at that time,
`when you started working on it, say, in the -- or I
`don't believe you actually told me when you started
`working on it.
` When did you start working on Clango?
` A So around that -- well, that was essentially
`the project.
` I -- I'm not sure when it was exactly called
`Clango. I think they went through some other, you know,
`trade names. But it would have been in the late -- you
`know, the second half of 1999.
`
`Page 15
`
`1
`2
`3
`4
`5
`6
`7
`8
`9
`10
`11
`12
`13
`14
`15
`16
`17
`18
`19
`20
`21
`22
`
`1
`2
`3
`4
`5
`6
`7
`8
`9
`10
`11
`12
`13
`14
`15
`16
`17
`18
`19
`20
`21
`22
`
`www.DigitalEvidenceGroup.com
`
`Digital Evidence Group C'rt 2019
`
` And then Clango itself was very clear in 2000
`as the name of the thing.
` Q And did Clango have any competitors at the time
`it was being developed?
` A You know, we've had -- you know, over the
`years, we've had many, many other -- many other people
`in this area of content recognition. But I don't know
`if there was someone doing exactly that at that time.
` It's possible, because there were even some
`open source projects that were around that time. So I
`just don't remember exactly.
` Q Okay. How active was the field of content
`recognition in the late 1990s?
` MR. RAMSEY: Objection. Vague.
` MR. LEDAHL: Vague and ambiguous.
` THE WITNESS: As I remember, it -- around that
`time, it became very active.
` It was already interesting in the mid-'90s
`There were definitely -- there was definitely academic
`interest; for example, our paper, we were approached by
`people, for example, at IBM about it; Informex; the LIST
`Company, they were interested -- they were doing content
`Page 16
`
`recognition on other media, like video and so on.
` And, of course, the Internet was exploding.
`And so search, in general, was of interest. And the
`Internet was starting to have lots of media as well as
`text. You know, originally it was just text and maybe
`pictures. So searching that became, really, a big deal.
` When Napster came on the scene -- I don't
`remember the exact -- so that may have been 2000 or
`something. I can't remember the exact year. But around
`that -- that time, then it really exploded. And then
`everybody wanted to be in on it, because they -- they
`saw money involved.
` Q Again, this is around the time that you started
`working on Clango as well?
` A Yes. So, I mean, around that time; I mean,
`plus or minus. I don't remember the exact year when --
`when all these other competitors appeared.
` Q Okay. And you mentioned that you had entered
`into a consulting arrangement with Audible Magic
`originally?
` A Right.
` Q Who was your employer when you entered into
`Page 17
`Pages 14 to 17
`202-232-0646
`
`
`
`Case 1:14-cv-02396-PGG-SN Document 239-12 Filed 11/12/20 Page 5 of 37
`
`9/4/2019
`
`Network-1 Technologies, v. Google LLC and Youtube LLC
`Confidential - Attorneys' Eyes Only
`
`Erling Wold 30(b)(6)
`
`1
`2
`3
`4
`5
`6
`7
`8
`9
`10
`11
`12
`13
`14
`15
`16
`17
`18
`19
`20
`21
`22
`
`1
`2
`3
`4
`5
`6
`7
`8
`9
`10
`11
`12
`13
`14
`15
`16
`17
`18
`19
`20
`21
`22
`
`that consulting agreement?
` A Muscle Fish was, I think, an LLC at the time.
`And we were just essentially a partnership.
` Q What was your position at -- at Muscle Fish at
`that time?
` A So we were just all equal partners. It was a
`very -- communist, I was going say -- democratic
`partnership. We were just all in it together.
` Q Was there anyone else at Muscle Fish who was
`also working on Clango?
` A Well, I think we all discussed it, because we
`were all four people in the same office. I was the one
`who was primarily working on the technology that was
`going into Clango.
` Q Okay. And how frequently were you in contact
`with the folks at Audible Magic while you were working
`on Clango?
` A Fairly often. I mean, you know, at least -- I
`don't know exactly. And it would -- and it would go up
`and down, of course, as things heated up. At least once
`a month, I would assume, or more. And -- and then in
`2000, of course, all the time.
`
`Page 18
`
` Q And is there a particular person at Audible
`Magic with whom you were in contact?
` A I think Vance Ikezoye was the first person to
`get in touch with us. But Vance Ikezoye and Jim
`Schrempp are the two founders.
` At the -- at -- it was called Wired Air at the
`time and -- but then Jim Schrempp was the main
`technology person, and so we talked to him the most.
` Q And I believe you mentioned that, at some
`point, Muscle Fish was purchased by Audible Magic. Is
`that right?
` A Yes.
` Q Do you know when -- around when that occurred?
` A My recollection is that that happened in July
`of 2000.
` Q And after Muscle Fish was purchased by Audible
`Magic, did you continue working on the Clango
`application?
` A Uh-huh, yes.
` Q Let's take one quick step back.
` You mentioned that there were several other
`people at Muscle Fish who you had had discussions with
`Page 19
`
`www.DigitalEvidenceGroup.com
`
`Digital Evidence Group C'rt 2019
`
`1
`2
`3
`4
`5
`6
`7
`8
`9
`10
`11
`12
`13
`14
`15
`16
`17
`18
`19
`20
`21
`22
`
`1
`2
`3
`4
`5
`6
`7
`8
`9
`10
`11
`12
`13
`14
`15
`16
`17
`18
`19
`20
`21
`22
`
`about Clango?
` A Sure.
` Q Do you remember any of their names?
` A Yes. So Doug Keislar.
` Do you want me to spell it?
` Q Sure.
` A So Doug, D-o-u-g; Keislar is K-e-i-s-l-a-r.
` He was a -- also a computer music engineer
`type. And he had some psychoacoustic knowledge. And so
`I asked him questions about psychoacoustics when we were
`first doing the feature stuff in '95 or so, and we
`continue those discussions.
` Thom Blum, T-h-o-m, B-l-u-m; and Jim Wheaton,
`W-h-e-a-t-o-n. So all of us discussed it.
` Q And let's start with -- with Doug.
` Did he move to Audible Magic after the Muscle
`Fish purchase?
` A Yes. All four of them -- all four of us did.
` Q Perfect.
` And did they all continue to have discussions
`about Clango with you while at Audible Magic?
` A Yes.
`
`Page 20
`
` Q Based on all --
` THE REPORTER: Can you repeat? Can you slow
`down and repeat?
` MR. DANG: Sure.
`BY MR. DANG:
` Q Based on all of this experience with the Clango
`application, is it fair to say that you have personal
`knowledge about the development of the Clango
`application?
` A Yes.
` MR. DANG: Handing the witness what's been --
`would you mark this as Exhibit 1 for me?
` (Wold Exhibit 1 was marked for
` identification.)
`BY MR. DANG:
` Q Handing the witness what has been marked as
`Exhibit 1. And for the record, the Bates numbers are
`AUDMAG01710721.
` Do you recognize this e-mail?
` Please take a moment to review it, if you'd
`like.
` A Yes, I recognize this e-mail.
`
`Page 21
`Pages 18 to 21
`202-232-0646
`
`
`
`Case 1:14-cv-02396-PGG-SN Document 239-12 Filed 11/12/20 Page 6 of 37
`
`9/4/2019
`
`Network-1 Technologies, v. Google LLC and Youtube LLC
`Confidential - Attorneys' Eyes Only
`
`Erling Wold 30(b)(6)
`
`1
`2
`3
`4
`5
`6
`7
`8
`9
`10
`11
`12
`13
`14
`15
`16
`17
`18
`19
`20
`21
`22
`
`1
`2
`3
`4
`5
`6
`7
`8
`9
`10
`11
`12
`13
`14
`15
`16
`17
`18
`19
`20
`21
`22
`
` Q What is this e-mail?
` A So this is an e-mail from Jim Schrempp, who I
`mentioned, who was the main technology person at Audible
`Magic, to me, cc'd with Vance, about -- as I said, the
`company was called Wired Air at the time.
` So it's about Audible Magic/Wired Air, and that
`they -- they have an idea about this application, which
`is Clango, that it would have -- it would have this
`capability that the user would click "ID This" button.
`And whatever they were listening to on their laptop --
`or their computer, sorry -- that it would transmit
`that -- a fingerprint of that -- the last thing they
`were listening to, and then the back end, some back-end
`service somewhere, would identify that and then return
`that back to the client and display that information:
`Title, artist, and a -- and a link, like I said, like a
`URL.
` And they're asking us to do the lookup stuff,
`so the lookup code. So this thing that does the
`fingerprinting looks at -- and I think he's -- he's
`talking about the lookup code, so the comparison and
`search part of the lookup process.
`
`Page 22
`
` I don't know how much detail you want.
` He's -- he's giving us some idea of the size of
`the database, how much time it takes; so he's giving us
`some vague spec for this, because it was early in this
`process.
` And he's saying that, clearly, a brute force
`search won't be enough performance; won't -- won't be
`fast enough. We have to do some kind of clustering,
`some other strategy that optimizes this identification.
` And he's happy with either an offline indexing
`process or not, I assume. He said that's -- I don't
`know -- the others is fine.
` Q Let -- let me try to break that answer down
`just a little bit.
` A Okay.
` Q So let's start in the beginning.
` So in the first sentence, Jim mentions a new
`application?
` A Right.
` Q You mention that that's the -- the initial
`stages of the Clango application --
` A Right.
`
`Page 23
`
`www.DigitalEvidenceGroup.com
`
`Digital Evidence Group C'rt 2019
`
`1
`2
`3
`4
`5
`6
`7
`8
`9
`10
`11
`12
`13
`14
`15
`16
`17
`18
`19
`20
`21
`22
`
`1
`2
`3
`4
`5
`6
`7
`8
`9
`10
`11
`12
`13
`14
`15
`16
`17
`18
`19
`20
`21
`22
`
` Q -- right?
` And how long before this e-mail, to -- to your
`memory, were you working on the Clango application with
`Jim?
` A I -- I don't remember exactly. I don't know.
` You know, probably -- since he's referring to
`the new application, I assume he's referring to an
`ongoing discussion, so maybe a week or two or three or
`something like that.
` Q Okay.
` A Like I said, it -- it had only been a few
`months since we had been in contact.
` Q And -- and when was this e-mail sent?
` A November 29, 1999.
` Q And so you mentioned that the application that
`you and Jim envisioned would extract, I believe you
`said, fingerprints from an audio work. Is that right?
` A Yes.
` Q And what is a fingerprint of an audio work?
` A Well, it's a -- a metaphor, like a -- the fact
`that you're identified by your own human fingerprint.
` The idea is -- and, in fact, even with a human
`Page 24
`
`fingerprint -- there's the idea that you -- you pull out
`certain features, and that you find a little crossing
`here or a swirl there.
` So the idea is that if you want to identify a
`piece of audio, you can do the same kind of thing. You
`can extract features from that that describe the audio,
`and that represent it, and that are much smaller than
`the whole audio.
` So like a fingerprint is a little teeny piece
`of you, you don't have to have the whole person; you
`just need this thing. And it's enough to identify
`something.
` Q And how did you and Jim envision extracting
`features from a piece of audio at this time?
` A Well, so I think in the initial discussions,
`even with Vance, in -- in mid-1999, we told them we had
`feature extraction capabilities, because that had been
`described in our paper in the mid-'90s, in the '95-'96
`time frame.
` Q And once you had extracted a feature from a
`piece of audio --
` (Counsel conferring with counsel.)
`
`Page 25
`Pages 22 to 25
`202-232-0646
`
`
`
`Case 1:14-cv-02396-PGG-SN Document 239-12 Filed 11/12/20 Page 7 of 37
`
`9/4/2019
`
`Network-1 Technologies, v. Google LLC and Youtube LLC
`Confidential - Attorneys' Eyes Only
`
`Erling Wold 30(b)(6)
`
`1
`2
`3
`4
`5
`6
`7
`8
`9
`10
`11
`12
`13
`14
`15
`16
`17
`18
`19
`20
`21
`22
`
`1
`2
`3
`4
`5
`6
`7
`8
`9
`10
`11
`12
`13
`14
`15
`16
`17
`18
`19
`20
`21
`22
`
`BY MR. DANG:
` Q Sorry, let me start over.
` What feature extraction capabilities did you
`have, in a little bit more detail, at that time?
` A Yeah. So in the -- in the mid-'90s, we -- we
`had -- as I said, we had had this paper; we had several
`papers. And in at least one of them, we had discussed,
`you know, extraction of things like pitch, spectral
`centroid, spectral width, mel-filter Cepstral
`coefficients, which were -- well, a way we would
`describe the spectrum of the sound over time.
` And -- yeah, and all these things were over
`time, so they were tracked over time.
` Yeah, I'm sure there were some other things as
`well, but those are the ones that come to mind.
` Q Well, let's -- let's take one as an example.
` So you mentioned pitch as something --
` A Yes.
` Q -- that you would extract.
` Would you describe a little bit of how you
`would fingerprint a song by extracting a pitch feature?
` A Yes.
`
`Page 26
`
` I mean, are you talking about what I did back
`then, or what I would do if I were to do that -- I mean,
`how one could possibly do that now or something?
` I'm not sure what the question is.
` Q Right. So to clarify, so back then, in
`November of 1999 --
` A Right.
` Q -- how did you extract features from a song
`using pitch?
` A Yes.
` Well, so in 1999, we actually didn't use that
`feature, so that's why I was asking that question.
` Q Oh, okay.
` A We -- by the time that I -- we had talked to
`Audible Magic -- okay. So let me step back just a
`little bit.
` So in the mid-'90s, we were interested in the
`whole similarity problem of -- of sound, and from, you
`know, sound effects to larger musical things. And so we
`were looking at all kinds of features.
` But audio identification is a subset of the
`whole similarity problem. It's -- you know, it's --
`Page 27
`
`1
`2
`3
`4
`5
`6
`7
`8
`9
`10
`11
`12
`13
`14
`15
`16
`17
`18
`19
`20
`21
`22
`
`1
`2
`3
`4
`5
`6
`7
`8
`9
`10
`11
`12
`13
`14
`15
`16
`17
`18
`19
`20
`21
`22
`
`www.DigitalEvidenceGroup.com
`
`Digital Evidence Group C'rt 2019
`
`when things are very, very similar, then we say they're
`the same, typically.
` Q Uh-huh.
` A By that -- by the time of 1999, 1999, we had
`decided that the -- that the mel-filter Cepstral
`coefficients -- I'm going to refer to them as MFCCs,
`because that's what we -- it's easier.
` THE REPORTER: Refer to them as what?
` THE WITNESS: MFCCs, the acronym.
` So by that time, we had decided that that was
`enough to do an identification, that those features were
`enough.
`BY MR. DANG:
` Q So the -- the MFCC, was that a particular
`feature that was extracted?
` A Yes, a set of features.
` Q A set of features?
` A Right.
` Q And do you remember what set of features went
`into the MFCC?
` A Yes. So I don't know how -- how technical you
`want to get. But basically, I can give you a very
`Page 28
`
`simple idea.
` It's basically, at the time we were using
`10 MFCCs per segment of sound. And a segment of sound,
`you know, could be 25 milliseconds, or it could be a
`second, for example, something in that -- but it's
`something short, in that range.
` And the -- the way those -- so they're just
`floating by numbers.
` Let me think.
` The way that they are computed is, you do some
`kind of spectral analysis of the sound. So that's,
`like, what the ear does. You look for low frequencies,
`high frequencies, so on. And then you rectify that.
`And then you take another spectral analysis of that.
` And -- and what you end up with is a set of
`features that really describes the spectral shape of the
`sound. The lower-order features are kind of the gross
`shape of the spectrum, and the higher-order features are
`more the detail, the -- yeah, the higher features on the
`spectrum, and over time.
` Q And once the Clango application had extracted
`those features, what would it do next in the process of
`Page 29
`Pages 26 to 29
`202-232-0646
`
`
`
`Case 1:14-cv-02396-PGG-SN Document 239-12 Filed 11/12/20 Page 8 of 37
`
`9/4/2019
`
`Network-1 Technologies, v. Google LLC and Youtube LLC
`Confidential - Attorneys' Eyes Only
`
`Erling Wold 30(b)(6)
`
`1
`2
`3
`4
`5
`6
`7
`8
`9
`10
`11
`12
`13
`14
`15
`16
`17
`18
`19
`20
`21
`22
`
`1
`2
`3
`4
`5
`6
`7
`8
`9
`10
`11
`12
`13
`14
`15
`16
`17
`18
`19
`20
`21
`22
`
`identifying a song?
` MR. LEDAHL: Lacks foundation.
` THE WITNESS: So in the actual application, as
`it was released in mid-2000, there would be the Clango
`application, itself, would just extract the features.
`And so it would -- when the person pressed the button,
`there would be, say, a circular buffer of the last -- or
`not necessarily a circular buffer. There would be a
`buffer containing the last so many seconds of the sound
`of what they had been listening to.
` That would all be fingerprinted at the Clango
`application. So this set of MFCCs would be extracted.
` Then that would -- that package would be sent
`over the network to a server at -- under Audible Magic's
`control. I think they actually had it in a third-party
`site at the time. I'm not sure.
` And at that -- and on -- on that end, there
`would be a reference database of fingerprints. And then
`some kind of lookup algorithm would be done to compare
`the fingerprint coming in with that set of references
`and -- to see if any of them were close enough that we
`would report a match.
`
`Page 30
`
` (Counsel conferring with counsel.)
`BY MR. DANG:
` Q In this 1999 e-mail, I guess, how did you
`envision, at that time, that Clango, or whatever the new
`application you were discussing here, would use those
`MFCC extracted features to identify a song?
` A I think it's described -- I was just looking.
`I mean, I think it's described in the e-mail.
` Q Uh-huh.
` A He's saying that:
` Each fingerprint will be the set of
` feature vectors that describe a
` 15-second piece of a song.
` It would take -- yeah, he even talks about a
`circular buffer. So we would be doing the analysis on
`this stream of recording -- sorry -- on this stream of
`audio that the user is listening to. And when the user
`clicks "ID This," that fingerprint, the last -- in fact,
`he's saying 180-second tool is what he was imagining at
`the time -- worth of these fingerprints would be sent to
`the central Wired Air server.
` Then the server would do the lookup and return
`Page 31
`
`www.DigitalEvidenceGroup.com
`
`Digital Evidence Group C'rt 2019
`
`1
`2
`3
`4
`5
`6
`7
`8
`9
`10
`11
`12
`13
`14
`15
`16
`17
`18
`19
`20
`21
`22
`
`1
`2
`3
`4
`5
`6
`7
`8
`9
`10
`11
`12
`13
`14
`15
`16
`17
`18
`19
`20
`21
`22
`
`the metadata.
` Q And let's -- let's dive a little deeper into
`the lookup routine mentioned in this -- in this e-mail.
` Was Muscle Fish entrusted to build the lookup
`routine at this time?
` A Correct.
` Q And why was Muscle Fish, in particular,
`entrusted to build the lookup routine, to your
`knowledge?
` MR. LEDAHL: Calls for speculation.
` MR. RAMSEY: Calls for speculation.
`BY MR. DANG:
` Q Do you know why Muscle Fish was entitled to --
`or was entrusted to build the lookup routine?
` MR. LEDAHL: Same objection.
` THE WITNESS: I believe I do, because we had
`expertise in this area. We were actually in this little
`niche area. We were actually fairly well known.
` We had published this paper that was, again, in
`our little area, was a seminal paper. And they had
`gotten our names because of our expertise in this area.
`That's how they contacted us in the first place.
`Page 32
`
`BY MR. DANG:
` Q And as you began working on the lookup routine,
`did you have any particular goals when building that
`routine?
` A Yes. I mean, of course, it -- it had to
`function. And as described in this -- and he did give
`us some specifications, in terms of performance and so
`on and accuracy.
` Q And what do you mean by performance
`specifications?
` A So it had to respond to the user in a certain
`amount of time, for example. And it couldn't be too
`expensive of a computation, because that would be too
`much of a cost.
` Yeah, so that's what I meant by performance.
` Q And Jim -- do you see where it says --
` A Yes.
` Q -- Jim envisions that:
` I would like the routines to return
` an incorrect identification of less than
` one percent of the time and less than
` four percent of the time have the
`
`Page 33
`Pages 30 to 33
`202-232-0646
`
`
`
`Case 1:14-cv-02396-PGG-SN Document 239-12 Filed 11/12/20 Page 9 of 37
`
`9/4/2019
`
`Network-1 Technologies, v. Google LLC and Youtube LLC
`Confidential - Attorneys' Eyes Only
`
`Erling Wold 30(b)(6)
`
`1
`2
`3
`4
`5
`6
`7
`8
`9
`10
`11
`12
`13
`14
`15
`16
`17
`18
`19
`20
`21
`22
`
`1
`2
`3
`4
`5
`6
`7
`8
`9
`10
`11
`12
`13
`14
`15
`16
`17
`18
`19
`20
`21
`22
`
` routines fail to identify a fingerprint
` that is, in fact, in the sample.
` A Yes.
` Q Do you see that in this third paragraph here?
` A Yes, I do.
` Q Why would a lookup routine fail to return a
`fingerprint that is actually in the sample?
` A So, I mean, there's a fundamental issue that
`nothing is perfect. It's a fuzzy question, to begin
`with.
` Q Uh-huh.
` A It's a somewhat subjective -- as I said, when
`you talk about two things being similar, you know, they
`can be very similar or -- or not very similar.
` And so you have to decide how similar they have
`to be. So fundamentally, just in any kind of a matching
`algorithm that's -- that's not exact, you have to decide
`where those -- that boundary is. That's a subjective
`question.
` So some things -- so when you say -- in fact,
`when you say that is, in fact, in the sample, that is
`actually a little bit stronger than you can even say,
`Page 34
`
`because by its nature, it's a -- it's a fuzzy thing.
` Now, also, when you actually do some kind of
`lookup algorithm, you're going to trade off, typically,
`some error rates for issues like performance. So
`you'll -- you will get errors if -- if you try to do
`typical things to speed up the process, you will allow
`some errors. And maybe you're happy if, you know, you
`miss one in a -- one in a thousand. What he's saying is
`actually fairly high, one in -- oh, yeah, four out of
`100, one in 25.
` You might be willing to miss that to make it
`more cost-effective, for example.
` Q So let's -- let's break down just a couple of
`things that are -- that are in that answer.
` A Okay.
` Q So you mentioned that it's -- it's a fuzzy
`thing.
` Does -- are -- were you -- are you saying that
`there -- the lookup routine that you had envisioned with
`Jim at this time wouldn't look for exact matches of
`fingerprints?
` MR. LEDAHL: Vague and ambiguous.
`Page 35
`
`1
`2
`3
`4
`5
`6
`7
`8
`9
`10
`11
`12
`13
`14
`15
`16
`17
`18
`19
`20
`21
`22
`
`1
`2
`3
`4
`5
`6
`7
`8
`9
`10
`11
`12
`13
`14
`15
`16
`17
`18
`19
`20
`21
`22
`
`www.DigitalEvidenceGroup.com
`
`Digital Evidence Group C'rt 2019
`
` THE WITNESS: Well, again, yes.
` It depends a little bit on what you mean by
`exact. But if you -- for example, the bits will never
`be exactly the same -- well, I shouldn't say never. Of
`course, they could be exactly the same.
` If you sent -- if you had the exact same
`reference and audio, and you fingerprinted them, you
`would get the exact same bits. And if you compared
`those, those would match exactly at the bit level.
` But typically, what -- what you would want in
`an application like this is that the -- you know, the
`average user would say, you know, that is that tune;
`that tune is in your database; why didn't you get it.
`You know, that would be, what would you call, a match.
` And that would typically not be a bit-for-bit
`match. That would be that they would be similar enough,
`so the values would be similar enough by some kind of
`distance measure to -- to be -- said to be a match.
`BY MR. DANG:
` Q And to determine whether some song in your
`reference database was, as you mentioned, similar enough
`to the query, did you have to set any sort of threshold
`Page 36
`
`of similarity between those two?
` A Yeah. Yes.
` Q And I see in this fourth paragraph, the
`statement:
` Brute force won't yield the
` performance or accuracy needed for this.
` Do you see that?
` A Yes.
` Q What's a brute -- what's your understanding of
`the term "brute force" in this -- in this paragraph?
` MR. LEDAHL: Calls for speculation.
` THE WITNESS: At the time -- I mean, we would
`have -- we would have said that brute force meant taking
`the fingerprint of the unknown and comparing it to every
`single reference, computing the full -- the -- so
`whatever distance measure you're using, we were using --
`sorry. I'm going fast.
` We were using a Euclidean distance. And so
`every -- to be a brute force search would be to take
`that unknown and compute the Euclidean distance of that
`unknown with every single reference in the database, the
`full Euclidean distance, and then to look and see if any
`Page 37
`Pages 34 to 37
`202-232-0646
`
`
`
`Case 1:14-cv-02396-PGG-SN Document 239-12 Filed 11/12/20 Page 10 of 37
`
`9/4/2019
`
`Network-1 Technologies, v. Google LLC and Youtube LLC
`Confidential - Attorneys' Eyes Only
`
`Erling Wold 30(b)(6)
`
`1
`2
`3
`4
`5
`6
`7
`8
`9
`10
`11
`12
`13
`14
`15
`16
`17
`18
`19
`20
`21
`22
`
`1
`2
`3
`4
`5
`6
`7
`8
`9
`10
`11
`12
`13
`14
`15
`16
`17
`18
`19
`20
`21
`22
`
`of them are within that threshold that you mentioned
`earlier.
`BY MR. DANG:
` Q And why wouldn't -- sorry, let me scratch that.
` Is it fair to say that that brute force search
`is not the type of search that you and Jim envisioned
`using for this new application in this e-mail