throbber
FingerCode: A Filterbank for Fingerprint Representation and
`Matching*
`
`Ani1 K. Jain, Salil Prabhakar, Lin Hong,
`Department of Computer Science and Engineering
`Michigan State University, East Lansing, MI 48824
`{ jain,prabhaka,honglin} @cse.msu .edu
`
`and Sharath Pankanti
`IBM T. J. Watson Research Center
`Yorktown Heights, NY 10598
`sharat @watson.ibm .com
`
`Abstract
`With the identity fraud in o u r society reaching u n -
`precedented proportions and with a n increasing empha-
`sis o n the emerging automatic positive personal iden-
`tification applications, biometrics-based identification,
`especially fingerprint-based identification, is receiving
`a lot of attention. There are t w o m a j o r shortcomings
`of the traditional approaches t o fingerprint represen-
`tation. For a significant fraction of population, the
`. representations based o n explicit detection of complete
`ridge structures in the fingerprint are dificult t o ex-
`tract automatically. T h e widely used minutiae-based
`representation does n o t utilize a significant component
`of the..riCh discriminatory information available in the
`fingerprints. T h e proposed filter-based algorithm uses
`a bank of Gabor filters t o capture both the local and
`the global details in a fingerprint as a compact 640-
`byte fixed length FingerCode. T h e fingerprint match-
`ing is based o n the Euclidean distance between the
`t w o corresponding Fingercodes and hence is extremely
`fast. Our initial results show identification accuracies
`comparable t o the best results of minutiae-based algo-
`rithms published in the open literature [.I. Finally, w e
`show that the matching performance can be improved
`by combining the decisions of the matchers based o n
`complementary fingerprint information.
`1 Introduction
`With the advent of electronic banking, e-commerce,
`and smartcards and an increased emphasis on the
`privacy and security of information stored in various
`databases, automatic personal identification has be-
`come a very important topic. Accurate automatic per-
`sonal identification is now needed in a wide range of
`civilian applications such as passport control, cellu-
`lar telephones, automatic teller machines, and driver
`licenses. Traditional knowledge-based (password or
`Personal Identification Number (PIN)) and token-
`
`'This work is partially supported by IBM Contract No.
`11 1001069.
`
`Figure 1: Minutiae features: a ridge ending (U) and a
`bifurcation (0).
`
`based (passport, drivers license, and ID card) identi-
`fications are prone to fraud because PINS may be for-
`gotten or guessed by an imposter and the tokens may
`be lost or stolen. As an example, credit card fraud
`alone now costs more than 6 billion dollars annually.
`Biometrics, which refers to identifying an individual
`based on his or her physiological or behavioral charac-
`teristics is more reliable in differentiating between an
`authorized person and an imposter.
`Among all the various biometrics (e.g., face, fin-
`gerprints, iris, etc.) , fingerprint-based identification
`is the most mature and proven technique. A finger-
`print is the pattern of ridges and furrows on the sur-
`face of the finger. The uniqueness of a fingerprint
`can be determined by the overall pattern of ridges
`and furrows as well as the local ridge anomalies (a
`ridge bifurcation or a ridge ending, called minutiae
`points (see Figure 1)). As fingerprint sensors are be-
`coming smaller and cheaper [l], automatic identifica-
`tion based on fingerprint is becoming an attractive
`alternative/complement to the traditional methods of
`identification. The critical factor for the widespread
`use of fingerprints is in meeting the performance (e.g.,
`matching speed and accuracy) standards demanded by
`
`0-7695-0 149-4199 $10.00 0 1999 IEEE
`
`187
`
`ASSA ABLOY Ex 1037 - Page 1
`ASSA ABLOY AB, et al. v. CPC Patent Technologies Pty Ltd.
`IPR2022-01093 - U.S. Patent No. 8,620,039
`
`

`

`Input Image
`
`Normalize each sector
`
`Filtering
`
`Compute S.D.
`
`I
`
`I\
`
`e .
`
`e .
`
`Template FingerCode
`
`Input FingerCode
`
`U
`I
`
`Calculate Euclidian distance
`
`t
`Matching result
`
`Figure 2: Flow diagram of our fingerprint identification system.
`
`emerging civilian identification applications. Some of
`these applications (e.g., fingerprint-based smartcards)
`will also benefit from a compact representation of a
`fingerprint.
`
`The popular fingerprint representation schemes
`have evolved from intuitive system design tailored for
`fingerprint experts who visually match fingerprints.
`These schemes are either predominantly local (e.g.,
`minutiae-based fingerprint matching systems [2, 31)
`or exclusively global (fingerprint classification based
`on Henry system [4, 51). The minutiae-based auto-
`matic identification techniques first locate the minu-
`tiae points and then match their relative placements
`in a given finger and the stored template [2]. A good
`quality fingerprint contains between 60 to 80 minu-
`tiae, but different fingerprints have different number
`
`of minutiae. The variable sized minutiae-based repre-
`sentation does not easily lend itself to indexing mech-
`anisms; typical approaches [2] to match the minutiae
`from two fingerprints need to align the unregistered
`minutiae patterns of different sizes which makes them
`computationally expensive. The variable length of fin-
`gerprint representation (in terms of position and ori-
`entation of minutiae) makes it unsuitable to store the
`fingerprint on a smartcard. The global approach to
`fingerprint representation is typically used for index-
`ing, but does not offer good individual discrimination.
`Further, the indexing efficacy of existing global repre-
`sentations is poor due to a small number of categories
`that can be effectively identified and a highly skewed
`distribution of the population in each category. Both
`these approaches utilize representations which cannot
`
`188
`
`ASSA ABLOY Ex 1037 - Page 2
`ASSA ABLOY AB, et al. v. CPC Patent Technologies Pty Ltd.
`IPR2022-01093 - U.S. Patent No. 8,620,039
`
`

`

`be easily extracted from poor quality fingerprints.
`It is desirable to explore representation schemes
`which combine global and local information in a fin-
`gerprint. We present a new representation for the fin-
`gerprints which yields a relatively short, fixed length
`code, called Fingercode suitable for matching as well
`as storage on a smartcard. The matching reduces to
`finding the Euclidean distance between these Finger-
`Codes and hence the matching is very fast and the
`representation is amenable to indexing. We make use
`of both the global flow of ridges and furrows structure
`and the local ridge characteristics to generate a short
`fixed length code for the fingerprints while maintain-
`ing a high recognition accuracy.
`The proposed scheme of feature extraction tessel-
`lates the region of interest of the given fingerprint im-
`age with respect to a frame of reference (Figure 2).
`A feature vector is composed of an ordered enumer-
`ation of the features extracted from the (local) in-
`formation from each subimage (sector) specified by
`the tessellation. Thus, the feature elements capture
`the local information and the ordered enumeration of
`the tessellation captures the invariant global relation-
`ships among the local patterns. The local discrimi-
`natory information in the sector needs to be decom-
`posed into separate components. Gabor filterbank is a
`well known technique to capture useful information in
`specific bandpass channels as well as decompose this
`information into orthonormal components in terms of
`spatial frequencies.
`2 Filter-based Feature Extraction
`The three main steps in our feature extraction al-
`gorithm are: (i) determine a reference frame for the
`fingerprint image, (ii) filter the image in eight differ-
`ent directions using a bank of Gabor filters, and (iii)
`compute the standard deviation of gray values in sec-
`tors around the reference point in filtered images to
`define the feature vector or the Fingercode.
`
`Figure 3: Reference point (x), the region of interest
`and 80 sectors superimposed on a fingerprint.
`
`Let I(x,y) denote the gray level at pixel (x,y) in
`an M x N fingerprint image and let (xc,gc) denote
`the reference point. The region of interest is defined
`as the collection of all the sectors Si, where the ith
`sector S; is computed in terms of parameters ( T , 0) as
`follows:
`Si = {(x,y) Ib(Ti + 1) I r < b(T; + 2),
`0i 5 0 < o i + l , 1 I x 5 N , 1 F Y 5 M ) ,(I)
`where Ti = i diu k , 0i = (i mod k)(27r/k), T =
`+ (Y - Yc)2, 0 = t a n - l ( ( y -Yc)/(. - x c ) ) ,
`& -
`b is the width of each band and k is the number of
`sectors considered in each band. We consider five con-
`centric bands around the detected reference point for
`feature extraction. Each band is 20-pixels wide ( b =
`20), and segmented into sixteen sectors (k = 16) (Fig-
`ure 3). A 20-pixel wide band captures about 2 ridges
`and furrows on an average, in a 500 dpi fingerprint
`image. The innermost band is not used for feature
`extraction because the sectors in the region near the
`reference point contain very few pixels and, therefore,
`the standard deviation estimates in this region are not
`very reliable. Thus, we have a total of 16 x 5 = 80 sec-
`tors (So through S79). Eighty features for each of the
`eight filtered images give us a total of 640 (80 x 8)
`features per fingerprint image. Each feature can be
`quantized into 256 values and requires 1 byte of stor-
`age, so the entire feature vector requires only 640 bytes
`of storage.
`It is difficult to rely on feature extraction based
`on explicit detection of structural features in (e.g.,
`poor quality) fingerprints; features based on statisti-
`cal properties of images are likely to degrade gracefully
`with the image quality deterioration. For this study,
`we rely on grayscale variance-based (e.g., standard de-
`viation) features. The grayscale standard deviation in
`an image sector is indicative of the overall ridge activ-
`ity. As noted in the Section 3, our matcher based on
`this simple statistical feature performs extremely well
`and we expect to achieve significantly better accura-
`cies with more discriminative attributes.
`It is desirable to obtain representations for finger-
`prints which are scale, translation, and rotation in-
`variant. Scale invariance is not a significant problem
`since most fingerprint images could be scaled as per
`the spatial resolution (dpi) of the sensor. The rota-
`tion and translation invariance could be accomplished
`by establishing a reference frame based on the intrin-
`sic fingerprint characteristics which are rotation and
`translation invariant.
`2.1 Reference Frame Determination
`Fingerprints have many conspicuous landmark
`structures and any combination of them could be used
`
`189
`
`ASSA ABLOY Ex 1037 - Page 3
`ASSA ABLOY AB, et al. v. CPC Patent Technologies Pty Ltd.
`IPR2022-01093 - U.S. Patent No. 8,620,039
`
`

`

`for establishing a reference frame determination. We
`define the reference point of a fingerprint as the point
`of maximum curvature of ridges in the fingerprint im-
`age and the reference axis is defined to be the axis
`of local symmetry at the reference point. Reference
`point and reference axis together establish an invari-
`ant frame of reference for a given fingerprint. Refer-
`ence frame detection relies on Poincark index analysis
`and local symmetry detection, similar to the method
`used in [5].
`
`Figure 4: Determining the reference frame.
`
`Our representation scheme tolerates imprecision in
`the estimates of reference frame. Since the fingerprints
`are smoothly flowing ridge patterns, the characteris-
`tics of local neighborhoods change gradually. A small
`perturbation (within one inter-ridge distance unit) is
`likely to change the representation only slightly, so
`the overall variations in the representation of a fin-
`ger due to inaccurate localization is expected to re-
`main small. The detected reference point could be
`as much as 12 pixels (approximately 1 inter-ridge dis-
`tance units) away from its "true" location. Symmetry
`axis detection is very difficult in fingerprint images
`because the upper portion of the fingerprint has cir-
`cular ridges. We do not reply on the symmetry axis t o
`align the fingerprints. We achieve rotation invariance
`by rotating the Fingercode itself during the matching
`stage. Typical outputs of the reference frame deter-
`mination algorithm are shown in Figure 4.
`
`2.2 Filtering
`To remove noise and enhance the ridge and fur-
`row structures, we filter the fingerprint image in dif-
`ferent directions using a bank of Gabor filters [6]. Fin-
`gerprints have local parallel ridges and furrows, and
`well-defined local frequency and orientation. Properly
`tuned Gabor filters can remove noise, preserve the true
`ridge and furrow structures, and provide information
`contained in a particular direction in the image. A
`minutia point is an anomaly in locally parallel ridges
`and this information is captured by the Gabor filters.
`An even symmetric Gabor filter has the following gen-
`
`Figure 5: One of the eight Gabor filters (size = 33 x 33,
`f = 0.1, 6, = 4.0, 6, = 4.0, 8= 0')
`in the spatial
`domain.
`
`era1 form in the spatial domain:
`
`G(x,y;f,O) = I ; ( x 1 , 3 1 1 , 6 x , S y ) C O S ( 2 ~ f x 1 ) , (2)
`where (j'(x', yl, S,, 6,) = ,-0.5((X'*/6~)+(Y'*/6~)),
`2' =
`xsan8 + ycose, y' = xcos8 - ysin8, f is the frequency
`of the sinusoidal plane wave along the direction 8 from
`the x-axis, and 6, and 6, are the space constants of the
`Gaussian envelope along x and y axes, respectively.
`In our experiments, we set the filter frequency f
`to the average ridge frequency (l/K), where K is the
`inter-ridge distance. The average inter-ridge distance
`is approximately 10 pixels in a 500 dpi fingerprint im-
`age. If f is too large, spurious ridges are created in
`the filtered image whereas if f is too small, nearby
`ridges are merged into one. We used eight different
`values for 8 (0", 22.5', 45', 67.5", go", 112.5', 135",
`and 157.5') with respect to the x-axis. One of these
`eight filters is shown in Figure 5. The region of in-
`terest in a fingerprint image is convolved with each
`of these eight filters to produce a set of eight filtered
`images. A fingerprint convolved with a 0'-oriented
`filter accentuates those ridges which are parallel to
`the x-axis and smoothes the ridges in the other di-
`rections. Filters tuned to other directions work in a
`similar way. These eight direction-sensitive filters cap-
`ture most of the global ridge directionality information
`as well as the local ridge characteristics present in a
`fingerprint. We illustrate this by reconstructing a fin-
`gerprint image by adding together all the eight filtered
`images. The reconstructed image is similar to the orig-
`inal image but has been enhanced (Figure 6(h)). At
`least four directions are required to capture the entire
`global ridge information in a fingerprint (Figure 6(g)),
`but eight directions are required to capture the local
`characteristics. By capturing both the global and lo-
`cal information, the verification accuracy is improved
`although there is some redundancy among the eight fil-
`
`190
`
`ASSA ABLOY Ex 1037 - Page 4
`ASSA ABLOY AB, et al. v. CPC Patent Technologies Pty Ltd.
`IPR2022-01093 - U.S. Patent No. 8,620,039
`
`

`

`tered images. If the 6, and 6, (standard deviations of
`the Gaussian envelope) values are too large, the filter
`is more robust to noise, but is more likely to smooth
`the image to the extent that the ridge and furrow de-
`tails in the fingerprint are lost. If the 6, and 6, values
`are too small, the filter is not effective in removing
`noise. The values for 6, and 6, were empirically de-
`termined and each is set to 4.0 (about half the average
`inter-ridge distance).
`Before filtering the fingerprint image, we normal-
`ize the region of interest in each sector separately to
`a constant mean and variance. Normalization is per-
`formed to remove the effects of sensor noise and finger
`pressure differences. Let I ( z , y) denote the gray value
`at pixel (x,y),
`variance of sector S,, respectively, and N i ( z , y), the
`normalized gray-level value at pixel (z, y). For all the
`pixels in sector S,, the normalized image is defined as:
`
`and x, the estimated mean and
`
`if I ( Z , Y ) > M
`
`otherwise ,
`
`where MO and VO are the desired mean and variance
`values, respectively. Normalization is a pixel-wise op-
`eration which does not change the clarity of the ridge
`and furrow structures. If normalization is performed
`on the entire image, then it can not compensate for
`the intensity variations in different parts of the fin-
`ger due to finger pressure differences. Normalization
`of each sector separately alleviates this problem. For
`our experiments, we set both MO and Vo to a value of
`100.
`2.3 Feature Vector
`Let F,o(z, y) be the &direction filtered image
`for sector Si. For V i = 0,1,. . .,79 and 8 E
`[O", 22.5",45", 67.5", go", 112.5", 135", 157.5'1, the fea-
`ture value is the standard deviation & e , defined as
`
`number.of pixels in Si and Pie is the mean of pixel
`values of F,0(z,y) in Si. The standard deviation of
`each sector in each of the eight filtered images de-
`fine the components of our feature vector. The 640-
`dimensional feature vectors (Fingercodes) for finger-
`print images of two different fingers are shown as gray
`level images with eight disks, each disk corresponding
`to one filtered image (Figure 7). The gray level in a
`sector in a disk represents the feature value for that
`sector in the corresponding filtered image. Note that
`Figures 7(a) and (b) appear to be visually similar as
`
`191
`
`(el
`
`ff)
`
`Figure 6: Normalized, filtered (only four orientations
`shown), and reconstructed fingerprint images. Only
`the central part of the fingerprint is shown. (a) area
`of interest (b) normalized image (c)-(f) 0", 45", go",
`and 135" filtered images; (g), (h) reconstructed images
`using four and eight orientation filters, respectively.
`
`ASSA ABLOY Ex 1037 - Page 5
`ASSA ABLOY AB, et al. v. CPC Patent Technologies Pty Ltd.
`IPR2022-01093 - U.S. Patent No. 8,620,039
`
`

`

`are Figures 7(c) and (d), but the corresponding disks
`for two different fingers look very different.
`
`f30” rotation. These images are not as noisy as the
`inked fingerprints but they do contain large nonlinear
`deformations. We have not used any of the standard
`databases (e.g., NIST 9) because the inked fingerprint
`images are not representative of the livescan images
`used in the civilian applications.
`Each fingerprint in the database is matched with
`all the other fingerprints in the database. A matching
`is labeled correct if the matched pair is from identi-
`cal finger and incorrect otherwise. A total of 62,250
`(250 x 249) matchings were performed. The distri-
`bution for genuine (authentic) matches was estimated
`with 2,250 (250 x 9) matches and the imposter distri-
`bution was estimated with 60,000 (250 x 240) matches.
`None of the genuine matching scores was zero; the im-
`ages from the same finger did not yield an identical
`Fingercode because of rotation and inconsistency in
`reference location. Given a matching distance thresh-
`old, the genuine acceptance rate is the fraction of times
`the system correctly identifies two fingerprints repre-
`senting the same finger. Similarly, false acceptance
`rate is the fraction of times the system incorrectly
`identifies two fingerprints representing the same fin-
`ger. A Receiver Operating Characteristic (ROC) is
`a plot of Genuine Acceptance Rate against False Ac-
`ceptance Rate for all possible system operating points
`(i.e., matching distance threshold) and measures the
`overall performance of the system. An “ideal” ROC
`curve is a step function at zero False Acceptance
`Rate. Figure 8 compares the ROCs of a state-of-the-
`art minutiae-based matcher [2] with our filter-based
`matcher. Since the ROC curve of the filter-based
`matcher is above the minutiae-based matcher, we con-
`clude that our matcher performs better than a state-
`of-the-art minutiae-based matcher on this database.
`
`Figure 7: Examples of 640-dimensional feature vec-
`tors. (a) First impression of finger 1; (b) Second im-
`pression of finger 1; (c) First impression of finger 2;
`and (d) Second impression of finger 2.
`
`3 Matching and Experimental Results
`Our preliminary experiments involve a database
`containing 250 images (size = 640 x 480) from 25 dif-
`ferent fingers. Ten impressions of each finger are avail-
`able for a total of 250 images. The images were cap-
`tured with an optical scanner manufactured by Dig-
`ital Biometrics. We have developed a GUI to help
`the subjects in the proper placement of their fingers.
`The subjects were asked to place their fingers upright
`at the center of the glass platen and they were asked
`to provide different impressions of their fingers within
`
`Figure 8: Receiver Operating Characteristic (ROC)
`curve.
`An added advantage of an “independent” finger-
`
`192
`
`ASSA ABLOY Ex 1037 - Page 6
`ASSA ABLOY AB, et al. v. CPC Patent Technologies Pty Ltd.
`IPR2022-01093 - U.S. Patent No. 8,620,039
`
`

`

`print representation such as that proposed here is that
`it captures discriminatory information that is comple-
`mentary to the information used by commonly used
`minutiae-based fingerprint matchers. Consequently,
`the overall performance of fingerprint matching can
`be significantly improved by combining results of the
`matchers based on different representations. Figure 8
`shows such an improvement in matching accuracy re-
`sults by using a linear combinatinr, of the scores ob-
`tained from the pr<>ljc ~-:d her-based and minutiae-
`based [2] matchers. The weights in the linear com-
`bination were selected corresponding to a line pass-
`ing through origin that best separates the genuine
`distribution and the imposter distribution in a two-
`dimensional plot of scores from the two matchers.
`4 Conclusions
`We have developed a novel filter-based representa-
`tion technique for fingerprint identification. The tech-
`nique exploits both local and global characteristics in
`a fingerprint to make an identification. Each finger-
`print image is filtered in a number of directions and
`a 640-dimensional feature vector is extracted in the
`central region of the fingerprint. The feature vec-
`tor (Fingercode) is compact and requires only 640
`bytes. The matching stage computes the Euclidean
`distance between the template Fingercode and the
`input Fingercode. On a database of 250 fingerprints
`from 25 different fingers, 10 impressions per finger,
`we are able to achieve identification accuracy which is
`slightly better than the performance of a state-of-the-
`art minutiae-based fingerprint matcher. About 99%
`of the total compute time (- 3 seconds on a SUN
`ULTRA 10) is taken by the convolution of the input
`image with 8 Gabor filters. The primary advantage of
`our approach is its computationally attractive match-
`ing/indexing capability. For instance, if the normal-
`ized (for orientation and size) Fingercodes of all the
`enrolled fingerprints are stored as templates, the iden-
`tification effectively involves a “bit” comparison. As
`a result, the identification time would be relatively in-
`sensitive to the database size. Further, our approach
`for feature extraction and matching is more amenable
`to hardware implementation than, say, string based
`fingerprint matcher.
`There are a number of limitations of the initial im-
`plementation described in the paper. The represen-
`tation and matching schemes assume that reference
`frame can be determined with a reasonable accuracy.
`A more realistic approach would consider a combina-
`tion of frame determination methods and then verify
`a consistent frame positioning. The current imple-
`mentation requires that the entire region of interest
`be available and does not take into account occlusion
`
`or obliteration of part of the fingerprint. This sit-
`uation could be remedied by incorporation of “don’t
`care” options for the components of the representation
`which do not correspond to fingerprint area. While the
`present approach tolerates small magnitudes of elas-
`tic distortion and local scaling (due to finger-pressure
`variations), it does not take care of significant non-
`linear elastic distortion in the fingerprints. The inter-
`ridge densities in a fingerprint could be used to obtain
`a canonical representation to compensate for the large
`distortions due to shear and pressure variations caused
`by the contact of the finger with the sensing device.
`We are currently evaluating the performance of the
`filter-based matcher on databases containing thou-
`sands of livescan fingerprints. On a database of 2,672
`images belonging to 167 people, we achieve an FRR
`of 12% for an FAR of 1%. This performance needs to
`be improved significantly. The main problem seems to
`be the failure of reference frame detection algorithm
`in poor quality fingerprint images. We are working on
`(i) a more robust determination of reference frame, (ii)
`refinements of initial strategies for feature extraction
`and matching, and (iii) indexing techniques based on
`the proposed representation.
`References
`[l] L. 0’ Gorman, “Fingerprint Verification,” in Bio-
`metrics: Personal Identification in a Networked
`Society, A. K. Jain, R. Bolle, and S. Pankanti, ed-
`itors, Kluwer Academic Publishers, 1998.
`[a] A. K. Jain, L. Hong, S. Pankanti, and R. Bolle,
`“An Identity Authentication System using Finger-
`prints,” Proceedings of the IEEE, Vol. 85, No. 9,
`pp. 1365-1388, 1997.
`[3] D. Maio and D. Maltoni, “Direct Gray-Scale Minu-
`tiae Detection in Fingerprints,” IEEE Trans. Pat-
`tern Anal. Machine Intell., Vol. 19, No. 1, pp. 27-
`40, 1997.
`[4] G. T. Candela, P. J. Grother, C. I. Watson,
`R. A. Wilkinson, and C. L. Wilson, “PCASYS:
`A Pattern-Level Classification Automation Sys-
`tem for Fingerprints,” NIST Tech. Report NISTIR
`5647, August 1995.
`[5] A. K. Jain, S. Prabhakar, and L. Hong, “A Mul-
`tichannel Approach to Fingerprint Classification” ,
`IEEE Trans. Pattern Anal. and Machine Intell.,
`Vol. 21, No. 4, 1999.
`[6] J. G. Daugman, “High Confidence Recognition of
`Persons by a Test of Statistical Independence,”
`IEEE Trans. Pattern Anal. and Machine Intell.,
`Vol. 15, NO. 11, pp. 1148-1161, 1993.
`
`193
`
`ASSA ABLOY Ex 1037 - Page 7
`ASSA ABLOY AB, et al. v. CPC Patent Technologies Pty Ltd.
`IPR2022-01093 - U.S. Patent No. 8,620,039
`
`

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket