`
`Page I of 8
`
`Nearest neighbor search
`
`From Wikipcdia. the fTtt encyclopedia
`
`Nea rest neighbor RlIrt' h (NNS), also koo"'n as proximity searc h, si milarity St'1I rt'h or cJo~1 point
`stArch, is an optimintion problem for finding closest (or most similar) points, Closeness is typically
`expressed in lenJ1S of a dissimilarity function: lhe less similar the objects, lhe larger the function values.
`Formally, the nearest-ntighbor (NN) search problem is defi ned as follows: gh'en a sel S of points in a
`Spllct M and a query paim q E 111, find the closest point in S 10 q. Donald Knuth in "01. 3 of The An (if
`COn/I'mer l'rogrumm;lIg (1973) called il the pnst-G ffict probJ~m, referri ng \0 an applica tion of
`assigning to a residence 1M nearest post olTice. A direct generalintion of this problem is a k-NN sean:h,
`where we need to find Ihe k closest points.
`
`Most commonly AI is II metric space and dissi milarity is exproscd as a distallCe metlie, which is
`symmenic and sAlisfies the lriangle inequality. E"en more common, /of is taken to be the (I-dimensional
`vector space where d issimilarity is measured using the Euclidean diSlance. Manhattan distance or other
`distance melric. Howel'er, Ihe dissimilarity function can be arbitrary. One example are asymmetric
`Bregman divergences. for which the lriangle inequality does not hold.ll)
`
`Contents
`
`I Applications
`•
`• 2 Methods
`• 2.1 Linear search
`• 2.2 Space panitioning
`• 2.3 Locality sensitive hashing
`• 2.4 Nearest neighbor sean:h in Sp<Kes ""i1h small intrinsic dimension
`• 2.5 Projected radial search
`• 2.6 Vector approximalion files
`• 2.7 Compression/clustering based search
`'2 R Greedy "'!Ilk se!lro;h ;n sm~ll-worid gfllphs
`•
`• J Variants
`• 3,1 k-nearest neighbor
`• 3.2 Approximate nearest neighbor
`• 3 .3 Nearest neighbor distance ral io
`• 3.4 Fi.~ed-radius near neighbors
`• 3.5 All nearesl ncighhon
`• 4 See also
`• 5 NOles
`• 6 Rererences
`• 7 Funherreading
`• g Extemal links
`
`file J/fC:lUscnlklaiA DataILocalfTen
`
`Page 10f8
`SXBUIN.htm
`
`NETWORK-l EXH IBIT 2008
`Google inc. v. Network· I Technologies, Inc.
`I PR20 15-00345
`
`9/1012015
`
`
`
`Nearest neighbor search - wikipedia, the free encyclopedia
`
`Pagt' 2 of 8
`
`Applications
`
`The nearest neighbor search problem arises ill numerous fields ofapplicDtion, includ ing:
`
`• Pallem recognition - in paniclilar for optical character recognition
`• Sl3tistical classification- see k-nearest neighbor al gori thm
`• Computer vision
`• Computational Geometry - see Closest pair o f points problem
`• D,lIabases - e.S. oomem-based imase rerrie''lll
`• Coding theory - see maximum likelihood dc<:oding
`• Dala compression - see MPEG-2 standard
`• Robolic sensingPl
`• Recommendation systems, e.S. see Collaborath 'e fiiterillg
`• Internet markeling - see conte:O;I1l31 ad"cn ising and beha"ionlitargeting
`• DNA Sl"qucncing
`• Spell checking - suggesti ng correct spelling
`• Plagiarism detection
`• ComaCt searching algorithms in FEA
`• Similarity scores for predicting career paths of pmfessional athletes.
`• Cluster analysis· assignment ofa set of obsero.'lltions into subsets (called dusters) so that
`obsel'\'ations in the same !:IuStl'l life simil3t in SOliII' Sl'nS\!', usually baSt'll 011 Em:litil'3n tiiSllIllt:l'
`• Chemical similarity
`• Sampling-Based Motion Planning
`
`Methods
`
`Various sol utions 10 the NNS problem ltave been proposed. The quality lind usefulness of the algorithms
`are determined by the lime comple"i!}' of queries as well as the space comple)tily of any searc h data
`structures that must be mai nta ined. The informal observation usually referred to as the Curse of
`dimensionality states thai there is no general-purpose e)tact solution for NNS in high-dimensional
`Euclidean space using polynomial preprocessing and polylogarilhmic search lime.
`
`Linear search
`
`The simplest solulion to the NNS problem is 10 compule Ihe di stance from the query poi nl to e"ery other
`point in Ihe database, keeping Irack o flhe ~beSt so far". This algorithm. sometimes referred to as Ihe
`nai,'e approach, has D running lime ofO(dN) where N is the cardinality of S and d is the dimensionality
`of M. There are no search data structures to maintain , so linear sean:h has no space complexity beyond
`the StOrage of the database. Nai ve searc h can, on D,"erage, outperform space par1itioning approaches on
`higher dimensional spaces.I))
`
`Space partitioni ng
`
`file J/fC:lUsenlklaiA DmaILocalfTen
`
`Page 2 of8
`SXBU1N.ht m
`
`9/1012015
`
`
`
`Nearest neighbor search - wikipedia, tbe free encyclopedia
`
`Page 3 of 8
`
`Since the 1970s, branch and bound methodology has been applied to the problem. In the case of
`Euclidean space this approach is known as spatial index or spalial access methods. Several space·
`paTli tioni ng ITI('thods have been de"e1oped for solving the NNS problem. Perhaps the simplest is Ihe k·d
`tree, which iterali,'ely bise<:ts the sean:h space into IWO regions containing half of the points of the
`parent region. Queries are performed via traversal oftbe tree from the root to a leaf by evaluating the
`query poinl at each spliL Depending on the distance specified in Ihe query, neighboring branches Ihal
`might contain hits ma y also need to be evaluated. For constant dimension query time, average
`complexity is 0(1011 N) r' l in Ihe case of randomly distributed points, worst case complexity analyses
`have been peri"omled."1 A hemali"ely the R-tree data structure was designed 10 suppon nearest neighbor
`searc h in dynamic context, as it has efficient al gorithms for insenions and deletions such as the R· tree.
`t") R-trees can yield nearest neighbors J>Ot only for Euclidean distance. but can also be used wi lh othe r
`distances.
`
`In case o f general metric space bT3nch and bound approach is kll(lwn under the name of lnetric trees_
`Panicular examples include "p-tree and BK-tree.
`
`Usi ng a set of points taken from a 3-dimensional space and put into a BSP tree. and given a query poi nt
`taken from thi:' SlIme space, a possible solution to Ihe problem of finding the nearest point-cloud point to
`the query point is given in the following description of an algori thm. (Strictly speaking, no such point
`may exist, because it may not be unique. But in practice, usually we only care about finding anyone of
`the subset of all poim-cloud points that eltist at the shonest distance to a given query point.) The idea is,
`for each branc hing of tile tree. guess that the dosest point in the cloud residl'$ in the half·space
`containing the query poine. This may not be the usc. but it is a good heuristic. After having recursh'ely
`gone through all the trouble o fsol"ing the problem for the guessed hlf-space. now compare the
`distance returned by this result with the shonest diS\JlllCe from the query point to the panitioning plane.
`This laller dis18nce is that between the query point and the closest possi ble point that could exist in the
`half-space II(lt searched. Iflhis di stance is greater than that returned in the earlier result, then clearly
`there is no need to search Ihe other half-space. Ifther"e is suc h a need. then you mUSI go through the
`trouble of solving Ihe problem for the other ha lf space, and then compare its result to the former result.
`and then relum the proper result. The perfonnan-ce o f this algorithm is nearer \0 logarithmic time than
`linear lime when the query poi nt is near the cloud. because as the di stance between Ihe query point and
`Ihe closest point-cloud point nears zero, the al gori thm needs only perfonn a look-up usi ng the query
`poi nt as a key to gt"t the l'UITt'Ct result.
`
`Loca lity sensitive hashing
`
`Loca lity sensitive hashing (LSH) is I technique for grouping points in space into 'buc kets' based on
`some distance metric operating on the poi nts. Points that are close to each other under the chosen melric
`are mapped to the same bucket with high probability.t71
`
`Nearest neighbor sea rch in spaces with small intrinsic dimension
`
`The cover tree has a theorMical bound that is based DUthe dataset's doubling constant. The bound on
`sean:h time is O(C1l 10\; II) where c is the expa nsion ~onstam of the dataset
`
`file J/fC:lUsenlklaiA DataILocalfTen
`
`Page 3 of8
`SXBUIN.ht m
`
`9/1012015
`
`
`
`Nearest neighbor search - wikipedia, tbe free encyclopedia
`
`Page " of 8
`
`Proj et:ted radial search
`
`In the special case where the data is a dense 3D map o f geometric points, the projection geometry of the
`se rlsirlg lechnique carl be used 10 dramatically simpli fy the search problem. This approach re-quires th at
`the 3D data is organized by a projeaion 10 a IWO dimensional grid and assumes that the data is spatially
`smOOlh across r1eighboring lo'1id cells with Ihe exception of objecl boundaries. These assumpt ions are
`val id when dealirlg with 3 D sensor dala in applications such as sun'eying, robotics and slereo vision but
`may not hold for unorganized dala in general. In practice this technique has an average search lime of a
`(I) or O{K) for the k-neares\ neighbor problem when applied 10 real world stereo \'ision data. 1:1
`
`Vector approximation fil es
`
`In high dimemional spaces. lree indexing structures become useless because an increasing pereemage of
`lhe nodes need to be examined anyv.'l1y. To speed up li!lear search, a compressed version oftbe feature
`\'ectors stored in RAM is used to prefilter the dataseu in a first nul. The final candidates are determined
`in a second sUige using the uncompresserl data from the disk for distance calculation.111
`
`CO mlJression/ciustering based search
`
`The VA-file approach is a special case ofa compression based search. where each feature componem is
`compressed uniforml y and indepemkmly. The optimal compression technique in multidi mensional
`spaces is Vector Quantization (VQ), implemented through clusterins- The database is clustered and the
`most "promising" clusters are relr1e\'ed. Huge gai ns o\'er VA-File. tree-based indexes and sequential
`scan have been ohse".-ed.I'110I Also 0011' Ihe parallel,. between clusu~ring and I..SH.
`
`Greedy walk search in small-world graphs
`
`One possible way to soh 'e NNS is to construct a graph G ( \~ E~ where every point x; E Sis
`uniquely aS5OCiPtoo with vertex Vi E V. The search of the point in the set S Closesl to the query q takes
`the form of the search of\"mex in the graphG( V, £ ). One Or tlte basic verte;.; search algorithms in
`graphs wilh metric objects is the greedy search alsonlhnl. It stam from the random ,·ene.~ Vi E V. The
`algorithm computes a distance \'alue frolll the query (j to each ,·erte.\ from the neighborllood
`) E E} oflhe curren! vertex Vi. and then selects a \·erte;.; with lite minimal distMCe
`{Vj : (Vi, UJ
`value. lftbe di stance \'alue between the quel)' and the selected \·erte;.; is smaller than the one between
`Ihe query and the current element, then the algorithm moves 10 the selecled \'erte;.;, and ;1 becomes new
`curren! vertex. The algorilhm SlOpS when it reaches a local minimum: a verte;.; whose neighborhood does
`not comain a vertex thai is closer to lbe quel)' titan the \'enex itself. This idea was ex ploi tl'd in VoroNel
`system 1111 focthe plane, in RayNet system Illl for the E" a nd foc Ihe general metric space in Metrizcd
`Small World algorithm.IUI Java and C++ implememations of the algorilhm loge-ther with sources are
`hosted on the GitHub {Non·Metrie Space Library
`(hnps:llgithub.comlsearchivarius/NonMetricSpaceLibl. Metrized Small World library
`(hup l/aponom84.githubjo/MetrizedSmallWorldf) .
`
`fil cJlfC JUsenlklaiA DataILocalfTen
`
`Page 4 of8
`SXBUIN.ht m
`
`9/1012015
`
`
`
`Nearest neighbor search - wikipedia, tbe free encyclopedia
`
`Page 5 ofB
`
`Variants
`
`There Bre numerous ,-anant$ oC lhe NNS problem BOO Ihe IwO mosl well-known are Ihe k-nean:SI
`neighbor search and the f;-approximate nearest neighbor search_
`
`k· ncarcst neighbor
`
`k-nearesl neighbor search identifies !he top k nearest neighbors to the query. This technique is
`«lInmonl)' used in predictive analytics to estimate or classify a point based on the consensus of its
`neighbors. k-nearcst neighbor graphs are graphs in which every point is connected to its k nearest
`neighbors.
`
`AIlllroxima1e nt'ares1 neighbor
`
`In some applications it may be acceptable to retneve a "good guess" oflhe nearest neighbor. In those
`cases. we can use an algorithm which doesn't guanr.mee to return the actual nearest neighbor in e,'ery
`case, in return for improved sp«il or memory savings. Often such 3ll algorithm will find the nearest
`ne ighbor in a majority of cases. bUl lhis depends strongly on Ihe dataset being queried,
`
`Algonthms that support the approllimatc nearcst lKighbor search include locality-sensitive hashing, beSt
`bin first and balanced box-decomposilion tree based search,l!'1
`
`Nearest neighbor d ista nce ratio
`
`Nearesl neighbor di stance ratio do IKM apply the threshold on tbe direct distance from the onginal point
`to lite challenger neighbor but on a ralio ofil depending on Ihe distance to the previous neighbor. It is
`used in C BI R to retne" e pictures through a "query by example" using the similarity between local
`features, More generally il is involved in several matching problems_
`
`Fixed-radius ncaT neighbors
`
`Filled-radius near neighbors is the problem where one wantS to efficiently find all points given in
`Euclidean space within a sh'en filled d istance from a specified point The dala 5lructure should work on
`a distance which is filled howevCf the query point is ;arbilrary,
`
`All nearest neighlJurs
`
`For some applications (e.s_ entropy estimation). we may hal'e N data-poims and wish to know which is
`the nearest neighborfor e\'C'J' Olll! oflhosl! N pOilll5. This could o f cou~e be ac;hieved by running a
`neareSI-neigbbor search once for every poim, but an improved stnr.tegy would be an algorithm lhat
`exploits the information redundancy l>elWeen these N queries to produce a more tfficiem search_ As a
`sim ple tllample: when we find the distance from poi nt Xto point Y, that also tells uS the d istance from
`poi nt rIo point X, so lhe same calculation can be reused in twO different queries.
`
`fil cJlfC:lUsenlklaiA DalalLocalfTen
`
`SXBUIN.ht m
`
`9/1012015
`
`Page 5 of8
`
`
`
`Nearest neighbor searc h - wikipedia, the free encyclopedia
`
`Page 6 of 8
`
`Given a fixed dimension, II semi-definite positive nonn (thereby including every L' nonn), and " poims
`in Ih is space, Ihl' nearesl neighbour of 1" '1'1)' poinl can be found in 0(11 log II) time and Ihe m nt'3Test
`neighbours of every poim can be found in 0(11111 log II) lime.11l1161
`
`See also
`
`• Ball tree
`• Closest pair o f poinlS problem
`• ClUSter analysis
`• Content-based image retrie"al
`• Curse of dimensiooality
`• Digital signal processing
`• Dimension reduction
`• Fixed-radi us near neighbors
`• Fouri~ analysis
`• Insumce-based learni ng
`• k-nearest neighbor algorithm
`• Linear least squares
`• Locality sensitive hashing
`
`Notes
`
`• Min Hash
`• fo,'lultidimensional analysis
`• Nearest-neighbor interpolation
`• Neighbor joining
`• Principal com ponent analysis
`• Range search
`• Set cover problem
`• Singular " al ue decomposi tion
`• Sparse distributed memory
`• Statistical distance
`• Time series
`• Voroooi diagram
`• Wavelet
`
`1. CI}'on, la"'Crencc (21)(»1). "Fast ncare5I r>eighbo. "CIrie"al fo. brq;man di '·C'l;cnca.". I'ro<:eedings oflhl!
`15th Inll:rnlllionu/ OOI'Ijerence un MlJChlnc "'aroing: 112- 1 19.
`2. Bcwley. A , &: Upcrofl., B. (2013). Ad,'~ntngcs ofExploiling Projcdion SU'UCturc for Segmenting Dense 3D
`Point Clouds. In AUSl.rolian Conference on Robotics and Automation 111
`(hnp:! ("",,"w .aru.asn.auf lie ro/acra..JOI 3lpapcrs/pap 1485 1_ Ii Ie I. pel/)
`3. Weber, Schck, BIoII. "A qu:mlilOlli,'c analysis and performance Study for sim;larity search methods in high
`dim.:MS iona I spaces" (h up:!!w"",,'. \'ldb.orgfconl7l W8Ip 194. pel I) (PI)" ).
`4. AIlIitcw Moorc . "An introduclory !uloria) on KD
`trees" (hup:II,,""w.3ulOn!~b.comlauIOllwcb' I 466Sr ... crsionf!-'p;u1I:5id;lIaimoore-lutoriaLpdl'!
`br:LDcb",=in& I:mguagl'"'cn) (t'l)f),
`5. lce, D. T.: Wong. C. K. ( 1')77), "Wonl-ease Malysis for region and ",m;al region searches in
`multidimensional binary seareh trees and balanced quad trees". ,4,,1U Infi,rn,atlro 9 ( I): 23-29.
`do;: 10, IOO716FOO263763 (hllps:lId.~.doi.orglI0. I OO7"42 FBFOO263763).
`6. Rouosopoulos, N.: Kelley, S.: Vi...,....!. F. D. R. (1995). "Nea=' neighbor qucric:s". 1'mc<?<'d"'8~ of the 199$
`ACAI SIGMO!) l"ternali"",,1 """j"",nCl! /}II Manag.",..,nr of dOlO, SIGMOIJ '95, p. 71.
`doi: 10. 11 45/223784223794 (hups:lld.l.doi.orglI 0.1145%2F223 784.22379~). ISBN 0l!979173 16.
`7, A. RajanlJnan and S. Ullman (2010). "Mining ofMassi"e Dataset$. Ch.
`3. " (hl1 p:llin r olab.stan r ord.cd uI-ul lmanlmmds. hi ml).
`8. Weber. BIOIl • An Approximation-Based Dal.3 Strut:turc for Similarity Search",
`9. R3ITIOISwarny. Rose. lOP 2007 .• Adaplive cluSlc:r'iiiSl3llCc bounding for similarit)· search in itn.:lb'l'
`d:u.;lbascs".
`I O. RarnD~"'Dmy, Rosc, TKD E 20 I O. "Adapli" e dUSlCf-di~lDnce bounding for high-dimcn~ionlll indClling".
`II . 01;,·;"', Ikaumont: Kerm:mce, A""""Marie: Marebal, loris; R i,~, Etienne (2006). ' VoroNct: A se:tlablc
`objea nelwork based on Voronoi IcsscIl3lions". IKRlA . RR· 5833 (I): 23-29. doi: 10.lOO7I6FOO26376J
`(hups:l/dx.doi.orr/ I0.IOOW.2FBFOO263763 ).
`
`file :lllC :lUsenlklai A DataILocalfT en
`
`SX BUIN.ht m
`
`9/1012015
`
`Page 60f8
`
`
`
`Nearest neighbor search - wikipedia, the free encyclopedia
`
`Page 7 of 8
`
`12. Oli.'i«. Beaumont: Kermarree. Anne-Marie: Ri.iCR, Etiennc (2007), "Peer to Peer ~Iultidimensional
`On:rbys: Approx im.1ling Compla Structures", I'rllldp/es of Omrihu/eJ ,\)~/..",s -4878 (,): 31 S-328,
`do,: 10.1 007/978-3-540-77096-1_ 2J (hnps:lfdx.doi.ergll 0.1 1.lO'7¥.2F978-3-540-77O%-1_ 23). ISBN 978-3-
`S40-77095-4. Miss'ng I lut ) · in Authors list (help)
`13. Malkov, Yury: Ponomarcnlo, Alc.~andcr: K'ylo" . Vbdimir. Log'"m)\', Andrey (201 4), -Appro.~imate
`DCaJt'S1 neighbor algorithm based on navigable small ,,"()rld graphs". Informa/ion S)~/"mI' 4S: 61-68.
`doi; 10.1 0 I 61j .is.20 I 3.1 O. 006 (https:J'd~ .doi .orgllO. I 0 I W02 Fj. is.!O I 3. I 0.(06).
`14, S. Arya, D. M. Moun ... N, S. Netln)':lbu. R. Sil,·mt1an and A. Wu. An oplim.11 algorithm For Jpprmim.1t~
`nearest neighbor sardtinJ;. Journal of the ACM. 45(6);891-923. 19911.121
`(hnp:J/vruw.csc.USLhklf.xultylaryaipubiJACM.pdl)
`15, Cb.\[son, Kenneth L (1983). "fIlSI algorithms fOllhc all ncare51 neighbors problem". 14th l EEH SJ·mp.
`FOImdariom of Complller .'ie-tellCe, (I-rxs '8J), pp. 226-232, dol; lo.ll09ISFCS.1983.1 (i
`(hltpd/dx.doi.orgII0.1109"-'2FSFCS. 1983.16), ISBN 0_8186-0508-1 .
`16. Vaidya_ P. M. (19119). "An O(n log n) Algorithm fOl" the AlI_NI.':lrciI..Neighbors
`Probkm" (hllp:l/w""" .springcrlinkcomfromcnllp4onk!608787r7l81f1
`JF09d19252d36e4a I bIB96833 71 OcID&c&:pi- 8). f>lM",'f! ond Comput<lliom,' Geomf!/rJ' 4 ( I); 10 I- II 5.
`doi; 10.1 007IBF02187718 {hllp$1Id:<.doi.orsfI0.IOO7Y.2FBF02Um 18).
`
`References
`
`• Andrews, L . A template for the nearest neighbor problem, C/C I I Users Joumal, vol. 19. \1{) II
`(November 1(01), 40 - 49, 1001, ISSN: 107S-2838, www.ddj.comiaTcliilectlI84401449
`(hllp:llwww.ddj.comlarchitect/I8440I449)
`• AtylI, 5., D. M . Mount. N. S. Netanyaliu, R. Sih·ennan. and A. Y. Wu. An Optimal Algorithm for
`Appro:<imale Nearest Neighbor Searching in Fil<ed Dimensions. JUllnwl of/he ACM, \"01. 45. no.
`6, pp. 891-923
`• &oyer. K., Goldstein, J .. Ramakrishnan, R. , and Shaft, U. 1999. When is nearest neighbor
`meaningful? (n Proceedings of tile 7th (eDT. Jerusalem, Israel.
`• Chung-Min Cllen and Villei ling - A Sampl ing-Based Estimator for Top-k Query. ICDE 1001:
`6 17-617
`• Samet. H. 2006. Foundations e f Multidi mensional and Metric Data Structures. Morgan
`Kaufmann. ISB N 0-11-369446-9
`• Zezula, P .. Amato, G .. Dahnal, V .• and Batko. M. Similarity Search - The Metric Space Approach.
`Springer. 2006. ISBN 0-387-29146-6
`
`Further reading
`
`• Shasha. Dennis (2004). High Peiformollc/t DisC'/J\'In)' ill Time Series. Berlin: SpringeT'o
`ISBN 0-387-00857-8.
`
`External links
`
`• Nearest Neighbors and Similarity Search
`(hllp:llsimsearch.yury .n~me/nnorial.ht ml ) - a website
`dedicated to edllC3tionai materials, software. literature,
`researchers, open problems and events related ,e NN
`searching. Maintained by Yury Lifshits
`
`WiJ,;imcdill Commons has
`med.n relnted to ,\ ..... "'"
`fleiglol>ollT'S sfIOIrclo.
`
`file :JIfC:lUsenlklai A DataILocalfT en
`
`Page 7 of8
`SXBUIN.htm
`
`9/1012015
`
`
`
`Nearest neighbor search - wikipedia, the free encyclopedia
`
`Page 8 of 8
`
`• Similarity Search Wiki (hupJlsswiki.ticrTll-aoi.net) - a collection of links. people. ideas.
`keywords, papers. slides, code and data sets 00 nearest oeighbours
`" dO Spatial Searching (hnp://www.cgal.orgfPkgfSpatiaISearchi ngD)inCGAL - the Computational
`Geometry Algorithms Library
`• Non-Metril: Space Libmry (hllpsJlgithub.comlsearchi"ariusINonJ\lctricSpaceLib) ~ An open
`source similarity search library conta ining realisations of"arious Nearest neighbor search
`methods.
`
`Retrieved from "hnpsJlen. wikipedia.orglwlindex.php?
`title=Nearesi_ neighbor _ search&oldid=679970516"
`Calegories: Approximation algorithms I Classification algorithms Data mining Discrete geometry
`Geometric alJ:!orithrns Machine leaminS Numerical analysis I Mathem31ical optimization
`Search algorithms
`
`" This page was last modified on 1 September 2015, aI22:59.
`• Text is available under the Creati ve Commons Anrib ution-ShareAlike License; additional terms
`may app ly. By using Ihis sill'. you agree \0 the Tenus of Use and Privacy Policy_ Wikipedia'i) is a
`registered trademark of the Wikimcdia Foundation, Inc., a non-profit organization.
`
`fil cJlIC:lUsenlklaiA Dam/LocalfTen
`
`Page 8 of8
`SXBUIN.ht m
`
`9/ 10/2015