`
`UNITED STATES DEPARTMENT OF COMMERCE
`United States Patent and Trademark Office
`Address: COMMISSIONER FOR PATENTS
`P.O. Box 1450
`Alexandria, Virginia 22313-1450
`
`17/302,699
`
`05/11/2021
`
`Nikolas Anthony Kelly
`
`4959.0020000
`
`7574
`
`STERNE, KESSLER, GOLDSTEIN & FOX P.L.L.C.
`1101 K Street, NW
`10th Floor
`
`WASHINGTON,DC 20005
`
`SAINT-VIL, EDDY
`
`3715
`
`04/18/2024
`
`ELECTRONIC
`
`Please find below and/or attached an Office communication concerning this application or proceeding.
`
`The time period for reply, if any, is set in the attached communication.
`
`Notice of the Office communication was sent electronically on above-indicated "Notification Date" to the
`following e-mail address(es):
`e-office @sterneKessler.com
`
`PTOL-90A (Rev. 04/07)
`
`
`
`
`
`Disposition of Claims*
`19-38 is/are pending in the application.
`)
`Claim(s)
`5a) Of the above claim(s) _ is/are withdrawn from consideration.
`C} Claim(s)__ is/are allowed.
`Claim(s) 19-38 is/are rejected.
`(] Claim(s)__ is/are objectedto.
`C] Claim(s
`are subjectto restriction and/or election requirement
`)
`* If any claims have been determined allowable, you maybeeligible to benefit from the Patent Prosecution Highway program at a
`participating intellectual property office for the corresponding application. For more information, please see
`http://www.uspto.gov/patents/init_events/pph/index.jsp or send an inquiry to PPHfeedback@uspto.gov.
`
`) ) ) )
`
`Application Papers
`10)( The specification is objected to by the Examiner.
`11) The drawing(s) filed on 03/16/2022 is/are: a)[¥) accepted or b)(.) objected to by the Examiner.
`Applicant may not request that any objection to the drawing(s) be held in abeyance. See 37 CFR 1.85(a).
`Replacement drawing sheet(s) including the correction is required if the drawing(s) is objected to. See 37 CFR 1.121(d).
`
`Priority under 35 U.S.C. § 119
`12)£) Acknowledgment is made of a claim for foreign priority under 35 U.S.C. § 119(a)-(d)or (f).
`Certified copies:
`_—_c)L) None ofthe:
`b)L) Some**
`a)Q) All
`1.1) Certified copies of the priority documents have been received.
`2.1.) Certified copies of the priority documents have been received in Application No. |
`3.2.) Copies of the certified copies of the priority documents have been receivedin this National Stage
`application from the International Bureau (PCT Rule 17.2(a)).
`*“ See the attached detailed Office action for a list of the certified copies not received.
`
`Attachment(s)
`
`1)
`
`Notice of References Cited (PTO-892)
`
`Information Disclosure Statement(s) (PTO/SB/08a and/or PTO/SB/08b)
`2)
`Paper No(s)/Mail Date
`U.S. Patent and Trademark Office
`
`3)
`
`4)
`
`(LJ Interview Summary (PTO-413)
`Paper No(s)/Mail Date
`(Qj Other:
`
`PTOL-326 (Rev. 11-13)
`
`Office Action Summary
`
`Part of Paper No./Mail Date 20240409
`
`Application No.
`Applicant(s)
`17/302,699
`Kelly, Nikolas Anthony
`
`Office Action Summary Art Unit|AIA (FITF)StatusExaminer
`EDDY SAINT-VIL
`3715
`Yes
`
`
`
`-- The MAILING DATEof this communication appears on the cover sheet with the correspondence address --
`Period for Reply
`
`A SHORTENED STATUTORYPERIOD FOR REPLYIS SET TO EXPIRE 3 MONTHS FROM THE MAILING
`DATE OF THIS COMMUNICATION.
`Extensionsof time may be available underthe provisions of 37 CFR 1.136(a). In no event, however, may a reply betimely filed after SIX (6) MONTHSfrom the mailing
`date of this communication.
`If NO period for reply is specified above, the maximum statutory period will apply and will expire SIX (6) MONTHSfrom the mailing date of this communication.
`-
`- Failure to reply within the set or extended period for reply will, by statute, cause the application to become ABANDONED (35 U.S.C. § 133).
`Any reply received by the Office later than three months after the mailing date of this communication, evenif timely filed, may reduce any earned patent term
`adjustment. See 37 CFR 1.704(b).
`
`Status
`
`1) Responsive to communication(s) filed on 03/16/2022 and 09/20/2022.
`C) A declaration(s)/affidavit(s) under 37 CFR 1.130(b) was/werefiled on
`
`2a)() This action is FINAL. 2b)¥)This action is non-final.
`3) An election was madeby the applicant in responseto a restriction requirement set forth during the interview
`on
`; the restriction requirement and election have been incorporated into this action.
`4)(2) Since this application is in condition for allowance except for formal matters, prosecution as to the merits is
`closed in accordance with the practice under Exparte Quayle, 1935 C.D. 11, 453 O.G. 213.
`
`
`
`Application/Control Number: 17/302,699
`Art Unit: 3715
`
`Page 2
`
`Notice of Pre-AlA or AIA Status
`
`The present application, filed on or after March 16, 2013, is being examined
`
`underthe first inventor to file provisions of the AIA.
`
`Application Status
`
`1.
`
`Presentoffice action is in response to the preliminary amendmentfiled
`
`09/20/2022. Claims 1-18 are cancelled. Claims 19-38 are added and are currently
`
`pending in the application.
`
`Double Patenting
`
`2.
`
`The nonstatutory double patenting rejection is based on a judicially created
`
`doctrine grounded in public policy (a policy reflected in the statute) so as to prevent the
`
`unjustified or improper timewise extension of the “right to exclude” granted by a patent
`
`and to prevent possible harassment by multiple assignees. A nonstatutory obviousness-
`
`type double patenting rejection is appropriate where the conflicting claims are not
`
`identical, but at least one examined application claim is not patentably distinct from the
`
`reference claim(s) because the examined application claim is either anticipated by, or
`
`would have been obvious over, the reference claim(s). See, e.g., In re Berg, 140 F.3d
`
`1428, 46 USPQ2d 1226 (Fed. Cir. 1998); In re Goodman, 11 F.3d 1046, 29 USPQ2d
`
`2010 (Fed. Cir. 1993); In re Longi, 759 F.2d 887, 225 USPQ 645 (Fed. Cir. 1985); In re
`
`Van Ornum, 686 F.2d 937, 214 USPQ 761 (CCPA 1982); In re Vogel, 422 F.2d 438,
`
`164 USPQ 619 (CCPA 1970); and /n re Thorington, 418 F.2d 528, 163 USPQ 644
`
`(CCPA 1969).
`
`A timelyfiled terminal disclaimer in compliance with 37 CFR 1.321(c) or 1.321(d)
`
`may be used to overcome an actualor provisional rejection based on a nonstatutory
`
`
`
`Application/Control Number: 17/302,699
`Art Unit: 3715
`
`Page 3
`
`double patenting ground provided the conflicting application or patent either is shown to
`
`be commonly ownedwith this application, or claims an invention madeasa resultof
`
`activities undertaken within the scopeof a joint research agreement.
`
`Effective January 1, 1994, a registered attorney or agent of record may sign a
`
`terminal disclaimer. A terminal disclaimer signed by the assignee mustfully comply with
`
`37 CFR 3.73(b).
`
`3.
`
`Claims 19-36 are rejected on the ground of nonstatutory obviousness-type
`
`double patenting as being obvious over claims 13 and 19 of copending US Application
`
`No.18/216,917 in view of MAHMOUD (US 20190050637 A1) (MAHMOUD).
`
`Copending US Application No. 18/216,917
`Glaim 13: A sysiem, comprising: a camera
`
`US Application No. 17/302,699
`Claim 36: A device, comnmrising: a camera
`
`configured to capture an image; a computing
`
`configured to capture an image; a computing
`
`cevice coupled to the camera, the computing|device coupled io ihe camera, the computing
`
`device comprising: a display; a processar, and|device comprising: a processor; and a
`
`amemory, wherein ihe memory contains
`
`memary, wherein the memory contains
`
`instructions stored thereon that when executed|instructions stored thereon thal when executed
`
`oy the processor cause ihe processor tc:
`
`oy the processor cause the processor tc:
`
`flattened feature vector into a resultant feature
`
`transiate sign language in the image to a
`
`larget lanquage cutoul, comprising: capturing
`
`ihe image; detecling pose information from the|detect hand pose inforrnation from the image;
`
`image; converting the pose information into a|convert the hand pose information into a
`
`feature vector; converting the feature vector
`
`flattened feature vector; normalize the
`
`into a sign language string, (Claim 20)}
`
`
`
`Application/Control Number: 17/302,699
`Art Unit: 3715
`
`
`Page 4
`
`wherein to convert the feature vector, the
`
`vector) apply a convolutional neural network
`
`processor: splits the feature vector into
`
`(CNN) to soit the resultani feature vectorinto
`
`individual regions; and processes the
`
`individual regions; pracess ihe individual
`
`individual regions into a sign language string:
`
`regions into a sign language string based on
`
`and translating the sign language string into
`
`calabases: and translate the sign language
`
`the target language culpul: and oreserit @ sign
`
`string ino 4 target language outoul.
`
`language information in one or more
`
`lanquade transiation of a target language input
`
`on the display,
`
`comprising: receiving the target lanquage
`
`inpul: translating the target language input to
`
`sign language grammar; retrieving phonetic
`
`representations thal correspond to the sign
`
`language grammar: generating courdinates
`
`fromm the phonetic representations using a
`
`generative network: renciering an avatar or
`
`video that rnaoves between the coordinates;
`
`and presenting the avatar or video on the
`
`display.
`
`Claim 19: The system of claim 13, wherein tc
`
`Claims 22, 28 and 34: wherein to apply the
`
`GNN, the orocessor is configured to: output
`
`convert the feature vector, the processor
`
`
`
`Application/Control Number: 17/302,699
`Art Unit: 3715
`
`applies a Convolutional Neural Network
`one or more flag values associated with an
`
`Page 5
`
`configured to output one or more flag values
`
`intrasign region, an intersign region, or a non-
`
`associated wilh an intrasign reqian, an
`
`signing region, wherein the one or more flag
`
`intersign region, or a non-signing region, anc
`
`values correspond io an individual sign.
`
`wherein the one or more flag values
`
`correspond to an individual sign
`
`
`Copending US Application No.18/216,917 appears to be silent on but
`
`MAHMOUD which pertains to automated sign language recognition ({ 1), teaches or at
`
`least suggests causing the processor to normalize the flattened feature vector into 4
`
`resullant feature vector (at least | 67: features were experimented with to describe the
`
`sign, namely the BoPs, a concatenation of the BoPs for each half of the sign sequence,
`
`and the normalized concatenation of the BoPsfor each half of the sign sequence; { 68:
`
`To form a consistent feature vector, the BoP was normalized by the numberof framesin
`
`the sign sample; {] 96:... The split histogram of the postures of the sign language video
`
`can be normalized to accountfor a difference in signing speed; J] 127:... normalize the
`
`split histogram of the sequence of postures of the sign language video to accountfor a
`
`difference in signing speed}. it would have been obvious to one of ordinary skill in the
`
`art, before the effective filing date of the invention, to have incorporated the “normalize”
`
`feature of MAHMOUD to predictably accountfor a difference in signing speed and/or
`
`form a consistent feature vector (MAHMOUD: 4f 68, 127).
`
`indepencent claim 19 is a method, comprising steps generally similar to those
`
`of representative claim 33. As a result, claim 19 is rejected similarly to claim 33.
`
`
`
`Application/Control Number: 17/302,699
`Art Unit: 3715
`
`Page 6
`
`independent clairn 26 is a method, comprising steps generally similar to those
`
`of representative claim 33. As a result, claim 26 is rejected similarly to claim 33.
`
`Dependent claims 20-25, 27-32 and 34-38 stand rejected as being dependent
`
`upon rejected baseclaims.
`
`This is a provisional obviousness-type double patenting rejection because the
`
`conflicting claims have notin fact been patented.
`
`Claim Rejections - 35 USC § 112
`
`The following is a quotation of the first paragraph of 35 U.S.C. 112(a):
`
`(a) INGENERAL.—The specification shall contain a written description of the
`invention, and of the manner and process of making and usingit, in such full, clear, concise,
`and exact terms as to enable any person skilled in the art to which it pertains, or with whichit
`is most nearly connected, to make and use the same, and shall set forth the best mode
`contemplated by the inventor or joint inventor of carrying out the invention.
`
`The following is a quotation of the first paragraph of pre-AIA 35 U.S.C. 112:
`
`The specification shall contain a written description of the invention, and of the
`manner and process of making and usingit, in suchfull, clear, concise, and exact terms as to
`enable any person skilled in the art to which it pertains, or with whichit is most nearly
`connected, to make and use the same, and shall set forth the best mode contemplated by the
`inventor of carrying out his invention.
`
`4.
`
`Claim 20 is rejected under 35 U.S.C. 112(a) or 35 U.S.C. 112 (pre-AlA), first
`
`paragraph, asfailing to comply with the written description requirement. The claim(s)
`
`contains subject matter which wasnot described in the specification in such a way as to
`
`reasonably conveyto one skilled in the relevant art that the inventor or a joint inventor,
`
`or for applications subject to pre-AIA 35 U.S.C. 112, the inventor(s), at the time the
`
`application wasfiled, had possession of the claimed invention. In particular, claim 20
`
`recites, in part, “wherein the image or the sequencesof imagesis a 1, 2, or 3
`
`channel image’. However,the originally filed specification does not disclose such
`
`information. To remedy the issue, Examiner suggests pointing to the particular figure(s)
`
`
`
`Application/Control Number: 17/302,699
`Art Unit: 3715
`
`Page 7
`
`and/or paragraph(s)in the originally filed specification that provides the requisite written
`
`description support for “a 1, 2, or 3 channel image’in order for the claims to be
`
`compliant under §112(a) and commensurate with the subject matter supported in the
`
`specification.
`
`Claim Rejections - 35 USC § 101
`
`35 U.S.C. 101 reads as follows:
`
`Whoever invents or discovers any new and useful process, machine, manufacture, or composition of
`matter, or any new and useful improvementthereof, may obtain a patent therefor, subject to the
`conditions and requirementsofthis title.
`
`5.
`
`Claims 19-38 are rejected under 35 U.S.C. 101 becausethe claimed invention is
`
`directed to abstract idea without significantly more.
`
`Independentclaims 19, 26 and 33 are each directed to a category of invention. In
`
`particular, independent claims 19 and 26 are directed to “a method?’(i.e. a process) and
`
`independentclaim 33 is directed to “a device’(i.e. a machine). Referencing
`
`independent claim 33 as representative, the claims generally require the following
`
`limitations: [L1] ...capture an image — can be performedalternatively as a mental
`
`processto the extent that a person can visually capture a hand pose information by
`
`observation; [L3] detect hand pose information from the image — can be performed
`
`alternatively as a mental processto the extent that a person can visually detect hand
`
`pose information by observation; [L4] convert the hand poseinformation into a flattened
`
`feature vector — capable of being performed in the human mind or by a human with
`
`pencil and paper, [L5] normalize the flattened feature vector into a resultant feature
`
`vector — capable of being performed in the human mind or by a human with pencil and
`
`paper, [L6] ...split the resultant feature vector into individual regions — capable of being
`
`performed in the human mind or by a human with pencil and paper; [L7] process the
`
`
`
`Application/Control Number: 17/302,699
`Art Unit: 3715
`
`Page 8
`
`individual regions into a sign language string based on language information — capable
`
`of being performed in the human mind or by a human with pencil and paper; and [L8]
`
`translate the sign languagestring into a target language output — capable of being
`
`performed in the human mind or by a human with pencil and paper. The mere nominal
`
`recitation of a camera, computing device comprising: a processor; and a memory
`
`and automation of the manual processof translating sign language doesnot take the
`
`claim out of a ceriain method of organizing human activity and mental processes
`
`groupings. Accordingly, the claim recites an abstract idea under Step 2A: Prong 1. (Step
`
`2A — Prong 1: YES).
`
`The computer components, namely a camera, and a computing device
`
`comprising: a processor; and a memory, and one or more databasesarerecited at
`
`a high level of generality (see published Specification, i.e., J] 3:...real time captioning,
`
`producing translations as the useris signing;
`
`5: This Sign Language Translation
`
`method ... can be used on any device... operate on any platform enabled with video
`
`capturing (e.g. tablets, smartphones, or computers), allowing for seamless
`
`communication; ¥ 8: embodiments do not require any specialized hardware besides a
`
`camera andwifi connection (and therefore would be suitable to run on any smartphone
`
`or camera-enabled device)...; § 28: In the processing module 202, the featuretrain is
`
`split into each individual sign via the sign-splitting component 209 via a 1D
`
`Convolutional Neural Network... the sentence base 214 (a database of sentences) via K
`
`Nearest-Neighbors (KNN) with a Dynamic Time Warping (DTVV) distance metric... the
`
`signbase in 213 (a database ofindividual signs)...; {| 33:... a 1D Convolutional Neural
`
`Network...; {[ 36:... sentence base 410 (a database of sentences)...) and are, as such,
`
`
`
`Application/Control Number: 17/302,699
`Art Unit: 3715
`
`Page 9
`
`interpreted to be generic computer components that perform generic functions. Apart
`
`from the recitation of the generic computer components that perform generic functions,
`
`the claim limitations, under the broadest reasonable interpretation, may be performedin
`
`the human mind, including observations, evaluations, and judgments.
`
`As noted above, the claims generally recite a camera, and a computing device
`
`comprising: a processor; and a memory, and one or more databaseswhich are
`
`interpreted to be generic computer components (see published Specification, i.e., 7 3,
`
`5, 8, 28, 33, 36). The claims do not include additional elements that either alone or in
`
`combination integrate the judicial exception into a practical application. Because the
`
`aosiract idea is nat integrated into a practical application, the clairn is directed to the
`
`juciicial exception. (Step 2A, Prong 2: NO).
`
`As discussed with respect to Step 2A Prong Two, the additional elements in the
`
`claim amount to no more than mereinstructions to apply the exception using generic
`
`computer components. The same analysis applies here in Step 2B, i.e., mere
`
`instructions to apply an exception using generic computer components cannotintegrate
`
`a judicial exception into a practical application at Step 2A or provide an inventive
`
`concept in Step 2B. The published Specification (i.e., J4 3, 5, 8, 28, 33, 36) discloses
`
`the additional elements in a manner that indicates that the additional elements are
`
`sufficiently well-known that the specification does not need to describe the particulars of
`
`such additional elements to satisfy 35 U.S.C. § 112(a). See MPEP 2106.05(qd), as
`
`modified by the USPTO Berkheimer Memorandum. Hence, the additional elements are
`
`generic, “well-understood, routine, conventional’ in the field. The use of the additional
`
`elements either alone or in combination amounts to no more than mereinstructions to
`
`
`
`Application/Control Number: 17/302,699
`Art Unit: 3715
`
`Page 10
`
`apply the judicial exception using generic computer components. Mereinstructions to
`
`apply an exception using generic computer components cannot provide an inventive
`
`concept, and thus the claims are patentineligible. (Step 2B: NO).
`
`Dependents claims 20-25, 27-32 and 34-38 includeall the limitations of
`
`independent claim 1 from which they depend and, as such, recite the same abstract
`
`idea(s) noted abovefor respective independentclaims 19, 26 and 33. The dependent
`
`claims do not appear to remedythe issues noted above. Therefore, dependentclaims
`
`20-25, 27-32 and 34-38 are not drawn to patent eligible subject matter as they are
`
`directed to (an) abstract idea(s) without significantly more.
`
`Claim Rejections - 35 USC § 103
`
`In the event the determination of the status of the application as subject to AIA 35
`
`U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103)is incorrect, any
`
`correction of the statutory basis (i.e., changing from AIA to pre-AlA) for the rejection will
`
`not be considered a new ground ofrejection if the prior art relied upon, and the rationale
`
`supporting the rejection, would be the same undereither status.
`
`The following is a quotation of 35 U.S.C. 103 which forms the basis for all
`
`obviousness rejections set forth in this Office action:
`
`A patent for a claimed invention may not be obtained, notwithstanding that the claimed
`invention is not identically disclosed as set forth in section 102, if the differences between the
`claimed invention and the prior art are such that the claimed invention as a whole would have
`been obvious before the effectivefiling date of the claimed invention to a person having
`ordinary skill in the art to which the claimed invention pertains. Patentability shall not be
`negated by the manner in which the invention was made.
`
`6.
`
`Claims 19-26 and 28-38 are rejected under 35 U.S.C. 103 as being unpatentable
`
`over MAHMOUD (US 20190050637 A1) (MAHMOUD).
`
`Re claims 19, 26 and 33:
`
`
`
`Application/Control Number: 17/302,699
`Art Unit: 3715
`
`Page 11
`
`[Claim 33] MAHMOUD teachesor at least suggests a device, comprising: a
`
`camera configured to capture an image; a computing device coupled to the
`
`camera, the computing device comprising: a processor; and a memory, wherein
`
`the memory contains instructions stored thereon that when executed by the
`
`processor (at least { 5: use camerasto record videos of the signer... Multi-camera
`
`systems are used to capture a 3D image ofthe signer from different views) cause the
`
`processor to: detect hand pose information from the image (at least { 12: neural
`
`networks was used to detect hand postures. Each sign was represented by a sequence
`
`of postures..); convert the hand poseinformationinto a flattened feature vector (at
`
`least J 12:...Local features are extracted from each frame of the sign video and
`
`converted to a Bag of Features (BoFs)); normalize the flattened feature vector into a
`
`resultant feature vector (at i¢ast J 67: features were experimented with to describe the
`
`sign, namely the BoPs, a concatenation of the BoPs for each half of the sign sequence,
`
`and the normalized concatenation of the BoPs for each half of the sign sequence; q 68:
`
`To form a consistent feature vector, the BoP was normalized by the numberof framesin
`
`the sign sample; {] 96:... The split histogram of the postures of the sign language video
`
`can be normalized to accountfor a difference in signing speed; {] 127:... normalize the
`
`split histogram of the sequence of postures of the sign language video to accountfor a
`
`difference in signing speed}; apply a neural networkto split the resultant feature
`
`vectorinto individual regions(at least § 7: Pulse-coupled neural network feature
`
`generation modelfor arabic sign language recognition; § 12: neural networks was used
`
`to detect hand postures. Each sign was represented by a sequenceof postures).
`
`MAHMOUD issilent on the neural network being a convolutional neural network
`
`
`
`Application/Control Number: 17/302,699
`Art Unit: 3715
`
`Page 12
`
`(CNN). The convolutional neural network (CNN) is one of several known machine
`
`learning techniques. Hence, substituting the neural network with the convolutional
`
`neural network (CNN) would have been an obvious matter of choice.
`
`MAHMOUD further teachesor at least suggests the memory contains
`
`instructions stored thereon that when executed by the processor: process the
`
`individual regions into a sign languagestring based on languageinformation in
`
`one or more databases(atleast { 50: constructed Arabic sign databasetable in the
`
`sign table database 180, which provides the sequence of images for each associated
`
`sign number in the Arabic sign language; § 52: Each sign language databasetable in
`
`the sign table database 180 has a predetermined orderof sign numberentries from an
`
`associated sequenceof images); and translate the sign languagestring into a
`
`target language output(at least J 52:... the first sign number in an English sign table
`
`database 180 is the word “house.”; claim 1:... identifying text words that correspond
`
`with the sign gesture and presenting the text on a display to ease communication
`
`between deaf People and non-deaf people).
`
`[Claim 19] The claim recites a method counterpart to claim 33, comprising
`
`steps substantially similar in scope to those of representative claim 33. As a result,
`
`claim 19 is rejected similarly to claim 33.
`
`[Claim 26] The claim recites a method counterpart to claim 33, comprising
`
`steps substantially similar in scope to those of representative claim 33. As a result,
`
`claim 26 is rejected similarly to claim 33.
`
`Re claim 20:
`
`
`
`Application/Control Number: 17/302,699
`Art Unit: 3715
`
`Page 13
`
`[Claim 20] MAHMOUD appearsto be silent on wherein the imageor the
`
`sequencesof imagesis a 1, 2, or 3 channel image. The Examinertakesofficial
`
`notice that the concept and advantages of multi-channel imagesor any suitable type of
`
`image were old and well-knownto one of ordinary skill in the art before the effective
`
`filing date of the invention. Hence, it would have been obvious to oneof ordinary skill in
`
`the art, before the effectivefiling date of the invention, to have used a 1, 2, or 3
`
`channel image to modify MAHMOUD asclaimedto predictably provide alternatives for
`
`suitable images.
`
`Re claims 21 and 23:
`
`[Claims 21 and 23] MAHMOUD teachesor at least suggests wherein
`
`converting the feature vector comprises: splitting the feature vector into
`
`individual regions; and processing the individual regions into a sign language
`
`string (at least 7 67-69: sign sequence wassplit into two parts, wherein a BoPs was
`
`built for each part...), wherein processing the individual regions comprises
`
`determining whether the individual regions are one of a pre-recorded sentence or
`
`an individual sign in one or more databases(at least ¢ 11: A K-Nearest Neighbor
`
`(KNN) classifier was used by Tharwatet al. (2015); Shanableh etal. (2007); and El-
`
`Bendary etal. (2010). They reported an accuracy of 99% on 30 alphabets with no
`
`motion, 87% on 23 signs, and 91%on 30 alphabets with no motion; FIG. 1C:a
`
`database system with a conversion database; 9 50-54:... Each sign language
`
`databasetable in the sign table database 180 has a predetermined order of sign
`
`numberentries from an associated sequence of images. For example, let us assume
`
`the first sign number in an English sign table database 180 is the word “house.”...; | 41)
`
`
`
`Application/Control Number: 17/302,699
`Art Unit: 3715
`
`Page 14
`
`20, 103, 117, 126:... classify each sign gesture using a K-Nearest Neighbors (K-NN)
`
`classifier... ).)
`
`Re claims 22, 28 and 34:
`
`[Claims 22, 28 and 34] MAHMOUD teachesor at least suggests wherein
`
`converting the feature vector comprises applying a Convolutional Neural Network
`
`(CNN) configured to output one or more flag values associated with an intrasign
`
`region, an intersign region, or a non-signing region, and wherein the one or more
`
`flag values correspond to an individual sign (at least 7§ 97, 118, 127, 128: splitting a
`
`sequenceof postures and generating a split histogram of postures for each split
`
`posture, and concatenating the split postures to preserve an order of posturesin the
`
`sign gesture).
`
`Re claims 24, 30 and 36:
`
`[Claims 24, 30 and 36] MAHMOUD discloses applying a binary classifier to
`
`determine whether one or more of the individual regions is sign language (at least
`
`qq) 20, 96, 103, 104, 112, 126, 130). The binary classification is one of several known
`
`classification techniques. Hence, using the binary classifier would have been an
`
`obvious matter of choice.
`
`MAHMOUD appearsto besilent on applying the binary classifier to determine
`
`whetherone or more of the individual regions is fingerspelled. The Examiner takes
`
`official notice that the concept and advantagesof the American Sign Language (ASL)
`
`finger-spelling alphabet were old and well-known to one of ordinary skill in the art before
`
`the effective filing date of the invention. Hence,it would have been obvious to one of
`
`ordinary skill in the art, before the effective filing date of the invention, to have modified
`
`
`
`Application/Control Number: 17/302,699
`Art Unit: 3715
`
`Page 15
`
`MAHMOUD to determine whether one or moreof the individual regions is fingerspelled
`
`to predictably include a much richer set of hand poses that covers 26letters plus the
`
`digits zero through nine.
`
`Re claims 25, 31 and 37:
`
`[Claims 25, 31 and 37] MAHMOUD teachesor at least suggests comparing the
`
`individual regions to signs in one or more databases to generate comparison
`
`results; and choosing a sign based on a K Nearest Neighbor function or a
`
`Dynamic Time Warping function applied to the comparison results (at least 4 11:A
`
`K-Nearest Neighbor (KNN) classifier was used by Tharwatet al. (2015); Shanableh et
`
`al. (2007); and El-Bendary et al. (2010). They reported an accuracy of 99% on 30
`
`alphabets with no motion, 87% on 23 signs, and 91%on 30 alphabets with no motion;
`
`41) 20, 103, 117, 126:... classify each sign gesture using a K-Nearest Neighbors (K-
`
`NN) classifier... ).
`
`Re claims 29 and 35:
`
`[Claims 29 and 35] MAHMOUD teachesor at least suggests determining
`
`whetherthe individual regions are one of a pre-recorded sentence or an
`
`individual sign in the one or more databases by applying a K Nearest Neighbor
`
`function or a Dynamic Time Warping function on the individual regions(at least {
`
`11: A K-Nearest Neighbor (KNN) classifier was used by Tharwatet al. (2015);
`
`Shanablehet al. (2007); and El-Bendary et al. (2010). They reported an accuracyof
`
`99% on 30 alphabets with no motion, 87% on 23 signs, and 91%on 30 alphabets with
`
`no motion; § 41) 20, 103, 117, 126:... classify each sign gesture using a K-Nearest
`
`Neighbors (K-NN) classifier... ).
`
`
`
`Application/Control Number: 17/302,699
`Art Unit: 3715
`
`Re claims 32 and 38:
`
`Page 16
`
`[Claims 32 and 38] MAHMOUD appearsto be silent on outputting a plurality
`
`of selectable translations associated with the target language output. The
`
`Examinertakesofficial notice that the concept and advantages of computer selectable
`
`options for outputting information were old and well-known to one of ordinary skill in the
`
`art before the effective filing date of the invention. Hence, it would have been obvious to
`
`one of ordinary skill in the art, before the effectivefiling date of the invention, to have
`
`modified MAHMOUD to output a plurality of selectable translations associated with the
`
`target language output to predictably provide the student with a list of possible
`
`translations and allow the user to select one of the options from the list.
`
`Examiner's Note
`
`7.
`
`No art rejection is currently provided for claim 27.
`
`Conclusion
`
`8.
`
`The prior art made of record and notrelied upon is listed in the attached PTO
`
`Form 892 and is considered pertinent to applicant's disclosure.
`
`Anyinquiry concerning this communication or earlier communications from the
`
`examiner should be directed to EDDY SAINT-VIL whose telephone numberis (571)272-
`
`9845. The examiner can normally be reached Mon-Fri 6:30 AM -6:00 PM.
`
`Examiner interviews are available via telephone, in-person, and video
`
`conferencing using a USPTO supplied web-basedcollaboration tool. To schedule an
`
`interview, applicant is encouraged to use the USPTO Automated Interview Request
`
`(AIR) at http:/Avwww.uspto.gov/interviewpractice.
`
`
`
`Application/Control Number: 17/302,699
`Art Unit: 3715
`
`Page 17
`
`If attempts to reach the examiner by telephone are unsuccessful, the examiner's
`
`supervisor, PETER VASATcan be reached on (571) 270-7625. The fax phone number
`
`for the organization where this application or proceeding is assigned is 571-273-8300.
`
`Information regarding the status of published or unpublished applications may be
`
`obtained from Patent Center. Unpublished application information in Patent Centeris
`
`available to registered users. To file and manage patent submissions in Patent Center,
`
`visit: https://patentcenter.uspto.gov. Visit https:/Awww.uspto.gov/patents/apply/patent-
`
`center for more information about Patent Center and
`
`https:/Awww.uspto.gov/patents/docx for information aboutfiling in DOCX format. For
`
`additional questions, contact the Electronic Business Center (EBC) at 866-217-9197
`
`(toll-free). If you would like assistance from a USPTO Customer Service
`
`Representative, call 800-786-9199 (IN USA OR CANADA)or 571-272-1000.
`
`/EDDY SAINT-VIL/
`Primary Examiner, Art Unit 3715
`
`