Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
METHOD FOR NIPPLE REPLICATION
Document Type and Number:
WIPO Patent Application WO/2023/130067
Kind Code:
A1
Abstract:
A method includes receiving from a camera of a mobile device a plurality of scanned images, applying an object detection model of a machine learning engine to the plurality of scanned images to identify a user's nipple within the plurality of scanned images based on a plurality of object feature vectors, generating a scan image output including a plurality of object feature vectors identifying the user's nipple, applying a genetic algorithm to extract a plurality of geometric features from the scan image output, the plurality of geometric features identifying: nipple-related height, nipple-related width, nipple-related texture, and nipple-related color, generating a 3D model of the user's nipple based on the plurality of geometric features; and determining based on the 3D model of the user's nipple, a baby bottle nipple profile corresponding to the user's nipple to allow 3D printing of a custom baby bottle nipple.

Inventors:
ZEEV SHILO BEN (US)
AMIEL HAGAI (US)
Application Number:
PCT/US2022/082618
Publication Date:
July 06, 2023
Filing Date:
December 30, 2022
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
PROXAMAMA LLC (US)
International Classes:
A61J11/00; B33Y80/00; G06T1/00; G06T15/20
Domestic Patent References:
WO2020219856A12020-10-29
WO2020200087A12020-10-08
Foreign References:
US20180276841A12018-09-27
US20190042871A12019-02-07
US20170312185A12017-11-02
US20210342836A12021-11-04
US20220062111A12022-03-03
US20230069584A12023-03-02
Other References:
MATKOWSKI WOJCIECH MICHAL; MATKOWSKI KRZYSZTOF; KONG ADAMS WAI-KIN; LLOYD HALL CORY: "The Nipple-Areola Complex for Criminal Identification", 2019 INTERNATIONAL CONFERENCE ON BIOMETRICS (ICB), IEEE, 4 June 2019 (2019-06-04), pages 1 - 6, XP033709055, DOI: 10.1109/ICB45273.2019.8987341
XU ET AL.: "3D Joints Estimation of the Human Body in Single-Frame Point Cloud", IEEE ACCESS, 30 September 2020 (2020-09-30), pages 178900 - 178908, XP011813042, Retrieved from the Internet [retrieved on 20230315], DOI: 10.1109/ACCESS.2020.3027892
Attorney, Agent or Firm:
BUETTNER, Laurin T. (US)
Download PDF:
Claims:
CLAIMS

What is claimed is:

1. A method comprising: receiving, by a processor, from a camera of a mobile device, a plurality of scanned images; applying, by the processor, an object detection model of a machine learning engine to the plurality of scanned images to identify a user’s nipple within the plurality of scanned images based on a plurality of object feature vectors; generating, by the processor, an scan image output comprising a plurality of object feature vectors identifying the user’s nipple; applying, by the processor, a genetic algorithm to extract a plurality of geometric features from the scan image output; wherein the plurality of geometric features identifies: nipple-related height, nipple-related width, nipple-related texture, and nipple-related color; generating, by the processor, a 3D model of the user’s nipple based on the plurality of geometric features; and determining, by the processor, based on the 3D model of the user’s nipple, a baby bottle nipple profile corresponding to the user’s nipple to allow 3D printing of a custom baby bottle nipple; wherein the custom baby bottle nipple is a 3D replication of the user’s nipple.

2. The method of claim 1, further comprising training, by the processor, the machine learning engine to identify the nipple within the plurality of scanned images based at least in part on the plurality of scanned images comprising at least a portion of a nipple of a human.

3. The method of claim 1, wherein the plurality of scanned images is a video comprising at least two image frames.

22

4. The method of claim 3, wherein the machine learning engine is trained to identify the nipple within each frame of the plurality of scanned images.

5. The method of claim 3, further comprising prompting, by the processor, the user to rescan the nipple if the machine learning engine does not identify a nipple within each frame of the plurality of scanned images.

6. The method of claim 3, further comprising gathering and creating a point cloud, by the processor, by stitching each image frame of the plurality of scanned images together.

7. The method of claim 6, wherein a first genetic process of the genetic algorithm comprises: orienting, by the processor, the point cloud from the scan image with a teat of the user’s nipple in a predetermined direction.

8. The method of claim 7, wherein the predetermined direction is along a positive z axis.

9. The method of claim 7, wherein a second genetic process of the genetic algorithm comprises setting, by the processor, an average normal at a top portion of the point cloud as close to a positive z axis as possible.

10. The method of claim 9, wherein a third genetic process of the genetic algorithm comprises maximizing, by the processor, a height at which the teat exceeds a predetermined cross-sectional diameter.

11. The method of claim 10, wherein the predetermined cross-sectional diameter is 30 mm.

12. The method of claim 8, wherein the genetic algorithm is configured to filter out a normal of each point in the point cloud further and further away from the positive z axis, until there is a clear separation between the teat and a base of the user’s nipple.

13. The method of claim 1, further comprising rebuilding, by the processor, the user’s nipple without any gaps or holes by extracting key contours and measurements from the image scan.

14. The method of claim 1, wherein the plurality of geometric features comprises a bounding box.

Description:
METHOD FOR NIPPLE REPLICATION

Field of Invention

[0001] The present disclosure relates generally to feeding devices for infants. Specifically, the present disclosure relates to methods of customizing feeding device nipples.

Background

[0002] Feeding devices, such as baby bottles, are often used to feed babies from newborns to toddlers for various reasons. Reasons for using a feeding device include, but are not limited to: latching difficulties by the baby, inability for the mother to produce enough milk, feeding by a caregiver or physician other than the mother, inability for the mother to breastfeed for health reasons, weaning of the baby, etc.

Summary of the Invention

[0003] The summary is a high-level overview of various aspects of the invention and introduces some of the concepts that are further detailed in the Detailed Description section below. This summary is not intended to identify key or essential features of the claimed subject matter, nor is it intended to be used in isolation to determine the scope of the claimed subject matter. The subject matter should be understood by reference to the appropriate portions of the entire specification, any or all drawings, and each claim.

[0004] Embodiments of the present disclosure relate to a method including receiving, by a processor, from a camera of a mobile device, a plurality of scanned images. The method also includes applying, by the processor, an object detection model of a machine learning engine to the plurality of scanned images to identify a user’s nipple within the plurality of scanned images based on a plurality of object feature vectors; generating, by the processor, an scan image output comprising a plurality of object feature vectors identifying the user’s nipple. The method also includes applying, by the processor, a genetic algorithm to extract a plurality of geometric features from the scan image output, where the plurality of geometric features identifies: nipple-related height, nipple-related width, nipple-related texture, and nipple-related color. The method also includes generating, by the processor, a 3D model of the user’s nipple based on the plurality of geometric features. The method also includes determining, by the processor, based on the 3D model of the user’s nipple, a baby bottle nipple profile corresponding to the user’s nipple to allow 3D printing of a custom baby bottle nipple, where the custom baby bottle nipple is a 3D replication of the user’s nipple.

[0005] In some embodiments, the method further includes training, by the processor, the machine learning engine to identify the nipple within the plurality of scanned images based at least in part on the plurality of scanned images comprising at least a portion of a nipple of a human.

[0006] In some embodiments, the plurality of scanned images is a video comprising at least two image frames.

[0007] In some embodiments, the machine learning engine is trained to identify the nipple within each frame of the plurality of scanned images.

[0008] In some embodiments, the method further includes prompting, by the processor, the user to rescan the nipple if the machine learning engine does not identify a nipple within each frame of the plurality of scanned images.

[0009] In some embodiments, the method further includes gathering and creating a point cloud, by the processor, by stitching each image frame of the plurality of scanned images together.

[0010] In some embodiments, a first genetic process of the genetic algorithm includes orienting, by the processor, the point cloud from the scan image with a teat of the user’s nipple in a predetermined direction.

[0011] In some embodiments, the predetermined direction is along a positive z axis.

[0012] In some embodiments, a second genetic process of the genetic algorithm comprises setting, by the processor, an average normal at a top portion of the point cloud as close to a positive z axis as possible.

[0013] In some embodiments, a third genetic process of the genetic algorithm includes maximizing, by the processor, a height at which the teat exceeds a predetermined cross-sectional diameter.

[0014] In some embodiments, the predetermined cross-sectional diameter is 30 mm.

[0015] In some embodiments, the genetic algorithm is configured to filter out a normal of each point in the point cloud further and further away from the positive z axis, until there is a clear separation between the teat and a base of the user’s nipple. [0016] In some embodiments, the method further includes rebuilding, by the processor, the user’s nipple without any gaps or holes by extracting key contours and measurements from the image scan.

[0017] In some embodiments, the plurality of geometric features comprises a bounding box.

Brief Description of the Drawings

[0018] The accompanying drawings are included to provide a further understanding of the disclosure and are incorporated in and constitute a part of this specification, illustrate embodiments, and together with the description serve to explain the principles of the present disclosure.

[0019] FIG 1 is a block diagram illustrating an operating computer architecture for predictive visualization of a medical procedure of a user, according to one or more embodiments of the present disclosure.

[0020] FIG 2 is a flow diagram illustrating a method for scanning and replicating the shape of a mother’s nipple, according to one or more embodiments of the present disclosure.

[0021] FIG 3 is a schematic diagram depicting an exemplary path for scanning a nipple of a user, according to one or more embodiments of the present disclosure.

[0022] FIG . 4 is a schematic diagram of a plurality of frames of a scan image, according to one or more embodiments of the present disclosure.

[0023] FIG 5 is an exemplary output scan image with a bounding box, according to one or more embodiments of the present disclosure.

[0024] FIG 6 is a schematic diagram of a nipple depicting a base diameter and a teat diameter, according to one or more embodiments of the present disclosure.

[0025] FIG 7A is an isolated nipple point cloud, according to one or more embodiments of the present disclosure.

[0026] FIG 7B is a nipple point cloud oriented using point cloud normal, according to one or more embodiments of the present disclosure.

[0027] FIG 7C is a nipple point cloud with point cloud normals filtered to create a gap in the scan that isolates the teat from the breast, according to one or more embodiments of the present disclosure. [0028] FIG 7D is a nipple point cloud with predetermined contours combined to rebuild the user’s nipple without any gaps or holes, according to one or more embodiments of the present disclosure. [0029] FIG 7E is a marching cube watertight mesh, according to one or more embodiments of the present disclosure.

[0030] FIG. 8 is a flow diagram illustrating a genetic algorithm method, according to one or more embodiments of the present disclosure.

[0031] FIG 9 is a three-dimensional point cloud after a first genetic process of a genetic algorithm, according to one or more embodiments of the present disclosure.

[0032] FIG 10 is a three-dimensional point cloud after a second genetic process of a genetic algorithm, according to one or more embodiments of the present disclosure.

[0033] FIG 11 is a three-dimensional point cloud after a third genetic process of a genetic algorithm, according to one or more embodiments of the present disclosure.

[0034] FIG 12 is a three-dimensional point cloud after the genetic algorithm filters out the normal, according to one or more embodiments of the present disclosure.

[0035] FIG 13 is a three-dimensional point cloud after the genetic algorithm filters out the normal, according to one or more embodiments of the present disclosure.

[0036] FIG 14 is a set of graphs depicting a roughness algorithm output, according to one or more embodiments of the present disclosure.

[0037] FIG 15 is a fine scale analysis and a rough scale analysis, according to one or more embodiments of the present disclosure.

[0038] FIG 16 is smooth intensity map and a rough intensity map, according to one or more embodiments of the present disclosure.

[0039] FIG 17 is a custom displacement map that estimates the skin texture roughness across the user’s nipple, according to one or more embodiments of the present disclosure.

Detailed Description

[0040] The present invention can be further explained with reference to the included drawings, wherein like structures are referred to by like numerals throughout the several views. The drawings shown are not necessarily to scale, with emphasis instead generally being placed upon illustrating the principles of the present invention. Therefore, specific structural and functional details disclosed herein are not to be interpreted as limiting, but merely as a representative basis for teaching one skilled in the art to variously employ the present invention.

[0041] Among those benefits and improvements that have been disclosed, other objects and advantages of this invention can become apparent from the following description taken in conjunction with the accompanying figures. Detailed embodiments of the present invention are disclosed herein; however, it is to be understood that the disclosed embodiments are merely illustrative of the invention that may be embodied in various forms. In addition, each of the examples given in connection with the various embodiments of the present invention is intended to be illustrative, and not restrictive.

[0042] Nipple confusion is a common syndrome in which newborn babies have trouble latching to their mother’s breast after being fed with a baby bottle. In some embodiments, one potential solution to this problem may be to make the baby bottle’s nozzle and mechanisms resemble those of the mother’s nipple. Described herein are methods for scanning and replicating the shape of a mother’s nipple such that the replicated nipple may be integrated with a baby bottle. In some embodiments, an automated mass is created using a customization technique for replicating the shape of a mother’s nipple and integrating the replicated nipple with a larger baby bottle. Also described herein are platforms for providing enhanced scans of the mother’s breast. In some embodiments, artificial intelligence (Al) is used to optimize the scans provided by the platform. In some embodiments, this methods described herein use a combination of 3D scanning, Al object detection, a novel mesh post-processing and a novel texturizing procedure to create custom nipples for users.

[0043] In some embodiments, as described above, Al is used to produce a scan image of the mother’s breast using object detection. In some embodiments, the Al includes at least one machine learning model, such as a neural network. In some embodiments, the neural network is a convolutional neural network. In some embodiments, the neural network is a deep learning network, a generative adversarial network, a recurrent neural network, a fully connected network, or combinations thereof.

[0044] In some embodiments, to gather easy and adequate scans for nipple replication, the present disclosure provides a method of scanning for users to use in their homes using a structural light sensor often found in the front-facing camera in modern mobile phones. In some embodiments, the scanning method utilizes a mobile phone’s built-in facial recognition system to take multiple scans of a mother’ s breast by stitching individual scans of the mother’s breast, taken by the mobile phone, to form a sparse 3D point cloud. In some embodiments, the scanning method also includes recognizing the mother’s breast in 3D space and detecting a frontal image of the mother’s nipple to use for color matching. In some embodiments, the scanning method includes extracting key measurements and contours of the mother’ s nipple scan to direct the mother to the most appropriate nipple product for her unique body.

[0045] Throughout the specification, the following terms take the meanings explicitly associated herein, unless the context clearly dictates otherwise. The phrases “in one embodiment” and “in some embodiments” as used herein do not necessarily refer to the same embodiment s), though it may. Furthermore, the phrases “in another embodiment” and “in some other embodiments” as used herein do not necessarily refer to a different embodiment, although it may. Thus, as described below, various embodiments of the invention may be readily combined, without departing from the scope or spirit of the invention. Further, when a particular feature, structure, or characteristic is described in connection with an implementation, it is submitted that it is within the knowledge of one skilled in the art to effect such feature, structure, or characteristic in connection with other implementations whether or not explicitly described herein.

[0046] In addition, as used herein, the term “or” is an inclusive “or” operator, and is equivalent to the term “and/or,” unless the context clearly dictates otherwise. The term “based on” is not exclusive and allows for being based on additional factors not described, unless the context clearly dictates otherwise. In addition, throughout the specification, the meaning of “a,” “an,” and “the” include plural references. The meaning of “in” includes “in” and “on.”

[0047] It is understood that at least one aspect/functionality of various embodiments described herein can be performed in real-time and/or dynamically. As used herein, the term “real-time” is directed to an event/action that can occur instantaneously or almost instantaneously in time when another event/action has occurred. For example, the “real-time processing,” “real-time computation,” and “real-time execution” all pertain to the performance of a computation during the actual time that the related physical process (e.g., a user interacting with an application on a mobile device) occurs, in order that results of the computation can be used in guiding the physical process.

[0048] As used herein, the term “dynamically” means that events and/or actions can be triggered and/or occur without any human intervention. In some embodiments, events and/or actions in accordance with the present invention can be in real-time and/or based on a predetermined periodicity of at least one of: nanosecond, several nanoseconds, millisecond, several milliseconds, second, several seconds, minute, several minutes, hourly, several hours, daily, several days, weekly, monthly, etc.

[0049] In some embodiments, the inventive specially programmed computing systems with associated devices are configured to operate in the distributed network environment, communicating over a suitable data communication network (e.g., the Internet, etc.) and utilizing at least one suitable data communication protocol (e.g., IPX/SPX, X.25, AX.25, AppleTalk™, TCP/IP (e.g., HTTP), etc.). Of note, the embodiments described herein may, of course, be implemented using any appropriate hardware and/or computing software languages. In this regard, those of ordinary skill in the art are well versed in the type of computer hardware that may be used, the type of computer programming techniques that may be used (e.g., object oriented programming), and the type of computer programming languages that may be used (e.g., C++, Objective-C, Swift, Java, Javascript). The aforementioned examples are, of course, illustrative and not restrictive.

[0050] As used herein, the terms “image(s)” and “image data” are used interchangeably to identify data representative of visual content which includes, but not limited to, images encoded in various computer formats (e.g., “.jpg”, “.bmp,” etc.), streaming video based on various protocols (e.g., Real-time Streaming Protocol (RTSP), Real-time Transport Protocol (RTP), Real-time Transport Control Protocol (RTCP), etc.), recorded/generated non-streaming video of various formats (e.g., “,mov,” “.mpg,” “.wmv,” “.avi,” “.flv,” ect.), and real-time visual imagery acquired through a camera application on a mobile device.

[0051] As used herein, term "server" should be understood to refer to a service point which provides processing, database, and communication facilities. By way of example, and not limitation, the term "server" can refer to a single, physical processor with associated communications and data storage and database facilities, or it can refer to a networked or clustered complex of processors and associated network and storage devices, as well as operating software and one or more database systems and application software that support the services provided by the server. Cloud servers are examples.

[0052] The material disclosed herein may be implemented in software or firmware or a combination of them or as instructions stored on a machine-readable medium, which may be read and executed by one or more processors. A machine-readable medium may include any medium and/or mechanism for storing or transmitting information in a form readable by a machine (e.g., a computing device). For example, a machine-readable medium may include read only memory (ROM); random access memory (RAM); magnetic disk storage media; optical storage media; flash memory devices; electrical, optical, acoustical or other forms of propagated signals (e.g., carrier waves, infrared signals, digital signals, etc.), and others.

[0053] In another form, a non-transitory article, such as a non-transitory computer readable medium, may be used with any of the examples mentioned above or other examples except that it does not include a transitory signal per se. It does include those elements other than a signal per se that may hold data temporarily in a “transitory” fashion such as RAM and so forth.

[0054] Examples of hardware elements may include processors, microprocessors, circuits, circuit elements (e.g., transistors, resistors, capacitors, inductors, and so forth), integrated circuits, application specific integrated circuits (ASIC), programmable logic devices (PLD), digital signal processors (DSP), field programmable gate array (FPGA), logic gates, registers, semiconductor device, chips, microchips, chip sets, and so forth. In some embodiments, the one or more processors may be implemented as a Complex Instruction Set Computer (CISC) or Reduced Instruction Set Computer (RISC) processors; x86 instruction set compatible processors, multicore, or any other microprocessor or central processing unit (CPU). In various implementations, the one or more processors may be dual-core processor(s), dual-core mobile processor(s), and so forth.

[0055] Determining whether an embodiment is implemented using hardware elements and/or software elements may vary in accordance with any number of factors, such as desired computational rate, power levels, heat tolerances, processing cycle budget, input data rates, output data rates, memory resources, data bus speeds and other design or performance constraints.

[0056] One or more aspects of at least one embodiment may be implemented by representative instructions stored on a machine-readable medium which represents various logic within the processor, which when read by a machine causes the machine to fabricate logic to perform the techniques described herein. Such representations, known as “IP cores” may be stored on a tangible, machine readable medium and supplied to various customers or manufacturing facilities to load into the fabrication machines that actually make the logic or processor. [0057] FIGS. 1 through 17 illustrate exemplary systems and methods for scanning and replicating the shape of a mother’s nipple such that the replicated nipple may be integrated with a baby bottle. The following embodiments provide technical solutions and technical improvements that overcome technical problems, drawbacks and/or deficiencies in at least one technical field involving efficiency and accuracy of computing systems utilized in assisting the formation of baby bottle nipples that accurately mimic the nipple of a breastfeeding mother, described herein. For example, at least one technical difficulty is the efficiency of computing system in extracting from images, e.g., pixels, useful visual data that can be utilized to replicate a mother’s nipple. As explained in more detail below, the present disclosure provides a technically advantageous computer architecture that improves scan images of a mother’s breast, based at least in part on scan image data of other users (i.e., other breastfeeding mothers), to create a more realistic and lifelike baby bottle nipple that mimics the nipple of the breastfeeding mother, thereby reducing nipple confusion by the baby. In some embodiments, the systems and methods are technologically improved by being programmed with machine-learning modeling to create a 3D scan image. Some embodiments leverage the wide-spread use of mobile personal communication devices (e.g., smart phones with integrated cameras) to facilitate the inputting of user-generated data to enhance the 3D scan image.

[0058] FIG 1 illustrates a block diagram illustration of an exemplary nipple scanning system 100 consistent with some embodiments of the present disclosure. The components and arrangements shown in FIG. 1 are not intended to limit the disclosed embodiments as the components used to implement the disclosed processes and features may vary. In accordance with the disclosed embodiments, the nipple scanning system 100 may include a server 106 in communication with a first user computing device 104 of a first user 102 and a second user computing device 105 of a second user 103 via a network 108.

[0059] Network 108 may be of any suitable type, including individual connections via the internet such as cellular or Wi-Fi networks. In some embodiments, network 108 may connect participating devices using direct connections such as radio-frequency identification (RFID), near-field communication (NFC), Bluetooth™, low-energy Bluetooth™ (BLE), Wi-Fi™, ZigBee™ ambient backscatter communications (ABC) protocols, USB, WAN or LAN. Because the information transmitted may be personal or confidential, security concerns may dictate one or more of these types of connections be encrypted or otherwise secured. In some embodiments, however, the information being transmitted may be less personal, and therefore the network connections may be selected for convenience over security.

[0060] Server 106 may be associated with a medical practice or other type of practice or entity. For example, server 106 may manage user information. One of ordinary skill will recognize that server 106 may include one or more logically or physically distinct systems.

[0061] In some embodiments, the server 106 may include hardware components such as a processor (not shown), which may execute instructions that may reside in local memory and/or transmitted remotely. In some embodiments, the processor may include any type of data processing capacity, such as a hardware logic circuit, for example, an application specific integrated circuit (ASIC) and a programmable logic, or such as a computing device, for example a microcomputer or microcontroller that includes a programmable microprocessor.

[0062] In some embodiments, the first user computing device 104 may be associated with the first user 102 who is a breastfeeding mother. In some embodiments, the second user computing device 105 may be associated with the second user 103 who is an entity, such as a medical practice or medical products company. When the first user 102 wishes to generate a baby bottle nipple, the server 106 may prompt the first user 102 to input user information and a scan image via the first user computing device 104.

[0063] In some embodiments, the first user computing device 104 and/or the second user computing device 105 may be a mobile computing device. The first user computing device 104 and/or the second user computing device 105, or mobile user devices, may generally include at least computer-readable non-transient medium, a processing component, an Input/Output (I/O) subsystem and wireless circuitry. These components may be coupled by one or more communication buses or signal lines. The first user computing device 104 and/or the second user computing device 105 may be any portable electronic devices, including a mobile phone, a handheld computer, a tablet computer, a laptop computer, a tablet device, a multifunction device, a portable gaming device, a vehicle display device, or the like, including a combination of two or more of these items. In some embodiments, the mobile user device may be any appropriate device capable of taking still images or video with an equipped front camera. In some embodiments, the first user computing device 104 and/or the second user computing device 105 may be a desktop computer. [0064] As shown in FIG. 1, in some embodiments, the first user computing device 104 includes a user camera 110. In some embodiments, at least one user image may be captured by the user camera 110 and transmitted via network 108. In some embodiments, the at least one scan image capture may be performed by a nipple scanning application 130 available to all users of the first user computing device 104. In some embodiments, the at least one scan image capture may be performed by a camera application that comes with a mobile first user computing device 104, and the resulting at least one scan image may be uploaded to the nipple scanning application 130.

[0065] In some embodiments, wireless circuitry is used to send and receive information over a wireless link or network to one or more other devices' suitable circuitry such as an antenna system, an RF transceiver, one or more amplifiers, a tuner, one or more oscillators, a digital signal processor, a CODEC chipset, memory, etc. The wireless circuitry can use various protocols, e.g., as described herein.

[0066] It should be apparent that the architecture described is only one example of an architecture for the first user computing device 104 and/or the second user computing device 105, and that the first user computing device 104 and/or the second user computing device 105 can have more or fewer components than shown, or a different configuration of components. The various components described above can be implemented in hardware, software, or a combination of both hardware and software, including one or more signal processing and/or application specific integrated circuits.

[0067] In some embodiments, the first user computing device 104 may include an application such as the nipple scanning application 130 (or application software) which may include program code (or a set of instructions) that performs various operations (or methods, functions, processes, etc.), as further described herein.

[0068] Examples of software may include software components, programs, applications, computer programs, application programs, system programs, machine programs, operating system software, middleware, firmware, software modules, routines, subroutines, functions, methods, procedures, software interfaces, application program interfaces (API), instruction sets, computing code, computer code, code segments, computer code segments, words, values, symbols, or any combination thereof. Determining whether an embodiment is implemented using hardware elements and/or software elements may vary in accordance with any number of factors, such as desired computational rate, power levels, heat tolerances, processing cycle budget, input data rates, output data rates, memory resources, data bus speeds and other design or performance constraints. [0069] In some embodiments, the nipple scanning application 130 enables the first user 102 to upload a scan image to the server 106. In some embodiments, the nipple scanning application 130 may be an application provided by a medical entity or other entity. In one implementation, the nipple scanning application 130 may be automatically installed onto the first user computing device 104 after being downloaded. In addition, in some embodiments, the nipple scanning application 130 or a component thereof may reside (at least partially) on a remote system (e.g., server 106) with the various components (e.g., front-end components of the nipple scanning application 130) residing on the first user computing device 104. As further described herein, the nipple scanning application 130 and the server 106 may perform operations (or methods, functions, processes, etc.) that may require access to one or more peripherals and/or modules. In the example of FIG. 1, the server 106 includes a scan optimization module 120 and an algorithm module 140, as will be described in further detail below.

[0070] In some embodiments, the scan image 112 may be processed by the scan optimization module 120, which is specifically programmed in accordance with the principles of the present invention with one or more specialized inventive computer algorithms. Further, in some embodiments, the scan optimization module 120 can be in operational communication (e.g., wireless/wired communication) with the server 106 which can be configured to support one or more functionalities of the scan optimization module 120.

[0071] FIG 2 illustrates a flow diagram of an exemplary method 200 of creating a 3D nipple from a scan image, according to some embodiments of the present disclosure.

[0072] At step 205, once the first user 102 is ready to scan her breast, the first user 102 places the first user computing device 104 approximately 1 to 2 feet underneath the user’s breast 117 with a front camera 110 of the first user computing device 104 framing the nipple 119 she wants to replicate. When ready, she presses record and gradually moves the first user computing device 104 upwards from below her breast to slightly above the nipple, keeping the nipple approximately centered in frame, as depicted in FIG. 3. At step 205, the first user 102 may also enter user information.

[0073] At step 210, in some embodiments, the nipple scanning application 130 gathers and creates a full point cloud. For example, in some embodiments, the scan image 112 may be a video stream including a plurality of frames 114. As shown in FIG. 4, an exemplary video stream captured by the user camera 110 (e.g., a camera of a mobile phone) can be divided into frames 114. In some embodiments, each frame 114 may contain an image data with any known color model, including but not limited to: YCrCb, RGB, LAB, etc. In some embodiments, the nipple scanning application 130 creates an RGB image sequence. In some embodiments, the nipple scanning application 130 takes each frame 114 of the scan image 112 and stitches the frames 114 together to create the point cloud. In some embodiments, the scan image 112 (e.g., input video stream) may include any appropriate type of source for video contents. In some embodiments, the contents from the scan image 112 (e.g., the scanning video of FIG. 4) may include both video data and metadata. In some embodiments, the scan image 112 is a scan of the user’s breast, captured by the front camera of the first user computing device 104.

[0074] At 215 the scan optimization module 120 may recognize the nipple complex in the scan image 112. In some embodiments, the scan optimization module 120 may be implemented as an application (or set of instructions) or software/hardware combination configured to perform operations (or methods, functions, processes, etc.) for receiving and processing image data inputs (e.g., without limitation, image(s), video(s), etc.), via the network 108, from the first user computing device 104. The scan optimization module 120 may receive the scan image 112 from the first user 102 and employ a machine learning engine 144 to identify the user’s nipple within the scan image 112. In some embodiments the machine learning engine 144 may include, e.g., software, hardware and/or a combination thereof. For example, in some embodiments, the machine learning engine 144 may include a processor and a memory, the memory having instructions stored thereon that cause the processor to generate, without limitation, at least one 3D image.

[0075] In some embodiments, the machine learning engine 144 may be configured to utilize a machine learning technique. In some embodiment, the machine learning engine 144 may include one or more of a neural network, such as a feedforward neural network, radial basis function network, an image classifier, recurrent neural network, convolutional network, generative adversarial network, a fully connected neural network, or some combination thereof, for example. In some embodiments, the machine learning engine 144 may be composed of a single level of linear or non-linear operations or may include multiple levels of non-linear operations. For example, the machine learning engine 144 may include numerous layers and/or hidden layers that perform calculations (e.g., dot products) using various neurons. [0076] In some embodiments and, optionally, in combination of any embodiment described above or below, an exemplary implementation of Neural Network may be executed as follows: i) Define Neural Network architecture/model, ii) Transfer the input data to the exemplary neural network model, iii) Train the exemplary model incrementally, iv) determine the accuracy for a specific number of timesteps, v) apply the exemplary trained model to process the newly-received input data, vi) optionally and in parallel, continue to train the exemplary trained model with a predetermined periodicity.

[0077] In some embodiments, the scan optimization module 120 may employ object recognition techniques to identify a nipple within a scan image 112. For example, in some embodiments, the scan optimization module 120 may employ an object detection model. In some embodiments, the object detection model may employ the machine learning engine 144 to recognize a nipple in the scan image 112. In some embodiments, the machine learning engine 144 is a convolutional neural network that performs a convolution operation to recognize objects in images. In some embodiments, a deep convolutional neural network (CNN) may be run to retrieve a feature vector, known as the encoder part. In some embodiments, scan image data may be connected with the feature vector and nonlinear convolutional layers are run to identify an object in a scan image.

[0078] In some embodiments the object detection model of the present disclosure includes a base architecture or series of “layers” it uses to process scan image data and return information about what is in the scan image 112. In some embodiments, the object detection model is trained on a unique dataset to recognize information in an image. In some embodiments, the dataset is gathered from multiple users taking images of their breasts, mostly along a specific path 115, as depicted in FIG. 3, moving from below the breast to just above the nipple. In some embodiments, the machine learning engine 144 of the object detection model is trained on a set of scan images of previous users used in a wide variety of applications that contain a nipple to detect the nipple location in a two-dimensional image. In some embodiments, the machine learning engine 144 is trained on hundreds of training scan images. In other embodiments, the machine learning engine 144 is trained on thousands of training scan images. In other embodiments, the machine learning engine 144 is trained on tens of thousands of training scan images. In other embodiments, the machine learning engine 144 is trained on hundreds of thousands of training scan images. [0079] In some embodiments, the machine learning engine 144 may identify a single key frame of the nipple for color, help ensure that the user’s scan image 112 is correctly seeing the nipple and isolate the nipple part of the point cloud for a genetic algorithm of the algorithm module 140. In some embodiments, the machine learning engine 144 may output an output scan image 116 with one or more bounding boxes 118 (e.g. defined by a point, width, and height), and a class label for each bounding box 118, as depicted in FIG. 5. In some embodiments, the trained machine learning engine 144 may detect at least one group of classes. In some embodiments, the groups of classes may be: 1) a nipple complex including the areola and the teat; 2) only a teat; or 3) a frontal view of the entire nipple (for collecting information on color).

[0080] At step 215, in some embodiments, if the machine learning engine 144 may detect during scanning that too many of the scan image frames 114 do not contain a nipple, the application provides an error message to the user and asks the user to re-scan her nipple, starting the method again at step 210.

[0081] At step 220, the nipple scanning application 130 may transmit the output scan image 116 with the 3D bounding box 118 around the nipple, the colored .ply point cloud, and the series of color (RGB) images to the server 106 for a novel post-processing and texturizing procedure for nipple geometries.

[0082] While the process for collecting the output scan image is designed to be easy, the resulting output scan image may contain holes and noise due to the lower resolution of the scanner and the natural fluctuations of the scan image. As a result, in step 225, a series of post-processing techniques may be used to fill in the holes within the output scan image 116, while crucially collecting important dimensional information about the nipple itself. In some embodiments, if the post-processing procedure shows an error message, the user may be given information as to why an error occurred and is asked to re-scan her nipple.

[0083] Every nipple contains two important dimensions as it pertains to nipple confusion: base diameter and teat diameter. The base diameter is defined as the diameter around which the nipple meets the breast, and the teat diameter is defined as the diameter around which the nipple starts to curve downward or its inflection point, as depicted in FIG. 6. In some embodiments, the postprocessing techniques described below replicate nipple geometry and extract base and teat diameter information for both tailored and customized baby bottle nipple solutions. FIGS. 7A-7E depicts each step of the post-processing method applied to an isolated nipple point cloud, depicted in FIG. 7A, according to embodiments of the present disclosure.

[0084] At 225, the algorithm module 140 may use the genetic algorithm to perform various tasks, as will be described in further detail below. In some embodiments, the genetic algorithm may automatically orient the isolated nipple point cloud upwards along the z-axis, as depicted in FIG. 7B. In some embodiments, the algorithm module may also calculate the normals at each point. In some embodiments, the algorithm module may gradually filter out those normals that are too far from the z-axis to create a gap in the scan that isolates the teat from the breast, as depicted in FIG. 7C. The genetic algorithm, in some embodiments, may work in three steps, described below with reference to FIGS. 7A-C and 9-11, and involves collecting information on: 1) nipple height; 2) teat width; 3) nipple base width; 4) large scale textural roughness; 5) small scale textural roughness; or 6) nipple color.

[0085] FIG 8 is a flow chart depicting the genetic algorithm method 300. In some embodiments, at 310, the genetic algorithm may orient the isolated point cloud from the scan image 112 teat up (+z axis) using a series of genetic processes. In some embodiments, the series of genetic processes are meant to minimize the cross-sectional area of the point cloud at the top, minimize the height of the overall scan and minimize the average normal at the heights part of the scan. In some embodiments, the series of genetic processes may randomly mutate parameters through multiple iterations until a local minima is achieved. In some embodiments, these genetic processes are different than machine learning in that they do not require training. In some embodiments, three genetic processes may be used in the order described below.

[0086] In some embodiments, the first genetic process of the genetic algorithm may minimize the overall height of the scan by rotating the point cloud in the x and y axis and the ratio of the x and y dimensions (Xo and Yo) at the top quarter of the scan to the x and y dimension of the scan as a whole, as depicted in FIG. 9.

[0087] In some embodiments, the second genetic process of the genetic algorithm may set the average normal at the top portion of the scan as close to the +Z axis as possible, as depicted in FIG. 10.

[0088] In some embodiments, the third genetic process of the genetic algorithm may maximize a height at which the teat exceeds a predetermined cross-sectional diameter (Z c ), as depicted in FIG. 11. In some embodiments, the predetermined cross-sectional diameter is 30 mm. In some embodiments, the predetermined cross-sectional diameter is 25 mm to 35 mm; or 27 mm to 35 mm; or 29 mm to 35 mm; or 31 mm to 35 mm; or 33 mm to 35 mm; or 25 mm to 33 mm; or 25 mm to 31 mm; or 25 mm to 29 mm; or 25 mm to 27 mm; or 26 mm to 30 mm; or 29 mm to 30 mm; or 30 mm to 31 mm. In some embodiments, each generation in the genetic process may score itself based on rotations along the x and y axes. In some embodiments, once the smallest possible score is achieved, the genetic algorithm may move on to the next genetic process until the scan is properly oriented. In some embodiments, if the genetic algorithm sees that there are too many holes around the edge of the teat, the genetic algorithm will ask the user to redo the scan image, making sure that the user moves slowly from underneath the breast to above the nipple.

[0089] At 320, the algorithm module 140 may use the genetic algorithm to filter out the normal of each point in the point cloud further and further away from the +Z axis, until there is a clear separation between the teat and the nipple base, as depicted in FIG. 11. In some embodiments, the normal of each point in the point cloud may be estimated based on the positions of the point’s nearest neighbors. In some embodiments, the dot product of each normal is taken with the +Z axis ([0,0,1]). In some embodiments, the genetic algorithm may then measure the width at the bottom of the teat and the top of the nipple base to get the teat width and the nipple base width, respectively. In some embodiments, the genetic algorithm may then measure the distance between the +Z axis at the base and the +Z axis at the top of the nipple scan to get the nipple height, as depicted in FIG. 12. In some embodiments, if the genetic algorithm is unable to separate the nipple base and the teat, the genetic algorithm may indicate that the user should try to stimulate the nipple more. In some embodiments, if the genetic algorithm sees that the top of the scan is above a quarter of the width of the bottom of the scan, the genetic algorithm may provide an error and asks the user if she moved during the scan.

[0090] In some embodiments, at 330, the genetic algorithm may analyze the keyframe image extracted from the scan image 112 at the center of the nipple’s detected bounding box and the surround area of the teat’s bounding box to get information on texture and color, as depicted in FIG. 13. Specifically, in some embodiments, once the nipple scanning application 130 returns images of the breast during scanning and a front “key frame” image is identified, the trained machine learning engine 144 may identify the teat. In some embodiments, once the teat’s location is identified in a 2D frame, the average color of the pixels around the teat area may be used as the color value for that user. [0091] At step 230, once the location of the z height of teat and base diameters are found, predetermined contours may be combined to rebuild the user’s nipple without any gaps or holes, as depicted in FIG. 7D. In some embodiments, nipple scanning application 130 may extract key features and measurements from the scan images that are used to replicate the shape and size of the user’s nipple. For example, in some embodiments, the predetermined contours may include: a predetermined contour along the breast, a predetermined contour at the base of the nipple, a predetermined contour at the teat and a predetermined contour around the nipple apex. Finally, in some embodiments, a marching cubes function may be used to create a solid watertight mesh, as depicted in FIGS. 7E.

[0092] At step 235, the RGB frames provided by the nipple scanning application 130 through the scanning process may be used to generate a texture of the scanned nipple. In some embodiments, the object detection model may determine which frame 114 of the scan image 112 is aligned with the majority of the nipple and areola (frontal view). In some embomdiments, the frame 114 may be isolated and used to measure “roughness” across the areola’s and nipple’s surfaces utilizing at least one gray scale image analysis tool. A sample output graphs of a roughness algorithm are depicted in FIG. 14. In some embodiments, at least one gray scale image analysis tool may be used at two different resolutions to determine large scale roughness values and fine scale roughness values. Examples of fine scale analysis and rough scale analysis are depicted in FIG. 15, according to embodiments of the present disclosure. In some embodiments these roughness values may be used to generate intensity maps, indicating where roughness/bumpiness is strongest, and where roughness/bumpiness is weakest, based on white and black values, respectively. FIG. 16 depicts an exemplary smooth intensity map and a rough intensity map, according to embodiments of the present disclosure. In some embodiments, the intensity maps may be plugged into a novel procedural texture based on multiple voronoi algorithms built in Blender® to create a custom displacement map that estimates the skin texture roughness across the user’s nipple, as depicted in FIG. 17.

[0093] At step 240, once the custom geometry with its personalized texture is created, the nipple’s 3D custom shape may be translated into a 3D baby bottle nipple profile that fits with a standard baby bottle. Specifically, the 3D baby bottle nipple, while having the shape of the user’s nipple, may be size and formatted to fit to a standard commercial baby bottle. In some embodiments, the baby bottle nipple profile may include a base portion that may connect with a standard baby bottle via a baby bottle collar. In some embodiments, the baby bottle nipple profile may include a base portion that connects directly to a standard baby bottle.

[0094] At step 245, the custom 3D baby bottle nipple profile is sent to the server 106. The standard 3D nipple profile is accessible on the server 106 by the second user computing device 105 for 3D printing.

[0095] At step 250 a custom baby bottle nipple, based on the custom 3D baby bottle nipple profile, is 3D printed. In some embodiments, the custom baby bottle nipple may be integrated into a standard manufactured baby bottle part. In some embodiments, the result is a baby bottle capped with a 3D printed nipple designed to mimic the geometry and feel of the user’s nipple.

[0096] In some embodiments, the entire process sits in three different “locations.” The initial process of scanning the nipple and identifying its location in 3D space exists on the user’s phone - the first location. The phone application produces a colored scan, eight points defining the nipple’s location and a series of RGB images that are sent to a virtual machine for post-processing and texturing using the Genysis® API - the second location. The Genysis® API then sends the resulting key dimensions, solid watertight printable mesh .stl file, a single color value and any error information to a server for 3D printing and archiving - the third location.

[0097] At least some aspects of the present disclosure will now be described with reference to the following numbered clauses.

[0098] 1. A method including: receiving, by a processor, from a camera of a mobile device, a plurality of scanned images; applying, by the processor, an object detection model of a machine learning engine to the plurality of scanned images to identify a user’s nipple within the plurality of scanned images based on a plurality of object feature vectors; generating, by the processor, an scan image output comprising a plurality of object feature vectors identifying the user’s nipple; applying, by the processor, a genetic algorithm to extract a plurality of geometric features from the scan image output; where the plurality of geometric features identifies: nipple-related height, nipple-related width, nipple-related texture, and nipple-related color; generating, by the processor, a 3D model of the user’s nipple based on the plurality of geometric features; and determining, by the processor, based on the 3D model of the user’s nipple, a baby bottle nipple profile corresponding to the user’s nipple to allow 3D printing of a custom baby bottle nipple; wherein the custom baby bottle nipple is a 3D replication of the user’s nipple.

[0099] 2 The method of clause 1, further including training, by the processor, the machine learning engine to identify the nipple within the plurality of scanned images based at least in part on the plurality of scanned images including at least a portion of a nipple of a human.

[0100] 3 The method of clause 1, where the plurality of scanned images is a video including at least two image frames.

[0101]4 The method of clause 3, where the machine learning engine is trained to identify the nipple within each frame of the plurality of scanned images.

[0102] 5. The method of clause 3, further including prompting, by the processor, the user to rescan the nipple if the machine learning engine does not identify a nipple within each frame of the plurality of scanned images.

[0103] 6 The method of clause 3, further including gathering and creating a point cloud, by the processor, by stitching each image frame of the plurality of scanned images together.

[0104] 7 The method of clause 6, where a first genetic process of the genetic algorithm includes: orienting, by the processor, the point cloud from the scan image with a teat of the user’s nipple in a predetermined direction.

[0105] 8 The method of clause 7, where the predetermined direction is along a positive z axis.

[0106] 9 The method of clause 7, where a second genetic process of the genetic algorithm comprises setting, by the processor, an average normal at a top portion of the point cloud as close to a positive z axis as possible.

[0107] 10. The method of clause 9, where a third genetic process of the genetic algorithm comprises maximizing, by the processor, a height at which the teat exceeds a predetermined cross- sectional diameter. [0108] 11 The method of clause 10, where the predetermined cross-sectional diameter is 30 mm. [0109] 12. The method of clause 8, where the genetic algorithm is configured to filter out a normal of each point in the point cloud further and further away from the positive z axis, until there is a clear separation between the teat and a base of the user’s nipple.

[0110] 13. The method of clause 1, further comprising rebuilding, by the processor, the user’s nipple without any gaps or holes by extracting key contours and measurements from the image scan.

[0111] 14. The method of clause 1, where the plurality of geometric features includes a bounding box.

[0112] While a number of embodiments of the present invention have been described, it is understood that these embodiments are illustrative only, and not restrictive, and that many modifications may become apparent to those of ordinary skill in the art, including that the inventive methodologies, the inventive systems, and the inventive devices described herein can be utilized in any combination with each other. Further still, the various steps may be carried out in any desired order (and any desired steps may be added and/or any desired steps may be eliminated).