Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
ELECTRIC VEHICLE CHARGING STATION CAMERA ARRAY
Document Type and Number:
WIPO Patent Application WO/2024/092215
Kind Code:
A1
Abstract:
Systems and methods are provided herein for providing an electric vehicle charging station (EVCS) camera array for determining user information and electric vehicle characteristics. For example, a first camera may be affixed to an upper portion of a housing of an EVCS and a second camera may be affixed to a lower portion of the housing of the EVCS. The cameras may be configured to obtain a video or capture images of an area corresponding to a parking space associated with the EVCS, a parking space next to the parking space of the EVCS, and/or walking paths (e.g., sidewalks) next to the EVCS. The varying positions of the cameras allow the EVCS to capture different angles of the area proximal to the EVCS to better determine user information and/or vehicle characteristics. The EVCS may then provide services based on the determined user information and/or vehicle characteristics.

Inventors:
PAULIN CATHERINE (US)
CRIST BRADFORD (US)
MEYER RAMSEY (US)
SHAH ANANT (US)
STEGE STEPHEN (US)
KLEIN DAVID (US)
Application Number:
PCT/US2023/078052
Publication Date:
May 02, 2024
Filing Date:
October 27, 2023
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
VOLTA CHARGING LLC (US)
International Classes:
B60L53/65; B60L53/16; B60L53/31; B60L53/62
Attorney, Agent or Firm:
INGERMAN, Jeffrey H. et al. (US)
Download PDF:
Claims:
What is claimed is:

1. An electric vehicle charging station comprising: a housing unit; a display connected to the housing unit; a first camera connected to the housing unit; a second camera connected to housing unit; a connector, connected to the housing unit; and a processor, located inside the housing unit, coupled to the display, the first camera, and the second camera; the processor configured to: receive a first image of an electric vehicle from the first camera; receive a second image of the electric vehicle from the second camera; determine a characteristic of the electric vehicle using the first image and the second image; determine a media item associated with the characteristic of the electric vehicle; and charge the electric vehicle with the connector while displaying the media item on the display.

2. The method of claim 1, wherein the first camera is connected to a top portion of the housing.

3. The method of claim 2, wherein the second camera is connected to a lower portion of the housing.

4. The method of claim 1, wherein the first camera is a wide-angle camera.

5. The method of claim 1, wherein the first camera is a USB camera.

6. An electric vehicle charging station comprising: a housing unit; a display connected to the housing unit; a first camera with a first orientation connected to the housing unit; a second camera with a second orientation connected to the housing unit; a connector, connected to the housing unit; and a processor, located inside the housing unit, coupled to the display, the first camera, and the second camera; the processor configured to: receive a first image of an electric vehicle from the first camera; receive a second image of the electric vehicle from the second camera; determine a characteristic of the electric vehicle using the first image and the second image; and charge the electric vehicle with the connector while using a charging rate, wherein the charging rate is based on the determined characteristic.

7. The electric vehicle charging station of claim 6, wherein the processor is further configured to: determine the characteristic of the electric vehicle using the first image; determine a confidence value of the characteristic of the electric vehicle based on the first image; and determine that the confidence value is below a threshold, wherein the second image is received in response to the confidence value being below the threshold.

8. The electric vehicle charging station of claim 6, wherein the processor is further configured to: determine a media item associated with the characteristic of the electric vehicle; and charge the electric vehicle with the connector using the charging rate while displaying the media item on the display.

9. The electric vehicle charging station of claim 6, wherein the characteristic corresponds to a model of the electric vehicle.

10. The electric vehicle charging station of claim 6, wherein the characteristic corresponds to a profile associated with the electric vehicle.

11. The electric vehicle charging station of claim 1, wherein the charging rate corresponds to an amount of current during a time period.

Description:
ELECTRIC VEHICLE CHARGING STATION CAMERA ARRAY

Background

[0001] The present disclosure relates to computer-implemented techniques for charging electric vehicles, and in particular to techniques for allocating resources to electric vehicles based on information captured by one or more cameras of an electric vehicle charging station.

Summary

[0002] As more consumers transition to electric vehicles, there is an increasing demand for electric vehicle charging stations (EVCSs). These EVCSs usually supply electric energy, either using cables or wirelessly, to the batteries of electric vehicles. For example, a user can connect their electric vehicle via cables of an EVCS and the EVCS supplies electrical current to the user’s electric vehicle. The cables and control systems of the EVCSs can be housed in kiosks in locations to allow a driver of an electric vehicle to park the electric vehicle close to the EVCS and begin the charging process. These kiosks may be placed in areas of convenience, such as in parking lots at shopping centers, in front of commercial buildings, or in other public places. Traditional EVCSs have few, if any, sensors for determining information about an area (e.g., parking space) proximal to the EVCSs. The lack of sensors limits an EVCS’s ability to customize services to a user based on information about the user and/or user vehicle. Accordingly, traditional EVCSs provide the same services (e.g., user experience, charging rate, charging cost, etc.) to each electric vehicle that is connected to the EVCSs without considering additional factors (e.g., user information/electric vehicle characteristics), resulting in suboptimal user experience.

[0003] Various systems and methods described herein address these problems by providing an EVCS camera array for determining user information and electric vehicle characteristics. In some embodiments, an EVCS comprises a display which can be used to provide media to a user to enhance the user’s charging experience. Consequently, passers-by, in addition to users of the EVCS may notice media content displayed by the EVCS. The larger display results in a larger EVCS housing and provides room for the EVCS to house multiple sensors (e.g., cameras). Cameras may be affixed to the housing in different locations. For example, a first camera may be affixed to an upper portion of the housing and a second camera may be affixed to a lower portion of the housing. The cameras may be configured to obtain a video or capture images of an area corresponding to a parking space associated with the EVCS, a parking space next to the parking space of the EVCS, and/or walking paths (e.g., sidewalks) next to the EVCS. The cameras may be wide-angle cameras or 360° cameras that are configured to obtain videos or capture images of a large area proximal to the EVCS. [0004] The varying positions of the cameras allow the EVCS to capture different angles of the area proximal to the EVCS to better determine user information and/or vehicle characteristics. For example, a first camera may have a first orientation relative to an electric vehicle in a parking space corresponding to an EVCS. The EVCS may receive one or more images from the first camera. The EVCS can process the images (e.g., via image recognition) to determine a vehicle characteristic of the electric vehicle. In some embodiments, the EVCS uses a machine learning algorithm trained with images of vehicles with known characteristics. The EVCS may also determine a confidence score related to the determined vehicle characteristic. Many factors (e.g., quality of an image, commonality of the determined information/characteristic, position of the electric vehicle, lighting, etc.) can impact the confidence score. If the confidence score does not exceed a threshold, the EVCS may receive one or more images from a second camera having a second orientation relative to the electric vehicle. The EVCS can use the one or more images received from the second camera to validate the determined vehicle characteristic. For example, the EVCS may process the one or more images received from the second camera and determine a plurality of vehicle characteristics. If one of the plurality of vehicle characteristics determined from the one or more images from the second camera matches with the vehicle characteristic determined using the first camera, the EVCS may provide a service based on the determined vehicle characteristic.

[0005] In another example, if a first confidence score related to the vehicle characteristic determined from the one or more images of the first camera does not exceed a threshold, the EVCS receives one or more images from the second camera. The EVCS can use the one or more images received from the second camera to calculate a second confidence score associated with the vehicle characteristic. If the second confidence score exceeds a threshold, the EVCS may provide a service based on the vehicle characteristic. If the second confidence score does not exceed a threshold, the process may start over. For example, the EVCS may receive new images from the first camera and calculate a third confidence score. This process may continue until the confidence score related to one or more images received from the first camera and/or the second camera exceed a threshold.

[0006] The EVCS can use the information captured by the cameras to more efficiently provide services to the user of the electric vehicle. For example, the EVCS may use the information captured by the cameras to determine an electric vehicle characteristic (e.g., the model) of an electric vehicle parked in a parking space associated with the EVCS. The EVCS may access a database that comprises a plurality of entries associating electric vehicle characteristics with customized services. The database may be stored in the EVCS, a server, or a combination thereof. The database may have entries associating models of electric vehicles with charging rates. For example, a first model may be associated with a first charging rate and a second model may be associated with a second charging rate that is slower than the first charging rate. The charging rates may be based on optimizing the battery life of the respective model. The EVCS may charge the electric vehicle at the first charging rate because the EVCS determined that the electric vehicle is associated with the first model.

[0007] In another example, the EVCS may use the information captured by the cameras to determine the make of an electric vehicle parked in a parking space associated with the EVCS. The EVCS may have access to a database with entries associating makes of electric vehicles with charging costs. For example, a first make may be associated with a first charging cost and a second model may be associated with a second charging cost that is more expensive than the first charging cost. The charging cost may be based on a promotional campaign of the manufacturer of the electric vehicle. The EVCS may charge the electric vehicle at the first charging cost because the EVCS determined that the electric vehicle is associated with the first make.

[0008] In another example, the EVCS may use the information captured by the cameras to determine that an electric vehicle has a paper license plate. The EVCS may customize the media displayed for the user based on the paper license plate. For example, because paper license plates usually correspond to newer vehicles, the EVCS may display an advertisement for home charging stations. In another example, the EVCS may use the information captured by the cameras to determine that an electric vehicle has an out-of-state license plate. The EVCS may display customized media (e.g., nearby tourist destinations, hotel advertisements, etc.) in response to determining that the electric vehicle has an out-of-state license plate.

Brief Description of the Drawings [0009] Other objects and advantages of the disclosure will be apparent upon consideration of the following detailed description, taken in conjunction with the accompanying drawings, and in which:

[0010] FIG. 1 shows an illustrative diagram of a system for determining user information and electric vehicle characteristics using an EVCS camera array, in accordance with some embodiments of the disclosure;

[0011] FIGS. 2A and 2B illustrate EVCS camera arrays used for determining user information and electric vehicle characteristics, in accordance with some embodiments of the disclosure;

[0012] FIG. 3 shows another illustrative diagram of a system for determining user information and electric vehicle characteristics using an EVCS camera array, in accordance with some embodiments of the disclosure;

[0013] FIGS. 4 A and 4B show other illustrative diagrams of a system for determining user information and electric vehicle characteristics using an EVCS camera array, in accordance with some embodiments of the disclosure;

[0014] FIG. 5 shows an illustrative block diagram of an EVCS system, in accordance with some embodiments of the disclosure;

[0015] FIG. 6 shows an illustrative block diagram of a user equipment device system, in accordance with some embodiments of the disclosure;

[0016] FIG. 7 shows an illustrative block diagram of a server system, in accordance with some embodiments of the disclosure;

[0017] FIG. 8 is an illustrative flowchart of a process for determining user information and electric vehicle characteristics using an EVCS camera array, in accordance with some embodiments of the disclosure;

[0018] FIG. 9 is another illustrative flowchart of a process for determining user information and electric vehicle characteristics using an EVCS camera array, in accordance with some embodiments of the disclosure; and

[0019] FIG. 10 is another illustrative flowchart of a process for determining user information and electric vehicle characteristics using an EVCS camera array, in accordance with some embodiments of the disclosure.

Detailed Description

[0020] FIG. 1 shows an illustrative diagram of a system 100 for determining user information and electric vehicle characteristics using an EVCS camera array, in accordance with some embodiments of the disclosure. In some embodiments, the EVCS 102 provides an electric charge to the electric vehicle 104 via a wired connection, such as a charging cable, or a wireless connection (e.g., wireless charging). The EVCS 102 may be in communication with the electric vehicle 104 or a user device 108 belonging to a user 106 (e.g., a driver, passenger, owner, renter, or other operator of the electric vehicle 104) who is associated with the electric vehicle 104. In some embodiments, the EVCS 102 communicates with one or more devices or computer systems, such as user device 108 or server 110, respectively, via a network 112.

[0021] In the system 100, there can be more than one EVCS 102, electric vehicle 104, user, 106, user device 108, server 110, and/or network 112, but only one of each is shown in FIG.

1 to avoid overcomplicating the drawing. In addition, a user 106 may utilize more than one type of user device 108 and more than one of each type of user device 108. In some embodiments, there may be paths 114a-d between user devices, EVCSs, and/or electric vehicles, so that the items may communicate directly with each other via communications paths, as well as other short-range point-to-point communications paths, such as USB cables, IEEE 1394 cables, wireless paths (e.g., Bluetooth, infrared, IEEE 802-1 lx, etc.), or other short-range communication via wired or wireless paths. In an embodiment, the devices may also communicate with each other directly through an indirect path via a communications network. The communications network may be one or more networks including the internet, a mobile phone network, mobile voice or data network (e.g., a 4G, 5G, or LTE network), cable network, public switched telephone network, or other types of communications network or combinations of communications networks. In some embodiments, a communications network path comprises one or more communications paths, such as a satellite path, a fiberoptic path, a cable path, a path that supports internet communications (e.g., IPTV), free-space connections (e.g., for broadcast or other wireless signals), or any other suitable wired or wireless communications path or combination of such paths. In some embodiments, a communications network path can be a wireless path. Communication with the devices may be provided by one or more communication paths but is shown as a single path in FIG. 1 to avoid overcomplicating the drawing.

[0022] In some embodiments, the EVCS 102 comprises a display 120 which can be used to provide media to the user 106 to enhance the user’s charging experience. In some embodiments, the size of the display 120 results in the EVCS 102 being larger than other types of EVCSs. The larger EVCS 102 provides room to house a first camera 116 and a second camera 118. In some embodiments, the first camera 116 and/or the second camera 118 are configured to obtain a video or capture images of an area corresponding to a parking space 122 associated with the EVCS 102, a parking space next to the parking space of the EVCS 102, and/or walking paths (e.g., sidewalks) next to the EVCS 102. In some embodiments, the first camera 104 has a first view of the parking space 122 and the second camera 118 has a second view of the parking space 122. The first view may be different from the second view due to the different positions, orientations, and/or camera types of the first camera 116 and the second camera 118.

[0023] In some embodiments, the EVCS 102 receives one or more images from the first camera 116 and process the images (e.g., via image recognition) to determine a vehicle characteristic of the electric vehicle 104. In some embodiments, the EVCS 102 uses a machine learning algorithm trained with images of vehicles with known characteristics. For example, the first camera 116 may capture a first image showing the electric vehicle 104 in a first orientation relative to the first camera 116. The EVCS 102 may use the shape of the electric vehicle 104 displayed in the first image to determine the make (vehicle characteristic) of the electric vehicle 104. In some embodiments, the EVCS 102 also determines a confidence score related to the determined vehicle characteristic. For example, if the first image is taken late at night without optimal lighting, the EVCS 102 may determine a low confidence score associated with the determined vehicle characteristic.

[0024] In some embodiments, if the confidence score does not exceed a threshold, the EVCS 102 receives one or more images from the second camera 118. The EVCS 102 can use the one or more images from the second camera 118 to validate the vehicle characteristic determined from the first image of the first camera 116. For example, the EVCS 102 may process the one or more images from the second camera 118 and determine a plurality of vehicle characteristics (make, color, model, etc.) related to the electric vehicle 104. If one of the plurality of vehicle characteristics determined from the one or more images from the second camera 118 matches with the vehicle characteristic determined using the first image of the first camera, the EVCS 102 may determine that the vehicle characteristic determined using the first image of the first camera 116 is valid.

[0025] In another example, the EVCS 102 may process the one or more images from the second camera 118 and calculate a confidence score relating to the vehicle characteristic determined using the first image of the first camera 116. For example, if the vehicle characteristic determined using the first image of the first camera 116 is a first model of electric vehicle, the EVCS 102 may process the one or more images from the second camera 118 to determine a second confidence score related to the first model of electric vehicle being displayed in the one or more images from the second camera 118. If the second confidence score is greater than a second threshold, the EVCS 102 may determine that the vehicle characteristic (e.g., first model of electric vehicle) determined using the first image of the first camera 116 is valid.

[0026] Although vehicle characteristics are described, the same or similar methodology may be used to determine user information. For example, the EVCS 102 may receive one or more images from the first camera 116 and process the images (e.g., via image recognition) to determine user information (e.g., age, gender, clothing, facial expression, etc.) related to the user. In some embodiments, the EVCS 102 uses a machine learning algorithm trained with images of users with known characteristics. For example, the first camera 116 may capture a first image showing the user 106 in a first orientation relative to the first camera 116. The EVCS 102 may use the first image to determine an approximate age of the user 106. In some embodiments, the EVCS 102 also determines a first confidence score related to the determined user information (user age). For example, if the first image is taken late at night without optimal lighting, the EVCS 102 may determine a low confidence score associated with the determined user information. In some embodiments, if the confidence score does not exceed a threshold, the EVCS 102 receives one or more images from the second camera 118. The EVCS 102 can use the one or more images from the second camera 118 to validate the user information determined from the first image of the first camera 116 as described herein.

[0027] In some embodiments, the EVCS 102 provides services to the electric vehicle 104 based on the vehicle characteristic and/or user information determined using the images captured by the first camera 116 and the second camera 118. In some embodiments, the EVCS 102 accesses a database that comprises a plurality of entries associating vehicle characteristics with customized services. The database may be stored in the EVCS 102, the server 110, or a combination thereof. The database may have entries associating models of electric vehicles with charging rates. For example, a first model may be associated with a first charging rate and a second model may be associated with a second charging rate that is slower than the first charging rate. The charging rates may be based on optimizing the battery life of the respective model. The EVCS 102 may charge the electric vehicle 104 at the first charging rate because the EVCS 102 determines that the electric vehicle 104 is associated with the first model.

[0028] In some embodiments, the EVCS 102 provides services to the user 106 based on the vehicle characteristic and/or user information determined using the images captured by the first camera 116 and the second camera 118. For example, the database may have entries associating models of electric vehicles with certain media. A first model may be associated with a first media item and a second model may be associated with a second media item. In some embodiments, the EVCS 102 displays the first media item on the display 120 because the EVCS 102 determines that the electric vehicle 104 is associated with the first model. [0029] FIG. 2A shows an illustrative diagram of EVCS camera arrays used for determining user information and electric vehicle characteristics, in accordance with some embodiments of the disclosure. In some embodiments, FIG. 2A illustrates the EVCS displayed in FIG. 1. In some embodiments, the EVCS 202 includes a housing 204 (e.g., a body or a chassis) that holds a display 212. In some embodiments, the EVCS 202 comprises more than one display. For example, the EVCS 202 may have a first display 212 and a second display (on the other side of the EVCS 202). In some embodiments, the display 212 is large compared to the housing 204 (e.g., 60% or more of the height of the frame and 80% or more of the width of the frame), allowing the display 212 to function as a billboard, capable of conveying information to passersby. In some embodiments, the one or more displays 212 display messages (e.g., media items) to users of the EVCS 212 (e.g., operators of the electric vehicle) and/or to passersby that are in proximity to the EVCS 212. In some embodiments, the display 212 has a height that is at least three feet and a width that is at least two feet.

[0030] The EVCS 202 further comprises a computer that includes one or more processors and memory. In some embodiments, the memory stores instructions for displaying content on the display 212. In some embodiments, the computer is disposed inside the housing 204. In some embodiments, the computer is mounted on a panel that connects (e.g., mounts) a first display (e.g., display 212) to the housing 204. In some embodiments, the computer includes a near-field communication (NFC) system that is configured to interact with a user’s device (e.g., user device 108 of a user 106 in FIG. 1).

[0031] The EVCS 202 further comprises a charging cable 214 (e.g., connector) configured to connect and provide a charge to an electric vehicle (e.g., electric vehicle 104 of FIG. 1). In some embodiments, the charging cable 214 is an IEC 62196 type-2 connector. In some embodiments, the charging cable 214 is a “gun-type” connector (e.g., a charge gun) that, when not in use, sits in a holder (e.g., a holster). In some embodiments, the housing 204 houses circuitry for charging an electric vehicle. For example, in some embodiments, the housing 204 includes power supply circuitry as well as circuitry for determining a state of a vehicle being charged (e.g., whether the vehicle is connected via the connector, whether the vehicle is charging, whether the vehicle is done charging, etc.). In some embodiments, the EVCS 202 supports ISO 15118, which allows a user to plug their electric vehicle into the EVCS 202 and begin charging without inputting any additional information.

[0032] In some embodiments, the EVCS 202 further comprises a first camera 206 and a second camera 208. In some embodiments, the first camera 206 is affixed to an upper portion of the EVCS 202 and the second camera 208 is affixed to a lower portion of the EVCS 202. In some embodiments, the first camera 206 and/or the second camera 208 are housed within the EVCS 202. For example, the first camera 206 may be housed within the EVCS 202 and capture information through an opening in the EVCS 202. In some embodiments, the first camera 206 and/or the second camera 208 are affixed to the outside of the EVCS 202. In some embodiments, the first camera 206 and/or the second camera 208 are configured to obtain a video or capture images of an area corresponding to a parking space associated with the EVCS 202, a parking space next to the parking space of the EVCS 202, and/or walking paths (e.g., sidewalks) next to the EVCS 202. In some embodiments, the first camera 206 and/or the second camera 208 are wide-angle cameras or 360° cameras that are configured to obtain videos or capture images of a large area proximal to the EVCS 202. In some embodiments, the first camera 206 and/or the second camera may be mounted directly on the housing 204 of the EVCS 202 and may have a physical (e.g., electrical, wired) connection to the EVCS 202 or a computer system associated with the EVCS 202. In some embodiments, the first camera 206 and/or the second camera 208 may be disposed separately from but proximal to the housing 204 of the EVCS 202.

[0033] In some embodiments, the EVCS 202 further comprises additional sensors (not shown). In some embodiments, the additional sensors detect external objects within a region (area) proximal to the EVCS 202. In some embodiments, the one or more sensors are configured to determine a state of the area proximal to the EVCS 202 (e.g., wherein determining the state includes detecting external objects or the lack thereof). In some embodiments, the external objects can be living or nonliving, such as people, kids, animals, vehicles, shopping carts, toys, etc. In some embodiments, the one or more sensors can detect stationary or moving external objects. In some embodiments, the one or more sensors may be one or more image sensors (e.g., first camera 206 and second camera 208), ultrasound sensors, depth sensors, IR cameras, RGB cameras, PIR cameras, heat IR, proximity sensors, radar, tension sensors, NFC sensors, and/or any combination thereof. The one or more sensors may be connected to the EVCS 202 or a computer system associated with the EVCS 202 via wired or wireless connections such as a Wi-Fi connection or Bluetooth connection. [0034] FIG. 2B shows another illustrative diagram of EVCS camera arrays used for determining user information and electric vehicle characteristics, in accordance with some embodiments of the disclosure. FIG. 2B displays a second EVCS 252 with a housing 254, a display 260, a connector 262, a first camera 256, and a second camera 258. In some embodiments, the first camera 256 and the second camera 2258 are located on the same side of the second EVCS 252. In some embodiments, the second EVCS 252 uses the same or similar methodologies described in FIG. 2A for determining user information and electric vehicle characteristics. In some embodiments, the second EVCS 252 comprises a third camera (not shown) on another side of the EVCS 252 (e.g., near the connector 262).

[0035] FIG. 3 shows another illustrative diagram of a system for determining user information and electric vehicle characteristics using an EVCS camera array, in accordance with some embodiments of the disclosure. In some embodiments, FIG. 3 uses the same or similar methods described in FIGS. 1-2B. FIG. 3 displays a first EVCS 302 and a second EVCS 312. In some embodiments, the first EVCS 302 comprises a first camera 304, a second camera 306, and a display 308 and the second EVCS 312 comprises a third camera 314 and a fourth camera 316.

[0036] In some embodiments, the first camera 304 captures a first plurality of images of an electric vehicle 322 and the second camera 306 captures a second plurality of images of the electric vehicle 322. In some embodiments, the first camera 304 and/or the second camera 306 begin capturing images in response to a charge request. For example, a user may transmit (e.g., via the user device) a charge request to the first EVCS 302 and the first camera 304 and the second camera 306 begin capturing images of the electric vehicle 322 once the first EVCS 302 receives the charge request. In some embodiments, the first camera 304 and/or the second camera 308 begin capturing images in response to the first EVCS 302 detecting the electric vehicle 322 in the first parking space 310. For example, a sensor in communication with the first EVCS 302 may detect the electric vehicle 322 in the first parking space 310 and alert the first EVCS 302. The first EVCS 302 may then request images from the first camera 304 and the second camera 306. In some embodiments, the first camera 304 and/or the second camera 306 are constantly capturing images.

[0037] In some embodiments, the first camera 304 and/or the second camera 306 do not capture images until an electric vehicle (e.g., electric vehicle 322) is detected. For example, the first camera 304 may be capturing images while the second camera 306 is not capturing images. The first camera 304 may capture an image showing the electric vehicle 322. In response to the first camera 304 capturing an image showing the electric vehicle 322, the second camera 306 may begin capturing images of the electric vehicle 322.

[0038] In some embodiments, each camera has a different perspective for capturing images. For example, the first camera 304 may have a first focal point 320a, the second camera 306 may have a s second focal point 320b, the third camera 314 may have a third focal point 320c, and the fourth camera 316 may have a fourth focal point 320d. In some embodiments, each camera captures images of the electric vehicle 322 from a different perspective. In some embodiments, the first EVCS 302 uses the images captured from more than one camera to determine user information and/or electric vehicle characteristics. For example, the first EVCS 302 may use the images taken from the first camera 304 to determine that the electric vehicle 322 has a paper license plate 324 (electric vehicle characteristic). In some embodiments, the first EVCS 302 also associates a confidence score with the determined electric vehicle characteristic. For example, the first EVCS 302 may attribute a low confidence score to the paper license plate determination because the paper license plate 324 is a distance away from the first focal point 320a.

[0039] In some embodiments, the first EVCS 302 receives images from the second camera 306 because of the low confidence score associated with the vehicle characteristic determination of the images from the first camera 304. In some embodiments, the first EVCS 302 uses the images from the second camera 306 to confirm the determined vehicle characteristic. For example, the first EVCS 302 may process the images received from the second camera 306 to see if the electric vehicle 322 has a paper license plate 324. In some embodiments, the first EVCS 302 assigns a second confidence score to the vehicle characteristic determined using the images from the second camera 306. For example, the first EVCS 302 may attribute a high confidence score to the vehicle characteristic determined using the images from the second camera 306 because the paper license plate 324 is aligned with the second focal point 320b. In some embodiments, the images captured using the second camera 306 have additional information increasing the confidence rate related to the determined vehicle characteristic. For example, the images captured using the second camera 306 may comprise license plate letters that the first EVCS 302 can use to determine a profile related to the electric vehicle 322.

[0040] In some embodiments, the first EVCS 302 receives images from the third camera 314 and/or the fourth camera 316 because of the low confidence score associated with the vehicle characteristic determination of the images from the first camera 304 and/or the second camera 306. In some embodiments, the first EVCS 302 receives the images from the second EVCS 302. In some embodiments, the images captured from the third camera 314 and/or the fourth camera 316 are used to validate the vehicle characteristic determined using the images from the first camera 304 and/or the second camera 306.

[0041] In some embodiments, the first EVCS 302 customizes the media that will be displayed on the display 308 based on the determined user information/electric vehicle characteristic. For example, the first EVCS 302 may customize the media displayed for the user based on determining that the electric vehicle 322 has a the paper license plate 324. For example, because paper license plates usually correspond to newer vehicles, the first EVCS 302 may display an advertisement for home charging stations. In another example, the first EVCS 302 may use the information captured by the cameras to determine that the electric vehicle 322 has an out-of-state license plate. The first EVCS 302 may display customized media (e.g., nearby tourist destinations, hotel advertisements, etc.) in response to determining that the electric vehicle 322 has an out-of-state license plate.

[0042] FIGS. 4 A and 4B show other illustrative diagrams of a system for determining user information and electric vehicle characteristics using an EVCS camera array, in accordance with some embodiments of the disclosure. In some embodiments, FIGS. 4A and 4B use the same or similar methods and devices described in FIGS. 1-3.

[0043] In some embodiments, FIG. 4A displays a first image 402 captured by a first camera and FIG. 4B displays a second image 410 captured by a second camera. In some embodiments, the first camera used to capture the first image 402 and the second camera used to capture the second image 410 are housed within an EVCS. In some embodiments, the first camera used to capture the first image 402 and the second camera used to capture the second image 410 are housed within different EVCSs. The first image 402 displays an electric vehicle 404 in a parking space 406. In some embodiments, the first image 402 is part of video data. In some embodiments, the first camera captures the first image 402 when an EVCS receives a charging request. In some embodiments, the EVCS captures the first image 402 automatically upon detection of the electric vehicle 404.

[0044] In some embodiments, an EVCS receives the first image 402 from the first camera and process the images (e.g., via image recognition) to determine a vehicle characteristic of the electric vehicle 404. In some embodiments, the EVCS uses a machine learning algorithm trained with images of vehicles with known characteristics. For example, the EVCS may use the side profile of the electric vehicle 404 displayed in the first image 402 to determine the make (vehicle characteristic) of the electric vehicle 404. In some embodiments, the EVCS also determines a first confidence score related to the determined vehicle characteristic. [0045] In some embodiments, if the first confidence score does not exceed a threshold, the EVCS receives the second image 410 from the second camera. The EVCS can use the second image 410 to validate a vehicle characteristic determined from the first image 402. For example, the EVCS may process the first image 402 and determine that the license plate 408 is a paper type. In some embodiments, the EVCS associates a low confidence score with the determination that license plate 408 is a paper type due to the information displayed in the first image 402. For example, due to the perspective of the first camera, the first image 402 only shows a portion of the license plate 408. In response to the first confidence score being below a first threshold, the EVCS may receive the second image 410 from the second camera.

[0046] The EVCS may process the second image 410 and calculate a second confidence score relating to the determination that the license plate 408 is a paper type. In some embodiments, the EVCS associates a higher confidence score with the determination that license plate 408 is a paper type due to the information displayed in the second image 410. For example, due to the perspective of the second camera, the second image 410 shows a larger portion of the license plate 408. In some embodiments, the EVCS processes (e.g., via optical character recognition) the characters on the license plate 408 to determine the second confidence score. For example, the EVCS may look up the characters of the license plate 408 in a database to determine a profile associated with the electric vehicle 404 and the profile may indicate if the license plate 408 is a paper type, the make/model of the electric vehicle 404, the date of purchase of the electric vehicle 404, and/or similar such information. The EVCS may use the profile associated with the electric vehicle 404 to determine a high confidence score for the second confidence score.

[0047] If the second confidence score is greater than a second threshold, the EVCS 102 may determine that the vehicle characteristic (e.g., that the license plate 408 is a paper type) determined using the first image 402 is valid.

[0048] FIG. 5 shows an illustrative block diagram of an EVCS system 500, in accordance with some embodiments of the disclosure. In particular, the EVCS system 500 of FIG. 5 may be any of the EVCSs depicted in FIGS. 1-3. In practice, and as recognized by those of ordinary skill in the art, items shown separately could be combined, and some items could be separated. In some embodiments, not all shown items must be included in the EVCS 500. In some embodiments, the EVCS 500 may comprise additional items.

[0049] The EVCS system 500 can include processing circuitry 502, which includes one or more processing units (processors or cores), storage 504, one or more network or other communications network interfaces 506, additional peripherals 508, one or more sensors 510, a motor 512 (configured to retract a portion of a charging cable), one or more wireless transmitters and/or receivers 514, and one or more input/output (I/O) paths 516. I/O paths 516 may use communication buses for interconnecting the described components. I/O paths 516 can include circuitry (sometimes called a chipset) that interconnects and controls communications between system components. The EVCS 500 may receive content and data via I/O paths 516. The I/O path 516 may provide data to control circuitry 518, which includes processing circuitry 502 and a storage 504. The control circuitry 518 may be used to send and receive commands, requests, and other suitable data using the I/O path 516. The I/O path 516 may connect the control circuitry 518 (and specifically the processing circuitry 502) to one or more communications paths. I/O functions may be provided by one or more of these communications paths but are shown as a single path in FIG. 5 to avoid overcomplicating the drawing.

[0050] The control circuitry 518 may be based on any suitable processing circuitry such as the processing circuitry 502. As referred to herein, processing circuitry should be understood to mean circuitry based on one or more microprocessors, microcontrollers, digital signal processors, programmable logic devices, field-programmable gate arrays (FPGAs), application-specific integrated circuits (ASICs), etc., and may include a multi-core processor (e.g., dual-core, quad-core, hexa-core, or any suitable number of cores) or supercomputer. In some embodiments, processing circuitry may be distributed across multiple separate processors or processing units, for example, multiple of the same type of processing units (e.g., two Intel Core i7 processors) or multiple different processors (e.g., an Intel Core i5 processor and an Intel Core i7 processor). The allocation-of-services functionality can be at least partially implemented using the control circuitry 518. The allocation-of-services functionality described herein may be implemented in or supported by any suitable software, hardware, or combination thereof. The allocation of services can be implemented on user equipment, on remote servers, or across both.

[0051] The control circuitry 518 may include communications circuitry suitable for communicating with one or more servers. The instructions for carrying out the above- mentioned functionality may be stored on the one or more servers. Communications circuitry may include a cable modem, an integrated service digital network (ISDN) modem, a digital subscriber line (DSL) modem, a telephone modem, Ethernet card, or a wireless modem for communications with other equipment, or any other suitable communications circuitry. Such communications may involve the internet or any other suitable communications networks or paths. In addition, communications circuitry may include circuitry that enables peer-to-peer communication of user equipment devices, or communication of user equipment devices in locations remote from each other (described in more detail below).

[0052] Memory may be an electronic storage device provided as the storage 504 that is part of the control circuitry 518. As referred to herein, the phrase “storage device” or “memory device” should be understood to mean any device for storing electronic data, computer software, or firmware, such as random-access memory, read-only memory, high-speed random-access memory (e.g., DRAM, SRAM, DDR RAM, or other random-access solid- state memory devices), non-volatile memory, one or more magnetic disk storage devices, optical disk storage devices, flash memory devices, other non-volatile solid-state storage devices, quantum storage devices, and/or any combination of the same. In some embodiments, the storage 504 includes one or more storage devices remotely located, such as database of server system that is in communication with the EVCS 500. In some embodiments, the storage 504, or alternatively the non-volatile memory devices within the storage 504, includes a non-transitory computer-readable storage medium.

[0053] In some embodiments, storage 504 or the computer-readable storage medium of the storage 504 stores an operating system, which includes procedures for handling various basic system services and for performing hardware-dependent tasks. In some embodiments, storage 504 or the computer-readable storage medium of the storage 504 stores a communications module, which is used for connecting the EVCS 500 to other computers and devices via the one or more communication network interfaces 506 (wired or wireless), such as the internet, other wide area networks, local area networks, metropolitan area networks, and so on. In some embodiments, storage 504 or the computer-readable storage medium of the storage 504 stores a media item module for selecting and/or displaying media items on the display(s) 520 to be viewed by passersby and users of the EVCS 500. In some embodiments, storage 504 or the computer-readable storage medium of the storage 504 stores an EVCS module for charging an electric vehicle (e.g., measuring how much charge has been delivered to an electric vehicle, commencing charging, ceasing charging, etc.), including a motor control module that includes one or more instructions for energizing or forgoing energizing the motor. In some embodiments, executable modules, applications, or sets of procedures may be stored in one or more of the previously mentioned memory devices and corresponds to a set of instructions for performing a function described above. In some embodiments, modules or programs (i.e., sets of instructions) need not be implemented as separate software programs, procedures, or modules, and thus various subsets of modules may be combined or otherwise re-arranged in various implementations. In some embodiments, the storage 504 stores a subset of the modules and data structures identified above. In some embodiments, the storage 504 may store additional modules or data structures not described above.

[0054] In some embodiments, the EVCS 500 comprises additional peripherals 508 such as one or more displays 520 for displaying content, and charging cable 522. In some embodiments, the displays 520 may be touch- sensitive displays that are configured to detect various swipe gestures (e.g., continuous gestures in vertical and/or horizontal directions) and/or other gestures (e.g., a single or double taps) or to detect user input via a soft keyboard that is displayed when keyboard entry is needed.

[0055] In some embodiments, the EVCS 500 comprises one or more sensors 510 such as a first camera 524 and a second camera 526 (e.g., the cameras described above with respect to FIGS. 1-3). In some embodiments, the sensor 510 may be an ultrasound sensor, depth sensor, IR camera, RGB camera, PIR camera, heat IR, proximity sensor, radar, tension sensor, NFC sensor, and/or any combination thereof. In some embodiments, the one or more sensors 510 are for detecting whether external objects are within a region proximal to the EVCS 500, such as living and nonliving objects, and/or the status of the EVCS 500 (e.g., available, occupied, etc.) in order to perform an operation, such as determining a vehicle characteristic, user information, region status, appropriate allocation of services, etc.

[0056] FIG. 6 shows an illustrative block diagram of a user equipment device system, in accordance with some embodiments of the disclosure. In practice, and as recognized by those of ordinary skill in the art, items shown separately could be combined and some items could be separated. In some embodiments, not all shown items must be included in device 600. In some embodiments, device 600 may comprise additional items. In an embodiment, the user equipment device 600 is the same user equipment device 108 of FIG. 1. The user equipment device 600 may receive content and data via VO path 602. The VO path 602 may provide audio content (e.g., broadcast programming, on-demand programming, internet content, content available over a local area network (LAN) or wide area network (WAN), and/or other content) and data to control circuitry 604, which includes processing circuitry 606 and a storage 608. The control circuitry 604 may be used to send and receive commands, requests, and other suitable data using the VO path 602. The VO path 602 may connect the control circuitry 604 (and specifically the processing circuitry 606) to one or more communications paths. I/O functions may be provided by one or more of these communications paths but are shown as a single path in FIG. 6 to avoid overcomplicating the drawing.

[0057] The control circuitry 604 may be based on any suitable processing circuitry such as the processing circuitry 606. As referred to herein, processing circuitry should be understood to mean circuitry based on one or more microprocessors, microcontrollers, digital signal processors, programmable logic devices, FPGAs, ASICs, etc., and may include a multi-core processor (e.g., dual-core, quad-core, hexa-core, or any suitable number of cores) or supercomputer. In some embodiments, processing circuitry may be distributed across multiple separate processors or processing units, for example, multiple of the same type of processing units (e.g., two Intel Core i7 processors) or multiple different processors (e.g., an Intel Core i5 processor and an Intel Core i7 processor).

[0058] In client/server-based embodiments, the control circuitry 604 may include communications circuitry suitable for communicating with one or more servers that may at least implement the described allocation of services functionality. The instructions for carrying out the above-mentioned functionality may be stored on the one or more servers. Communications circuitry may include a cable modem, an ISDN modem, a DSL modem, a telephone modem, Ethernet card, or a wireless modem for communications with other equipment, or any other suitable communications circuitry. Such communications may involve the internet or any other suitable communications networks or paths. In addition, communications circuitry may include circuitry that enables peer-to-peer communication of user equipment devices, or communication of user equipment devices in locations remote from each other (described in more detail below).

[0059] Memory may be an electronic storage device provided as the storage 608 that is part of the control circuitry 604. Storage 608 may include random-access memory, read-only memory, hard drives, optical drives, digital video disc (DVD) recorders, compact disc (CD) recorders, BLU-RAY disc (BD) recorders, BLU-RAY 3D disc recorders, digital video recorders (DVRs, sometimes called personal video recorders, or PVRs), solid-state devices, quantum storage devices, gaming consoles, gaming media, or any other suitable fixed or removable storage devices, and/or any combination of the same. The storage 608 may be used to store various types of content described herein. Nonvolatile memory may also be used (e.g., to launch a boot-up routine and other instructions). Cloud-based storage may be used to supplement the storage 608 or instead of the storage 608.

[0060] The control circuitry 604 may include audio generating circuitry and tuning circuitry, such as one or more analog tuners, audio generation circuitry, filters or any other suitable tuning or audio circuits or combinations of such circuits. The control circuitry 604 may also include scaler circuitry for upconverting and converting down content into the preferred output format of the user equipment device 600. The control circuitry 604 may also include digital-to-analog converter circuitry and analog-to-digital converter circuitry for converting between digital and analog signals. The tuning and encoding circuitry may be used by the user equipment device 600 to receive and to display, play, or record content. The circuitry described herein, including, for example, the tuning, audio generating, encoding, decoding, encrypting, decrypting, scaler, and analog/digital circuitry, may be implemented using software running on one or more general purpose or specialized processors. If the storage 608 is provided as a separate device from the user equipment device 600, the tuning and encoding circuitry (including multiple tuners) may be associated with the storage 608. [0061] The user may utter instructions to the control circuitry 604, which are received by the microphone 616. The microphone 616 may be any microphone (or microphones) capable of detecting human speech. The microphone 616 is connected to the processing circuitry 606 to transmit detected voice commands and other speech thereto for processing. In some embodiments, voice assistants (e.g., Siri, Alexa, Google Home and similar such voice assistants) receive and process the voice commands and other speech.

[0062] The user equipment device 600 may optionally include an interface 610. The interface 610 may be any suitable user interface, such as a remote control, mouse, trackball, keypad, keyboard, touch screen, touchpad, stylus inputjoystick, or other user input interfaces. A display 612 may be provided as a stand-alone device or integrated with other elements of the user equipment device 600. For example, the display 612 may be a touchscreen or touch-sensitive display. In such circumstances, the interface 610 may be integrated with or combined with the microphone 616. When the interface 610 is configured with a screen, such a screen may be one or more of a monitor, a television, a liquid crystal display (LCD) for a mobile device, active matrix display, cathode ray tube display, lightemitting diode display, organic light-emitting diode display, quantum dot display, or any other suitable equipment for displaying visual images. In some embodiments, the interface 610 may be HDTV-capable. In some embodiments, the display 612 may be a 3D display. The speaker (or speakers) 614 may be provided as integrated with other elements of user equipment device 600 or may be a stand-alone unit. In some embodiments, the display 612 may be outputted through speakers 614.

[0063] FIG. 7 shows an illustrative block diagram of a server system 700, in accordance with some embodiments of the disclosure. Server system 700 may include one or more computer systems (e.g., computing devices), such as a desktop computer, a laptop computer, and a tablet computer. In some embodiments, the server system 700 is a data server that hosts one or more databases (e.g., databases of images or videos), models, or modules or may provide various executable applications or modules. In practice, and as recognized by those of ordinary skill in the art, items shown separately could be combined and some items could be separated. In some embodiments, not all shown items must be included in server system 700. In some embodiments, server system 700 may comprise additional items.

[0064] The server system 700 can include processing circuitry 702, which includes one or more processing units (processors or cores), storage 704, one or more network or other communications network interfaces 706, and one or more input/output I/O paths 708. I/O paths 708 may use communication buses for interconnecting the described components. I/O paths 708 can include circuitry (sometimes called a chipset) that interconnects and controls communications between system components. Server system 700 may receive content and data via I/O paths 708. The I/O path 708 may provide data to control circuitry 710, which includes processing circuitry 702 and a storage 704. The control circuitry 710 may be used to send and receive commands, requests, and other suitable data using the I/O path 708. The I/O path 708 may connect the control circuitry 710 (and specifically the processing circuitry 702) to one or more communications paths. I/O functions may be provided by one or more of these communications paths but are shown as a single path in FIG. 7 to avoid overcomplicating the drawing.

[0065] The control circuitry 710 may be based on any suitable processing circuitry such as the processing circuitry 702. As referred to herein, processing circuitry should be understood to mean circuitry based on one or more microprocessors, microcontrollers, digital signal processors, programmable logic devices, FPGAs, ASICs, etc., and may include a multi-core processor (e.g., dual-core, quad-core, hexa-core, or any suitable number of cores) or supercomputer. In some embodiments, processing circuitry may be distributed across multiple separate processors or processing units, for example, multiple of the same type of processing units (e.g., two Intel Core i7 processors) or multiple different processors (e.g., an Intel Core i5 processor and an Intel Core i7 processor).

[0066] Memory may be an electronic storage device provided as the storage 704 that is part of the control circuitry 710. Storage 704 may include random-access memory, read-only memory, high-speed random-access memory (e.g., DRAM, SRAM, DDR RAM, or other random-access solid-state memory devices), non-volatile memory, one or more magnetic disk storage devices, optical disk storage devices, flash memory devices, other non-volatile solid- state storage devices, quantum storage devices, and/or any combination of the same.

[0067] In some embodiments, storage 704 or the computer-readable storage medium of the storage 704 stores an operating system, which includes procedures for handling various basic system services and for performing hardware-dependent tasks. In some embodiments, storage 704 or the computer-readable storage medium of the storage 704 stores a communications module, which is used for connecting the server system 700 to other computers and devices via the one or more communication network interfaces 706 (wired or wireless), such as the internet, other wide area networks, local area networks, metropolitan area networks, and so on. In some embodiments, storage 704 or the computer-readable storage medium of the storage 704 stores a web browser (or other application capable of displaying web pages), which enables a user to communicate over a network with remote computers or devices. In some embodiments, storage 704 or the computer-readable storage medium of the storage 704 stores a database for storing information on electric vehicle charging stations, their locations, media items displayed at respective electric vehicle charging stations, a number of each type of impression count associated with respective electric vehicle charging stations, and so forth.

[0068] In some embodiments, executable modules, applications, or sets of procedures may be stored in one or more of the previously mentioned memory devices and corresponds to a set of instructions for performing a function described above. In some embodiments, modules or programs (i.e., sets of instructions) need not be implemented as separate software programs, procedures, or modules, and thus various subsets of modules may be combined or otherwise re-arranged in various implementations. In some embodiments, the storage 704 stores a subset of the modules and data structures identified above. In some embodiments, the storage 704 may store additional modules or data structures not described above.

[0069] FIG. 8 is an illustrative flowchart of a process 800 for determining user information and electric vehicle characteristics using an EVCS camera array, in accordance with some embodiments of the disclosure. Process 800 may be performed by physical or virtual control circuitry, such as control circuitry 518 of an EVCS (FIG. 5). In some embodiments, some steps of process 800 may be performed by one of several devices.

[0070] At step 802, control circuitry receives a first image of an electric vehicle from a first camera. In some embodiments, the first camera continually captures images and sends the images to the control circuitry. In some embodiments, the control circuitry instructs the first camera to capture one or more images in response to detecting an electric vehicle and/or user. In some embodiments, the control circuitry detects the electric vehicle and/or user when the control circuitry receives a charge request from the electric vehicle and/or user.

[0071] At step 804, control circuitry receives a second image of the electric vehicle from a second camera. In some embodiments, the second camera continually captures images and sends the images to the control circuitry. In some embodiments, the control circuitry instructs the second camera to capture one or more images in response to detecting an electric vehicle and/or user. In some embodiments, the control circuitry detects the electric vehicle and/or user when the control circuitry receives a charge request from the electric vehicle and/or user.

[0072] At step 806, control circuitry determines a characteristic of the electric vehicle using the first image and the second image. In some embodiments, the control circuitry uses a machine learning algorithm to process the first image and the second image to determine a characteristic corresponding to the electric vehicle. In some embodiments, the characteristic or characteristics include vehicle model, vehicle make, vehicle specifications, vehicle condition, and/or similar such information. In some embodiments, other information captured from one or more sensors is used in conjunction with the first image and the second image to determine the characteristic of the electric vehicle. The control circuitry may also determine a confidence score related to the determined characteristic. If the confidence score does not exceed a threshold, the control circuitry may request additional images from the first and/or second camera. If the confidence score does exceed the threshold the process 800 may continue to step 808.

[0073] At step 808, control circuitry determines a charging rate based on the characteristic of the electric vehicle. In some embodiments, the control circuitry accesses a database comprising a plurality of entries associating characteristics with customized services. For example, a first model may be associated with a first charging rate and a second model may be associated with a second charging rate that is slower than the first charging rate.

[0074] At step 810, control circuitry charges the electric vehicle with a connector using the determined charging rate. In some embodiments, the charging rate is based on optimizing the battery life of the electric vehicle.

[0075] FIG. 9 is another illustrative flowchart of a process 900 for determining user information and electric vehicle characteristics using an EVCS camera array, in accordance with some embodiments of the disclosure. Process 900 may be performed by physical or virtual control circuitry, such as control circuitry 518 of an EVCS (FIG. 5). In some embodiments, some steps of process 900 may be performed by one of several devices. [0076] At step 902, control circuitry receives a first image of an electric vehicle from a first camera. In some embodiments, the control circuitry uses the same or similar methodologies described in step 802 above.

[0077] At step 904, control circuitry determines a characteristic of the electric vehicle using the first image. In some embodiments, the control circuitry processes the first image (e.g., via image recognition) to determine a characteristic of the electric vehicle. In some embodiments, the control circuitry uses a machine learning algorithm trained with images of vehicles with known characteristics.

[0078] At step 906, control circuitry determines a first confidence score of the characteristic of the electric vehicle using the first image. In some embodiments, one or more factors impact the confidence score. For example, the factors may include quality of the image, commonality of the determined information/characteristic, position of the electric vehicle, lighting, etc.

[0079] At step 908, control circuitry determines whether the first confidence score is greater than a first threshold. In some embodiments, the first threshold is dependent on the determined characteristic. For example, a first characteristic (e.g., model of electric vehicle) may have a higher first threshold than a second characteristic (e.g., color of electric vehicle). If the control circuitry determines that the first confidence score is not greater than the first threshold, the process 900 continues to step 910. If the control circuitry determines that the first confidence score is greater than the first threshold, the process 900 continues to step 916. [0080] At step 910, control circuitry receives a second image of an electric vehicle from a second camera. In some embodiments, the control circuitry uses the same or similar methodologies described in step 804 above. In some embodiments, the control circuitry requests one or more images from the second camera in response to the first confidence score being less than the first threshold. In some embodiments, the control circuitry selects the second camera from a plurality of cameras based on the determined vehicle characteristic. For example, the control circuitry may select a camera facing the front of the electric vehicle if the determined vehicle characteristic relates to a license plate of the electric vehicle.

[0081] At step 912, control circuitry determines a second confidence score of the characteristic of the electric vehicle using the second image. In some embodiments, one or more factors impact the confidence score. For example, the factors may include quality of the image, commonality of the determined information/characteristic, position of the electric vehicle, lighting, etc. [0082] At step 914, control circuitry determines whether the second confidence score is greater than a second threshold. In some embodiments, the second threshold is the same or similar to the first threshold. In some embodiments, the second threshold is dependent on the first threshold and/or first confidence score. For example, if the first confidence score is very low the second threshold may be a higher value to reduce false positives. In another example, if the first confidence score is high, the second threshold may be a lower value. If the control circuitry determines that the second confidence score is not greater than the second threshold, the process 900 starts over at step 902 where one or more new images are received from the first camera. If the control circuitry determines that the second confidence score is greater than the second threshold, the process 900 continues to step 916.

[0083] At step 916, control circuitry determines a service based on the characteristic of the electric vehicle. In some embodiments, the control circuitry accesses a database comprising a plurality of entries associating characteristics with customized services. For example, a first model may be associated with a first charging rate and a second model may be associated with a second charging rate that is slower than the first charging rate.

[0084] At step 918, control circuitry outputs the service determined in step 916. For example, the control circuitry may charge the electric vehicle according to the charging rate associated with the determined characteristic of the electric vehicle.

[0085] FIG. 10 is another illustrative flowchart of a process 1000 for determining user information and electric vehicle characteristics using an EVCS camera array, in accordance with some embodiments of the disclosure. Process 1000 may be performed by physical or virtual control circuitry, such as control circuitry 518 of an EVCS (FIG. 5). In some embodiments, some steps of process 1000 may be performed by one of several devices. [0086] At step 1002, control circuitry receives a first image of an electric vehicle from a first camera. In some embodiments, the control circuitry uses the same or similar methodologies described in step 902 above.

[0087] At step 1004, control circuitry determines a characteristic of the electric vehicle using the first image. In some embodiments, the control circuitry uses the same or similar methodologies described in step 904 above.

[0088] At step 1006, control circuitry determines a first confidence score of the characteristic of the electric vehicle using the first image. In some embodiments, the control circuitry uses the same or similar methodologies described in step 906 above. [0089] At step 1008, control circuitry receives a second image of the electric vehicle from a second camera. In some embodiments, the control circuitry uses the same or similar methodologies described in step 910 above.

[0090] At step 1010, control circuitry determines a plurality of characteristics of the electric vehicle using the second image. In some embodiments, the control circuitry uses a machine learning algorithm to process the second image to determine the plurality of characteristics corresponding to the electric vehicle. In some embodiments, the plurality of characteristics is a subset of characteristics determined using the second image. For example, the control circuitry may determine a first set of characteristics using the second image and assign confidence scores to each characteristic in the set. The control circuitry may then determine the plurality of characteristics to comprise only the characteristics that are associated with confidence scores exceeding a minimum threshold.

[0091] At step 1012, control circuitry determines whether the plurality of characteristics comprise the characteristic determined in step 1004. If the control circuitry determines that the plurality of characteristics do not comprise the characteristic determined in step 1004, the process 1000 starts over at step 1002 where one or more new images are received from the first camera. If the control circuitry determines that the plurality of characteristics do comprise the characteristic determined in step 1004, the process 1000 continues to step 1014. [0092] At step 1014, control circuitry determines a second confidence score of the characteristic of the electric vehicle using the second image. In some embodiments, the control circuitry uses the same or similar methodologies described in step 912 above.

[0093] At step 1016, control circuitry determines whether the first confidence score and the second confidence score are greater than a threshold. In some embodiments, the first confidence score and the second confidence score are combined, and the combined confidence score is compared to the threshold. In some embodiments, the first confidence score and the second confidence score are weighted according to the characteristic determined in step 1004 and/or camera characteristics. For example, the first confidence score may be weighted higher than the second confidence score if the first camera provides higher quality images. In another example, the characteristic determined in step 1004 may relate to the license plate. The first confidence score may be weighted higher than the second confidence score if the first camera has a better view of the license plate. If the control circuitry determines that the first confidence score and the second confidence score are not greater than the threshold, the process 1000 starts over at step 1002 where one or more new images are received from the first camera. If the control circuitry determines that the first confidence score and the second confidence score are greater than the threshold, the process 1000 continues to step 1018.

[0094] At step 1018, control circuitry determines a service based on the characteristic of the electric vehicle. In some embodiments, the control circuitry uses the same or similar methodologies described in step 916 above.

[0095] At step 1020, control circuitry outputs the service. In some embodiments, the control circuitry uses the same or similar methodologies described in step 918 above.

[0096] It is contemplated that some suitable steps or suitable descriptions of FIGS. 8-10 may be used with other suitable embodiments of this disclosure. In addition, some suitable steps and descriptions described in relation to FIGS. 8-10 may be implemented in alternative orders or in parallel to further the purposes of this disclosure. For example, some suitable steps may be performed in any order or in parallel or substantially simultaneously to reduce lag or increase the speed of the system or method. Some suitable steps may also be skipped or omitted from the process. Furthermore, it should be noted that some suitable devices or equipment discussed in relation to FIGS. 1-7 could be used to perform one or more of the steps in FIGS. 8-10.

[0097] The processes discussed above are intended to be illustrative and not limiting. One skilled in the art would appreciate that the steps of the processes discussed herein may be omitted, modified, combined, and/or rearranged, and any additional steps may be performed without departing from the scope of the invention. More generally, the above disclosure is meant to be exemplary and not limiting. Only the claims that follow are meant to set bounds as to what the present invention includes. Furthermore, it should be noted that the features and limitations described in any one embodiment may be applied to any other embodiment herein, and flowcharts or examples relating to one embodiment may be combined with any other embodiment in a suitable manner, done in different orders, or done in parallel. In addition, the systems and methods described herein may be performed in real time. It should also be noted that the systems and/or methods described above may be applied to, or used in accordance with, other systems and/or methods.