Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
DYNAMIC TRAILER TURN ASSIST SYSTEM AND METHOD WITH GRAPHIC OVERLAY
Document Type and Number:
WIPO Patent Application WO/2023/092065
Kind Code:
A1
Abstract:
A method and system are disclosed for facilitating a turn operation by a tow vehicle having a connected trailer, including receiving sensor data from one or more sensors mounted to the tow vehicle and trailer; estimating a tire path for each tire of the trailer based upon the sensor data; detecting and classifying an object based upon the sensor data; determining a dimension of the object; and determining whether or not the detected object is in the estimated tire path. Upon a determination that the detected object is in the estimated tire path, creating an image of an environment including the tow vehicle and trailer. The created image highlights the object based upon the object classification and the dimension of the object. The method further includes transmitting the created image and an instruction to a user display for displaying the created image on the user display.

Inventors:
AHAMED NIZAR (US)
BURTCH JOSEPH (US)
DAVANI SINA (US)
PATHALIL NITHIN JOSEPH (US)
MAGANA RAYMUNDO (US)
Application Number:
PCT/US2022/080136
Publication Date:
May 25, 2023
Filing Date:
November 18, 2022
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
CONTINENTAL AUTONOMOUS MOBILITY US LLC (US)
International Classes:
B62D15/02; B62D13/06; G01B11/26
Foreign References:
US20140160276A12014-06-12
US20140358429A12014-12-04
Attorney, Agent or Firm:
ESSER, William F et al. (US)
Download PDF:
Claims:
WHAT IS CLAIMED IS:

1. A method for assisting in executing a turn operation by a vehicle connected to a trailer, the method comprising: receiving, by data processing hardware, sensor data from one or more sensors mounted to at least one of a tow vehicle or a trailer connected thereto; estimating, by the data processing hardware, a tire path for each tire of the trailer based upon the sensor data; detecting, by the data processing hardware, an object based upon the sensor data; classifying, by the data processing hardware, the object and determining at least one dimension of the object; determining, by the data processing hardware, whether or not the detected object is in the estimated tire path; upon an affirmative determination that the detected object is in the estimated tire path, creating, by the data processing hardware based on the received sensor data, an image of an environment in which the tow vehicle and the trailer are located, the created image highlighting the object based upon the object classification and the at least one dimension of the object; and transmitting, by the data processing hardware, the created image and an instruction to a user display for displaying the created image on the user display.

2. The method of claim 1, further comprising determining, by the data processing hardware based upon the object classification and the at least one dimension of the object, whether or not the object can be driven over by the trailer without damaging the trailer, wherein the highlighting of the object is based upon the determination of whether or not the object can be driven over by the trailer.

3. The method of claim 2, wherein the highlighting of the object comprises an overlay of a first type upon the determination that the object can be driven over by the trailer, and of a second type upon the determination that the object cannot be driven over by the trailer.

4. The method of claim 3, wherein the first type of overlay comprises an overlay having a first color and the second type of overlay comprises an overlay having a second color different from the first color.

5. The method of claim 1, wherein the one or more sensors comprise a plurality of cameras, and the method further comprises: determining a distance between the object and the tow vehicle and a distance between the object and the trailer; and selecting, by the data processing hardware, a camera view from at least one of the plurality of cameras based in part upon the distance between the object and the tow vehicle and the distance between the object and the trailer, wherein the image created is based upon the selected camera view.

6. The method of claim 5, wherein the camera view selected includes a representation of the tow vehicle and a representation of the object when the distance between the object and the tow vehicle is less than the distance between the object and the trailer, and the camera view selected includes a representation of the trailer and the representation of the object when the distance between the object and the trailer is less than the distance between the object and the tow vehicle.

7. The method of claim 6, wherein the camera view selection is dynamically performed such that camera views are selected a plurality of times during a turn operation by the tow vehicle.

8. A trailer turn assist system for a tow vehicle and coupled trailer, the system comprising: data processing hardware; and non-transitory memory hardware communicatively coupled to the data processing hardware, the memory storing program instructions which, when executed by the data processing hardware, configures the data processing hardware to: receive sensor data from one or more sensors mounted to at least one of a tow vehicle or a trailer connected thereto; estimate a tire path for each tire of the trailer based upon the sensor data; detect an object based upon the sensor data; determine whether or not the detected object is in the estimated tire path; select a camera view from at least one of the one or more cameras based in part upon a distance of the object to the tow vehicle and a distance of the object to the trailer; upon an affirmative determination that the detected object is in the estimated tire path, create, based upon the received sensor data and upon the selected camera view, an image of an environment in which the tow vehicle and the trailer are located; and transmit the created image and an instruction to a user display for displaying the created image on the user display.

9. The trailer turn assist system of claim 8, wherein the instructions further configure the data processing hardware to determine at least one dimension of the object, and the instructions to create the image of the environment includes instructions to highlight the object based upon the object classification and the at least one dimension of the object.

10. The trailer turn assist system of claim 9, wherein the instructions further configure the data processing hardware to determine, based upon the object classification and the at least one dimension of the object, whether or not the object can be driven over by the trailer without damaging the trailer, wherein the object is highlighted in the created image based upon the determination of whether or not the object can be driven over by the trailer.

11. The trailer turn assist system of claim 10, wherein the highlighting of the object comprises an overlay of a first type upon the determination that the object can be driven over by the trailer, and of a second type upon the determination that the object cannot be driven over by the trailer.

16

12. The trailer turn assist system of claim 11, wherein the first type of overlay comprises an overlay having a first color and the second type of overlay comprises an overlay having a second color different from the first color.

13. The trailer turn assist system of claim 8, wherein the camera view selected includes a representation of the tow vehicle and a representation of the object when the distance between the object and the tow vehicle is less than the distance between the object and the trailer, and the camera view selected includes a representation of the trailer and the representation of the object when the distance between the object and the trailer is less than the distance between the object and the tow vehicle.

14. The trailer turn assist system 8, , wherein the camera view selection is performed a plurality of times during a turn operation by the tow vehicle.

17

Description:
DYNAMIC TRAILER TURN ASSIST SYSTEM AND METHOD

WITH GRAPHIC OVERLAY

CROSS REFERENCE TO RELATED APPLICATION

The present application claims the benefit of U.S. provisional patent application 63/264,340, filed November 19, 2021, and titled “Dynamic Trailer Turn Assist System and Method with Graphic Overlay,” the content of which is incorporated by reference herein in its entirety.

TECHNICAL FIELD

[0001] This disclosure relates to a trailer turn assist system and particularly to a system for assisting a tow vehicle driver with executing a turn operation with a connected trailer.

BACKGROUND

[0002] Trailers are usually unpowered vehicles that are pulled by a powered tow vehicle. A trailer may be a utility trailer, a popup camper, a travel trailer, livestock trailer, flatbed trailer, enclosed car hauler, and boat trailer, among others. The tow vehicle may be a car, a crossover, a truck, a van, a sports-utility -vehicle (SUV), a recreational vehicle (RV), or any other vehicle configured to attach to the trailer and pull the trailer. The trailer wheels may follow a different path compared to the tow vehicle’s wheel. This is due to the additional dynamics introduced by the pivoting point between the vehicle and the trailer. Because the trailer path differs from the path of the tow vehicle, such as when executing a sharp turn, trailers are susceptible of colliding with other vehicles or static objects like curbs.

DESCRIPTION OF DRAWINGS

[0003] FIG. l is a schematic view of an example tow vehicle according to an example embodiment hitched to a trailer. [0004] FIG. 2 is a schematic view of the example tow vehicle and trailer of FIG. 1 at an oblique angle.

[0005] FIG. 3 is a schematic view of the example tow vehicle of FIG. 1 having a trailer turn assist system according to an example embodiment.

[0006] FIGS. 4A-4C show displayed top views of a tow vehicle and connected trailer with estimated tire paths thereof relative to a curb.

[0007] FIGS. 5A-5C show displayed side camera views of a trailer with an estimated trailer path relative to a curb.

[0008] FIGS. 6 shows a flowchart illustrating a trailer turn assist operation according to an example embodiment.

[0009] Like reference symbols in the various drawings indicate like elements.

DETAILED DESCRIPTION

[0010] A tow vehicle, such as, but not limited to a car, a crossover, a truck, a van, a sports-utility -vehicle (SUV), and a recreational vehicle (RV) may be configured to tow a trailer. The tow vehicle connects to the trailer by way of a vehicle coupler attached to a trailer hitch, e.g., a vehicle tow ball attached to a trailer hitch coupler. The tow vehicle includes a trailer turn assist system that estimates trailer paths of the trailer tires/wheels, detects objects in the estimated trailer paths, identifies whether the object can be safely driven over by the trailer, and selectively changes a displayed camera view of the object based on distances between the object and each of the tow vehicle and the trailer.

[0011] Referring to FIGS. 1-3, in some implementations, a vehicle-trailer system 100 includes a tow vehicle 102 hitched to a trailer 104. The tow vehicle includes a vehicle tow ball attached to a trailer hitch coupler 106 supported by a trailer hitch bar 108 of the trailer 104. The tow vehicle 102 includes a drive system 110 associated with the tow vehicle 102 that maneuvers the tow vehicle 102 and thus moves the vehicle-trailer system 100 across a road surface based on drive maneuvers or commands having x, y, and z components, for example. The drive system 110 includes a front right wheel 112, 112a, a front left wheel 112, 112b, a rear right wheel 112, 112c, and a rear left wheel 112, 112d. In addition, the drive system 110 may include wheels (not shown) associated with the trailer 104. The drive system 110 may include other wheel configurations as well. The drive system 110 may include a motor or an engine that converts one form of energy into mechanical energy allowing the vehicle 102 to move. The drive system 110 includes other components (not shown) that are in communication with and connected to the wheels 112 and engine and that allow the vehicle 102 to move, thus moving the trailer 104 as well. The drive system 110 may also include a brake system (not shown) that includes brakes associated with each wheel 112, 112a-d, where each brake is associated with a wheel 112a-d and is configured to slow down or stop the wheel 112a-n from rotating. In some examples, the brake system is connected to one or more brakes supported by the trailer 104. The drive system 110 may also include an acceleration system (not shown) that is configured to adjust a speed of the tow vehicle 102 and thus the vehicle-trailer system 100, and a steering system (not shown) that is configured to adjust a direction of the tow vehicle 102 and thus the vehicle-trailer system 100. The vehicle-trailer system 100 may include other systems as well.

[0012] The tow vehicle 102 may move across the road surface by various combinations of movements relative to three mutually perpendicular axes defined by the tow vehicle 102: a transverse axis Xv, a fore-aft axis Yv, and a central vertical axis Zv. The transverse axis Xv extends between a right side R and a left side of the tow vehicle 102. A forward drive direction along the fore-aft axis Yv is designated as Fv, also referred to as a forward motion. In addition, an aft or rearward drive direction along the fore-aft direction Yv is designated as Rv, also referred to as rearward motion. In some examples, the tow vehicle 102 includes a suspension system (not shown), which when adjusted causes the tow vehicle 102 to tilt about the Xv axis and or the Yv axis, or move along the central vertical axis Zv. As the tow vehicle 102 moves, the trailer 104 follows along a path of the tow vehicle 102. Therefore, when the tow vehicle 102 makes a turn as it moves in the forward direction Fv, then the trailer 104 follows along. While turning, the tow vehicle 102 and the trailer 104 form a trailer angle a.

[0013] Moreover, the trailer 104 follows the tow vehicle 102 across the road surface by various combinations of movements relative to three mutually perpendicular axes defined by the trailer 104: a trailer transverse axis XT, a trailer fore-aft axis YT, and a trailer central vertical axis ZT. The trailer transverse axis XT extends between a right side and a left side of the trailer 104 along a trailer turning axle 105. In some examples, the trailer 104 includes a front axle (not shown) and rear axle 105. In this case, the trailer transverse axis XT extends between a right side and a left side of the trailer 104 along a midpoint of the front and rear axle (i.e., a virtual turning axle). A forward drive direction along the trailer fore-aft axis YT is designated as FT, also referred to as a forward motion. In addition, a trailer aft or rearward drive direction along the fore-aft direction YT is designated as RT, also referred to as rearward motion. Therefore, movement of the vehicle-trailer system 100 includes movement of the tow vehicle 102 along its transverse axis Xv, fore-aft axis Yv, and central vertical axis Zv, and movement of the trailer 104 along its trailer transverse axis XT, trailer fore-aft axis YT, and trailer central vertical axis ZT. Therefore, when the tow vehicle 102 makes a turn as it moves in the forward direction Fv, then the trailer 104 follows along. While turning, the tow vehicle 102 and the trailer 104 form the trailer angle a being an angle between the vehicle fore-aft axis Yv and the trailer fore-aft axis YT.

[0014] In some implementations, the vehicle 102 includes a sensor system 130 to provide sensor system data 136 that may be used to determine one or more measurements, such as the trailer angle a. In some examples, the vehicle 102 is autonomous or semi-autonomous, therefore, the sensor system 130 provides reliable and robust autonomous driving. The sensor system 130 provides sensor system data 136 and may include different types of sensors 132, 134 that may be used separately or with one another to create a perception of the tow vehicle’s environment or a portion thereof that is used by the vehicle-trailer system 100 to identify object(s) in its environment and/or in some examples autonomously drive and make intelligent decisions based on objects and obstacles detected by the sensor system 130. In some examples, the sensor system 130 includes one or more sensors 132, 134 supported by the tow vehicle 102 which provide sensor system data 136 associated with object(s) positioned around the tow vehicle 102. The tow vehicle 102 may support the sensor system 130; while in other examples, the sensor system 130 is supported by the vehicle 102 and the trailer 104, with some sensors 132, 134 being mounted to the trailer 104.

[0015] In an example embodiment, the sensors 132, 134 include one or more cameras 132 that provide camera data 133. The one or more cameras 132 may include monocular cameras where each position on an image shows a different amount of light, but not a different hue. In some examples, the camera(s) 132 may include a fisheye lens that includes an ultra wide-angle lens that produces strong visual distortion intended to create a wide panoramic or hemispherical image 133. Fisheye cameras capture images 133 having an extremely wide angle of view. Other types of cameras may also be used to capture images 133 of the vehicle and trailer environment. The camera data 133 may include additional data 133 such as intrinsic parameters (e.g., focal length, image sensor format, and principal point) and extrinsic parameters (e.g., the coordinate system transformations from 3D world coordinates to 3D camera coordinates, in other words, the extrinsic parameters define the position of the camera center and the heading of the camera in world coordinates). In addition, the camera data 133 may include minimum/maximum/average height of each camera 132 with respect to ground (e.g., when the vehicle is loaded and unloaded), and a longitudinal distance between the camera

132 and the tow vehicle hitch ball. In some implementations, first and second cameras 132b and 132c are positioned on each side of the vehicle 102. Additionally, a rear facing third camera 132a may be mounted at the rear of the vehicle 102, and a front facing camera 132d may be mounted at the front of the vehicle.

[0016] The sensors 134 may include, but is not limited to, radar, sonar, LIDAR (Light Detection and Ranging, which can entail optical remote sensing that measures properties of scattered light to find range and/or other information of a distant target), LADAR (Laser Detection and Ranging), ultrasonic, thermal camera, GPS, etc. The sensor system 130 provides sensor system data 136 that includes one or both of images

133 from the one or more cameras 132 and sensor information 135 from the one or more other sensors 134. Therefore, the sensor system 130 is especially useful for receiving information of the environment or portion of the environment of the tow vehicle 102 and for increasing safety in the vehicle-trailer system 100 which may operate by the driver or under semi-autonomous or autonomous conditions.

[0017] The tow vehicle 102 may include a user interface 140. In some examples, the user interface includes a display 142. The user interface 140 is configured to display information to the tow vehicle driver and to receive one or more user commands and/or data from the driver via one or more input mechanisms or a touch screen display and/or displays. In some examples, the display 142 is a touch screen display. In other examples, the display 142 is not a touchscreen and the user interface includes a device, such as, but not limited to, a rotary knob, keyboard, keypad and/or mouse (not shown) to provide user input. In some examples, the display 142 is not supported by the tow vehicle 102 and is instead part of a handheld device, such as a cellular telephone or a tablet.

[0018] The user interface 140 is in communication with a vehicle controller 150 that includes a computing device (or data processing hardware) 152 (e.g., central processing unit having one or more computing processors) in communication with non-transitory memory or memory hardware 155 (e.g., a hard disk, flash memory, random-access memory) capable of storing instructions executable on the computing processor(s)). In some examples, the non-transitory memory 155 stores instructions that when executed on the data processing hardware 152 cause the vehicle controller 150 perform operations supporting a trailer turn assist function. As shown, the vehicle controller 150 is supported by the tow vehicle 102; however, the vehicle controller 150 may be separate from the tow vehicle 102 and in communication with the tow vehicle 102 via a network (not shown). In addition, the vehicle controller 150 is in communication with the sensor system 130 and receives sensor system data 136 from the sensor system 130. In some examples, the vehicle controller 150 is configured to process sensor system data 136 received from the sensor system 130. In some implementations, the vehicle controller 150 includes data processing hardware located in sensors 130 such some processing of raw sensor data occurs within the corresponding sensor 130.

[0019] In example embodiments, the tow vehicle 102 includes a trailer turn assist system which assists a driver when turning the tow vehicle 102. FIG. 3 illustrates the tow vehicle 102 according to an example embodiment. The trailer turn assist system 152 includes a path estimator algorithm, function block or module 154 which estimates the tire paths of the trailer 104 and optionally the tow vehicle 102. The tire path estimate for the trailer 104 uses known information about the trailer, such as the trailer length and width, the trailer angle a, and the wheel angle of the front wheels of the tow vehicle 102 and/or the tow vehicle’s steering wheel angle. An object detector algorithm, function block or module 156 detects objects located near the tow vehicle 102 and the trailer 104. In addition, the object detector 156 utilizes object recognition and/or semantic segmentation software to identify or classify the object. The object detector 156 also determines whether the detected objects are in the estimated trailer tire paths and determines whether the object can or cannot be safely driven over by the trailer 104. Based on the object classification, the object detector 156 determines whether or not the object, which is determined to be in an estimated tire path, may be safely run over by the trailer 104 without damaging the trailer 104 and/or the object. This determination may include determining the height of the object such that determining whether the object may be safely driven over is based on the determined object height.

[0020] The trailer turn assist system 152 may also include an image highlighter algorithm, function block or module 160 which modifies the images captured by the cameras 132 to highlight the detected objects which are determined to be in a trailer tire path, and to also highlight the corresponding trailer tire path. The modified image(s) is then available for display to the tow vehicle user on the display 142. In one implementation, the highlighting used distinguishes between objects that are determined to capable of being safely run over by the trailer 104 from objects that are determined to be incapable of being safely run over. In one implementation, the highlighting of the object and corresponding tire path estimate may be an overlay having one color, such as yellow, when the object is determined to be capable of being safely run over, and having another color, such as red, when the object is determined to be incapable of being safely run over. It is understood that different highlight coloring or different shading (e.g., solid versus partial) may be utilized.

[0021] Further, the trailer turn assist system 152 includes a camera view selector algorithm, function block or module 162 to continuously change the camera view displayed on the user display 142 during execution of a turn operation by the tow vehicle 102. Specifically, when the tow vehicle 102 takes a relatively sharp turn, such as a 90 degree turn, around a curb, an image of the curb is constantly shown on the user display 142 by changing the camera view of the curb. When the tow vehicle 102 is closer to the curb than the trailer 104, the camera view displayed on the user display 142 includes representations of the tow vehicle 102 and the curb. When the trailer 104 is subsequently closer to the curb that the tow vehicle 102, the camera view adjusted so that representations of the curb and the trailer 104 are depicted on the user display 142. In this case, the different camera views may be from the front camera 132d initially, then adjusted to be from the side camera 132c, and then adjusted again to be from the rear camera 132a or a camera on the trailer 104. It is understood that camera views may include combined side views in which images from multiple cameras 132 are stitched or otherwise combined. For example, one combined camera view may combine images from the front camera 132d and a side camera 132c, another combined camera view may combine camera images from a side camera 132c and the rear camera 132a and/or a camera 132 mounted to the trailer 104. It is further understood that the camera views may be top views or side views of the tow vehicle 102 and the trailer 104, or both.

[0022] As the tow vehicle 102 starts making a turn, the sensor system 130 will show the appropriate camera view to the driver or other use of the tow vehicle 102. This may be a single side mirror camera view from a side camera 132b or 132c, or a side view from multiple cameras 132 that are stitched together to form a single image. The stitched side view may be generated from the multiple cameras on the tow vehicle 102 and optionally along with any cameras on the trailer 104.

[0023] Using existing information about the length and width of the trailer 104, along with the current trailer angle a formed between the tow vehicle 102 and the trailer 104 and the steering angle of the front wheels 112a, 112b and/or the steering wheel of the tow vehicle 102, the paths that the trailer wheels 112 are determined and projected in the camera view shown to the user on the display 142. Optionally, the path the corners of the trailer 104 will follow may also be projected on the display 142. The determined path of the trailer wheels 112 are presented as overlays over the corresponding image captured by a camera(s) 132.

[0024] Simultaneously the object detection or semantic segmentation algorithm identifies the free-space and objects around the vehicle 102 and the trailer 104. Any object identified to be located in the estimated tire or wheel path of the trailer 104 may be highlighted on the user display 142 along with the corresponding estimated tire/wheel path to alert the user of a potential collision.

[0025] In one implementation, radar/lidar/camera sensor data 133, 135 is used to identify the height of the object in the path of the trailer 104. Once the height is determined, a visual information is shown to the user via the display 142 which differentiates objects that can be safely driven over from objects that cannot be safely driven over. For example, a curb that is low enough for the trailer wheels 112 to safely drive over without damaging the trailer or the curb is highlighted using a yellow color, and another vehicle or other sizeable object that is in the path of the trailer 104 is highlighted using red color to show that the trailer 104 cannot safely drive over the vehicle/other object without damaging the trailer and/or the vehicle/other object. It is understood that different colors may be used to differentiate between objects that are determined to be driven over from objects which cannot be driven over, and that other highlight markings may be used to distinguish these two different categories of objects in addition to or place of the color highlighting.

[0026] In some implementations, the camera view displayed via the display 142 continuously changes as the vehicle 102 and trailer 104 are making the turn. The trailer turn assist system 152 continuously shows on the display 142 the object closest to the vehicle/trailer in focus. For example, when the vehicle 102 takes a sharp turn around a curb, the curb is constantly shown in the display 142 by changing the camera view. When the vehicle 102 is close to the curb, the camera view will be adjusted such that the vehicle 102 and the curb will be visible on the display 142, and when the trailer 104 is close to the curb, the camera view is adjusted such that the trailer 104 and the curb are visible.

[0027] The trailer turn assist system 152 projects the trailer path in the different camera views when the tow vehicle 102 is travelling in the forward direction and starting to turn. In addition, the system 152 highlights the objects on the display 142 with which the trailer 104 may potentially collide. The highlighting varies to differentiate between objects that are safe to be run over and objects that the trailer 104 should not run over. [0028] The trailer turn assist system varies the trailer path depiction to differentiate between a safe path and a potential collision path. The system 152 also provides on the display 142 a dynamic side view showing the target object continuously in focus. The system gives the user of the tow vehicle 102 much more information in an easily understandable manner. This gives the tow vehicle user the freedom to choose to drive over a curb while simultaneously being alerted about an imminent collision. [0029] FIG. 4A is a top view of the tow vehicle 102 and the front portion of the trailer 104 displayed on display 142, with the estimated tire paths 103 of the vehicle 102 and the estimated tire paths 107 of the trailer 104 appearing as overlays. In this image, no objects are determined to be in the estimated trailer tire paths 107 so no objects or trailer tire path is highlighted. FIG. 4B shows the tow vehicle 102 and front portion of the trailer 104 displayed with a curb C highlighted along with the corresponding trailer tire path 107 in which the curb C is determined to be located. In this displayed image, the curb C is determined to being capable of being safely driven over and so the curb and the corresponding trailer tire path 107 have a first highlighting. FIG. 4C shows the tow vehicle 102 and front portion of the trailer 104 displayed with the curb C highlighted along with the corresponding trailer tire path 107 in which the curb is determined to be located. In this displayed image, the curb C is determined to being incapable of being safely driven over and so the curb and the corresponding trailer tire path 107 have a second highlighting different from the first highlighting. The different types of highlighting inform the tow vehicle user as to whether or not the tow vehicle 102 should be steered differently so as to avoid having the trailer 104 collide with the curb C.

[0030] FIG. 5 A illustrates a displayed camera image of a portion of the trailer 104, the curb C and an overlay ed trailer tire path 107. Because the curb C is determined to be located outside of the estimated trailer tire paths 107, the curb is not highlighted. In FIG. 5B, the curb C is determined to be both in the estimated trailer tire path 107 and capable of being safely driven over, so the curb C and the trailer tire path 107 are highlighted in a color (yellow, in this case). In FIG. 5C, the curb C is determined to be both in the estimated trailer tire path 107 and incapable of being safely driven over, so that the curb and the tire path are highlighted in another color (red, in this case). With the drawings being in black and white, the curb C is highlighted with solid line hatching in FIG. 5B and with dashed line hatching in FIG. 5C to represent the yellow and red colors, respectively.

[0031] FIG. 6 illustrate a method 900 for performing a trailer turn assist function according to an example embodiment. Image data 133 from the cameras 132 and other sensor data 135 from sensors 134 are received by the vehicle controller 150 at 902, and vehicle dynamics sensor data is received by the controller 150 at 904. The vehicle dynamics sensor data may include the angle of the tow vehicle wheels 112a, 112b and/or the angle of the steering wheel of the tow vehicle 102. Based at least in part upon the received image and sensor data, the vehicle controller 150 estimates at 906 the tire/wheel paths of the trailer 104 and optionally the tow vehicle 102 using the path estimator 154. At 908, the vehicle controller 150 utilizes the object detector 156 to detect objects in the image data received from the cameras 132 and/or the sensor data from sensors 134. The object detector 156 also determines whether any detected object is located in an estimated trailer tire path 107. Objects determined to be in an estimated trailer tire path 107 are classified and dimensions of the objects are determined at 910. The method 900 further includes selecting the camera view at 912 for display on the user display 142 so that the curb C or other detected object is prominently displayed along with the tow vehicle 102 or the trailer 104. The camera view selected is based on the distance of the curb C to the tow vehicle 102 and to the trailer 104.

[0032] The controller 150 determines at 914 whether or not the object determined to be in the estimated tire path 107 of the trailer 104 can be safely run over by the trailer 104 without damaging the tow vehicle and/or the object. In some aspects, this determination is based on various dimensions of the object, such as its height, the object type/classification of the object, and known dimensions and/or characteristics of the trailer 104. At 916 and 918, images are created from the image data selected in 912 to include a highlight overlay over the objects determined to be in a tire path 107 of the trailer 104 as well as the corresponding estimated tire paths 107 of the trailer 104. Objects in an estimated tire path 107 of the trailer 104 which can be safely run over by the trailer 104 are highlighted in the image(s) with image including an overlay in a first color at 916, such as yellow, and objects in the estimated trailer tire path 107 which cannot be safely run over by the trailer 104 are highlighted with an overlay in a second color at 918, such as red. In addition, the estimated tire path 107 leading to the object that can be safely run over may also be highlighted with the first color and the estimated tire path leading to the object that cannot be safely run over may be highlighted with the second color. At 920, the controller 150 sends one or more instructions to the user display 142 to display the images with the detected object and corresponding tire path overlayed in the first or second colors. The displayed images with the overlayed, highlighted object(s) and estimated trailer path(s) 107 serve to inform the driver of the tow vehicle 102 of the nature and/or extent of an upcoming collision with the detected object, in response to which the tow vehicle driver may take action to avoid the collision or, if the object can be safely run over, take no action and allow the trailer 104 to run over the object.

[0033] Various implementations of the systems and techniques described here can be realized in digital electronic circuitry, integrated circuitry, specially designed ASICs (application specific integrated circuits), computer hardware, firmware, software, and/or combinations thereof. These various implementations can include implementation in one or more computer programs that are executable and/or interpretable on a programmable system including at least one programmable processor, which may be special or general purpose, coupled to receive data and instructions from, and to transmit data and instructions to, a storage system, at least one input device, and at least one output device. [0034] These computer programs (also known as programs, software, software applications or code) include machine instructions for a programmable processor, and can be implemented in a high-level procedural and/or object-oriented programming language, model-based design with auto-code generation, and/or in assembly/machine language. As used herein, the terms “machine-readable medium” and “computer-readable medium” refer to any computer program product, apparatus and/or device (e.g., magnetic discs, optical disks, memory, Programmable Logic Devices (PLDs)) used to provide machine instructions and/or data to a programmable processor, including a machine-readable medium that receives machine instructions as a machine-readable signal. The term “machine-readable signal” refers to any signal used to provide machine instructions and/or data to a programmable processor.

[0035] Implementations of the subject matter and the functional operations described in this specification can be implemented in digital electronic circuitry, or in computer software, firmware, or hardware, including the structures disclosed in this specification and their structural equivalents, or in combinations of one or more of them. Moreover, subject matter described in this specification can be implemented as one or more computer program products, i.e., one or more modules of computer program instructions encoded on a computer readable medium for execution by, or to control the operation of, data processing apparatus. The computer readable medium can be a machine-readable storage device, a machine-readable storage substrate, a memory device, a composition of matter effecting a machine-readable propagated signal, or a combination of one or more of them. The terms “data processing apparatus”, “computing device” and “computing processor” encompass all apparatus, devices, and machines for processing data, including by way of example a programmable processor, a computer, or multiple processors or computers. The apparatus can include, in addition to hardware, code that creates an execution environment for the computer program in question, e.g., code that constitutes processor firmware, a protocol stack, a database management system, an operating system, or a combination of one or more of them. A propagated signal is an artificially generated signal, e.g., a machine-generated electrical, optical, or electromagnetic signal that is generated to encode information for transmission to suitable receiver apparatus. [0036] Similarly, while operations are depicted in the drawings in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order, or that all illustrated operations be performed, to achieve desirable results. In certain circumstances, multi-tasking and parallel processing may be advantageous. Moreover, the separation of various system components in the embodiments described above should not be understood as requiring such separation in all embodiments, and it should be understood that the described program components and systems can generally be integrated together in a single software product or packaged into multiple software products.

[0037] A number of implementations have been described. Nevertheless, it will be understood that various modifications may be made without departing from the spirit and scope of the disclosure. Accordingly, other implementations are within the scope of the following claims.