US11360375B1 - Stereoscopic camera alignment via laser projection - Google Patents

Stereoscopic camera alignment via laser projection Download PDF

Info

Publication number
US11360375B1
US11360375B1 US16/814,413 US202016814413A US11360375B1 US 11360375 B1 US11360375 B1 US 11360375B1 US 202016814413 A US202016814413 A US 202016814413A US 11360375 B1 US11360375 B1 US 11360375B1
Authority
US
United States
Prior art keywords
stereoscopic camera
camera
convergence point
iteratively
roll
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active, expires
Application number
US16/814,413
Inventor
Daniel J. Henry
Nathan A. Vaughn
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Rockwell Collins Inc
Original Assignee
Rockwell Collins Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Rockwell Collins Inc filed Critical Rockwell Collins Inc
Priority to US16/814,413 priority Critical patent/US11360375B1/en
Assigned to ROCKWELL COLLINS, INC. reassignment ROCKWELL COLLINS, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: HENRY, DANIEL J., VAUGHN, NATHAN A.
Application granted granted Critical
Publication of US11360375B1 publication Critical patent/US11360375B1/en
Active legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B35/00Stereoscopic photography
    • G03B35/08Stereoscopic photography by simultaneous recording
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/246Calibration of cameras
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B15/00Special procedures for taking photographs; Apparatus therefor
    • G03B15/006Apparatus mounted on flying objects

Definitions

  • Stereoscopic 3D camera systems installed on an aircraft must be properly aligned for optimal 3D imaging performance. Misalignment of the cameras in a 3D camera system will cause a degradation of the 3D effect projected to the user. In some applications, such as air-to-air refueling, a degraded 3D effect can have dangerous consequences. A tanker boom operator can mistakenly scrape a refueling boom along the surface of the aircraft being refueled. On a stealth aircraft, such scraping could damage the stealth coating and jeopardize mission safety.
  • embodiments of the inventive concepts disclosed herein are directed to a system and method for calibrating stereoscopic cameras in an aircraft.
  • a plurality of lasers are aimed at a desired convergence point, such as at a point on a runway or other ground facility.
  • Each of the stereoscopic cameras are adjusted in terms of pitch, yaw, and roll to align the convergence point in each camera. Misalignments are iteratively corrected.
  • an alignment crosshair is applied, physically or electronically, and the convergence point aligned to the alignment crosshair.
  • FIG. 1A shows an environmental side view of an exemplary embodiment
  • FIG. 2 shows a block diagram of a system according to an exemplary embodiment
  • FIG. 3A shows a view as seen through a center camera
  • FIG. 3B shows a view as seen through a calibrated stereoscopic camera system
  • FIG. 5 shows a view during a step of a calibration process according to an exemplary embodiment
  • FIG. 7 shows a view during a step of a calibration process according to an exemplary embodiment
  • FIG. 8A shows a view during an alignment process according to an exemplary embodiment
  • FIG. 8B shows a view during an alignment process according to an exemplary embodiment
  • FIG. 8C shows a view during an alignment process according to an exemplary embodiment
  • FIG. 9A shows a view as seen through a center camera
  • FIG. 9B shows a view as seen through a calibrated stereoscopic camera system
  • FIG. 10 shows a view as seen through a calibrated stereoscopic camera system
  • FIG. 11 shows a block diagram of an exemplary embodiment of a stereoscopic calibration system
  • FIG. 12 shows a flowchart of an exemplary embodiment
  • inventive concepts are not limited in their application to the details of construction and the arrangement of the components or steps or methodologies set forth in the following description or illustrated in the drawings.
  • inventive concepts disclosed herein may be practiced without these specific details.
  • well-known features may not be described in detail to avoid unnecessarily complicating the instant disclosure.
  • inventive concepts disclosed herein are capable of other embodiments or of being practiced or carried out in various ways. Also, it is to be understood that the phraseology and terminology employed herein is for the purpose of description and should not be regarded as limiting.
  • a letter following a reference numeral is intended to reference an embodiment of the feature or element that may be similar, but not necessarily identical, to a previously described element or feature bearing the same reference numeral (e.g., 1 , 1 a , 1 b ).
  • Such shorthand notations are used for purposes of convenience only, and should not be construed to limit the inventive concepts disclosed herein in any way unless expressly stated to the contrary.
  • any reference to “one embodiment,” or “some embodiments” means that a particular element, feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the inventive concepts disclosed herein.
  • the appearances of the phrase “in some embodiments” in various places in the specification are not necessarily all referring to the same embodiment, and embodiments of the inventive concepts disclosed may include one or more of the features expressly described or inherently present herein, or any combination of sub-combination of two or more such features, along with any other features which may not necessarily be expressly described or inherently present in the instant disclosure.
  • embodiments of the inventive concepts disclosed herein are directed to a system and method for calibrating stereoscopic cameras in an aircraft.
  • a plurality of lasers are aimed at a desired convergence point, such as at a point on a runway or other ground facility.
  • Each of the stereoscopic cameras are adjusted in terms of pitch, yaw, and roll to align the convergence point in each camera. Misalignments are iteratively corrected.
  • An aircraft 100 including a stereoscopic camera alignment system 104 projects a plurality of laser beams 102 from different, known locations within the aircraft 100 , toward a convergence point.
  • the convergence point is some point a predefined distance from the stereoscopic cameras where it is desirable the stereoscopic cameras be calibrated.
  • FIG. 2 a block diagram of a system according to an exemplary embodiment is shown.
  • a set of lasers 206 , 208 , 210 are aimed at a convergence point 212 , within the field of view of each camera 200 , 202 .
  • the set of lasers 206 , 208 , 210 comprises a center, crosshair laser 210 .
  • the crosshair laser 210 is projected along a centerline of the stereoscopic system to constrain calibration to the center of the system and prevent divergence toward either the left camera 200 or right camera 202 .
  • the convergence point 212 may be defined to conform to some desirable distance along the centerline according to a mission parameter of the stereoscopic system.
  • the cameras 200 , 202 may be parallel to each other, while the readouts of the corresponding camera focal plane for each camera 200 , 202 if shifted or transformed.
  • Such shifting results in the same convergence as physical toe-in, but does not introduce distortion (called keystone) in the images of the two cameras.
  • keystone When keystone is introduced, it must be removed via image warping to prevent eye strain in the corners of the image and depth distortion.
  • One or more intersecting lasers 206 , 208 may be used to more precisely define the desirable distance so that the convergence point 212 at the center of the crosshair laser 210 is maintained at that distance.
  • FIGS. 3A-3B views as seen through a center camera 300 and a calibrated stereoscopic camera system 302 are shown.
  • the center camera view 300 is hypothetical in nature; the stereoscopic camera system being calibrated may not include a center camera.
  • the calibrated stereoscopic view 302 includes overlapping images of a projected crosshair laser, the views converging on a convergence point for parallax. Horizontal alignment, and alignment at the central convergence point are the desirable factors. It may be appreciated that FIGS. 3A-3B show only the crosshair laser, not any intersecting lasers.
  • FIG. 4 a view as seen through a misaligned stereoscopic camera system is shown.
  • a misaligned stereoscopic camera system an image from the right camera 400 and an image from the left camera 402 are misaligned both in terms of horizontal alignment and convergence point.
  • a view during a step of a calibration process is shown.
  • misalignments in images from the left camera and right camera are identified in terms of pitch and or yaw (along vertical and/or horizontal axes with respect to the corresponding camera), and an adjustment (either physical or an image transformation) is determined to align or overlap the central convergence points.
  • an adjustment either physical or an image transformation
  • the process requires substantially contemporaneous adjustment of the left camera and right camera.
  • Simultaneous overlapping views 500 , 502 from the left camera and right camera may be used to facilitate such contemporaneous alignment.
  • FIG. 6 a view during a step of a calibration process according to an exemplary embodiment is shown.
  • misalignments in images from the left camera and right camera are identified in terms of roll, and an adjustment (either physical or an image transformation) is determined to align or parallelize horizon lines of overlapping views 600 , 602 from the left camera and right camera.
  • FIG. 7 a view during a step of a calibration process according to an exemplary embodiment is shown. Because the pitch, yaw, and roll axes of each of the left camera and right camera are related, adjustments to correct misalignments in roll axes are likely to create misalignments in terms of pitch and/or yaw. Therefore, in a third step, new misalignments in the left camera and right camera are iteratively identified and new adjustments in terms of pitch, yaw, and roll are iteratively determined to re-align the central convergence points of the overlapping views 700 , 702 .
  • a calibration system may project one or more intersecting lines 800 , 802 to further align the convergence point 804 of the central crosshair laser to a desired location in camera.
  • FIG. 8B shows a properly located plane of convergence. The intersection of the dashed crosshairs is aligned with the center of the crosshair.
  • FIGS. 8A and 8C show an incorrectly located plane of convergence. If the plan of convergence is either too far or too close, the intersection of the dashed crosshairs will not be aligned with the center of the crosshair.
  • FIGS. 9A-9B views as seen through a hypothetical center camera and a calibrated stereoscopic camera system are shown.
  • the center hypothetical camera view in 9 A shows the crosshair laser and projected intersecting lines.
  • the calibrated stereoscopic view includes overlapping images of the crosshair laser and intersecting lines converging on a convergence point.
  • a calibration system may utilize a thin crosshair 1002 or other display reference point, either as a film applied to the display or rendered on the display, to align the horizontal features of the crosshair laser and the convergence point to the display.
  • FIG. 11 a block diagram of an exemplary embodiment of a stereoscopic calibration system is shown.
  • the system comprises a processor 1100 , non-transitory computer readable medium having a memory 1102 connected to the processor 1100 for storing processor executable code, and a left camera 1104 and right camera 1106 connected to the processor 1100 .
  • the left camera 1104 and right camera 1106 are calibrated with respect to one or more laser 1108 projections.
  • the processor 1100 iteratively identifies misalignments in images from each of the left camera 1104 and right camera 1106 , and determines adjustments.
  • the adjustments may be applied via physical manipulation of the left camera 1104 and right camera 1106 .
  • the adjustments comprise image transformations applied to streaming images from each of the left camera 1104 and right camera 1106 .
  • the processor 1100 independently analyses images from the left camera 1104 and right camera 1106 to identify misalignments with respect to at least a crosshair laser in the one or more lasers 1108 .
  • the processor 1100 iteratively determines transformations to align a convergence point and horizontal features of the crosshair laser in each camera image.
  • a plurality of lasers are projected 1200 at a convergence point within the field of view of a stereoscopic camera system.
  • Images from each of a first camera and second camera are received 1202 , 1204 ; for example, at a calibration display or by a calibration processor. Misalignments are identified via comparison of the received images, to each other, and potentially with respect to a display reference point.
  • an alignment correction 1206 , 1208 for each of the first camera and second camera is determined with respect to pitch and/or yaw.
  • an alignment correction 1210 , 1212 for each of the first camera and second camera is determined with respect to roll.
  • an operator or calibration processor determines 1214 if application of the determined 1206 , 1208 , 1210 , 1212 corrections causes a new misalignment. If so, the process repeats iteratively until the images from the first camera and second camera are aligned with respect to the projected 1200 lasers within an acceptable threshold.
  • linear transformation matrices for the first camera and second camera may be stored for later use.
  • a set of two equally spaced stereoscopic cameras can be calibrated to a defined convergence point through an iterative visual process. Such process enables quick calibration of camera alignment and convergence point, without expensive or complicated equipment, to produce a 3D image on a compatible 3D display.

Abstract

A system and method for calibrating stereoscopic cameras in an aircraft utilizes a plurality of lasers aimed at a desired convergence point, such as at a point on a runway or other ground facility or projection surface. Each of the stereoscopic cameras are adjusted in terms of pitch, yaw, and roll to align the convergence point in each camera. Misalignments are iteratively corrected, as required. An alignment crosshair is applied, physically or electronically, and the convergence point aligned to the alignment crosshair.

Description

BACKGROUND
Stereoscopic 3D camera systems installed on an aircraft must be properly aligned for optimal 3D imaging performance. Misalignment of the cameras in a 3D camera system will cause a degradation of the 3D effect projected to the user. In some applications, such as air-to-air refueling, a degraded 3D effect can have dangerous consequences. A tanker boom operator can mistakenly scrape a refueling boom along the surface of the aircraft being refueled. On a stealth aircraft, such scraping could damage the stealth coating and jeopardize mission safety.
SUMMARY
In one aspect, embodiments of the inventive concepts disclosed herein are directed to a system and method for calibrating stereoscopic cameras in an aircraft. A plurality of lasers are aimed at a desired convergence point, such as at a point on a runway or other ground facility. Each of the stereoscopic cameras are adjusted in terms of pitch, yaw, and roll to align the convergence point in each camera. Misalignments are iteratively corrected.
In a further aspect, an alignment crosshair is applied, physically or electronically, and the convergence point aligned to the alignment crosshair.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and should not restrict the scope of the claims. The accompanying drawings, which are incorporated in and constitute a part of the specification, illustrate exemplary embodiments of the inventive concepts disclosed herein and together with the general description, serve to explain the principles.
BRIEF DESCRIPTION OF THE DRAWINGS
The numerous advantages of the embodiments of the inventive concepts disclosed herein may be better understood by those skilled in the art by reference to the accompanying figures in which:
FIG. 1A shows an environmental side view of an exemplary embodiment,
FIG. 1B shows an environmental top view of an exemplary embodiment;
FIG. 2 shows a block diagram of a system according to an exemplary embodiment;
FIG. 3A shows a view as seen through a center camera;
FIG. 3B shows a view as seen through a calibrated stereoscopic camera system;
FIG. 4 shows a view as seen through a misaligned stereoscopic camera system;
FIG. 5 shows a view during a step of a calibration process according to an exemplary embodiment;
FIG. 6 shows a view during a step of a calibration process according to an exemplary embodiment;
FIG. 7 shows a view during a step of a calibration process according to an exemplary embodiment;
FIG. 8A shows a view during an alignment process according to an exemplary embodiment;
FIG. 8B shows a view during an alignment process according to an exemplary embodiment;
FIG. 8C shows a view during an alignment process according to an exemplary embodiment;
FIG. 9A shows a view as seen through a center camera;
FIG. 9B shows a view as seen through a calibrated stereoscopic camera system;
FIG. 10 shows a view as seen through a calibrated stereoscopic camera system;
FIG. 11 shows a block diagram of an exemplary embodiment of a stereoscopic calibration system;
FIG. 12 shows a flowchart of an exemplary embodiment;
DETAILED DESCRIPTION
Before explaining at least one embodiment of the inventive concepts disclosed herein in detail, it is to be understood that the inventive concepts are not limited in their application to the details of construction and the arrangement of the components or steps or methodologies set forth in the following description or illustrated in the drawings. In the following detailed description of embodiments of the instant inventive concepts, numerous specific details are set forth in order to provide a more thorough understanding of the inventive concepts. However, it will be apparent to one of ordinary skill in the art having the benefit of the instant disclosure that the inventive concepts disclosed herein may be practiced without these specific details. In other instances, well-known features may not be described in detail to avoid unnecessarily complicating the instant disclosure. The inventive concepts disclosed herein are capable of other embodiments or of being practiced or carried out in various ways. Also, it is to be understood that the phraseology and terminology employed herein is for the purpose of description and should not be regarded as limiting.
As used herein a letter following a reference numeral is intended to reference an embodiment of the feature or element that may be similar, but not necessarily identical, to a previously described element or feature bearing the same reference numeral (e.g., 1, 1 a, 1 b). Such shorthand notations are used for purposes of convenience only, and should not be construed to limit the inventive concepts disclosed herein in any way unless expressly stated to the contrary.
Further, unless expressly stated to the contrary, “or” refers to an inclusive or and not to an exclusive or. For example, a condition A or B is satisfied by anyone of the following: A is true (or present) and B is false (or not present), A is false (or not present) and B is true (or present), and both A and B are true (or present).
In addition, use of the “a” or “an” are employed to describe elements and components of embodiments of the instant inventive concepts. This is done merely for convenience and to give a general sense of the inventive concepts, and “a” and “an” are intended to include one or at least one and the singular also includes the plural unless it is obvious that it is meant otherwise.
Finally, as used herein any reference to “one embodiment,” or “some embodiments” means that a particular element, feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the inventive concepts disclosed herein. The appearances of the phrase “in some embodiments” in various places in the specification are not necessarily all referring to the same embodiment, and embodiments of the inventive concepts disclosed may include one or more of the features expressly described or inherently present herein, or any combination of sub-combination of two or more such features, along with any other features which may not necessarily be expressly described or inherently present in the instant disclosure.
Broadly, embodiments of the inventive concepts disclosed herein are directed to a system and method for calibrating stereoscopic cameras in an aircraft. A plurality of lasers are aimed at a desired convergence point, such as at a point on a runway or other ground facility. Each of the stereoscopic cameras are adjusted in terms of pitch, yaw, and roll to align the convergence point in each camera. Misalignments are iteratively corrected.
Referring to FIGS. 1A-1B, environmental side and top views of an exemplary embodiment are shown. An aircraft 100 including a stereoscopic camera alignment system 104 according to at least one embodiment projects a plurality of laser beams 102 from different, known locations within the aircraft 100, toward a convergence point. The convergence point is some point a predefined distance from the stereoscopic cameras where it is desirable the stereoscopic cameras be calibrated.
Referring to FIG. 2, a block diagram of a system according to an exemplary embodiment is shown. Where cameras 200, 202 of a stereoscopic camera system require calibration, a set of lasers 206, 208, 210 are aimed at a convergence point 212, within the field of view of each camera 200, 202.
In at least one embedment, the set of lasers 206, 208, 210 comprises a center, crosshair laser 210. The crosshair laser 210 is projected along a centerline of the stereoscopic system to constrain calibration to the center of the system and prevent divergence toward either the left camera 200 or right camera 202. The convergence point 212 may be defined to conform to some desirable distance along the centerline according to a mission parameter of the stereoscopic system.
In at least one embodiment, the cameras 200, 202 may be parallel to each other, while the readouts of the corresponding camera focal plane for each camera 200, 202 if shifted or transformed. Such shifting results in the same convergence as physical toe-in, but does not introduce distortion (called keystone) in the images of the two cameras. When keystone is introduced, it must be removed via image warping to prevent eye strain in the corners of the image and depth distortion.
One or more intersecting lasers 206, 208 may be used to more precisely define the desirable distance so that the convergence point 212 at the center of the crosshair laser 210 is maintained at that distance.
Referring to FIGS. 3A-3B, views as seen through a center camera 300 and a calibrated stereoscopic camera system 302 are shown. It may be appreciated that the center camera view 300 is hypothetical in nature; the stereoscopic camera system being calibrated may not include a center camera. The calibrated stereoscopic view 302 includes overlapping images of a projected crosshair laser, the views converging on a convergence point for parallax. Horizontal alignment, and alignment at the central convergence point are the desirable factors. It may be appreciated that FIGS. 3A-3B show only the crosshair laser, not any intersecting lasers.
Referring to FIG. 4, a view as seen through a misaligned stereoscopic camera system is shown. In a misaligned stereoscopic camera system, an image from the right camera 400 and an image from the left camera 402 are misaligned both in terms of horizontal alignment and convergence point.
Referring to FIG. 5, a view during a step of a calibration process according to an exemplary embodiment is shown. In a first step, misalignments in images from the left camera and right camera are identified in terms of pitch and or yaw (along vertical and/or horizontal axes with respect to the corresponding camera), and an adjustment (either physical or an image transformation) is determined to align or overlap the central convergence points. It may be appreciated that the process requires substantially contemporaneous adjustment of the left camera and right camera. Simultaneous overlapping views 500, 502 from the left camera and right camera may be used to facilitate such contemporaneous alignment. In at least one embodiment, it may be desirable for at least one camera to apply a filter to facilitate unique identification of the left camera image from the right camera image for automatic calibration.
Referring to FIG. 6, a view during a step of a calibration process according to an exemplary embodiment is shown. In a second step, misalignments in images from the left camera and right camera are identified in terms of roll, and an adjustment (either physical or an image transformation) is determined to align or parallelize horizon lines of overlapping views 600, 602 from the left camera and right camera.
Referring to FIG. 7, a view during a step of a calibration process according to an exemplary embodiment is shown. Because the pitch, yaw, and roll axes of each of the left camera and right camera are related, adjustments to correct misalignments in roll axes are likely to create misalignments in terms of pitch and/or yaw. Therefore, in a third step, new misalignments in the left camera and right camera are iteratively identified and new adjustments in terms of pitch, yaw, and roll are iteratively determined to re-align the central convergence points of the overlapping views 700, 702.
Referring to FIGS. 8A-8C, views during an alignment process according to an exemplary embodiment are shown. In at least one embodiment, a calibration system may project one or more intersecting lines 800, 802 to further align the convergence point 804 of the central crosshair laser to a desired location in camera. FIG. 8B shows a properly located plane of convergence. The intersection of the dashed crosshairs is aligned with the center of the crosshair. FIGS. 8A and 8C show an incorrectly located plane of convergence. If the plan of convergence is either too far or too close, the intersection of the dashed crosshairs will not be aligned with the center of the crosshair.
Referring to FIGS. 9A-9B, views as seen through a hypothetical center camera and a calibrated stereoscopic camera system are shown. The center hypothetical camera view in 9A shows the crosshair laser and projected intersecting lines. The calibrated stereoscopic view includes overlapping images of the crosshair laser and intersecting lines converging on a convergence point.
Referring to FIG. 10, a view as seen through a calibrated stereoscopic camera system is shown. A calibration system may utilize a thin crosshair 1002 or other display reference point, either as a film applied to the display or rendered on the display, to align the horizontal features of the crosshair laser and the convergence point to the display.
Referring to FIG. 11, a block diagram of an exemplary embodiment of a stereoscopic calibration system is shown. The system comprises a processor 1100, non-transitory computer readable medium having a memory 1102 connected to the processor 1100 for storing processor executable code, and a left camera 1104 and right camera 1106 connected to the processor 1100. The left camera 1104 and right camera 1106 are calibrated with respect to one or more laser 1108 projections.
In at least one embodiment, the processor 1100 iteratively identifies misalignments in images from each of the left camera 1104 and right camera 1106, and determines adjustments. In one embodiment, the adjustments may be applied via physical manipulation of the left camera 1104 and right camera 1106. Alternatively, or in addition, the adjustments comprise image transformations applied to streaming images from each of the left camera 1104 and right camera 1106.
In at least one embodiment, the processor 1100 independently analyses images from the left camera 1104 and right camera 1106 to identify misalignments with respect to at least a crosshair laser in the one or more lasers 1108. The processor 1100 iteratively determines transformations to align a convergence point and horizontal features of the crosshair laser in each camera image.
It should be appreciated that, while embodiments discussed herein include a crosshair laser, other embodiments are envisioned.
Referring to FIG. 12, a flowchart of an exemplary embodiment is shown. A plurality of lasers are projected 1200 at a convergence point within the field of view of a stereoscopic camera system. Images from each of a first camera and second camera are received 1202, 1204; for example, at a calibration display or by a calibration processor. Misalignments are identified via comparison of the received images, to each other, and potentially with respect to a display reference point.
In one embodiment, an alignment correction 1206, 1208 for each of the first camera and second camera is determined with respect to pitch and/or yaw. Likewise, an alignment correction 1210, 1212 for each of the first camera and second camera is determined with respect to roll.
In at least one embodiment, an operator or calibration processor determines 1214 if application of the determined 1206, 1208, 1210, 1212 corrections causes a new misalignment. If so, the process repeats iteratively until the images from the first camera and second camera are aligned with respect to the projected 1200 lasers within an acceptable threshold.
In at least one embodiment, where the corrections comprise image transformations, linear transformation matrices for the first camera and second camera may be stored for later use.
By using a geometry of lasers projected onto a flat surface (such as a runway or portable projection surface), a set of two equally spaced stereoscopic cameras can be calibrated to a defined convergence point through an iterative visual process. Such process enables quick calibration of camera alignment and convergence point, without expensive or complicated equipment, to produce a 3D image on a compatible 3D display.
It is believed that the inventive concepts disclosed herein and many of their attendant advantages will be understood by the foregoing description of embodiments of the inventive concepts disclosed, and it will be apparent that various changes may be made in the form, construction, and arrangement of the components thereof without departing from the broad scope of the inventive concepts disclosed herein or without sacrificing all of their material advantages; and individual features from various embodiments may be combined to arrive at other embodiments. The form herein before described being merely an explanatory embodiment thereof, it is the intention of the following claims to encompass and include such changes. Furthermore, any of the features disclosed in relation to any of the individual embodiments may be incorporated into any other embodiment.

Claims (11)

What is claimed is:
1. A method for aligning a stereoscopic camera system comprising:
focusing a plurality of lasers on a convergence point, the plurality of lasers including a crosshair laser defining a centerline of a stereoscopic camera system and two intersecting lasers;
producing an overlapping image from a first stereoscopic camera and a second stereoscopic camera based on the convergence point;
identifying a misalignment in each of the first stereoscopic camera and the second stereoscopic camera along at least one of a pitch or a yaw axis with respect to the convergence point;
contemporaneously applying corrections to the misalignment of the first stereoscopic camera and the second stereoscopic camera along the pitch or yaw axis;
identifying a roll misalignment of one or more of the first stereoscopic camera or the second stereoscopic camera on a roll axis; and
applying a correction to the misalignment of the one or more of the first stereoscopic camera or the second stereoscopic camera along the roll axis.
2. The method of claim 1, further comprising:
iteratively identifying misalignments to pitch or yaw of the first stereoscopic camera and second stereoscopic camera; and
iteratively applying corrections to the iteratively identified misalignments.
3. The method of claim 1, further comprising:
iteratively identifying roll misalignments to the roll of the first stereoscopic camera and second stereoscopic camera; and
iteratively applying corrections to the iteratively identified roll misalignments.
4. The method of claim 1, further comprising:
determining a first transformation to images from the first stereoscopic camera to apply all the corrections to the first stereoscopic camera; and
determining a second transformation to images from the second stereoscopic camera to apply all the corrections to the second stereoscopic camera.
5. The method of claim 1, further comprising:
rendering a display reference on a display device;
aligning the convergence point to the display reference in the first stereoscopic camera; and
aligning the convergence point to the display reference in the second stereoscopic camera.
6. A system comprising:
a plurality of lasers aimed at a desired convergence point, the plurality of lasers including a crosshair laser defining a centerline of a stereoscopic camera system and two intersecting lasers;
a first stereoscopic camera;
a second stereoscopic camera; and
at least one processor in data communication with the first stereoscopic camera, second stereoscopic camera, and a non-transitory computer readable medium having a memory storing processor executable code to configure the at least one processor to:
produce an overlapping image from a first stereoscopic camera and a second stereoscopic camera based on the convergence point;
identify a misalignment in each of the first stereoscopic camera and the second stereoscopic camera along at least one of a pitch or a yaw axis with respect to the convergence point;
contemporaneously apply corrections to the misalignment of the first stereoscopic camera and the second stereoscopic camera along the pitch or yaw axis;
identify a roll misalignment of one or more of the first stereoscopic camera or the second stereoscopic camera on a roll axis; and
apply a correction to the misalignment of the one or more of the first stereoscopic camera or the second stereoscopic camera along the roll axis.
7. The system of claim 6, wherein the at least one processor is further configured to:
iteratively identify misalignments to pitch or yaw of the first stereoscopic camera and second stereoscopic camera; and
iteratively apply corrections to the iteratively identified misalignments.
8. The system of claim 6, wherein the at least one processor is further configured to:
iteratively identifying roll misalignments to the roll of the first stereoscopic camera and second stereoscopic camera; and
iteratively applying corrections to the iteratively identified roll misalignments.
9. The system of claim 6, wherein the at least one processor is further configured to:
determine a first transformation to images from the first stereoscopic camera to apply all the corrections to the first stereoscopic camera; and
determine a second transformation to images from the second stereoscopic camera to apply all the corrections to the second stereoscopic camera.
10. The system of claim 6, wherein the at least one processor is further configured to:
render a display reference on a display device;
align the convergence point to the display reference in the first stereoscopic camera; and
align the convergence point to the display reference in the second stereoscopic camera.
11. The system of claim 6, wherein the at least one processor is further configured to apply a filter to images from the first stereoscopic camera to distinguish images from the first stereoscopic camera from images from the second stereoscopic camera.
US16/814,413 2020-03-10 2020-03-10 Stereoscopic camera alignment via laser projection Active 2040-03-24 US11360375B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US16/814,413 US11360375B1 (en) 2020-03-10 2020-03-10 Stereoscopic camera alignment via laser projection

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US16/814,413 US11360375B1 (en) 2020-03-10 2020-03-10 Stereoscopic camera alignment via laser projection

Publications (1)

Publication Number Publication Date
US11360375B1 true US11360375B1 (en) 2022-06-14

Family

ID=81944217

Family Applications (1)

Application Number Title Priority Date Filing Date
US16/814,413 Active 2040-03-24 US11360375B1 (en) 2020-03-10 2020-03-10 Stereoscopic camera alignment via laser projection

Country Status (1)

Country Link
US (1) US11360375B1 (en)

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7209161B2 (en) * 2002-07-15 2007-04-24 The Boeing Company Method and apparatus for aligning a pair of digital cameras forming a three dimensional image to compensate for a physical misalignment of cameras
US20080192110A1 (en) * 2005-05-13 2008-08-14 Micoy Corporation Image capture and processing
US8098275B2 (en) 2003-12-30 2012-01-17 The Trustees Of The Stevens Institute Of Technology Three-dimensional imaging system using optical pulses, non-linear optical mixers and holographic calibration
US20130016186A1 (en) * 2011-07-13 2013-01-17 Qualcomm Incorporated Method and apparatus for calibrating an imaging device
WO2014153429A1 (en) 2013-03-20 2014-09-25 Trimble Navigation Limited Indoor navigation system and method
US8866913B1 (en) * 2013-04-08 2014-10-21 Omnivision Technologies, Inc. Systems and methods for calibration of a 360 degree camera system
US20170350968A1 (en) 2016-06-06 2017-12-07 Goodrich Corporation Single pulse lidar correction to stereo imaging
US10088296B2 (en) 2014-09-10 2018-10-02 Faro Technologies, Inc. Method for optically measuring three-dimensional coordinates and calibration of a three-dimensional measuring device
US20180300901A1 (en) 2017-04-18 2018-10-18 Panasonic Intellectual Property Management Co., Ltd. Camera calibration method, recording medium, and camera calibration apparatus
US20180300900A1 (en) 2017-04-18 2018-10-18 Panasonic Intellectual Property Management Co., Ltd. Camera calibration method, recording medium, and camera calibration apparatus
JP2019020371A (en) 2017-07-11 2019-02-07 宏介 津留 Aircraft indicator for orientation and calibration of aircraft-mounted camera and laser range finder

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7209161B2 (en) * 2002-07-15 2007-04-24 The Boeing Company Method and apparatus for aligning a pair of digital cameras forming a three dimensional image to compensate for a physical misalignment of cameras
US8098275B2 (en) 2003-12-30 2012-01-17 The Trustees Of The Stevens Institute Of Technology Three-dimensional imaging system using optical pulses, non-linear optical mixers and holographic calibration
US20080192110A1 (en) * 2005-05-13 2008-08-14 Micoy Corporation Image capture and processing
US20130016186A1 (en) * 2011-07-13 2013-01-17 Qualcomm Incorporated Method and apparatus for calibrating an imaging device
WO2014153429A1 (en) 2013-03-20 2014-09-25 Trimble Navigation Limited Indoor navigation system and method
US8866913B1 (en) * 2013-04-08 2014-10-21 Omnivision Technologies, Inc. Systems and methods for calibration of a 360 degree camera system
US10088296B2 (en) 2014-09-10 2018-10-02 Faro Technologies, Inc. Method for optically measuring three-dimensional coordinates and calibration of a three-dimensional measuring device
US20170350968A1 (en) 2016-06-06 2017-12-07 Goodrich Corporation Single pulse lidar correction to stereo imaging
US20180300901A1 (en) 2017-04-18 2018-10-18 Panasonic Intellectual Property Management Co., Ltd. Camera calibration method, recording medium, and camera calibration apparatus
US20180300900A1 (en) 2017-04-18 2018-10-18 Panasonic Intellectual Property Management Co., Ltd. Camera calibration method, recording medium, and camera calibration apparatus
JP2019020371A (en) 2017-07-11 2019-02-07 宏介 津留 Aircraft indicator for orientation and calibration of aircraft-mounted camera and laser range finder

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
Vilaca, Joao et al., "Stero Vision Calibration Procedure for 3D Surface measurements", IEEE Xplore, DOI: 10.1109/IECON.2006.347777, Conference Paper Dec. 2006, 7 pages.

Similar Documents

Publication Publication Date Title
US11269244B2 (en) System and method for calibrating a display system using manual and semi-manual techniques
US8866913B1 (en) Systems and methods for calibration of a 360 degree camera system
US7889234B2 (en) Automatic calibration for camera lens distortion correction
CN105306805B (en) Apparatus and method for correcting image distortion of camera for vehicle
US9210303B2 (en) Lens distortion correction method
US20180040141A1 (en) Camera extrinsic parameters estimation from image lines
CN113781369B (en) Aligning digital images
KR20100064031A (en) Method and apparatus for correcting depth image
US10628967B2 (en) Calibration for multi-camera systems
CN109570750B (en) Laser galvanometer precision online correction system and method
CN105757422A (en) Positioning device and method for correcting camera parallelism and distance
US8055100B2 (en) Method and system for image registration quality confirmation and improvement
US11360375B1 (en) Stereoscopic camera alignment via laser projection
US20190047706A1 (en) Flight direction display method and apparatus, and unmanned aerial vehicle
WO2020114433A1 (en) Depth perception method and apparatus, and depth perception device
CN109945779A (en) Correction system and correlation method with an at least video camera
CN109693371A (en) Three-dimensional printing system and its manufacturing method
WO2021035606A1 (en) Course line adjustment method, ground device, unmanned aerial vehicle, system, and storage medium
KR102477480B1 (en) Apparatus for calibrating for around view camera and method thereof
KR101886840B1 (en) Method and apparatus for geometric correction based on user interface
CN113411547A (en) Position correction method and device of holder
KR102044098B1 (en) Apparatus and method for calibrating blind spot detection
CN113959374B (en) Image-based laser holder optical axis correction method and device
US20150053848A1 (en) Identifying and Correcting Hogel and Hogel Beam Parameters
CN110579339A (en) polarization angle calibration method, device, equipment, optical system and storage medium

Legal Events

Date Code Title Description
FEPP Fee payment procedure

Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

STCF Information on status: patent grant

Free format text: PATENTED CASE