Patent 12136276
Prior art
Earlier patents, publications, and products that may anticipate or render the claims unpatentable.
Active provider: Google · gemini-2.5-pro
Prior art
Earlier patents, publications, and products that may anticipate or render the claims unpatentable.
As a senior US patent analyst, I have reviewed the provided documentation and performed a search for US Patent No. 12136276.
Based on my analysis of the patent text and publicly available information from the USPTO, the list of prior art references cited by the examiner during the prosecution of US Patent 12136276 is not contained within the provided patent text. A comprehensive analysis of potential anticipation requires the official "References Cited" section from the patent's file history.
However, the patent text itself references a commonly-owned U.S. application, which can be considered relevant art. I will analyze this reference.
Analysis of Relevant Art
Referenced Application: U.S. Application Serial No. 17/173,950
Full Citation: U.S. Patent Application Publication No. US 2022/0254249 A1 (This is the publication of application Ser. No. 17/173,950).
Filing Date: February 11, 2021.
Brief Description: This application describes a system for detecting lane markers and estimating distances to objects using a monocular camera. It details a geometric algorithm to determine camera parameters, such as camera height and road plane normal, by detecting lane lines in video frames. It also discusses calculating an Inverse Perspective Mapping (IPM) to rectify the view of the road and using this information to fit equidistant parallel lanes. The system can use this initialized data for downstream tasks like lane departure warnings and distance estimation to other vehicles.
Potential Anticipation Analysis:
This application is highly relevant as it forms the basis for some of the geometric calculations mentioned in patent 12136276. The patent under analysis, US 12136276, repeatedly incorporates this application by reference, stating, "Further detail on computing camera properties based on detected lane lines is provided in commonly-owned application Ser. No. 17/173,950, filed Feb. 11, 2021, and incorporated by reference in its entirety" (see, e.g., Description, FIG. 2, process 206).While not strictly "prior art" in a way that would anticipate the claims under 35 U.S.C. § 102 (due to common ownership and the continuation status), its disclosure is foundational. The key distinction and novelty in patent 12136276 lie in the methods of initialization, specifically:
- Using a deep learning model to automatically identify a horizon line to derive camera parameters (as claimed in various independent claims).
- A network-based system involving human annotators to confirm or manually define the horizon/lane lines for remote initialization (described in FIG. 2).
- A multi-headed neural network that can directly predict camera parameters from an image (described in FIG. 3B).
Therefore, while application 17/173,950 discloses the underlying geometric principles and the use of camera parameters for lane detection, it does not appear to disclose the specific deep-learning-based and network-assisted initialization methods that are central to the claims of US 12136276. For instance, the independent claims of US 12136276 require steps such as "identifying a horizon in the plurality of images by inputting the plurality of images into a deep learning (DL) model" and "determining one or more camera parameters based on the horizon." These steps appear to be the novel contribution over the referenced application.
Conclusion: Without the official list of examiner-cited prior art, a complete analysis is not possible. The most relevant piece of art identified through the patent text itself is the co-owned application Ser. No. 17/173,950. This reference describes the foundational geometric methods upon which US 12136276 builds but does not appear to anticipate the core claims related to using a deep learning model or a network-annotator system for the initial camera parameter estimation.
Generated 5/13/2026, 12:15:15 AM