Patent 9792361
Derivative works
Defensive disclosure: derivative variations of each claim designed to render future incremental improvements obvious or non-novel.
Active provider: Google · gemini-2.5-pro
Derivative works
Defensive disclosure: derivative variations of each claim designed to render future incremental improvements obvious or non-novel.
Defensive Disclosure and Prior Art Generation for US 9,792,361
Publication Date: May 9, 2026
Disclosing Entity: [Internal Research Division]
Subject: Derivative Works and Obvious Variations of a System for Social Network-Based Roadway Condition Reporting
This document discloses a series of derivative inventions, technical variations, and cross-domain applications of the core concepts described in U.S. Patent 9,792,361 ("the '361 patent"). The purpose of this disclosure is to place these variations into the public domain, thereby establishing prior art against future patent applications claiming these or similar incremental improvements as novel. The following disclosures are described in sufficient detail to enable a Person Having Ordinary Skill in the Art (PHOSITA) to practice the inventions.
Derivative Disclosures Based on Core Claims
The following disclosures are extensions of the system described in Claim 1 of the '361 patent, which outlines a system for presenting location-dependent social network information based on a user's input.
Axis 1: Material & Component Substitution
Derivative 1.1: Vibro-Acoustic Interface for Eyes-Free Operation
Enabling Description: This variation replaces the standard visual user interface and spoken input port with a system optimized for eyes-free and non-verbal communication, suitable for cyclists or motorcyclists. The hardware data input port is a piezoelectric sensor array integrated into the vehicle's handlebars or the user's gloves, configured to detect specific tap sequences or pressure patterns (e.g., double-tap for "pothole," long-press for "gravel"). The user's request is encoded from these haptic inputs. The automated hardware user interface is replaced with a combination of a high-fidelity tactile transducer providing patterned vibrations and a bone-conduction audio transducer. For example, a received "pothole ahead" alert is presented as a sharp, localized vibration on the left handlebar and a low-frequency tone via the bone-conduction headset, indicating a hazard on the left side of the travel path. The geospatial positioning system remains a core component, but the entire interaction loop is non-visual and non-verbal.
Mermaid.js Diagram:
sequenceDiagram participant User participant Piezo_Input as Piezoelectric Sensor Array participant Processor participant Comms_Port as Communication Port participant Social_DB as Social Network Database participant Haptic_UI as Vibro-Acoustic UI User->>Piezo_Input: Executes double-tap gesture Piezo_Input->>Processor: Transmits encoded "pothole" signal Processor->>Processor: Associates GPS coordinates Processor->>Comms_Port: Forms and transmits user request (pothole at location X,Y) Comms_Port->>Social_DB: Sends new roadway condition record Social_DB-->>Comms_Port: Acknowledges record & sends proximal alerts Comms_Port-->>Processor: Receives alert for "debris at location A,B" Processor->>Haptic_UI: Renders alert as specific vibration pattern & audio tone Haptic_UI->>User: Delivers tactile and bone-conduction feedback
Derivative 1.2: Integrated Vehicle CAN-Bus and Lidar Sensor Suite as Input Port
Enabling Description: This derivative eliminates the need for manual user input by substituting the data input port with a direct interface to a vehicle's Controller Area Network (CAN-Bus) and its forward-facing Lidar/Radar sensors. The automated hardware processor continuously monitors the CAN-Bus for events indicative of a poor road condition, such as an ABS activation, a traction control event, or a sudden suspension compression/rebound signal from accelerometers. Simultaneously, it processes the Lidar point cloud data to identify physical anomalies on the road surface that correlate with the CAN-Bus events. When a correlation is confirmed (e.g., ABS event matches a Lidar-detected pothole), the processor automatically defines and transmits a roadway condition record to the social network database, complete with precise GPS coordinates, time, and a classification of the event (e.g., "Severe Bump," "Loss of Traction").
Mermaid.js Diagram:
flowchart TD A[CAN-Bus Monitor] --> C{Processor}; B[Lidar/Radar Sensor] --> C; C -- Reads data --> D[Event Detection Module]; D -- ABS/Traction Event --> E{Event/Anomaly Correlation}; D -- Suspension Spike --> E; D -- Lidar Anomaly --> E; E -- Correlation Confirmed --> F[Request Generation]; F -- "Pothole @ Lat/Lon" --> G[Communication Port]; G --> H((Social Network Database));
Axis 2: Operational Parameter Expansion
Derivative 2.1: High-Density Urban Swarm Operation at Millimeter-Wave Frequencies
Enabling Description: This disclosure describes the system operating in a dense urban environment with thousands of nodes (vehicles, delivery drones) per square kilometer. To handle the massive data volume and latency requirements, the communication interface port utilizes the 60 GHz millimeter-wave (mmWave) band for high-bandwidth, short-range vehicle-to-vehicle (V2V) and vehicle-to-infrastructure (V2I) communication. Instead of each device querying a central database for every update, a localized, dynamic mesh network is formed. A new roadway condition record is first broadcast to peers within a 300-meter radius. A local "moderator" node (e.g., a V2I-equipped traffic light) aggregates and validates reports from multiple vehicles before forwarding a single, verified record to the larger social network database. The social network ranking factor becomes heavily weighted by the number of independent, localized confirmations.
Mermaid.js Diagram:
graph LR subgraph "Local Mesh Network (60 GHz mmWave)" V1[Vehicle 1] -- Reports Pothole --> V2; V1 -- Reports Pothole --> V3; V2[Vehicle 2] -- Confirms Pothole --> I1; V3[Vehicle 3] -- Confirms Pothole --> I1; end I1(Infrastructure Node) -- Aggregates & Verifies --> C[Communication Port]; C -- Sends Verified Record --> DB[(Social Network DB)]; DB -- Sends Regional Alerts --> C; C -- Broadcasts to Mesh --> I1;
Derivative 2.2: Nanoscale Road Surface Condition Reporting
Enabling Description: This variation scales the system down to detect and report road surface characteristics at a microscopic level. The "mobile electronic device" is a specialized analysis vehicle equipped with an atomic force microscope (AFM) or a scanning acoustic microscope (SAM) mounted on a gimbaled, road-following arm. The data input port receives high-frequency topographical and material elasticity data from the microscope's cantilever. The processor analyzes this data stream in real-time to detect early signs of road distress, such as micro-cracking, aggregate polishing, or bitumen binder degradation, long before they become visible potholes. A "roadway condition record" in this context is a geo-tagged dataset of the surface's coefficient of friction or Young's modulus, which is transmitted to a database used by civil engineering and road maintenance authorities.
Mermaid.js Diagram:
classDiagram class AnalysisVehicle { +gps_module: GPS +comm_port: CommunicationPort +afm_scanner: AtomicForceMicroscope +processor: HardwareProcessor +scanRoadSurface() +analyzeSurfaceData() +generateMicroConditionRecord() +transmitRecord() } class AtomicForceMicroscope { +cantilever: Cantilever +laser_diode: Laser +photodetector: Photodetector +getTopographyData(): array +getElasticityData(): float } class MicroConditionRecord { +timestamp: datetime +location: GPSCoordinate +coefficient_of_friction: float +surface_hardness: GPa } AnalysisVehicle --> AtomicForceMicroscope : uses AnalysisVehicle --> MicroConditionRecord : creates
Axis 3: Cross-Domain Application
Derivative 3.1: Aerospace - Real-Time In-Flight Turbulence and Systems Anomaly Reporting
Enabling Description: The core mechanism is applied to aviation. The "mobile electronic device" is an aircraft's avionics suite. The data input port is tied to the Flight Data Recorder (FDR) bus, capturing real-time data from accelerometers, pitot tubes, and engine sensors. A pilot's verbal report ("severe chop over waypoint XYZ") or an automated system trigger (e.g., vertical g-force exceeding a threshold) creates a "user request." The processor combines this with the aircraft's precise location (from GPS/INS) and altitude. The communication port transmits this record via ACARS or a satellite link to a "social network database" for atmospheric conditions, managed by air traffic control and shared among aircraft. The system presents received alerts (e.g., "moderate turbulence reported at your flight level in 50 nautical miles") on the navigation display, ranked by proximity, time, and the credibility of the reporting aircraft (e.g., reports from heavy jets are weighted higher).
Mermaid.js Diagram:
flowchart TD subgraph Aircraft_A A[FDR Bus Monitor] --> B{Event Trigger}; C[Pilot Voice Input] --> B; B -- Turbulence Event --> D[Processor]; D -- Associates 4D Location (Lat,Lon,Alt,Time) --> E[Request Formation]; E --> F[SATCOM/ACARS Port]; end F --> G((ATC Atmospheric Database)); G --> H[Other Aircraft]; H --> I[Avionics Display]; I -- "Turbulence Alert Ahead" --> J(Pilot B);
Derivative 3.2: AgTech - Hyper-Local Soil and Pest Condition Reporting Network
Enabling Description: This system is adapted for precision agriculture. The "mobile electronic device" is a sensor probe mounted on an autonomous tractor or carried by a farmer. The data input port consists of a multi-spectral camera and electrochemical sensors for soil pH, moisture, and nitrogen levels. A user's spoken input ("looks like corn borer") or an automated detection by the camera's image recognition algorithm triggers a request. The processor tags the finding with GPS coordinates to a sub-meter accuracy. The request is sent via a LoRaWAN or cellular connection to a shared agricultural database. Farmers in the same region receive alerts on their farm management software, ranked by proximity and the type of threat. For example, a "fusarium head blight" warning would be ranked higher than a "low nitrogen" report during a critical growth stage.
Mermaid.js Diagram:
sequenceDiagram participant Farmer/Drone participant Sensor_Probe participant Onboard_CPU participant Ag_Database as Agricultural Database participant Neighboring_Farms Farmer/Drone->>Sensor_Probe: Scan field section Sensor_Probe->>Onboard_CPU: Send soil/image data Onboard_CPU->>Onboard_CPU: Analyze for pests/deficiencies Onboard_CPU->>Ag_Database: Transmit record ("Corn Borer @ GPS XYZ") Ag_Database->>Neighboring_Farms: Push location-based alert Neighboring_Farms->>Farmer/Drone: Display alert on Farm Mgmt Software
Derivative 3.3: Consumer Electronics - Public Space Digital Infrastructure Quality Reporting
Enabling Description: This application crowdsources the quality of public digital infrastructure like Wi-Fi hotspots or 5G cellular cells. A user's smartphone is the "mobile electronic device." The system runs as a background service. The "user input" is automatically generated when the device's networking stack detects a poor quality of service (e.g., high packet loss, low throughput, failed authentication on a public Wi-Fi network). The processor creates a record with the device location, the network SSID or Cell ID, and the specific performance metric. This is sent to a public database. Other users approaching that location can query the database to see real-time network quality, with data ranked by recency and the number of corroborating reports. The user interface could be an augmented reality overlay that shows color-coded indicators (green, yellow, red) over nearby cafes or transit stations, representing their Wi-Fi quality.
Mermaid.js Diagram:
stateDiagram-v2 [*] --> Idle Idle --> Monitoring: User enters public space Monitoring --> Reporting: WiFi packet loss > 20% Reporting --> Monitoring: Report sent to DB Reporting: Create record (SSID, GPS, Loss%) Reporting: Transmit via cellular backup Monitoring --> Idle: User leaves public space
Axis 4: Integration with Emerging Tech
Derivative 4.1: AI-Powered Predictive Road Condition Modeling
Enabling Description: The system is enhanced with a server-side AI model (e.g., a spatio-temporal graph neural network). The social network database feeds historical and real-time road condition reports, along with weather data (from NOAA) and traffic flow data (from DOT sensors), into the model. The AI learns to predict the formation of hazardous conditions, such as ice forming on a specific overpass when the temperature drops below a certain point with precipitation, even before a user reports it. When a user queries the system, they receive not only user-reported data but also AI-generated predictive alerts ("High probability of black ice on Exit 23 ramp in the next 30 minutes"). The "social network ranking factor" is augmented by an AI-calculated confidence score for the prediction.
Mermaid.js Diagram:
graph TD A[User Reports] --> D{AI Model}; B[Weather Data] --> D; C[Traffic Flow Data] --> D; D -- Generates Predictions --> E[Predictive Alert Database]; F[Mobile Device] -- Sends Request --> G{Query Handler}; H[Social Network DB] --> G; E --> G; G -- Returns Merged Data --> F;
Derivative 4.2: IoT Sensor Fusion for Automated Record Generation
Enabling Description: This derivative moves beyond the single-vehicle context and creates a fully automated reporting system using a distributed network of IoT sensors. The system integrates data from: 1) Piezoelectric strain gauges embedded in roadways and bridges to detect vehicle weight and structural stress. 2) Acoustic sensors alongside roads to detect the sound signature of hydroplaning or tire screeching. 3) Smart city cameras with computer vision algorithms to spot flooding or debris. A central server, acting as the "processor," fuses these disparate data streams. A "roadway condition record" is automatically created when sensor data from multiple sources cross-correlates (e.g., a strain gauge detects a heavy load, followed by a hydroplaning acoustic signature, and visual water detection from a camera), creating a high-confidence, automated "flooding" alert without any human intervention.
Mermaid.js Diagram:
flowchart LR subgraph IoT Data Sources A[Embedded Road Sensors] B[Acoustic Sensors] C[Smart City Cameras] end subgraph Central Server D{Sensor Fusion Engine} E[Condition Logic] F[Record Generator] end A & B & C --> D D -- Fused Data --> E E -- "Flooding" Condition Met --> F F --> G((Social Network DB));
Derivative 4.3: Blockchain-Verified Roadway Incident Ledger
Enabling Description: To ensure the integrity and verifiability of reports, this variation uses a permissioned blockchain (e.g., Hyperledger Fabric) as the backend database. Each "roadway condition record" is a transaction on the distributed ledger. A user's mobile device signs the transaction with its private key, creating a non-reputable record. To add a record, a small gas fee (paid via micropayment or earned through credible reporting) is required, deterring spam. "Social network ranking" is achieved through an on-chain reputation score (similar to a non-transferable NFT or "Soulbound Token") associated with each user's public key. Reports from users with higher reputation scores are weighted more heavily by the smart contracts that govern data retrieval. This creates a trusted, immutable, and auditable history of roadway conditions, useful for insurance claims or municipal liability cases.
Mermaid.js Diagram:
sequenceDiagram participant Mobile_Device as Mobile Device participant Wallet as Crypto Wallet participant Smart_Contract as Validation Smart Contract participant Ledger as Blockchain Ledger Mobile_Device->>Wallet: Create report transaction Wallet->>Mobile_Device: Request signature for Tx Mobile_Device->>Wallet: Sign transaction with private key Wallet->>Smart_Contract: Submit signed transaction Smart_Contract->>Ledger: Validate signature & reputation score Smart_Contract->>Ledger: Write new block with report data Ledger-->>Smart_Contract: Confirm transaction Smart_Contract-->>Mobile_Device: Return confirmation
Axis 5: The "Inverse" or Failure Mode
Derivative 5.1: Graceful Degradation via Store-and-Forward Protocol
Enabling Description: This variation is designed for operation in areas with intermittent or non-existent network connectivity (e.g., rural areas, tunnels). The "automated hardware communication interface port" is configured with a "low-power, limited-functionality" mode. When the device detects a loss of connection to the central social network database, it enters a store-and-forward state. All new user-generated reports are stored locally in a time-stamped, geo-tagged queue in persistent memory. The device simultaneously listens for peer devices using a low-power, ad-hoc wireless protocol (e.g., Bluetooth LE, Wi-Fi Direct). When another device is detected, they perform a handshake and sync their queues of pending reports, propagating information through the local ad-hoc network. Once a device in the ad-hoc network re-establishes a connection to the central server, it uploads its entire synchronized queue of reports, which are then integrated into the main database.
Mermaid.js Diagram:
stateDiagram-v2 state "Connected Mode" as Connected { [*] --> Connected Connected --> Disconnected: Loss of Cellular/WAN Connected: Transmit reports directly to DB } state "Disconnected/Ad-Hoc Mode" as Disconnected { Disconnected --> Connected: Regain Cellular/WAN Disconnected: Store new reports locally Disconnected: Listen for peers via BLE/WiFi Direct Disconnected: Sync report queues with peers }
Combination Prior Art Scenarios
Combination 1: C-V2X Direct Communication Protocol for Latency-Critical Alerts
- Description: The system described in the '361 patent is combined with the 3GPP Cellular V2X (C-V2X) PC5 direct communication standard. While the core system uploads reports to a cloud-based social network database via a Uu interface (device-to-network), this combination adds a parallel PC5 interface. When a user reports a high-priority, latency-critical event like "vehicle driving wrong way," the processor not only sends the report to the cloud database but simultaneously broadcasts a standardized Basic Safety Message (BSM) or a new "Road Hazard Message" directly to all C-V2X enabled vehicles within a 1-2 km radius. This provides sub-second alerts to nearby vehicles, bypassing cloud latency for immediate threats, while the cloud database serves for non-real-time information and broader dissemination.
Combination 2: GeoJSON and OpenStreetMap for Interoperable Data Representation
- Description: The data format for roadway condition records is standardized using the open GeoJSON format (IETF RFC 7946). A pothole is represented as a
Pointfeature, a flooded area as aPolygonfeature, and a debris field as aLineStringfeature. Each feature'spropertiesobject contains the metadata (timestamp, report type, user credibility score, etc.). This standardized data is then rendered not on a proprietary map, but as an overlay layer on OpenStreetMap (OSM) tiles. Furthermore, verified, persistent hazards (e.g., a road washout) are contributed back to the core OpenStreetMap dataset, allowing any application using OSM data to benefit from the reports. This combines the '361 patent's reporting mechanism with open data standards for maximum interoperability.
Combination 3: Open-Source AI Framework for Transparent Ranking
- Description: The "social network ranking factor" is implemented not as a proprietary, black-box algorithm, but as a transparent, open-source model using the TensorFlow or PyTorch framework. The model's architecture (e.g., a simple logistic regression or a more complex gradient-boosted tree) and its input features (report recency, proximity, user reputation, number of confirmations, correlation with weather) are publicly documented. The model is trained on a public, anonymized dataset of road reports. This allows for public auditing of the ranking algorithm to ensure fairness and prevent manipulation. The system in the '361 patent becomes a data collection and presentation layer for a transparent, community-vetted ranking and filtering engine.
Generated 5/9/2026, 12:46:57 PM