Patent 10664518

Derivative works

Defensive disclosure: derivative variations of each claim designed to render future incremental improvements obvious or non-novel.

Active provider: Google · gemini-2.5-pro

Derivative works

Defensive disclosure: derivative variations of each claim designed to render future incremental improvements obvious or non-novel.

✓ Generated

Of course. As a Senior Patent Strategist and Research Engineer, here is a comprehensive "Defensive Disclosure" document for U.S. Patent 10,664,518, designed to establish prior art against future incremental inventions.

Defensive Disclosure and Prior Art Publication
Title: Methods and Systems for Spatially-Partitioned, Context-Aware Data Services
Publication Date: May 4, 2026
Keywords: Augmented Reality, Location-Based Services, Spatial Indexing, Tessellation, View-Dependent Rendering, Sensor Fusion, Digital Twin, WebXR, GeoJSON, ROS.

Abstract: This publication discloses a series of methods, systems, and applications that build upon the concept of delivering data objects to a user device based on its location within a partitioned (tiled) map and its specific view of interest. The disclosed variations are intended to enter the public domain to serve as prior art for future patent applications. Disclosures include substitutions of core components, operation under extreme parameters, application to novel industrial domains, integration with emerging technologies like AI and blockchain, and systems designed for graceful degradation or failure modes.

Combination Prior Art Scenarios

This section details the combination of the core invention of spatially-partitioned data delivery with existing open-source standards, rendering such combinations obvious to a person skilled in the art.

1. Combination with WebXR and GeoJSON for Browser-Based AR

  • Enabling Description: A system is constructed wherein the "tessellated tile map" is implemented as a series of GeoJSON Polygon or MultiPolygon features. Each feature in the GeoJSON file includes a properties object containing a unique tile ID and a URL endpoint pointing to the associated AR content package. A client-side web application utilizes the WebXR Device API to create an immersive AR session in a standard web browser. The device's geolocation, obtained via the browser's Geolocation API, is continuously checked against the GeoJSON tile definitions. Upon entering a new tile, the application fetches the corresponding AR content package (e.g., a GLB 3D model, video, or interactive script) from the URL specified in the tile's properties. The device's orientation, provided by the WebXR XRViewerPose object, determines the "view of interest" and is used to correctly position and render the AR content relative to the user's perspective. This architecture requires no proprietary software on the client device, relying solely on open web standards.
  • Mermaid Diagram:
    sequenceDiagram
        participant Browser as Web Browser (Client)
        participant GeoAPI as Geolocation API
        participant WebXR as WebXR API
        participant Server
        Browser->>GeoAPI: requestPosition()
        GeoAPI-->>Browser: currentCoordinates
        Browser->>Server: fetchTileMap('map.geojson')
        Server-->>Browser: geoJSONData
        Browser->>Browser: identifyCurrentTile(currentCoordinates, geoJSONData)
        Note over Browser: Now in Tile 'A', get content URL
        Browser->>Server: getContent('content_A.glb')
        Server-->>Browser: arContentPackage
        Browser->>WebXR: requestSession('immersive-ar')
        WebXR-->>Browser: sessionStarted
        loop Render Loop
            Browser->>WebXR: getViewerPose()
            WebXR-->>Browser: viewerPose (position, orientation)
            Browser->>Browser: renderContent(arContentPackage, viewerPose)
        end
    

2. Combination with Robot Operating System (ROS) and OpenCV for Autonomous Navigation and Interaction

  • Enabling Description: An autonomous mobile robot (AMR) uses the ROS framework for navigation and control. The "initial map" of the area of interest is a standard ROS map format (PGM and YAML file) generated via a SLAM algorithm (e.g., gmapping). This map is programmatically tessellated into polygonal tiles, with each tile's definition stored as a parameter on the ROS Parameter Server. An "AR Management Engine" is implemented as a ROS node that subscribes to the /tf (transform) topic to track the robot's base_link frame relative to the map frame, thereby determining its current tile. The robot's onboard camera stream is processed by a separate ROS node using the OpenCV library for object recognition. When a predefined object of interest is detected within the camera's view, its identity and coordinates are published to a topic. The management node, correlating the current tile and the recognized object ("view of interest"), triggers a specific action. For example, upon recognizing a 'charging station' object while in the 'low_battery' tile, the robot initiates a docking procedure. The "AR content" here is the set of executable behaviors or data overlays for mission planning.
  • Mermaid Diagram:
    flowchart TD
        subgraph AMR
            A[Camera] --> B{OpenCV Node};
            B -- Recognized Object ID --> C{AR Management Node};
            D[Wheel Encoders/IMU] --> E{SLAM/Localization};
            E -- Robot Pose --> C;
        end
        subgraph Server
            F[ROS Parameter Server] -- Tile Definitions --> C;
        end
        C -- Current Tile + Object View --> G[Execute Behavior];
        G -- Navigation Goal --> H{move_base Node};
        H --> I[Motor Controllers];
    

3. Combination with MQTT and Prometheus for IoT-Driven Dynamic AR Environments

  • Enabling Description: The system is applied to a smart factory floor. Each "tile" corresponds to a specific work cell. IoT sensors (temperature, vibration, pressure) within each cell are configured as MQTT clients that publish their data to specific topics (e.g., factory/cell3/temp). A central "AR Management Engine" subscribes to all relevant MQTT topics. The system also includes a Prometheus time-series database for monitoring and alerting. When a sensor value crosses a predefined threshold (e.g., a machine is overheating), a Prometheus alert fires, triggering the AR engine. The engine immediately modifies the AR content package for the corresponding tile. A technician entering that tile and viewing the affected machine (the "view of interest") with an AR headset will see a dynamically-injected maintenance alert, real-time sensor readings, and step-by-step repair instructions overlaid on the machine. This creates a closed-loop system where the physical environment directly alters the digital information overlay.
  • Mermaid Diagram:
    sequenceDiagram
        participant Sensor as IoT Sensor
        participant Broker as MQTT Broker
        participant Engine as AR Management Engine
        participant Prometheus
        participant ARDevice as Technician's AR Headset
        loop Real-time Data
            Sensor->>Broker: PUBLISH factory/cell3/temp: 95C
        end
        Broker->>Engine: PUSH Data
        Broker->>Prometheus: PUSH Data
        Prometheus->>Prometheus: Evaluate Rule (temp > 90C)
        Prometheus->>Engine: ALERT! High Temperature
        Engine->>Engine: Modify AR Content for Tile 'cell3'
        ARDevice->>Engine: Request Content for Tile 'cell3'
        Engine-->>ARDevice: Updated AR Package (with alert)
        ARDevice->>ARDevice: Render maintenance alert on machine view
    

Derivative Variations on Core Claims

1. Material & Component Substitution

  • 1.1. Acoustic Positioning for Tile Identification:

    • Enabling Description: Instead of relying on GPS or WiFi, the system uses an array of ultrasonic transducers installed in the ceiling of an indoor environment. Each transducer emits a unique, high-frequency, coded signal (CDMA). A user's device, equipped with multiple microphones, receives these signals. By calculating the Time Difference of Arrival (TDOA) of the signals from three or more transducers, the device computes its precise XYZ coordinates. This position is then used to identify its containing "tile," which is defined in 3D space (a voxel). This method provides high-precision indoor positioning where RF signals are unreliable.
    • Mermaid Diagram:
      flowchart LR
          subgraph Environment
              T1(Transducer 1); T2(Transducer 2); T3(Transducer 3);
          end
          subgraph UserDevice
              M1(Mic 1); M2(Mic 2); M3(Mic 3);
              CPU(Processing Unit);
          end
          T1 -- CDMA Signal 1 --> M1;
          T2 -- CDMA Signal 2 --> M2;
          T3 -- CDMA Signal 3 --> M3;
          M1 & M2 & M3 --> CPU;
          CPU -- TDOA Calculation --> Pos(X,Y,Z);
          Pos --> TileID[Identify Tile/Voxel];
          TileID --> AR[Fetch AR Content];
      
  • 1.2. Neuromorphic Processors for View-of-Interest Recognition:

    • Enabling Description: Object recognition for the "view of interest" is offloaded from a traditional CPU/GPU to a low-power neuromorphic processor (e.g., an Intel Loihi or IBM TrueNorth architecture). This spiking neural network (SNN) processor is trained to recognize key objects within the environment. It processes camera input as a series of events rather than frames, drastically reducing power consumption. When a target object is recognized, the chip sends a simple event trigger containing the object ID to the main application processor, which then instantiates the corresponding AR content. This is ideal for always-on, battery-powered AR glasses.
    • Mermaid Diagram:
      classDiagram
          class ApplicationProcessor {
              +mainLoop()
              +renderARContent(contentID)
          }
          class NeuromorphicProcessor {
              -spikingNeuralNetwork
              +processEventStream()
              +triggerObjectRecognition(objectID)
          }
          class Camera {
              +captureEventStream()
          }
          ApplicationProcessor -- "Controls" Camera
          Camera -- "Provides Event Stream" NeuromorphicProcessor
          NeuromorphicProcessor -- "Sends Recognition Trigger" ApplicationProcessor
      

2. Operational Parameter Expansion

  • 2.1. Micro-Scale Application for Semiconductor Failure Analysis:

    • Enabling Description: The system is applied to the surface of a silicon wafer. The "map" is a high-resolution image of the die from a scanning electron microscope (SEM). The "tiles" are individual circuit components (transistors, capacitors) defined by the GDSII layout file. An analyst navigates the SEM view. When the SEM is focused on a specific transistor (the "tile"), the system identifies it. If the analyst then uses an electron beam to probe a specific part of that transistor, like the gate (the "view of interest"), the system overlays "AR content" consisting of real-time voltage contrast data, expected logic states from a simulation, and material composition information from an EDX detector.
    • Mermaid Diagram:
      flowchart TD
          A[SEM Image Acquisition] --> B{Position Correlation};
          C[GDSII Layout Data] -- Component Boundaries --> B;
          B -- Current Component 'Tile' --> D{AR Engine};
          E[E-Beam Probe] -- Probe Coordinates --> F{View of Interest ID};
          F -- 'Gate' of Transistor X --> D;
          G[EDX Detector Data] --> H[Live Data Feed];
          I[SPICE Simulation Data] --> H;
          D & H --> J[Overlay AR Content on SEM View];
      
  • 2.2. Planetary-Scale Application for Mars Rover Navigation:

    • Enabling Description: The "map" is a planetary orbital map of Mars. "Tiles" are kilometer-scale sectors of the Martian surface, defined by geological features. As a rover (e.g., Perseverance) enters a new tile, its mission plan autonomously loads a new set of scientific objectives and navigation constraints associated with that tile (e.g., "Analyze soil in Tile G-7," "Avoid steep slopes in Tile H-5"). The rover's forward-facing cameras perform visual odometry and identify specific rock formations or terrain hazards. A recognized hazardous rock (the "view of interest") causes the system to overlay a no-go zone in the rover's local navigation path, an example of "AR content" being an actionable data overlay for an autonomous agent.
    • Mermaid Diagram:
      stateDiagram-v2
          [*] --> InTile_G6
          InTile_G6 --> MovingTo_G7 : Traverse
          MovingTo_G7 --> InTile_G7 : Arrived
          InTile_G7: Load Science Objectives for G-7
          InTile_G7: Activate Hazard Detection
          state InTile_G7 {
              Drive --> HazardScan : Every 10m
              HazardScan --> Drive : Path Clear
              HazardScan --> AvoidanceManeuver : Hazard in View
              AvoidanceManeuver --> Drive : Path Re-routed
          }
      

3. Cross-Domain Application

  • 3.1. Aerospace: Smart Wiring Harness Assembly:

    • Enabling Description: An aerospace technician wears AR glasses while building a complex wiring harness on a large assembly board. The board is the "map," and each connector port is a "tile." When the technician's device recognizes they are working at a specific port (e.g., J-15), the system populates the view with AR content for that connection. As the technician looks at a specific pin on that connector (the "view of interest"), the system overlays the correct wire color, part number, and required crimping tool. The system can also highlight the target pin on the other end of the wire, ensuring correct end-to-end connectivity.
    • Mermaid Diagram:
      flowchart TD
          Start --> A{Identify Connector Tile: J-15};
          A --> B{Fetch Connection Data for J-15};
          B --> C{Detect View of Interest: Pin 4};
          C --> D[Overlay AR Data: Wire P/N 123-Red, Tool-B];
          D --> E{Highlight Target: Connector P-08, Pin 9};
          E --> F{Verify Connection with Continuity Tester};
          F -- OK --> A;
      
  • 3.2. AgTech: Precision Pest Management with Drones:

    • Enabling Description: A farm is mapped and divided into one-hectare "tiles." A swarm of autonomous drones patrols the farm. When a drone enters a specific tile, it loads a mission package for that area, including known pest hotspots. Using a multispectral camera, the drone scans rows of crops. If it identifies a plant showing signs of aphid infestation (the "view of interest"), it triggers the "AR content," which in this case is a command to a second, specialized drone. This second drone flies to the precise coordinates and performs a targeted micro-spraying of pesticide, minimizing chemical usage and environmental impact.
    • Mermaid Diagram:
      sequenceDiagram
          participant Controller
          participant ScoutDrone
          participant SprayerDrone
          Controller->>ScoutDrone: Patrol Tile H4
          ScoutDrone->>ScoutDrone: Scan Crops
          ScoutDrone->>Controller: Report Pest at GPS(X,Y)
          Controller->>SprayerDrone: Dispatch to GPS(X,Y)
          SprayerDrone->>SprayerDrone: Execute Targeted Spray
          SprayerDrone->>Controller: Report Mission Complete
      
  • 3.3. Consumer Electronics: Interactive Smart Home Manual:

    • Enabling Description: The system is applied to a home environment. Each smart appliance (thermostat, oven, TV) is a "tile." When a user points their smartphone at the smart thermostat, an app recognizes it and enters the "Thermostat Tile." The app displays basic controls. If the user then aims their camera at the physical wiring panel behind the thermostat's faceplate (the "view of interest"), the app overlays an interactive wiring diagram. Tapping a virtual wire overlay displays its function (e.g., "Common Wire," "Heat Call"). This provides context-sensitive repair and installation guidance without requiring the user to find a paper manual.
    • Mermaid Diagram:
      graph TD
          A[Start App] --> B{Point phone at appliance};
          B -- Recognizes Oven --> C[Enter 'Oven Tile'];
          B -- Recognizes Thermostat --> D[Enter 'Thermostat Tile'];
          D --> E{View of Interest: Front Panel};
          D --> F{View of Interest: Wiring Panel};
          E --> G[Display User Controls];
          F --> H[Overlay Interactive Wiring Diagram];
      

4. Integration with Emerging Tech

  • 4.1. AI-Driven Predictive Tile Caching:

    • Enabling Description: The AR management engine incorporates a recurrent neural network (RNN), specifically an LSTM model, trained on historical movement data from all users in the environment. The model takes a user's recent trajectory (a sequence of tile IDs and timestamps) as input and predicts a probability distribution for the next tile they are likely to enter. The system then preemptively pushes the AR content for the top 1-2 most likely tiles to the user's device. This predictive caching minimizes latency when the user crosses a tile boundary, creating a seamless experience.
    • Mermaid Diagram:
      flowchart LR
          A[User Device] -- Trajectory: [T1, T5, T9] --> B[AR Management Engine];
          subgraph B
              C[LSTM Model] -- Predicts Next Tile --> D{Pre-cache Logic};
              C <.-> E[Historical Path Database];
          end
          D -- Probability(T10)=0.8, P(T8)=0.15 --> A;
          A -- Pre-fetches content for T10 --> F((Content CDN));
      
  • 4.2. Blockchain for Verifiable AR Content and Tile Ownership:

    • Enabling Description: The "tile map" is managed as a decentralized application (dApp) on a public blockchain (e.g., Ethereum). Each tile is a non-fungible token (NFT) whose ownership can be bought, sold, or leased. The metadata for each NFT tile points to an immutable content hash (e.g., on IPFS) that defines the AR content package. When an advertiser wants to place an ad in a specific physical location, they purchase or lease the corresponding tile NFT. Users' devices query the blockchain to get the authentic content hash for their current tile, ensuring the AR content they see is genuine and authorized by the tile owner, preventing content spoofing or unauthorized alterations.
    • Mermaid Diagram:
      erDiagram
          TILE_NFT {
              string TokenID PK
              string OwnerAddress
              string ContentHash
          }
          USER_DEVICE {
              string DeviceID PK
              string CurrentTileID
          }
          IPFS {
              string ContentHash PK
              blob ARContentPackage
          }
          TILE_NFT ||--o{ USER_DEVICE : "is located in"
          TILE_NFT ||--|| IPFS : "points to"
      

5. The "Inverse" or Failure Mode

  • 5.1. Graceful Degradation Mode with Progressive Detail:

    • Enabling Description: The system is designed to operate under variable network bandwidth and device processing power. Each tile is associated with multiple AR content packages at different levels of detail (LOD). In optimal conditions, the device downloads a high-fidelity package with complex 3D models and high-resolution textures. If the network is slow or the device CPU is overloaded, the AR engine automatically requests a lower-LOD package, which might consist of simple 2D icons and text. In a complete network failure, the device falls back to a pre-cached "base" layer for the tile, showing only critical information like a location name. This ensures the system always provides some value rather than failing completely.
    • Mermaid Diagram:
      stateDiagram-v2
          state "High Bandwidth" as High {
              Entry: Load LOD-1 (Full 3D)
          }
          state "Low Bandwidth" as Low {
              Entry: Load LOD-2 (2D Icons/Text)
          }
          state "Offline" as Offline {
              Entry: Load LOD-3 (Cached Text)
          }
      
          [*] --> High : Good Connection
          High --> Low : Bandwidth Drops
          Low --> High : Bandwidth Improves
          Low --> Offline : Connection Lost
          High --> Offline : Connection Lost
          Offline --> Low : Connection Restored
      
  • 5.2. Safety-Critical Override for Emergency Egress:

    • Enabling Description: The AR system is integrated with a building's fire alarm and emergency management system. Upon an alarm trigger, the AR management engine enters a "safety override" mode. It immediately terminates all standard AR content rendering on devices within the building. It then force-pushes a single, mandatory AR content package to every device. This package overlays large, simple, animated arrows onto the user's view of the world, directing them along the safest and most current egress path, dynamically updated based on real-time data about fire or hazard locations. All other functionality is disabled to prevent distraction.
    • Mermaid Diagram:
      flowchart TD
          A{Fire Alarm Triggered};
          A --> B[AR Engine: ENTER OVERRIDE MODE];
          B --> C{Terminate All Standard AR Sessions};
          C --> D{Broadcast Egress Path AR Package};
          subgraph User Device
              E[Receive Override Package];
              F[Render Egress Arrows];
          end
          D --> E;
          G[Real-time Hazard Map] -- Updates --> D;
      

Generated 5/4/2026, 6:05:10 AM