Patent 6164534
Derivative works
Defensive disclosure: derivative variations of each claim designed to render future incremental improvements obvious or non-novel.
Active provider: Google · gemini-2.5-pro
Derivative works
Defensive disclosure: derivative variations of each claim designed to render future incremental improvements obvious or non-novel.
Defensive Disclosure and Prior Art Generation
Reference Patent: U.S. Patent No. 6,164,534
Publication Title: Method and Apparatus for Accessing Electronic Data via a Familiar Printed Medium: Derivative Implementations and Integrations
Date of Publication: April 29, 2026
Keywords: Augmented Reality, Machine Vision, Printed Electronics, Internet of Things (IoT), AI-Generated Content, Blockchain Provenance, Fail-Safe Systems, Human-Computer Interaction
This document discloses a series of derivative works, alternative embodiments, and cross-domain applications of the system described in U.S. Patent 6,164,534 ("the '534 patent"). The purpose of this disclosure is to place these variations into the public domain, thereby establishing them as prior art for any future patent applications in this field.
Derivative Embodiments of the '534 Patent System
The following disclosures expand upon the core architecture of the '534 patent, which links a physical medium to electronic data. Each derivative provides a technical enabling description and a visual diagram.
1. Material & Component Substitution: Interactive Conductive Textile with Capacitive Sensing
Enabling Description: This embodiment replaces the passive "printed matter" and optical "feature recognition unit" with an active textile system. The printed matter is a multi-layer fabric, such as a "smart textile" jacket or upholstery, woven with a grid of conductive threads (e.g., silver-coated nylon yarn). The "machine recognizable features" are not visually printed codes but are instead specific, unique patterns integrated into the textile's capacitive grid, defined by localized changes in dielectric material or thread density. The "feature recognition unit" is a controller integrated with the textile, which continuously scans the grid for changes in capacitance. A user's touch on a specific pattern alters the capacitance at that location. The controller identifies the touched pattern (the "feature"), and its associated transmitter (e.g., a Bluetooth Low Energy module) sends a coded signal to a paired intelligent controller, such as a smartphone or haptic feedback device. For example, a user could touch a specific symbol on the sleeve of their jacket to trigger an audio player on their phone to play a specific playlist.
Mermaid Diagram: Data Flow for Conductive Textile
sequenceDiagram participant User participant ConductiveTextile as Smart Textile (Jacket) participant TextileController as Integrated BLE Controller participant PairedDevice as Smartphone / Haptic Device User->>+ConductiveTextile: Touches specific woven pattern ConductiveTextile->>-TextileController: Capacitance change detected at pattern coordinates TextileController->>TextileController: Match coordinates to pre-defined feature ID TextileController->>PairedDevice: Transmit coded signal (Feature ID) via Bluetooth PairedDevice->>PairedDevice: Receive signal and execute command (e.g., play music) PairedDevice-->>User: Present programming material (audio playback)
2. Operational Parameter Expansion: Nanoscale Semiconductor Wafer Analysis
Enabling Description: This embodiment applies the invention at the microscopic scale for semiconductor manufacturing. The "printed matter" is a silicon wafer upon which circuits are fabricated. The "machine recognizable features" are nanoscale fiducial markers (e.g., unique geometric patterns of gold or tungsten) etched directly onto the wafer at specific locations within the die layout. The "feature recognition unit" is a Scanning Electron Microscope (SEM) or an Atomic Force Microscope (AFM) equipped with machine vision software. The microscope scans the wafer, and when the software recognizes a fiducial marker, it transmits its unique identifier. The "intelligent controller" is a high-performance computing workstation connected to a manufacturing execution system (MES). Upon receiving the marker ID, the controller accesses a database containing the specific design parameters, simulation data, or previous inspection results for that exact region of the wafer. The "display unit" is the workstation's monitor, which overlays the retrieved engineering data directly onto the live microscope image.
Mermaid Diagram: Nanoscale Wafer Analysis Workflow
flowchart TD A[Place Wafer in SEM] --> B{Scan Wafer Surface}; B --> C{Machine Vision Recognizes Nanoscale Fiducial Marker}; C --> D[Transmit Marker ID to Controller]; D --> E[Controller Queries MES/Design Database]; E --> F[Retrieve Circuit Simulation & Test Data]; F --> G[Overlay Data on Live SEM Image]; G --> H[Display to Engineer];
3. Cross-Domain Application: Agricultural Technology (AgTech) Soil and Plant Health System
Enabling Description: This embodiment applies the invention to precision agriculture. The "printed matter" is a physical tag or stake made of a biodegradable polymer, placed in the soil next to a plant or crop row. The "machine recognizable feature" is a durable, weatherproof 2D data matrix code printed with UV-resistant ink, containing a unique identifier for that specific plant or soil zone. The "feature recognition unit" is a ruggedized smartphone or a drone-mounted multispectral camera. Upon scanning the code, the device transmits the identifier to an "intelligent controller," which is a cloud-based agricultural management platform. The controller aggregates and processes data associated with that identifier, including IoT sensor data from the soil (moisture, pH, nutrients via LoRaWAN sensors), satellite imagery (NDVI), and historical yield data. The "display unit" (e.g., the farmer's tablet) then presents a dashboard with AI-driven recommendations, such as variable rate irrigation or targeted fertilizer application for that specific plant or zone.
Mermaid Diagram: AgTech Data Aggregation
erDiagram PLANT_TAG { string tagID PK "Unique Feature" string location } IOT_SENSOR { string sensorID PK string location float moisture float pH } CLOUD_PLATFORM { string dataID PK string tagID FK string sensorID FK string satelliteData string aiRecommendation } FARMER_DEVICE { string deviceID PK } PLANT_TAG ||--o{ CLOUD_PLATFORM : has IOT_SENSOR ||--o{ CLOUD_PLATFORM : has CLOUD_PLATFORM }o--|| FARMER_DEVICE : displays_on
4. Integration with Emerging Tech: Blockchain-Verified Supply Chain Provenance
Enabling Description: This embodiment integrates the '534 patent's system with a blockchain ledger for supply chain verification of high-value goods (e.g., pharmaceuticals, luxury items). The "printed matter" is the product's packaging or an embedded certificate of authenticity. The "machine recognizable feature" is a physically unclonable function (PUF), such as a unique pattern of randomly dispersed fibers in the paper, which is captured and stored as a digital signature during manufacturing. The "feature recognition unit" is a high-resolution scanner that captures the PUF. The "intelligent controller" (e.g., a dedicated mobile app) computes a hash of the scanned PUF and uses this hash as a key to query a public or private blockchain. The controller retrieves the immutable transaction history for that unique item from the distributed ledger. The "display unit" presents the complete, verified provenance of the item—from creation through every step of the supply chain to the current point of sale—confirming its authenticity.
Mermaid Diagram: Blockchain Verification Sequence
sequenceDiagram participant User participant ScannerApp as Controller participant ProductPUF as Machine Feature participant Blockchain User->>+ScannerApp: Scans Physical Unclonable Function (PUF) on product ScannerApp->>ScannerApp: Compute digital hash of the PUF ScannerApp->>+Blockchain: Query ledger with PUF hash Blockchain-->>-ScannerApp: Return immutable provenance record ScannerApp->>-User: Display verified origin, custody chain, and authenticity
5. "Inverse" or Failure Mode: Fail-Safe Emergency Procedure Display
Enabling Description: This variation is a life-safety system designed to operate reliably in a "disconnected" or emergency state. It is intended for industrial facilities or aircraft. The "machine recognizable features" are large, high-contrast symbols (e.g., compliant with ISO 7010) printed on machinery or inside cockpit panels. The "intelligent controller" and "display unit" are combined into a single, ruggedized, battery-powered device with non-volatile flash memory. In normal operation (connected to the facility network), the device functions like the standard '534 system. However, a heartbeat protocol constantly checks for network connectivity. If the connection is lost, the controller enters a "fail-safe" mode. In this mode, when a user scans a safety symbol, the controller does not attempt to access a remote server. Instead, it retrieves a corresponding emergency procedure (e.g., equipment shutdown, fire suppression protocol) stored directly in its local memory and presents it as a simplified, step-by-step checklist on its monochrome, low-power display.
Mermaid Diagram: State Transitions for Fail-Safe Device
stateDiagram-v2 [*] --> Online: Device Powers On Online: Accessing full data from remote server Online --> Offline_Emergency: Network Connection Lost Offline_Emergency: Operating on local memory Offline_Emergency --> Online: Network Connection Restored state Online { Scan --> Fetch_Remote: Scans feature Fetch_Remote --> Display_Rich: Displays full-featured content Display_Rich --> Scan } state Offline_Emergency { Scan_FailSafe --> Fetch_Local: Scans feature Fetch_Local --> Display_Basic: Displays cached safety protocol Display_Basic --> Scan_FailSafe }
Combination Prior Art with Open-Source Standards
This section discloses three scenarios where the core invention of the '534 patent is combined with existing open-source standards, rendering such combinations obvious to a person skilled in the art.
Combination with WebXR and QR Codes (ISO/IEC 18004): A system is disclosed wherein the machine-recognizable feature is a standard QR code printed in a museum catalog. The feature recognition unit is a common smartphone camera, and the intelligent controller is the phone's web browser. Recognition of the QR code (via the browser's built-in scanning API) navigates to a URL that loads a 3D model of the artifact using the open-source WebXR Device API. The display unit is the smartphone screen, which presents an augmented reality view of the artifact overlaid on the user's environment, or a VR view if a compatible headset is used.
Combination with Matter IoT Standard and NFC (ISO/IEC 14443): A system for smart home device onboarding is disclosed. The printed matter is the device's quick-start guide, and the feature is a standard NFC tag. The feature recognition unit is the NFC reader in a smartphone. Tapping the phone to the guide triggers an app (the intelligent controller) to read the device's commissioning information from the tag. The controller then uses the open-source Matter protocol to securely and automatically provision the device onto the user's Wi-Fi and Thread networks, with the display unit showing the real-time status.
Combination with RISC-V and Embedded Linux (Yocto Project): An implementation of the intelligent controller is disclosed using an open-source hardware and software stack. The controller is built on a System on a Chip (SoC) using the RISC-V ISA. The SoC runs a custom Embedded Linux distribution built with the Yocto Project. The feature recognition unit is a generic USB webcam utilizing standard UVC drivers included in the Linux kernel. The feature recognition logic is performed by an application running on the OS, using the OpenCV (Open Source Computer Vision) library to detect and decode features. This describes a complete, non-proprietary hardware/software implementation of the patented system.
Generated 4/29/2026, 5:07:49 PM