Patent 5132992
Derivative works
Defensive disclosure: derivative variations of each claim designed to render future incremental improvements obvious or non-novel.
Active provider: Google · gemini-2.5-pro
Derivative works
Defensive disclosure: derivative variations of each claim designed to render future incremental improvements obvious or non-novel.
Defensive Disclosure: US Patent 5132992 - Audio and video transmission and receiving system
This defensive disclosure aims to broaden the scope of prior art related to US Patent 5132992, "Audio and video transmission and receiving system," by outlining derivative variations of its core claims. The goal is to render future incremental improvements by competitors as obvious or lacking novelty. The current date is April 26, 2026.
Derivatives of Independent Claim 1: Transmission and Receiving System
Claim 1: A transmission and receiving system for providing information to remote locations, comprising:
source material library means prior to identification and compression;
identification encoding means for retrieving the information for said items from said source material library means and for assigning a unique identification code to said retrieved information;
conversion means, coupled to said identification encoding means, for placing said retrieved information into a predetermined format as formatted data;
ordering means, coupled to said conversion means, for placing said formatted data into a sequence of addressable data blocks;
compression means, coupled to said ordering means, for compressing said formatted and sequenced data;
compressed data storing means, coupled to said compression means, for storing as a file said compressed sequenced data received from said compression means with said unique identification code assigned by said identification encoding means; and
transmitter means, coupled to said compressed data storing means, for sending at least a portion of a specific file to a specific one of said remote locations.
Derivative 1.1: Material & Component Substitution - DNA Data Storage System
Enabling Description:
This derivative system replaces traditional magnetic or optical storage with deoxyribonucleic acid (DNA) as the primary storage medium for both the "source material library means" and "compressed data storing means." Information (audio/video) is encoded into synthetic DNA strands using established oligo-nucleotide synthesis techniques. The "identification encoding means" comprises a nanopore sequencing array that reads DNA sequences, deciphers the embedded unique identification code, and extracts the raw information. The "conversion means" involves enzymatic synthesis or CRISPR-based editing tools to convert retrieved DNA sequences into a predetermined digital format (e.g., binary streams) suitable for computational processing. The "ordering means" functions by assembling these digital streams into addressable data blocks. "Compression means" employs bio-inspired algorithms (e.g., adaptive Huffman coding tailored for DNA sequence redundancy) to reduce the data footprint of the digital representation. The "compressed data storing means" then writes these compressed digital sequences back to new synthetic DNA strands for long-term archival. The "transmitter means" utilizes a high-throughput microfluidic system coupled with directed enzyme delivery for sequence-specific information release at remote locations, or bio-molecular communication protocols leveraging synthetic biology constructs.
graph TD
A[Source Material Library (DNA Storage)] --> B(Nanopore Sequencing Array);
B --> C{Identification Encoding (Unique ID & Info Retrieval)};
C --> D[Enzymatic Synthesis / CRISPR Conversion Means (DNA to Digital)];
D --> E[Ordering Means (Addressable Data Blocks)];
E --> F[Bio-Inspired Compression Means];
F --> G[Compressed Data Storing Means (DNA Storage)];
G --> H[Microfluidic / Bio-molecular Transmitter Means];
H --> I[Remote Location];
Derivative 1.2: Material & Component Substitution - Quantum Entanglement Communication
Enabling Description:
This derivative system implements the "transmitter means" using optical quantum entanglement for information delivery. The "compressed data storing means" interfaces with a quantum transduction unit that converts classical compressed data (e.g., audio/video bitstreams) into entangled photon pairs. The "transmitter means" specifically utilizes a network of ground-to-satellite free-space optical links or fiber optic quantum channels. Each data block, now represented by the quantum state of entangled photons, is transmitted to a "specific one of said remote locations." At the remote location, a quantum receiver performs Bell-state measurements on the incoming entangled photons to reconstruct the original classical data. This system necessitates cryogenic photonic integrated circuits for stable entanglement generation and detection, ensuring robust data transfer across vast distances with inherent quantum encryption capabilities.
graph TD
A[Compressed Data Storing Means] --> B{Quantum Transduction Unit};
B --> C[Entangled Photon Pair Generation];
C --> D[Free-Space Optical / Fiber Quantum Channel (Transmitter Means)];
D --> E[Quantum Receiver (Remote Location)];
E --> F[Reconstructed Classical Data];
Derivative 1.3: Operational Parameter Expansion - Planetary-Scale Distribution System
Enabling Description:
This system is designed for the distribution of scientific data (e.g., high-resolution planetary imaging, seismic activity logs) across disparate celestial bodies within a solar system. The "source material library means" resides on an orbital data hub around Earth, storing petabytes of raw observational data. The "identification encoding means" assigns unique identifiers to datasets, factoring in origin body, instrument, and time-slice. "Conversion means" adapts raw telemetry into a unified, radiation-hardened data format. "Ordering means" segments these massive datasets into dynamically addressable blocks optimized for interplanetary packet transmission. "Compression means" utilizes extreme lossy compression algorithms (e.g., for visual data) coupled with error-correcting codes, designed to maintain critical scientific integrity despite significant data reduction for low-bandwidth, high-latency links. The "compressed data storing means" employs solid-state radiation-hardened memory arrays. The "transmitter means" consists of a network of deep-space optical communication relays, leveraging adaptive optics and phased arrays for precise laser beam pointing, capable of transmitting terabits per second across astronomical units, with inherent latency compensation protocols for time-delayed delivery to remote locations such as Martian orbital platforms, lunar habitats, or asteroid mining outposts.
graph TD
A[Source Material Library (Earth Orbit)] --> B(ID Encoding - Planetary Data);
B --> C[Data Conversion - Radiation Hardened Format];
C --> D[Ordering - Interplanetary Data Blocks];
D --> E[Extreme Compression + ECC];
E --> F[Compressed Data Storage (Radiation Hardened)];
F --> G[Deep-Space Optical Transmitter Network];
G --> H[Remote Location (Mars Orbiter / Lunar Habitat)];
Derivative 1.4: Operational Parameter Expansion - Micro-Scale Bio-Information Distribution
Enabling Description:
This system functions within a biological organism, distributing targeted therapeutic or diagnostic "information" at a cellular or subcellular scale. The "source material library means" is a synthetic DNA plasmid library encoding various therapeutic mRNA sequences, regulatory RNA, or gene-editing instructions, contained within engineered viral vectors. The "identification encoding means" uses specific promoter sequences or receptor-binding domains on the viral vectors to target particular cell types (remote locations) and ensures unique cellular recognition. "Conversion means" involves the host cell's transcription and translation machinery, converting the viral vector's genetic material into functional proteins or RNA within the cell. The "ordering means" dictates the sequential expression or activation of these biological instructions within the cellular machinery, forming addressable molecular pathways. "Compression means" refers to optimized genetic coding (e.g., codon optimization for minimal sequence length) and efficient vector packaging. The "compressed data storing means" are the stably integrated viral genomes within the target cell's nucleus or cytoplasm. The "transmitter means" (e.g., for intercellular distribution) could involve exosomal secretion or gap junction communication, sending specific gene expression products or signaling molecules to adjacent or distant cells, leveraging endogenous biological communication channels.
graph TD
A[Source Material Library (Engineered Viral Vectors)] --> B(Targeting/Promoter ID Encoding);
B --> C[Host Cell Transcription/Translation (Conversion Means)];
C --> D[Intracellular Signaling Pathways (Ordering Means)];
D --> E[Genetic Code Optimization (Compression Means)];
E --> F[Integrated Viral Genome (Compressed Data Storage)];
F --> G[Exosomal/Gap Junction Transmitter];
G --> H[Remote Location (Target Cell)];
Derivative 1.5: Cross-Domain Application - Industrial Automation Control Distribution
Enabling Description:
In an industrial automation context, this system provides real-time operational parameters, firmware updates, and complex control sequences to a distributed fleet of autonomous robotic assembly arms ("remote locations") within a smart factory. The "source material library means" is a central repository of verified robotic control algorithms, CAD designs for parts, and process flow definitions. The "identification encoding means" assigns unique IDs to robot models, software versions, and specific task scripts. "Conversion means" transforms high-level programming language (e.g., Python, C++) or CAD files into machine-executable G-code or robot-specific bytecode. The "ordering means" structures these instructions into prioritized, addressable command blocks suitable for deterministic execution. "Compression means" uses industry-standard motion planning compression algorithms (e.g., for reducing joint trajectories) and firmware differential updates. "Compressed data storing means" are secured servers within the factory network. The "transmitter means" employs a robust, low-latency, time-synchronized industrial Ethernet network (e.g., EtherCAT, Profinet) or a 5G private cellular network, ensuring reliable delivery of critical commands to individual robots or groups thereof.
graph TD
A[Source Material Library (Control Algorithms, CAD)] --> B(Robot/Task ID Encoding);
B --> C[High-Level to G-code Conversion];
C --> D[Prioritized Command Block Ordering];
D --> E[Motion Planning Compression];
E --> F[Secured Factory Server (Compressed Data Storage)];
F --> G[Industrial Ethernet / 5G Transmitter];
G --> H[Remote Location (Autonomous Robotic Arm)];
Derivative 1.6: Cross-Domain Application - Smart Agriculture Precision Farming
Enabling Description:
This system supports precision agriculture by distributing analytical data and prescriptive actions to autonomous farm machinery. The "source material library means" is a cloud-based geospatial database containing historical yield maps, soil composition data, and hyperspectral imagery from drones. "Identification encoding means" assigns unique plot IDs, crop cycle timestamps, and sensor data references to individual data layers. "Conversion means" transforms raw sensor data and imagery into standardized, actionable formats, such as nutrient requirement maps (e.g., NPK levels) or pest infestation probabilities. "Ordering means" organizes these into addressable geo-referenced task blocks for specific zones within a field. "Compression means" applies image compression (e.g., JPEG 2000 for hyperspectral data) and lossless compression for prescriptive application rates. "Compressed data storing means" are distributed edge servers on agricultural land or on farm vehicles. The "transmitter means" utilizes a robust, long-range wireless mesh network (e.g., LoRaWAN for telemetry, 5G for high-bandwidth imagery) to send these compressed task blocks to autonomous tractors, drones, or smart irrigation systems ("remote locations") for precise application of water, fertilizer, or pesticides.
graph TD
A[Source Material Library (Geospatial DB, Imagery)] --> B(Plot/Crop ID Encoding);
B --> C[Raw to Prescriptive Map Conversion];
C --> D[Geo-referenced Task Block Ordering];
D --> E[Image & Data Compression];
E --> F[Edge Server (Compressed Data Storage)];
F --> G[Wireless Mesh / 5G Transmitter];
G --> H[Remote Location (Autonomous Tractor / Irrigation)];
Derivative 1.7: Cross-Domain Application - Environmental Monitoring & Early Warning
Enabling Description:
This system focuses on distributing critical environmental sensor data for real-time monitoring and early warning, such as seismic activity or tsunami threats. The "source material library means" consists of a globally distributed network of deep-ocean acoustic sensors, seismic stations, and atmospheric monitoring probes. "Identification encoding means" assigns unique sensor IDs, geophysical event tags, and temporal markers to raw data streams. "Conversion means" processes raw sensor readings (e.g., pressure waves, ground acceleration) into standardized geophysical data formats (e.g., SAC for seismology, NetCDF for oceanography). "Ordering means" aggregates and sequences these validated data points into addressable event-based or time-series blocks. "Compression means" utilizes specialized geophysical data compression techniques (e.g., for waveforms) to reduce bandwidth requirements while preserving critical event characteristics. "Compressed data storing means" are regional data centers with high-availability storage. The "transmitter means" comprises a multi-layered communication infrastructure including geostationary satellite links, submarine fiber optic cables, and terrestrial broadband networks, capable of sending prioritized warning messages and processed data files to emergency response centers, coastal warning sirens, and public broadcast systems ("remote locations") with ultra-low latency.
graph TD
A[Source Material Library (Global Sensor Network)] --> B(Sensor/Event ID Encoding);
B --> C[Raw to Geophysical Format Conversion];
C --> D[Event/Time-Series Block Ordering];
D --> E[Geophysical Data Compression];
E --> F[Regional Data Center (Compressed Data Storage)];
F --> G[Satellite/Submarine/Broadband Transmitter];
G --> H[Remote Location (Emergency Response Center)];
Derivative 1.8: Integration with Emerging Tech - AI, IoT, and Blockchain
Enabling Description:
This system integrates AI-driven content optimization, IoT sensors for real-time network monitoring, and blockchain for secure rights management. The "source material library means" stores digital media. "Identification encoding means" not only assigns a unique ID but also embeds content metadata (e.g., creator, license terms) as a cryptographic hash. The "conversion means" adapts content based on real-time feedback from "IoT sensors" monitoring network congestion, receiver capabilities, and user QoE (Quality of Experience) metrics. An "AI-driven optimization module," coupled with the "ordering means" and "compression means," dynamically selects optimal codecs, bitrates, and data block sequencing for each specific "remote location," minimizing latency and maximizing perceived quality. The "compressed data storing means" stores the content alongside a "blockchain ledger" of its rights and transaction history. The "transmitter means" initiates delivery only after a smart contract on the blockchain validates the user's access rights and processes payment, ensuring immutable proof of transaction and usage. The AI continuously refines compression and transmission parameters based on global IoT network performance data and user feedback recorded on the blockchain.
graph TD
A[Source Material Library] --> B(ID Encoding + Crypto Hash);
B -- IoT Network Feedback --> C[AI-Driven Conversion Means];
C -- IoT Network Feedback --> D[AI-Driven Ordering Means];
D -- IoT Network Feedback --> E[AI-Driven Compression Means];
E --> F{Compressed Data Storing Means + Blockchain Ledger};
F -- Smart Contract Validation --> G[Intelligent Transmitter Means];
G --> H[Remote Location];
subgraph IoT Network
IoT[IoT Sensors] --> C;
IoT --> D;
IoT --> E;
end
subgraph Blockchain
BC[Blockchain Ledger] --> F;
BC -- Validation --> G;
end
Derivative 1.9: Integration with Emerging Tech - AI, IoT, and Blockchain for Data Provenance
Enabling Description:
This system specifically focuses on distributing critical sensor data with verifiable provenance and AI-driven insights. The "source material library means" comprises a distributed network of heterogeneous IoT sensors (e.g., environmental, industrial, biometric) continuously streaming raw data. The "identification encoding means" assigns unique identifiers to each sensor and data stream, also generating a cryptographic hash of the raw data. This hash, along with sensor metadata (location, calibration), is immutably anchored to a "permissioned blockchain ledger." The "conversion means" incorporates an AI-driven data cleansing and harmonization module, standardizing disparate sensor formats. The "ordering means" organizes the clean data into time-series or event-triggered blocks. The "compression means" applies adaptive compression algorithms based on data criticality and real-time bandwidth availability, influenced by AI predictive models for network load. "Compressed data storing means" are geo-distributed edge nodes that periodically batch-upload validated, compressed data to a central cloud repository, with each batch's hash linked to the blockchain. The "transmitter means" is an AI-managed secure data fabric, ensuring only authenticated and authorized "remote locations" (e.g., regulatory agencies, AI analytics platforms, public safety dashboards) receive the data, with every transmission event recorded on the blockchain for auditability and non-repudiation.
graph TD
A[IoT Sensor Network (Raw Data Streams)] --> B(ID Encoding + Crypto Hash);
B -- Hash & Metadata --> BL(Permissioned Blockchain Ledger);
B --> C[AI Data Cleansing & Harmonization (Conversion Means)];
C --> D[Time-Series / Event Block Ordering];
D --> E[Adaptive Compression (AI-Managed)];
E --> F[Edge Node Compressed Data Storage];
F -- Batch Hash --> BL;
F --> G[AI-Managed Secure Data Fabric (Transmitter)];
G --> H[Remote Location (Regulatory Agency / Analytics Platform)];
Derivative 1.10: The "Inverse" or Failure Mode - "Dark-Mode" Disaster Communication System
Enabling Description:
This system is an emergency "dark-mode" variant designed for operation during widespread infrastructure failure (e.g., natural disasters, cyberattacks). The "source material library means" is a pre-hardened, offline repository of essential emergency information (e.g., first aid guides, evacuation routes, critical contact numbers, limited pre-recorded PSAs). The "identification encoding means" prioritizes content based on criticality (e.g., "life-saving," "situational awareness," "non-essential"). The "conversion means" performs aggressive downsampling and grayscale conversion for video, and mono, low-bitrate audio conversion, prioritizing intelligibility over fidelity. The "ordering means" enforces a strict priority queue for transmission, with life-saving information always at the top. The "compression means" utilizes extreme, robust, and computationally inexpensive compression (e.g., run-length encoding for monochrome images, simple delta PCM for audio) to ensure data delivery even over severely degraded channels. The "compressed data storing means" are solid-state, battery-backed, and EMP-hardened local caches within emergency broadcast nodes. The "transmitter means" leverages opportunistic, intermittent, and low-power communication methods such as shortwave radio, satellite burst modems, or even long-range acoustic signaling, selectively beaming limited-functionality, text-heavy information packets or highly compressed emergency broadcasts to designated, simplified emergency receivers (e.g., hand-crank radios, basic satellite phones) at "remote locations," automatically omitting non-essential data.
graph TD
A[Offline Emergency Content Library (Hardened)] --> B(Criticality-Based ID Encoding);
B --> C[Aggressive Downsampling / Grayscale Conversion];
C --> D[Strict Priority Transmission Ordering];
D --> E[Robust, Low-Cost Compression];
E --> F[Hardened Local Cache (Compressed Data Storage)];
F --> G[Opportunistic Low-Power Transmitter (Shortwave, Burst Sat)];
G --> H[Remote Location (Simplified Emergency Receiver)];
Derivatives of Independent Claim 2: Distribution Method
Claim 2: A distribution method responsive to requests identifying information to be sent from a transmission system to a remote location, said method comprising the steps of:
storing audio and video information in a compressed data form;
requesting transmission, by a user, of at least a part of said stored compressed information to said remote location;
sending at least a portion of said stored compressed information to said remote location;
receiving the sent information at said remote location;
buffering the processed information at said remote location; and
playing back said buffered information in real time at a time requested by said user.
Derivative 2.1: Material & Component Substitution - Quantum Memory and Teleportation Method
Enabling Description:
This method stores "information" (e.g., audio/video bitstreams represented as quantum states) within quantum memory crystals (e.g., rare-earth-ion-doped crystals) by encoding bits into electron spin or photon polarization states. A "user" initiates "requesting transmission" via a direct neural interface, triggering a specific quantum entanglement generation sequence. "Sending at least a portion" involves quantum teleportation of the prepared quantum information states across a quantum network to an entangled node at the "remote location." "Receiving the sent information" is the successful reconstruction of the quantum state at the remote node. "Buffering the processed information" occurs in a local quantum memory at the remote location, maintaining coherence for a specified duration. "Playing back said buffered information" involves reading out the quantum states to classical transducers (e.g., photon detectors, electron spin readers) which convert them into real-time audio/video signals for the user at the requested time.
sequenceDiagram
User->>Quantum Memory: Request Transmission (Neural Interface)
Quantum Memory->>Quantum Network: Prepare & Entangle Qubits
Quantum Network-->>Quantum Network: Quantum Teleportation (Sending Info)
Quantum Network->>Remote Quantum Node: Receive Quantum State
Remote Quantum Node->>Local Quantum Memory: Buffer Quantum Info
Local Quantum Memory->>User: Playback (Classical Transducers)
Derivative 2.2: Material & Component Substitution - Biochemical Information Distribution
Enabling Description:
This method distributes biochemical "information" within a complex biological system, such as a multi-organism bioreactor or human body. "Storing audio and video information in a compressed data form" is achieved by encoding data into specific sequences of messenger RNA (mRNA) or synthetic proteins, which are "compressed" by optimizing amino acid or nucleotide sequence redundancy. "Requesting transmission, by a user" could involve introducing specific molecular markers or activating optogenetic switches within a donor cell. "Sending at least a portion" is accomplished via active cellular transport mechanisms like exocytosis of vesicles containing the encoded mRNA/proteins, or through specific receptor-ligand interactions across intercellular junctions. "Receiving the sent information" involves target cells internalizing the vesicles or binding the signaling molecules. "Buffering the processed information" occurs as the mRNA translates into proteins or as proteins activate downstream signaling cascades, maintaining the "information" state for a biological duration. "Playing back said buffered information in real time at a time requested by said user" manifests as a physiological response, enzymatic reaction, or gene expression cascade within the target cells, initiated by an internal biological clock or external stimulus from the "user."
graph TD
A[Stored Info (Encoded mRNA/Proteins)] --> B{User Request (Molecular Marker/Optogenetic)};
B --> C[Active Cell Transport (Sending)];
C --> D[Target Cell Internalization (Receiving)];
D --> E[Intracellular Buffering (Translation/Signaling)];
E --> F[Physiological Response / Gene Expression (Playback)];
Derivative 2.3: Operational Parameter Expansion - Multi-Sensory Data Distribution for VR/AR
Enabling Description:
This method distributes ultra-high-resolution, multi-sensory data (e.g., 16K video, 3D spatial audio, haptic feedback profiles, olfactory cues) for immersive Virtual Reality (VR) or Augmented Reality (AR) experiences. "Storing audio and video information in a compressed data form" includes storing haptic textures, thermal profiles, and synthetic scent molecular compositions alongside traditional A/V, using specialized codecs for each sensory modality. "Requesting transmission" is initiated by a user within a VR/AR environment. "Sending at least a portion" occurs over dedicated, ultra-low-latency 6G wireless channels with massive MIMO and beamforming. "Receiving the sent information" is performed by a high-bandwidth AR/VR headset with integrated haptic actuators, micro-olfactory emitters, and foveated rendering engines. "Buffering the processed information" at the remote location (the headset) uses a combination of on-device GPU memory and predictive caching algorithms to maintain seamless, high-framerate playback across all sensory streams, anticipating user head movements and interactions. "Playing back said buffered information in real time" is synchronized presentation across all sensory output devices (visual, auditory, haptic, olfactory) at a refresh rate exceeding human perception, at the user's requested interaction time.
sequenceDiagram
User(VR/AR Headset)->>Transmission System: Request Multi-Sensory Experience
Transmission System->>6G Network: Send Compressed 16K/3D Audio/Haptic/Olfactory Data
6G Network->>VR/AR Headset: Receive Multi-Sensory Data
VR/AR Headset->>On-Device Cache: Buffer Processed Info (GPU Mem, Predictive Cache)
VR/AR Headset->>User(VR/AR Headset): Real-time Synchronized Playback (Visual, Auditory, Haptic, Olfactory)
Derivative 2.4: Operational Parameter Expansion - Sub-Atomic Information Distribution
Enabling Description:
This extreme-scale method deals with the distribution of quantum information, where "audio and video information" is conceptualized as the specific spin states or entanglement properties of individual sub-atomic particles. "Storing" involves maintaining these quantum states in ultra-stable quantum registers or atomic clocks, in a "compressed" form relative to the classical information they could represent. "Requesting transmission" is initiated by a quantum computer operator. "Sending at least a portion" involves manipulating quantum fields or using resonant frequencies to induce a controlled change in the quantum state of a sender particle, which is then instantaneously reflected in an entangled receiver particle located at a "remote location" through quantum non-locality. "Receiving the sent information" is the detection of this instantaneous state change in the remote entangled particle. "Buffering the processed information" entails storing the resulting quantum state in another quantum register at the remote location for a fleeting period of quantum coherence. "Playing back said buffered information" involves performing a specific quantum measurement on the buffered particle, collapsing its wavefunction to yield a classical output (e.g., a binary bit) at a precisely requested time, which can then be interpreted.
graph TD
A[Quantum Register (Stored Particle Spin States)] --> B{Quantum Computer Operator Request};
B --> C[Manipulate Sender Particle (Sending)];
C -- Quantum Entanglement --> D[Detect Remote Entangled Particle (Receiving)];
D --> E[Remote Quantum Register (Buffering)];
E --> F[Quantum Measurement (Playback)];
Derivative 2.5: Cross-Domain Application - Secure Medical Imaging Distribution
Enabling Description:
This method applies to the secure distribution of high-resolution medical imaging data within a healthcare network. "Storing audio and video information in a compressed data form" refers to storing DICOM (Digital Imaging and Communications in Medicine) formatted patient scans (e.g., MRI, CT) in a compressed, encrypted format within a hospital's Picture Archiving and Communication System (PACS). "Requesting transmission, by a user" occurs when a remote specialist (the user) accesses an Electronic Health Record (EHR) and requests a specific patient's scan from their diagnostic workstation (remote location). "Sending at least a portion of said stored compressed information" involves the PACS server securely transmitting the encrypted, compressed DICOM files over a dedicated Virtual Private Network (VPN) or secure healthcare cloud. "Receiving the sent information" is handled by the specialist's workstation. "Buffering the processed information at said remote location" utilizes local storage on the workstation to pre-fetch and cache image slices for smooth, real-time scrolling and manipulation by the diagnostic viewer application, optimizing for display latency. "Playing back said buffered information in real time at a time requested by said user" is the instantaneous rendering and interactive display of the medical images by the specialist, allowing for detailed examination at their convenience.
sequenceDiagram
Specialist(Remote)->>EHR/PACS: Request Encrypted DICOM Scan
EHR/PACS->>VPN/Secure Cloud: Send Compressed, Encrypted DICOM
VPN/Secure Cloud->>Specialist Workstation: Receive Encrypted DICOM
Specialist Workstation->>Local Storage: Buffer Processed Info (Image Slices)
Specialist Workstation->>Specialist(Remote): Real-time Diagnostic Display
Derivative 2.6: Cross-Domain Application - Aerospace Flight Telemetry Distribution
Enabling Description:
This method is used for distributing critical flight telemetry and diagnostic video feeds in an aerospace context. "Storing audio and video information in a compressed data form" means archiving high-rate sensor data (e.g., engine parameters, attitude control, environmental readings) and cockpit/external video feeds from spacecraft or aircraft in highly compressed, resilient formats within a central mission control data archive. "Requesting transmission, by a user" occurs when ground control engineers or autonomous monitoring systems (the user) require specific telemetry bursts or real-time video streams for diagnostics or mission planning at a ground station (remote location). "Sending at least a portion of said stored compressed information" is performed by satellite communication links (e.g., Ka-band, optical links) from the orbiting craft to the ground station. "Receiving the sent information" is handled by the ground station's receiving antennae and data acquisition systems. "Buffering the processed information at said remote location" involves using specialized high-speed memory buffers at the ground station to smooth out intermittent signal losses and compensate for communication delays inherent in space-to-ground links, ensuring a continuous stream for analysis. "Playing back said buffered information in real time at a time requested by said user" refers to the display of dynamic telemetry on engineering consoles and video feeds on monitors, synchronized for real-time analysis by ground control personnel.
graph TD
A[Spacecraft/Aircraft Archive (Compressed Telemetry/Video)] --> B{Ground Control Request};
B --> C[Satellite Comm Link (Sending)];
C --> D[Ground Station Receiver (Receiving)];
D --> E[High-Speed Buffering (Delay Comp)];
E --> F[Engineering Console / Video Display (Playback)];
Derivative 2.7: Cross-Domain Application - High-Frequency Trading Market Data Distribution
Enabling Description:
This method addresses the distribution of ultra-low-latency market data for high-frequency trading (HFT) applications. "Storing audio and video information in a compressed data form" translates to maintaining real-time order book data, trade execution records, and news feeds in highly optimized, binary-encoded, and compressed formats within an exchange's matching engine or primary data distributor. "Requesting transmission, by a user" occurs when HFT firms' proprietary trading algorithms (the user) subscribe to specific market data feeds from their co-located servers (remote location). "Sending at least a portion of said stored compressed information" is achieved via dedicated, ultra-low-latency fiber optic lines directly from the exchange to the co-location facility, using multicast protocols. "Receiving the sent information" is handled by specialized network interface cards (NICs) on the HFT firm's servers. "Buffering the processed information at said remote location" employs kernel-bypass network stacks and ring buffers in user-space memory on the HFT servers to minimize jitter and ensure deterministic processing of incoming market data packets, maintaining an extremely fresh view of the market. "Playing back said buffered information in real time at a time requested by said user" refers to the HFT algorithms consuming and reacting to the buffered market data with sub-microsecond latency, triggering trade orders in response to market events.
sequenceDiagram
HFT Algorithm(Co-Lo)->>Exchange Data Distributor: Subscribe Market Data Feed
Exchange Data Distributor->>Dedicated Fiber Optics: Send Compressed Binary Market Data
Dedicated Fiber Optics->>HFT Server(Co-Lo): Receive Market Data
HFT Server(Co-Lo)->>User-Space Ring Buffer: Buffer Processed Info (Low Latency)
HFT Server(Co-Lo)->>HFT Algorithm(Co-Lo): Real-time Algorithmic Consumption
Derivative 2.8: Integration with Emerging Tech - AI-Personalized Educational Content
Enabling Description:
This method applies to the distribution of AI-personalized educational content over a decentralized IoT network. "Storing audio and video information in a compressed data form" involves adaptive learning modules, interactive video lectures, and dynamic quizzes compressed and tokenized for efficient delivery. "Requesting transmission, by a user" is an explicit request for a learning module from a student via a smart device (e.g., tablet, AR glasses) acting as a "remote location." An "AI-driven personalization engine" (part of the transmission system) dynamically tailors the content difficulty, presentation style, and sequence based on the student's real-time performance and learning profile, identified through IoT sensors in their environment (e.g., eye-tracking, galvanic skin response). "Sending at least a portion of said stored compressed information" occurs over a peer-to-peer decentralized IoT network where other student devices or local edge nodes contribute bandwidth. Content rights and micro-payments for premium modules are managed by "smart contracts on a blockchain." "Receiving the sent information" is handled by the student's smart device. "Buffering the processed information at said remote location" utilizes on-device storage to pre-load upcoming segments and interactive elements, ensuring a smooth, uninterrupted learning flow. "Playing back said buffered information in real time at a time requested by said user" is the interactive display of the personalized module, adapting in real-time to user input and AI feedback loops.
sequenceDiagram
Student Device(IoT)->>AI Personalization Engine: Request Learning Module (with IoT Data)
AI Personalization Engine->>P2P IoT Network: Send Personalized, Compressed Content
P2P IoT Network->>Smart Contract (Blockchain): Verify Rights/Payment
Smart Contract (Blockchain)-->P2P IoT Network: Authorization
P2P IoT Network->>Student Device(IoT): Receive Content
Student Device(IoT)->>On-Device Storage: Buffer Processed Info
Student Device(IoT)->>Student Device(IoT): Real-time Adaptive Playback
Derivative 2.9: Integration with Emerging Tech - AI, IoT, and Blockchain for Smart City Data
Enabling Description:
This method distributes real-time sensor data from smart city infrastructure, enhanced by AI and secured by blockchain. "Storing audio and video information in a compressed data form" refers to storing real-time traffic camera feeds, environmental sensor readings (air quality, noise), and utility consumption data from distributed "IoT sensors" in a city's central data lake, using high-efficiency compression. "Requesting transmission, by a user" is triggered by an "AI-powered urban planning model" or a city management dashboard (the user) requiring specific data feeds for predictive analytics or operational oversight at a data analysis center (remote location). The "AI-powered model" dynamically adjusts its data requests based on observed patterns and predicted needs (e.g., requesting more traffic data during rush hour). "Sending at least a portion of said stored compressed information" is done over a secure city-wide optical fiber network or private 5G network. The data stream's integrity and timestamps are continuously validated via "blockchain technology" (e.g., by hashing data blocks and adding them to a distributed ledger), providing an immutable audit trail. "Receiving the sent information" is handled by the data analysis center. "Buffering the processed information at said remote location" utilizes in-memory databases and stream processing engines to ensure low-latency access for the AI model, allowing it to perform real-time analysis. "Playing back said buffered information in real time at a time requested by said user" is the live visualization on dashboards or the direct consumption by the AI model to update its simulations and decision-making processes.
sequenceDiagram
IoT Sensors(City)->>Central Data Lake: Store Compressed Sensor Data
AI Urban Model(Remote)->>Central Data Lake: Request Data (Dynamic, AI-Driven)
Central Data Lake->>Secure Fiber/5G Network: Send Compressed Data + Blockchain Hashes
Secure Fiber/5G Network->>Blockchain Network: Validate Data Integrity
Blockchain Network-->Secure Fiber/5G Network: Confirmation
Secure Fiber/5G Network->>Data Analysis Center(Remote): Receive Data
Data Analysis Center(Remote)->>In-Memory DB: Buffer Processed Info
Data Analysis Center(Remote)->>AI Urban Model(Remote): Real-time Model Update/Visualization
Derivative 2.10: The "Inverse" or Failure Mode - Data Degradation for Preview/Limited Access
Enabling Description:
This method describes a controlled "data degradation" process for offering previews or limited access to content while preventing full fidelity acquisition. "Storing audio and video information in a compressed data form" remains as in the original patent. "Requesting transmission, by a user, of at least a part of said stored compressed information to said remote location" specifically includes an option for "limited access" or "preview." Upon such a request, before "sending at least a portion," the transmission system dynamically applies a "degradation module." This module actively introduces specific artifacts (e.g., watermarks, pixelation, temporal aliasing by frame skipping, significant bitrate reduction, or even intentional audio gaps/noise) into the compressed A/V stream, or performs selective encryption of only a portion of the data. The "sending" process then transmits this degraded, compressed information. "Receiving the sent information" is performed by the remote location. "Buffering the processed information at said remote location" only stores the intentionally degraded version. "Playing back said buffered information in real time at a time requested by said user" presents the visibly or audibly impaired content, which serves as a preview or limited-functionality experience, but is inherently unsuitable for high-quality copying or unrestricted usage. This method ensures that unauthorized full-fidelity capture is not possible from the transmitted stream.
sequenceDiagram
User->>Transmission System: Request Limited Access / Preview
Transmission System->>Degradation Module: Apply Intentional Artifacts / Partial Encryption
Degradation Module->>Transmission System: Send Degraded, Compressed Info
Transmission System->>Remote Location: Receive Degraded Info
Remote Location->>Local Buffer: Buffer Degraded Info
Local Buffer->>User: Playback Degraded Content
Derivatives of Independent Claim 3: Receiving System
Claim 3: A receiving system responsive to a user input identifying a choice of an item stored in a source material library to be played back to said subscriber at a location remote from said source material library, said item containing information to be sent from a transmitter to said receiving system, and wherein said receiving system comprises:
transceiver means for automatically receiving the requested information from said transmitter as compressed formatted data blocks;
receiver format conversion means, coupled to said transceiver means, for converting said compressed formatted data blocks into a format suitable for storage and processing resulting in playback in real time;
storage means, coupled to said receiver format conversion means, for holding said compressed formatted data;
decompressing means, coupled to said receiver format conversion means, for decompressing said compressed formatted information; and
output data conversion means, coupled to said decompressing means, for playing back said decompressed information in real time at a time specified by said user.
Derivative 3.1: Material & Component Substitution - Neuromorphic Decompression System
Enabling Description:
This derivative receiving system utilizes neuromorphic computing architectures for its "decompressing means." The "transceiver means" receives compressed, formatted data blocks. The "receiver format conversion means" prepares these blocks for ingestion by a spiking neural network (SNN) based decompression engine. The "decompressing means" is implemented as a dedicated neuromorphic chip (e.g., Intel Loihi, IBM NorthPole) designed to mimic biological neural processes. This chip directly processes the incoming compressed data blocks, performing parallel pattern recognition and reconstruction through its synaptic weights and neuronal spiking activity, effectively decompressing the information with high energy efficiency and ultra-low latency, replacing traditional CPU/GPU-based decompression algorithms. The "storage means" (e.g., resistive RAM or phase-change memory) is integrated directly within or adjacent to the neuromorphic chip to feed data efficiently. The "output data conversion means" interfaces with the SNN's output layer, translating neuronal firing patterns back into a real-time analog or digital signal for playback.
graph TD
A[Transceiver Means] --> B[Receiver Format Conversion Means];
B --> C[Neuromorphic Chip (Decompressing Means)];
C -- High-Speed Interface --> D[Integrated Storage Means];
C --> E[Output Data Conversion Means];
E --> F[Playback Display/Speaker];
Derivative 3.2: Material & Component Substitution - Molecular Memory and Biochemical Output
Enabling Description:
This receiving system operates at a molecular level. The "transceiver means" consists of a chemically sensitive membrane or synthetic receptor array capable of binding to and internalizing specific "compressed formatted data blocks" (e.g., DNA origami structures, coded peptides) transmitted biochemically. The "receiver format conversion means" involves intracellular enzymatic cascades that unfold or modify these molecular data blocks into a biologically readable form suitable for storage. The "storage means" is implemented as synthetic polymers or stable protein complexes within the cell, holding the "compressed formatted data" as molecular conformations. The "decompressing means" comprises a series of sequence-specific enzymatic reactions or molecular motors that unravel the stored molecular data, releasing constituent "information" molecules (e.g., fluorophores, activators). The "output data conversion means" consists of biochemical sensors or genetically encoded fluorescent reporters that detect these released molecules, converting the molecular events into a macroscopic, real-time observable signal (e.g., light emission, cellular differentiation, neurotransmitter release) at a user-specified biological 'time' (e.g., when a specific gene is expressed or a pathway activated).
graph TD
A[Chemical Transceiver Means (Receptor Array)] --> B[Intracellular Enzymatic Conversion Means];
B --> C[Synthetic Polymer / Protein Storage Means];
C --> D[Sequence-Specific Enzymatic Decompressing Means];
D --> E[Biochemical Sensor / Fluorescent Reporter (Output Data Conversion Means)];
E --> F[Macroscopic Biological Playback Signal];
Derivative 3.3: Operational Parameter Expansion - Cryo-Cooled Astronomical Data Receiver
Enabling Description:
This system is a specialized receiving system for astronomical data from deep-space probes or radio telescopes. The "transceiver means" is a large-aperture radio antenna coupled to a highly sensitive, cryo-cooled quantum receiver operating at millikelvin temperatures to detect extremely faint "compressed formatted data blocks" transmitted across interstellar distances. The "receiver format conversion means" downconverts the received radio-frequency signals and performs initial error correction, converting them into a digital stream compatible with terrestrial processing. The "storage means" is a massive, fault-tolerant, high-speed archival system (e.g., exabyte-scale solid-state drives) to cope with sporadic high-volume data bursts, holding the raw and partially processed compressed data. The "decompressing means" comprises a massively parallel, molecular-scale decompressor, potentially based on DNA computation or specialized ASICs, optimized for the unique compression algorithms used in deep-space communication (e.g., highly robust, low-overhead codes). The "output data conversion means" is a scientific visualization workstation capable of rendering complex, multi-dimensional astronomical data sets (e.g., gravitational wave patterns, exoplanetary atmospheric spectra) in real-time, allowing astronomers to "play back" and interact with the decompressed information at a time specified for scientific analysis.
graph TD
A[Cryo-Cooled Quantum Transceiver (Antenna/Receiver)] --> B[RF Downconversion & Error Correction (Receiver Format Conversion)];
B --> C[Exabyte-Scale Fault-Tolerant Storage Means];
C --> D[Massively Parallel Molecular Decompressing Means];
D --> E[Scientific Visualization Workstation (Output Data Conversion Means)];
E --> F[Real-time Astronomical Data Display];
Derivative 3.4: Operational Parameter Expansion - Cellular Epigenetic Receiver
Enabling Description:
This receiving system is integrated within a eukaryotic cell. The "transceiver means" consists of specific transmembrane receptor proteins on the cell surface that selectively bind to external molecular signals ("compressed formatted data blocks"), such as growth factors or hormones. This binding event triggers an intracellular "receiver format conversion means" through a signal transduction cascade (e.g., phosphorylation events), converting the external signal into a chemically readable format within the cytoplasm (e.g., activating transcription factors). The "storage means" is the cell's chromatin, where the "compressed formatted data" (now as activated transcription factors) binds to specific DNA regulatory regions, leading to transient or stable epigenetic modifications (e.g., histone acetylation, DNA methylation). The "decompressing means" is the cell's gene expression machinery (RNA polymerase, ribosomes), which "reads" and "decompresses" the stored epigenetic information by transcribing specific genes into mRNA. The "output data conversion means" is the subsequent translation of mRNA into functional proteins, which then manifest a phenotypic change or physiological response in "real time" at a "time specified by said user" (e.g., a specific stage of cell differentiation or cell division initiated by a research scientist's experimental protocol).
graph TD
A[Transmembrane Receptor Proteins (Transceiver Means)] --> B[Signal Transduction Cascade (Receiver Format Conversion)];
B --> C[Chromatin/Epigenetic Modification (Storage Means)];
C --> D[Gene Expression Machinery (Decompressing Means)];
D --> E[Protein Synthesis / Phenotypic Change (Output Data Conversion Means)];
E --> F[Biological Playback Signal];
Derivative 3.5: Cross-Domain Application - Autonomous Vehicle Mapping/Traffic Update Receiver
Enabling Description:
This receiving system is integrated into an autonomous vehicle. The "transceiver means" is a redundant V2X (Vehicle-to-Everything) communication module (e.g., DSRC, C-V2X, 5G) capable of receiving "compressed formatted data blocks" comprising high-definition map updates, real-time traffic incident video, and infrastructure sensor data from a central traffic management server or roadside units (transmitter). The "receiver format conversion means" in the vehicle's central computing unit decodes and authenticates these incoming data blocks, converting them into a format compatible with the vehicle's onboard perception and planning stack. The "storage means" utilizes solid-state drives with automotive-grade durability and high-speed NVMe interfaces to quickly cache and hold the incoming map tiles and video segments. The "decompressing means" is a dedicated hardware accelerator (e.g., an ASIC for H.264/H.265 video, specialized map data decompression) within the vehicle's central computer, efficiently decompressing the received information. The "output data conversion means" feeds the decompressed map data to the vehicle's navigation and localization modules for continuous, seamless path planning, and streams decompressed video to driver-assistance displays or object recognition algorithms, all "in real time" to ensure safe and informed autonomous operation at any "time specified by said user" (e.g., as needed for the current driving task).
graph TD
A[V2X Transceiver Module] --> B[Onboard Data Decoder/Authenticator (Receiver Format Conversion)];
B --> C[Automotive-Grade SSD (Storage Means)];
C --> D[Dedicated Hardware Decompressor];
D --> E[Navigation/Perception/Display Systems (Output Data Conversion Means)];
E --> F[Real-time Autonomous Operation/Driver Display];
Derivative 3.6: Cross-Domain Application - Remote Surveillance Drone Data Receiver
Enabling Description:
This receiving system is deployed within a remote surveillance drone for encrypted, real-time mission updates and target analysis. The "transceiver means" is a military-grade, encrypted directional radio modem capable of automatically receiving "compressed formatted data blocks" containing updated flight paths, target coordinates, and encrypted sensor fusion algorithms from a ground-based command center (transmitter). The "receiver format conversion means" comprises a secure cryptographic module that decrypts and validates the incoming data, converting it into a drone-specific command and control language. The "storage means" is a hardened, self-erasing non-volatile memory module on the drone, holding mission-critical data temporarily. The "decompressing means" is an onboard FPGA or dedicated processor that decompresses the encrypted mission parameters and algorithmic updates. The "output data conversion means" feeds the decompressed information directly to the drone's flight controller for immediate execution of new commands, or to onboard AI-powered target recognition systems for real-time analysis of sensor data, enabling the drone to "play back" updated mission profiles or engage new targets "in real time" at a "time specified by said user" (e.g., mission commander).
graph TD
A[Military-Grade Directional Radio Modem (Transceiver Means)] --> B[Secure Cryptographic Module (Receiver Format Conversion)];
B --> C[Hardened Self-Erasing NV Memory (Storage Means)];
C --> D[Onboard FPGA/Processor (Decompressing Means)];
D --> E[Flight Controller / AI Target Recognition (Output Data Conversion Means)];
E --> F[Real-time Mission Execution/Analysis];
Derivative 3.7: Cross-Domain Application - Smart Grid Substation Receiver
Enabling Description:
This receiving system is located within an intelligent substation of a smart electrical grid. The "transceiver means" is an industrial-grade, redundant fiber optic communication module (e.g., leveraging IEC 61850 protocols) that automatically receives "compressed formatted data blocks" of real-time sensor readings (e.g., voltage, current, frequency, temperature) and predictive fault analyses from a central grid operations center (transmitter). The "receiver format conversion means" is a high-reliability industrial embedded controller that decodes the IEC 61850 GOOSE/SV messages and converts them into an internal data format for local decision-making algorithms. The "storage means" is a hardened, uninterruptible industrial flash storage unit, holding recent operational data and predictive models for offline analysis or rapid reinitialization. The "decompressing means" is a real-time processor optimized for decompressing grid-specific data streams, enabling rapid access to critical sensor data. The "output data conversion means" feeds the decompressed information to local protection relays for immediate fault isolation, to SCADA (Supervisory Control and Data Acquisition) systems for operational display, or to local AI agents that perform autonomous grid balancing, enabling the substation to "play back" operational adjustments "in real time" at a "time specified by said user" (e.g., the grid's control logic).
graph TD
A[Industrial Fiber Optic Transceiver (IEC 61850)] --> B[Embedded Controller (Receiver Format Conversion)];
B --> C[Hardened Industrial Flash Storage Means];
C --> D[Real-time Grid Data Decompressor];
D --> E[Protection Relays / SCADA / AI Agents (Output Data Conversion Means)];
E --> F[Real-time Grid Operation/Autonomous Balancing];
Derivative 3.8: Integration with Emerging Tech - AI, IoT, and Blockchain-Enhanced Receiver
Enabling Description:
This receiving system integrates an embedded AI assistant, IoT sensors for environmental feedback, and blockchain verification. The "transceiver means" receives "compressed formatted data blocks." The "receiver format conversion means" is augmented with an "AI assistant module" that not only converts data but also assesses content metadata for authenticity, referencing "blockchain hashes" embedded in the data block. "IoT sensors" (e.g., ambient light, room occupancy, user biometric data) provide contextual information to the AI assistant. The "storage means" (e.g., a smart caching unit) is dynamically managed by the AI assistant, which predicts user viewing patterns and network fluctuations (informed by IoT data) to optimize buffer size and pre-fetch content. The "decompressing means" adjusts its processing power based on the AI's real-time quality-of-experience (QoE) calculations derived from IoT sensor feedback. The "output data conversion means" is also controlled by the AI assistant, dynamically adjusting display settings (e.g., brightness, contrast, audio levels) and even content presentation (e.g., re-framing, captioning) to optimize the user's "playback" experience "in real time" at a "time specified by said user," all while the blockchain provides an immutable record of content consumption and rights verification.
graph TD
subgraph Receiver System
A[Transceiver Means] --> B{AI-Enhanced Receiver Format Conversion (with Blockchain Verification)};
B --> C{Smart Caching Storage Means (AI-Managed)};
C --> D{Adaptive Decompressing Means (AI-QoE Driven)};
D --> E{AI-Controlled Output Data Conversion Means};
E --> F[Playback Display/Speaker];
end
IoT[IoT Sensors (Ambient, Biometric)] --> B;
IoT --> C;
IoT --> D;
IoT --> E;
BC[Blockchain Network (Content Hashes)] --> B;
Derivative 3.9: Integration with Emerging Tech - AI, IoT, and Blockchain for Industrial Drone Footage
Enabling Description:
This receiving system is deployed at a central monitoring station for real-time industrial drone footage, integrating AI-powered anomaly detection and blockchain for data provenance. The "transceiver means" is a high-bandwidth, secure wireless link (e.g., private 5G) automatically receiving "compressed formatted data blocks" of live video feeds and sensor telemetry from industrial "IoT drones" performing inspections (transmitter). The "receiver format conversion means" decrypts and authenticates the incoming streams, converting them into a standardized format for a dedicated onboard "AI anomaly detection engine." This AI engine acts as the "decompressing means," performing real-time video and sensor data decompression and analysis, identifying potential defects (e.g., cracks in infrastructure, thermal hot spots) directly from the compressed stream or immediately after decompression. The "storage means" consists of a temporary, high-speed buffer for the AI's processing and a secure, forensically sound archive, where each processed video frame or identified anomaly's metadata, timestamp, and location are hashed and recorded on a "permissioned blockchain ledger" for immutable proof of inspection. The "output data conversion means" displays the decompressed drone footage to a human operator, overlaid with AI-generated annotations and alerts, allowing for "real-time" decision-making or retrospective analysis at a "time specified by said user" (the operator), with the blockchain ensuring the integrity of the inspection data.
graph TD
A[Secure Wireless Transceiver (Drone Link)] --> B[Decryption/Authentication (Receiver Format Conversion)];
B --> C{AI Anomaly Detection Engine (Decompressing Means)};
C -- Anomaly Metadata & Hash --> BL(Permissioned Blockchain Ledger);
C --> D[High-Speed Buffer & Secure Archive (Storage Means)];
C --> E[Operator Display (Output Data Conversion Means)];
E --> F[Real-time Footage + AI Overlays];
IoT[Industrial IoT Drones] -- Live Feed --> A;
Derivative 3.10: The "Inverse" or Failure Mode - Privacy-Preserving Output Conversion System
Enabling Description:
This receiving system incorporates a "privacy-preserving" mode by actively transforming output data. The "transceiver means" receives "compressed formatted data blocks." The "receiver format conversion means" prepares this data for processing. The "storage means" holds the compressed data. The "decompressing means" decompresses the information. The "output data conversion means" is augmented with an "on-device redaction and obfuscation module." Upon user input indicating a "privacy-preserving playback" request, this module, acting on user-defined privacy policies (e.g., anonymize faces, blur specific objects, redact sensitive documents within video), applies real-time processing to the decompressed audio/video stream. This involves active pixelation, gaussian blurring, audio distortion, or synthetic voice generation for specified regions or channels, even if the original content is unredacted. The system "plays back" the visibly or audibly modified information "in real time" at a "time specified by said user," with temporary, ephemeral buffering of only the redacted stream. This ensures that sensitive information is not displayed or recorded in its original form, fulfilling privacy requirements while still delivering the core content.
graph TD
A[Transceiver Means] --> B[Receiver Format Conversion Means];
B --> C[Storage Means];
C --> D[Decompressing Means];
D --> E{On-Device Redaction/Obfuscation Module (Output Data Conversion Means)};
E --> F[Privacy-Preserving Playback];
User[User Input (Privacy Mode)] --> E;
Combination Prior Art Scenarios with Open-Source Standards
These scenarios combine the teachings of US Patent 5132992 with existing open-source standards, demonstrating how the patent's concepts could be rendered obvious by a PHOSITA leveraging widely available knowledge at or around the priority date (January 7, 1991).
US5132992 + Early MPEG (e.g., MPEG-1 Standard):
- Description: A PHOSITA, aware of the need for efficient audio/video transmission (as articulated by US5132992's focus on data compression [cite: https://patents.google.com/patent/[US5132992](/patent/US5132992)/en]) and the emerging work on digital video compression standards, would find it obvious to implement the "compression means" and "decompressing means" of US5132992 (Claims 1 and 3) using the principles laid out in the early MPEG-1 standard. MPEG-1, formally ISO/IEC 11172, was being developed in the late 1980s and early 1990s, with drafts publicly available. Its focus on inter-frame and intra-frame compression for video and psychoacoustic modeling for audio would directly inform how to achieve the "high rates of data compression" mentioned in the patent's abstract [cite: https://patents.google.com/patent/[US5132992](/patent/US5132992)/en]. The "formatted data blocks" and "receiver format conversion means" would directly correspond to MPEG's elementary streams and system streams, and the parsing and decoding required for playback.
- Open-Source Standard: MPEG-1 (ISO/IEC 11172)
- Prior Art Overlap:
- Claim 1: "compression means... for compressing said formatted and sequenced data."
- Claim 1: "compressed data storing means... for storing as a file said compressed sequenced data."
- Claim 3: "transceiver means for automatically receiving the requested information from said transmitter as compressed formatted data blocks."
- Claim 3: "receiver format conversion means... for converting said compressed formatted data blocks into a format suitable for storage and processing."
- Claim 3: "decompressing means... for decompressing said compressed formatted information."
US5132992 + TCP/IP (Transmission Control Protocol/Internet Protocol):
- Description: Given US5132992's objective to use "multiple existing communications channels" including "standard telephone, ISDN... cable television systems" [cite: https://patents.google.com/patent/[US5132992](/patent/US5132992)/en], a PHOSITA would recognize the existing and evolving TCP/IP suite as a robust, packet-switched networking standard for transmitting data over such diverse channels. It would be obvious to adapt the "transmitter means" (Claim 1) and "transceiver means" (Claim 3) to encapsulate the "compressed formatted data blocks" within TCP/IP packets for reliable and ordered delivery across heterogeneous networks. The "ordering means" (Claim 1) for addressable data blocks would align with packet sequencing, and the "receiving" and "buffering" steps (Claim 2) would naturally leverage TCP's flow control and congestion avoidance mechanisms, or UDP for real-time streaming with application-level buffering.
- Open-Source Standard: TCP/IP Protocol Suite (RFCs, particularly RFC 791 and RFC 793, widely published by the IETF since 1981)
- Prior Art Overlap:
- Claim 1: "transmitter means... for sending at least a portion of a specific file to a specific one of said remote locations."
- Claim 2: "sending at least a portion of said stored compressed information to said remote location; receiving the sent information at said remote location."
- Claim 3: "transceiver means for automatically receiving the requested information from said transmitter as compressed formatted data blocks."
US5132992 + Early Hypertext (e.g., HTTP/HTML concepts):
- Description: US5132992 describes users "calling a phone number or by typing commands into a computer" to "choose audio and/or video material from a list" [cite: https://patents.google.com/patent/[US5132992](/patent/US5132992)/en]. At the time of invention, the foundational concepts of the World Wide Web, including HTTP and HTML, were being developed and demonstrated at CERN. A PHOSITA would foresee the application of these burgeoning hypertext technologies to create a user-friendly, graphical interface for the "remote order processing and item database 300" (described in the patent [cite: https://patents.google.com/patent/[US5132992](/patent/US5132992)/en]) and the "user terminal interface" (FIG. 4 flowchart [cite: https://patents.google.com/patent/[US5132992](/patent/US5132992)/en]). The "requesting transmission, by a user, of at least a part of said stored compressed information" (Claim 2) would be a straightforward interaction via an HTML-based form submitted over HTTP, enabling users to "identify a choice of an item" (Claim 3) and specify "time and place" of delivery.
- Open-Source Standard: HTTP (e.g., HTTP/0.9, proposed 1991), HTML (e.g., HTML Tags, early 1990s)
- Prior Art Overlap:
- Claim 1: "identification encoding means for... assigning a unique identification code" (used for linking/addressing content).
- Claim 2: "requesting transmission, by a user, of at least a part of said stored compressed information to said remote location."
- Claim 3: "receiving system responsive to a user input identifying a choice of an item stored in a source material library."
Generated 5/10/2026, 10:28:23 PM