Patent 6161149
Derivative works
Defensive disclosure: derivative variations of each claim designed to render future incremental improvements obvious or non-novel.
Active provider: Google · gemini-2.5-pro
Derivative works
Defensive disclosure: derivative variations of each claim designed to render future incremental improvements obvious or non-novel.
Defensive Disclosure and Prior Art Derivations of U.S. Patent 6,161,149
Publication Date: 2026-05-11
Subject: Technical disclosure of methods and systems for communication and collaboration, intended to enter the public domain and serve as prior art against future patent applications claiming similar or incrementally different inventions. This document expands upon the core concepts disclosed in U.S. Patent 6,161,149 ("the '149 patent").
Axis 1: Material & Component Substitution
1.1. System with Distributed Hash Table (DHT) Storage and Content-Addressable Channels
Enabling Description: A communication system is disclosed wherein the "central storage medium" is replaced by a peer-to-peer (P2P) Distributed Hash Table (DHT), such as the Kademlia-based DHT used by the InterPlanetary File System (IPFS). When an inputting member sends a message, their peripheral device (node) chunks the message content, computes its cryptographic hash, and adds it to the DHT. This hash serves as a content-identifier (CID). The "central agent" is a distributed application or smart contract that associates the CID with the intended recipients. A "notice" is generated and pushed to recipients via a P2P messaging layer. The "channel" within the notice is not a location-based URL but a content-addressable URI (e.g.,
ipfs://<CID>). A receiving member's device uses this URI to retrieve the message content directly from the P2P network, fetching chunks from whichever nodes hold them. Message threading is achieved by embedding the CID of the parent message within the metadata of the reply message before it is added to the DHT.Diagram:
sequenceDiagram participant P1 as Node P1 (Sender) participant DHT as P2P DHT Network participant P2 as Node P2 (Recipient) P1->>P1: Create Message M1 P1->>DHT: Store M1, Get CID1 P1->>P2: Push Notice (Contains CID1) P2->>DHT: Request Content for CID1 DHT-->>P2: Provide M1 Chunks P2->>P2: Create Reply M2 (links to CID1) P2->>DHT: Store M2, Get CID2 P2->>P1: Push Notice (Contains CID2)
1.2. System Utilizing Modern Push Services and WebSocket Channels
Enabling Description: This embodiment replaces SMTP/email as the notice sender. The central agent integrates directly with standardized push services like Firebase Cloud Messaging (FCM) for Android/Web or Apple Push Notification Service (APNS) for iOS. When a message is stored, the notice generator constructs a service-specific JSON payload containing a summary and a unique message identifier. This payload is pushed to the FCM/APNS gateway, which delivers it to the recipient's registered device. For real-time applications, the "channel" is a persistent WebSocket connection maintained between the peripheral device and the central agent. The pushed notice triggers the client application to send a message retrieval request over the existing WebSocket, using the message identifier. The server then pushes the full message content back over the same WebSocket, minimizing latency associated with new HTTP connections.
Diagram:
flowchart TD subgraph Sender A[Peripheral Device] -->|1. POST Message| B(Central Agent) end subgraph Backend B -->|2. Store Message| C[Database] B -->|3. Generate Notice| D[Notice Generator] D -->|4. Push Payload| E{Push Gateway <br>(FCM/APNS)} end subgraph Recipient F[Peripheral Device] G((WebSocket <br>Connection)) F <-.-> G end E -->|5. Deliver Notice| F F -->|6. Request Msg via WebSocket| G B -.-> G G -->|7. Push Msg Data| F
1.3. System with Graph Database for Thread Storage and GraphQL Channels
Enabling Description: The central storage medium is implemented as a graph database (e.g., Neo4j, ArangoDB). Each user and message is a node. When a user
(U1)sends a message(M1), a directed edge(U1)-[:SENT]->(M1)is created. When another user(U2)replies with message(M2), a new node(M2)is created along with edges(U2)-[:SENT]->(M2)and(M2)-[:REPLY_TO]->(M1). This creates a native, traversable conversation graph. The "notice" pushed to a recipient contains a "channel" that is a parameterized GraphQL query URI. When the recipient's client accesses the URI, it executes a query against the graph database, retrieving not only the specific message but also its context, such as the parent message, sibling replies, and author information, all in a single, efficient request.Diagram:
erDiagram USER ||--o{ MESSAGE : SENT MESSAGE ||--o{ MESSAGE : REPLY_TO USER { string userId PK string name } MESSAGE { string messageId PK string content datetime timestamp }
Axis 2: Operational Parameter Expansion
2.1. System for Asynchronous Collaboration on Terabyte-Scale Digital Twins
Enabling Description: This system manages version control and collaboration for massive-scale industrial digital twins (e.g., a factory floor or jet engine). An engineer's "information input" is a proposed modification to a component within the multi-terabyte model, committed to a central model repository. The "central agent," a model management server, stores this delta and identifies via a dependency graph which other engineering teams (e.g., thermal, stress, materials) are affected. It generates a "notice" for each team, summarizing the change (e.g., "Turbine blade pitch modified by +2 degrees"). The "channel" is a specialized URI that, when opened in their CAD software, does not download the entire model. Instead, it streams only the relevant model geometry, loads the specific delta, and highlights the changes in a diff-viewer, enabling rapid review without massive data transfer.
Diagram:
stateDiagram-v2 state "Digital Twin v1.0" as v1 state "Engineer A Proposes Change" as change state "Central Repository" as repo state "Notification Sent" as notify state "Engineer B Reviews Delta" as review state "Digital Twin v1.1" as v2 [*] --> v1 v1 --> change: Checkout model part change --> repo: Commit Delta repo --> notify: Generate notice with delta-URI notify --> review: Engineer B clicks URI review --> repo: Approve/Reject Change repo --> v2: Merge Delta
2.2. System for Low-Latency Coordination of Molecular Assemblers
Enabling Description: A real-time control system for a nanofactory coordinates swarms of molecular assemblers. Each swarm controller is a "member." The "central agent" is a fault-tolerant fabric controller with picosecond-level synchronization. When one swarm completes a sub-component, its controller transmits an "information input" packet containing a completion status and the component's spatial coordinates in volatile memory. The fabric controller stores this packet and triggers a "notice" to the controller of the next swarm in the assembly sequence. The notice is a hardware interrupt signal. The "channel" is a direct memory access (DMA) pointer included in the interrupt service routine, allowing the receiving controller to instantly read the data packet from the fabric controller's memory without OS intervention, thereby minimizing latency to near-zero.
Diagram:
sequenceDiagram participant C1 as Assembler Controller 1 participant FC as Fabric Controller participant C2 as Assembler Controller 2 C1->>FC: Write Data Packet (Input) to Shared Memory FC->>C2: Trigger Hardware Interrupt (Notice) Note right of C2: ISR contains DMA pointer (Channel) C2->>FC: Read Data Packet via DMA C2->>C2: Begin next assembly task
Axis 3: Cross-Domain Application
3.1. System for Predictive Maintenance in Aviation (Aerospace)
Enabling Description: An onboard avionics system ("central agent") continuously monitors engine sensor data. Upon detecting a pattern that a predictive model flags as a precursor to a fault (the "information input"), the system stores the full high-resolution sensor snapshot. It identifies the required maintenance specialty (e.g., "hydraulics technician") and generates a "notice". This notice is transmitted via satellite to the airline's ground-based operations hub. The notice contains a hyperlink ("channel") that directs the maintenance planner to a secure web portal. The portal displays the specific alert, the sensor data that triggered it, the relevant section of the maintenance manual, and a pre-populated work order to schedule the component replacement at the aircraft's next destination.
Diagram:
flowchart LR subgraph Aircraft A[Sensors] --> B{Predictive Model} B -- Anomaly --> C[Avionics Computer] end subgraph Ground E[Maintenance Hub] <-- Alert --- D F[Technician Portal] <-- Link --- E end C -- Store Snapshot --> C C -- Generate & Push Notice --> D(Satellite Uplink) D -- Route to Hub --> E E -- Display Alert & Provide Link --> F
3.2. System for Automated Irrigation and Pest Control (AgTech)
Enabling Description: A central farm management platform serves as the "central agent". An in-field drone captures multispectral imagery and its analysis algorithm detects an early-stage pest infestation ("information input"). The platform stores the geotagged imagery and the analysis report. The "notice generator" identifies the farm manager and pushes a notice to their mobile device. The notice includes a hyperlink ("channel") that opens a farm map centered on the affected area. The map displays the infestation boundary and provides interactive options: (1) retrieve detailed imagery, (2) dispatch a spot-spraying drone, or (3) automatically generate an exclusion zone in the day's harvesting plan.
Diagram:
graph TD A[Drone captures imagery] --> B{Analysis detects pest} B --> C[Farm Platform: Stores report] C --> D[Push Notice to Farmer's Phone] D --> E{Farmer clicks Channel/Link} E --> F[Open Map on Infestation] F --> G[Option: View Imagery] F --> H[Option: Dispatch Sprayer] F --> I[Option: Update Harvest Plan]
Axis 4: Integration with Emerging Tech
4.1. System with AI-Driven Notice Generation and Summarization
Enabling Description: A corporate collaboration platform uses an AI model as its "notice generator." When an employee posts a long document or complex message thread ("information input"), the AI performs several actions: (1) It generates a concise, abstractive summary of the content. (2) It analyzes the content to identify key entities, projects, and sentiment. (3) It accesses an organizational graph to identify other employees who, based on their roles and past work, are implicit stakeholders but were not explicitly included by the sender. The system then pushes a personalized "notice" to both explicit and implicit recipients, containing the AI-generated summary. The "channel" link may be augmented with parameters that cause the user interface to automatically highlight the sections of the document most relevant to that specific recipient's role.
Diagram:
sequenceDiagram participant User participant CentralAgent participant AI_Model participant Recipient User->>CentralAgent: Post long document CentralAgent->>AI_Model: Analyze document AI_Model-->>CentralAgent: Return {Summary, Stakeholders} CentralAgent->>Recipient: Push Notice (Summary, Personalized Link) Recipient->>CentralAgent: Click Link CentralAgent-->>Recipient: Serve document with personalized highlights
4.2. Combination Art: System with Blockchain-Based Auditing and Smart Contract Events
Enabling Description: This system is implemented on a permissioned blockchain (e.g., Hyperledger Fabric) for supply chain provenance. A supplier submitting a shipment of goods is an "inputting member." Their submission, including certificates of origin and quality reports, is the "information input." The input is stored in an off-chain database (like IPFS), and its hash is submitted to a smart contract ("central agent"). The smart contract records the hash, sender, and intended recipient (e.g., the buyer) on the immutable ledger. Upon successful transaction validation, the smart contract emits an
ShipmentReceivedevent. This event is the "notice." An off-chain listener service subscribed to these events pushes a notification to the buyer. The "channel" in the notice is a URL to a blockchain explorer showing the transaction details, which in turn contains the IPFS link to the off-chain documentation, providing a fully auditable and non-repudiable communication trail.Diagram:
flowchart TD A[Supplier submits docs to IPFS] --> B(Get Doc Hash) B --> C{Invoke Smart Contract w/ Hash} C --> D[Transaction written to Blockchain Ledger] D --> E[Smart Contract Emits Event (Notice)] E --> F[Off-Chain Listener Service] F --> G[Push Notification to Buyer] G --> H{Buyer clicks Channel/Link} H --> I[View Transaction on Blockchain Explorer] I --> J[Access Docs via IPFS Hash]
Axis 5: The "Inverse" or Failure Mode
5.1. System with Graceful Degradation to P2P Retrieval
Enabling Description: A system designed for tactical edge networks with unreliable connectivity to a central server. In normal operation, it functions as described in the '149 patent. When the central agent detects a loss of connectivity to its primary storage but can still reach peripheral devices via a low-bandwidth channel, it enters a "degraded" mode. When a sender transmits a message, the central agent stores only the message's metadata and a notice. It pushes the notice to the recipient. The "channel" in this notice is modified; it contains the network address of the sender's device. When the recipient's device activates the channel, it attempts to form a direct P2P connection with the sender's device to retrieve the full message content, bypassing the unavailable central store. The central agent acts only as a discovery and signaling service.
Diagram:
sequenceDiagram participant P1 as Sender participant Agent as Central Agent (Degraded) participant P2 as Recipient P1->>Agent: Send Msg (Agent stores metadata only) Agent->>P2: Push Notice (Channel = P1's Address) P2->>P1: Initiate P2P connection P1-->>P2: Transfer full message content
5.2. System with End-to-End Encryption and Zero-Knowledge Agent
Enabling Description: This system is for secure communication where the central agent cannot be trusted with message content. All "information inputs" are end-to-end encrypted on the peripheral device using a protocol like Signal's. The encrypted ciphertext is sent to the central agent, which stores it as an opaque blob. The notice generation and sending occurs as normal, but the agent has zero knowledge of the message content. The "channel" is a link to retrieve the encrypted blob. Decryption keys are managed and exchanged independently by the users' peripheral devices (e.g., via a separate key exchange protocol). The central agent's role is confined to storing encrypted blobs and pushing notices, ensuring the centrifugal workflow without compromising content confidentiality.
Diagram:
graph TD subgraph Sender A[Create Plaintext Msg] --> B(Encrypt Msg) end subgraph Recipient H[Receive Ciphertext] --> I(Decrypt Msg) end subgraph Out-of-Band J[Key Exchange] B <--> J I <--> J end B --> C[Send Ciphertext to Central Agent] C --> D[Agent Stores Ciphertext Blob] D --> E[Agent Pushes Notice] E --> F{Recipient Clicks Channel} F --> G[Agent Serves Ciphertext Blob] G --> H
Combination Prior Art with Open-Source Standards
In addition to the Hyperledger Fabric example (4.2), the following systems are disclosed:
Combination with W3C Decentralized Identifiers (DIDs) and Verifiable Credentials (VCs): A system where each group member is identified by a DID. An "information input" is structured as a W3C Verifiable Credential, cryptographically signed by the sender's DID. The VC is stored on a distributed public or private data store. The "central agent" is a service that, upon receiving a VC, sends a "notice" to the recipient's DIDComm endpoint. The "channel" in the notice is a URL that resolves via a Universal Resolver to the stored VC, allowing the recipient to verify its authenticity and integrity.
Combination with the Matrix Protocol: A system where the "central agent" is a standard Matrix homeserver. An "information input" is a message event posted to a Matrix room. A specialized bot or homeserver module acts as the "notice generator." When a message is posted that matches certain criteria (e.g., @mentions a user), this module generates a custom "notice" and pushes it to the user's device via the standard Matrix push gateway mechanism. The "channel" is a
matrix.toURI that acts as a permalink, deep-linking a client directly to the specific message event within the room's timeline. This implements the centrifugal notification flow using existing, open-source, federated communication standards.
Generated 5/11/2026, 12:46:42 AM