Patent 8494904

Derivative works

Defensive disclosure: derivative variations of each claim designed to render future incremental improvements obvious or non-novel.

Active provider: Google · gemini-2.5-pro

Derivative works

Defensive disclosure: derivative variations of each claim designed to render future incremental improvements obvious or non-novel.

✓ Generated

Defensive Disclosure Document for Innovations Derived from U.S. Patent 8,494,904

Publication Date: April 28, 2026
Subject Matter: Advanced methods and systems for the aggregation, verification, and monetization of descriptive profile data from distributed, unaffiliated sources.
Purpose: This document discloses a series of technical variations, extensions, and combinations of the core concepts outlined in US patent 8,494,904. Its intent is to place these derivative concepts into the public domain, thereby establishing them as prior art to preclude future patenting of these incremental and obvious improvements by third parties.


Analysis of Core Claimed Invention

The foundational concept involves a central computer system that receives partial user profiles from a plurality of unaffiliated third-party websites. A key mechanism is the tracking of the source of each profile attribute, which enables a compensation model for the data contributors. The aggregated profile data is then utilized for targeting third-party advertisements. This disclosure builds upon this foundation.


Derivative Disclosures

1. Derivations based on Component & Protocol Substitution

1.1. Synchronous API-Based Profile Enrichment
  • Enabling Description: This variation replaces the asynchronous URL-redirection method with a synchronous, low-latency API (Application Programming Interface) call. A third-party server, upon identifying a user, makes a server-to-server RESTful API call to the central profile system. The request payload contains the partial profile (e.g., a JSON object with attributes) and the cryptographically signed credentials of the third-party source. The central system ingests the data, updates the maintained profile, and returns a confirmation or an enriched data segment within the same synchronous transaction, typically in under 50 milliseconds. This architecture is suitable for real-time applications like ad bidding.
  • Mermaid Diagram:
    sequenceDiagram
        participant UserBrowser as User's Browser
        participant ThirdPartyServer as Third-Party Website
        participant ProfileSystem as Central Profile System
        participant Databank as Profile Databank
    
        UserBrowser->>ThirdPartyServer: Visits website
        ThirdPartyServer->>ProfileSystem: POST /enrich (API Call with Partial Profile)
        activate ProfileSystem
        ProfileSystem->>Databank: Update maintained profile with new attributes
        Databank-->>ProfileSystem: Confirmation
        ProfileSystem-->>ThirdPartyServer: 200 OK (Returns enriched data segments)
        deactivate ProfileSystem
        ThirdPartyServer->>UserBrowser: Renders page with targeted content
    
1.2. Profile Aggregation via a Distributed Hash Table (DHT)
  • Enabling Description: Instead of a centralized databank, maintained profiles are stored in a peer-to-peer Distributed Hash Table (DHT), similar to those used in systems like BitTorrent or IPFS. A user's identifier (e.g., a hashed email or a decentralized identifier) serves as the key. Each participating third-party node can write partial profiles (the values) to the DHT. To retrieve a full profile, the central system queries the DHT using the user's key, collecting partial profiles from various nodes. Source attribution is maintained by including the contributing node's public key in the stored value. Compensation is managed by a separate service that tracks DHT PUT requests.
  • Mermaid Diagram:
    flowchart TD
        subgraph Peer-to-Peer Network
            A[Node A - Contributor 1]
            B[Node B - Contributor 2]
            C[Node C - Contributor 3]
            DHT{Distributed Hash Table}
            A -- PUT(UserID, {attr: 'A1'}) --> DHT
            B -- PUT(UserID, {attr: 'B1'}) --> DHT
            C -- PUT(UserID, {attr: 'C1'}) --> DHT
        end
    
        subgraph Central System
            Aggregator
            Compensation
        end
    
        Aggregator -- GET(UserID) --> DHT
        DHT -- Returns [{attr: 'A1'}, {attr: 'B1'}, {attr: 'C1'}] --> Aggregator
        Aggregator --> MaintainedProfile[Builds Maintained Profile]
        DHT -- Log 'PUT' events --> Compensation
    

2. Derivations based on Operational Parameter Expansion

2.1. High-Frequency, Low-Latency Edge Computing Architecture
  • Enabling Description: The profile aggregation system is deployed on a global network of edge computing nodes. When a user in a specific geographic region accesses a participating website, the data transaction is handled by the nearest edge node. This node maintains a real-time cache of profile fragments relevant to users in its region, using an in-memory database like Redis or a time-series database for behavioral data. The edge node performs profile enrichment and ad targeting decisions with sub-10-millisecond latency. The nodes periodically synchronize their data with a central cloud-based databank for global consistency and model training.
  • Mermaid Diagram:
    graph TD
        User[User in EU] -->|Request| EdgeNodeEU[Edge Node - EU]
        
        subgraph Global Network
            EdgeNodeEU <--> CentralDB[(Central Databank)]
            EdgeNodeUS[Edge Node - US] <--> CentralDB
            EdgeNodeAPAC[Edge Node - APAC] <--> CentralDB
        end
    
        subgraph EdgeNodeEU
            direction LR
            Cache[(In-Memory Cache)]
            Logic[Targeting Logic]
            Contributor[Local Data Contributor] -- Partial Profile --> Cache
            Cache --> Logic
        end
        
        Logic -->|Targeted Ad| User
    
2.2. Federated Profile System with Privacy-Preserving Computation
  • Enabling Description: For operating across strict regulatory boundaries (e.g., GDPR, CCPA), the system uses a federated architecture. Each region has an independent databank that never shares raw user data. When a cross-region profile is needed for targeting, the system uses privacy-preserving technologies like Homomorphic Encryption or Secure Multi-Party Computation (SMPC). A query is encrypted and sent to a partner databank; the partner computes over the encrypted data to find matching attributes and returns an encrypted result. The result can only be decrypted by the querying system. This allows for profile enrichment without exposing personally identifiable information (PII) across borders.
  • Mermaid Diagram:
    sequenceDiagram
        participant SystemA as Databank A (e.g., US)
        participant SystemB as Databank B (e.g., EU)
        
        SystemA->>SystemA: Encrypt Query for "User X"
        SystemA->>SystemB: Send Encrypted Query
        activate SystemB
        SystemB->>SystemB: Compute on encrypted query against local data
        SystemB-->>SystemA: Return Encrypted Result (e.g., matching attributes)
        deactivate SystemB
        SystemA->>SystemA: Decrypt Result
        SystemA->>SystemA: Augment local profile for "User X"
    

3. Derivations based on Cross-Domain Application

3.1. Aerospace: Federated Predictive Maintenance Network
  • Enabling Description: Unaffiliated entities (airlines, engine manufacturers, MRO providers) contribute operational data (a "partial profile") for specific aircraft components, identified by a unique serial number. Data includes sensor telemetry, flight cycles, and maintenance actions. A central, secure system aggregates these partial profiles into a comprehensive "maintained profile" for each component. AI models use this aggregated data to predict failures and optimize maintenance schedules. The entity that contributes data which leads to a successful "catch" of a potential failure receives a significant payment or credit, incentivizing data sharing.
  • Mermaid Diagram:
    erDiagram
        AIRCRAFT_COMPONENT ||--o{ COMPONENT_PROFILE : "has"
        COMPONENT_PROFILE {
            string component_serial_id PK
            json maintained_profile
        }
        PARTIAL_PROFILE {
            string profile_id PK
            string component_serial_id FK
            string source_id
            json attributes
            timestamp created_at
        }
        DATA_CONTRIBUTOR ||--o{ PARTIAL_PROFILE : "provides"
        DATA_CONTRIBUTOR {
            string source_id PK
            string organization_name
        }
        MAINTENANCE_ALERT ||--|{ COMPONENT_PROFILE : "targets"
        MAINTENANCE_ALERT {
            string alert_id PK
            string component_serial_id FK
            string alert_details
        }
        COMPONENT_PROFILE ||--o{ PARTIAL_PROFILE : "is built from"
    
3.2. AgTech: Collaborative Crop Yield Optimization Platform
  • Enabling Description: A platform aggregates "partial profiles" of agricultural land plots from diverse, unaffiliated sources: satellite imagery providers (NDVI data), IoT sensor companies (soil moisture), and hyperlocal weather services (precipitation data). Each partial profile is tagged with the source and the specific coordinates of the land plot. The platform's "maintained profile" provides a holistic view of the plot's condition. This data is used to deliver targeted advice on irrigation, fertilization, and pest control to farmers. Data contributors are compensated via a revenue-sharing model based on the measured increase in crop yield for farmers using the platform.
  • Mermaid Diagram:
    flowchart TD
        A[Satellite Provider] -- NDVI Data --> CentralPlatform
        B[IoT Sensor Vendor] -- Soil Moisture --> CentralPlatform
        C[Weather Service] -- Precipitation Data --> CentralPlatform
        
        subgraph CentralPlatform
            Aggregator((Aggregator))
            PlotProfileDB[(Land Plot Profiles)]
            AnalyticsEngine{Analytics Engine}
        end
    
        Aggregator --> PlotProfileDB
        PlotProfileDB --> AnalyticsEngine
        AnalyticsEngine --> D[Farmer's Dashboard]
        D -- Actionable Insights --> E((Farmer))
        
        subgraph Compensation
            F(Yield Monitor) -- Yield Data --> G{Rev-Share Calculator}
            G -- Payments --> A
            G -- Payments --> B
            G -- Payments --> C
        end
    

4. Derivations based on Integration with Emerging Technology

4.1. AI-Driven Dynamic Attribute Valuation
  • Enabling Description: This system integrates a reinforcement learning (RL) agent to manage the data economy. When a partial profile is received, the RL agent assesses its value based on multiple factors: the predicted uplift in conversion probability if used for ad targeting, the attribute's scarcity in the existing databank, the historical credibility of the source, and the current demand for that attribute from advertisers. The agent generates a real-time price for the data, which is used in the compensation calculation. Over time, the agent learns to prioritize high-value data sources and can even predictively request specific missing attributes from the network to complete valuable user profiles.
  • Mermaid Diagram:
    stateDiagram-v2
        [*] --> Receiving
        Receiving: Partial Profile Received
        Receiving --> Valuating: On Reception
        
        Valuating: RL Agent Analyzes Data
        state Valuating {
            direction LR
            [*] --> Scarcity
            Scarcity --> Credibility
            Credibility --> PredictedUplift
            PredictedUplift --> Demand
            Demand --> [*]
        }
    
        Valuating --> Pricing: Generate Real-time Price
        Pricing --> Compensating: Execute Payment
        Compensating --> [*]
    
4.2. Blockchain-Verified Data Provenance and Compensation
  • Enabling Description: The system uses a permissioned blockchain (e.g., Hyperledger Fabric) to ensure data provenance and automate compensation. When a third party contributes a partial profile, a hash of the data along with the source's identity and a user consent token is recorded as a transaction on the ledger. When this data is used for ad targeting, a smart contract is triggered. The contract automatically verifies the data's origin on the blockchain and executes a micropayment in a stablecoin from the advertiser's account directly to the data contributor's wallet. This creates a transparent, auditable, and immutable record of the entire data lifecycle.
  • Mermaid Diagram:
    sequenceDiagram
        participant Contributor as Data Contributor
        participant User as User
        participant AdSystem as Ad System
        participant Blockchain as Permissioned Blockchain
        participant Advertiser as Advertiser
    
        User->>Contributor: Grants Consent
        Contributor->>AdSystem: Submits Partial Profile + Consent Token
        AdSystem->>Blockchain: Transaction: RecordData(DataHash, Source, Consent)
        Advertiser->>AdSystem: Places Ad Targeting Order
        AdSystem->>Blockchain: Smart Contract Call: UseData(DataHash)
        Blockchain-->>AdSystem: Verified
        AdSystem->>AdSystem: Serves Targeted Ad
        Blockchain->>Contributor: Auto-Execute Micropayment
    

5. Derivations based on Inverse or Failure Mode Operation

5.1. Graceful Degradation to Anonymized Segment Targeting
  • Enabling Description: The system includes a "privacy-first" operational mode that can be triggered by user preference, regulatory requirements, or as a safe-failure mode. In this mode, the system does not store or process individual-level PII. Instead, upon receiving a partial profile, it immediately converts the specific attributes into generalized, k-anonymized segments (e.g., "male, 30-40, interested in sports" becomes segment "MK30S"). The raw data is discarded. Ad targeting is performed only on these non-personally-identifiable segments. Compensation to data providers is based on a lower, fixed rate for contributing to these anonymized audience pools.
  • Mermaid Diagram:
    flowchart TD
        subgraph Normal Mode
            A[Receive Partial Profile (PII)] --> B{Store PII in Maintained Profile} --> C[Target Individual User]
        end
    
        subgraph Privacy-First Mode
            D[Receive Partial Profile (PII)] --> E{Anonymizer Engine}
            E -- K-Anonymization --> F[Generate Anonymized Segment]
            F --> G{Store Segment in Audience Pool} --> H[Target Segment]
            E -- Discard PII --> I((/dev/null))
        end
    
        Trigger[User Opt-Out or Regulation] --> Switch{Operational Mode Switch}
        Switch -- Normal --> A
        Switch -- Privacy-First --> D
    

Combination Prior Art Scenarios with Open-Source Standards

  1. Combination with OAuth 2.0 / OpenID Connect: The '904 patent's data transfer is implemented using the OAuth 2.0 authorization code grant flow. A user, on a third-party website, clicks "Share my shopping interests with AdNetwork." This initiates an OAuth flow where the user authenticates with the AdNetwork (the '904 system) and authorizes the third-party site to share specific "scopes" (e.g., profile.purchase_history). The third-party site receives an authorization code, exchanges it for an access token, and then uses the token to make a secure, server-to-server API call to the AdNetwork's user info endpoint, thereby transferring the partial profile. This standardizes the process of receiving user-consented data from unaffiliated parties.

  2. Combination with W3C Decentralized Identifiers (DIDs) and Verifiable Credentials (VCs): The user is identified by a DID, not a cookie. A third-party retail site (an "Issuer") creates a Verifiable Credential containing attributes like {"product_category": "outdoor_gear"} and cryptographically signs it. The user stores this VC in their digital wallet. When visiting a publisher site, the user's browser agent (the "Holder") presents this VC to the '904 ad system (the "Verifier"). The ad system verifies the signature of the issuer and uses the attribute for ad targeting. This decentralizes profile creation, putting the user in control of what attributes are shared, while still allowing the ad system to track the source (the issuer of the VC) for compensation.

  3. Combination with Prebid.js Header Bidding Framework: The '904 profile enrichment system is configured as a "Real-Time Data" module within the open-source Prebid.js framework. When an ad auction is initiated on a publisher's webpage, Prebid calls the '904 module. This module then contacts multiple, unaffiliated data providers (also integrated into the ecosystem) to fetch partial profile attributes associated with the user's ID. It aggregates these attributes into a set of targeting segments and attaches them to the bid requests sent to demand-side platforms (DSPs). This enriches the auction with valuable user data in real-time. The data provider whose attributes are present in the winning bid receives a share of the revenue, tracked and managed through the Prebid framework.

Generated 4/28/2026, 3:33:04 PM