Patent 9235259
Derivative works
Defensive disclosure: derivative variations of each claim designed to render future incremental improvements obvious or non-novel.
Active provider: Google · gemini-2.5-pro
Derivative works
Defensive disclosure: derivative variations of each claim designed to render future incremental improvements obvious or non-novel.
This document serves as a defensive disclosure of derivative inventions and improvements related to the art described in US Patent 9,235,259. The purpose of this disclosure is to place these concepts into the public domain, thereby establishing them as prior art for any future patent applications. The disclosures herein are described in sufficient detail to enable a Person Having Ordinary Skill in the Art (PHOSITA) to practice the inventions.
Disclosure Set 1: Derivatives of Core Coarse-to-Fine Tick Detection (Claims 1 & 9)
This set of disclosures expands upon the fundamental method of using a low-power coarse processor to trigger a high-fidelity fine processor that compares an audio event to a device-specific, pre-trained fingerprint.
1.1. Component Substitution: Haptic/Vibrational Tick Detection
Enabling Description: This variation replaces the microphone with a non-acoustic sensor to detect the "tick" as a physical impulse. A piezoelectric transducer is laminated onto the device's chassis, or a multi-axis MEMS accelerometer is used. The coarse detection processor monitors the sensor output for a rapid, high-amplitude change in voltage (from the piezoelectric element) or G-force (from the accelerometer) that exceeds a baseline threshold. Upon this coarse trigger, a buffer of the high-resolution sensor data is passed to the fine detection processor. The fine processor calculates the Fast Fourier Transform (FFT) of the vibrational signal. The "fingerprint" is a pre-trained reference set of the device's characteristic structural resonant frequencies in response to a physical tap. The fine detection step involves correlating the frequency spectrum of the live impulse with this stored vibrational fingerprint.
Mermaid Diagram:
flowchart TD A[Physical Tap on Device] --> B{Piezoelectric Transducer / MEMS Accelerometer}; B --> C[Coarse Processor: Monitor for Voltage/G-force Spike]; C -->|Spike > Threshold| D[Trigger & Buffer High-Res Vibration Data]; C -->|Spike <= Threshold| C; D --> E[Fine Processor: Perform FFT on Vibration Data]; E --> F{Correlate FFT with Stored Vibrational Fingerprint}; F -->|Correlation > T_p| G[Tick Confirmed]; F -->|Correlation <= T_p| H[False Alarm];
1.2. Operational Parameter Expansion: Ultrasonic Fracture Detection in Industrial Assets
Enabling Description: This disclosure applies the method to the field of predictive maintenance and non-destructive testing. An ultrasonic transducer, sensitive in the 50 kHz to 5 MHz range, is acoustically coupled to a critical industrial component (e.g., a pressure vessel wall, a pipeline section, a rotating turbine blade). The coarse detection processor continuously monitors for wideband energy bursts characteristic of Acoustic Emission (AE) events, such as those produced by micro-fracture propagation in metals or composites. When such a burst is detected, the fine detection processor is enabled. It analyzes a high-resolution buffer of the ultrasonic waveform, comparing its spectral and temporal characteristics against a pre-trained library of AE "fingerprints" corresponding to known failure modes (e.g., crack growth, delamination, fiber breakage). The fine analysis uses a Short-Time Fourier Transform (STFT) to create a spectrogram, which is then compared to reference spectrograms using image correlation techniques.
Mermaid Diagram:
sequenceDiagram participant Asset as Industrial Asset (e.g., Pipeline) participant Sensor as Ultrasonic Transducer participant CoarseCPU as Coarse Processor participant FineCPU as Fine Processor participant Hub as Monitoring Hub Asset->>Sensor: Micro-fracture event generates ultrasonic 'tick' loop Continuous Monitoring Sensor->>CoarseCPU: Ultrasonic Waveform Data CoarseCPU->>CoarseCPU: Detect Energy Burst end CoarseCPU->>FineCPU: Enable! (Coarse Tick Detected at T_0) Sensor->>FineCPU: Buffer High-Resolution Waveform around T_0 FineCPU->>FineCPU: Generate Spectrogram and Compare with Failure Fingerprints alt Correlation > Threshold FineCPU->>Hub: ALERT: Potential Fracture Detected (Type: Crack Growth) else Correlation <= Threshold FineCPU->>Hub: LOG: Non-critical Acoustic Event end
1.3. Cross-Domain Application: Subterranean Pest Detection in Agriculture (AgTech)
Enabling Description: This invention is adapted for precision agriculture to non-invasively detect and identify subterranean pests, such as root-boring insects. A geophone or soil-vibration sensor is buried in the root zone of crops. The coarse processor monitors for low-amplitude, intermittent vibrational transients against the background seismic noise. The fine processor is triggered when a transient is detected and uses a pre-trained library of vibrational "fingerprints." Each fingerprint corresponds to the unique substrate-borne vibrations produced by the movement, chewing, or stridulation of a specific target pest (e.g., the root weevil larva). The fine analysis employs the energy differencing technique from claim 20 but applies it to the low-frequency (10-800 Hz) spectrum of the geophone signal. Identification of a pest triggers a targeted micro-dosing of pesticide or biological agent at that specific location.
Mermaid Diagram:
stateDiagram-v2 [*] --> Listening Listening --> CoarseTrigger : Vibration transient detected CoarseTrigger --> FineAnalysis : Enable fine processor FineAnalysis --> Listening : Correlation < Threshold (False Alarm) FineAnalysis --> PestIdentified : Correlation > Threshold for Pest 'X' PestIdentified --> Dosing : Trigger targeted micro-dosing Dosing --> Listening : Return to monitoring state
1.4. Integration with Emerging Tech: AI-Adaptive Acoustic Fingerprinting
Enabling Description: This disclosure enhances the tick detection system with an AI model that dynamically adapts the reference fingerprint to changing environmental conditions. The device is equipped with an array of IoT sensors (e.g., temperature, humidity, barometer, accelerometer for orientation). The coarse detection stage remains the same. When triggered, the fine processor receives the audio buffer and a feature vector of the current environmental state from the IoT sensors. Instead of a static fingerprint, it uses a lightweight generative neural network (e.g., a conditional variational autoencoder) that was trained on tick recordings under a wide range of conditions. The network takes the environmental feature vector as a condition and generates an expected fingerprint for the tick under the current conditions. This generated fingerprint is then used for the correlation, dramatically improving robustness to environmental changes and device aging.
Mermaid Diagram:
classDiagram class CoarseDetector { +listenForSpike() } class FineDetector { -correlationThreshold +confirmTick(audioBuffer, contextVector) } class AIGenerator { <<Model>> +generateFingerprint(contextVector) } class IoTSensorManager { +getCurrentContextVector() } CoarseDetector --> FineDetector : triggers FineDetector "1" -- "1" IoTSensorManager : gets FineDetector "1" -- "1" AIGenerator : uses IoTSensorManager ..> AIGenerator : provides context
1.5. Inverse/Failure Mode: Failsafe Low-Power Tick Confirmation
Enabling Description: This variant describes a "graceful degradation" mode for ultra-low-power scenarios, such as when a device's battery is critically low. Upon entering this state, the fine detection processor and its associated memory and clock domains are completely powered down. The system relies solely on the coarse detector. To reduce the high rate of false positives from the coarse detector alone, a secondary confirmation logic is implemented. When the coarse detector triggers, it does not wake the fine processor. Instead, it registers a "potential tick event" and opens a short time window (e.g., 500 ms). It then requires a second, distinct coarse tick event to occur within that window to validate the event as a "confirmed tick." This "double-tap" logic provides a rudimentary but extremely low-power method of confirmation, maintaining basic functionality while consuming minimal energy.
Mermaid Diagram:
stateDiagram-v2 state "Low Power Mode" as LPM { [*] --> Idle Idle --> Tentative : Coarse tick detected Tentative --> Confirmed : Second coarse tick detected within 500ms Tentative --> Idle : Timeout (500ms) Confirmed --> Idle : Report tick and reset } state "Full Power Mode" as FPM { [*] --> Listening Listening --> Fine_Processing : Coarse tick detected Fine_Processing --> [*] }
Disclosure Set 2: Derivatives of Noise Suppression via FFT Energy Differencing (Claims 20 & 22)
This set expands on the specific fine-processing technique of using differences between FFTs of successive audio buffers to suppress noise and create a fingerprint.
2.1. Signal Processing Substitution: Wavelet Packet Decomposition for Fingerprinting
Enabling Description: This disclosure replaces the FFT-based fine processing with a Wavelet Packet Decomposition (WPD). WPD provides a richer time-frequency analysis than FFT, particularly for transient signals. Upon a coarse trigger, the buffered audio is decomposed using WPD to a specified level (e.g., level 4), creating a tree of wavelet coefficients. The energy of the coefficients in each terminal node (representing a specific frequency sub-band) is calculated for successive time buffers. The "fingerprint" is a reference vector (or matrix) of the expected inter-buffer energy differences across these specific wavelet sub-bands. This method is more robust to certain types of non-stationary noise, as the wavelet basis functions are better at compactly representing transient "tick" signals than sinusoidal FFT basis functions. A Morlet or Daubechies mother wavelet is selected for this purpose.
Mermaid Diagram:
flowchart TD subgraph Fine Processing v1 (Patent) A[Audio Buffer] --> B[Compute FFT]; C[Previous Buffer] --> D[Compute FFT]; B & D --> E{Compute Energy Difference in Frequency Bins}; end subgraph Fine Processing v2 (Wavelet) F[Audio Buffer] --> G[Compute WPD]; H[Previous Buffer] --> I[Compute WPD]; G & I --> J{Compute Energy Difference in Wavelet Sub-bands}; end E --> K[Correlate with FFT Fingerprint]; J --> L[Correlate with Wavelet Fingerprint];
2.2. Cross-Domain Application: Smart Home Wake-Word Reverb Analysis
Enabling Description: The method is applied to smart speakers to reject false wake-word activations from media playback (e.g., a TV). The standard low-power phonetic model serves as the "coarse detector." Upon a potential wake-word detection, the "fine processor" is enabled. It analyzes the audio signal containing the wake-word and the audio immediately following it. It computes FFTs for two overlapping buffers: one centered on the wake-word and one centered on the subsequent reverberant tail. By taking the difference of the energy spectra (F c (ω,m') = E c (ω,m') - E c (ω,m'-1)), the system isolates the spectral decay characteristics of the sound in the specific room. This acoustic signature is compared to a "fingerprint" of the room's reverberation profile learned during device setup. A wake-word originating from a TV will have the acoustic characteristics of the TV's speakers and the recording environment, which will not match the live room's fingerprint, causing the activation to be rejected.
Mermaid Diagram:
stateDiagram-v2 [*] --> Listening Listening --> CoarseWakeWord : Phonetic match for "Hey Gizmo" CoarseWakeWord --> FineReverbAnalysis : Enable fine processor state FineReverbAnalysis { direction LR [*] --> CaptureAudio CaptureAudio --> ComputeFFTs : Buffer 1 (word), Buffer 2 (tail) ComputeFFTs --> ComputeEnergyDiff : Isolate room reverb signature ComputeEnergyDiff --> CorrelateWithRoomFingerprint } FineReverbAnalysis --> [*] : Correlation > Threshold (Wake-word Accepted) FineReverbAnalysis --> Listening : Correlation < Threshold (False Alarm, from TV)
2.3. Integration with Blockchain: Immutable Physical Event Auditing
Enabling Description: This disclosure creates a system for a high-security, auditable supply chain. An IoT device attached to a secure container uses the coarse-to-fine tick detection to identify impacts or unauthorized access attempts. Upon a confirmed "fine" detection, the system generates the fingerprint matrix (F c (ω,m)). This matrix, along with the GPS location, a high-precision timestamp, and the device ID, is serialized into a JSON object. The SHA-256 hash of this JSON object is computed. This hash is then submitted as a transaction to a permissioned blockchain (e.g., Hyperledger Fabric). Storing only the hash on-chain is efficient, while the full JSON payload is stored off-chain in a distributed file system like IPFS. This creates an immutable, tamper-evident, and verifiable record of a physical event occurring at a specific time and place.
Mermaid Diagram:
sequenceDiagram participant Container as Secure Container participant Sensor as IoT Device participant Blockchain as Private Blockchain participant OffChainDB as Distributed Storage (IPFS) activate Sensor Container->>Sensor: Physical Impact ('tick') Sensor->>Sensor: Coarse-to-Fine Detection Sensor->>Sensor: Generate Fingerprint Matrix Fc Sensor->>Sensor: Create JSON Payload (Timestamp, GPS, Fc) Sensor->>OffChainDB: Store JSON Payload, get Content ID (CID) Sensor->>Sensor: Compute Hash(CID + Metadata) Sensor->>Blockchain: Submit Transaction(Hash) deactivate Sensor
Combination Prior Art Disclosures
3.1. Combination with WebRTC and WebAssembly (WASM)
- Enabling Description: A method for browser-based, peer-to-peer device pairing is disclosed. Two devices (e.g., laptops) navigate to a web application. The application uses the Web Audio API (
getUserMedia) to access the microphone on each device. A Web Worker, running in a background thread, performs the computationally inexpensive "coarse tick detection" on the raw audio stream. When a potential tick is found, the relevantAudioBufferis passed to the main JavaScript thread. The main thread invokes a pre-compiled WebAssembly (WASM) module that executes the high-performance "fine tick detection" algorithm, including the FFT energy differencing and correlation check against a fingerprint downloaded from the server. When both devices report a confirmed tick with a closely matching timestamp and fingerprint correlation, their identities are exchanged over an existing, unauthenticated WebRTCRTCDataChannel, thus establishing a secure, authenticated session. This method fully implements the patented invention within the open standards of the modern web platform.
3.2. Combination with the Matter IoT Protocol
- Enabling Description: A "tap-to-commission" feature for Matter-compliant IoT devices is disclosed. A new, un-commissioned device (e.g., a smart bulb) listens for a specific tap sequence on its housing using the coarse-to-fine detection method. A commissioning device (e.g., a smartphone) is physically tapped against the bulb. Both devices detect the tick. The bulb, upon confirming the tick via its internal fine processor, broadcasts a special Matter-compliant BLE advertisement packet containing a hash of the tick's fingerprint. The smartphone, also having detected the tick, computes its own hash. It scans for the bulb's advertisement and, upon finding a packet with the matching hash, initiates the standard Matter commissioning flow over BLE. This uses the shared physical event, verified by the patented method, as a secure, out-of-band mechanism to bootstrap the standardized Matter onboarding process.
3.3. Combination with Android Sensor Hardware Abstraction Layer (HAL)
- Enabling Description: A method for system-level integration of the tick detection algorithm into an Android-based mobile device is disclosed. The device manufacturer implements the coarse-to-fine tick detection logic within the device's Sensor HAL. A new, non-standard sensor type,
SENSOR_TYPE_ACOUSTIC_TAP, is defined and registered with theSensorManager. The coarse detection algorithm is implemented to run continuously on a low-power Digital Signal Processor (DSP) that has access to the device's microphone data. Upon a coarse trigger, the DSP wakes the main Application Processor (AP) and passes it the relevant audio buffer. The AP executes the fine detection algorithm. If the tick is confirmed, the HAL driver populates asensors_event_tstructure and pushes it to the system's sensor event queue. Any user-space application with the appropriate permissions can then register aSensorEventListenerfor this sensor type, allowing them to receive tap events without requiring microphone access or implementing the detection logic themselves. This embeds the patented method as a native feature of the open-source mobile operating system.
Generated 5/10/2026, 12:36:47 AM