I have a software application that plots waveforms that are defined by a sequence of floating point numbers (think of the old X Windows program xgraph and you understand it).
I'd like to enhance this program with the ability to detect "significant events". Mostly, I'd roughly define a "significant event" to be a part of the waveform that differs in amplitude significantly from its neighboring points. For example, a waveform that is all zeros except for one point that has an amplitude of 1 has a "significant event" where the 1 occurs.
I'd like to figure out an algorithm that could efficiently find these significant events. A naive approach that works well on "theoretical" waveforms is to subtract each point from its neighbor. This is a simple finite difference approximation to the derivative.
However, real world signals contain noise and as most of you know, differentiating a noisy signal leads to more noise.
Can any of you suggest some algorithms that might be useful to find these significant events? I know very little about signal processing.
The application is written in python and the numerical processing stuff uses numpy, an array-processing library written in C. I would like to use only numpy functions, as using a python for loop is way too inefficient, as the waveform data can involve millions of points.
I'd like to enhance this program with the ability to detect "significant events". Mostly, I'd roughly define a "significant event" to be a part of the waveform that differs in amplitude significantly from its neighboring points. For example, a waveform that is all zeros except for one point that has an amplitude of 1 has a "significant event" where the 1 occurs.
I'd like to figure out an algorithm that could efficiently find these significant events. A naive approach that works well on "theoretical" waveforms is to subtract each point from its neighbor. This is a simple finite difference approximation to the derivative.
However, real world signals contain noise and as most of you know, differentiating a noisy signal leads to more noise.
Can any of you suggest some algorithms that might be useful to find these significant events? I know very little about signal processing.
The application is written in python and the numerical processing stuff uses numpy, an array-processing library written in C. I would like to use only numpy functions, as using a python for loop is way too inefficient, as the waveform data can involve millions of points.