Nope. Two things are missing, the loss of dB that can be tolerated before the signal looks like noise, and how the signal rolls off with frequency. With only the given data, I’d have to say anything less than 10ms is approaching dicey detection.
If the transducer has a first-order rolloff than the risetime for a step change would be about 1.6ms (to .632 of the final value).
5 time-constants (8ms) would give a response to within 1% of the final value.
But as noted, that's only if the transducer has a first-order rolloff.