We’ve run into a snag processing camera images. The theory behind the cosmic ray detector is that the CCD in the phone camera is sensitive to radiation other than visible light. If I block the camera lens and then begin to take pictures I should get totally black images, except in the case where a cosmic ray event has impacted the camera sensor and caused “sparkling” or set of pixels to become illuminated.
On the G1 this works well in that I can regularly get total black images. However, on the Nexus One I can never get a totally black image. Does this mean the the CCD is very “noisy” or does it mean that it is very sensitive? Am I being bombarded by cosmic ray events, background radiation (Yes!), or the CCD is just flaky?
For you math wiz’s out there, you know, those of you with perfect 800’s on your SAT’s: How do you write a digital filter to find real events while blocking out benign electrical noise? Since a Cosmic Ray event is fairly low level, how do you make the distinction? What is its magnitude?
And then for the app developer: How do you write the filter — or put it in a processing pipeline — so that it is fast enough to not be a bottleneck to obtaining samples — i.e., it has to be faster than the time it takes to collect a sample.
If you have ideas please feel free to attend or jump in to the discussion. Thanks.