Hello--I'm starting a project which requires a video camera to send simple image data to a microcontroller for processing. The camera doesn't need to do any fancy processing, image tracking, etc; it can just send the images. The first thing the microcontroller needs to do is convert the image data from the camera into something that represents the image as a collection of pixels. (I don't need color information, just greyscale.) For example, it could get an image from the camera and convert it into an array of integers that represent the darkness/brightness of each pixel. I did a little research and it looks like a CMOS camera module is what I need. I checked out the AVRCam and other cameras that have microcontrollers already on them, but it seemed like they do more than I need, and it'd be nice to do something cheaper.
My question is: where can I find a good introduction or tutorial about connecting a CMOS camera module to a microcontroller and getting the microcontroller to understand the data from the camera? Also, what's the best type of microcontroller for this job? I've been reading robotics books, so I know the basics of how to program a microcontroller, but I've never actually used one. I have a lot of programming experience, though. Ideally I'd like to program in C, but I'm willing to learn a new language if it makes things easier on the hardware end. I'm also OK with using a different kind of camera besides CMOS; that just seemed like the best from what I know. I've spent a lot of time Google searching for this stuff, but haven't found a nice, basic introduction.
Edit: I'm putting more info about the requirements here.
--All the processing the microcontroller needs to do is convert frames into 16-shade grayscale and output them as a list of 4-bit pixels with some sort of signature to mark the beginning and end of a frame.
--I'd like to get 20fps, but really as long as I don't go below 10 it'll be satisfactory.
--The microcontroller can grab frames at an inconsistent rate; if the fps swings up and down while it's running, that's not a problem.
Thanks in advance!
My question is: where can I find a good introduction or tutorial about connecting a CMOS camera module to a microcontroller and getting the microcontroller to understand the data from the camera? Also, what's the best type of microcontroller for this job? I've been reading robotics books, so I know the basics of how to program a microcontroller, but I've never actually used one. I have a lot of programming experience, though. Ideally I'd like to program in C, but I'm willing to learn a new language if it makes things easier on the hardware end. I'm also OK with using a different kind of camera besides CMOS; that just seemed like the best from what I know. I've spent a lot of time Google searching for this stuff, but haven't found a nice, basic introduction.
Edit: I'm putting more info about the requirements here.
--All the processing the microcontroller needs to do is convert frames into 16-shade grayscale and output them as a list of 4-bit pixels with some sort of signature to mark the beginning and end of a frame.
--I'd like to get 20fps, but really as long as I don't go below 10 it'll be satisfactory.
--The microcontroller can grab frames at an inconsistent rate; if the fps swings up and down while it's running, that's not a problem.
Thanks in advance!
Last edited: