Hello! I'm a programmer by trade looking to get into embedded programming. I have a product idea that I have completely developed and tested in software on my computer, but still needs to get put onto a chip. The problem is that I have little to no experience with embedded stuff (I bet you haven't heard that one before, huh?). I have a number of general questions about chips, embedded programming, and I guess life in general that I feel need to be answered by another actual human who does this kind of stuff all the time, if you'll indulge me.
For some background, my programming experience is mostly in high performance desktop software. I've made a bunch of little projects with Arduinos, a bunch of small circuitry projects, and done a little bare-metal programming on my old rpi, but that's the extent of my embedded experience. The project that I have in mind belongs in the video processing category, and for now I'm only looking to sell maybe 100 or so units.
My questions:
1. I've been searching around and researching for a couple days now, and it seems like people really REALLY like the i.MX series' chips. I was wondering if this is sort of like how if you were to read hackernews every day, you would be under the impression that all new code in the year 2022 is being written in Rust. By which I mean, are comments and recommendations from people online really indicative of the state of microprocessor programming today? I don't have any first-hand experience in this stuff myself and so I'm going off of what people on this internet are saying. I'd like to know how many grains of salt to take with their advice.
2. One of the strategies I've been using in my research has been to think "what devices in my life do I think would have the computational ability to do this task?" and researching those. And what I've found is that chip manufacturers seem to make many chips with a very specific purpose in mind. For example, the Qualcomm chips in my phone *can* be purchased, but they don't seem to bother letting you boot anything other than Android. Similarly, I have a page open on TI's website about some ARM chip of theirs and their diagram *says* that it has a 3D GPU in it but I can't find any information on *what* GPU it is, except for the blurb saying the chip features "Gesture Recognition" and "Vision Analytics." Is it fair to say that these chips largely have a specific use case and specific customer in mind when they're made, and aren't made for people like me?
3. All of the microprocessors I've been looking at seem to make the assumption that I want to install Android or Linux on the chip. And that I'd be silly for wanting to program their chips without them. Every single chip that I've found that's cheap and powerful enough has the same story: the chip comes with a pre-built copy of Linux (which I can hopefully get the source for), and a proprietary userspace blob that I have to use to interface with the GPU, using OpenCL or OpenGL or something. I guess what I was really wanting was something like an rpi, but more powerful. You can pretty easily write code for the rpi without an OS, and the QPU on the rpi is documented to the point where you can write code directly for the rpi's GPU without having to rely on weird buggy unoptimizable drivers. Does something like this exist? I've seen people online asking a similar question and the answer seems to always be the same: just suck it up and use Linux. Are those people right?
4. More on Linux: I'm used to desktop and server Linux, and have zero knowledge of embedded Linux. Motivated by my experience, my opinion of Linux is that it's very bulky, has weird crazy init processes, has deamons running in the background all the time, takes forever to boot, and every time you want to do something with it you end up with tons and tons of duct tape. Is this a fair characterization of the embedded Linuxes that this chips manufacturers are wanting me to use? One of my main qualms about technologies like deep learning is that once you adopt them, you will spend the rest of the lifespan of that project fiddling with the network. In other words, when some problems go to Vegas, they stay in Vegas, and will never ever leave Vegas to become solved problems or easier problems. Is embedded Linux like Vegas? Will I spend an inordinate amount of time configuring Linux?
5. Any other general advice? How useful are development/evaluation boards? How much should I be looking for pre-made boards with all of the components I need vs getting the chips and boards fabbed separately? If I get the made separetely is it likely that I won't even have enough documentation about the chips to install them on the boards correctly? Are there any chips out there with fantastic documentation under the $50-$70 mark? Are there any other communities that would be willing to give straight-up answers to a n00b like me? Any other common gotchas?
Thanks a ton! I know this is a bit of a wall of text, but it would be a huge help to have someone with real-world experience answer these. I can only read product specifications for "IoT enabled" chips for so long before I start seeing the matrix
For some background, my programming experience is mostly in high performance desktop software. I've made a bunch of little projects with Arduinos, a bunch of small circuitry projects, and done a little bare-metal programming on my old rpi, but that's the extent of my embedded experience. The project that I have in mind belongs in the video processing category, and for now I'm only looking to sell maybe 100 or so units.
My questions:
1. I've been searching around and researching for a couple days now, and it seems like people really REALLY like the i.MX series' chips. I was wondering if this is sort of like how if you were to read hackernews every day, you would be under the impression that all new code in the year 2022 is being written in Rust. By which I mean, are comments and recommendations from people online really indicative of the state of microprocessor programming today? I don't have any first-hand experience in this stuff myself and so I'm going off of what people on this internet are saying. I'd like to know how many grains of salt to take with their advice.
2. One of the strategies I've been using in my research has been to think "what devices in my life do I think would have the computational ability to do this task?" and researching those. And what I've found is that chip manufacturers seem to make many chips with a very specific purpose in mind. For example, the Qualcomm chips in my phone *can* be purchased, but they don't seem to bother letting you boot anything other than Android. Similarly, I have a page open on TI's website about some ARM chip of theirs and their diagram *says* that it has a 3D GPU in it but I can't find any information on *what* GPU it is, except for the blurb saying the chip features "Gesture Recognition" and "Vision Analytics." Is it fair to say that these chips largely have a specific use case and specific customer in mind when they're made, and aren't made for people like me?
3. All of the microprocessors I've been looking at seem to make the assumption that I want to install Android or Linux on the chip. And that I'd be silly for wanting to program their chips without them. Every single chip that I've found that's cheap and powerful enough has the same story: the chip comes with a pre-built copy of Linux (which I can hopefully get the source for), and a proprietary userspace blob that I have to use to interface with the GPU, using OpenCL or OpenGL or something. I guess what I was really wanting was something like an rpi, but more powerful. You can pretty easily write code for the rpi without an OS, and the QPU on the rpi is documented to the point where you can write code directly for the rpi's GPU without having to rely on weird buggy unoptimizable drivers. Does something like this exist? I've seen people online asking a similar question and the answer seems to always be the same: just suck it up and use Linux. Are those people right?
4. More on Linux: I'm used to desktop and server Linux, and have zero knowledge of embedded Linux. Motivated by my experience, my opinion of Linux is that it's very bulky, has weird crazy init processes, has deamons running in the background all the time, takes forever to boot, and every time you want to do something with it you end up with tons and tons of duct tape. Is this a fair characterization of the embedded Linuxes that this chips manufacturers are wanting me to use? One of my main qualms about technologies like deep learning is that once you adopt them, you will spend the rest of the lifespan of that project fiddling with the network. In other words, when some problems go to Vegas, they stay in Vegas, and will never ever leave Vegas to become solved problems or easier problems. Is embedded Linux like Vegas? Will I spend an inordinate amount of time configuring Linux?
5. Any other general advice? How useful are development/evaluation boards? How much should I be looking for pre-made boards with all of the components I need vs getting the chips and boards fabbed separately? If I get the made separetely is it likely that I won't even have enough documentation about the chips to install them on the boards correctly? Are there any chips out there with fantastic documentation under the $50-$70 mark? Are there any other communities that would be willing to give straight-up answers to a n00b like me? Any other common gotchas?
Thanks a ton! I know this is a bit of a wall of text, but it would be a huge help to have someone with real-world experience answer these. I can only read product specifications for "IoT enabled" chips for so long before I start seeing the matrix