Questions: chip selection and its consequences

Thread Starter

taylor_totter

Joined Jul 1, 2022
1
Hello! I'm a programmer by trade looking to get into embedded programming. I have a product idea that I have completely developed and tested in software on my computer, but still needs to get put onto a chip. The problem is that I have little to no experience with embedded stuff (I bet you haven't heard that one before, huh?). I have a number of general questions about chips, embedded programming, and I guess life in general that I feel need to be answered by another actual human who does this kind of stuff all the time, if you'll indulge me.

For some background, my programming experience is mostly in high performance desktop software. I've made a bunch of little projects with Arduinos, a bunch of small circuitry projects, and done a little bare-metal programming on my old rpi, but that's the extent of my embedded experience. The project that I have in mind belongs in the video processing category, and for now I'm only looking to sell maybe 100 or so units.

My questions:

1. I've been searching around and researching for a couple days now, and it seems like people really REALLY like the i.MX series' chips. I was wondering if this is sort of like how if you were to read hackernews every day, you would be under the impression that all new code in the year 2022 is being written in Rust. By which I mean, are comments and recommendations from people online really indicative of the state of microprocessor programming today? I don't have any first-hand experience in this stuff myself and so I'm going off of what people on this internet are saying. I'd like to know how many grains of salt to take with their advice.

2. One of the strategies I've been using in my research has been to think "what devices in my life do I think would have the computational ability to do this task?" and researching those. And what I've found is that chip manufacturers seem to make many chips with a very specific purpose in mind. For example, the Qualcomm chips in my phone *can* be purchased, but they don't seem to bother letting you boot anything other than Android. Similarly, I have a page open on TI's website about some ARM chip of theirs and their diagram *says* that it has a 3D GPU in it but I can't find any information on *what* GPU it is, except for the blurb saying the chip features "Gesture Recognition" and "Vision Analytics." Is it fair to say that these chips largely have a specific use case and specific customer in mind when they're made, and aren't made for people like me?

3. All of the microprocessors I've been looking at seem to make the assumption that I want to install Android or Linux on the chip. And that I'd be silly for wanting to program their chips without them. Every single chip that I've found that's cheap and powerful enough has the same story: the chip comes with a pre-built copy of Linux (which I can hopefully get the source for), and a proprietary userspace blob that I have to use to interface with the GPU, using OpenCL or OpenGL or something. I guess what I was really wanting was something like an rpi, but more powerful. You can pretty easily write code for the rpi without an OS, and the QPU on the rpi is documented to the point where you can write code directly for the rpi's GPU without having to rely on weird buggy unoptimizable drivers. Does something like this exist? I've seen people online asking a similar question and the answer seems to always be the same: just suck it up and use Linux. Are those people right?

4. More on Linux: I'm used to desktop and server Linux, and have zero knowledge of embedded Linux. Motivated by my experience, my opinion of Linux is that it's very bulky, has weird crazy init processes, has deamons running in the background all the time, takes forever to boot, and every time you want to do something with it you end up with tons and tons of duct tape. Is this a fair characterization of the embedded Linuxes that this chips manufacturers are wanting me to use? One of my main qualms about technologies like deep learning is that once you adopt them, you will spend the rest of the lifespan of that project fiddling with the network. In other words, when some problems go to Vegas, they stay in Vegas, and will never ever leave Vegas to become solved problems or easier problems. Is embedded Linux like Vegas? Will I spend an inordinate amount of time configuring Linux?

5. Any other general advice? How useful are development/evaluation boards? How much should I be looking for pre-made boards with all of the components I need vs getting the chips and boards fabbed separately? If I get the made separetely is it likely that I won't even have enough documentation about the chips to install them on the boards correctly? Are there any chips out there with fantastic documentation under the $50-$70 mark? Are there any other communities that would be willing to give straight-up answers to a n00b like me? Any other common gotchas?

Thanks a ton! I know this is a bit of a wall of text, but it would be a huge help to have someone with real-world experience answer these. I can only read product specifications for "IoT enabled" chips for so long before I start seeing the matrix :p
 

nsaspook

Joined Aug 27, 2009
13,079
Embedded programming covers a wide range of hardware and software capabilities. Video processing category is usually a high-end signal processing task consisting a several high performance task blocks running (because it's easier to develop and programs) in synchronization with a schedulers (RT OS , etc ...) and hardware with specialized DSP systems. You need to be more, specific.

Rust is a computer morality language looking for a killer application. It's not there yet.
C: You have a hammer and 5 nails. If you need any other tools, you'll have to make them yourself, and you have all of the raw materials to do so. It's a lot of work, but some of the biggest and nicest mansions were built this way.

C++: You have every tool ever invented. Some are amazing, and some are useless. Sometimes the good ones are stuck to the useless ones. You have a dumpster but you have to call different companies to pick up different types of trash. You occasionally call the wrong company, so some of the trash never gets picked up.

Rust: You have a pretty good selection of tools. Unfortunately, all of the sharp ones are locked up in a chest somewhere. You can use them, but everyone in the neighborhood will scream at you if you do. It's possible to finish the house without them if you're skilled enough, but it won't look quite like you originally intended.
 

BobTPH

Joined Jun 5, 2013
8,807
I have several questions.

1. Why is this a hardware project, why not run the software on a standard PC?

2. What makes an Rpi insufficient for this project?

3. Have you considered single board computer using either an an Intel or an Arm chip?

4. I find it hard to believe it would be cost effective to make your own boards for quantity 100. Why do you think it is?
 

MrChips

Joined Oct 2, 2009
30,707
Welcome to AAC!

Your MCU selection will be hugely dependent on your specific application.
We don't know what video processing application you have in mind and the requirements for connectivity, file storage, real-time processing, RTOS, etc. Video processing may also require additional support hardware such as image capture and graphics processing.

In general, if you are new to embedded systems programming, you want to start at a basic level and work your way up. You can start with bare bones MCU programming, Arduino, and RaspberryPi. This will introduce you to the basics before moving to a more advanced platform.

It is obvious that eventually you need processing power. I am running ARM chips at 200MHz and higher. My MCUs of choice are STMicroelectronics STM32 family. Basically, the compiler is free and you will be programming bare metal in C programming language. I have no need for RTOS because it gets in my way of what I want to accomplish. In essence, I write my own RTOS.
 

RayB

Joined Apr 3, 2011
31
I'm a programmer by trade looking to get into embedded programming.
For some background, my programming experience is mostly in high performance desktop software. I've made a bunch of little projects with Arduinos, a bunch of small circuitry projects, and done a little bare-metal programming on my old rpi, but that's the extent of my embedded experience
IMO: You know too much about programming and are fretting about going down a muddy road and finding the bridge is out. You've done Arduino. You've done rPi. You're competent in PC programming. So, you are confused.

Chill.

Do your (written) requirements.
Do your power budget (batteries? AC? Need to sleep uC?)
Concentrate on your core strength in languages (once complete, the owner has no language concerns.)
I would stay away from Python unless this is a core competency. Same for RUST.
I would avoid manufacturer's evals as much hardware onboard consumes flexibilities (ports, pinouts.)
Pick a uC or SoC platform with "free" tools. Were the chips impacted by the COVID supply mess?
Are the chips inexpensive 'nuff to smoke a few? (You will.)
Is the uC multicore and supported by a "free" RTOS?

Prototype against a written multi-stage expectation plan. Avoid getting hung-up on feature creep.
Stay focused. Design modular. Test often. Design for substitution of external peripherals.
Check/double-check/check-again public software licenses.

Good luck.
 
Top