newbie in programming

MrSalts

Joined Apr 2, 2020
2,767
Its odd how C and Unix caught on though
Yup, very odd that the fastest, most versatile systems can be used to make the most useful systems. Some less skilled people may think the language and OS should be more user-friendly but it's odd how user friendliness cuts into performance.

also, if one would bother to read the manual, they would see that many Unix commands do have alternative names or you can create them yourself. But, by the time you get that done, you'll have remembered the command.
 

nsaspook

Joined Aug 27, 2009
13,265
Try learning Python. By far the most human friendly language I've tried. If you haven't already, try a Linux operating system as well. The way everything is set up is much different than Windows based systems and languages, more secure and user friendly. Python is largely integrated into Linux as are some other languages that you may come across like Java and Perl. The command line of Linux is essentially a language of its own.
Python teaches bad programming practices so be careful. IMO Python is a bad language for a beginner. It's a OK scripting language if you already understand structured programming, as a method, not tied to any language. Being incredibly easy to use and learn for new beginners and newcomers means there are a hell of a lot of bad programming habits being created today in Python programmers.

Python is a trap for the uninitiated.
1659115433335.png
https://en.wikipedia.org/wiki/Duck_typing
https://www.codeproject.com/Tips/1095195/The-Dangers-of-Duck-Typed-Languages

https://medium.com/nerd-for-tech/python-is-a-bad-programming-language-2ab73b0bda5

Strongly typed languages are a pain but they build the foundation needed for good embedded programming practices. You will need this expertise when there is not driver for new device and you need to write your driver using spec-sheet register definitions and a custom interface to a API so Python programmers can use the new device.

Writing Python code with a C mindset is a good programming approach.
 
Last edited:

k1ng 1337

Joined Sep 11, 2020
960
Python teaches bad programming practices so be careful. It's a OK scripting language if you already understand structured programming, as a method, not tied to any language. Being incredibly easy to use and learn for new beginners and newcomers means there are a hell of a lot of bad programming habits being created today in Python programmers.

Python is a trap for the uninitiated.
https://en.wikipedia.org/wiki/Duck_typing
https://www.codeproject.com/Tips/1095195/The-Dangers-of-Duck-Typed-Languages

https://medium.com/nerd-for-tech/python-is-a-bad-programming-language-2ab73b0bda5
What do you recommend? I like it because it is powerful, at least for what I'm interested in. The readability makes it more appealing as I don't care for languages with all kinds of special characters and indentations.
 

nsaspook

Joined Aug 27, 2009
13,265
What do you recommend? I like it because it is powerful, at least for what I'm interested in. The readability makes it more appealing as I don't care for languages with all kinds of special characters and indentations.
It depends on the type of programming you want to do. It's not Python in general that's the issue, it's thinking you understand structured programming, data structures, concurrent programming issues, etc ... simply because you can write Python programs that only crash 10% of the time while running. If you plan on hardware specific embedded programming then C IMO is a must have skill that should be combined with computer science skills on the theory of basic computing, not computer programming specifics.
 
Last edited:

nsaspook

Joined Aug 27, 2009
13,265
Tried. Only use it when I have to. The dependency on whitespace crimps my style.
The syntax rules in Python were obviously designed for interpreted small scripts, not structured programming using a compiler with proper tokens like brackets to indicate scope.
 

xox

Joined Sep 8, 2017
838
I never developed a fascination with Unix myself. It was certainly prominent even back then but it always appeared sloppy, a glorified hack job. Of course there were few options for operating systems back in the 70s nd 80s so it was useful to have the choice.


There were (are) umpteen varieties of it too, this variant and that variant each with their own idiosyncratic bent, uniformity was never a big feature that I could see and the GUI was a bolt-on (well, there were several).


Unix grew out of the earlier (and much more impressive) Multics project (in fact the name Unix is a play on the name Multics). Multics was the first OS written in a high level language (PL/I) but was never commercially successful and was large too I guess.


Something I hated (to this day in fact) is the obsession with abbreviated names which Unix and C seemed to attach huge importance too, like it was a fashion and many systems I saw that used or ran on Unix followed that lead, cryptic shortnames everywhere.


Conisder: printf or sprintf or strcat for example, I am fine with abbreviations but they could have been optional, with a more meaningful name as the true name.


In Multics for example commands were very readable and a user had an optional abbreviations file that was used when they logged in, that way they could use (or define their own) abbreviations.


For example to copy a file in Multics: copy_file or to start a process runing: start_process, many people would use cf and sp for these but the documentation always used the full names and the full names were pretty much self explanatory.


Its odd how C and Unix caught on though, I think this was because many colleges and universities had such systems because there was little or no cost. Back then IBM, DEC etc likely charged a leasing fee for their OS and language compilers!


C was always burdensome in that it has a very limited number of data types, even strings are not first class types in the language which I thought was just too much minimalism. The C grammar has (IMHO) plagued languages ever since, to this day there are characteristics in Java and C# that exist only because of the prominence given to the C language grammar.

Consider this. Unix was probably the first "semantically secure" OS. Contrast that with say Windows-95 which offered no such guarantees. In fact at the time you could say that Unix was the most powerful operating system in history. It allowed you to do things that other OS's simply couldn't. Well in large part it was developed for the purpose of switching telephony packets in real time so go figure.

As to C's rather crude features bear in mind that it was first and foremost a "system's software" kind of language. What it does do it does very well. Sure there are many higher-level abstractions which would have been nice. But it wasn't really until many years later that language designers started incorporating things like mark-and-sweep garbage collectors or in the case of C++ deterministic constructors/destructors (aka RAII "Resource Allocation Is Initialization") which made things like native text strings practical.

By the way if you don't like the name of the standard C function you could always #define another symbol as an alias or say declare a function pointer with a more suitable name. That said functions which do not do bounds checking should be avoided like the plague. That includes strcpy, strcat, gets, etc. Always use the bounded versions (ie strncpy, strncat, fgets).
 

ApacheKid

Joined Jan 12, 2015
1,609
Consider this. Unix was probably the first "semantically secure" OS. Contrast that with say Windows-95 which offered no such guarantees. In fact at the time you could say that Unix was the most powerful operating system in history. It allowed you to do things that other OS's simply couldn't. Well in large part it was developed for the purpose of switching telephony packets in real time so go figure.

As to C's rather crude features bear in mind that it was first and foremost a "system's software" kind of language. What it does do it does very well. Sure there are many higher-level abstractions which would have been nice. But it wasn't really until many years later that language designers started incorporating things like mark-and-sweep garbage collectors or in the case of C++ deterministic constructors/destructors (aka RAII "Resource Allocation Is Initialization") which made things like native text strings practical.

By the way if you don't like the name of the standard C function you could always #define another symbol as an alias or say declare a function pointer with a more suitable name. That said functions which do not do bounds checking should be avoided like the plague. That includes strcpy, strcat, gets, etc. Always use the bounded versions (ie strncpy, strncat, fgets).
Well yes there is some truth in what you say. But as for "first" for this and that, I don't think I agree. Multics was the ancestor of Unix, and was a strong OS. It had security designed in from the outset. Unlike Unix, Multics was from the outset aimed at multi users applications, in fact it was designed as a "utility' OS where timesharing and bureau use was a target goal so security was important.

The USAF were important technical contributors to the Multics security system, this resulted in the original B2 security spec and implementation of that. Unix (being built using C) was plagued by the infamous buffer overrun problem, hard to avoid in C code yet hard to encounter in PL/I code.

PL/I was without doubt a better language than C, but compilers were challenging, especially optimizers. This was because PL/I was the first language to handle (what we now call) "exceptions" where exception handling code could be invoked in ways that are not detectable with static analysis. PL/I also had the unique (to this day perhaps) absence of reserved words, keywords could be used as identifiers and the grammar handles it all.

It might also be true (I'd need to check my notes) that PL/I was the first high level language to include a pointer data type. Pointer arithmetic is not possible in PL/I (it isn't needed) so the kinds of problems we see routinely in C code do not exist in PL/I, this meant that Multics itself was a very robust OS.

I personally regard C as an amateur language. What I mean by that is that many aspects of it seem to reflect compiler simplicity to the extent that bugs and code complexity are tolerated in return for being able to write simple, basic, compilers. I feel C has been a huge handicap for the industry, had Digital Research had their act together things might have been different because CP/M was strong OS for 8 bit microprocessors and although the IBM PC was sold with both CP/M and MS-DOS as options, DOS was cheaper and Microsoft aggressively marketed it and had IBM support more than Digital Research did.

CP/M was written in PL/I and therein lay its strength, alas it was not to be and the kludge MS-DOS dominated PC's for decades relegating CP/M to the specialized and elite applications and never really got traction.

I wrote a PL/I compiler in C so know both languages very well indeed, the code is here in fact, this was a serious implementation that ran 32 bit COFF executables on Windows NT.

Given a choice between C and PL/I and access to good quality compilers, most engineers would likely prefer PL/I and could likely appreciate its advantages over C. Building an entire OS in C is common these days but that doesn't mean it is well suited to engineering work.

(An example of PL/I having good engineering features is its support for the "bit" data type. One could declare a bit(12) for example and could specify aligned or unaligned, unaligned meaning all the bits were packed as expected whereas aligned meant each bit began on a word boundary. This would be a huge help in OS code or MCU code).
 
Last edited:

nsaspook

Joined Aug 27, 2009
13,265
C is a language that makes amateur programmers look bad because IMO they don't really understand computer hardware architecture, digital electronics, complex hardware systems and fundamental defensive structured programming. The C language is neutral (doesn't get in your way or promote X way of programming) and natural for hardware hackers (that love dangerous low-level details what some would call tedious work) that also need to code. Compiler simplicity and low levels of software abstraction are angels to those that need to do low-level debugging of new and unique hardware systems. C is here and still strong today because, hardware engineers that also write code, also design computers that are easily compatible with C compilers. Direct efficient pointers to "bits" require CISC type hardware support that's out of fashion today, thankfully. Structured bit fields in fundamental data types are a lot less error prone and usually more efficient for actual hardware interfacing..
 
Last edited:

ApacheKid

Joined Jan 12, 2015
1,609
C is a language that makes amateur programmers look bad because IMO they don't really understand computer hardware architecture, digital electronics, complex hardware systems and fundamental defensive structured programming. The C language is neutral (doesn't get in your way or promote X way of programming) and natural for hardware hackers (that love dangerous low-level details what some would call tedious work) that also need to code. Compiler simplicity and low levels of software abstraction are angels to those that need to do low-level debugging of new and unique hardware systems. C is here and still strong today because, hardware engineers that also write code, also design computers that are easily compatible with C compilers. Direct efficient pointers to "bits" require CISC type hardware support that's out of fashion today, thankfully. Structured bit fields in fundamental data types are a lot less error prone and usually more efficient for actual hardware interfacing..
Used with good discipline and standards C is fine, but that level of discipline is high IMHO, higher than many other languages. One must take steps to avoid certain practices that are easy or tempting in C that can lead to frustrating bugs.

Strings are inherent to software, they are ever present in many problems and that's why many languages support them out of the box, but not C. That alone is a good reason to avoid the language IMHO. When I first learned C and found that I could not declare strings and could not compare strings using comparison operators I almost dropped my coffee.

I know the language well, have developed some very large projects in C much of that low level system software too. So with strong discipline it can work and work well, I just find that emphasis on discipline is too high these days.
 

dl324

Joined Mar 30, 2015
16,916
Strings are inherent to software, they are ever present in many problems and that's why many languages support them out of the box, but not C. That alone is a good reason to avoid the language IMHO. When I first learned C and found that I could not declare strings and could not compare strings using comparison operators I almost dropped my coffee.
What's the big deal? A string is just a series of NULL terminated bytes that happen to represent ASCII characters and you just use string functions to compare them.
 

nsaspook

Joined Aug 27, 2009
13,265
Used with good discipline and standards C is fine, but that level of discipline is high IMHO, higher than many other languages. One must take steps to avoid certain practices that are easy or tempting in C that can lead to frustrating bugs.

Strings are inherent to software, they are ever present in many problems and that's why many languages support them out of the box, but not C. That alone is a good reason to avoid the language IMHO. When I first learned C and found that I could not declare strings and could not compare strings using comparison operators I almost dropped my coffee.

I know the language well, have developed some very large projects in C much of that low level system software too. So with strong discipline it can work and work well, I just find that emphasis on discipline is too high these days.
It's not discipline, it's good structured programming that should be used on all languages at all times.

Software bugs don't have a domain. You can shoot yourself in the foot with C, blow your leg off with C++, or blowup the Internet with X language.

Strings are a higher level abstraction. C is not an object oriented language. The lack of formal language strings in C is a feature, not a weakness. It won't do anything "behind the scenes".
 
Last edited:

ericgibbs

Joined Jan 29, 2010
18,848
hi Guys,
The TS asked this question.
Do you have any advice to make learning programming easier for newbie?

Not a discussion about programming semantics, please stay on Topic.

Moderation.
 

ApacheKid

Joined Jan 12, 2015
1,609
I am a complete newbie in C programming. I am looking some advice from experts here.

I have installed code block in my computer. I have also started reading books for C language. I am still finding it very difficult to learn programming. I don't understand how to make learning programming easier.

Do you have any advice to make learning programming easier for newbie?
Yes. A practice I adopted many years ago, is to set yourself a goal, a real deliverable functioning end product. Then work toward that using C (in your case, if that's the language you're pursuing).

This helps a lot because you start to use the language as a tool even though you might use it poorly at first, you are actually using it to solve a real problem.

The problem can be anything that interests you and should avoid being too complex and should also avoid the need to use advanced C features too soon.

In addition use tools that provide good debugging, being able to step thru code and see variables and stacks and so on, is a great thing that helps yo learn a language.

Visual Studio - Community Edition (Free) is highly recommended for this, its a very solid compilers, debuggers and so on, so for learning C at least, Visual Studio is a superb choice. It runs on Windows or Mac too.
 
Last edited:

xox

Joined Sep 8, 2017
838
Well yes there is some truth in what you say. But as for "first" for this and that, I don't think I agree. Multics was the ancestor of Unix, and was a strong OS. It had security designed in from the outset. Unlike Unix, Multics was from the outset aimed at multi users applications, in fact it was designed as a "utility' OS where timesharing and bureau use was a target goal so security was important.


The USAF were important technical contributors to the Multics security system, this resulted in the original B2 security spec and implementation of that. Unix (being built using C) was plagued by the infamous buffer overrun problem, hard to avoid in C code yet hard to encounter in PL/I code.


PL/I was without doubt a better language than C, but compilers were challenging, especially optimizers. This was because PL/I was the first language to handle (what we now call) "exceptions" where exception handling code could be invoked in ways that are not detectable with static analysis. PL/I also had the unique (to this day perhaps) absence of reserved words, keywords could be used as identifiers and the grammar handles it all.


It might also be true (I'd need to check my notes) that PL/I was the first high level language to include a pointer data type. Pointer arithmetic is not possible in PL/I (it isn't needed) so the kinds of problems we see routinely in C code do not exist in PL/I, this meant that Multics itself was a very robust OS.


I personally regard C as an amateur language. What I mean by that is that many aspects of it seem to reflect compiler simplicity to the extent that bugs and code complexity are tolerated in return for being able to write simple, basic, compilers. I feel C has been a huge handicap for the industry, had Digital Research had their act together things might have been different because CP/M was strong OS for 8 bit microprocessors and although the IBM PC was sold with both CP/M and MS-DOS as options, DOS was cheaper and Microsoft aggressively marketed it and had IBM support more than Digital Research did.


CP/M was written in PL/I and therein lay its strength, alas it was not to be and the kludge MS-DOS dominated PC's for decades relegating CP/M to the specialized and elite applications and never really got traction.


I wrote a PL/I compiler in C so know both languages very well indeed, the code is here in fact, this was a serious implementation that ran 32 bit COFF executables on Windows NT.


Given a choice between C and PL/I and access to good quality compilers, most engineers would likely prefer PL/I and could likely appreciate its advantages over C. Building an entire OS in C is common these days but that doesn't mean it is well suited to engineering work.


(An example of PL/I having good engineering features is its support for the "bit" data type. One could declare a bit(12) for example and could specify aligned or unaligned, unaligned meaning all the bits were packed as expected whereas aligned meant each bit began on a word boundary. This would be a huge help in OS code or MCU code).

PL/I does indeed sound like a very powerful language. But as you say it was difficult to implement correctly. Implementing a C compiler on the other hand is a comparatively trivial task. Which explains why C is so ubiquitous. It can be easily bootstrapped from just about any machine. Its ABI is also standardized, making it fairly well interoperable with other languages too. But yes it is rather lacking in the "features" department. To say the least.

As to the original question I would say practice, practice, practice. Challenge yourself to create a program that does something interesting. Hone it to perfection. Even if that means deleting the entire program and starting over. But above all, have fun with it!
 

ApacheKid

Joined Jan 12, 2015
1,609
PL/I does indeed sound like a very powerful language. But as you say it was difficult to implement correctly. Implementing a C compiler on the other hand is a comparatively trivial task. Which explains why C is so ubiquitous. It can be easily bootstrapped from just about any machine. Its ABI is also standardized, making it fairly well interoperable with other languages too. But yes it is rather lacking in the "features" department. To say the least.

As to the original question I would say practice, practice, practice. Challenge yourself to create a program that does something interesting. Hone it to perfection. Even if that means deleting the entire program and starting over. But above all, have fun with it!
I was a very poor programmer when I first started. When I began paid programming work I'd struggle and take ages to do something relatively simple. This was because most of my education in science, math, electronics and so on uses different disciplines to those needed to write imperative code. That was a struggle for the first few years and I eventually became very competent, but the thinking processes of the sciences is not well suited to imperative coding, this is why I am attracted to functional programming, it is extremely close to math.

If a mathematician sees:

A = A + 1

They will probably scream and with good reason!
 

djsfantasi

Joined Apr 11, 2010
9,163
I was a very poor programmer when I first started. When I began paid programming work I'd struggle and take ages to do something relatively simple. This was because most of my education in science, math, electronics and so on uses different disciplines to those needed to write imperative code. That was a struggle for the first few years and I eventually became very competent, but the thinking processes of the sciences is not well suited to imperative coding, this is why I am attracted to functional programming, it is extremely close to math.

If a mathematician sees:

A = A + 1

They will probably scream and with good reason!
Interesting post!

I agree with you. I shake my head when someone says a background in math and sciences helps with learning to program. I don’t have the proof to dispute these statements, but instinctively question them. I’m general, I feel that a background in the sciences is NOT helpful. Mathematics, maybe? Geometry - yes. Applied Mathematics - yes. And my reasoning is that both start with a limited set of axioms upon which a rigorous process to establish theorems are built.

In programming one might say the axiom is the available structure to the code, be it syntax or control templates, resulting in a hypotheses/ program. If one accepts that a complex program is a collection of smaller programs, the analogy extends.
 
Top