zero before decimal point

Thread Starter

kubeek

Joined Sep 20, 2005
5,794
Hi,

can someone explain to me why the uk/us custom is to write numbers as .5 instead of 0.5?
I would think that the decimal point in such notation is quite easy to miss, so a quick glance could mistake if for a 5.
So is it beacause people were just lazy to add a redundant zero infront of the point, or is there some other thing that I am not aware of?
 

Brownout

Joined Jan 10, 2012
2,390
Maybe they expect users to not be lazy readers. My B&K Precision 'scope does not include a redundant, leading zero on any of the panel figures.
 

Thread Starter

kubeek

Joined Sep 20, 2005
5,794
Well maybe, but how about written text? I hope you can imagine just how easy it can be to mistake .5 for a 5, expecially with bad eyesight.
 

Thread Starter

kubeek

Joined Sep 20, 2005
5,794
Exactly, so why not write it always as 0.5 or 5, so that the probability of error is much lower than 50/50?
 

atferrari

Joined Jan 6, 2004
4,764
Actually not an answer to your question; locally, people working in electronics name a cap like that as "point forty seven".

In the lists that people bring when buying on the counter, it seems common practice to write the values as you say, .33 , .47, .68.
 

djsfantasi

Joined Apr 11, 2010
9,156
I personally teach or use in my writing, the use of the leading 0 where appropriate.

However, I don't think it's lack of use is caused by laziness. It is more related to the fact that a leading zero is not used in other situations. Would you write 050 for fifty? Would you do so if you were summing five hundred to fifty ((500 + 50 ) = ? Or is it (500 + 050 ) = ?) In fact, from this point a case could be made that 0.5 is as confusing to the eye as .5

Especially children in the earlier grades get confused enough by decimal points, without adding another 'special' case. Having said that, sometimes teaching a particular student to use or 'imagine' the leading zero aids in their understanding.

:rolleyes: Disclaimer: I am not a certified teacher, but most of my family are. I have been in the classroom as a sub.

Just my 2 cents.
 

Thread Starter

kubeek

Joined Sep 20, 2005
5,794
I am aware of how it works nowadays, but I still can´t get to the bone of how it became that these two different notations emerged.
My perosonal guess would be that it started as a difference between british and continental notation, but I don´t even know how would i google that to find something more. Any clues?
 

Thread Starter

kubeek

Joined Sep 20, 2005
5,794
Especially children in the earlier grades get confused enough by decimal points, without adding another 'special' case. Having said that, sometimes teaching a particular student to use or 'imagine' the leading zero aids in their understanding.
That is the point I saw somewhere when googling, and I don´t get it either. Why would you need to imagine a leading zero? It is just there and always has been.

To me it is natural that 0.5 is somewhere between 0 and 1, and 1.3 is somewhere between 1 and 2. Now a .5 seems like a special case where the 0 is ommited just "because", while all other cases remain different.
edit: could that be the reason why kids get confused with decimal points as you say?
 
Last edited:

djsfantasi

Joined Apr 11, 2010
9,156
{omitted}
To me it is natural that 0.5 is somewhere between 0 and 1, and 1.2 is somewhere between 1 and 2. Now a .5 seems like a special case where the 0 is ommited just "because", while all other cases remain different.
That is my point. To you what is natural, may be strange to others. I have seen frustrated people trying to understand why a number should EVER start with a zero. That numbers between 0 and 1 should be treated differently (from THEIR perspective) is confusing. Their perspective is that there are 'gazillions of numbers' and they just don't start with a zero. That is why I asked why couldn't someone write 050?
 

Thread Starter

kubeek

Joined Sep 20, 2005
5,794
I would say that 1.2 is one plus a bit more, and 0.2 is zero plus a bit more.
.2 being point plus a bit more just doesn´t seem right.

Maybe it could be that the way they are taught doesn´t show zero as part of the system of numbers? Maybe if it were properly explained that 0.2 is zero ones and 2 tenths..
 

djsfantasi

Joined Apr 11, 2010
9,156
And what is "050"?

Rhetorically speaking.

By the way, I DO agree with you. Just think there is another perspective. No judgement on whether or not it is correct or incorrect, just that it exists.
 

Thread Starter

kubeek

Joined Sep 20, 2005
5,794
To me 050 can only be the result when you are substracting 75 from 125, writing the numbers in rows. Otherwise it is missing the decimal point. - here you can see the advantage, it is much harder to lose a zero due to bad handwriting or poor eyesight, than to lose a decimal point and make 50 from .50

I of course understand there is another perspective and that none can be called more correct than the other, my original point was more about trying to find out why did there come to be two standards, and what where the reasons for the difference.
 

#12

Joined Nov 30, 2010
18,224
I can testify that, clear back to the 1950's the leading zero was not taught in American schools. That doesn't say anything about, "why", only the fact that it has been accepted practice for nearly 60 years.

I was a young nerd. I would have remembered. :D
 

#12

Joined Nov 30, 2010
18,224
Excellent (decimal) point Max. :D Customs vary across the planet. It was centuries ago that numbering and spelling conventions started. The, "why" of it may be lost in the mists of time.
 

Wendy

Joined Mar 24, 2008
23,415
Personally I use a leading zero to add clarity, but accepting there are other standards I don't obsess.

I don't think I could handle $1,000,00 as $1,000.00. They have two different connotations.
 
Top