L'octidi 8 prairial, an CCXXV, email@example.com a écrit :
A lot of Europe does it, and it is wrong! It goes back quite a while
it was fashionable to use a dot (.) as a symbol for multiplication. So
Europe stopped using a dot to signal a decimal point to avoid
(they should have stopped stopped using a dot as a symbol for
multiplication). In the U.S. and G.B. an X was used for multiplication
symbol so they continued on using a dot for decimal (as it should be).
What glyph is used as a separator does not really matter. What really
matter is that it is convenient and everybody uses the same. We could
have settled for a heart-shaped symbol, it would have worked.
In this matter, considerations such as "preserving local cultures" are
irrelevant. It is a matter of communication, an even of (slightly)
technical communication. Convenience and unambiguity are paramount.
Hence the "everybody uses the same" condition.
Convenience sets a few rules. The most important of these is: the
decimal separator, which has a semantic role, must be much more visible
than the thousand separator, which has only an aesthetic role. Thus,
for decimal and comma for thousand is stupid.
I suggest to apply the following rules, whenever you are free to chose
- Be liberal in what you accept: understand both dots and commas, do
start a pedantic rant if you get a text with the "wrong" one.
- In "casual" computerized text, especially monospace, use dot for
decimal and no thousand separator.
- In typeset text, use dot for decimal and a thin space for thousands
(possibly: only if the range of the numbers exceeds 9999, i.e. no
thousand separator for years for example).
- In hand-written text, the visibility of the dot is not reliable
enough, use a comma for decimal. And a small space for thousand.