I remember 50 years ago my math teacher spent a lesson on arithmetic precision, or the importance of knowing how accurate your numbers are.
For example...
1.4 <= I am confident this number lies between 1.35 and 1.45
1.40 <= I am confident this number lies between 1.395 and 1.405
There is a difference, even if it's not always important.
In .Net decimal variables know their precision. For example, in C# the code
float f = 1.40F;
Console.WriteLine(f);
decimal d = 1.40M;
Console.WriteLine(d);
outputs
1.4 1.40
As you can see, floats (and doubles) do not know their precision but Decimals do.
Similarly in Visual Basic, decimals understand their precision but you cannot enter a decimal literal with trailing zeros because the editor won't let you.
Dim d As Decimal = 1.40D <= The editor removes the trailing zero as soon as you leave the line
However, you can initialize the decimal with a Decimal.Parse and the trailing zero is honored.
Dim d As Decimal = Decimal.Parse("1.40")
Console.WriteLine(d)
outputs
1.40
So it's clear both C# and Visual Basic treat decimals the same internally (as I would expect) but the Visual Basic editor has a bug that makes it think it should remove trailing zeroes even when it should not.Looking through the editor options and Googling does not reveal a way to suppress this behavior. It's just a bug.