You are doing an unnecessary conversion and too late. When you divide with integers, you will have resulted as an integer, and the integer of the result obtained is 0. If either the decimal part needs to use a given decimal, either by a variable that is already decimal
or a literal of that number. The literal of type decimal
always comes with the suffix M
of money , since this type is usually used for monetary value. Without the suffix it is by default an integer, and then there is trouble. So you should give the result you expect:
using static System.Console;
public class Program {
public static void Main() {
decimal contagemSubida = 0 , contagemDescida = 0;
int cSubida = 6, cDescida = 4, range = 10;
contagemSubida += cSubida * range / 100M;
contagemDescida += cDescida * range / 100M;
contagemSubida += contagemSubida * (4 / 10M);
contagemDescida += contagemDescida * (4 / 10M);
WriteLine(contagemSubida);
WriteLine(contagemDescida);
}
}
See running on .NET Fiddle . And at Coding Ground . Also I placed GitHub for future reference .
Note that if you change the type of the secondary variables to decimal
the result may be another without changing the divisor to type decimal
and it does not seem to be what you want, although I can not state why the question does not say which should be the result.