Measuring the complexity of an algorithm is traditionally done in computer science using analysis asymptotic which uses what is called a notation # (the O (n) of which your utterance speaks).
An algorithm with complexity O (n) is nothing more than a linear time algorithm, which means that the time it takes to execute is a direct reflection of the size of the data input it uses and that this time grows steadily and ... linear!
In practical terms what is desired by the statement of your problem is that if we say your algorithm perform a step that takes approximately 1 second to calculate the factorial of
1
the calculation of the factorial of
2
should carry out two steps and take approximately 2 seconds, the factorial of
3
would perform three steps and would take approximately 3 seconds, and so on, for each value above its time grows linearly.
As for your algorithm to meet demand, yes it does, this can be confirmed by seeing that each factorial above an earlier one generates only an extra call to fatorial
and fatorial
only performs constant time operations (I'm not going To get into more detail here because the subject is extensive, I suggest you see the links I used on Asymptotic and Big O analysis.