Galera, to make a small program that given any number, enters a function that divides the number into a vector and returns the number of decimal places (number of positions) that the vector has. In all the guides and tutorials I see says that the way I'm doing is right, but for some reason the code is not working. Here is the code:
#include <stdio.h>
#include <stdlib.h>
int verPal(int num1, int *v);
int main(int argc, const char * argv[]) {
int *v;
int numero, y;
printf("digite o numero");
scanf("%d", &numero);
y = verPal(numero,v);
for (int i = 0; i<y; i = i+1) {
printf("%d \n", v[i]);
}
return 0;
}
int verPal(int num1, int *v){
int x=num1;
int y=0;
while(x != 0){
x = x/10;
y = y+1;
}
x = num1;
v = (int *) malloc(y*sizeof(int));
for (int i = 0; x != 0; i=i+1) {
v[i] = x%10;
x = x/10;
}
return y;
}
The idea here would be, for example, enter numero = 10
, y
return 2 and I have a vector with 2 positions being v[0] = 0
and v[1] = 1
. When I do malloc(y*sizeof(int))
it should be, in case, 2 * the size of an integer, it n returns me 2 spaces, only 1, how can I solve this? To doing something wrong?
Thanks