How to transform an array from one dimension to two in Java?

1

I have a one-dimensional array of integers, like this:

int[] a = new int[]{1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12};

And I want to turn it into an array of two dimensions, for example:

[1, 2, 3],

[4, 5, 6],

[7, 8, 9],

[10, 11, 12]

How do I make this transformation in Java?

    
asked by anonymous 07.03.2017 / 17:07

1 answer

4

Note that there are several ways to transform a one-dimensional array into a two-dimensional array. Here are some alternative representations of the example:

[1, 2],           [1, 2, 3, 4],            [1, 2, 3, 4, 5, 6],

[3, 4],           [5, 6, 7, 8],            [7, 8, 9, 10, 11, 12]

[5, 6],           [9, 10, 11, 12]

[7, 8],

[9, 10],

[11, 12]

These are examples that represent 6x2, 3x4, and 2x6 matrices, respectively. Therefore, it is necessary to know the width of the array, and the height can be calculated from there, knowing the size of the one-dimensional array.

static int[][] dimensionar_uma_em_duas (int[] matriz, int largura) {
      int altura = matriz.length / largura;
      int[][] ret = new int[altura][largura];
      for(int i=0; i<matriz.length; i++) {
            ret[i/largura][i%largura] = matriz[i];
      }
      return ret;
}

The size_a_in_Two method, gets the height of the array by dividing its total size by width, which is an input parameter. From there it traverses the one-dimensional vector at once, using the integer division and module functions to correctly position the elements in the new two-dimensional vector.

For a vector with height 4 and width 3, and the input of the first example, calculations (i / width) and (i% width) will return the positions: 0, 0; 0.1; 0.2; 1.0; 1.1; 1,2; 2.0; 2.1; 2.2; 3.0; 3.1 and 3.2; respectively.

    
07.03.2017 / 17:11