PHP中的多维数组能走多远?

I know it's possible to have an array within an array like this.

Array01 (

[0] => Array
    (
        [0] => 40292633
        [1] => 412
    )

[1] => Array
    (
        [0] => 41785603
        [1] => 382
    )

[2] => Array
    (
        [0] => 48792980
        [1] => 373
    )

[3] => Array
    (
        [0] => 44741143
        [1] => 329
    )
)

Can you have an array in an array in an array like this?

Array01 (

[0] => Array
    (
       [0] => Array
       (
           [0] => 40292633
           [1] => 412
       )

        [1] => Array
       (
           [0] => 41785603
           [1] => 382
       )
    )

[1] => Array
    (
        [0] => 41785603
        [1] => 382
    )

[2] => Array
    (
        [0] => 48792980
        [1] => 373
    )

[3] => Array
    (
        [0] => 44741143
        [1] => 329
    )
)

What I'm curious about is how far can this go, how many arrays can you have in an array?

Ok, let's just calculate the possible maximum. What do we need?

  • We need to know how much memory PHP requires to store/create an array
  • We need to know how much memory we could possibly use (best case scenario)
  • divide the amount of memory by the amount of memory required to store a PHP array

How big is a PHP array? Well, that's easy to work out: PHP is open source, knowing arrays are actually hashtables, let's look at all of the bits and pieces PHP needs:

typedef struct bucket {
    unsigned long h;
    unsigned int nKeyLength;
    void *pData;
    void *pDataPtr;
    struct bucket *pListNext;
    struct bucket *pListLast;
    struct bucket *pNext;
    struct bucket *pLast;
    const char *arKey;
} Bucket;

typedef struct _hashtable {
    unsigned int nTableSize;
    unsigned int nTableMask;
    unsigned int nNumOfElements;
    unsigned long nNextFreeElement;
    Bucket *pInternalPointer;
    Bucket *pListHead;
    Bucket *pListTail;
    Bucket **arBuckets;
    void * pDestructor;
    short persistent;
    unsigned char nApplyCount;
    short bApplyProtection;
} HashTable;

typedef union _zvalue_value {
 long lval;
 double dval;
 struct {
  char *val;
  int len;
 } str;
 HashTable *ht;//HashTable, this is used for arrays
 void* obj; //is actually a zend_object, but I believe that's a typedef to a pointer
} zvalue_value;


typedef struct _zval_struct {
 zvalue_value value;
 unsigned int refcount__gc;
 unsigned char type;
 unsigned char is_ref__gc;
} zval;

Now, let's quickly get an idea of how much bytes this uses:

int main(void)
{
    printf(
        "%zu bytes where pointers are %zu bytes in size
",
        sizeof(Bucket) + sizeof(zval) + sizeof(HashTable),
        sizeof(void *)//sizeof pointer
    );
    return 0;
}

Now on a typical 32 bit system this tells us that the combined size of all structs is 96 bytes, and a pointer is 4 bytes big. To use an array, it stands to reason that we need to have a pointer to it, so an array takes at least 100 bytes (96 bytes + 4 bytes for a pointer to the array). What does this tell us? Well, if a pointer takes up 4 bytes, then we know how many different pointers there are: 2^(4*8) => 2^32, this gices us a maximum of 4294967296 bytes we can address. Of course, a single pointer uses up 32 bytes, so the total number of pointers we can define is actually 2^2/32, which is 134217728
Likewise, to know how many PHP arrays this amount of memory can store, we need to 2^32/100, which gives us 42949672.96 arrays, but 0.96 is not an array (it's not enough), so we can't use that bit. 42 949 672 is, in theory, the maximum amount of arrays we can create.

Note: The output on a 64 bit machine is not simply twice that of a 32 bit machine, it's "168 bytes where pointers are 8 bytes". If you want to know what the maximum amount of arrays is on 64bit platforms, do the maths (2^64/174)...

Is this accurate? Nope, absolutely not. I've not taken other overhead into account at all, but it's safe to say that you can create, if you feel like going insane a 100-dimensional array. However, accessing a value does imply an awful lot of indirection, hashtable lookups and therefore, this will result in quite the performance overhead, so KISS (ie keep it simple and sane).

You can go infinitely deep.

EDIT:

There is a limit of number of elements per array, as described here, but that is a separate issue from depth.

Of course you need to take into consideration your machine's constraints (time and RAM), but there are no theoretical limits or restrictions from PHP for going infinitely deep.

What you have described in your first example is a "simple" bi-dimensional array - an "array of arrays" - in which you must use two indexes to access the required data.

In the second example you have described a mixed dimensional array - for some elements to be accessed you need to provide three indexes, for others, only two.

AFAIK, an array' size (and any other variable for that matter) is limited only by the memory available to the PHP interpreter (and by the other variables that are already present in the memory).

Update: Clarification

There is no hard limit imposed on the array depth/dimensions by design in PHP. You can have as many dimensions as you want, provided that:

  1. You don't hit the memory limit
  2. You don't go over the PHP's elements limit for one of your arrays^

^There is a limit on how many elements an array can have - RiggsFolly has provided a link to this answer on that particular topic, if you're interested.

You could test it by yourself

<?php
$i =0;
$array = [];
$a = &$array;
while(true) {
$a = &$a[];
$i++;
if ($i % 1000000==0){ echo $i,' ';};
}