It's a story of "so what", but I thought it was interesting that the behavior was different in each language.
JavaScript
var arr = [];
arr[100] = 'foo';
console.log(arr.length); //=> 101
console.log(arr[0]); //=> undefined
console.log(arr[101]); //=> undefined
Undefined indexes return ʻundefined`.
Declaring " use strict "; does not change the execution result.
Perl
my @arr;
$arr[100] = 'foo';
print $#arr; #=> 100
print defined $arr[0] ? 1 : 0; #=> 0
print defined $arr[101] ? 1 : 0; #=> 0
Like JavaScript, undefined index seems to be ʻundef`.
Declaring ʻuse strict;` does not change the execution result.
Ruby
arr = []
arr[100] = 'foo'
p arr.size #=> 101
p arr[0] #=> nil
p arr[101] #=> nil
It's about the same as JavaScript / Perl, but it's nil.
Probably because there is no equivalent of ʻundefined`.
PHP
$arr = array();
$arr[100] = 'foo';
echo count($arr); //=> 1
echo is_null($arr[0]); //=> 1
echo is_null($arr[101]); //=> 1
The undefined index will be null.
The length of the array is 1, which is different from other languages. This seems to be the case in PHP because both lists and hashes (dictionaries) are ʻarray`.
Access to an undefined index will result in a ʻE_NOTICE` level error. (See comments for details)
error_reporting(E_ALL); // E_Set to output NOTICE
$arr = array();
$arr[100] = 'foo';
echo $arr[0]; //=> Notice: Undefined offset: 0
Python
arr = []
arr[100] = 'foo' #=> IndexError: list assignment index out of range
If you try to reference / assign to an index that does not exist, you will get ʻIndexError`. I feel that this is the most straightforward behavior.