charAt() always returns a character whose value is less than 65536, because the higher code points are represented by a pair of 16-bit surrogate pseudo-characters. Unicode code points range from 0 to 1114111 ( 0x10FFFF).
The index of the first character is 0, and the index of the last character in a string called str is str.length - 1. ()Ĭharacters in a string are indexed from left to right.Object.prototype._lookupSetter_() Deprecated.Object.prototype._lookupGetter_() Deprecated.Object.prototype._defineSetter_() Deprecated.Object.prototype._defineGetter_() Deprecated.