The JS function "charCodeAt(0)" doesn't seem to return the correct Unicode code point in case of emoji.
const letter = '😀' const code0 = letter.charCodeAt(0).toString(16) const code1 = letter.charCodeAt(1).toString(16) console.log(code0, code1) // d83d de00
Many symbols recently added to Unicode have two code points, which is called "surrogate pair".
😀 = d83d + de00
codePointAt
In UTF-16, the surrogate pair can be expressed one code point.
😀 = d83d + de00
😀 = 1f600
const letter = '😀' const codePoint = letter.codePointAt(0).toString(16) console.log(codePoint) // 1f600
charCodeAt
Return a code point if the symbol is normal.
Return two code points if the symbol uses the surrogate pair.
codePointAt
Return one code point in any cases.