Magnus Olsen wrote:
Both UTF-7 and UTF-16 are 16bits UTF-16 encoding are
bit diffrent ageinst
UTF-7.some link and info about
utf-7http://www.faqs.org/rfcs/rfc2152.html
Now I understand what you mean! This a "different" 16bits than the one I
was referring to. I meant UTF-16 uses 16bit code units, while UTF-7 uses
8bit code units (of which it only uses 7bits per code unit for
compatibility purposes). The 16 bits you are referring to are for the
code points in Unicode 2.0. UTF-16 can be used to encode all code points
in Unicode 4.0 (which require more than 16 bits) though (by being a
variable-width encoding).