[Date Prev][Date Next]
Re: ldap_explode_dn corrupts UTF-8 encoding (ITS#1890)
Many LDAP applications which use ldap_explode_dn() don't
expect characters to be outside of the printable ASCII
range. This is because, historically, LDAP DNs were
restricted to ASCII. Hence, today, we hex escape. This
is the safest alternative.
Of course, regardless of whether such characters are
escaped or not, some DN, RDN, AVAs will have escaped
characters as required by RFC 2253. Just as in OpenLDAP 2.0
(and 1.2), applications should to be prepared for such.
This will be last post on this subject. This horse is dead.
At 10:41 PM 2002-06-17, firstname.lastname@example.org wrote:
>On Mon, 17 Jun 2002, Pierangelo Masarati wrote:
>> > It is mostly a matter of breaking things that used to work.
>> I have no problems in letting ldap_explode_dn/rdn return
>> a "pretty" (UTF-8) form of the dn: it's not a big deal; however
>> I'd like to see some consensus on that, which is difficult
>> to gather on obsoleted function calls: everybody would like
>> to keep them as they were, regardless of incompatibilities
>> or problems that might arise if fancy formats are used.
>One more thing I would like to add: The compatibility problem will
>manifest itself even (and especially) in dynamically linked executables...
>so if somebody upgrades OpenLDAP from 2.0 to 2.1 suddenly many
>applications will show wrong DNs. I agree that due to the RFCs one must
>already expect \<hex> encoded characters, but I guess that most
>application developers just use the OpenLDAP man pages as a reference.
>Thus I think that most current software will not expect such encodings.
>Assume that a popular Linux distribution upgrades OpenLDAP to 2.1.x:
>Suddenly everything looks different... not a nice thought, and it will
>be blamed on OpenLDAP... And it is a matter of taste if one thinks that
>this is true or that all applications are poorly written...