[Date Prev][Date Next] [Chronological] [Thread] [Top]

Re: bulk copy 20 Million entries



that's weird, I did import 1 million entries and it took me about 12 hours on the 2 processor Linux Box with 2 G of RAM... I hope that OpenLDAP can import 20 millions in a day max...that is our requirement before piloting the directory server...
----- Original Message -----
Sent: Tuesday, October 02, 2001 8:18 PM
Subject: Re: bulk copy 20 Million entries

I have tried to load 1 million entries into OpenLDAP on Redhat6.2(P-III, 800MHz, 512M RAM,
ldbm with berkeley DB, about 10 attributes indexed), it taken about 6 days long.
With Netscape Directory Server, it taken about one and half days. Netscape takes some special
actions when the size of .db files are too large, however, the OpenLDAP does not.
For 20 Million entries, the only way is to distribute your database into serveral servers,
ex. 500K entries per server. If you can adopt distribution well, the OpenLDAP is also suitable
for Large database.
 
SCW
 
 
----- Original Message -----
Sent: Wednesday, October 03, 2001 10:45 AM
Subject: bulk copy 20 Million entries

Does anyone have done any benchmarking on how long does it take OpenLDAP to load (bulk copy) 20 million entries ?
 
I am planning on using OpenLDAP for managing 20 millions of our users, is it a suitable product, can OpenLDAP handle that kind of anticipated load or should I look into some other LDAP vendors except IPlanet (too expensive)?
 
Thanks