[Date Prev][Date Next] [Chronological] [Thread] [Top]

Re: performance on Solaris



Thank you very much Quanah, you sound very encouraging.

Does anyone knows how OpenLDAP performs on Sun Solaris (10) on Sun's hardware (sparc cpus)?

Some more data to complete the picture.
- OpenLDAP 2.4.16 + Berkeley BDB 4.7 + back-bdb
- 800.000 entries occupy 880MB (log excluded), and a typical entry would just be:
     dn: uid=657321@myunit.myorg.org,
ou=people,dc=myorg,dc=org
     objectClass: account
     objectClass: simpleSecurityObject
     uid: 657321@myunit.myorg.org
     userPassword: whatever
- The "working set" is around 25% (i.e. most of the bind/updates will be done on the same 200.000 entries)

Given these numbers, I thought 500MB be enough, do you agree?

Thanks a lot,
Stefano



On Wed, Jan 27, 2010 at 7:32 PM, Quanah Gibson-Mount <quanah@zimbra.com> wrote:
--On Wednesday, January 27, 2010 7:06 PM +0100 Stefano Zanmarchi <zanmarchi@gmail.com> wrote:

Hi,
I need to set up a LDAP production server with 800.000 entries.
 Performance and stability are
my main concern, I expect around 250 binds and 50 ldapupdate/insert per
minute.
The machine is a SunFire6900 with 8 sparc US-IV+ CPUs and I'd like to
dedicate around 500MB
of ram to OpenLdap (more, if necessary). The OS is SunSolaris 10.


I'd like to know if OpenLdap is sincerely a good choice with these
numbers, and if anyone can
share his on-the-field experience with so many users.

250 binds/minute and 50 updates/minute is a very low number of each.  You fail to state what version of OpenLDAP you plan on using, or which backend (sql, bdb, hdb, etc) so it's hard to give you anything concrete.  Assuming you'd be using OpenLDAP 2.4.21 + Berkeley BDB 4.8 with either back-bdb or back-hdb, you should be fine.  I would note however any testing I've ever done using Solaris as the OS has been significantly slower than Linux running on the same hardware.

If you have 800,000 entries, your main concern is definitely going to be the RAM.  500MB is unlikely to cut it, but it depends on the size of your entries.  You may need upwards of 16GB of RAM depending again on how large your DB actually is.

I.e., you fail to provide the data necessary to give you a conclusive answer.

--Quanah


--

Quanah Gibson-Mount
Principal Software Engineer
Zimbra, Inc
--------------------
Zimbra ::  the leader in open source messaging and collaboration