[Date Prev][Date Next]
Re: (ITS#4702) out-of-memory on huge DB ?
PS: 4GB of RAM is probably not enough for a 20GB database, you won't be
able to configure caches large enough to get decent search performance.
I think 8GB would be a practical minimum here.
> Full_Name: Paolo Rossi
> Version: 2.3.27
> OS: Solaris 8
> URL: ftp://ftp.openldap.org/incoming/
> Submission from: (NULL) (220.127.116.11)
> Hi, during some test on very huge DB, due to see how syncrepl works in this
> scenario, I've found a strange behavior:
> Solaris 8 on 2xUSIII+ 4GB RAM
> openLDAP 2.3.27
> BDB 18.104.22.168
> backend hdb
> 1 provider, 1 consumer, 1 consumer with filter.
> on 1 million dn LDAP whit 2 sub-dn for each dn, all the systems works fine,
> when I've tried to use a 10m dn with 3 sub-dn (very big ldap, openldap-data dir
> about 20GB):
-- Howard Chu
Chief Architect, Symas Corp. http://www.symas.com
Director, Highland Sun http://highlandsun.com/hyc
OpenLDAP Core Team http://www.openldap.org/project/