[Date Prev][Date Next] [Chronological] [Thread] [Top]

Re: (ITS#4702) out-of-memory on huge DB ?



PS: 4GB of RAM is probably not enough for a 20GB database, you won't be 
able to configure caches large enough to get decent search performance. 
I think 8GB would be a practical minimum here.

Paolo.Rossi.con@h3g.it wrote:
> Full_Name: Paolo Rossi
> Version: 2.3.27
> OS: Solaris 8
> URL: ftp://ftp.openldap.org/incoming/
> Submission from: (NULL) (88.149.168.114)
> 
> 
> Hi, during some test on very huge DB, due to see how syncrepl works in this
> scenario, I've found a strange behavior:
> 
> Solaris 8 on 2xUSIII+ 4GB RAM
> openLDAP 2.3.27 
> BDB 4.2.52.4
> 
> backend hdb
> 
> 1 provider, 1 consumer, 1 consumer with filter.
> 
> on 1 million dn LDAP whit 2 sub-dn for each dn, all the systems works fine,
> when I've tried to use a 10m dn with 3 sub-dn (very big ldap, openldap-data dir
> about 20GB):


-- 
   -- Howard Chu
   Chief Architect, Symas Corp.  http://www.symas.com
   Director, Highland Sun        http://highlandsun.com/hyc
   OpenLDAP Core Team            http://www.openldap.org/project/