1.1

Table Of Contents
Chapter 13
Estimating Memory Requirements
Designing a SQLFire database also involves estimating the memory requirements for your data based on the size of
the actual table values and indexes, the overhead that SQLFire requires for your data, and the overall usage pattern for
your data. You can estimate the memory requirements for tables using general guidelines for SQLFire overhead. Or,
you can load tables with representative data and then query the SQLFire SYS.MEMORYANALYTICS table to obtain
details about the memory required for individual tables and indexes.
Estimating SQLFire Overhead
SQLFire requires different amounts of overhead per table and index entry depending on whether you persist
table data or congure tables for overow to disk. Add these overhead gures to the estimated size of each table
or index entry to provide a rough estimate for your data memory requirements. If you already have representative
data, use the SQLFire Java agent to query the SYS.MEMORYANALYTICS table to obtain a more accurate
picture of the memory required to store your data.
Note: All overhead values are approximate. Be sure to validate your estimates in a test environment with
representative data.
Table 1: Approximate Overhead for SQLFire Table Entries
Approximate overheadOverflow is configured?Table is persisted?
64 bytesNoNo
120 bytesNoYes
152 bytesYesYes
Table 2: Approximate Overhead for SQLFire Index Entries
Approximate overheadType of index entry
80 bytesNew index entry
24 bytesFirst non-unique index entry
8 bytes to 24 bytes*Subsequent non-unique index entry
*If there are more than 100 entries for a single index entry, the overhead per entry increases from 8 bytes to
approximately 24 bytes.
71