User`s guide

2 Getting Started with Database Toolbox
2-198
Working with Large Data Sets
In this section...
“Connect to a Database with Maximum Performance” on page 2-198
“Import Large Data Sets into MATLAB” on page 2-198
“Export Large Data Sets from MATLAB” on page 2-199
“Access Data Stored in a Database Using a DatabaseDatastore” on page 2-199
Connect to a Database with Maximum Performance
When you are using MATLAB with a database containing large volumes of data, you
might experience out-of-memory issues or slow processing. To achieve the fastest
performance, connect to your database using the native ODBC interface. For details,
see “Connecting to a Database Using the Native ODBC Interface”. If the native ODBC
interface does not work, connect to your database using a JDBC driver. For details, see
“Connecting to a Database” on page 2-191.
Import Large Data Sets into MATLAB
If you are selecting large volumes of data in a database to import into MATLAB, you
might experience out-of-memory issues or slow processing. To achieve the fastest
performance, you can import the data in batches.
When working with a native ODBC connection, you might be restricted by the amount
of memory available to MATLAB. You might have to process parts of your data in
MATLAB rather than processing your whole set of data at once. Use the fetch function
to limit the number of rows your query returns by using the row limit argument. Using
a MATLAB script, you can fetch data in increments using the row limit until all data is
retrieved. For an example, see fetch.
When working with a JDBC connection, you might run into out-of-memory issues
because of JVM heap memory restrictions. To achieve the best performance with
importing large sets of data into MATLAB, you might need to fetch the data in batches
by setting database preferences. To assess your memory needs and for options on
running an SQL query that returns large amounts of data, see “Preference Settings for
Large Data Import” on page 5-19.