In a Data Access Layer method, you execute a command that selects from dbo.Documents, where one column contains large binary data. You want to read this column as a stream. Which ExecuteReader option should you use when creating the DbDataReader?

Difficulty: Medium

Correct Answer: var reader = cmd.ExecuteReader(CommandBehavior.SequentialAccess);

Explanation:


Introduction / Context:
When working with large binary columns such as documents, images, or media files in ADO.NET, it is inefficient to load entire values into memory at once. Instead, you can stream data from the database using a DbDataReader configured to support sequential access. The CommandBehavior enumeration provides options that control how the reader behaves, including whether it supports streaming of large values. Understanding which behavior flag to use is crucial for building scalable and efficient data access code.


Given Data / Assumptions:

    The application uses .NET Framework 4 and a DbConnection to a SQL Server 2008 database.
    The dbo.Documents table contains a column with large binary data, such as a varbinary(max) column.
    You have created a DbCommand with CommandText set to SELECT * FROM dbo.Documents.
    You call cnx.Open() and then need to create a DbDataReader by calling ExecuteReader on the command.
    The ReadDocument method expects to be able to read large binary fields as a stream rather than loading the entire value at once.


Concept / Approach:
CommandBehavior.SequentialAccess is specifically designed for scenarios where you want to read large binary or text data streams in sequence from the reader. When this flag is used, the data provider does not buffer an entire row in memory; instead, it allows you to call methods such as GetBytes or GetChars to retrieve data in chunks. Other flags like SchemaOnly or KeyInfo focus on metadata and do not enable streaming behavior. Therefore, to efficiently read large binary values as a stream, you must pass CommandBehavior.SequentialAccess to ExecuteReader.


Step-by-Step Solution:
1. Create and configure the DbCommand object cmd, setting its CommandText to select from dbo.Documents. 2. Open the database connection by calling cnx.Open(). 3. Call ExecuteReader on cmd, passing CommandBehavior.SequentialAccess: var reader = cmd.ExecuteReader(CommandBehavior.SequentialAccess);. 4. Inside the ReadDocument method, use the reader to iterate through rows and use GetBytes, GetStream, or similar methods to retrieve the binary column data in manageable chunks. 5. Because of SequentialAccess, the provider reads data sequentially from the server, which reduces memory usage and supports streaming of large values.


Verification / Alternative check:
You can test this approach with large sample documents and monitor memory usage. With SequentialAccess, the application should handle large files more smoothly compared to the default behavior. Documentation for System.Data.CommandBehavior confirms that SequentialAccess is intended for reading large binary values as streams by requiring that fields be accessed in order and supporting incremental read operations.


Why Other Options Are Wrong:
CommandBehavior.Default uses standard buffering behavior and does not specifically optimize for streaming large binary fields, although it may work for smaller values.
CommandBehavior.SchemaOnly instructs the provider to return only column metadata, not the actual data rows, which is the opposite of what is needed for reading documents.
CommandBehavior.KeyInfo requests additional key and index information with the data and does not affect streaming behavior for large binary columns.


Common Pitfalls:
Developers sometimes attempt to load entire large binary values into memory, which can cause performance problems or even out of memory conditions. Another pitfall is forgetting to access fields in sequence when using SequentialAccess, which can lead to exceptions. When streaming large values, always read columns in order and populate your objects or streams incrementally.


Final Answer:
You should call ExecuteReader with CommandBehavior.SequentialAccess to read the large binary data as a stream.

More Questions from Microsoft Certification

Discussion & Comments

No comments yet. Be the first to comment!
Join Discussion