Checklist - Data Access Layer

J.D. Meier, Alex Homer, David Hill, Jason Taylor, Prashant Bansode, Lonnie Wall, Rob Boucher Jr, Akshay Bogawat

Design Considerations

  • Abstraction is used to implement a loosely coupled interface to the data access layer.
  • Data access functionality is encapsulated within the data access layer.
  • Application entities have been mapped to data source structures.
  • Data exceptions that can be handled are caught and processed.
  • Connection information is protected from unauthorized access.

Blob

  • Images are stored in a database only when it is not practical to store them on the disk.
  • BLOBs are used to simplify synchronization of large binary objects between servers.
  • Additional database fields are used to provide query support for BLOB data.
  • BLOB data is cast to the appropriate type for manipulation within your business or presentation layer
  • You did not store BLOB in the database when using buffered transmission.

Batching

  • Batched commands are used to reduce round trips to the database and minimize network traffic.
  • Largely similar queries are batched for maximum benefit.
  • Batched commands are used with a DataReader to load or copy multiple sets of data.
  • Bulk copy utilities are used when loading large amounts of file-based data into the database.
  • You did not place locks on long running batch commands.

Connections

  • Connections are opened as late as possible and closed as early as possible.
  • Trusted sub-system authentication was used to maximize the effectiveness of connection pooling.
  • Transactions are performed through a single connection where possible.
  • Connection information is not stored in System or User Data Source Name (DSN) where possible.
  • Retry logic was used to manage situations where the connection to data source is lost or times out.

Data Format

  • You have considered the use of custom data or business entities for improved application maintainability.
  • You did not implement business rules in data structures associated with the data layer.
  • XML was used when working with structured data that changes over time.
  • DataSets have been considered for disconnected operations when dealing with small amounts of data.
  • Serialization and interoperability requirements have been considered.

Exception Management

  • You have identified data access exceptions that should be handled in the data layer.
  • Retry process for operations is implemented to handle data source errors or timeout errors.
  • Appropriate exception propagation strategy is designed.
  • Global exception handling has been implemented to catch unhandled exceptions.
  • You have designed an appropriate logging and notification strategy for critical errors and exceptions.

Object Relational Mapping Considerations

  • Used or developed a framework to provides a layer between domain entities and the database.
  • An O/RM tool that will generate a schema to support the object model and provide a mapping between the database and domain entities is used in a Greenfield environment.
  • Tools to map the domain model and relational model is used in a Brownfield environment.
  • Data access pattern, such as Repository is used for smaller applications or when O/RM tools are not available.

Queries

  • Parameterized SQL statements are used, instead of using literals, to reduce the chances of SQL Injection.
  • User input has been validated when used with dynamically generated SQL queries.
  • String concatenation has not been used to build dynamic queries in the data layer.
  • Objects are used to build the database query.
  • Business processing logic is not mixed with logic used to generate SQL statement when building dynamic SQL.

Stored Procedures

  • Output parameters are used to return single values.
  • Individual parameters are used for single data inputs.
  • XML parameters have been considered for passing lists or tabular data.
  • Memory-based temporary tables are used only when required.
  • Error handling has been implemented to return errors that can be handled by the application code.

Transactions

  • Transactions are enabled only when you need them.
  • Transactions are kept as short as possible to minimize the amount of time that locks are held.
  • Appropriate isolation level is chosen by doing a tradeoff between data consistency and contention.
  • Manual or explicit transactions are used when performing transactions against a single database.
  • You have considered the use of Multiple Active Result Sets (MARS) in transaction heavy concurrent applications to avoid potential deadlock issues.

Validation

  • All data received by the data layer is validated.
  • User input used for dynamic SQL has been validated to prevent SQL injection attacks.
  • All trust boundaries are identified, and data that crosses these boundaries is validated.
  • You have determined whether validation that occurs in other layers is sufficient, or if you must validate it again.
  • The data layer returns informative error messages if validation fails.

XML

  • XML readers and writers are used to access XML-formatted data.
  • An XML schema was used to define formats for data stored and transmitted as XML.
  • XML data is validated against the appropriate schemas.
  • Custom validators are used for complex data parameters within your XML schema.
  • XML indexes have been considered for read-heavy applications that use XML in SQL Server.

Manageability Considerations

  • A common interface or abstraction layer is used to provide an interface to the data layer.
  • You have considered creating custom entities, or if other data representations better meet your requirements.
  • Business or data entities are defined by deriving them from a base class that provides basic functionality and encapsulates common tasks.
  • Business or data entities rely on data access logic components for database interaction.
  • You have considered the use of stored procedures to abstract data access from the underlying data schema.

Performance Considerations

  • Connection pooling has been optimized based on performance testing.
  • Isolation levels have been tuned for data queries.
  • Commands are batched to reduce round-trips to the database server.
  • Optimistic concurrency is used with non-volatile data to mitigate the cost of locking data in the database.
  • Ordinal lookups are used for faster performance when using a DataReader.

Security Considerations

  • Windows authentication has been used instead of SQL authentication when using Microsoft SQL Server.
  • Encrypted connection strings in configuration files are used rather than using a system or user DSN.
  • Used a salted hash instead of an encrypted version of the password when storing passwords.
  • Identity information is passed to the data layer for auditing purposes.
  • Typed parameters are used with stored procedures and dynamic SQL to protect against SQL injection attacks.

Deployment Considerations

  • The data access layer is located on the same tier as the business layer to improve application performance.
  • The TCP protocol has been used to improve performance when you need to support a remote data access layer.
  • You did not locate the data access layer on the same server as the database.

Last edited Dec 19, 2008 at 6:49 PM by prashantbansode, version 2

Comments

No comments yet.