DoDAF Viewpoints and Models
Data and Information Viewpoint
DIV-2: Logical Data Model
The DIV-2 allows analysis of an architecture's data definition aspect, without consideration of implementation specific or product specific issues.
Another purpose is to provide a common dictionary of data definitions to consistently express models wherever logical-level data elements are included in the descriptions. Data definitions in other models include:
- Data described in a DIV-2 may be related to Information in an OV-1 High Level Operational Concept Graphic or and Activity Resource (where the Resource is Data) flow object in an OV-5b Operational Activity Model. This relation may be a simple subtype, where the Data is a proceduralized (structured) way of describing something. Recall that Information describes something. Alternatively, the relation may be complex using Information and Data whole-part (and overlap) relationships.
- The DIV-2 information entities and elements can be constrained and validated by the capture of business requirements in the OV-6a Operational Rules Model.
- The information entities and elements modeled in the DIV-2 also capture the information content of messages that connect life-lines in an OV-6c Event-Trace Description.
- The DIV-2 may capture elements required due to Standards in the StdV-1 Standards Profile or StdV-2 Standards Forecast.
The DIV-2 is a generalized formal structure in computer science. It directly reflects the paradigm or theory oriented mapping from the DIV-1 Conceptual Data Model to the DIV-2.
Possible Construction Methods: DoDAF does not endorse a specific data modeling methodology. The appropriate way to develop a logical data model depends on the technology chosen as the main design solution (e.g., relational theory or object-orientation). For relational theory, a logical data model seems best described using an entity relationship diagramming technique. For Object-Oriented, a logical data model seems best described using Class and/or Object diagrams.
In either case, attention should be given to quality characteristics for the data model. Definition and acceptance of data model quality measures (not data quality measures) for logical data models are sparse. There is some research and best practices. Framed as a software verification, validation, and quality factors, types of best practices include:
- Validation Factors - Was the Right Model Built?
- Information Requirements Fidelity.
- Conceptual, Logical, and Physical Traceability.
- Adherence to Government and Industry Standards and Best Practices.
- Domain Values.
- Resource Exchange and Other Interoperability Requirements.
- Net-Centric Factors.
- XML Registration.
- COI Participation.
- DDMS Compatibility.
- Identifiers and Labels.
- Verification Factors - Was it Well Built?
- Design Factors.
- Abstraction and Generalization.
- Ontologic Foundations.
- Semantic Purity.
- Logical and Physical Redundancy.
- Separation of Concerns.
- Software Quality Factors.
- Naming Conventions.
- Naming and Business Languages.
- Enumerations/free text ratio.
An example design factor is normalization- essentially one representation for any particular real-world object. There are degrees of normalization with third normal form (3NF) being commonly used. At 3NF, there are no repeating attributes; instead techniques like lookup tables, super-subtyping to carry the common attributes at the supertype-level, and entity decomposition into smaller attribute groupings are used. For the DIV-2, care should be taken to avoid hidden overlaps, where there is a semantic overlap between concepts with different entity, attribute, or domain value names.
DIV-1: Conceptual Data Model
DIV-2: Logical Data Model
DIV-3: Physical Data Model