• slider4

    ABODE™ business transformation methodology  founded on EA3.0

  • slide1

    Enterprise Architecture 3.0 (EA3.0) - the third wave in Enterprise Architecture approaches

  • slider2

    Generalized Enterprise Function Framework  (GEFF)

  • slider3

    A3 KAM - Triple A(Architectural, Actionable and  Augmented) Knowledge Assets Management

Data Management Products and Services

There are several reasons to use Data Management Products & Services. Our approach is based on the industry standard DAMA DMBoK 2 (Data Management Body of Knowledge version 2). These tools are supportive during the Transformation phase, although they might be related to the Business, the Architecture, or the Lifecycle Management

Data Management Suite

The Data Management Suite support the Transition Capability Information Management, especially the Business Data Management part. The Data Management Suite (DMS) is based on DAMA DM BoK 2, an international standard in the Data Management area. The suite consists of tools to , in one way design, build, use and maintain a Data Hub, Data Warehouse, Data Vault or Data Lake and in the other way to support the the DAMA DMBoK 2 knowledge areas. The contemporary focus is on organising and creating a central data store of the Data Hub storage type and deliver requested data. Support of the other data storage types is simply manageable.

 

With the Data Management Suite of Enterprise Architects the following is realized:

  1. Data Governance
    • Data Governance predefined processes can be used by a customer’s data management team; the processes are managed and monitored case-based and support many policies around Data Management like requests, incidents, Data Ownership/Stewardshipand are based on the Data Management Capabilities as defined by the Data Architecture;
    • Processes can be publicized and executed based on the ABODE/EA3.0 approach; amendments and changes can be very rapidly delivered and the Design and Architecture including the user and functional management are publicized on the predefined Data Management Portal.
  2. Data Architecture (also see the Enterprise Architectuur Suite)
    • The Data Management team maintains the predefined Data Architecture; this can be easily established with the AAM Archimate tool;
    • The PROSA (Principles, Requirements, design Objectives, Standards & Agreements) can be easily maintained with the APT (ABODE PROSA Tool),
    • The DtA (Design to Architecture tool) tool is used to get insight in the architectural progress of different delivery objectives.
  3. Data Modeling & Design
    • For Modeling and Design the AOM (ABODE Object Management) tool is used; please click on AOM to retrieve more insight on how the different Data Models are interrelated, how the Data Lineage of any Object and its Attributes is kept, etc.
    • Models can be designed manually or sometime automatically based on the possibility to get access to the database schema and the usability of it. Also, exchange between different data modeling tools is possible via different interfaces (XML, JSON, etc.).
  4. Data Storage & Operations
    • The idea is that a central storage is created (a data hub or other storage type) and that different data delivery approaches can be used for integration of Source Data Stores in the central storage and Target Data Stores for data marts, BI reporting, services, etc. can be update: either is batch (ETL like solutions) or in real-time (services based on publish and subscribe patterns).
  5. Data Security
    • Data Security is supported by extending services with functionalities accessing databases and enforcing ARBAC (Attribute & Role Based Access Control) controls.
  6. Data Integration & Operability
    • Data Integration is the batch approach delivered by
      • Generating Stored Procedures (SQL Scripts) based on the Data Mapping technique
      • Integrating the Stored Procedures in ETL scripts
    • Data Interoperability is the batch approach delivered by (micro)services:
      • Generating messages based on the Data Mapping technique
      • Generating (micro)services for source and target data stores
    • Combination of batch and real-time integration is possible
  7. Reference & Master Data
    Master Data models are created separately from the Reference Data. This ensures simple models while supporting the the maintainability.
  8. Data Warehousing & Business Intelligence
    A Data Warehouse can be seen as both a Corporate Data Store or a Target Data Store while Data Marts are always Target Data Stores.
  9. Metadata Management
    Metadata describes information about data components (objects, attributes, relations) specifically for managing and understanding data and their models. During or after modeling metadata is added to these componenten. The use of a Data Dictionary (repository)
    • Helps with maintaining the Data components and metadata separately
    • Enables publication, for example in a Data Management Portal, in which way definitions, validation, integrity and security rules, meta data is accessible and usable for everyone.
  10. Data Quality
    Data Quality is partly defined by the data integrity rules and partly by other quality rules. These rules work as controls during the execution (ETL scripts and Services). Because error codes and/or error messages can be added to the rules the exceptions can be reported during data integration and interoperability and roll-back of the complete ETL is possible.

Data Modeling Suite

Enterprise Architects distinguishes the following tools (please click for extra information):

Object modeling and Management tool
specifically for manual design or data model import/export
Integrated Schema Solution
based on the database schema from the source data store directly importing the data model or vice versa, based on the data model the database schema
Integrated Service Engineering
generating microservices based on the Data Mapping techniques to enable real-time delivery
Integrated Stored Procedure
generating Stored Procedures (SQL Scripts) based on the Data Mapping techniques to enable batch data delivery via ETL
Extract-Transform-Load
tool for composing and executing stored procedures for migration and/or conversion from one (source) data store to another (target) data store, including data validation and integrity controls
Monitoring Data Lineage (model components) & Logging (migration & conversion)