Which Version Control Systems Are supported by denodo Virtualization 7.0?

Using Version Control is a denodo Virtual DataPort (VDP) recommended best practice. Version 7.0 of denodo virtualization supports three Version Control Systems (VCS):

  • Microsoft Team Foundation Server (TFS) 2010 or later
  • Apache Subversion (1.7), and
  • Git

Related References:

Denodo Data Virtualization Project Roles

A Denodo virtualization project typically classifies the project duties of the primary implementation team into four Primary roles.

Denodo Data Virtualization Project Roles

  • Data Virtualization Architect
  • Denodo Platform Administrator
  • Data Virtualization Developer
  • Denodo Platform Java Programmer
  • Data Virtualization Internal Support Team

Role To Project Team Member Alignment

While the denodo project is grouped into security permissions and a set of duties, it is import to note that the assignment of the roles can be very dynamic as to their assignment among project team members.  Which team member who performs a given role can change the lifecycle of a denodo project.  One team member may hold more than one role at any given time or acquire or lose roles based on the needs of the project.

Denodo virtualization Project Roles Duties

Data Virtualization Architect

The knowledge, responsibilities, and duties of a denodo data virtualization architect, include:

  • A Deep understanding of denodo security features and data governance
  • Define and document5 best practices for users, roles, and security permissions.
  • Have a strong understanding of enterprise data/information assets
  • Defines data virtualization architecture and deployments
  • Guides the definition and documentation of the virtual data model, including, delivery modes, data sources, data combination, and transformations

Denodo Platform Administrator

The knowledge, responsibilities, and duties of a Denodo Platform Administrator, Include:

  • Denodo Platform Installation and maintenance, such as,
    • Installs denodo platform servers
    • Defines denodo platform update and upgrade policies
    • Creates, edits, and removes environments, clusters, and servs
    • Manages denodo licenses
    • Defines denodo platform backup policies
    • Defines procedures for artifact promotion between environments
  • Denodo platform configuration and management, such as,
    • Configures denodo platform server ports
    • Platform memory configuration and Java Virtual Machine (VM) options
    • Set the maximum number of concurrent requests
    • Set up database configuration
      • Specific cache server
      • Authentication configuration for users connecting to denodo platform (e.g., LDAP)
      • Secures (SSL) communications connections of denodo components
      • Provides connectivity credentials details for clients tools/applications (JDBC, ODBC,,,etc.)
      • Configuration of resources.
    • Setup Version Control System (VCS) configuration for denodo
    • Creates new Virtual Databases
    • Create Users, roles, and assigns privileges/roles.
    • Execute diagnostics and monitoring operations, analyzes logs and identifies potentials issues
    • Manages load balances variables

Data Virtualization Developer

The Data Virtualization Developer role is divided into the following sub-roles:

  • Data Engineer
  • Business Developer
  • Application Developer

the knowledge, responsibilities, and duties of a Denodo Data Virtualization Developer, by sub-role, Include:

Data Engineer

The denodo data engineer’s duties include:

  • Implements the virtual data model construction view by
    • Importing data sources and creating base views, and
    • Creating derived views applying combinations and transformations to the datasets
  • Writes documentation, defines testing to eliminate development errors before code promotion to other environments

Business Developer

The denodo business developer’s duties include:

  • Creates business vies for a specific business area from derived and/or interface views
  • Implements data services delivery
  • Writes documentation

Application Developer

The denodo application developer’s duties include:

  • Creates reporting vies from business views for reports and or datasets frequently consumed by users
  • Writes documentation

Denodo Platform Java Programmer

The Denodo Platform Java Programmer role is an optional, specialized, role, which:

  • Creates custom denodo components, such as data sources, stored procedures, and VDP/iTPilot functions.
  • Implements custom filters in data routines
  • Tests and debugs any custom components using Denodo4e

Data Virtualization Internal Support Team

The denodo data virtualization internal support team’s duties include

  • Access to and knowledge of the use and trouble of developed solutions
  • Tools and procedures to manage and support project users and developers

Denodo Virtual Dataport (VDP) naming Convention Guidance

Denodo provides some general Virtual Dataport naming convention recommendations and guidance.  First, there is the general guidance for basic Virtual Dataport object types and, secondly, more detail naming guidance recommends.      

Denodo Basic Virtual Dataport (VDP) Object Prefix Recommendations

  • Associations Prefix: a_{name}
  • Base Views Prefix: bv_{name}
  • Data Sources Prefix: ds_{name}
  • Integration View Prefix: iv_{name}
  • JMS Listeners Prefix: jms_{name}
  • Interfaces Prefix: i_{name}
  • Web Service Prefix: ws_{name}

Virtual Dataport (VDP) High-Level Project Structure

Different layers are identified when creating logical folders hierarchies within each Data Virtualization project.  The recommended high-Level project folders are:

Connectivity

  • Connectivity, where related physical systems, data sources, and base views are part of this folder.

Integration

  • Integration views include the combinations and transformations views for the next layers. Not directly consumed views at this level.

Business Entities

  • Business Entities include Canonical business entities exposed to all users.

Report Views

  • Report Views include Pre-built reports and analysis frequently consumed by users.

Data Services

  • Data Services include web services for publishing views from other levels. Can contain views need for data formatting and manipulation.

Associations

  • This folder stores associations.

JMS listeners

  • This folder stores JMS listeners

Stored procedures

  • This folder stores custom stored procedures developed using the VDP API.

Denodo Knowledge Base VDP Naming Conventions

Additional more detailed naming convention and Virtual Dataport organization guidance are available in the donodo Community Knowledge Base, under Operations

Knowledge Base Virtual Dataport (VDP) Naming Conventions Online Page

Virtual Dataport (VDP) Naming Conventions Downloadable PDF

What is Information Management?

Information Management (IM)
Information Management (IM)

Information Management Definition

Information Management (IM) tends to vary a based on your business perspective, but is all the systems, processes, practice (business and technical) within organizations for the creation, use, and disposal of business information to support business operations.

Information Management (IM) Activities

Information Management activities may include, but are not be limited to:

  • Information creation, capture, storage, and disposal
  • The governance of information, practices, meaning and usage
  • Information protection, Regulatory compliance, privacy, and limiting legal liability
  • Technological infrastructure, such as, architecture, strategies and delivery enablement

Related References

 

Netezza / PureData – How to add comments on a field

The ‘Comment on Column’ provides the same self-documentation capability as ‘Comment On table’, but drives the capability to the column field level.  This provides an opportunity to describe the purpose, business meaning, and/or source of a field to other developers and users.  The comment code is part of the DDL and can be migrated with the table structure DDL.  The statement can be run independently or working with Aginity for PureData System for Analytics, they can be run as a group, with the table DDL, using the ‘Execute as a Single Batch (Ctrl+F5) command.

Basic ‘COMMENT ON field’ Syntax

  • The basic syntax to add a comment to a column is:

COMMENT ON COLUMN <<Schema.TableName.ColumnName>> IS ‘<<Descriptive Comment>>’;

Example ‘COMMENT ON Field’ Syntax

  • This is example syntax, which would need to be changed and applied to each column field:

COMMENT ON COLUMN time_dim.time_srky IS ‘time_srky is the primary key and is a surrogate key derived from the date business/natural key’;

Related References

What are the Factor Affecting the Selection of Data Warehouse Naming Convention?

The primary factors affecting the choices in the creation of Data Warehouse (DW) naming convention policy standards are the type of implementation, pattern of the implementation, and any preexisting conventions.

Type of implementation

The type of implementation will affect your naming convention choices. Basically, this boils down to, are you working with a Commercial-Off-The-Shelf (COTS) data warehouse or doing a custom build?

Commercial-Off-The-Shelf (COTS)

If it is a Commercial-Off-The-Shelf (COTS) warehouse, which you are modifying and or enhancing, then it is very strongly recommended that you conform to the naming conventions of the COTS product.  However, you may want to add an identifier to the conventions to identify your custom object.

Using this information as an exemplar:

  • FAV = Favinger, Inc. (Company Name – Custom Identifier)
  • GlobalSales = Global Sales (Subject)
  • MV = Materialized View (Object Type)

Suffix Pattern Naming Convention

<<Custom Identifier>>_<<Object Subject Name>>_<<Object Type>>

Example:  FAV_GlobalSales_MV

Prefix Pattern Naming Convention

<<Object Type>>_<<Custom Identifier>>_<<Object Subject Name>>

Example: MV_FAV_GlobalSales

Custom Data Warehouse Build

If you are creating a custom data warehouse from scratch, then you have more flexibility to choose your naming convention.  However, you will still need to take into account a few factors to achieve the maximum benefit from you naming conventions.

  • What is the high level pattern of you design?
  • Are there any preexisting naming conventions?

Data Warehouse Patterns

Your naming convention will need to take into account the overall intent and design pattern of the data warehouse, the objects and naming conventions of each pattern will vary, if for no other reason than the differences in the objects, their purpose, and the depth of their relationships.

High level Pattern of the Data Warehouse Implementation

The high level pattern of you design whether an Operational Data Store (ODS), Enterprise Data Warehouse (EDW), Data Mart (DM) or something else, will need to guide your naming convention, as the depth of logical and/or processing zone of each pattern will vary  and have some industry generally accepted conventions.

Structural Pattern of the Data Warehouse Implementation

The structural pattern of your data warehouse design whether, Snowflake, 3rd Normal Form, or Star Schema, will  need to guide your naming convention, as the depth of relationships each pattern will vary, have some industry generally accepted conventions, and will relate directly to you High level Data Warehouse pattern.

Preexisting Conventions

Often omitted factor of data warehouse naming conventions are the sources of preexisting conventions, which can have significant impacts both from an engineering and political point of view. The sources of these conventions can vary and may or may not be formally documented.

A common source naming convention conflict is with preexisting implementations, which may not even be document.  However, system objects and consumers are familiar will be exposed to these conventions, will need to be taken into account when accessing impacts to systems, political culture, user training, and the creation of a standard convention for your data warehouse.

The Relational Database Management System (RDBMS) in which you intend to build the data warehouse may have generally accepted conventions, which consumers may be familiar and have a preconceived expectations whether expressed or intended).

Change Management

Whatever data warehouse naming convention you chose, the naming conventions along with the data warehouse design patterns assumptions, should be well documented and placed in a managed and readily accessible, change management (CM) repository.

Related Reference