Denodo Security Enforcement

As the Virtual DataPort Administration Guide, explains in the section “Types of Access Rights” section, on VDP databases, views, rows, and columns. The denodo role-based access mechanism controls how and what a user or user role can use in the virtual layer, including the data catalog.

Import Denodo Security Notes

  • Consumer security authorization is imposed at the object level, then Data Level
  • Consumer security authorization is not imposed on Modeling Layers/VDP Folders
  • Using a virtual database to partition projects or subjects is a Best Practice

Basically, the ability to grant security is as follows:

VDP Database

  •  Permissions grants include connection, creation, read, write and admin privileges over a VDP database.

VDP Views

  • Permissions grants include read, write, insert, update and delete privileges over a view.

VDP Columns Within a VDP View

  • Permissions grants include the denial of the projection specific columns /fields within a view.

Row Level Security

  • Row Level restrictions can be added to allow users to obtain only the rows that match a certain condition or to return all the rows masking the sensitive fields

Denodo Virtual DataPort (VDP) Administration Guide

 For more information, see these section denodo Virtual DataPort Administration Guide:

  • Section 12.2 of the guide describes the general concepts of user and access rights management in DataPort, while
  • Section 12.3 describes how privileges are managed and assigned to users and roles using the VDP Administration Tool.

Virtual DataPort Administration Guide

Related References

Denodo Data Consolidation & Denormalization

As the data modeling process in denodo moves through the conceptual layers of the data warehouse, there is an evolution of the data structure and their associated metadata.

The Base Layer

As the modeling process begins the base layer is the ingestion layer, where the source system data structures are recreated in denodo and field are transformed in denodo Virtual Query Language (VQL) data types. the Business layer is what folks with a traditional data warehousing background would think of as Staging or landing. These base layer views should most closely mirror the technical structure and data characteristics of the input data source and will be the least business friend in their organization, naming, and metadata.

The Semantics layer

The semantics layer is where the major data reorganization, data transformation, and the application of business friend field names and metadata begins. The semantics layer is what folks with a traditional data warehousing background would think of as the Data Warehouse (DW) or Enterprise Data Warehouse (EDW). The semantics layer of the logical data warehouse (LDW) performs serval tasks:

  • Data from multiple input sources are consolidated
  • The model becomes multi-dimensional (Fact and Dimension oriented)
  • Field names and descriptive metadata are changed to meaningful, domain normalized, business-friendly names and descriptions.
  • Domain normalizing business rules and transformations are applied.
  • Serves as a data source for the business layer and reporting layer.

The Business Layer

The business layer, which is considered optional by denodo, is modeled along a more narrow business subject orientation and more specialized business rules are applied. This is what folks with a traditional data warehousing background would think of as a Datamart (DM).

The business layer of the logical data warehouse (LDW) performs serval tasks:

  • Limits and optimizes the data to facilitate business intelligence and report activities concerning a specific line of business or business topic (e.g. Financials, Human Resources, Inventory, Asset management, etc. )
  • Business-specific/customized rules and metadata are applied
  • Supplements the semantic layer and serves as a data source for the reporting layer.
  • Additional data consolidation and data structure denormalization (flattening) may occur in the business layer

The Reporting Layer

The reporting layer, which is considered optional by denodo, is the most customized layer and sees the most reporting topic specialization and specific need transformation. The reporting layer is where a traditional data warehousing may provide customized reporting, or system interface views, interface ETL’s to produce interface files, and reporting team do more of their own development.

The reporting layer of the logical data warehouse (LDW) performs serval tasks:

  • Provides consumer-specific customized rules and metadata
  • Provides consumer-specific data organization/layouts
  • Data is optimized for consumer purposes and may be highly or entirely denormalized to meet consumer needs.

Denodo Best Practices For Base Views

The denodo “Base Layer” in the Logical Data Warehouse (LDW) can be thought of as the Data Staging local layer in a more traditional data warehouse (DW) development pattern.  The Base layer is the level at which the source system data structures are transformed into denodo field types and the source data structures are rendered as created in base views (bv).

Base views (bv), the first step in virtualizing data, are the denodo structures reflecting the source system structure and the second step behind the data source connection and, therefore, are essential elements for the other layers of the Logical Data Warehouse (LDW).  To provide some guidance to facilitate the usefulness and performance of base views here are some best practices:

  • Use consistent Object Naming conventions.  It is strongly recommended that the denodo standard naming conventions be used.
  • Import and or create the Primary Keys (PK), Foreign Keys (FK), and Associations.
  • Have Statistics Collection been set and include all critical fields?
  • Base views, as a rule, should not be cached unless absolutely necessary for reasons of performance.
  • Create indexes on Primary Keys (PK), Surrogate keys, and Foreign Keys (FK)
  • Create performance Indexes to mirror sources system to improve performance.
  • Populate view Metadata properties describing the type and the nature of the data which the view contains. Note:  if you have a governance team, they may want to manage the metadata in the denodo data catalog.
  • Retain the original table name (applying naming convention prefix and field names to facilitate data Lineage traceability.?
  • Use denodo tools against tables, where possible, rather than (Manual) SQL views, Database views, or Stored Procedure.  Denodo cannot rewrite or optimize these objects.
  • Field metadata should be annotated with “Not Used” if the field is always null, blank, or empty.  This saves time and labor when working with levels and researching data issues

Related Reference

The Generalizing Specialist: The Key To Success

Generalizing specialist

A generalizing specialist is simply someone who is multi-skilled. Such an individual can be a specialist in one or more technical disciplines while at the same time actively seeks to expand their skill set, which spans across different areas besides their present specialties. Generalizing specialists are also referred to as cross-functional developers, multi-disciplinary developers, and versatilists.

While they can become more skilled with time, don’t mistake them to be super skilled in every discipline. However, their technical knowledge and general software development knowledge, as well as a good understanding of their relevant business domains, can be critical to getting things done in real-time. The person can easily be redeployed based on the changes in business strategy or other necessary requirements to remain competitive.

Benefits being a generalizing specialist

We live in a fast-changing industry, where being a specialist in just a single discipline alone may not cut it in the larger scheme of things. Since generalizing specialists have knowledge on a broad range of issues, they can see the bigger picture and help make better decisions for greater productivity. As such, the available job opportunities will likely be more compared to specialists. Better yet, you will be able to attract better job offers.

Importance of generalizing specialists

Generalizing specialists are essential to developing high-performing agile teams in companies, and here are some of the reasons they are considered the key to success.

• Better collaboration

While a company will have different departments, they will be connected with others and geared towards accomplishing the same end goal. Communication and collaboration within the teams involved are important elements to achieving that goal, and this is something most specialists aren’t good at. When you don’t have a good understanding of how everything fits together, it’s very easy to look down on what your teammates are doing. Working together effectively might prove a challenging task.

Generalizing specialists are more likely to appreciate the work of others simply because they have a good grasp of different technical and domain disciplines. Their background allows them to understand the issues teammates are trying to find solutions to.

• Improved flexibility

The IT industry, by its very nature, faces significant changes that serious businesses must comply with to remain relevant. With a generalizing specialist, dynamic transition and allocation of the new tasks wouldn’t be a huge problem. Things would look quite different if a team is built of specialists that are just accustomed to doing the same type of tasks over and over again. In fact, this is considered to be risky, as it can result in productivity loss.

• Increased efficiency

Generalizing specialists bring less dependency, which can go a long way in increasing efficiency and productivity. The problem when working with specialists is that they can easily become bottlenecks, especially when they have a lot on their plate. There’s a good chance that multiple development teams will be looking up to the specialist, and this can negatively affect the overall team efficiency.

Conclusion

Generalizing specialists are surely taking over. There’s room for some specialists within IT departments, but as things look at the moment, more departments are moving towards becoming more agile. It’s not unlikely to see only a few specialists survive in the information technology industry over time.

Why Unit Testing is Important

Whenever a new application is in development, unit testing is a vital part of the process and is typically performed by the developer. During this process, sections of code are isolated at a time and are systematically checked to ensure correctness, efficiency, and quality. There are numerous benefits to unit testing, several of which are outlined below.

1. Maximizing Agile Programming and Refactoring

During the coding process, a programmer has to keep in mind a myriad of factors to ensure that the final product correct and as lightweight, as is possible for it to be. However, the programmer also needs to make certain that if changes become necessary, refactoring can be safely and easily done.

Unit testing is the simplest way to assist in making for agile programming and refactoring because the isolated sections of code have already been tested for accuracy and help to minimize refactoring risks.

2. Find and Eliminate Any Bugs Early in the Process

Ultimately, the goal is to find no bugs and no issues to correct, right? But unit testing is there to ensure that any existing bugs are found early on so that they can be addressed and corrected before additional coding is layered on. While it might not feel like a positive thing to have a unit test reveal a problem, it’s good that it’s catching the issue now so that the bug doesn’t affect the final product.

3. Document Any and All Changes

Unit testing provides documentation for each section of coding that has been separated, allowing those who haven’t already directly worked with the code to locate and understand each individual section as necessary. This is invaluable in helping developers understand unit APIs without too much hassle.

4. Reduce Development Costs

As one can imagine, fixing problems after the product is complete is both time-consuming and costly. Not only do you have to sort back through a fully coded application’s worth of material, any bugs which may have been compounded and repeated throughout the application. Unit testing helps not only limit the amount of work that needs to be done after the application is completed it also reduces the time it takes to fix errors because it prevents developers from having to fix the same problem more than once.

5. Assists in Planning

Thanks to the documentation aspect of unit testing, developers are forced to think through the design of each individual section of code so that its function is determined before it’s written. This can prevent redundancies, incomplete sections, and nonsensical functions because it encourages better planning. Developers who implement unit testing in their applications will ultimately improve their creative and coding abilities thanks to this aspect of the process.

Conclusion

Unit testing is absolutely vital to the development process. It streamlines the debugging process and makes it more efficient, saves on time and costs for the developers, and even helps developers and programmers improve their craft through strategic planning. Without unit testing, people would inevitably wind up spending far more time on correcting problems within the code, which is both inefficient and incredibly frustrating. Using unit tests is a must in the development of any application.

Related References

Denodo Data Catalog Roles

The denodo catalog provides the data governance and self-service capabilities to supplement the denodo Virtual DataPort (VDP) core capabilities. Six roles provide the ability to assign or deny capabilities with the denodo data catalog and supplement the database, row, and column security and permissions of denodo Virtual DataPort (VDP).

The Tasks The Roles Can PerformDenodo Data Catalog Role Name
Assign categories, tags and custom properties groups to views and web services.data_catalog_classifier
Edit views, web services, and databases. Create, edit and delete tags, categories, custom properties groups, and custom properties.data_catalog_editor
Can do the same as a user with the roles “data_catalog_editor” and “data_catalog_classifier”.data_catalog_manager
Configure personalization options and content search.data_catalog_content_admin
This role can perform any action of all the other data catalog roles.data_catalog_admin
The exporter role can export the results of a query from the Denodo Data Catalog.data_catalog_exporter
denodo Virtualization
denodo Virtualization

Related References

denodo > User Manuals > Denodo Platform New Features Guide

denodo > User Manuals > Data Catalog Guide > Administration

Quote – Doing The Work Yourself

Writing and The Written Word

The wise man will commit no business of importance to a proxy when he may do it himself.

— Roger L’Estrange

View original post