Denodo Data Catalog Roles

The denodo catalog provides the data governance and self-service capabilities to supplement the denodo Virtual DataPort (VDP) core capabilities. Six roles provide the ability to assign or deny capabilities with the denodo data catalog and supplement the database, row, and column security and permissions of denodo Virtual DataPort (VDP).

The Tasks The Roles Can PerformDenodo Data Catalog Role Name
Assign categories, tags and custom properties groups to views and web services.data_catalog_classifier
Edit views, web services, and databases. Create, edit and delete tags, categories, custom properties groups, and custom properties.data_catalog_editor
Can do the same as a user with the roles “data_catalog_editor” and “data_catalog_classifier”.data_catalog_manager
Configure personalization options and content search.data_catalog_content_admin
This role can perform any action of all the other data catalog roles.data_catalog_admin
The exporter role can export the results of a query from the Denodo Data Catalog.data_catalog_exporter
denodo Virtualization
denodo Virtualization

Related References

denodo > User Manuals > Denodo Platform New Features Guide

denodo > User Manuals > Data Catalog Guide > Administration

Denodo Model Best Practices For Creation of Associations

What Are Denodo Associations?

In denodo associations follow the same concept as modeling tools, which can be described as an ‘on-demand join.’

Where Should Associations Be Created In the Denodo Model?

You don’t necessarily need to define an Association at every level; usually, the best practice is to apply associations at the following points:

  • On final views published for data consumers, indicating relationships between related views; Especially, on published web services.
  • On the component views below, any derived view that brings together disparate (dissimilar) data sources.  The associations should be defined as Referential Constraints whenever appropriate to aid the optimization engine.
  • On the component views below, any derived view that joins a “Base View from Query” with standard views, since Base Views from Query cannot be rewritten by the denodo optimization engine.  Often Base Views from Query create performance bottlenecks.

These best practices should cover the majority scenarios; beyond these guidelines, it is best to take an ad-hoc approach to create Associations when you see a specific performance/optimization.

Why Are Associations important in Denodo?

In a nutshell, associations performance and the efficiency of the denodo execution optimizer along with other model metadata, such as:  

  • The SQL of the view(s)
  • Table metadata (Table Keys {PK, FK), Virtual Partitions…etc.)
  • Data statistics, which are used by the Cost Based Optimizer (CBO)

Related References

Associations in Denodo

Importing Associations And Joins From A Database Schema in Denodo

A coworker recently asked a question as to whether denodo generated joins automatically from source RDBMS database schema.  After searching, a few snippets of information became obvious.  First, that the subject of inheriting join properties was broader than joins and needed to in modeling associations (joins on demand). Second, that there were some denodo design best practices to be considered to optimize associations.

Does Denodo Automatically Generate Joins From the Source System?

After some research, the short answer is no.

Can Denodo Inherit Accusations From A Logical Model?

The short answer is yes. 

Denodo bridges allow models to be passed to and from other modeling tools, it is possible to have the association build automatically, using the top-down approach design approach and importing a model, at the Interface View level, which is the topmost level of the top-down design process. 

However, below the Interface view level, associations and or joins are created manually by the developer.

Where Should Associations Be Created?

You don’t necessarily need to define an Association at every level, usually, the best practice is to apply associations at following points:

These best practices should cover the majority scenarios, beyond these guidelines it is best to take an ad-hoc approach to create Associations when you see a specific performance/optimization.

Related References

Associations in Denodo

denodo SQL Type Mapping

denodo 7.0 saves some manual coding when building the ‘Base Views’ by performing some initial data type conversions from ANSI SQL type to denodo Virtual DataPort data types. So, where is a quick reference mapping to show to what the denodo Virtual DataPort Data Type mappings are:

ANSI SQL types To Virtual DataPort Data types Mapping

ANSI SQL TypeVirtual DataPort Type
BIT (n)blob
BIT VARYING (n)blob
BOOLboolean
BYTEAblob
CHAR (n)text
CHARACTER (n)text
CHARACTER VARYING (n)text
DATElocaldate
DECIMALdouble
DECIMAL (n)double
DECIMAL (n, m)double
DOUBLE PRECISIONdouble
FLOATfloat
FLOAT4float
FLOAT8double
INT2int
INT4int
INT8long
INTEGERint
NCHAR (n)text
NUMERICdouble
NUMERIC (n)double
NUMERIC (n, m)double
NVARCHAR (n)text
REALfloat
SMALLINTint
TEXTtext
TIMESTAMPtimestamp
TIMESTAMP WITH TIME ZONEtimestamptz
TIMESTAMPTZtimestamptz
TIMEtime
TIMETZtime
VARBITblob
VARCHARtext
VARCHAR ( MAX )text
VARCHAR (n)text

ANSI SQL Type Conversion Notes

  • The function CAST truncates the output when converting a value to a text, when these two conditions are met:
  1. You specify a SQL type with a length for the target data type. E.g. VARCHAR(20).
  2. And, this length is lower than the length of the input value.
  • When casting a boolean to an integertrue is mapped to 1 and false to 0.

Related References

denodo 7.0 Type Conversion Functions

Infosphere DataStage – Boolean Handling for Netezza

Beware when you see this message when working with Boolean in DataStage, the message displays as informational (at list it did for me) not as a warning or an error.  Even though it seems innocuous, what it meant for my job, was the Boolean (‘true’ / ‘false’) was not being interpreted and everything posted to ‘false’.

In DataStage the Netezza ‘Boolean’ field/Data SQL type maps to the ‘Bit’ SQL type, which expects a numeric input of Zero (0) or one (1).  So, my solution (once I detected the problem during unit testing) was to put Transformer Stage logic in place to convert the Boolean input to the expected number value.

Netezza to Datastage Data Type Mapping

Netezza data types

InfoSphere DataStage

data types (SQL types)

Expected Input value

BOOLEANBit0 or 1 (1 = true, 0 = false)

Transformer Stage logic Boolean Handling Logic

A Netezza Boolean field can store: true values, false values, and null. So, some thought should be given to you desired data outcome for nulls

This first example sets a that the nulls are set to a specific value, which can support a specific business rule for null handling and, also, provide null handling for non-nullable fields.  Here we are setting nulls to the numeric value for ‘true’ and all other non-true inputs to ‘false’.

If isnull(Lnk_Src_In.USER_ACTIVE) then 1 Else if Lnk_Src_In.USER_ACTIVE = ‘true’ Then 1 Else 0

These second examples sets a that the nulls are set by the Else value, if your logic direction is correct value and still provides null handling for non-nullable fields.

  • If  Lnk_Src_In.USER_ACTIVE = ‘true’ Then 1 Else 0

  • If  Lnk_Src_In.USER_ACTIVE = ‘False’ Then 0 Else 1

Director Log Message

Message ID

  • IIS-DSEE-TBLD-00008

Message Text

  • <<Link Name Where Message Occurred>>: Numeric string expected. Use default value.

Or something like this:

  • <<Link Name Where Message Occurred>>: Numeric string expected for input column ‘<<Field Name Here>>‘. Use default value.

Related References

Boolean

PureData System for Analytics, PureData System for Analytics 7.2.1, IBM Netezza user-defined functions, UDX data types reference information, Supported data types, Boolean

https://www.ibm.com/support/knowledgecenter/en/SSULQD_7.2.1/com.ibm.nz.udf.doc/r_udf_boolean_datatype.html

Data types and aliases

PureData System for Analytics, PureData System for Analytics 7.2.1, IBM Netezza stored procedures, NZPLSQL statements and grammar, Variables and constants, Data types and aliases

https://www.ibm.com/support/knowledgecenter/en/SSULQD_7.2.1/com.ibm.nz.sproc.doc/c_sproc_data_types_aliases.html

Logical data types

PureData System for Analytics, PureData System for Analytics 7.2.1, IBM Netezza database user documentation, Netezza SQL basics, Data types, Logical data types

https://www.ibm.com/support/knowledgecenter/en/SSULQD_7.2.1/com.ibm.nz.dbu.doc/r_dbuser_data_types_logical.html

Data type conversions from Netezza to DataStage

InfoSphere Information Server, InfoSphere Information Server 11.5.0, Connecting to data sources, Databases, Netezza Performance Server, Netezza connector, Designing jobs by using the Netezza connector, Defining a Netezza connector job, Data type conversions, Data type conversions from Netezza to DataStage

https://www.ibm.com/support/knowledgecenter/en/SSZJPZ_11.5.0/com.ibm.swg.im.iis.conn.netezza.use.doc/topics/nzcc_mappingdatatypes.html

Infosphere Datastage – Data type conversions from Netezza to DataStage

Some recent research to eliminate some ETL Data Conversion issues, made me want to make an enhancement to the documentation provided by IBM, to prevent repeating the research.

Netezza data types and their equivalent InfoSphere DataStage data types

Netezza data types

InfoSphere DataStage data types (SQL types)

Notes

 
BYTEINTTinyInt  
SMALLINTSmallInt  
INTInteger  
BIGINTBigInt  
NUMERIC(p, s)Numeric, decimal, doubleDecimal and double are aliases of Numeric 
FLOAT(p)Float  
REALReal  
DOUBLE PRECISIONDouble  
CHAR(n)Char  
VARCHAR(n)VarChar  
NCHAR(n)NChar  
NVARCHAR(n)NVarChar  
BOOLEANBit  
DATEDate  
TIMETime  
TIME WITH TIME ZONEVarCharThe value of time with time zone will be returned without the time zone information. 
TIMESTAMPTimeStamp  
INTERVALVarCharYou cannot load the interval data type from an external table. 
ROWIDBigInt  
TRANSACTION IDBigInt  
DATASLICEInteger  

Related References

Data type conversions from Netezza to DataStage

InfoSphere Information Server, InfoSphere Information Server 11.5.0, Connecting to data sources, Databases, Netezza Performance Server, Netezza connector, Designing jobs by using the Netezza connector, Defining a Netezza connector job, Data type conversions, Data type conversions from Netezza to DataStage

Netezza / PureData – What is the maximum length of CHAR, VARCHAR, and NVARCHAR fields

During a recent project, I had reason to investigate what the maximum character field links for Netezza to be able to fit the data ETL ‘ed from the source.  The constraints on Netezza character fields are not as long as some other databases, therefore, I thought I would document and a quick post for future reference.   Knowing these limits will help with your ETL design and construction but you may know where to truncate your field input data.

Data typeAlias namesInfoSphere DataStage data types (SQL type)Maximum LengthNotes
CHARCHARACTER, CHAR(n), CHARACTER(n)Char64,000Fixed-length character string, blank padded to length n. If you do not specify n, the default is an unsized CHAR value.
VARCHARCHARACTER VARYING, VARCHAR(n), CHARACTER VARYING(n), CHAR VARYING(n)VarChar, LongVarChar64,000Variable length character string to a maximum length of n. If you do not specify n, the default is an unsized VARCHAR value. There is no blank padding, and the value is stored as entered.
NCHARNATIONAL CHARACTER, NATIONAL CHAR(n), NCHAR(size)NChar, Char,16,000Fixed-length character string, blank padded to length n. If you do not specify n, the default is an unsized NCHAR value.
NVARCHARNATIONAL CHARACTER VARYING, NATIONAL CHAR VARYING, NATIONAL CHAR VARYING(n), NATIONAL CHARACTER VARYING(n), and NVARCHAR(n)VarChar, NVarChar ,LongVarChar, LongNVarChar, VarChar16,000Variable length character string to a maximum length of n. If you do not specify n, the default is an unsized NVARCHAR value.

Related References

Data types and aliases

PureData System for Analytics ,PureData System for Analytics 7.2.1, IBM Netezza stored procedures, NZPLSQL statements and grammar, Variables and constants, Data types and aliases

Data type conversions from DataStage to Netezza

InfoSphere Information Server, InfoSphere Information Server 11.5.0, Connecting to data sources, Databases, Netezza Performance Server, Netezza connector, Designing jobs by using the Netezza connector, Defining a Netezza connector job, Data type conversions, Data type conversions from DataStage to Netezza