denodo SQL Type Mapping

denodo 7.0 saves some manual coding when building the ‘Base Views’ by performing some initial data type conversions from ANSI SQL type to denodo Virtual DataPort data types. So, where is a quick reference mapping to show to what the denodo Virtual DataPort Data Type mappings are:

ANSI SQL types To Virtual DataPort Data types Mapping

ANSI SQL TypeVirtual DataPort Type
BIT (n)blob
BIT VARYING (n)blob
BOOLboolean
BYTEAblob
CHAR (n)text
CHARACTER (n)text
CHARACTER VARYING (n)text
DATElocaldate
DECIMALdouble
DECIMAL (n)double
DECIMAL (n, m)double
DOUBLE PRECISIONdouble
FLOATfloat
FLOAT4float
FLOAT8double
INT2int
INT4int
INT8long
INTEGERint
NCHAR (n)text
NUMERICdouble
NUMERIC (n)double
NUMERIC (n, m)double
NVARCHAR (n)text
REALfloat
SMALLINTint
TEXTtext
TIMESTAMPtimestamp
TIMESTAMP WITH TIME ZONEtimestamptz
TIMESTAMPTZtimestamptz
TIMEtime
TIMETZtime
VARBITblob
VARCHARtext
VARCHAR ( MAX )text
VARCHAR (n)text

ANSI SQL Type Conversion Notes

  • The function CAST truncates the output when converting a value to a text, when these two conditions are met:
  1. You specify a SQL type with a length for the target data type. E.g. VARCHAR(20).
  2. And, this length is lower than the length of the input value.
  • When casting a boolean to an integertrue is mapped to 1 and false to 0.

Related References

denodo 7.0 Type Conversion Functions

Analytics Model Types

Every day, businesses are creating around 2.5 quintillion bytes of data, making it increasingly difficult to make sense and get valuable information from this data. And while this data can reveal a lot about customer bases, users, and market patterns and trends, if not tamed and analyzed, this data is just useless. Therefore, for organizations to realize the full value of this big data, it has to be processed. This way, businesses can pull powerful insights from this stockpile of bits.

And thanks to artificial intelligence and machine learning, we can now do away with mundane spreadsheets as a tool to process data. Through the various AI and ML-enabled data analytics models, we can now transform the vast volumes of data into actionable insights that businesses can use to scale operational goals, increase savings, drive efficiency and comply with industry-specific requirements.

We can broadly classify data analytics into three distinct models:

  • Descriptive
  • Predictive
  • Prescriptive

Let’s examine each of these analytics models and their applications.

Descriptive Analytics. A Look Into What happened?

How can an organization or an industry understand what happened in the past to make decisions for the future? Well, through descriptive analytics.

Descriptive analytics is the gateway to the past. It helps us gain insights into what has happened. Descriptive analytics allows organizations to look at historical data and gain actionable insights that can be used to make decisions for “the now” and the future, upon further analysis.

For many businesses, descriptive analytics is at the core of their everyday processes. It is the basis for setting goals. For instance, descriptive analytics can be used to set goals for better customer experience. By looking at the number of tickets raised in the past and their resolutions, businesses can use ticketing trends to plan for the future.

Some everyday applications of descriptive analytics include:

  • Reporting of new trends and disruptive market changes
  • Tabulation of social metrics such as the number of tweets, followers gained over some time, or Facebook likes garnered on a post.
  • Summarizing past events such as customer retention, regional sales, or marketing campaigns success.

To enhance their decision-making capabilities businesses have to reduce the data further to allow them to make better future predictions. That’s where predictive analytics comes in.

Predictive Analytics takes Descriptive Data One Step Further

Using both new and historical data sets predictive analytics to help businesses model and forecast what might happen in the future. Using various data mining and statistical algorithms, we can leverage the power of AI and machine learning to analyze currently available data and model it to make predictions about future behaviors, trends, risks, and opportunities. The goal is to go beyond the data surface of “what has happened and why it has happened” and identify what will happen.

Predictive data analytics allows organizations to be prepared and become more proactive, and therefore make decisions based on data and not assumptions. It is a robust model that is being used by businesses to increase their competitiveness and protect their bottom line.

The predictive analytics process is a step-by-step process that requires analysts to:

  • Define project deliverables and business objectives
  • Collect historical and new transactional data
  • Analyze the data to identify useful information. This analysis can be through inspection, data cleaning, data transformation, and data modeling.
  • Use various statistical models to test and validate the assumptions.
  • Create accurate predictive models about the future.
  • Deploy the data to guide your day-to-data actions and decision-making processes.
  • Manage and monitor the model performance to ensure that you’re getting the expected results.

Instances Where Predictive Analytics Can be Used

  • Propel marketing campaigns and reach customer service objectives.
  • Improve operations by forecasting inventory and managing resources optimally.
  • Fraud detection such as false insurance claims or inaccurate credit applications
  • Risk management and assessment
  • Determine the best direct marketing strategies and identify the most appropriate channels.
  • Help in underwriting by predicting the chances of bankruptcy, default, or illness.
  • Health care: Use predictive analytics to determine health-related risk and make informed clinical support decisions.

Prescriptive Analytics: Developing Actionable Insights from Descriptive Data

Prescriptive analytics helps us to find the best course of action for a given situation. By studying interactions between the past, the present, and the possible future scenarios, prescriptive analytics can provide businesses with the decision-making power to take advantage of future opportunities while minimizing risks.

Using Artificial Intelligence (AI) and Machine Learning (ML), we can use prescriptive analytics to automatically process new data sets as they are available and provide the most viable decision options in a manner beyond any human capabilities.

When effectively used, it can help businesses avoid the immediate uncertainties resulting from changing conditions by providing them with fact-based best and worst-case scenarios. It can help organizations limit their risks, prevent fraud, fast-track business goals, increase operational efficiencies, and create more loyal customers.

Bringing It All Together

As you can see, different big data analytics models can help you add more sense to raw, complex data by leveraging AI and machine learning. When effectively done, descriptive, predictive, and prescriptive analytics can help businesses realize better efficiencies, allocate resources more wisely, and deliver superior customer success most cost-effectively. But ideally, if you wish to gain meaningful insights from predictive or even prescriptive analytics, you must start with descriptive analytics and then build up from there.

Windows – Host File Location

Occasionally, I need to update the windows hosts files, but I seem to have a permanent memory block where the file is located. I have written the location into numerous documents, however, every time I need to verify and or up the host file I need to look up the path. Today, when I went to look it up I discovered that I had not actually posted it to this blog site. So, for future reference, I am adding it now.

Here is the path of the Windows Hosts file, the drive letter may change depending on the drive letter on which the Windows install was performed.

C:\WINDOWS\system32\drivers\etc

Which Version Control Systems Are supported by denodo Virtualization 7.0?

Using Version Control is a denodo Virtual DataPort (VDP) recommended best practice. Version 7.0 of denodo virtualization supports three Version Control Systems (VCS):

  • Microsoft Team Foundation Server (TFS) 2010 or later
  • Apache Subversion (1.7), and
  • Git

Related References:

New CentOS 8 Linux Release

The new CentOS 8 rebuild is out. Christened version 8.0-1905, this release provides a secure, stable and a more reliable foundation for CentOS users such as organizations running high-performance websites and businesses with Linus experts that use CentOS daily for their workloads, but who do not need strong commercial support.

The new OS comes in after Red Hat released RHEL 8 – Red Hat Enterprise Linux – in May of this year. According to CentOS 8 release notes, the contributors note that this rebuild is 100% compliant with Red Hat’s redistribution policy. This Linux distro allows users to achieve successful operations using the robust power of an enterprise-class OS, but without the cost of support and certification. Below are some of the updates as outlined in CentOS 8 release notes that you can expect with this new release and some of the deprecated features.

What’s New in the Just Released CentOS 8?

  • BaseOS and Appstream
  • New container tools
  • Systemwide crypto policies
  • TCP stack improvements
  • DNF

· BaseOS and Appstream

The main repository or Base Operating System offers the components of distribution that in turn provide the running user space on the hardware, virtual machines, or even a container. The Application Stream or App stream offers all the apps you might want to run in particular user space. The Supplemental repository offers software that comes with special licensing.

· New Container Tools

With the aid of Podman, CentOS 8 supports Linux Containers. This replaces Docker and Mobdy, which depend on daemon and run as root. Unlike the previous release, the Podman in the new version does not depend on daemon. Podman allows users to create images from scratch using Buildah.

· Systemwide Crypto Policies

The command “update crypto policies” can be used to update the system-wide cryptographic policy on the new OS. The policies have settings for the following applications and libraries; NSS TLS library, Kerberos 5 library, Open SSH SSH2 protocol implementation, IKE protocol implementation & Libreswan IPsec, Open SSL TLS library and GnuTLS TLS library.

· TCP Stack Improvements

The CentOS 8 Linux distro also brings with it TCP stack version 4.16 with an improved ingress connection rate. The Linux kernel is now able to support the new BBR and NV control algorithms. This is very helpful in helping improve the Linux server internet speed.

· DNF – Dandified Yum

The new Operating System includes the basic foundations of the Yum package but is now upgraded to the DNF (Dandified Yum). Though it maintains a similar command-line interface and API to its predecessor, it does promise to be faster, seamless and super-efficient.

· Other Improvements

The CentOS also has a compiler based on the version 8.2 and includes support for more recent C ++ language standard versions, improved optimizations, more code, and hardening techniques as well as new hardware support and better warnings.

In addition to those features, the new CentOS 8 also supports secure guests, which using cryptographically signed images will ensure that the program retains its integrity. It also boasts of improved management of memory and support. CentOS 8 release notes state that the new OS will allow the Crash dump to take in kernel crash during all booting phases which were not possible before.

CentOS 8 gives encrypted storage to LUKS2. It also allows for enhancements made to the process scheduler to include the new deadline process scheduler. This Linux distro will also enable installations and boot from dual-in-line, non-volatile memory modules.

A great bonus feature is that you can manage the new software with Cockpit via a web browser. This feature is very user-friendly, making it great for system administrators and new users alike.

Deprecated Features and Functionalities

If you are upgrading from previous CentOS versions, the most significant change is seen in the nftables framework which has replaced iptables. Nfatables allows users to perform network address translation (NAT) mangling, packet classification, and packet filtering. Unlike iptables, nfatables helps to provide secure firewall support with enhanced performance, increased scalability, and easy code maintenance.

These changes, though not major, may cause problems with firewall functionality. Although upgrades using RHEL may be supported, it is not advisable to upgrade directly from much older versions of CentOS like CentOs 6 and below as they may not be compatible.

Users of CentOS as a desktop will see an update of the GNOME SHELL default interface to version 3.28, while still carrying the default display server as Wayland.

Final Thoughts

If you are looking to upgrade from previous versions, a system to do so directly is yet to be released. As such, your most favorable option would be to back up your data as you install the newly released CentOS 8. When it is up and running, you can then move all the data to the new system.

Nonetheless, the new CentOS 8 Linux release is an exciting feat. This OS provides a manageable and consistent platform that suits a wide variety of deployments. It comes with well-thought-out and ingenious software updates that will help avid users to build more robust container workloads and web apps.

Denodo Data Virtualization Project Roles

A Denodo virtualization project typically classifies the project duties of the primary implementation team into four Primary roles.

Denodo Data Virtualization Project Roles

  • Data Virtualization Architect
  • Denodo Platform Administrator
  • Data Virtualization Developer
  • Denodo Platform Java Programmer
  • Data Virtualization Internal Support Team

Role To Project Team Member Alignment

While the denodo project is grouped into security permissions and a set of duties, it is import to note that the assignment of the roles can be very dynamic as to their assignment among project team members.  Which team member who performs a given role can change the lifecycle of a denodo project.  One team member may hold more than one role at any given time or acquire or lose roles based on the needs of the project.

Denodo virtualization Project Roles Duties

Data Virtualization Architect

The knowledge, responsibilities, and duties of a denodo data virtualization architect, include:

  • A Deep understanding of denodo security features and data governance
  • Define and document5 best practices for users, roles, and security permissions.
  • Have a strong understanding of enterprise data/information assets
  • Defines data virtualization architecture and deployments
  • Guides the definition and documentation of the virtual data model, including, delivery modes, data sources, data combination, and transformations

Denodo Platform Administrator

The knowledge, responsibilities, and duties of a Denodo Platform Administrator, Include:

  • Denodo Platform Installation and maintenance, such as,
    • Installs denodo platform servers
    • Defines denodo platform update and upgrade policies
    • Creates, edits, and removes environments, clusters, and servs
    • Manages denodo licenses
    • Defines denodo platform backup policies
    • Defines procedures for artifact promotion between environments
  • Denodo platform configuration and management, such as,
    • Configures denodo platform server ports
    • Platform memory configuration and Java Virtual Machine (VM) options
    • Set the maximum number of concurrent requests
    • Set up database configuration
      • Specific cache server
      • Authentication configuration for users connecting to denodo platform (e.g., LDAP)
      • Secures (SSL) communications connections of denodo components
      • Provides connectivity credentials details for clients tools/applications (JDBC, ODBC,,,etc.)
      • Configuration of resources.
    • Setup Version Control System (VCS) configuration for denodo
    • Creates new Virtual Databases
    • Create Users, roles, and assigns privileges/roles.
    • Execute diagnostics and monitoring operations, analyzes logs and identifies potentials issues
    • Manages load balances variables

Data Virtualization Developer

The Data Virtualization Developer role is divided into the following sub-roles:

  • Data Engineer
  • Business Developer
  • Application Developer

the knowledge, responsibilities, and duties of a Denodo Data Virtualization Developer, by sub-role, Include:

Data Engineer

The denodo data engineer’s duties include:

  • Implements the virtual data model construction view by
    • Importing data sources and creating base views, and
    • Creating derived views applying combinations and transformations to the datasets
  • Writes documentation, defines testing to eliminate development errors before code promotion to other environments

Business Developer

The denodo business developer’s duties include:

  • Creates business vies for a specific business area from derived and/or interface views
  • Implements data services delivery
  • Writes documentation

Application Developer

The denodo application developer’s duties include:

  • Creates reporting vies from business views for reports and or datasets frequently consumed by users
  • Writes documentation

Denodo Platform Java Programmer

The Denodo Platform Java Programmer role is an optional, specialized, role, which:

  • Creates custom denodo components, such as data sources, stored procedures, and VDP/iTPilot functions.
  • Implements custom filters in data routines
  • Tests and debugs any custom components using Denodo4e

Data Virtualization Internal Support Team

The denodo data virtualization internal support team’s duties include

  • Access to and knowledge of the use and trouble of developed solutions
  • Tools and procedures to manage and support project users and developers

Denodo Virtual Dataport (VDP) naming Convention Guidance

Denodo provides some general Virtual Dataport naming convention recommendations and guidance.  First, there is the general guidance for basic Virtual Dataport object types and, secondly, more detail naming guidance recommends.      

Denodo Basic Virtual Dataport (VDP) Object Prefix Recommendations

  • Associations Prefix: a_{name}
  • Base Views Prefix: bv_{name}
  • Data Sources Prefix: ds_{name}
  • Integration View Prefix: iv_{name}
  • JMS Listeners Prefix: jms_{name}
  • Interfaces Prefix: i_{name}
  • Web Service Prefix: ws_{name}

Virtual Dataport (VDP) High-Level Project Structure

Different layers are identified when creating logical folders hierarchies within each Data Virtualization project.  The recommended high-Level project folders are:

Connectivity

  • Connectivity, where related physical systems, data sources, and base views are part of this folder.

Integration

  • Integration views include the combinations and transformations views for the next layers. Not directly consumed views at this level.

Business Entities

  • Business Entities include Canonical business entities exposed to all users.

Report Views

  • Report Views include Pre-built reports and analysis frequently consumed by users.

Data Services

  • Data Services include web services for publishing views from other levels. Can contain views need for data formatting and manipulation.

Associations

  • This folder stores associations.

JMS listeners

  • This folder stores JMS listeners

Stored procedures

  • This folder stores custom stored procedures developed using the VDP API.

Denodo Knowledge Base VDP Naming Conventions

Additional more detailed naming convention and Virtual Dataport organization guidance are available in the donodo Community Knowledge Base, under Operations

Knowledge Base Virtual Dataport (VDP) Naming Conventions Online Page

Virtual Dataport (VDP) Naming Conventions Downloadable PDF