data-driven decision making is at the center of all things. The emergence of
data science and machine learning has further reinforced the importance of data
as the most critical commodity in today’s world. From FAAMG (the biggest five
tech companies: Facebook, Amazon, Apple, Microsoft, and Google) to governments
and non-profits, everyone is busy leveraging the power of data to achieve final
goals. Unfortunately, this growing demand for data has exposed the inefficiency
of the current systems to support the ever-growing data needs. This
inefficiency is what led to the evolution of what we today know as Logical Data
What Is a Logical
simple words, a data lake is a data repository that is capable of storing any
data in its original format. As opposed to traditional data sources that
use the ETL (Extract, Transform, and Load) strategy, data lakes work on the ELT
(Extract, Load, and Transform) strategy. This means data does not have to be
first transformed and then loaded, which essentially translates into reduced
time and efforts. Logical data lakes have captured the attention of
millions as they do away with the need to integrate data from different data
repositories. Thus, with this open access to data, companies can now begin to
draw correlations between separate data entities and use this exercise to their
Primary Use Case
Scenarios of Data Lakes
Logical data lakes are a
relatively new concept, and thus, readers can benefit from some knowledge of
how logical data lakes can be used in real-life scenarios.
Experimental Analysis of Data:
Logical data lakes can
play an essential role in the experimental analysis of data to establish its
value. Since data lakes work on the ELT strategy, they grant deftness and speed
to processes during such experiments.
To store and
analyze IoT Data:
Logical data lakes can
efficiently store the Internet of Things type of data. Data lakes are capable
of storing both relational as well as non-relational data. Under logical data
lakes, it is not mandatory to define the structure or schema of the data
stored. Moreover, logical data lakes can run analytics on IoT data and come up
with ways to enhance quality and reduce operational cost.
To improve Customer
Logical data lakes can
methodically combine CRM data with social media analytics to give businesses an
understanding of customer behavior as well as customer churn and its various
To create a Data
Logical data lakes
contain raw data. Data warehouses, on the other hand, store structured and
filtered data. Creating a data lake is the first step in the process of data
warehouse creation. A data lake may also be used to augment a data warehouse.
reporting and analytical function:
Data lakes can also be
used to support the reporting and analytical function in organizations. By
storing maximum data in a single repository, logical data lakes make it easier
to analyze all data to come up with relevant and valuable findings.
A logical data lake is a comparatively new area of study. However, it can be said with certainty that logical data lakes will revolutionize the traditional data theories.
The 360-degree view of
the consumer is a well-explored concept, but it is not adequate in the digital
age. Every firm, whether it is Google or Amazon, is deploying tools to
understand customers in a bid to serve them better. A 360-degree view demanded
that a company consults its internal data to segment customers and create
marketing strategies. It has become imperative for companies to look outside
their channels, to platforms like social media and reviews to gain insight into
the motivations of their customers. The 720-degree view of the customer is
further discussed below.
What is the
720-degree view of the customer?
A 720-degree view of the customer refers to a
three-dimensional understanding of customers, based on deep analytics. It
includes information on every customer’s level of influence, buying behavior,
needs, and patterns. A 720-degree view will enable retailers to offer relevant
products and experiences and to predict future behavior. If done right, this
concept should assist retailers leverage on emerging technologies, mobile
commerce, social media, and cloud-based services, and analytics to sustain
lifelong customer relationships
What Does a
720-Degree View of the Customer Entail?
Every business desires to cut costs, gain an
edge over its competitors, and grow their customer base. So how exactly will a
720-degree view of the customer help a firm advance its cause?
Social media channels help retailers interact
more effectively and deeply with their customers. It offers reliable insights
into what customers would appreciate in products, services, and marketing
campaigns. Retailers can not only evaluate feedback, but they can also deliver
real-time customer service. A business that integrates its services with social
media will be able to assess customer behavior through tools like dislikes and
likes. Some platforms also enable customers to buy products directly.
Customer analytics will construct more detailed customer profiles by
integrating different data sources like demographics, transactional data, and
location. When this internal data is added to information from external
channels like social media, the result is a comprehensive view of the customer’s
needs and wants. A firm will subsequently implement more-informed decisions on
inventory, supply chain management, pricing, marketing, customer segmentation,
and marketing. Analytics further come in handy when monitoring transactions,
personalized services, waiting times, website performance.
The modern customer demands convenience and
device compatibility. Mobile commerce also accounts for a significant amount of
retail sales, and retailers can explore multi-channel shopping experiences. By
leveraging a 720-degree view of every customer, firms can provide consumers
with the personalized experiences and flexibility they want. Marketing
campaigns will also be very targeted as they will be based on the transactional
behaviors of customers. Mobile commerce can take the form of mobile
applications for secure payment systems, targeted messaging, and push
notifications to inform consumers of special offers. The goal should be to
provide differentiated shopper analytics.
Cloud-based solutions provide real-time data across multiple channels, which illustrates an enhanced of the customer. Real-time analytics influence decision-making in retail and they also harmonize the physical and retail digital environments. The management will be empowered to detect sales trends as transactions take place.
The Importance of
the 720-Degree Customer View
Traditional marketers were all about marketing
to groups of similar individuals, which is often termed as segmentation. This technique
is, however, giving way to the more effective concept of personalized
marketing. Marketing is currently channeled through a host of platforms,
including social media, affiliate marketing, pay-per-click, and mobile. The
modern marketer has to integrate the information from all these sources and
match them to a real name and address. Companies can no longer depend on a
fragmented view of the customer, as there has to be an emphasis on
personalization. A 720-degree customer view can offer benefits like:
Firms can improve customer acquisition by
depending on the segment differences revealed from a new database of customer
intelligence. Consumer analytics will expose any opportunities to be taken
advantage of while external data sources will reveal competitor tactics. There
are always segment opportunities in any market, which are best revealed by
real-time consumer data.
Marketers who rely on enhanced digital data can
contribute to cost management in a firm. It takes less investment to serve
loyal and satisfied consumers because a firm is directing addressing their
needs. Technology can be used to set customized pricing goals and to segment
New Products and
Real-time data, in addition to third-party information, have a crucial impact on pricing. Only firms with a robust and relevant competitor and customer analytics and data can take advantage of this importance. Marketers with a 720-degree view of the consumer across many channels will be able to utilize opportunities for new products and personalized pricing to support business growth
The first 360 degrees include an enterprise-wide
and timely view of all consumer interactions with the firm. The other 360
degrees consists of the customer’s relevant online interactions, which
supplements the internal data a company holds. The modern customer is making
their buying decisions online, and it is where purchasing decisions are
influenced. Can you predict a surge in demand before your competitors? A
720-degree view will help you anticipate trends while monitoring the current
View and Big Data
Firms are always trying to make decision making
as accurate as possible, and this is being made more accessible by Big Data and
analytics. To deliver customer-centric experiences, businesses require a
720-degree view of every customer collected with the help of in-depth analysis.
Big Data analytical capabilities enable monitoring
of after-sales service-associated processes and the effective management of
technology for customer satisfaction. A firm invested in being in front of the
curve should maintain relevant databases of external and internal data with
global smart meters. Designing specific products to various segments is made
easier with the use of Big Data analytics. The analytics will also improve
asset utilization and fault prediction. Big Data helps a company maintain a
clearly-defined roadmap for growth
It is the dream of every enterprise to tap into
customer behavior and create a rich profile for each customer. The importance
of personalized customer experiences cannot be understated in the digital era.
The objective remains to develop products that can be advertised and delivered
to customers who want them, via their preferred platforms, and at a lower
Data virtualization is a data management approach that allows retrieving and manipulation of data without requiring technical data details like where the data is physically located or how the data is formatted at the source. Denodo is a data virtualization platform that offers more use cases than those supported by many data virtualization products available today. The platform supports a variety of operational, big data, web integration, and typical data management use cases helpful to technical and business teams. By offering real-time access to comprehensive information, Denodo helps businesses across industries execute complex processes efficiently. Here are 10 Denodo data virtualization use cases.
1. Big data analytics
Denodo is a popular data virtualization tool for examining large data sets to uncover hidden patterns, market trends, and unknown correlations, among other analytical information that can help in making informed decisions.
2. Mainstream business intelligence and data warehousing
Denodo can collect corporate data from external data sources and operational systems to allow data consolidation, analysis as well as reporting to present actionable information to executives for better decision making. In this use case, the tool can offer real-time reporting, logical data warehouse, hybrid data virtualization, data warehouse extension, among many other related applications.
3. Data discovery
Denodo can also be used for self-service business intelligence and reporting as well as “What If” analytics.
4. Agile application development
Data services requiring software development where requirements and solutions keep evolving via the collaborative effort of different teams and end-users can also benefit from Denodo. Examples include Agile service-oriented architecture and BPM (business process management) development, Agile portal & collaboration development as well as Agile mobile & cloud application development.
5. Data abstraction for modernization and migration
Denodo also comes in handy when reducing big data sets to allow for data migration and modernizations. Specific applications for this use case include, but aren’t limited to data consolidation processes in mergers and acquisitions, legacy application modernization and data migration to the cloud.
6. B2B data services & integration
Denodo also supports big data services for business partners. The platform can integrate data via web automation.
7. Cloud, web and B2B integration
Denodo can also be used in social media integration, competitive BI, web extraction, cloud application integration, cloud data services, and B2B integration via web automation.
8. Data management & data services infrastructure
Denodo can be used for unified data governance, providing a canonical view of data, enterprise data services, virtual MDM, and enterprise business data glossary.
9. Single view application
The platform can also be used for call centers, product catalogs, and vertical-specific data applications.
10. Agile business intelligence
Last but not least, Denodo can be used in business intelligence projects to improve inefficiencies of traditional business intelligence. The platform can develop methodologies that enhance outcomes of business intelligence initiatives. Denodo can help businesses adapt to ever-changing business needs. Agile business intelligence ensures business intelligence teams and managers make better decisions in shorter periods.
With over two decades of innovation, applications in 35+ industries and multiple use cases discussed above, it’s clear why Denodo a leading platform in data virtualization.
Timestamp from date is one of those data type conversions, which I occasionally have to do in DataStage but can never seem to remember. So, I thought I would write this quick post to document the data type conversion code, which is easy, once I finally remember how to do it again.
I use the TimestampFromDateTime(%date%,%time%) function to
do this data type conversion. I’m sure there are other ways to achieve the
result, but I find this method clean and easy to perform. The TimestampFromDateTime(%date%,%time%)
function is in the Functions > Date & Time menu.
To populate the function, you need only add your date field on
use ’00:00:00’ as your time element
TimestampFromDateTime(<<Date Field Here>>, ’00:00:00′)
This is one of those data type conversions which I
occasionally have to do in DataStage, but can never seem to remember. So, I
thought I would write this quick post to document the data type conversion code, which is
really easy, once I finally remember how to do it again.
The TimestampFromDateTime Function
I use the TimestampFromDateTime function to do this data type conversion. I’m sure there are other ways to achieve the result, but I find this method clean and easy to perform. The TimestampFromDateTimefunction is in the Functions > Date & Time menu.
To populate the function, you need only add your date field on
use ’00:00:00’ as your time element
TimestampFromDateTime Function Format
TimestampFromDateTime with Time Element example
TimestampFromDateTime(<<Date Field Here>>, ’00:00:00′)
Today, newfound efficiencies and innovation are key to any business success – small, medium or large. In the rapidly evolving field of data analytics, innovative approaches to handling data are particularly important since data is the most valuable resource any business can have. IBM common SQL Engine is delivering application and query compatibility that is allowing companies to turn their data into actionable insights. This is allowing businesses to unleash the power of their databases without constraints.
But, is this really important?
Yes. Many businesses have accumulated tons of data over the years. This data resides in higher volumes, more locations throughout an enterprise – on-premise and on-cloud –, and in greater variety. Typically, this data should be a huge advantage, providing enterprises with actionable insights. But, often, this doesn’t happen.
IBM Hybrid Data Management.
With such a massive barrel of complex legacy data, many organizations find it confusing to decide what to do with it. Or where to start. The process of migrating all that data into new systems is simply a non-starter. As a solution, enterprises are turning to IBM Db2 – a hybrid, intuitive data approach that marries data and analytics seamlessly. IBM Db2 hybrid data management allows flexible cloud and on-premises deployment of data.
However, such levels of flexibility typically require organizations to rewrite or restructure their queries, and applications that will use the diverse, ever-changing data. These changes may even require you to license new software. This is costly and unfeasible. To bridge this gap, the Common SQL Engine (CSE) comes into play.
How IBM Common SQL Engine is Positioning Db2 for the Future?
The IBM Common SQL Engine inserts a single layer of data abstraction at the very data source. This means that, instead of migrating the data all at once, you can now apply data analytics wherever the data resides – whether on private, public or hybrid cloud – by using the Common SQL Engine as a bridge.
The IBM’s Common SQL Engine provides portability and consistency of SQL commands, meaning that the SQL is functionally portable across multiple implementations. It allows seamless movement of workloads to the cloud and allows for multiplatform integration and configurations regardless of their programming language.
Ideally, the Common SQL Engine is supposed to be the heart of the query and the foundation of application compatibility. But it does so much more!
Its compatibility extends beyond data analytic applications to include security, management, governance, data management, and other functionalities as well.
How does this improve the quality, flexibility, and portability of Db2?
By allowing for integration across multiple platforms, workloads and programming languages, the Common SQL Engine, ultimately, leads to a “data without limits” environment for Db2 hybrid data management family through:
Query and application compatibility
The Common SQL engine (CSE) ensures that users can write a query, and be confident that it will work across the Db2 hybrid data management family of offerings. With the CSE, you can change your data infrastructure and location – on-cloud or on-premises – without having to worry about license costs and application compatibility.
Data virtualization and Integration
The common SQL engine has a built-in data virtualization service that ensures that you can access your data from all your sources. These services position Db2 family of offerings including, IBM Db2 warehouse, IBM Db2, IBM Db2 BigSQL amongst others.
This services also applies to IBM Integrated Analytics System, Teradata, Oracle, Puredata and Microsoft SQL server. Besides, you can work seamlessly with open-source solutions such as HIVE; and cloud sources such as Amazon Redshift. Such levels of integration are unprecedented!
By allowing users to effectively pull data from Db2 data stores and integrate it with data from non-IBM stores using a single query, the common SQL engine places Db2 at an authoritative position as compared to other data stores.
Licensing is one of the hardest nuts to crack, especially for smart organizations who rely on technologies such as the cloud to deliver their services. While application compatibility and data integration will save you time, flexible licensing saves you money, on the spot.
IBM’s common SQL engine allows flexible licensing, meaning that you can purchase one license model and deploy it whenever needed, or as your data architecture evolves. Using IBM’s FlexPoint licensing, you can purchase FlexPoints and use them across all Db2 data management offerings. This is a convenience in one place.
The flexible licensing will not only simplify the adoption and exchange of platform capabilities, but it also positions your business strategically by making it more agile. Your data managers will be able to access the tools needed on the fly, without going through a lethargic and tedious procurement process.
IBM Db2 Data Management Family Is Supported by Common SQL Engine (CSE) .
IBM Db2 is a family of custom, deployable database that allows enterprises to leverage existing investments. IBM Db2 allows businesses to use any type of data from an either structured or unstructured database (or data warehouse). It provides the right data foundation/environment with industry-leading data compression, on-premise and cloud deployment options, modern data security, robust performance for mixed loads and the ability to adjust and scale without redesigning.
The IBM Db2 family enable businesses to adapt, scale quickly and remain competitive without compromising security, risk levels or privacy. It features:
Deployment and flexibility: On-premises, scale-on demand, and private or cloud deployments• Compression and performance
Embedded IoT technology is allowing businesses to act fast on the fly.
Some of these Db2 family offerings that are supported by the common SQL engine include:
Db2 Big SQL
Db2 on Cloud
Db2 Warehouse on Cloud
IBM Integrated Analytics System (IIAS)
Db2 Family Offerings and Beyond
Since the common SQL engine mainly focuses on data federation and propensity, other non-IBM databases can as well plug into the engine for SQL processing. These other 3rd party offerings include:
Watson Data Platform
Microsoft SQL Server
IBM Common SQL engine is allowing organizations to fully use data analytics to future-proof their business, and as well remain agile and competitive. In fact, besides the benefits of having robust tools woven into CSE, this SQL engine offers superior analytics and machine-learning positioning. Data processing can now happen at the speed of light –- 2X to 5X faster. The IBM Common SQL engine adds important capabilities to Db2, including freedom of location, freedom of use, and freedom of assembly.