Business teams need IT department more than ever

Company’s departments such as marketing, sales, finance, operations or HR continues have come to rely on IT to provide expertise on current technology and, more important, to provide a road map that shows where technology will lead and how to make the best use of increasingly sophisticated tools.

In the past, business teams might have bought or developed their own tools and databases, hardware and software without considering the manteinance of those the systems. In our experience, this is perhaps because IT was seen as a roadblock or didn’t move as fast as expected.

At BrightBI Consulting, we’ve seen that business departments that embrace the IT team as their partner they succeed far than others. On one hand, Company’s need to rely on IT for technology, integration, implementation expertise, the care and feeding of the data, the reliability of systems, and an eye toward future technologies that may add a value for the organization. On the other hand, IT needs to rely on the business to define a direction and establish clear objectives.


In our expertise this relationship needs to be focused on the following main practices:

Focus on a common goal. Business departments and IT one have and important thing in common: they both want to demonstrate value to the organization; for example improving the customer experience, raise sales or reduce costs.

Because we now rely heavily on data and analytical tools to do our work, we need a greater level of help from IT experts, especially in how we could both create and collect enterprise-level customer data, for instance, while ensuring data integrity.

Collaborate on a digital road map. The emerging IT solutions are more focused on external customers. This means being more strategic. And being more strategic means being more engaged in discussions and planning with all business departments. Companies must understand the need of creating more engagement by having IT representatives attend other business departments meetings. Only doing so, IT could become more proactive in allocating resources and suggesting new technologies to help us reach other departments goals sooner.


Making the shift to establish a deeper partnership between IT and other business departments can take time. So enter this transition point with patience in mind.


Use Artificial Intelligence to Stay On Top

Businesses have never faced so much pressure to deliver value and results in less time – and that calls for some serious disruption.

One source of assistance is definitely Artificial Intelligence (AI). Our experience tells that as humans we cannot ingest, process and provide insights faster and better than machines anymore. Let’s see it through some examples:

  • Better Service Delivery: A machine that can learn a range of tasks, and then complete them, that adds a huge value to the business. Many companies have explored chat bots functionality where software is trained to help clients and learn how to solve their problems case after case. This innovation has helped businesses reduce the strain on call centers, while giving consumers access to a service that suits them. Without AI, this would not have been possible. But even though this added service capacity is rooted in technology, it is built on the fact that great service keeps customers coming back.
  • Develop Better Products: Every company, product, or service can be improved by incorporating Artificial Intelligence. Whether you use dynamic branching to make it easier for your customers to fill in an online form, or program a digital concierge to give a more accurate, broad range of responses to questions, Artificial Intelligence carries value and the power to enhance customer experiences. Chances are your competitors are already trying to innovate with this kind of technology, and you need to keep up if you don’t want your business to get left behind.


  • Smarter Financial Decisions: One of the biggest problems that businesses face is balancing their expenditure with their income. Rising salaries, increased staff quotas, and ever-shrinking profit margins mean it is more important than ever to make shrewd financial decisions. Artificial Intelligence that uses data-mining software can help you make those decisions. Your financial adviser might be able to refer you to a few sources for affordable small business capital, but they can never process the sheer volume of information that an intelligent algorithm can. We have not yet reached a point where financial advisers should be replaced, but you can empower them to make better choices by investing in AI that analyses endless streams of information to find the best financial products for your business.
  • Automation: Imagine yourself working in a factory and struggling everyday to manage overtime of your staff. An algorithm can be trained to recognize the patterns that usually indicate a need for overtime and adjust output accordingly to avoid extra cost. Additionally, it can also warn operators of imminent machine failures based on thousands of hours of historic data. Therefore, one of the biggest business benefits of AI is the reduced man hours associated with better processes.
  • Monitor Human Capital: The world of human capital and people analytics is focused on using data to understand which employees are giving their best, and which are not delivering. With the backing of Artificial Intelligence, business leaders can see who is likely to let them down, and train or replace them accordingly. While it seems brutal, this is a primary example of the way AI and new technologies can drive better business performance.

Our experience tells that in order to progress in business businesses need to embrace new technologies. AI is as disruptive and impactful as cell phones and the internet itself – so use it to keep yourself afloat, or you run the risk of battling to stay competitive in a rapidly-changing business environment.

Key Pricing and TCO Practices for Cloud BI

Leaders from all around the globe face challenges and pressure to accomplish more tasks in shorter periods of time and they demand agile analytics tools to make fully informed decisions quickly.

Also, business users expect to have access to a system where they can analyze data when and where they choose to, without the need to wait on IT to deliver customized reports or to deploy analytics tools to departments once there’s
availability in the IT project pipeline.

Indeed, as organizational workloads become close to sophistication and legacy systems, pushing enterprise architectures to their boundaries, businesses are requiring greater agility, increased flexibility, faster time-to-value, and real economic impact.

That is why senior executives from every single organization are and think that migrating to cloud-based analytics platforms.


One of the most attractive selling points for cloud analytics is the use of a more flexible OpEx (Operating Expenditures) pricing model. Under an OpEx pricing model for cloud analytics, there’s no long-term commitment to the tools and assets being invested in. Companies typically pay for cloud analytics services on a monthly, quarterly or annually basis, offering greater flexibility.

In addition, companies that rely on cloud analytics automatically receive software updates and new features immediately without having to endure costly and time-consuming efforts to install updates from IT departments.

Moreover, the up-front cost avoidance offered through the OpEx model enables organizations to free up cash flow.

By comparison, under a CapEx (capital expenditures) model for premise-based analytics, companies not only have to pay annual software licensing and maintenance fees for analytics tools, but also for associated servers, peripheral devices, and IT staff to support these technologies.


Shifting to an OpEx model under the use of cloud-based analytics also enables organizational leaders to improve time-to-value through the use of analytics by avoiding the lengthy process for obtaining budgetary approvals for capital investments. Instead of waiting weeks or months to have funding approved and to install premise-based analytics software, companies that use cloud-based analytics tools can become operational within days or even hours.

What’s Data Governance?

Data Governance is the exercise of decision-making and authority for data-related matters. Also, is a system of decision rights and accountabilities for information-related processes, executed according to agreed-upon models which describe the 6 Ws:


  • who can take
  • what actions
  • with what information
  • when
  • under what circumstances
  • using what methods.

Data Governance programs can differ significantly, depending on their . They can focus on Compliance, on Data Integration, on Master Data Management, etc. However, regardless of the focus adopted on Data Governance, every program will have essentially the same three-part mission:

  1. to make/collect/align rules,
  2. to resolve issues, and
  3. to monitor/enforce compliance while providing ongoing support to Data Stakeholders.

Typical goals of a Data Governance Program

  1. Enable better decision-making.
  2. Reduce operational friction.
  3. Protect the needs of data stakeholders.
  4. Train management and staff to adopt common approaches to data issues.
  5. Build standard, repeatable processes.
  6. Reduce costs and increase effectiveness through coordination of efforts
  7. Ensure transparency of processes.


Maybe you probably have other goals. It depends on the focus of your program. Some of these goals will address your general infrastructure and culture, such as identifying data stakeholders and their specific value propositions for solving data related issues. Others may be very specific, such as including the business in validating a certain percentage of core business data definitions.

Who is involved with Data Governance?

 Data Governance is of concern to any individual or group who has an interest in how data is created, collected, processed and manipulated, stored, made available for use, or retired. People are called Data Stakeholders. Often, Data Stakeholders are OK with letting various IT Management and Data Management teams decide how to do the tasks. But sometimes, these activities require decisions that really should be made by groups of stakeholders according to an agreed-upon process for making those decisions; that’s when Data Governance comes into play. Such decision-making (and other activities) are facilitated and coordinated by centralized resources.

When do organizations need formal Data Governance?

Organizations need to move from informal governance to formal Data Governance when one of four situations occur:


  1. The organization gets so large that traditional management isn’t able to address data-related cross-functional activities.
  2. The organization’s data systems get so complicated that traditional management isn’t able to address data-related cross-functional activities.
  3. The organization’s Data Architects, SOA teams, or other horizontally-focused groups need the support of a cross-functional program that takes an enterprise view of data concerns and choices.
  4. Regulation, compliance, or contractual requirements call for formal Data Governance. 

Where in an organization are Data Governance Programs located?

Data Governance Programs can be placed within Business Operations, IT, Compliance/Privacy, or Data Management organizational structures. What’s important is that they received appropriate levels of leadership support and appropriate levels of involvement from Data Stakeholder groups.

Why use a formal Data Governance Framework?

Frameworks help us organize how we think and communicate about complicated or ambiguous concepts. The use of a formal framework can help Data Stakeholders from Business, IT, Data Management, Compliance, and other disciplines come together to achieve clarity of thought and purpose.

The use of a framework can help management and staff make good decisions – decisions that stick. It can help them reach consensus on how to “decide how to decide.” That way, they can more efficiently create rules, ensure that the rules are being followed, and to deal with noncompliance, ambiguities, and issues.

How does an organization “do” Data Governance?

First, they decide what’s important to them – what their program will focus on. Then they agree on a value statement for their efforts. This will help establish scope and to establish SMART goals, success measures, and metrics. Next, develop a roadmap for their efforts, and they use this to acquire the support of stakeholders.


Once achieved, they design a program, deploy the program, go about the processes involved in governing data, and perform the processes involved in monitoring, measuring, and reporting status of the data, program, and projects.

Data Governance programs tend to start by focusing their attention on finite issues, then expanding their scope to address additional concerns or additional sets of information. And so, the establishing of Data Governance tends to be an iterative process; a new area of focus may go through all of the steps described above, at the same time that other governance-led efforts are well-established in the “govern the data” phase.

How much Data Governance do we need?

As little as Data Governance helps you meet your goals. The DGI Data Governance Framework can be applied to pervasive, “big-bang” programs. But it was specifically designed for organizations that intend to apply governance in a limited fashion, then scale as needed. All the 10 components of Data Governance described in the framework will be present in the smallest of programs and projects; the level of complexity will grow as the number of participants or complexity of data systems increases.


By standardizing your teams on the terminology and concepts described in the framework, you’re training your Business, IT, and Compliance staff to communicate with each other in a way that leads to realizing value from your data assets, managing cost and complexity, and ensuring compliance. A “act locally, but think globally” approach to Data Governance means your teams will be ready when it’s time to tackle large or complex data-related issues.

How do we assess whether we are ready for Data Governance?

It’s important to assess readiness for Data Governance before you move from your current state to a more formal approach to governance and stewardship. Why? There may be a valid reason why the current model is in place. Likewise, there may be a good reason why change could be detrimental to the enterprise, a particular program or project, or even an individual’s career. Red flags include:

  • Refusal of business groups to get involved
  • Refusal of leadership to sponsor a Data Governance effort
  • The decision to implement a bottom-upprogram when the decisions and rules that must be implemented clearly must come down from the top of the organization
  • The decision to empower a group (an outsourcer, partner, or team) to make data-related decisions for a data-related effort where they would benefit from NOT:
  • Considering an enterprise view.
  • Involving data stakeholders.
  • Correcting data issues.
  • Acknowledging data issues.

Advantages of Data Governance

In today’s world many businesses are growing rapidly and each day systems process so many transactions and create vast amounts of new data, some as simple as adding new customer, vendor, material, payments, credits and debits. While entering the data manually or digitally there is always the chance to enter incorrect or duplicate data and that can lead to a big data disaster for decision making and implementing new business strategies.


Companies are starting to realize this and see that their data must be cleansed and enriched to compete and to get the full benefit of their historical and present master and transactional data. To get a better handle on data as a strategic asset, companies are empowering their people as well as technology and processes to manage the long term quality of their data.

Data governance controls the quality of the data and provides consistent and trusted data that business users can rely upon to make critical decisions. Below are some advantages of data governance:

  • Making data consistent.
  • Improving data quality.
  • Making data accurate, complete.
  • Maximizing the use of data to make decisions.
  • Improving business planning.
  • Improving financial performance.
  • Maximizing profits of the company.

A good data governance process allows companies to know that whether the data they are accessing is current or historical data, it will be reliable and usable for analysis. The benefits of data governance, such as those listed above, are an ROI the company can realize well into the future.

In-Memory Database Systems and Solutions

Big data and analytics have become a major competitive differentiator, but managing massive amounts of data with around-the-clock uptime is an ongoing challenge for IT. It’s more critical than ever to achieve the performance, availability and robust security that enterprises need for mission-critical workloads while keeping costs low.

“What’s an In-Memory Database system?”

Mainly, an in-memory database system (IMDB) is a database management system that relies on main memory (RAM) for computer data storage to facilitate faster response times. The following are some key points In-Memory Database systems have:

  1. In-Memory Database systems use RAM as a first class storage layer in order to read and write directly to and from memory without touching the disk.
  2. Source data is loaded into system memory in a compressed format, that means that the footprint needed on traditional database systems is reduced considerably.
  3. In-Memory Database systems is one type of analytic database, which is read-only system, stores historical/non-frequently used data on disk.
  4. IMDB systems allow users to run queries and reports on the information contained, which is regularly updated to incorporate recent transaction data from organization’s operational systems.
  5. In addition to providing extremely fast query response times, in-memory analytics can reduce or eliminate the need for data indexing and storing pre-aggregated data in OLAP cubes or aggregate tables. This capacity reduces IT costs and allows faster implementation of BI/BA applications.
  6. In-Memory Database systems provide internal optimized algorithms that are simpler and execute fewer CPU instructions.

Advantages of an In-Memory Database over traditional Databases

The purpose of the In-Memory Database is to provide increased performance over traditional database queries, especially for organizations that deal with big data or lack of performance regularly.

The premise behind the In-Memory Database comes from its ability to put the working set of either complete or partial data into system memory. In the case of partial data being moved in-memory, the tables selected are those that would benefit most from the increased speed gained from dynamic RAM storage, which brings us to the first advantage in-memory databases have over their traditional counterparts.

1. Speed (faster queries)

When a user queries a large data set, it takes time to process information requests when data is stored in traditional databases. Using In-Memory Databases help speed the queries because it takes much less time to search in-memory data. According to those who use in-memory databases, the speed difference is significant.

2. Real-Time Decision Making

All of the queries can be returned in the time it takes to get business done. In-memory data helps ensure that decision makers have the most relevant and timely information available when they are speaking to a customer, supplier, or even management in their own company. With this information, they’re able to present an accurate picture rather than having to wait for information to be returned or basing their decision on outdated information. This is why having these data is a definite competitive advantage.

3. Big Data Management

In-memory database helps in big data management because is used with applications that allow very fast data access, storage and manipulation even in systems that don’t have a disk but have the need to manage large volumes of data.

4. Real-time Updates

An important advantage when using an in-memory database system is its real-time embedded systems which are highly resource-constrained, require small memory and CPU footprint.

5. Reduced IT Costs

In-Memory analytics can reduce or eliminate the need for data indexing and storing pre-aggregated data in OLAP cubes or aggregate tables. This capacity reduces IT costs and allows faster implementation of BI/BA applications.

6. Reduced burden on IT Resources

How can these big databases, which commonly reside on large server hard drives, sit in memory?

PHOTO by William Warby under (CC BY 2.0)

This can be attributed to the low cost of memory. More memory is built into servers. Cost alone doesn’t account for how In-Memory databases work. Databases themselves have to be designed efficiently, which means less redundancy in data tables. Furthermore, data is compressed to help it fit within the smaller confines of memory as opposed to storage. As a result, there’s less need to purchase servers equipped with large hard disk drives, and resources are saved as a result of querying memory over power of consuming hard drives.

In-Memory Databases main players


Microsoft SQL Server’s in-memory technology (Hekaton) is a new database engine optimized for memory resident data and OLTP workloads, fully integrated into SQL Server: it is not a separate system!! To take advantage of Hekaton, a user simply declares a table memory optimized.

Microsoft SQL Server’s in-memory technology (Hekaton) is a new database engine optimized for memory resident data and OLTP workloads, fully integrated into SQL Server: it is not a separate system!! To take advantage of Hekaton, a user simply declares a table memory optimized.

Hekaton is designed around 4 architectural principles:

  1. Optimize for main memory data access.
  2. Accelerate business logic processing.
  3. Provide frictionless scale-up.
  4. Built-in to SQL Server.

Main Capabilities:

  • Achieve breakthrough performance with in-memory technology built in across all workloads.
  • Get the reliability and high availability you need with AlwaysOn.
  • Gain enterprise-class scalability and predictable performance.
  • Row Level Security and Always Encrypted technology in SQL Server 2016 protect data at rest and in motion.
  • SQL Server Management Studio helps to centrally manage database infrastructure across your datacenter and the cloud.

2. ORACLE TimesTen

Oracle TimesTen In-Memory Database is a full-features relational database that runs in the application tier, storing all data in main memory. This dramatically reduces latency and increases throughput.


  • Oracle TimesTen In-Memory Database stores data in application tier main memory, and with no network latency or disk I/O, transactions take just microseconds and complex analytic queries happen at the speed of thought.
  • Provides enterprise-class reliability and availability by logging data and transactions to disk to enable a full recovery, and with high-speed replication, Oracle TimesTen In-Memory Database can be configured for high availability and instant failover.
  • Oracle TimesTen In-Memory Database is embedded in Oracle Exalytics In-Memory Machine, enabling Oracle Business Intelligence Standard Edition users to perform complex analytic queries at real-time speeds.
  • Oracle TimesTen In-Memory Database supports full SQL transaction semantics and includes OCI, Pro*C and PL/SQL for compatibility with Oracle Database.
  • Accelerates existing Oracle Database applications when used as a high-performance cache for Oracle Database, Enterprise Edition (see Oracle TimesTen Application-Tier Database Cache).


SAP HANA in-memory database powers real-time insights across your business controlling systems and data.

Main Functional Capabilities

  • High-performance computing: It leverages the latest hardware and software innovations to accelerate performance for all applications.
  • Comprehensive data processing: Embeds multiple data processing engines and predictive libraries to maximize value from Big Data and the Internet of Things (IoT).
  • OLAP and OLTP support: Allows processing for transactional and analytic workloads on the same system with online transaction processing (OLTP) and online analytical processing (OLAP).
  • Administration and security: It helps you monitor system health and network security which are key tasks for administrators.
  • Integration services: All of your data sources can be integrated into SAP HANA – to complement your SAP HANA applications or to perform in-depth analyses.

Main Technical Capabilities

  • Open environment: Supports standard JDBC/ODBC and RESTFul Webservice to help you build applications that can be easily integrated with your legacy systems.
  • Componentized data integration: Allows maximum flexibility and shrink TCO with componentized data integration.
  • Efficient system Management: Integrates development, administration, and monitoring tools to manage systems more efficiently.