Search This Blog

Thursday, March 27, 2014

The Three Most Common Mistakes Made in Oracle BI Application projects

The Three Most Common Mistakes Made in Oracle BI Application projects

For a variety of reasons, completing a successful Oracle BI Applications project is not as straightforward as one might think considering that the BI Applications are touted as a pre-built, end-to-end solution out of the box.

Based on our experience with either implementing new Oracle BI Applications projects or following on to failed projects, there are three common mistakes made that can determine the success or failure of the project: 


1.   Failing to follow the installation and configuration guides


This may seem difficult to believe but there are many cases where projects have been implemented without following the specific instructions in the installation guide and/or the configuration guide.  

The installation guide is critical for setting up the infrastructure for Informatica, DAC, and the OBIEE platform.

Oracle Business Intelligence Applications Installation Guide for Informatica PowerCenter Users

Some of the more common steps missed include:
  • Setting up the SSE_ROLE for the target data warehouse user
  • Configuring the proper Code Page settings for data movement between source and target
  • Failing to review and apply the supplied Oracle database parameter settings in the parameter template file (for example the one for Oracle 11g named init11g.ora)
  • Not setting PowerCenter Integration Services custom properties - specifically the  overrideMpltVarWithMapVar parameter which enables Informatica to evaluate parameters within mapplets.
The configuration guide provides instructions on how to set up both common Oracle BI Application areas and dimensions as well as functional area configuration steps.

Oracle® Business Intelligence Applications Configuration Guide for Informatica PowerCenter Users

Under common dimensions, it is critical that the exchange rate and calendar configuration is followed correctly and relates to the specific source system environment that will be used.

For the functional areas, there are a series of configuration files that must be reviewed and edited to conform to the source system.   For example, the Human Resources functional area requires that the  
band dimension files for Person Age, Job Requisition Age,  Performance Ratings, and Period of Service are configured before the data is loaded.

Also domain CSV files for Ethnic Group, Sex, Employment, and other HR attributes should be reviewed prior to the data load,  For HR Analytics, the most critical domain file that must be configured is the one that populates the Workforce Event dimension.  This file (named domainValues_Wrkfc_EventType_psft.csv for Peoplesoft implementations) maps each employee assignment event from the PS_JOB table to a standard set of values for Hires, Terminations, and other job related activities.   This file should be reviewed with knowledgeable HR subject matter experts to properly categorize each Action/Reason Code combination into the standard event types.

2.   Developing dashboards without continuous user involvement

The four words that strike fear into the heart of any Oracle BI applications consultant are "I have a spreadsheet".    In many implementations, dashboard development requirements are taken directly from one or more existing spreadsheets that are passed among various business organizations.   This approach more often than not leads to a disappointed user base when a final dashboard is delivered because OBIEE, while quite powerful, cannot always replicate the form and function that is easily built into a spreadsheet.

A far better approach is to take the existing spreadsheets and work through a fit-gap analysis to understand the business requirements and metrics that drive the spreadsheet.  After that is completed, the OBIA data model should be modified to reflect those requirements before any actual dashboard and analysis configuration is started.    Once the data model is ready and available with either actual or test data, workshops should be scheduled with users to demonstrate the capabilities of OBIEE on top of that data model.   Rather than duplicating spreadsheets, focus on the data model and the flow of an analysis.  Many spreadsheets have thousands of rows that are filtered by the user and then pivoted to create other summary analyses.   

Start with a top down approach on the dashboards, focusing on:  
  • dashboard prompts to filter reports automatically 
  • drilling and navigation 
  • conditional highlighting
  • ranking reports to identify outliers and top performers
  • charts that visually display trends
  • multiple view types of the same data using view selectors
  • column selectors
  • filters and calculations driven by presentation variables 
The key is to get users to think about interactive analysis instead of data dumps and scrolling through long table format reports.   

It is important to push back on users when they ask for features that are not easily achieved in the OBIEE tool or require significant modification to the data model just to meet a very specific reporting requirement.  Balancing the development and maintenance of any OBIEE code with what can be occasionally excessively specific user report requirements should be considered before heading down a path that can lead to project delays. 

Involve users throughout the development process to get their input and feedback.  With the rapid development capabilities of Answers, it is very easy to modify the layout of dashboards and analyses on the fly to get buy-in from the users. 


3.   Implementing the RPD without modification

The delivered metadata repository (RPD) that comes with the Oracle Business Intelligence Applications should not be considered a final product.    On every OBIA project, one of the first tasks that should be performed is an RPD Review with the business users to develop a list of customizations that will make the Presentation Layer of the RPD a more effective representation of the business.    Performing this process early on will greatly reduce the development time later on when reports are developed.   It also is very helpful in improving the user adoption experience if they are new users to the OBIEE Answers tool.

The three R's of the RPD review process are:  Rename, Remove, and Reorder.

Rename any presentation column or table to reflect the business definition.  It is far easier to rename a column than to get user's to convert their known business vocabulary to match that of OBIA.  For example, rename the Out of the Box Employee Organization table and columns to be Department.

Remove any presentation columns and tables that are not required for analysis.  This includes any columns that may be exposed in the Presentation Layer but are not populated by the ETL for the particular source system for the implementation.   Work under the assumption that any column exposed in the Presentation Layer must be populated by ETL, unit tested for accuracy, and useful for creating analyses.    Simplicity yields project success. 

Reorder presentation tables and columns to be more effective for users.   Put most frequently used columns at the top of presentation tables.    Put dimension tables at the top and facts at the bottom of the subject area.    Group similar metrics together either by purpose or by time series.    Make good use of presentation table foldering to minimize the number of attributes and metrics displayed. 

Conclusion:


There are no guarantees of success when implement a BI application.   But there are certainly ways to increase the possibility of attaining the ultimate goal:  satisfied users with a useful business analysis tool delivered on time and on budget.    It can be done.

Sunday, March 9, 2014

(ROI) Return on Investment for Business Intelligence

How to justify a business intelligence system has occupied people ever since these were called decision support systems.  Supposedly the numbers are “soft”.  Reductions in IT costs virtually never cover the cost of replacing spreadsheets and Access databases with a proper BI system.  Yet, organizations spend over a billion dollars a year on BI systems, and that’s just in the USA.  In today’s environment, CFO’s will not let capital investments pass without some kind of a business case.  So what do these companies do?


Let’s briefly dissect the formula for return on investment:
ROI =     Expected value (NPV (cash flows from revenue increases or cost savings))                                          
…………Expected value (NPV(cash outlays from spending on hardware, software, labor, and services))

This formula tells us what we have to look for and how we should maximize the value.  We need to estimate:
  • Cash flows and their timings
  • Likelihood of these benefits occurring
  • Cash outlays for the BI system and their timings, including
    • Hardware and software costs
    • Implementation costs
    • Training costs
    • Administration, enhancement and upgrade costs
  • The likelihood of the cash outlays occurring

The formula also tells us that benefits and costs that occur in the near future are more valuable than those in the distant future because they are discounted less AND because the probability of them occurring is higher.

The classic definition of ROI looks at cash flows.  Non cash accounting items do not count.

Someone who wants to be seen as delivering valuable BI projects, therefore will look for a project that delivers a set of benefits that drive value over a short period of time.  If the scope is too small, the benefits will be minimal.  If the scope is too large, the risk and time to deliver will be unacceptable.

I understand this is not a new finding.  The formula just lays it out in black and white.

In future posts, I will discuss how to estimate these numbers, even when people feel they cannot be quantified.

Fundamental of Data and so called Big Data

The hot topic that seems to be around every corner these days is “Big Data”.   Most publications work under the premise that everyone already understands Big Data and the value it can bring to the organization.  My experience shows that assumption is not always correct.  Many folks are unclear how to recognize “Big Data” within their particular organizations.  More importantly, folks may not understand the possible business value that can be extracted from it. Without both aspects of understanding, adoption and success of Big Data initiatives will face difficulties.  This blog addresses those two aspects by identifying:

a) Typical “Big Data” examples within organizations
b) Real-world value propositions from harvesting “Big Data”

Simple Definition and Context

Let’s start with establishing a definition and context for “Big Data” since the name alone can be misleading.  Big Data is a reference to the very large scale of data available or being created that cannot be easily handled using traditional processing, methodologies or technologies.  Big Data can relate to structured or unstructured data.  The challenges with Big Data can include how to capture it, the methods for storing it, how to understand it, how to analyze it, how to search it, or how to visualize it and correlate it to something familiar.  Data can be tagged as “Big Data” by evaluating it against the above classifications and by considering at least two factors:

1) The scale and sheer volume of the data in comparison to what is reasonably expected
2) The speed at which it is created or is expected to grow

Since there is a lot of subjectivity in defining “Big Data”, let’s list some real-world examples to solidify the understanding.

Big Data within Organizations

Using the definition above as guidance, a growing number of possible examples may belong in the “Big Data” category. For the sake of being concise, the following represent the types of “Big Data” organizations commonly encounter and the value that can be derived from them.

Call & Contact Details
DefinitionValue Proposition
 
Organizations that offer products and services directly to consumers will have significant call center operations with several platforms to support various methods of customer interaction and engagement.  With each interaction comes information that collectively can be significant in quantity.   While organizations have used some of this data for operational support, many have struggled with maximizing the analytical value.
  1. Re-allocation of agents across centers and queues based on predictive demand planning using diverse criteria.  Improve efficiency on agent utilization and load balancing of resources; enhance customer experience by reducing wait time and abandon rates.
  2. Analytics against voice data for accurate tagging of reasons for call, reasons for transfers, etc.  Optimizes understanding of customer satisfaction & response levels and improves call routing.
  3. Mining of call characteristics (length, number, source, type, reason) to create stronger correlations for better insight
  4. Correlations of contact metrics to recent marketing efforts to measure effectiveness and acceptance.
  5. Creating correlations between account/customer details to offers made/accepted to increase upselling & cross-selling opportunities


Transaction Details
DefinitionValue Proposition
Industries including financial firms, trading firms and large retailers surmount a continuous stream of transactional detail.  This can include authorization details, trade transactions, and purchases.  Most organizations deal with this vast amount of data by limiting the amount of detail data used during analysis or applying standard practices to summarize and aggregate for business intelligence.
  1. Fraud detection techniques using transactions well beyond individual events; longer time spans, more criteria, un-related events.  Leads to improved risk exposure management.
  2. Trending of activity across various time periods to recognize patterns of behavior.  Effective for identifying revenue opportunity or measuring strategy effectiveness.
  3. Developing correlations between transactions and customer/account details (demographics, purchase history) to improve marketing strategies.
  4. Correlations of transactions to external factors (marketing offers, news events, regional criteria) to understand behavior and measure marketing effectiveness.
  5. Analyzing transactions using broad criteria across extensive time periods to improve forecasting accuracy.

Web Clicks & Logs
DefinitionValue Proposition
Analyzing details of customers with similar interests and behavioral patterns to maximize the effectiveness of offers made to individual customers.  Will improve and extend sales.Customers and prospects visiting the organization’s web sites have their own distinct behaviors and patterns.  What they click, what they click next, what peaks their current interest, what peaks the interest of other visitors at this same moment are examples of behavior patterns that are valuable to better understand.  Furthermore, each visit brings other interesting criteria such as originating source for visit, geographical tags, SEO tags, etc.   Only a handful of companies are recognizing the strategic value in this treasure chest of information.
  1. Correlating product sales to one another to understand buying patterns, i.e., what other products are bought along with a given product.  Improve offers and up-selling opportunities.
  2. Analyzing regional considerations for customers during a given experience to optimize target marketing and ensure relevance of offers.
  3. Analyzing click and navigational patterns to improve customer experience, i.e., offer online chat or display tips to improve “stickiness” and overall experience.
  4. Correlate traffic, interest and behavior to external factors including media efforts, regional criteria to measure strategy effectiveness.

Other interesting Big Data examples you may encounter include:

Application Logs
Informational, warning, error, monitoring and event messages are continuously produced by software systems, hardware devices and application platforms.  Proactively recognizing potential issues from the patterns can help improve the quality of service and reliability that the IT groups need to ensure.  Furthermore, it can be an element of a good risk mitigation strategy if the services and platforms are a critical part of your business.  This content is often overlooked for the value it possesses.

Social Media
Tweets, Facebook and Google+ posts, blogs & responses have quickly become acceptable means of social interaction between people.  The sheer number of people using these channels creates a “Big Data” problem.  The data and growth is exceptionally large to deal with. The content is text-based and needs to be evaluated in context to derive at the right interpretation. Furthermore the relevance of the content (eliminating noise) is difficult to decipher.  The jury is still in deliberation over the ROI for harvesting this information.  Nonetheless, it’s difficult to ignore.

GPS Trace Records
Equipment, products and personnel are increasingly fitted with GPS technologies that can track every move from point A to point B.  The ability to proactively analyze this movement can lead to supply chain efficiencies, human capital effectiveness, bottom-line cost reduction, fraud mitigation and allow for overall control and continuous visibility.

Technology & Instrument Output
Needless to say, there are countless unique examples within industries.  Utility and communication companies produce incredible amounts of usage details that can be used to manage demand and optimize performance.  Genomics and scientific organizations are deploying technologies producing ever granular bits of potentially important information.

Documents & Other Unstructured Data
Virtually every organization produces an immense amount of unstructured data, or in other words information that does not easily conform to a defined data model.    This can include internal documentation, publications, correspondences, health records, audio recordings, etc.  Not only is this a content management problem but it also requires unique analytical techniques to harvest value from the content.  Increase the scale of it and it now becomes a Big Data challenge.  Businesses can use this data for ensuring compliance, managing risk and achieving more complete records.

Many more organizations will have “Big Data” challenges over the coming years. Some of this can be attributed to their individual growth as a company but much of it is the result of technology advances and outside factors.   It is safe to conclude that all organizations with Big Data will need to take some action to do something valuable with it, at least to remain competitive.

In future blogs I will talk in more specifics about individual approaches, technologies and business application around Big Data.  In the meantime, please feel free to comment below or reach out to me to talk about “Big Data” challenges you are facing.

Saturday, March 8, 2014

So why do successful IT implementations fail to achieve desired results?

So why do successful IT implementations fail to achieve desired results?

Answer in almost all cases is lack of user adoption.  I know we have heard it before but humans in general are creatures of habit.  Once we are comfortable with a method – no matter how cumbersome it is – we like to stick with it. We build our comfort zone around it and resist any effort to change it.  Your IT team can implement the nirvana system that solves all problems, but it will never gain the desired results if end-users don’t buy into it.  The IT team needs to be both sales and marketing to get it. In reality, most IT projects focus heavily on the “technical” and completely miss the “human factor”.  Oh, and the sales and marketing process doesn’t start at your launch party. Your end users need to be sold on the change before the project is even started.  So put on your marketing and sales hat and get started.

Keep these 5 things in mind before you launch your next IT project:

1. Business Requirements: Business requirements come from business users, not IT. Sounds obvious but not always practiced. Usually the project team in all sincerity tries to address business problems without consulting the business users, involving them too late in the process, way after the technology/tool selection occurs. For a successful implementation, it is important that the users whose pain you are working to resolve are involved in defining the scope of the project.

2. The right tool for the job: A lot of time, the tool or technology is selected before the problem is completely defined.  This is a recipe for disaster. You should not base your business needs around a tool’s capabilities. It should be chosen in response to your business needs. Sometimes- this will lead to a combination of various tools and technologies to get to the right solution.

3. Focus Groups: Once you have good understanding of business requirements and have decided on the technology, don’t try to solve everything at once. A phased approach is best to get buy in. Start with a small  group of end users and walk them through the complete life cycle of the project. Get this group comfortable with the solution and you will now have end-user champions that will serve as an extended team to help promote within the organization.

4. Training and Rollout: Before you open the system for end users, make sure that you have proper training materials and a training schedule in place. Proper training is a must for successful end-user adoption. As I said earlier, people resist abrupt change. Make sure that your new implementation eases into their daily routine. End users will then realize the true value of the system and the problem it solves. Change will gradually occur.

5. BICC: Last but not least, is the implementation of proper processes and procedures. You should think about developing a BICC (Business Intelligence Competency Center). A BICC is comprised of members from both IT and business users. The days of IT producing data and end-users consuming it are gone.  Now with fourth generation reporting tools like Oracle’s OBIEE, your data consumers are self enabled to access the information themselves when needed. This behooves us to make sure that the information that is being accessed is accurate and readily available. The BICC ensures that the right personnel are involved in the development, distribution and consumption of information in your organization.

Of course, there are many more factors involved in making sure the implementation of an IT project is successful, but following the above mentioned best practices will ensure that your IT project will result in a successfully utilized business solution.  Stay tuned for more on this topic!