Object One BI&DW model is an end-to-end approach to your business intelligence solution that begins with planning a Business Intelligence strategy, and continues through execution, including data preparation, creation of analysis and reporting tools, and dissemination of the business intelligence tools and results.
a) Understanding the information needs of the enterprise by business role are
b) Understanding the possible sources of data elements that could comprise of the needed information
c) Documenting the priorities of these needs including cost-benefit analysis
d) Defining a BI roadmap / strategy for the enterprise that will most effectively and efficiently enable "Informed decision making" through BI.
Excellence in business intelligence solutions requires clean data that is accessible to systems and users throughout your organization. At Object One we use a variety of techniques that ensure the data management services are adequate to support the best BI solution we can build for your organization.
The goal of data management is to make data accessible, usable, understandable and timely; The goal of a data quality initiative is to make that data correct. Data quality is the cornerstone of all business information, and Object One believes that it is particularly essential in the BI environment. Data that is correct will lead to good business decisions; data that is not correct cannot be used at all in BI solutions.
A decision maker will certainly know when data is late, incomprehensible, irrelevant, or when it cannot be accessed at all; but he may not know when it is incorrect. That is why incorrect data is not only useless—it is dangerous.
Object One pays enormous attention to data quality during the course of planning, implementing, and disseminating your BI solution. Your company's source data is critically examined for possible errors. Our analysis shows where data is missing, out of range, invalid, or without referential integrity. Object One reports those errors, fixes them at the source and rebuilds the data warehouse using the corrected data. We also propose how to fix the data during the extraction process so that the error does not re-occur. Software tools are available that help ensure data quality. The tools examine data to see that it meets certain criteria, and some correct the data when it fails these tests. They avoid the necessity of manually assessing every scrap of information or manually building custom systems to do the job.
However, building an ETL process with impeccable data quality requires careful testing and analysis to make sure these processes are working, and Object One checks every step to make sure that your data is of the highest quality. Data is not only tested during the ETL phase, it is tested for accuracy where it is used in the BI solution we build, including the reporting and analysis applications. The bottom line is that Object One makes an active commitment to ensuring the accuracy and quality of the data that goes into your BI solution, and that its quality is maintained throughout its use in the system.
Information by itself is never enough. In order to evaluate the numbers in your data warehouse, you need to understand them, and that's where metadata comes into play. Metadata tells you how the information is defined, if the data are calculated, where they came from, how old they are, if the numbers have been cleaned for quality, filtered, or summarized, who is the information steward, and how the information was approved by its steward.
In short, metadata is data that documents data, and tells you how it was processed through the BI environment, from its source in a business process system, to its appearance in a BI report or use in an analytical model. That, in turn, will tell you how you can make appropriate use of the information, as well as when someone is using it inappropriately.
Gathering the metadata and storing it is a critical part of the main process of gathering, transforming and storing information that Object One does at the start of your BI solution project. Once the metadata is in place, you may not access it every day, but it will become a part of in how you think about the information you are using. You will of course need to review and revise the metadata when your business, and therefore your BI solution, changes in a way that impacts the way data are created for your system.
Metadata management has to be an integral part of your Business Intelligence in order for it to have substantial business value. Object One's consulting team works with your team to create a metadata strategy that is appropriate for your company, and that can be successfully integrated into your BI solution to provide you with a full set of current, accurate and available metadata.
Business Intelligence analytics and reporting require data to be broken down into various categories and types. The categories used for these breakdowns are called Master Data, and their definition and management is known as Master Data Management, or MDM.
While some breakdown entities, such as date and geography, have well understood and agreed-upon definitions, others usually do not, and a Master Data Management system has to enable the creation and use of several different categorizations. Without a set of clear definitions and an MDM system to manage them, the value of your business information and your company's BI solution will be seriously reduced.
For example, the customer entity is often defined and used differently by different groups, and if these definitions are not understood and properly used, the information will have little value. For example, a customer may be someone who buys a product, but a customer may also be someone who replied to an advertisement, had a product repaired, called for technical support, or returned a gift item.
That makes it unclear how to respond when your CEO asks how many customers your company has. If you can say that there are 38,472 customers who have bought a product, 16,755 who have called for tech support, and 75,398 who have responded to a recent advertising campaign, you will have done your job correctly. If you can only say that there are 38,472 customers without qualifying what kind of customer they are and what they have done, you have not.
You can give unambiguous answers, and use unambiguous information in your reporting and analysis, only if you have an adequate MDM system in place as part of your BI solution. Object One not only understands that, our consultants will work closely with your team to make sure that your system is adequate, and that your definitions are not only clear, but can be supported in your ETL environment. Each of your business process systems will be thoroughly analyzed to ensure that the definitions your team agrees upon can be supported by the extraction process, and that the information arriving in your data warehouse conforms to the definitions.
Our goal is to make sure that when your dashboard or model says there are 38,472 customers, you know and can say what that means, and that you and your colleagues will use the information correctly in your work.
No matter how much information is extracted from your company's business process networks and stored in your data warehouse, it will have little value in your decision making processes without adequate reporting structures and analytical tools. Distilling the intelligence requires that your BI solution produce informative reports on a timely basis and provide analytical tools that support your decision making processes.
Today's enterprise IT environment vigorously supports the technique asking questions and deriving answers through simple data filters, data groupings, and other basic data "slicing and dicing" techniques. The analytics are housed in the mind of the user, as are the basic queries that support them in something that resembles a Socratic method of Business Intelligence.
But the large and rapidly increasing amount of data produced and captured in enterprise business process systems limits the value, and even the feasibility, of making decisions based on this crude methodology. The capacity of the human mind limits how many variables can be juggled at once, and even the data capacity of computer screens and computer-generated reports limits the usefulness of these methods.
The academic and scientific communities, along with technically advanced business analysts, have evolved a new set of techniques during the past decade that thrive on these large volumes of data instead of choking on them. They take advantage of the processing power of today's powerful computers to identify correlations and trends based on sophisticated statistical algorithms rather than crude data sifting methods.
Business analysis tools can mine enterprise data to answer questions and create predictions that can be used to build business strategies. The use of these analytical methods is not new in business But its presence at the center of business decision-making in a wide range of large enterprises is new, and is providing the companies that use the methods with more successful and competitive business strategies. What's also new is that while early adopters were very large corporations with very large IT budgets, today's inexpensive computing power makes the use of advanced analytics available to a wider variety of companies of all sizes.
Besides computer power, a company that wants to use advanced analytics needs to have skilled business analysts available to work with business experts in order to build the models, run them and to report and disseminate the results. This teamwork is required both to insure that the models are statistically valid and accurate and to make sure that questionable or meaningless results are rejected. This is particularly important when predictive analytics are used in a planning environment because a questionable result can lead to a very poor business decision.
Object One's consulting teams are particularly adept at delivering to you all the analytical skills you will need to bring your data warehouse together with sophisticated business analytical tools. They will work with your business experts and make sure that your analysts are at least as expert at using these tools and interpreting the results as they are, which can assure you of successful analytical and predictive analytic results.
Distributing intelligence distributes the resulting information using tools such as portals, dashboards, scorecards, report repositories and PDAs that enable final consumers of information to receive the information in a form palatable to them. Once we have a strong layer which has captured data in a consolidated fashion (preparing for intelligence) and have extracted the required information (Manthanning intelligence), the function of the dissemination layer is to ensure integrity of data and deliver it in any form desired by the end user. This is achieved in many ways. It is possible that dissemination layer (say Dashboard) is closely tied with underlying extraction layer. At the same time, we can visualize more loosely coupled scenario, where extraction layer passes on the data in generic form like XML and then dissemination layer can choose to "visualize" the data in any form desired. The idea would be to extract once and use it in multiple ways as needed in different forms. The different ways the
Dashboards and Scorecards: These are essentially the same, except that Scorecard has pre-determined measure against which we can compare the performance of a given measure (Key Performance Indicator – KPI). So, if our car dashboard were to show the engine running at 4,000 rpm, it conveys one piece of information, but if we had pre-determined level indicator in there, it will be able to tell us that we may be higher than desired rpm and may need to take action to correct the situation. In short, scorecard can tell us how we are performing against a pre-determined target or targets (i.e. multiple level of targets are possible like green, yellow and red zone for example).
Portals: With well established portal standards, there should not be a need for users to bookmark different URLs and go to different sites to access difference pieces of information. Portal integration is possible at multiple levels. At the least, through single sign on, a reference can be put for delivering desired reports in centralized place. But more tight integration is also possible. Portal may be able to put in a portlet which comes from a reporting infrastructure along with other information or be capable of consuming XML based data and then visualizing the data as per the portal standard. Regardless of the level of integration, the security and integrity of data needs to be maintained.
Mobility: Distributing of information in a usable form through PDA/Mobile is just not a desire and nice to have but a requirement and thankfully an achievable reality with out of the box support for PDAs/Mobile by Business Intelligence tools. We believe that this area will still mature further, but today's solution do offer secured way to view and interact with your data through range of PDAs/Mobile.
Object One has extensive experience providing innovative, high performance, scalable solutions in this area.
n-depth Business Intelligence experience that spans domain knowledge, methodology, technologies and tools Cost-effective solutions delivered by leveraging global delivery model
Comprehensive BI service offerings
Solutions built on industry "best practices".