La présentation est en train de télécharger. S'il vous plaît, attendez

La présentation est en train de télécharger. S'il vous plaît, attendez

L’offre décisionnel IBM

Présentations similaires


Présentation au sujet: "L’offre décisionnel IBM"— Transcription de la présentation:

1 L’offre décisionnel IBM
Patrick COOLS Spécialiste Business Intelligence

2 Accélération des rythmes de décision
Le marché du Business Intelligence L’enjeux actuel des entreprises : devenir plus « agiles » Elargir les marchés tout en maintenant les avantages competitifs Adaptation aux changements des comportements clients Offrir le bon produit/service au bon moment Politique de prix adaptée Fidélisation des clients et partenaires In today's marketplace, successful businesses are those with the ability to anticipate and respond to marketplace and customer behavioral changes, well before the competition is even aware that they occurred. “Connecter l’offre à la demande” Pour une entreprise, être "agile" c'est être capable de s'adapter à un environnement de plus en plus compétitif, sujet à des changements imprévisibles, et de fidéliser une clientèle de plus en plus exigeante. Cet ensemble forme un processus conduisant à un idéal organisationnel: celui de l'entreprise agile qui se caractérise par la coordination horizontale, le partage de l'information et une grande flexibilité à court terme. L'accélération des rythmes de décision dans les entreprises amènent les dirigeants à réfléchir sur la rentabilité de leurs investissements en systèmes d'information décisionnels. Quels avantages compétitifs procurent-il ? Répondent-ils aux besoins stratégiques ? Cette réflexion entraîne la recherche de nouvelles approches plus efficaces et plus réactives de gestion de l'information... Comme le BPM. Les investissements en systèmes d'information décisionnels ont été réguliers depuis dix ans, que ce soit pour partager et uniformiser l'information dans l'entreprise (via des info-centres, des « data warehouses », des ERP…) ou sous des contraintes externes (comme le passage à l'an 2000, l'adoption de l'euro ou des normes IFRS). Résultat : les responsables se retrouvent souvent face à un ensemble d'outils juxtaposés ou empilés (chacun traitant des besoins spécifiques : plan/budget, reporting opérationnel, reporting financier, consolidation…), au mieux reliés les uns aux autres par des interfaces afin d'éviter ressaisies ou transferts manuels ! Une analyse globale montre en outre l'absence de vision claire sur l'utilisation de ce type d' information dans l'entreprise et sa réelle contribution à la création de valeur ajoutée. D'où une remise en cause du bien fondé de ces investissements et la redéfinition du rôle de ces systèmes d'information décisionnels. Cette réflexion manageriale ne peut se concevoir uniquement sous l'aspect des systèmes d'information. Comme tout projet, elle doit prendre en compte les méthodes d'utilisation de ces données, ainsi que les rôles des différents collaborateurs impliqués dans ce processus. Les systèmes informatiques intégrant ces fonctionnalités de simulation, d'estimation, d'analyse de probabilités et de suivi de l'information prévisionnelle contribuent à l'adoption de méthodes prévisionnelles performantes. Ces nouveaux outils, principalement apparus sur le marché courant 2002, permettent d'unifier dans un seul environnement les aspects traditionnels de restitution et de reporting du réalisé avec les fonctionnalités d'élaboration budgétaire, de simulation, voire de consolidation statuaire et d'intégration des données, dans une vision stratégique de l'entreprise (en s'appuyant par exemple sur des approches de type « BSC »). Connus sous le nom de solutions de « Business Performance Management » (BPM),de « Corporate Performance Management » ou de « systèmes unifiés et intégrés de gestion », ces environnements puissants sont au décisionnel ce que l'ERP fut il y a quelques années aux environnements opérationnels. --- Accélération des rythmes de décision

3 WebSphere Information Integrator
Les enjeux actuels du BI Information en temps réel et intégration dans les processus de l’entreprise Déploiement accru du BI dans l’entreprise Portals ease use & deployment In-line Analysis integrates analysis and business process management DB2 Alphablox EDW Enjeux du DataWarehouse d’Entreprise, global et centralisé Enterprise Warehouses Emerge Scalability Reduces Application Complexity Mixed Workload Support DB2 Universal Database Data Marts Warehouse ODS / Staging Le domaine du BI est en train de changer et d’évoluer ! Like everything else, business intelligence is evolving…in 3 important ways: First, the capabilities of the core database have become more critical than ever. The late 90s saw the emergence of integrated, enterprise warehouses supporting multiple analytic applications and consolidating multiple physical data marts. As a result, this puts a premium on scalability and the ability to support a mixture of workloads. Click Historically, analysis has been based on a snap shot of past information. Today, we’re finding companies want to make decisions in real time. So the ability to federate real time information with the analysis from the warehouse becomes important. Another aspect of real time is to feed the warehouse itself with more current information. This means gathering new information from message queues vs. traditional batch updates and in-turn drives the need for ETL/Data Quality processes to happen in real-time as well. The third important shift that’s going on is the expanding use of analysis. If more people can analyze information or have access to the analysis, there is more benefit to be gained. First, analysis has to be made accessible to more audiences – not just nerds locked in an office beating the data to death with advanced mathematics. Portal enablement is critical to this process. Second, analysis an in-line component of the actual business process. For instance, give a financial advisor access to analytic capabilities as part of the loan approval process… how will approving this loan impact their monthly objective? Analyses “temps réel” permettant le business “On Demand” Federation extends warehouses Message Queues stream data in ETL for Batch & real-time workloads WebSphere Information Integrator

4 DB2 Architecture du système d’information décisionnel
Operational systems Extract, transform, load Line of Business Data Marts Enterprise Data Warehouse ODS DB2 Warehouse Manager Ascential DataStage DB2 ESE / DPF DB2 Cube Views Query Patroller DB2 Work Group DB2 OLAP Server ETL metadata DB2 End User Analytic Tools ISV Query Tools, Applications Intelligent Miner Office Connect Spatial Extender Alphablox This is the basic BI flow diagram used in other education modules as well. At the top is the production data being sent to the warehouse. It funnels through DB2 Warehouse Manager or Ascential DataStage where the data is consolidated and transformed before loading into the warehouse. In the middle, data goes into DB2 ESE although in small SMB accounts WorkGroup edition may be a good choice. Note the operational data store is inside the data warehouse, not in a separate database. DB2 then feeds the dependent data marts with data. One of these marts will be DB2 OLAP Server -- a multidimensional OLAP cube. In the user layer, we see the tools end users choose to aggregate and distill the data into its final business form. The predominant process here is query and reporting which is performed by business partner tools. Data Management Division has purposely chosen to not invest in building end user tools, instead relying on Cognos, Business Objects, MicroStrategy, Crystal Decisions, and others for this capability. Also in the user analytics layer are Intelligent Miner, Query Patroller, Office Connect, and spatial extenders. While most of these tools are infrastructure components, they provide specific transforms to refine the data for its final use. D Graham DTG

5 Model & ETL tool metadata
DB2 Cube Views Métadonnées OLAP meta data bridge meta data bridge DB2 Data Warehouse RDBMS Metadata Model & ETL tool metadata DML DDL OLAP Metadata DATA BI tool metadata BUSINESS OBJECTS 1 XML Hyperion ER design tools There are two issues with OLAP metadata: it is decentralized and it is a more complex structure than standard SQL metadata. Decentralized means every tool reinvents the hierarchical structures. This means proprietary lock in but more important high maintenance costs when similar OLAP metadata structures must be duplicated in several front end tools. Changes in one hierarchy now have to be repaired in many repositories. The complex structure simply means getting beyond simple rows, columns, and keys to multilevel unbalanced hierarchies, some of which are recursive. In the examples of partners, only Brio does not carry complex OLAP metadata in their repository. They handle OLAP differently from the other vendors which capture the OLAP metadata in their own repositories. QMF for Windows

6 DB2 OLAP Server Analyses multidimensionnelles
source data Multi-dimensional Cubes OLAP Server OIS Partitioning option High concurrency option Analyzer, Brio, or Windows Client DB2 OLAP Server is two primary subsystems -- the Essbase OLAP cube services and the DB2 star schema services. The Essbase cube is an OEM product from Hyperion corporation that is tightly integrated with DB2. DB2 is used to supply the underlying data for the cube creation as well as drill down capability to users of the cube. These services are offered via DB2 OLAP Integration Server or OIS. The Essbase cube subsystem is the most advanced in the industry today. It provides summarized data in multiple perspectives called dimensions. The dimensions might be "Sales region", "Product category", "Gross margins", "Customer segment", or any other major metric used to operate the business. All data in the database is distilled into summarized reporting "cells" for every dimension. This allows the user to quickly extract summary reports based on the corporate hierarchy (ie. organization chart or general ledger account structures). Since the data is summarized and all formula calculations are applied when the data is loaded into the cube, the response time for pulling reports from the cube is at transaction speed or "speed of thought". This is why they call it online analytical processing (i.e. OLAP). For the Essbase cube to function, it requires copious amounts of data to compute the summaries (called the load and calc cycle). While Essbase can get this data from flat files, Oracle, or any number of sources, OIS provides a perfectly built integration from star schema data into the Essbase cube. With simple GUI screens, the administrator can import the database design from DB2, map those table layouts into the summaries of the cube, and activate the load calc cycle to acquire data and update the cube. This design greatly simplifies administration of the cube, keeps it synchronized with the source data, and provides a seamless link between relational data and "MOLAP" summaries. More important than the easy administration, the end user who queries the cube and gets a summary can click through on the summary report to "drill down" into the DB2 data to understand what makes up that summary result. This is a crucial feature because when a summary result shows a business problem, the end user frequently needs to get the details of the situation to be able to make decisions. OIS makes this so easy for the user that they never know they have switched from a MOLAP cube to a DB2 star schema database. Hyperion Analyzer is an end user tool for Web or Windows-based interactive analysis. It is ideal for a broad range of sales, marketing, and other enterprise analysis applications, such as sales analysis, product profitability, and key performance measurement. It provides drill-through to relational data. Keep in mind that Essbase is so popular, that most BI front end user tools also connect to the Essbase cube and can drill down into DB2 detailed data as well. The partitioning option allows OLAP models to be divided into separate logical and physical partitions, which can be accessed seamlessly by users. This distributes the summarized data across servers for increased performance. A new priced feature, High Concurrency Option, is required for cube clustering that is needed for load balancing and failover capabilities. The tools bundle includes an API that provides data navigation features to any customized client-side software, a SQL Interface and Drill Through which is an ODBC based link to SQL-compliant data stores, and a Currency Conversion module that translates monetary values according to any exchange rates. Additionally there is an Excel macro that can be used to link spreadsheets directly to the Essbase cube. D Graham SQL interface Essbase API MS Excel macros Currency conversions DTG

7 DB2 Query Patroller Gestion et surveillance des requêtes
Intercept query from the client Queues for user requests Ensure short queries get through fast Control high resource burning queries Flexible result set options Queue handling of reports returned Improved interoperability with end user tools Client benefits End user satisfaction with performance Postpone hardware purchases answers explains answer sets queued queue Data Warehouse Query Patroller DB2 Optimizer SQL 23 SQL 21 SQL 15 SQL14 Query Patroller is a product that sits between DB2 and the BI tool that submits queries. It intercepts the SQL and uses the DB2 explain function to estimate the amount of CPU, memory, and disk resources the query will consume. It then chooses to execute the query or delays it to run later when the server resources are available. Query patroller tries to ensure that queries get prioritized into DB2 base on how much they will consume and the priority of the user who submitted it . Query Patroller also allows the user to choose to have the answer set sent back at a later time. While this is not the default, it may be useful when you know the query will run for more than minutes. Query patroller will delay the SQL submission until resources are available and capture the answer set in a disk file until the user is ready to accept it. For example, imagine a query that runs an hour being submitted at 10AM. Query Patroller can delay running it until just after noon when people are at lunch and resources are available. By 2PM, the query is complete but the user who submitted the query is in meetings. So, the answer set -- rows of data-- is held on disk until the user logs on again and accepts the results. D Graham DTG

8 models & scores used in applications models & scores used in BI Tools
DB2 Intelligent Miner models Visualize scoring Analyst defines model and runs the analysis … DB2 Data Warehouse extracts XML models & scores used in applications models & scores used in BI Tools DB2 has each of these functions embedded in the database DB2 IM Modelling, IM Scoring, and IM Visualization Modelling is the process of running algorithms against the data and finding new and useful relationships in the data. The models (which are the new business relationships that are the results of the modelling process) are stored in PMML and can be transferred to any PMML compliant system. The data warehouse or data mart does not need to be DB2. (For IM Modelling the database must be DB2). The target database where scoring happens must be DB2 v8.1 with IM scoring. There are two forms of the data mining offering. Intelligent Miner for Data is a data mining workbench. Intelligent Miner Scoring, Modeling, and visualization are DB2 extenders for developing applications. In both cases, advanced mathematics are used to analyze the data and provide insights into customer or business behaviors that are otherwise impossible to discover. The extenders allow parallelism in DB2 to access all the data instead of using a small sample. This provides for higher accuracy but requires the data mining be invoked by a programming SQL interface. This is the best choice for embedding data mining in an application or for interacting with business partner tools. Intelligent Miner offers several popular algorithms -- neural networks, radial basis function, regression, etc. Each algorithm has strengths and weaknesses in detecting patterns. Some are more appropriate for certain kinds of data than others. Sometimes we try using two or more algorithms on the same data to determine which is the most accurate result. Many algorithms give us confidence factors about the accuracy of the results. Therefore, running two algorithms for the same business problem allows us to choose the results with the highest confidence factor. Sometimes a mining algorithm produces a "training set" or a "model". This means the mining algorithm consumes the data and identifies a pattern, saving the pattern into an XML document according the PMML data mining standard. The XML model can then be used at a later time by the scoring algorithms or by a business partner tool compatible with PMML models. The business functions we try to use mining for include clustering for segmenting the consumers, scoring for applying a score to each person in the segment for example gold-silver-bronze customers, predictions of cross-selling probabilities, and outlier detection to locate abnormal situations, usually for fraud detection. D Graham DTG

9 DB2 Spatial Extender Longitude & Latitude coordinates
Load Spatial "maps" into tables Create Spatial indexes Add spatial data to customer records Just 1 more column of data Run queries with spatial functions Extender invoked by SQL Touch, intersect, nearest, etc. functions ESRI visualization tools ArcPlan, ArcInfo, ArcWeb Uses City planning, emergency routes, cell phone coverage, branch/store location selection, risk analysis, transportation logistics, marketing ESRI.Com map of Phoenix Spatial data is location based data that has properties of proximity, area/shapes, distance, etc. It is a visualization technique that leverages the pattern detection capability of the human eye as the "data mining" detection device. People can see things when reports and numbers are often difficult to use for the same purpose. Spatial Extender provides the ability to store and manipulate spatial data in DB2. To do this DB2 Spatial Extender builds on DB2 user defined functions to create spatial data types, spatial indexes and the functions that operate on them. There are also special indexes in DB2 to support SQL functions on spatial data. For example, you can ask via SQL to find all home addresses within 5 kilometers of a defined map point. DB2 indexes and the spatial extender make this a relatively simple request. Why spatial analysis? The answer starts with an understanding of spatial data. Any location-based data is considered spatial, for example: customer address, store or branch locations, sales zones, delivery routes, the scene of accidents, etc. You can see on a map where your most valuable customers live in relation to where your stores are located. You can quickly see where there are risk exposures in traffic patterns when a major disaster like flooding or fires occur. Spatial visualization answers the simple business questions associated with "where" something happens. You might want to Identify customers with a home insurance policy living within 1000 yards of a river who DO NOT have flood insurance. You might want to analyze how many accidents occurred within 2 kilometers of a highway exit that caused more than $1000 damage in order to set insurance prices or plan better city roads. Spatial analysis is hottest in government. City planning and security are two obvious areas of visualization where spatial is needed. But retailers, energy suppliers, and telecommunications companies are also investigating spatial visualization. You use DB2 Spatial Extender to create a geographic information system (GIS) -- that is objects, data, and applications that allows you to generate and analyze spatial information about geographic features. Geographic features include the objects that comprise the earth's surface and the objects that occupy it. They make up both the natural environment (examples are rivers, forests, hills, and deserts) and the cultural environment (cities, residences, office buildings, landmarks, and so on). Or you could map systems such as subways, electrical wiring, water and gas lines in buildings or cities. In some cases, foot traffic patterns through large stores helps retailers design the store layout and placement of promotional items. D Graham DTG

10 DB2 AlphaBlox L’analyse intégrée dans les processus de l’entreprise
Alphablox= pour le développement d’applications analytiques Basé sur des composants ré-utilsables Ces composants peuvent être intégrés dans les applications web et les processus de l’entreprise Analyze Access Business Logic Present J2EE Framework …ties it all together Alphablox complémentaire de Business Objects, Hyperion, MicroStrategy, Cognos, etc.

11 DB2 AlphaBlox This is what the same blox, actually several of them, look like when you customize them and embed them into an existing web application. This is also an example of how you can have more than one “blox” on a page. In this example, most have their toolbars turned off. Finally, they have been “EMBEDDED” into an existing web page application. Note that if you were to double click on any one of the charts or graphics, you would not “launch” the BI product which would open up as a whole new window. Instead, you would drill down on that particular component. So far we have addressed the LOB issues, and we have just covered customized and embedded, what we have left is to discuss when would you use Alphablox versus the other IBM BI alternatives (Business Objects, Cognos, Microstrategy, Siebel or others.)

12 WebSphere Information
Websphere Information Integrator Complète le DW avec des informations temps réels WebSphere Information Integrator Application ODS Operational systems Enterprise Data Warehouse Mart DB2 DBMS Accès en temps réel à des informations ponctuelles (CRM, SCM,…)

13 DB2 DataWarehouse Edition
Une offre BI intégrée DB2 DataWarehouse Edition The flagship product in the DB2 information management software family from IBM, DB2 Data Warehouse Edition extends the powerful information management functions of DB2 Universal Database—the world’s most scalable database—with advanced BI features for building and working with data warehouses and data marts. By integrating your information assets and applying realtime analytics to turn your information into intelligence, your DB2 data warehouses and data marts can enable your company to deliver information on demand—selective, transparent access to distributed data sources. And by developing your solution with the backing of IBM services and support, you can transform your company into an on demand e-business. With DB2 Data Warehouse Edition, BI functions, including administration, ETL,data mining, and OLAP, are on the same data structures in the same database on the same warehouse tier. Let’s take a closer look….


Télécharger ppt "L’offre décisionnel IBM"

Présentations similaires


Annonces Google