Sep 30, 2014

SAP HANA - Series - 1

HANA Overview

What is SAP HANA?


SAP HANA is an in-memory database and application platform, which is for many operations 10-1000x faster than a regular database like Oracle on the same hardware. This allows simplification of design and operations, as well as real-time business applications.  Customers can finally begin to reduce IT complexity by removing the need for separate and multiple Application Servers, Operational Data Stores, Datamarts and complex BI Tool implementations.

SAP HANA is a “reinvention” of the database, based on 30 years of technology improvements, research and development. It allows the build of applications that are not possible on traditional RDBMS, and the renewal of existing applications like the SAP Business Suite.


Why did SAP build a database?

SAP co-founder and Chairman Hasso Plattner believed that if a database could be built with a zero response time, that business applications would be written fundamentally differently – and IT landscapes could be simplified. The research institution at the Hasso Plattner Institution in Potsdam theorized that with modern computers and software design, this would be very nearly possible.

SAP makes business applications and since it was clear that none of the incumbent software vendors like Oracle would write such a database and application platform, they needed to build their own. In addition, this would be the springboard for a complete renewal and simplifying of SAP’s applications to take them through the next 20 years.


Is SAP HANA just a database?


No. When SAP went to build HANA, they realized that the next generation of business applications would require a much more integrated approach than in the past.

SAP HANA contains – out of the box – the building blocks for entire enterprise applications. HANA can take care of the requirements that would be served by many layers in other application platforms, including transactional databases, reporting databases, integration layers, search, predictive and web. All of this is served up working out the box, with a single installation.

~~~Contd. 

Sep 22, 2014

Generic Delta Extraction using Function Module along with currency conversion in source system


Business Scenario:        
1. Need to have additional fields from Sales Partner Table (VBPA) for all Order Line Items in Sales Item Table (VBAP) for Sales Order Line Item Level Reporting. Standard data sources cannot be used as many info providers such as DSO's; Cubes are using them so it involves lots of effort in terms of time and money maintaining the same. Also this approach involves a lot of risk considering if anything gets deactivated during Transports.
2.  In addition to this Curren cy Conversion has to be done in Source system as per client norms.
Note: It is recommended to do currency conversions in BW system.

R3 Side: In order to meet the above 2 requirements we decided to go for Generic Delta Extraction using Function Module. Need to make sure that Generic Extraction is Delta Based as Sales Order Items Table (VBAP) contains all the line item level information for Orders and its not easy extracting everything i.e. doing full update on daily basis and then maintaining the same in BW. Here, we shall be building a logic using AEDAT (Created on) and ERDAT (Changed on) of VBAP to extract the order Items getting changed/Created since the last BW Extrac tion.

Steps for Delta Enabled, Function Module Based Data source

1.       Create an Extract structure including DLTDATE field in addition to all other required fields. DLTDATE would be used to build the logic to extract the delta using AEDAT and ERDAT.

             Reason for Addition of a DLTDATE Field in Extract Structure

While configuring delta in the RSO2 screen, the field on which the delta is requested must be a field present in the extract structure. To allow the extractor to provide delta on timestamp, there must be a timestamp field in the extract structure. Hence the timestamp field is added here – it is merely a dummy field created to allow us to use the extractor for delta purposes, as will become clear later.
1.jpg
2.  Copy the Function group RSAX from SE80, give new function group as ZRSAX_TEST.
3.  Copy function module. Deselect all and then select only RSAX_BIW_GET_DATA_SIMPLE name it as ZBW_FUNCTION.
4.  Go to the Include folder and double-click on LZRSAX_TESTTOP define the structure and Field symbol and internal table as below.

INCLUDE LZRSAX_TESTTOP.

* Structure for the Cursor - extraction of data
TYPES: BEGIN OF ty_vbap,
         Vbeln TYPE vbeln,
         Posnr TYPE posnr,
         Netwr TYPE netwr,
         Waerk TYPE waerk,
         Dltdate TYPE dats,
       END OF ty_vbap.

< p>* Structure for VBPA to extract PERNR
TYPES: BEGIN OF ty_vbap,
         Vbeln TYPE vbeln,
         Posnr TYPE posnr,
         Pernr TYPE pernr,
       END OF ty_vbap.

* Structure for the final Table
TYPES:  BEGIN OF ty_ord_f inal,
           Vbeln TYPE vbeln,
           Posnr TYPE posnr,
           Pernr TYPE pernr,
           Dltdate TYPE datum,
           Netwr   TYPE netwr,
           Waerk   TYPE waerk,
           Netwr_loc_val TYPE WERTV8,
           Loc_curr       TYPE waers,
           Netwr_rep_val TYPE WERTV8,
           Rep_curr       TYPE waers,
         END OF ty_ord_final.

* Internal table
DATA:    t_vbap TYPE STANDARD TABLE OF ty_vbap,
         t_vbpa TYPE STANDARD TABLE OF ty_vbpa.

*Work areas
DATA:    wa_vbap TYPE ty_vbap,
         wa_vbpa TYPE ty_vbpa.

* Variables
DATA:    lv_bukrs TYPE bukrs,
         lv_vkorg TYPE vkorg,
         lv_waers TYPE waers,
         lv_prsdt TYPE prsdt,
         lv_netwr TYPE netwr.

* Currency conversions Variables
       DATA:    save_ukurs     LIKE tcurr-ukurs,
         save_kurst     LIKE tcurr-kurst,
         save_ukurx(8)  TYPE p,
         save_ffact1    LIKE  tcurr-ffact,
         save_tfact     LIKE  tcurr-tfact,
         save_ffact     LIKE  tcurr-ffact,
         save_ukurs1(11) TYPE p DECIMALS 5.

* Field symbol declaration
FIELD-SYMBOLS: <i_fs_order_item> LIKE LINE OF t_vbap.

Save and Activate it.

Creating the Function Module

* Auxiliary Selection criteria structure
  DATA: l_s_select TYPE srsc_s_select.

* Maximum number of lines for DB table
  STATICS: s_s_if TYPE srsc_s_if_simple,

* counter
          s_counter_datapakid LIKE sy-tabix,

* cursor
          s_cursor TYPE cursor.

* Select ranges
  RANGES:  l_r_vbeln        FOR vbap-vbeln,    "DOC
           l_r_posnr        FOR vbap-posnr,    "ITEM
           i_r_dltdate      FOR vbap-erdat.    "DELTA DATE

DATA       t_final  LIKE LINE OF e_t_data.
* Initialization mode (first call by SAPI) or data transfer mode
* (following calls)?
  IF i_initflag = sbiwa_c_flag_on.

************************************************************************
* Initialization: check input parameters
*                  buffer input parameters
*                 prepare data selection
************************************************************************

* Check DataSource validity
    CASE i_dsource.
      WHEN 'ZBW_DS_TEST'.
      WHEN OTHERS.
        IF 1 = 2. MESSAGE e009 (r3). ENDIF.
* This is a typical log call. Please write every error message like this
        log_write 'E'                  "message type
                  'R3'                 "message class
                  '009'                "message number
                  i_dsource   "message variable 1
                  ' '.         "message variable 2
        RAISE error_passed_to_mess_handler.
    ENDCASE.

    APPEND LINES OF i_t_select TO s_s_if-t_select.

* Fill parameter buffer for data extraction calls
    s_s_if-requnr    = i_requnr.
    s_s_if-dsource   = i_dsource.
    s_s_if-maxsize   = i_maxsize.

* Fill field list table for an optimized select statement
* (in case that there is no 1:1 relation between InfoSource fields
* and database table fields this may be far from beeing trivial)
    APPEND LINES OF i_t_fields TO s_s_if-t_fields.

  ELSE.                 "Initialization mode or data extraction ?

************************************************************************
* Data transfer: First Call      OPEN CURSOR + FETCH
*                Following Calls FETCH only
************************************************************************

* First data package -> OPEN CURSOR
    IF s_counter_datapakid = 0.

* Fill range tables BW will only pass down simple selection criteria
* of the type SIGN = 'I' and OPTION = 'EQ' or OPTION = 'BT'.
      LOOP AT s_s_if-t_select INTO l_s_select.
        CASE l_s_select-fieldnm.
          WHEN 'VBELN'.
            ls_vbeln-sign        = ls_select-sign.
            ls_vbeln-option      = ls_select-option.
            ls_vbeln-low         = ls_select-low.
            ls_vbeln-high        = ls_select-high.
            APPEND l_r_vbeln.
          WHEN 'POSNR'.
            ls_posnr-sign        = ls_select-sign.
            ls_posnr-option      = ls_select-option.
            ls_posnr-low         = ls_select-low.
            ls_posnr-high        = ls_select-high.
            APPEND l_r__posnr.
          WHEN 'DLTDATE'.
            ls_delta_date-sign   = ls_select-sign.
            ls_delta_date-option = ls_select-option.
            ls_delta_date-low    = ls_select-low.
            ls_delta_date-high   = ls_select-high.
            APPEND l_r_dltdate.
          ENDCASE.
      ENDLOOP.

* Determine number of database records to be read per FETCH statement
* from input parameter I_MAXSIZE. If there is a one to one relation
* between DataSource table lines and database entries, this is trivial.
* In other cases, it may be impossible and some estimated value has to
* be determined.


  OPEN CURSOR WITH HOLD s_cursor FOR
      SELECT itm~vbeln AS vbeln itm~posnr AS posnr
             itm~netwr AS netwr itm~waerk AS waerk itm~aedat
             AS dltdate FROM vbap AS itm
             WHERE itm~vbeln IN ls_vbeln
             AND itm~posnr IN ls_posnr
             AND ( ( itm~aedat EQ '00000000'
             AND itm~erdat IN ls_dltdate )
             OR  ( itm~aedat NE '00000000'
             AND itm~aedat IN ls_dltdate ) ).
    ENDIF.                             "First data package ?

* Fetch records into interface table.
*   named E_T_'Name of extract structure'.

  REFRESH: t_vbap.
                FETCH NEXT CURSOR s_cursor
                APPENDING CORRESPONDING FIELDS
                OF TABLE t_vbap
                PACKAGE SIZE s_s_if-maxsize.
                IF sy-subrc <> 0.
                 CLOSE CURSOR ls_cur sor.
                 RAISE no_more_data.
                ENDIF..
* Loop at it_vbap to build the final table

LOOP AT t_vbap ASSIGNING  <i_fs_order_item> .
        CLEAR: t_final, lv_vkorg,lv_bukrs,lv_waers,lv_prsdt.
        MOVE: <i_fs_order_item>-vbeln TO t_final-vbeln,
              <i_fs_order_item>-posnr TO t_final-posnr,
              <i_fs_order_item>-netwr TO t_final-netwr,
              <i_fs_order_item>-waerk TO t_final-waerk,
              <i_fs_order_item>-dldat TO t_final-dltdate.

        SELECT SINGLE pernr FROM vbpa INTO  t_final-pernr
            WHERE vbeln = <i_fs_order_item>-vbeln
            AND  posnr = '000000'
            AND parvw = 'ZM'.
          IF sy-subrc NE 0.
             t_final-pernr = space.
          ENDIF.

*Select the order header data based on sales document
       SELECT SINGLE vkorg FROM vbak INTO lv_vkorg
              WHERE vbeln = <i_fs_order_item>-vbeln.
         IF sy-subrc = 0.
* Select the company code based on sales org
         SELECT SINGLE bukrs FROM tvko INTO lv_bukrs
            WHERE vkorg = lv_vkorg.
            IF sy-subrc = 0.
* Select the local currency based on company code
         SELECT SINGLE waers FROM t001 INTO lv_waers
            WHERE bukrs = lv_bukrs.
           IF sy-subrc = 0.
             t_final-loc_curr = lv_waers.
             t_final-rep_curr = 'USD'.
           ENDIF.

* Select the pricing date based on sales document and sales docu item
         SELECT SINGLE prsdt FROM vbkd INTO lv_prsdt
                WHERE vbeln = <i_fs_order_item>-vbeln
                AND  posnr = & lt;i_fs_order_item>-posnr.
              IF sy-subrc NE 0.
                SELECT SINGLE erdat FROM vbap INTO lv_prsdt
                WHERE vbeln = <i_fs_order_item>-vbeln
                AND  posnr = <i_fs_order_item>-posnr.
              ENDIF.

* Convert to local currency
         IF  -waerk NE lv_waers.

               CALL FUNCTION 'CONVERT_TO_LOCAL_CURRENCY'
                  EXPORTING
                    date                    =  lv_prsdt
                    foreign_amount           =  1
                    foreign_currency        = <i_fs_order_item>-waerk
                    local_currency          = lv_waers
                    type_of_rate            = 'M'
                  IMPORTING
                    exchange_rate           = save_ukurs
                    foreign_factor          = save_ffact
                    local_amount            = save_tfact
                    local_factor            = save_ffact1
                    exchange_ratex          = save_ukurx
                    derived_rate_type       = save_kurst
                  EXCEPTIONS
                    no_rate_found           = 1
                    overflow                = 2
                    no_factors_found        = 3
                    no_spread_found         = 4
                    derived_2_times         = 5.
               IF sy-subrc = 0.
                  save_ukurs1 = save_ukurs / save_ffact.
                  IF save_ffact1 NE 0 .
                     save_ukurs1 = save_ukurs1 * save_ffact1.
                  ENDIF.
                  lv_netwr = <i_fs_order_item>-net wr * save_ukurs1.
                  t_final-netwr_loc_val = lv_netwr.
               ENDIF.
          ELSE.
              t_final-netwr_loc_val = <i_fs_order_item>-netwr.
          ENDIF.

* Compare the Local currency with reporting currency< /p>
          IF lv_waers NE 'USD' .
* Convert the currency in the reporting currency
               CLEAR : save_ukurs1,lv_netwr,save_ukurs,
                       save_ffact,save_tfact,save_ffact1,
                       save_ukurx,save_kurst.

               CALL FUNCTION 'CONVERT_TO_LOCAL_CURRENCY'
                 EXPORTING
                   date                    = lv_prsdt
                   foreign_amount          = 1
                   foreign_currency        = lv_waers
                   local_currency          = 'USD'< /span>
                   type_of_rate            = 'M'
                 IMPORTING
                   exchange_rate           = save_ukurs
                   foreign_factor          = save_ffact
                   local_amount            = save_tfact
                   local_factor            = save_ffact1
                   exchange_ratex          = save_ukurx
                   derived_rate_type       = save_kurst
                 EXCEPTIONS
                   no_rate_found           = 1
                   overflow                = 2

                   no_factors_found        = 3
                   no_spread_found         = 4
                   derived_2_times         = 5.
                 IF sy-subrc = 0.
                   save_ukurs1 = save_ukurs / save_ffact.
                    IF save_ffact1 NE 0 .
                      save_ukurs1 = save_ukurs1 * save_ffact1.
                   ENDIF.
                   lv_netwr = t_final-netwr_loc_val * save_ukurs1.
                   t_final-netwr_rep_val = lv_netwr.
                 ENDIF.
          ELSE.
              t_final-netwr_rep_val = t_final-netwr_loc_val.
          ENDIF.
            ENDIF.
         ENDIF.
   APPEND t_final TO e_t_data.
  ENDLOOP.
    s_counter_datapakid = s_counter_datapakid + 1.
  ENDIF.              "Initialization mode or data extraction ?

ENDFUNCTION.



Explanation of the Code

RANGES:  l_r_vbeln        FOR vbap-vbeln,    "DOC
         l_r_posnr        FOR vbap-posnr,    "ITEM
         l_r_delta_date   FOR vbap-erdat.    "DELTA DATE

The l_r_delta_date range is created for the timestamp. The selection criteria for the extractor will be filled up into this range. This would be used to build the logic for extracting delta from VBAP using AEDAT and ERDAT.

LOOP AT s_s_if-t_select INTO l_s_select.
        CASE l_s_select-fieldnm.
          WHEN 'VBELN'.
            ls_vbeln-sign        = ls_select-sign.
            ls_vbeln-option      = ls_select-option.
            ls_vbeln-low         = ls_select-low.
            ls_vbeln-high        = ls_select-high.
            APPEND l_r_vbeln.

This part of the code is used to pass down the selections of vbeln from OLAP to OLTP. The same applies with 2 other fields f or POSNR and DLTDATE.

SELECT itm~vbeln AS vbeln itm~posnr AS posnr
             itm~netwr AS netwr itm~waerk AS waerk itm~aedat
             AS delta_date FROM vbap AS itm
             WHERE itm~vbeln IN ls_vbeln
             AND itm~posnr IN ls_posnr
             AND ( ( itm~aedat EQ '00000000'
             AND itm~erdat IN ls_delta_date )
             OR  ( itm~aedat NE '00000000'
             AND itm~aedat IN ls_delta_date ) ).

Here, we are basically extracting the delta records from VBAP based upon the selection passed for DLTDATE using ERDAT and AEDAT.Basically 2 conditions are used to extract delta:
1.  ( ( itm~aedat EQ '00000000' AND itm~erdat IN ls_delta_date ) – For New records
2.  ( ( itm~aedat NE '00000000' AND itm~aedat IN ls_delta_date ) ) – For Changed records.

Thereafter, once the above is done we are updating the final internal table T_VBAP.

LOOP AT t_vbap ASSIGNING  <i_fs_order_item>.

SELECT SINGLE pernr FROM vbpa INTO  t_final-pernr</ p>
            WHERE vbeln = <i_fs_order_item>-vbeln
            AND  posnr = '000000'
            AND parvw = 'ZM'.
          IF sy-subrc NE 0.
             t_final-pernr = space.
          ENDIF.

Looping over this table to extract additional fields as per t he requirement. In our case we need to extract additional partner fields from VBPA based upon orders getting created /changed since last BW extraction.

Currency Conversions

1. Document Currency: The Currency in which a document is posted in R/3 is called Document Currency. It is stored at document level as available in VBAP as it contains Sales Document Item Level Information i.e. WAERK.
2. Local Currency: The Company Code Currency is called Local Currency. It is stored at Compan y Code level in T001 Table.
3. Reporting Currency: In our case it is fixed as 'USD'.

2.jpg
Logic to get Local Currency:

Now in order to get Local currency one needs to h ave the Company Code (BUKRS) but this is not available at Sales Order Item Level (VBAP).

1. Fetch Sales Organization (VKORG) first for all the Orders extracted earlier.
     SELECT SINGLE vkorg FROM vbak INTO lv_vkorg
                 WHERE vbeln = <i_fs_order_item>-vbeln.

2. Fetch Corresponding Company Code (BUKRS) for all the Sales Organization (VKORG) extracted above from TVKO.
       SELECT SINGLE bukrs FROM tvko INTO lv_bukrs
             WHERE vkorg = lv_vkorg.

3. Fetch Corresponding Local Currency from the Table T001 for all company Code (BUKRS) extracted above.
       SELECT SINGLE waers FROM t001 INTO lv_waers
            WHERE bukrs = lv_bukrs.

This way we have fetched all the currencies for all the Order Items i.e. Document, Local and Reporting Currency. Now we need to do the conversion of the net value in Document Currency to net value in Local and Reporting Currency.

Logic to do Currency Conversions:

We shall be using a Standard Function Module 'CONVERT_TO_LOCAL_CURRENCY' for doing various conversions i.e. Local and Reporting Currency from Document Currency.

Input Parameters:
Date – PRSDT "Pricing Date". Need to fetch it for doing conversions.
From Currency - already fetched above.
To Currency – already fetched above.
Type of Conversion: Fix ed as 'M' in our case

SELECT SINGLE prsdt FROM vbkd INTO lv_prsdt
                WHERE vbeln = <i_fs_order_item>-vbeln
                AND posnr = <i_fs_order_item>-posnr.

In this part of the code we are fetching Pricing Date (PRSDT) from VBKD based upon Order Items already extracted from VBAP.
Finally we are passing everything to the Function Module to have the Conversion done to Local Currency first and later to Reporting Currency.

Putting it All Together: Create the Data source:

1. Go to RSO2 to create the data source.
2. Fill in the various Details including the Function module and Structure name.


3. Select the option Timestamp and select the DLTDATE field you had added in your extract structure. Also set the safety limits as required.
4.jpg
Note: We could have selected Calend. Day but in that case the delta extraction can only be done once in a day.

4. Click Save to go back to the previous screen and click Save again. The following screen comes up.
5.jpg
Note that the DLTDATE field is disabled for selection; this is because this will be populated automatically as part of the delta. As a result, it will be unavailable for manual entry in the Info package or in RSA3
Following this step, create the corresponding ODS, Data source etc. in BW side and replicate. These steps are similar to what would be done for a normal generic data source.
Later this ODS active table is read to have these additional fields in BW Old Flow.

Hope it helps.

Sep 16, 2014

10 Golden Rules for SAP BW on HANA Migrations

Via Content in SCN

This blog is the third in a sequence of blogs. It starts with Licensing, Sizing and Architecting BW on HANA and moves onto 10 Golden Rules for SAP HANA Project Managers. In this piece, I'm going to discuss some tips for making a migration a success from a technical perspective.

1) Create benchmarks

Always benchmark before and af ter. Make sure they are business-relevant, and are run in a place that excludes network and PC problems - the same place each time. Get the query variants and teach the technical team how to run and time the queries. Create an Excel workbook which has queries on the rows and query runs on the columns and run it every day after the migration. Now you can track project success.

2) Exclude unnecessary risks

I've seen so many projects that include unnecessary risks. Here are some examples

- Not doing full backups that allow proper restore points
- Putting installation files on network shares
- Having application servers on different networks to database servers

Find ways to remove these risks, you don't need them in your project.

3) Get your landscape in sync

Your landscape needs to be synchronized. Check your BW versions are part of a proper Support Package Stack and they aren't on wrong revisions.

On the HANA side, make sure you are on the latest revision of HANA. Yes, that means database and client and DBSL. A landscape that is not in sync is a landscape which is likely to fail.

4) Run an independent check of your HANA system

Unfortunately, hardware vendors sometimes mess up HANA installation and configuration. SAP have Note 1943937 which has details and ther e is an IBM-specific tool which checks for GPFS issues too.

Whilst you're there, check the trace folder of your HANA system. If it has a lot of logs, crashes or dumps then you have a problem. Get someone to resolve it before continuing.

5) Get the latest table sizing and table redistribution notes

There are SAP notes that apply fixes for table sizing and redistribution and these need applying prior to the export process. Search for the latest version and then implement them before exporting.

Specifically, check SAP Note 1921023 for SMIGR_CREATE_DDL and 1908075 for Landscape Redistribution. Also make sure you download the latest row store list from SAP Note 1659383 .

6) Get everything completely up to date

This is the #1 reason why I see problems in HANA migrations. You have to get everything up to date, especially SWPM, kernel, disp+work and r3load software. It requires manual work and you have to repackage the SWPM kernel files at times so you get the latest version. If you skip this or cut corners, you will pay for it later in the upgrade project.

7) Include the latest patches and notes

There is a common wisdom on SAP projects where people go to patch level N-1. This isn't the case with HANA, and you need to make sure you are on the latest patch during your project lifecycle. In addition, you need to search OSS for notes which need applying. One useful tip here is to use the SAP Note Search "Restrict by Product Components" --> SAP_BW --> Release 740 to 740 -> SAPKW74005 (for BW 7.40 SP05). You will now only see notes that relate to BW 7.40 SP05, which is neat.

Make sure you check SAP Note 1908075.

It's well worth spending time doing this analysis when you have some spare moments and noting the SAP Notes that you need to apply after the migration. If you don't do this then you will need to do it when you are tired, immediately after the migraiton.

8) Check the Important Notes list

The master note for BW 7.4 is SAP Note 1949273 and for BW 7.3.1 it is 1846493. You need to check and apply the corrections (and connected corrections) in these notes.
9) Follow post-processing steps

There are a number of post-processing steps detailed in the master upgrade guide. Amongst those, you should ensure you run ABAP Report RSDU_TABLE_CONSISTENCY to check for any problems, and refer to SAP Note 1695112 for more details.

10) Don't cut corners

You can't cut corners in a migration - you need to spend the time to get it right. I've been thinking lately, and SAP could really help make migration projects more successful by automating the above process. In the meantime, you need to be methodical and read all the available information and pay attention to the details. If you want your project to be a success then don't cut corners.

Final Words

The migration to SAP HANA can be a very smooth and simple process, if the technical team pay attention to the details. Every BW on HANA project I have seen in trouble has been either because of governance problems mentioned in my blog 10 Golden Rules for SAP HANA Project Managers, because of poorly architected hardware as per my blog Licensing, Sizing and Architecting BW on HANA or because of a te chnical team that didn't pay attention to the detail.

Sometimes you can get away with skipping some of the detail, but usually, you cannot.

Sep 10, 2014

Posted by Thomas Zurek 

This is yet another question that I get from all angles, partners, customers but even colleagues. BW has been the spearhead SAP application to run on HANA. Actually, it is also one of the top drivers for HANA revenue. We've created the picture in figure 1 to describe - on a high level - what has happened. I believe that this not only tells a story on BW's evolution but underlines the overall HANA strategy of becoming not only a super-fast DBMS but an overall, compelling and powerful platform.

Fig. 1: High level comparison between a classic BW and the two versions of BW-on-HANA. Here as PPT.

Classic BW
Classic BW (7.3ff) follows the classic architecture with a central DBMS server with one or more application servers attached. The latter communicate with the DBMS in SQL via the DBSL layer. Features and functions of BW - the red boxes in the left-most picture of fig. 1 - are (mostly) implemented in ABAP on the application server.

BW 7.3 on HANA
At SAPPHIRE Madrid in November 2011, BW 7.3 was the first version to be released on HANA as a DBMS. There, the focus was (a) to enable HANA as a DBMS underneath BW and (b) to provide a few dedicated and extremely valuable performance improvements by pushing the run-time of certain BW features to the HANA server. The latter is shown in the centre of fig. 1 by moving some of the red boxes from the application server into the HANA server. As the BW features and functions are still parameterised, defined, orchestrated from within the BW code in application server, they are still represented as striped boxes in the application server. Actually, customers and their users do not note a difference in usage other than better performance. Examples are: faster query processing, planning performance (PAK), DSO activation. Frequently, these features have been implemented in HANA using specialised HANA engines (most prominently the calculation and planning engines) or libraries that go well beyond a SQL scope. The latter are core components of the HANA platform and are accessed via proprietary, optimised protocols.

BW 7.4 on HANA
The next step in the evolution of BW has been the 7.4 release on HANA. Beyond additional functions being pushed down into HANA, there has been a number of features (pictured as dark blue boxes in fig. 1) that extent the classic BW scope and allow to do things that were not possible before. The HANA analytic process (e.g. using PAL or R) and the reworked modeling environment with new Eclpise-based UIs that smoothly integrate with (native) HANA modeling UIs andconcepts leading also to a reduced set of infoprovider types that are necessary to create the data warehouse. Especially the latter have triggered comments like
"This is not BW."
"Unbelievable but BW has been completely renewed."
"7.4 doesn't do justice to the product! You should have given it a different name!"

It is especially those dark blue boxes that surprise many, both inside and outside SAP. It is the essence that makes dual approaches, like within the HANA EDW, possible, which, in turn, leads to a simplified environment for a customer.

This blog has been cross-published here. You can follow me on Twitter via @tfxz.