Quantcast
Channel: SCN : Document List - SAP BW Powered by SAP HANA
Viewing all 273 articles
Browse latest View live

Easier Migration to SAP BW powered by SAP HANA...

$
0
0

...with ABAP Post-Copy Automation
for SAP Business Warehouse (SAP BW)

To reduce downtime in your production landscape, one of the recommended migration paths from SAP BW to SAP BW on SAP HANA comprises a system copy of your SAP BW system. The system copy procedure of SAP BW systems and landscapes is complex for a number of reasons however. There are a large number of configuration settings (such as connections and delta queue handling for data loading) and system copy scenarios of SAP BW (each with different landscape aspects) for example that have to be handled as part of every system copy, regardless of whether the system copy is part of the migration to SAP HANA or if you want to perform regular system copies of your SAP BW landscape.
SAP NetWeaver Landscape Virtualization Management offers preconfigured "task lists" used by the ABAP task manager for lifecycle management automation. You can also enable SAP BW powered by SAP HANA to “go productive” with the parallel operation of your existing production system, both connected to the same back end systems. This is achieved using a special and unique automated solution for delta queue cloning and synchronization on the production systems. SAP note 886102 (SMP logon required) thus becomes obsolete. Using the post-copy automation for SAP BW (BW PCA) in the migration process from SAP BW to SAP BW on SAP HANA, this process can be shortened by weeks and becomes easier, faster and more reliable.
  • You can download the license of SAP NetWeaver Landscape Virtualization Management via SAP Service Market Place http://service.sap.com/swdc -> Installations and Upgrades -> Browse our Download Catalog -> SAP NetWeaver and Complementary Products -> SAP NW LANDSC VIRT MGT ENT -> SAP NW LANDSC VIRT MGT ENT 1.0 -> Installation. Before downloading, it is strongly recommended to read the above mentioned guides thoroughly!

 

 


How to Configure a BW Bex Query in a Custom Fiori Tile

$
0
0

Hello All,

 

SAP has been marketing Role based Fiori applications very extensively.

 

These days, customers want most of their application types to be configured in Fiori Tiles and its a fact that the look and feel of Fiori app is simply "BINDAAS", when compared to our old legacy screens.

 

Last week I was asked by customer to configure a BW BEx Report in a Custom Fiori Tile.

As Fiori is relatively a new topic, I would be covering a very detailed step by step process here to demonstrate the configuration and will ensure to include as much as screen shots possible from my end.

 

So here we go,

 

A small introduction to Fiori:

Fiori is a collection of apps with a simple and easy to use experience for broadly and frequently used SAP software functions that work seamlessly across devices – desktop, tablet, or smartphone.

 

Experience SAP Fiori:

http://experience.sap.com/fiori/

 

See it on SAP HANA Marketplace:

SAP HANA

 

All things SAP Fiori

All Things SAP Fiori

 

Fiori Apps Library

Fiori Apps Library

 

Fiori Confuguration Overview:

http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/500de2f4-67fb-3010-c194-cb98eb3fffd8?QuickLink=index&…

 

 

Architecture:

  • Fiori Client
  • Netweaver Gateway (on ABAP)
  • BW

SAP BW(7.3+) -------->   RFC ---------> NetWeaver Gateway 7.4 -----------> HTTPS -------------> Fiori Apps

 

Steps that needs to be done in SAP BW side:

1) Create a Bw Bex Report using Bex Query Designer.

2) Make the necessary settings by adding 'By Easy Query' and 'By OData' options in Bex Extended tab.

1.png

3) In our Case : Technical Name of the Query is ‘ZSALES_ANALYZER_FIORI’ and Description is ‘Sales Revenue Analysis in Fiori Tile’

  1.png1.png

4) When we do the settings as mentioned in Step 2, we will be seeing an entry in 'EQMANAGER' Tcode in BW.

1.png

When we click on 'Test Easy Query',the corresponding URL and the Bex Output would be shown.(In our case, the URL is the following)

https://XXXXXXX/sap/bc/gui/sap/its/webgui?sap-client=800&~transaction=*EQPREVIEW%20P_WS=/BIC/NF_30;DYNP_OKCODE=WB_EXEC

 

In case if we want to bypass the credential request, we can change the link to the following:

https://XXXXXXX/sap/bc/gui/sap/its/webgui?sap-client=800&~transaction=*EQPREVIEW%20P_WS=/BIC/NF_30;DYNP_OKCODE=WB_EXEC&sap-user=<  >&sap-password=<  >

 

How to enable EQMANAGER Tcode in BW:

Use transaction SBGRFCCONF for EQMANAGER configuration.

http://help.sap.com/saphelp_tm90/helpdata/en/35/0da8627e8040ada63550cf1fee5f6d/frameset.htm

 

Steps that Needs to be Done in SAP GATEWAY System:

5) Log into the Gateway System by providing the user name and Password

6) Enter the transaction : LPD_CUST to create a Launchpad role. Here I have created a role ZC_BWQR_00

1.png

 

7) Double click on the Launchpad role and create a New Application.

1.png

 

8) Enter the details for the following in the Link Details Screen.(The details purely depends on your requirement)

i)  Link Text,

ii)  Description,

iii) Application Type as URL

iv) The actual URL(that we got in Step 4)

v)  Application Alias

 

NOTE: Following is one of the main screens in Fiori where we can configure various types of Application Types mentioned below:

1) BI Web Template

2) Cystal Report

3) Infoset Query

4) List Report

5) Object Based Navigation

6) Portal page

7) Report Writer

8) Transaction (Backend Transactions for Example :  ME23N -> Display Purchase order)

9) URL(Lumira, Crystal Reports, BW Bex reports)

Visual Composer XApps

Web Dynpro ABAP

Web Dynpro JAVA

Xcelsius Dashbaord

1.png

System Alias field is required in case of calling the transaction from backend system or specifying the webdynpro from backend system. If it is URL type, then system alias is not required.

Also please note that only HTTPS links can be configured in Fiori.

 

9) Proceed without providing the Name Space

1.png

 

10) Create a Semantic Object using Tcode --> /n/UI2/SEMOBJ . Here we have created ZBWBEXQUERY. To avoid confusion, we can even give the same name in all the columns

1.png

 

Click on SAVE button and save the configuration under a transport request.

 

 

Steps that needs to be done in Fiori Admin Page

11) Take the Fiori Admin Page and provide the required Credentials:

https://XXXXXX/sap/bc/ui5_ui5/sap/arsrvc_upb_admn/main.html?scope=CUST

 

12) Create a Catalog by using the following  + Button:(Available in Left Bottom Corner)

1.png

 

13) Enter the Title and ID(As per your Requirement)

1.png

 

14) Click on + Icon:

1.png

 

15) Click on 'App Launcher - Static'

1.png

Generally these are different types of tiles are available.

Static Tile: Show predefined static content.

Dynamic Tile: Show numbers that can read dynamically.

News Tile: Flips through news messages according to the configuration of the tile.


16) After clicking App Launcher - Static, you will be able to see the following screen:

1.png

17) Click on the 'Tiles' option and enter the General details:

1.png


The Title and Subtitle given here would be seen in the Fiori tile after configuration. So Provide the name accordingly.You also have the option to give the display icon here.


18) Enter the navigation details:

1.png

Provide the semantic object created in Step 10.


19) Click on SAVE Button and Confirm the Changes as OK

2.png

 

20) Click on the 'Target Mapping' option and click on 'Create Target Mapping' Button available in Right bottom corner

1.png


21) Provide the details in the Indent Screen.

1.png

Provide the semantic object created in Step 10.


22) Enter the details in the Target Screen:

1.png

Enter the Launchpad Role (created in Step 6) and Application Alias(created in Step 8)

 

23) Proceed by Saving the Configuration changes:

1.png

 

 

Roles and Users related Steps that needs to be done in Gateway System:

24) Log in to the gateway system and enter the transaction PFCG

Create a Single Role and Provide the description and save the role.

2.png

25) Click on Transaction drop down and Select Fiori Catalog:

1.png  2.png

 

26) Select the Catalog that you have created in Step 13 and click on SAVE option.

1.txt.png

 

27) Click on User tab and add the user who would be launching the tile from Fiori Designer page(in my case, user is KPIDEMO)

1.txt.png

 

Steps that Needs to be Done in Fiori Launchpad:

28)  Take the Fiori Launhpad Page and provide the required Credentials:

https://XXXXXX/sap/bc/ui5_ui5/ui2/ushell/shells/abap/FioriLaunchpad.html


 

29) Select the Tile Catalog after selecting the List Icon.

1.txt.png --> 2.png

 

30) Search and Select the Catalog created in Step 13.(In our Case 'Sales Revenue Analysis' Description)

1.png

 

31) Add the Tile to the any Group in Launchpad(Here am adding to the 'My Home' Group.

1.png

 

32) Now the configured tile would be appearing in the Fiori Launhpad Home Page of the user KPIDEMO

Untitled.png

 

33) Now Click on the Configured Tile and you will be able to see the BW Bex Query Output.

1.png

 

Hope you were be able to follow the steps and configure the Bex Report in Fiori Custom Tile.

 

 

Also, please note that this is just one of the method through which we can configure a Bex report in Fiori,

Another option has been explained by Gavin in the following Blog:

SAP BEx Query to Fiori App - From ASUG MN 2015

 

BR

Prabhith

SAP BW 7.4 SP5 on SAP HANA Overview and Roadmap

SAP BW NLS- Frequently Asked Questions

$
0
0

sapmeeting.jpg

Implementation and Change Management

 

Q: What kind of structural changes are permitted if a cube/DSO has sent data to NLS already?

Please refer to http://service.sap.com/sap/support/notes/1005040.
Q: How do BW and NLS tables sync up again after an InfoProvider has been extended?

Adding a new field to an InfoProvider deactivates an associated Data Archiving Process (DAP). Then reactivating the DAP adds the new field to the NLS table. Please note that the added field is empty. If the added field should be filled you have to reload the data, update the new field and archive the data again. DAP activation can be included as part of the transport to adjust QA and Production systems.

Q: What happens when a data archiving process gets deleted in BW?

When a DAP is deleted in BW, the corresponding tables and all archiving requests are deleted from the Sybase IQ database.
Q: What happens when an InfoProvider with an associated Data Archiving Process (DAP) gets deleted?

If archiving has not been executed before, the DAP and corresponding tables from IQ will be deleted.
On the other hand, if archiving has been executed previously, the DAP as well as previously archived data on Sybase IQ remain in place for future usage.

Q: What to consider when transporting Data Archive Processes and associated NLS connections?

In general you should use the same logical name for your NLS connection in all systems throughout your transport landscape pointing to the particular NLS target database (such as Sybase IQ) per system. Then you can transport the DAPs and generate the archiving requests per system. There is no automatic conversion of NLS connection names in BW transport management (like for BW source systems).

Q: Can I archive data from a DataStore Object that has no time characteristic included?

It is not possible to archive data in an InfoProvider that has no time characteristics / date field at all. It is currently not possible to select the flag “Free selection for archiving” in the Data Archiving Process.


Reporting

 

Q: Is it possible to use SAP BusinessObjects Explorer on BW generated views, including NLS data?

In order to achieve this, we recommend to use the HANA Model Generation for BW InfoProviders provided with BW 7.40 SP8. These generated HANA Views on top of BW InfoProviders can also access NLS archived data.
For more information on HANA Model generation with BW 7.40, see http://scn.sap.com/docs/DOC-52790 and the respective documentation: http://help.sap.com/saphelp_nw74/helpdata/en/66/33d851345c4770bd4e523701b9f5b0/content.htm?frameset=/en/66/33d851345c4770bd4e523701b9f5b0/frameset.htm (the documentation is on the information level of the latest SP released)


Before BW 7.40 SP5 (and since BW 7.30 SP8), you could use the BW model import wizard in the HANA Studio to generate a HANA view based on the BW InfoProvider metadata directly on the BW tables. These generated HANA view are not able to access NLS archived data. In that case a workaround is possible that involves a HANA Calculation View (union) on top of that generated HANA View and a HANA View based on virtual table on the NLS table in Sybase IQ (using HANA Smart Data Access). As of the complexity of that workaround we strongly recommend the usage of the HANA Model Generation with BW 7.40 SP5.

 

 

Q: What's new in SAP BW 7.4 SP8/SP9 with regards to BW Query Execution on archived data?

As of BW 7.40 SP8 and HANA SP08, BW Queries can use the so-called so called "HANA-API" to read from archived data. This means that more query features can be pushed down to HANA to reduce post-processing in the ABAP server. This can have a big effect on the query-runtime especially when restricted keyfigures are used. Technically speaking, as of SP8 the features of query-mode 2 can be used to read from archived data if the query mode 2,3 or 6 is used in the query setting. The query modes are explained in this blog: https://blogs.saphana.com/2013/06/10/bw-on-hana-and-the-query-execution-mode/

 

As of BW 7.40 SP8, InfoCubes with non-cumulative key figures can be archived, too. HANA Smart Data Access isrequired for query processing. More information can be found at SAP Help:

https://help.sap.com/saphelp_nw74/helpdata/en/db/706ccccc2949ab8b158589b7dd395c/content.htm?frameset=/en/4d/f20e7a63bf4b0ca969679ea1527932/frameset.htm&current_toc=/en/c5/f1c99abdd3fb4aa7febe103b5d77e5/plain.htm&node_id=183

Note 2165650 - FAQ: BW Near-Line Storage with HANA Smart Data Access

 

 

Data Loading

 

Q: What happens if an NLS job is running and a load DTP also kicks off?

It is not possible to load data during archiving, especially during the delete step.  It is a best practice to build a process step within the process chain for data achieving, so that data loads will not conflict with the archiving process.
Please also refer to the documentation provided here: http://help.sap.com/saphelp_nw73/helpdata/EN/4a/40bf4197ea1d0fe10000000a42189c/frameset.htm
Q: Will there be duplicate records between objects stored in the NLS and BW Objects at any time?

Sybase IQ uses snapshot versioning: a write operation cannot block a read operation or vice versa. Readers will see the data valid at the beginning of their extract, even if more recent changes have been applied and committed in the meanwhile. There is a time window around the archiving process during which data exists in two identical instances (once in the primary BW DB, once in NLS). BW is be able to correctly distribute the search criteria so they are restricted that only one instance is respected.

Q. Is it possible to parallelize the DData Loadingata Archiving Process (DAP) request in SAP NetWeaver BW?

Currently the parallel processing option for DAP requests is not supported at batch process level (transaction code: RSBATCH). However, there is a work around based on data modeling level: By splitting the archiving request into several data archiving processes and scheduled them in parallel in a Process Chain, it is possible to achieve parallelism. See the workaround description

 

 

Monitoring

 

Q: How can Nearline Storage Tables and Partitions be accessed and managed from BW?

In order to access and manage partitions on Sybase IQ from BW, SAP provides the program “RSDA_SYB_PARTITION_MONITOR” which is the “Partition Monitor for Nearline Table in Sybase IQ”. This report provides a detailed overview of all Nearline Tables in Sybase IQ and the corresponding partitions for each table.
Since the number of partitions for a data table on Sybase IQ is limited to 1024, the program allows users to define threshold values in order to create alerts and monitor all critical tables which exceed the upper threshold value. In this case the report can be also used to drop or/and merge the individual partitions. See detailed example.

Q: How can I check performance for NLS query access as well as how many records have been read from NLS during reporting?

You can check it with the query monitor in BW (transaction RSRT). Please enter your respective BW Query --> “Execute + Debug” --> Others > check “Display Statistics Data” (and “Do Not Use Cache” – optional). Then, you can check how many records were read from NLS and how long it took for reading form NLS.

FAQ Pic1.png

Suffix $X reflects the active data and $N reflects Near-line Storage (Sybase IQ).

 

System Operations and Set Up

 

Q: Can one NLS system connect to multiple BW instances?

Yes. One Sybase IQ server can talk to multiple BW systems. You should create separate databases in IQ for the different BW systems.

Composite Provider Based on BW on HANA

$
0
0

What is composite provider ?

 

  • A Composite provider is infoprovider (View), which combines data from several analytic indexes or from other info providers by Join or Union, and make this data available for reporting and analysis.
  • The composite provider is defined in a graphical environment, thus facilitating rapid modelling.
  • This make it possible to bring  together data from multiple sources using analytical indexes.
  • The main advantage of Composite providers is that: BW Info Providers can be combined using the JOIN operation allowing us to create new scenarios not possible or very expensive with standard techniques (Multiprovider, InfoSet).

 

Creating Composite providers from Analytical indexes and from other info providers.


  1. Call transaction RSLIMOBW

 

           1.png

     2. Enter a name for your composite provider.The maximum length of the name is 10 characters.The system added the prefix @3 to the info provider name.

     3. Choose create opens to the graphical modeling environment.

  

          1.png

 

     4.Choose an info cube and/or analytical index to be used in the composite provider.

     5. Drag & Drop the Info cube and/or Analytical index into the modelling area. A popup asks for the binding type Union or Join.

     6. Choose Ok.

 

          1.png

 

    7. Select fields from Info cube and/or Analytic index per Drag and Drop into the empty Composite Provider.

    8. A line (bold) carry's that Join if representing a Join-connection.

 

          1.png

 

     9. Check the Join-Connections. If a Composite Provider Query delivers as unexpected result, you can run different types of Join-Field-Analysis.

 

          1.png

    10.You can change a field's name and add a description. Choose change properties via the field's context menu. A field's name can be 10 characters           long at maximum. If longer the system assigns name like F001, F002 etc.

 

          1.png

        11. Activate the composite Provider.

  

                    1.png

Understanding the Stock Coverage Option in Key-Figure

$
0
0

Introduction

A plethora of features - big and small have been introduced with SAP BW7.4 powered by SAP HANA. One interesting feature that caught my eye was the 'Stock Coverage' option introduced in the key-figures. This document will provide a business case where this functionality can be used.

 

Scope

The scope of this document is restricted to an excel based planning application developed using Integrated Planning component of BW.

 

Software Components

- SAP BW7.4 powered by SAP HANA

- SAP BEx Query Designer

- SBOP Analysis for Office V2.03

- MS Excel 2013 (32-bit)

 

Note

I will be using technical names of the objects that I created as part of this explanation. In your case, technical names can differ and hence you need to change or adjust per your requirements.

 

Using Key-Figure with Stock Coverage Option

Go to T-code RSD1 and create a Key-figure as shown below. We are naming KF as ZKF_NOM in our case. You will notice that when you check the box for Stock Coverage, a new tab by the same name will be visible in the screen.

 

KF.jpg

When you click on "Stock Coverage', you will be directed to the tab named "Stock Coverage". Now this tab needs four parameters:

- Referenced Stock Key Figure: Key-figure that is going to be equivalent of stock. It could be receipt or supply depending on your requirement. In our case, we are using a key-figure for Projected Inventory here - ZKF_PRINV

- Referenced Demand Key Figure: This is key-figure against which coverage will be provided by ZKF_PRINV.

- Max. Number of Covered Periods: Number of periods for which ZKF_PRINV should provide coverage. In our case, we are limiting it to 2

- Stock Type: This is to define the stock consideration period - Beginning of period or End of Period. We are choosing End of Period.

 

KF1.jpg

 

When you select Stock Type, you will be directed to "Aggregation" tab where you need to provide time granularity for Stock Coverage. Currently only standard SAP time characteristics can be selected. Even if you create a custom characteristic with reference to standard, it will not be supported here. For our purpose, we are planning at month level and will be using 0CALMONTH.

KF2.jpg

We are now done with stock coverage configuration for this key-figure. Activate it.

 

Create a real-time cube using Administrator Workbench. You must ensure that 0CALMONTH, ZKF_PRINV and ZKF_TDMND must be included when we are using ZKF_NOM.

 

KF3.jpg

 

Activate your InfoCube and load any required data. You will notice that, ZKF_NOM is not visible in transformation and listcube transaction - much like non-cumulative key-figures. The reason is that for this KF, calculation has already been pushed to database and it is calculated at run-time.

 

Create an aggregation level and create an Input-ready query on top of it. ZKF_PRINV and ZKF_TDMND are marked as input-ready in the BEx query.

 

We will now call this query in Analysis Office. Launch "Analysis for Microsoft Excel" from your Programs Menu.

KF5.jpg

Ensure that planning ribbon is activated. If not, follow the settings below:

KF4.jpg

 

Insert the input-ready query created before as datasource and execute it by passing values into variables, if any.

KF6.jpg

 

Now I will be entering a value of 10000 for the 07/2015 for Projected Inventory and 5000 and 7000 for 08/2015 and 09/2015 for Total Demand.

KF7.jpg

 

Now click on the save icon visible under Planning ribbon of your Analysis tab and we will have our number of months calculated.KF8.jpg

 

Now let us understand the calculation happened. Remember that when we were configuring key-figure ZKF_NOD, we gave value to the parameter "Max. Number of Covered Periods" as 2.

Total Projected Inventory available in 07/2015 is 10000 units to cover for demand for next 2 months. However, combined demand of next two months - 08/2015 and 09/2015 is 5000+7000 = 12000 - which is greater than available quantity.

Demand is completely covered for 08/2015 but for 09/2015, it falls short by 2000 units. The available stock, therefore, can cover only for a portion of the month that can be calculated as 5000/7000 = 0.71 month.

So the total stock coverage available in this case is 1.71 months. To achieve it to the desired level, I could either raise Projected Inventory or carry forward the excess demand. There could be a lot of ways to tackle this scenario.

 

This is a very handy feature and saves a great deal of time. You can get your Inventory in Number of Days if your reference time characteristic is 0CALDAY.

 

While loading data into your InfoCube, if you have data for stock and demand key-figures, the stock-coverage key-figure will calculate those during loading and the values will be readily available in your query or application or report.

SAP HANA Memory Utilization-Corporate Memory

$
0
0

1 Corporate Memory

 

 

While working with big clients having businesses throughout the globe, it becomes really important to manage the data and wherever possible, reduce the data base size. When dealing with millions of records, we often come up with a situation where the data base size reaches almost the maximum capacity and data loading is no longer possible. We always cannot ask client increase the DB size as lot of money is involved. Especially if you are using HANA data base. Several Archiving techniques have been introduced in HANA like NLS and Dynamic Tiering for Using Extended Tables, but these are not only expensive but have some restriction. For example if a document is already achieved using NLS, then if you want to load delta of that record into DSO, you will come up DSO activation error.

We have a simple technique that can be used easily to reduce main memory size without incurring too much of implement cost. Here we are storing the not so important data in separate memory called Corporate Memory.

 

 

 

1.1 Data in Corporate memory

 

First we need to analyze which are the large tables in DBACOCKPIT. Then we need to take decision based on business requirement which data is not so important. Data are classified into 3 types: Hot, Warm and Cold

Hot Data: Data which is regularly used and often changed e.g. Last 3 years of data

Warm Data: Data which is rarely used and may need once or twice in a quarter e.g. Data between year 2012 and 2006.

Cold Data: Data which is most likely never going to be used but still business wants to keep e.g. data beyond 2006.

The production data can be classified into these groups based on business requirement. The selection filed can be time or company code or anything. It should be purely a business decision.

 

With corporate memory we intend to keep both Warm data and Cold data out of main memory and thus reduce drastically the size of main memory

Create Inflow DSO which will be copy of the large BW tables.

 

 

1.2 Hana Data Base

 

To understand the advantages of corporate memory, we first need to understand how the data is stored in the HANA data base.

When a data is loaded into an Info Provider in HANA, it actually gets loaded into two storage area: Memory and Disk. Thus a data will be accessed very quickly since it resides in memory like RAM.

 

SC1.jpg

The screen shot suggests that it has got two Memories:

Current total Memory and Size on disk

With Corporate memory concept, we can offload some of the data from Main memory and keep it only in disk, so main memory size gets reduced

 

 

Unload "SAPMIW"."/BIC/B0002683000"

 

2 Architecture

2.1 Functional Details

 

Let’s take an example of FI DSO. We assume that it has data from year 2000 to 2015.

Business approved that Data form 2012 onwards will only be useful for reporting and rest are historical data and may not require often. Hence Data from year 2000 to 2011 can be considered as Cold and Warm data. Hence these may not be required to be kept in Maun memory. These data we will be stored in Inflow DSO and then will be removed from the Main DSO.

 

 

 

 

 

2.2 Technical Details

 

  • Create an Inflow DSO (Corporate DSO) which will be a copy of the Main DSO.
  • Create transformation and DTP To and Fro from between the Corporate DSO  and Mai DSO
  • Load period wise Data to the Corporate DSO.(Year 2000 to 2011)
  • Once Data is loaded in the Corporate DSO, do selective deletion of year 2000 to 2011 from the Main DSO.
  • In DBACOCKPIT run SQL statement:                                             

Unload "SAPMIW"."/BIC/AFIXXXXX" (Here /BIC/AFIXXXXX is the active table of Corporate DSO)

  • The above statement will offload whole data of the main memory of corporate DSO

 

 

3 DBACOCKPIT Test

3.1 Before Corporate memory Load

 

  • Run t-code DBACOCKPIT

SC2.jpg

 

 

  • Expand Current Status
  • Double click on Overview

SC3.jpg

 

  • We will get the overall data base size

Sc4.jpg

3.2 Corporate memory Load

After taking the statistics from the cockpit, we need to do these below simple steps

  • Load Period wise data to Corporate memory DSO from the Main DSO
  • Do selective deletion of Data from the Main DSO of the periods that are loaded.
  • Check the size of the Main memory table in DBA COCKPIT
  • In DBA Cockpit, Expand Diagnostics, double click on table View

 

  

sc5.jpg

 

 

 

  • Enter the Active Data table name in Table/View Section and Display find

 

 

 

 

sc6.jpg

 

 

 

 

 

 

 

 

  • The next screen will give you the current size of the Corporate table

 

 

sc7.jpg

 

  • Here we can see the Main Memory is having 1.12GB of data allocated for this DSO

 

3.3 Offload of Main Memory

 

 

  • Run SQL command in DBACOCKPIT SQL Editor:

Unload "SAPMIW"."/BIC/AFIXXXXX"  

  • Expand Diagnostics

  

 

sc8.jpg

 

 

 

 

 

  • Double click on SQL editor and enter the command

 

 

  • Click on execute
  • All the data from the main memory of this Corporate DSO will be deleted
  • Check again the size of this table in DBACOCKPIT- Tables View

   sc9.jpg

 

 

  • Here we can see the Size of Memory is Zero
  • The size on Disk Remains same as we have only deleted data from the main memory and not from the disks

 

 

 

 

 

 

 

4 Important Notes and Conclusion

4.1 Notes:

  • We can repeat this activity for as many no of DSOs or cube that we want.
  • Once we finish this activity, sometimes we can see the data gain comes back to main memory automatically. This is being done by the system if someone even tries to check the data of corporate memory in SE11 or LISTCUBE. This happens because of HANA functionality called DELAT MERGE.
  • DELAT Merge is a HANA standard functionality where delta data is initially stored in Delta memory. Latter it is moved to Main Memory.
  • To avoid this we can run a job through Program that will perform the offload of Main Memory using the above SQL command on a periodic basis so that even if someone does open Corporate memory DSOs, and data again loaded in the Main memory, these will periodically delete Main Memory and keeps check the size

 

 

4.2 Conclusion  

 

  • After the Corporate Memory activity is performed we can again check the Overall DBACOCKPIT size, you will see that there is a massive reduction
  • This will help to really control the Main Memory Size in the data base and can save good amount of money for the customers buy not asking the increase the size.
  • This is a coast affective solution and also have less maintenance cost
  • Selective deletion from the Original DSO helps reduce the size.

SAP-NLS Solution - Update Q2/2015

$
0
0

With the current Release of SAP BW 7.40 Feature Pack 02 some interesting improvements and updates were released for the SAP-NLS Solution based on SAP IQ SP08. With this short Presentation you will find some important updates in the Area of data loading and query Performance together with SAP-NLS. Together with the existing SAP First Guidance Document - https://scn.sap.com/docs/DOC-39627 this updates gives you a quick overview in the Area of SAP-NLS.

View this Presentation


Frequently asked Questions when Using BW Post Copy Automation -

$
0
0

- in a SAP NetWeaver BW on SAP HANA
  Migration Context

Picture5.jpg
This page lists the topics to consider when using  BW PCA in a SAP NetWeaver BW on SAP HANA migration context. Last update: 2nd July 2015

 

 

1.    Installation

 

1.1  LVM license

Typically the process starts with getting the license for LVM in place. As BW PCA is embedded into the Post-Copy Automation framework this is licensed by LVM and requires a valid LVM license. Customers wanting to use BW PCA Initial Copy (incl. delta queue cloning feature), need to contact their Account Executive / Global Account Director to get the according license. For questions on licensing please contact your AE or LVM Solution Management Jens.Rolke@sap.com). Due to involvement of different parties to get the license in place you should start getting the license early in your project phase. The details on the license installation you find in the installation guide at http://service.sap.com/~sapidb/011000358700000175442014E.

 

1.2   Where can I download the software, I can’t find the software in service market place.

Before being able to see the software you have to be registered as a LVM customer. This requires the license for LVM (see point 1). In case this is in place you can find the software under following the path http://service.sap.com/swdc --> Installations and Upgrades --> Browse our Download Catalog --> SAP NetWeaver and Complementary Products --> SAP NW LANDSC VIRT MGT ENT --> SAP NW LANDSC VIRT MGT ENT 2.0 --> Installation.

 

1.3   What is the install footprint for BW PCA?

The functionality of BW PCA is delivered via notes and support packages. However in order to use the
tool, you need a license key for LVM. This key is an add-on, so SAP is able to restrict access to the functionality.

 

1.4   What are the required SAP notes I have to implement?

To get a list of the notes required in ECC and BW, please follow the instruction mentioned in the  note http://service.sap.com/sap/support/notes/1707321 (for BW notes) and http://service.sap.com/sap/support/notes/1614266 (for basis notes) in all systems affected by BW PCA. Execute the note analyzer report attached to the note 1614266 and choose  the right option for the affected system to download and implement the needed notes. Lessons learned from customer that used BW PCA are, that you should execute this report with the latest XML file in ECC and BW immediately before starting your system copy procedure. That ensures that you have the latest developments / corrections in your systems.

 

1.5   I don’t copy the ERP system. Do I still have to install notes in the ECC?

There are currently several notes that need to be applied to your ECC systems. Which of those apply to your system, you can check out using the note analyzer report attached in note http://service.sap.com/sap/support/notes/1707321, choose the option “Post-Copy Automation requirements for BW source system that will not be copied”.

 

2.   Further information

 

2.1   Where within SAP do I find more information about BW PCA Initial Copy?

We have created an official SDN page regarding the BW PCA Initial Copy functionality with presentation material and details on which licenses are required, etc. Please visit http://scn.sap.com/docs/DOC-32414 For more info about the whole scope of BW PCA please visit the SCN page:http://scn.sap.com/docs/DOC-54097(System Copy Automation for SAP Business Warehouse System Landscapes (BW PCA))

 

 

2.2  Are there any recorded sessions available

We have 2 recorded sessions that should provide an overview of the BW PCA Initial Copy procedure.

  1. German session: https://sap.emea.pgiconnect.com/p81522211/
  2. English session: http://scn.sap.com/docs/DOC-34256
  3. There is an RDS solution available. In case you want to make yourself familiar with it have a look at https://service.sap.com/rds-hana-bwmig.

 

2.3  Is there an RDS solution available?

There is an RDS solution available. In case you want to make yourself familiar with it have a look at https://service.sap.com/rds-hana-bwmig.

 

2.4  More practical infos :

 

3.   Impact of the cloning on ECC and BW

3.1   What happens to my ECC environment?

3.1.1   Do I have to lock ECC against data changes?

The cloning of the queues can happen without any implication to the operational processes in ECC. All Queues visible in transaction RSA7 get cloned and will be populated as soon as they are available. Existing queues are not influenced. Cloning means, we double the pointers for delta queues, not the data behind. The data is kept in ECC as long as the cloned BW system hasn’t picked up the deltas as well (growth of ARFCSDATA table). So please consider that in your overall planning as this phase of growing delta queues should be as short as possible.

 

3.1.2  Do I have to open the ECC for metadata changes?

If note 1855474 is implemented (it is part of aforementioned note analyzer list), then the source system doesn’t need to be changeable during the PCA procedure.

 

 

3.2  What happens during the cloning process and what are things I should consider:


The cloning of the queues can be divided into 2 phases.

  1. The whole procedure starts by executing the step “Clone delta queues in BW source system” with the task list
    SAP_BW_COPY_INITIAL_PREPARE in the original BW system. This step processes the queues of each configured BW source system that is connected to BW. The process clones the relevant queues in the underlying BW source system by creating the relevant entries in the metadata tables.
  2. In the step Synchronize delta queues in BW source system the task list has to synchronize the cloned queues with the
    original queues in the underlying BW source systems by processing the single queues sequentially. With recent feature improvement the synchronization procedure enables parallelization in different BW source systems to speed up the process. If you have created a large amount of LUWs in ECC between the clone step and the synchronization step, the runtime of this synchronization step can be significant long. In Pilot projects, we have seen runtimes of up to 5 hours. As a recommendation here you should empty the queues before cloning by getting deltas into BW. Furthermore you should keep the time between the clone step and the synchronization step as short as possible.

 

 

3.3  Which delta queues get cloned and which applications do not support BW PCA?

All delta queues which are visible in transaction RSA7 get cloned. However those extractors that developed an own change pointer management which can only deliver one single BW system will not be treated by BW PCA delta queue clone, i.e. though the cloning in RSA7 might appear ok, the DataSource can encounter errors during data load. This means those applications have to check whether there is a workaround possible (e.g. duplication of the datasource, own logic, etc.). The industry solution IS-H is such an example.

 

Please have a look at note http://service.sap.com/sap/support/notes/1932459 for unsupported data sources.

 

3.4  What should I consider once the delta queues get very large in between the single phases I am able to extract towards the connected BW?

If the delta queue got very large (e.g. during system copy and HANA Migration) you can use the parameter MAXPAK in the InfoPackage of a delta upload (->Scheduler –> Data Source Default Data Transfer) and configure the relevant parameter. See notes http://service.sap.com/sap/support/notes/1160555 and http://service.sap.com/sap/support/notes/1231922 for details.

 

3.5  What is the impact on the BW system?

3.5.1  Can I still load data and execute Queries?

After the cloning task has started, new delta initializations cannot be done anymore until the prepare task list is finished after the database export. When the queue synchronization phase has started (which can take a few hours in case of large delta queues) delta loads or initializations of deltas cannot be executed until the synchronization process is over, the BW database is exported and the prepare task list is finished. You can minimize the time needed for synchronization by following the advices given below in section “Runtime considerations”. Reporting on existing data is however still possible during that phase until the BW is shut down for database export.

 

3.5.2  Do I have to set the BW system to changeable?

If you have installed the latest version of the XML file of note http://service.sap.com/sap/support/notes/1707321 then the BW system does not have to be changeable during the PCA procedure.

 

4.  How to Use the Task Lists

4.1   What are the task lists I have to execute and in which system do I have to do that?

In the original BW you have to execute the task list SAP_BW_COPY_INITIAL_PREPARE. After you copied the system you execute the task list SAP_BW_BASIS_COPY_INITIAL_CONFIG in the copied system. For details on the procedure please check the configuration guide at http://service.sap.com/~sapidb/011000358700000368892013E

 

 

4.2  Should I continue the execution of the task list SAP_BW_COPY_INITIAL_PREPARE (i.e. after the step Confirm export) in the copied system?

No, the task list SAP_BW_COPY_INITIAL_PREPARE should only be executed in the original BW system. Please continue the task list after the step Confirm export in the original BW system using transaction STC02.

Don’t create a new execution of a task list (e.g. SAP_BW_COPY_INITIAL_PREPARE) via transaction STC01 while there is already an execution in process. Use transaction STC02 in order to resume the execution instead.

 

 

4.3  How should I execute the cloning step-by-step?

A good procedure in case you don’t use the Database Migration Option of Software Upgrade Manager (DMO of SUM) is the following:

 

  1. Extract all deltas in the original BW system using standard procedure. Some of our customers created an own Process Chain
    that includes all Delta InfoPackages and executed this Process Chain. Ideally you do this twice in order to also remove the data in the delta queue which is stored for possible delta repeat request.
  2. Clone delta queues by executing BW PCA task list SAP_BW_COPY_INITIAL_PREPARE in original BW system.
  3. Extract all deltas in the original BW system once more (is checked in the BW PCA procedure). Ideally you do this twice in
    order to also remove the data in the delta queue which is stored for possible delta repeat request.
  4. Synchronize delta queues by continuing BW PCA task list SAP_BW_COPY_INITIAL_PREPARE in original BW system.
  5. Export BW database and startup original BW followed by executing regularly tasks as you do it usual
  6. Resume the task list SAP_BW_COPY_INITIAL_PREPARE in transaction STC02 in the original BW
    system.
  7. Parallel you can import BW database to build a new cloned BW system.
  8. Extract deltas in cloned BW system twice (by using Process Chain from Step i))
  9. Optional upgrade cloned BW system to your target release.
  10. Extract deltas in your upgraded BW system twice (by using Process Chain from Step i))
  11. Migrate your database to SAP HANA
  12. Run post migration activities
  13. Extract and execute deltas in BW on HANA on a regularly basis

 

4.4  Are there further task list which are interesting for me?

BW Housekeeping task list can be applied prior to system copy and migration (see note http://service.sap.com/sap/support/notes/1829728).

 

5.   Runtime considerations

5.1   How long will the execution of the task list take in the original system

The cloning phase itself is quite fast. The synchronization phase will take longer, it was however parallelized and hence also won’t take days, especially if you follow the step by step procedure given above. It is however crucial, that the sync step doesn’t encounter failed delta data loads. These loads are already given as warning in the clone step, and will cause the sync step to halt in error in the check phase. You are then asked to repair these loads, and in case the only possible repair is to delete the initialization, then you are in trouble, because no changes to initializations are allowed anymore after the clone step. Most time in past projects has been spent with repairing old and outdated loads rather than the plain run time of the automated steps. For that reason make sure that all DataSources with active delta initialization do indeed work and have been loaded recently (cf step i. in the procedure outlined above). Run the SAP_BW_COPY_INITIAL_PREPARE in check mode first to detect potential problematic candidates and fix them before you run the task list in execution mode.

 

5.2  How long will the execution of the task list
take in the cloned system

Most long running steps like reconnect or TSPREFIX-Rename have recently been parallelized in the different BW source systems to speed up the process. We have seen that the runtime is sometimes determined by the runtime of BDLS, not so much for the initial copy (which this FAQ concentrates on), because usually only the BW system itself is renamed, but for refresh scenarios, where also the source systems are renamed. The BDLS runtime has been improved a lot recently within PCA. With parallelization of conversion jobs based on table entries, the runtime of BDLS especially for BW systems can be reduced significantly. For better planning and preparation of your project you should run the analysis report RBDLS_CHECK beforehand in your productive BW system to extract the relevant tables for conversion and get estimation about how to schedule the BDLS process. For more information look at the following links: 
http://scn.sap.com/docs/DOC-60122

https://scn.sap.com/docs/DOC-62597

 

/ BW on RDBMS system landscape?

In case you used the BW InfoObject 0LOGSYS in DSOs, InfoCubes or Masterdata objects, the runtime of BDLS can be significant. In that case consider to execute the procedure given in note http://service.sap.com/sap/support/notes/1894679

 

6.  Landscape considerations

6.1   Can we have a mixed BW on HANA

With help of BW PCA (delta queue cloning and initial copy configuration) you can copy a BW system which can then be upgraded and migrated to HANA. However it is not recommended to have a mixed system landscape (Dev, QA and prod) running on different databases. For more info please refer http://scn.sap.com/docs/DOC-41509 (page 25 ff.) SAP Note 1808450 - Homogenous system landscape for on BW-HANA

 

6.2  I want to upgrade/migrate my original system. Does PCA initial copy help me here?

You can use PCA initial copy to create a temporary synchronized copy of your original system which can be used by the end users during the time in which you upgrade / migrate your original system. After the original system is again productive, you can decommission the copied system.

 

6.3  Is the copied system synchronized in all aspects?

The cloning procedure covers only Service API extraction from ERP systems. ODP extraction is currently under development.

 

If you use DB Connect source systems, then the DBCON entries are copied as well, so you would extract the same data into either BW system. In case this is not desired, you would have to change the DBCON entries manually.

 

If you use WebService Push Data Load, then the remote WebService Systems will continue to send the data into the original BW only. In case it shall send
it to the new BW instead (or as well), the WebService System must be changed, eventually even by adopting the code of the Service.

If you load data into further (data mart) BW systems, then you have to decide, whether the Data Mart shall receive the data from the original or from
the new BW system or from both. If you chose to switch the delivery to the new BW, then you need to manually change the host in the RFC destinations of the DataMart system and execute BDLS there to point to the new BW. If you chose to receive data from both BW systems, you can connect the new BW system manually and use the data flow copy tool within the DataMart to copy the data flows which load from the original BW.

 

If you use planning tools, then the plan data will be stored in one BW only, depending on which one you execute the planning. The other system is not synchronized, except you use dedicated SLT features. Every other change not related to DataLoads, like e.g. workflow processes, are also not synchronized, but could be with specialized setup of SLT processes.

 

 

6.4  How long can both BW systems be maintained parallel?

If the restrictions given above are mitigated, then the systems can in principle be operated infinitely in parallel.

 

 

 

 

7.   Deletion / Undo of the Cloning

7.1    When the original BW is decommissioned, the original delta queues should not be filled anymore. What is the advised method of deleting these delta queues?

The easiest way to decommission the original BW system is to stop all loading processes followed by a deletion of source systems in the Administrator Workbench. The related delta queues of the source systems have to be deleted in RSA7 manually resp. with the report given in note http://service.sap.com/sap/support/notes/2014906.

 

7.2  Customer wants to decommission a BW source system or reset the cloned delta queues created by BW PCA and restart the configuration. What are the required steps?

 

 

Suppose you have a BW source system connection from a source system A to a BW system B. For this connection, delta is recorded. You want to delete the connection and the delta ecording. There are three cases to consider, for which we need to describe hat happens in BW system B and BW source system A when decommissioning the ystem.

 

 

7.2.1   Not even in source system A, an entry in table RSBASIDOC exists for BW system B

This is the case after elta queue cloning for PCA, when the new BW system B was not yet created and onnected to source system A.

You should copy the -Report attached to SAP note http://service.sap.com/sap/support/notes/2014906 to your source system A and execute it,
passing the logical system name of BW system B as input parameter.

This will cause the following changes in source ystem A:

 

  1. delete the ALE change pointer delta recording
    in table TBDA2. Delta is no longer recorded for those.
  2. delete the delta DataSources from RSA7
    belonging to BW system B. Delta is no longer recorded for those.

 

7.2.2  Within the BW system B, the source system A is visible n the Administrator Workbench.

 

In this case you have o delete the source system A from the administrator workbench in the source ystem tree in BW system B.

 

This will cause the following changes in BW system B:

  1. the entry for source system A in table SBASIDOC will be deleted
  2. all source system dependent metadata for system  will be deleted

 

This will cause the following changes in source system A:

 

  1. the entry for BW system B in table RSBASIDOC will be deleted
  2. all entries in table ROOSGEN for BW system B will be deleted
  3. the ALE change pointer delta recording is set to inactive in table TBDA2. The delta is no longer recorded for those. The entries of table TBDA2 are not deleted, as this is not required, as delta is no longer recorded.
  4. all delta DataSources in RSA7 of source system A pointing to BW system B appear as red. Nevertheless the delta is still recorded for those.

 

Therefore you have to copy the Z-report attached to SAP note http://service.sap.com/sap/support/notes/2014906to your source system A and execute it, passing system B as input parameter.

In the source system A following changes are caused by this report:

  • delete the ALE change pointer delta recording in table TBDA2. Delta is no longer recorded for those.
  • delete the delta DataSources from RSA7 belonging to BW system B. Delta is no longer recorded for those.

 

 

7.2.3  Within the BW system B the source System A is not visible in the Administrator Workbench or does not exist in the metadata tables of BW system B. But in source system A, in table RSBASIDOC, the entry for connection towards BW system B is still present.

 

In this case you have to execute function module RSAP_BIW_DISCONNECT in single test in source system A, passing the logical system name of BW system B as input parameter for the BW system, and the logical system name of source system A as input parameter for the OLTP-system. Do also set the FORCE_DELETE parameter to X.

This will cause the following changes in source system A:

 

  • the entry in table RSBASIDOC is deleted all entries in table ROOSGEN for BW system B are deleted
  • the ALE change pointer delta recording is set to inactive in table TBDA2. The delta is no longer recorded for those. The table entries are not deleted.
  • all delta DataSources in RSA7 of source system A pointing to BW system B appear as red. Nevertheless the delta is still recorded for those.

Therefore you have to copy the Z-report attached to SAP note 2014906 to your source system A and execute it, passing the logical system name of BW system B as input parameter.

In the source system A following changes are caused by this report:

 

  • delete the ALE change pointer delta recording in table TBDA2. Delta is no longer recorded for those.
  • delete the delta DataSources from RSA7 belonging to BW system B. Delta is no longer recorded for those.

 

7.3  Useful reports

We had a few customer situations in which the PCA procedure detected inconsistencies in already existing queues in the ECC environment. As this leads to a stop in the whole PCA procedure until the whole issue is fixed, we recommend to schedule the report RSC1_DIAGNOSIS in ECC, to check the consistency of the queues which should be cloned.

Advanced DSO - What's so Advanced about it

$
0
0

Context:

With the Advent of BW 7.4 Powered by HANA, we are observing sea changes in the world of SAP BW, now transforming into EDW.

One of the prominent changes is the introduction of a new Infoprovider known as Advanced DSO.

I belive the name Advanced is well justified because of the fact that it advances to replace Cube and DSO (now known as classic DSO)  and is positioned to

become the preferred Infoprovider for persistency in SAP BW.

I got a chance to lay my hands on this new kid on the block and wanted to share my experience with it.

 

Structure:

I will be presenting it with numbered points and screenshots. Rest is self-explanatory.

 

Content:

The document contains sequenced content with screenshots. Therefore, attaching XML version.



Easier Migration to SAP BW powered by SAP HANA...

$
0
0

...with ABAP Post-Copy Automation
for SAP Business Warehouse (SAP BW)

To reduce downtime in your production landscape, one of the recommended migration paths from SAP BW to SAP BW on SAP HANA comprises a system copy of your SAP BW system. The system copy procedure of SAP BW systems and landscapes is complex for a number of reasons however. There are a large number of configuration settings (such as connections and delta queue handling for data loading) and system copy scenarios of SAP BW (each with different landscape aspects) for example that have to be handled as part of every system copy, regardless of whether the system copy is part of the migration to SAP HANA or if you want to perform regular system copies of your SAP BW landscape.
SAP NetWeaver Landscape Virtualization Management offers preconfigured "task lists" used by the ABAP task manager for lifecycle management automation. You can also enable SAP BW powered by SAP HANA to “go productive” with the parallel operation of your existing production system, both connected to the same back end systems. This is achieved using a special and unique automated solution for delta queue cloning and synchronization on the production systems. SAP note 886102 (SMP logon required) thus becomes obsolete. Using the post-copy automation for SAP BW (BW PCA) in the migration process from SAP BW to SAP BW on SAP HANA, this process can be shortened by weeks and becomes easier, faster and more reliable.
  • You can download the license of SAP NetWeaver Landscape Virtualization Management via SAP Service Market Place http://service.sap.com/swdc -> Installations and Upgrades -> Browse our Download Catalog -> SAP NetWeaver and Complementary Products -> SAP NW LANDSC VIRT MGT ENT -> SAP NW LANDSC VIRT MGT ENT 1.0 -> Installation. Before downloading, it is strongly recommended to read the above mentioned guides thoroughly!

 

 

Update to SAP BW 7.50

$
0
0

SAP BW 7.50 is the next cornerstone on the simplification road towards the S/4HANA. With the full support of Advanced DSO (aDSO) objects for SAP-NLS and DT (dynamic tiering) and further enhancements for the BW modeling environment in Eclipse (query designer) and the new HANA source system to stay compatible with S/4HANA, also the Business Planning and Consolidation (PBC) is now fully integrated in the SAP BW software layer for the standard and embedded use case. This update also explains the decision for the code split from 7.40 to 7.50 and further considerations for the upgrade and implementation.

View this Presentation

SAP BW on SAP HANA References

$
0
0
273819_l_srgb_s_gl.jpg

This page gives an overview 
of presentations that list 
BW references from customers
by industries
and additional customer references

 

 

 

 

 

 

 

 

 

 

 

Last Update: August 2015

 

 

 

Avnet: Achieving One Version of the Truth with SAP BW powered by SAP HANA

Avnet is a global technology distributor of technology good and services that operates in 120+ countries, supporting more than with 800 suppliers and 100,000 customers worldwide. Hear how SAP's strong partnership has allowed Avnet to transform its business warehouse environment and provide game-changing analytics that accelerate the success for customers.

 

 

Also checkout the collection of public customer referenceslike articles, comments, videos, demos, tweets etc. related to BW-on-HANA from Thomas Zurek.

SAP Business Warehouse 7.5, edition for SAP HANA / Ramp-Up

$
0
0

RU BW 75 small.jpg

 

 

 

Ramp-Up for SAP BW 7.5, edition for SAP HANA

You like to be in the front row and experience the new features firsthand? We have recently opened up nominations for customers, who want to participate in the Ramp-Up for SAP Business Warehouse 7.5, edition for SAP HANA.

Please note that the Ramp-Up program only adresses the SAP BW 7.5, edition for SAP HANA which starts with release of SP01 planned for 23.Nov.2015.

All other features of SAP BW 7.5 powered by SAP HANA are generally available as of SP00 planned for 24th of Oct.2015 without Ramp-Up.


The Ramp-Up of SAP BW 7.5, edition for SAP HANA mainly targets customers who want to perform a New Install/Greenfield implementation or are on BW 7.4 powered by SAP HANA already and leverage the HANA optimized BW objects, esp. for new modeling projects. The SAP BW 7.5, edition for SAP HANA supports modeling only with HANA optimized BW objects which leads to a simplified governance of new implementation projects and future proof architecture for your BW implementation.

 

 

 

 

 

SAP BW Roadmap driven by SAP HANA

Roadmap.GIF

(click on picture to enlarge)

 

 

What‘s SAP BW 7.5, edition for SAP HANA?

version compare.GIF

SAP BW run simpler

      • Simplified goverance
      • Faster time to market with new agile and flexible data modelling patterns

An option to run SAP BW 7.5 only  with new and optimized SAP HANA objects

      • Simplified modelling with SAP HANA only objects and renewed intuitive UIs in Eclipse and SAP UI5

Non-disruptive transition and switch into the  edition for SAP HANA

      • Restricted usage of classic / non-HANA functionality

 

 

RU BW 75 best Practices.GIF

 

Run SAP BW 7.5 in future ready and recommended way

      • Use only new SAP HANA specific modelling and data flow objects
      • No overhead by legacy objects
      • Optimized data loads with high flexibility and agile administration
      • Intuitive Eclipse based modelling environment for main objects and query
      • All administration tools based on SAPUI5
      • New BW on HANA-optimized Business Content

Helps to adopt the latest best practices in a simplified mode

 

 

 

Choose the Right Path towards SAP BW 7.5, edition for SAP HANA

 

RU BW Scenarios pic.GIF

Existing SAP BW on AnyDB

      • Non-disruptive Upgrade to BW on HANA as starting point towards SAP BW, edition for SAP HANA

Existing SAP BW on HANA

      • Upgrade to SAP BW 7.5 on HANA (or higher)
      • Switch to SAP BW, edition for SAP HANA prepare mode
      • Use the new tools to convert existing  classic objects
      • Check tool for prerequisites fulfillment
      • Restricted usage of classic object

New Installation / Greenfield

      • Install SAP BW 7.5 on HANA (or higher)
      • Default SAP BW, edition for SAP HANA

 

 

Snapshot reporting on APO Live Cache data

$
0
0

Scenario: Majority of the business scenarios expect to analyze the live cache planning book data over a period of time, this enables the business to take informed decisions and plan their activities effectively.

 

Problem Statement: With traditional BW datasources, we have many limitations in reading the live cache data. The most prominent one is the inability of the datasource to capture changes (called delta in BW terminology) because of which whole dataset has to be read to move a copy to BW. With the performance of a BW extractor to read live cache as well as the amount of data to read on a daily basis leads to a non-scalable and inefficient solution in SAP BI.

 

Alternative Approach: SAP has introduced the ability to replicate Live Cache data into a transparent table with SAP infocenter – replication reports. This functionality can be utilized, provided the APO box is on HANA database. This can be treated as an alternative to BW datasource where SAP is utilizing the capabilities of HANA to make the replication faster and the ability to track delta’s.

 

Here are the high level steps to create a replication model, these steps have been detailed in many other documents already shared on SDN. In this document we will focus on generating a snapshot of the data as compared to reporting the ‘As is’ data which is a plug and play solution already configured by SAP.

 

High level steps:

  1. Create a replication model in with Tcode SAPAPO/REPL_MOD_VIE.
  2. Specify the connection as ‘Default’, do not use the external connection.
  3. Add the keyfigures you need into the replication model.

.pic1.jpg

4.  Activate the model in Tcode /SAPAPO/REPL_MODEL_MANAGE.


Pic2.jpg

5.     Run the replication in the Tcode /SAPAPO/REPL_REPLICATION_MODEL

Pic3.jpg

 

Generating a Snapshot:

By default the SAP infocenter is configured to replicate the planning book into a HANA table that stays in HANA schema. However, it is technically possible to replicate this table into SAP schema that can then be used for snapshotting in BW.

 

Pic4.jpg

 

Steps to generate HANA table in HANA schema:

  1. Choose the ‘default’ connection while replicating the table in Tcode SAPAPO/REPL_MOD_VIE.
  2. The table will be generated in SAP that is accessible via SE11. Look for the table with the “/1apo*” and hit F4 function key.
  3. You will find a table with the description as your replication model.

pic5.jpg

 

 

Pulling this data into BW for snapshots:

  1. Create a view on the table generated by the replication model.
  2. Create a custom datasource on the view created on top of replication model.
  3. Create a DSO and load it from the datasource created above.

 

Risk: As per SAP, every time the model is deactivated and activated again, model will generate a new table. This is an issue with BW, as the view cannot be changed in production environment.

Mitigation: Instead of the view based datasource, we created a function module based datasource that could read any specified table and pull the data. The name of the table to be pulled can be maintained in TVARV table or some Z table that can be maintained in Live environment by the application support team.

 

Conclusion: this model enables trend reporting on planning book data, we can now store various snapshots of the planning books with the ability of delta replication offered by SAP-infocenter replication jobs. However, this model is not suitable for real time data reporting, that is a plug and play solution offered by SAP, via HANA views that automatically get generated with the replication models.


SAP First Guidance Collection for SAP BW powered by SAP HANA

$
0
0

stones_s.jpg

 

SAP First Guidance

 

The First Guidance documents are the beginning of a series of documents that should help to better understand the various concepts of SAP BW powered by SAP HANA.The documents are still “work in progress”, so these guides are not intended to be exhaustive.. The purpose of these documents is to deliver additional information besides SAP Help and blogs to get a better understanding of the concepts of SAP BW on SAP HANA.

 

 

This Content replaced the retired "SAP HANA Cookbook"

End-to-End Implementation Roadmap for SAP BW

See also the Customer Adoption Journey Map

 

 

 

 

SAP First Guidance - Business Warehouse on SAP HANA Installation
(Updated version: reflects now the 7.31/7.40 releases as well, the usage of the SL toolset (SWPM, SUM, etc.) and the current SAP HANA 1.0 SP08 release and there actual revisions.)

After BW on HANA is already available since mid April 2012 this document was enhanched a few times since the first version.
On base of this information this "SAP First Guidance" document is released to support Basis administrators in running and  and configuring a Business Warehouse on a SAP HANA system. SAP First Guidance - Business Warehouse on SAP HANA installation provides answers to major questions and issues as well as workarounds and additional details to complement standard SAP Guides and SAP notes. This SAP First Guidance document has no claim of completeness, but it is the most complete starting point for an successfull implementation.

For more information please contact roland.kramer@sap.com

 

 

SAP First Guidance - Migration BW on HANA using the DMO option in SUM


New Version 1.7x available using the latest SUM/DMO SP13/14 Features and the new UI

SAP First Guidance - Using the new DMO to Migra... | SCN


the database migration option (DMO) is an option inside of the software update manager (SUM) for combined update and migration: update an existing SAP system to a higher software release and migrate to SAP HANA database in one technical step. As the technical SUM steps are the same, this “SAP First Guidance” document should make all customer-specific documentation obsolete. It is the complementary documentation to the existing Notes and SUM/DMO Upgrade Guides.

DMO can be used with every BW Release starting from 7.0x and onwards. It makes the two step approach (upgrade/migration) and the usage of the BW post copy automation (BW-PCA) obsolete. It can also be used within a Release, e.g. to go from 7.40 SP06 on anyDB to migrate to 7.40 SP07 on HANA.

For more information please contact roland.kramer@sap.com

 

 


SAP First Guidance - Implementing BW-MT as the new SAP BW Modeling Experience

SAP BW 7.4 SP08 powered by SAP HANA is the next milestone for enterprise data warehousing with BW on HANA and provides the next level of simplification for BW .In addition, SAP BW on HANA Virtual Data Warehouse capabilities have been enhanced for even more flexibility. In midterm the advanced DSO shall replace the main BW InfoProviders with persistency (InfoCube, DSO, PSA). The classic InfoProviders are of course still supported and do co-exist with the advanced DSOs.

For more information, please contact roland.kramer@sap.com

 

SFG_NLS.JPG


SAP First Guidance - SAP NLS Solution with Sybase IQ

This “SAP First Guidance” document provides all the necessary information to help quickly implement this newly released option: Store historical BW data on an external IQ server to optimize the  system size  when preparing  to migrate to SAP BW powered by SAP HANA. The SAP NLS solution with Sybase IQ also helps to keep down the TCO for investing in the SAP In-Memory when historical (“frozen”) data is stored on a less sophisticated device, which acts like a “BW Accelerator on Disk”. SAP Sybase IQ is therefore the perfect smart store for this kind of data. Please note that the SAP NLS solution can be used with all database versions supported by SAP BW 7.3x. SAP HANA is not necessary. The document is a “work in progress” and is not intended to be exhaustive.  It does however contain all information required for successful implementation of the SAP-NLS solution with Sybase IQ. SAP-NLS can be used with every SAP supported database, where SAP IQ will be the seconday database for data reallocation.

For more information, please contact roland.kramer@sap.com

 

 

SAP First Guidance - SEM/BW Modelling in SolMan 7.1 with MOPZ

Due to constant Questions about the Upgrade to NetWeaver 7.3x and 7.40 including SEM Add-On Components, we created a SAP First Guidance Document which describes the successfull Definition of a SAP BW 7.0/7.01 System with the SEM Add-On Installed on top.  With this Information you will be able to integrate the stack.xml in the SUM (software update manager) process and the DMO (database migration option) DMO process included in SUM as the first input is the location of the stack.xml which defines the download directory for SUM. Furthermore the interaction of the stack.xml in the upgrade process enables you a smooth Integration into the Upgrade Process included in DMO.

For more information please contact roland.kramer@sap.com

 

 

 

 

SAP First Guidance - BW Housekeeping and BW-PCA

The system copy procedure of SAP BW systems and landscapes is complex for a number of reasons. There is a large number of configuration settings (such as connections and delta queue handling for data loading) and system copy scenarios of SAP BW (each with different landscape aspects).  These have to be handled as part of every system copy, regardless of whether the system copy is part of the migration to SAP HANA or if you want to perform regular system copies of your SAP BW landscape. BW-PCA can be used from SAP BW 7.0 onwards depending on the SPS level.

Additionally see the usage and implementation of the BW Houskeeping Task List and the Pre/Post Task Lists for Upgrade/Migration. These are Included in the database migration option (DMO) as part of the software update manager (SUM). Additional Details also can be found on the BW ALM Page.

For more information please contact roland.kramer@sap.com

 

------------------------------------------------------------------------------------------------------------

 

 

SAP First Guidance SAP BW 7.40 on HANA - HANA Analysis Processes

Starting with SAP BW 7.4 SP5 on HANA a new feature to analyze data from certain perspectives, for example to calculate ABC classes or to calculate scoring classes, is introduced. This new feature is called SAP HANA Analysis Process and enables the data warehouse modeler to analyze the data using different predefined or own written functions or scripts. HANA provides the integration numerous specialized libraries like PAL, AFL, R to understand the correlation of the data in the existing EDW.

 

SAP First Guidance - SAP BW 7.40 on HANA - View Generation

The focus of this document is the exposure of BW data natively in HANA as HANA views that point directly to the data and tables managed by BW. This enables HANA-native consumption of BW data.

 

 

 

SAP First Guidance - SAP BW 7.30 on HANA - Inventory InfoCubes

This SAP first guidance document describes the handling of non-cumulative InfoCubes on SAP HANA. As the handling of Inventory InfoCubes changed within SAP BW 7.30 based on SAP HANA, this document wants to briefly describe the differences. for better understanding it is recommended to read How to Guide based on previous release How to Handle Inventory Management Scenarios in BW.

 

 

SAP First Guidance - SAP BW 7.40 SP05 on HANA - OpenODSView

In SAP BW 7.4 SP05 on HANA the new metadata object Open ODS View is introduced, which provides the data warehouse modeler with a flexible, easy to use tool to integrate external data in the EDW Core Data Warehouse.

 

 

SAP First Guidance - SAP BW 7.40 SP05 powered by SAP HANA - CompositeProvider

A CompositeProvider is an InfoProvider in which you can combine data from BW InfoProviders such as InfoObjects, DataStore Objects, SPOs and InfoCubes, or SAP HANA views such as Analytical or Calculation Views using join or union operations to make the data available for reporting. This paper explains how to create a CompositeProvider in SAP BW 7.4 SP5 using the BW modeling tools in SAP HANA Studio.

SAP First Guidance - Implementing BW-MT for BW-aDSO

$
0
0

SAP BW 7.4 SP8 powered by SAP HANA is the next milestone for enterprise data warehousing with BW on HANA and provides  the next level of simplification for BW .In addition, SAP BW on HANA Virtual Data Warehouse capabilities have been enhanced for even more flexibility. In mid term the advanced DSO shall replace the main BW InfoProviders with persistency (InfoCube, DSO, PSA). The classic InfoProviders are of course still supported and do co-exist with the advanced DSOs.

View this Document

SAP BW on HANA - Complete Reference Guide

$
0
0

This article provides detailed overview about BW on HANA. This can be used in modelling HANA based scenarios. Various BW 7.4 objects are explained in detail. This can be a handy reference to BW consultants and Project managers who implement BW on HANA.

View this Document

SAP Business Warehouse 7.5

$
0
0

277649_cliff runner slim.jpg

Introduction

 

SAP BW powered by SAP HANA was the very first SAP application that ran completely on top of SAP HANA and has meanwhile reached a great adoption. For many customers it has become the entry point for their SAP HANA roadmap. Customers will benefit not only from phantastic performance improvements for data load, query runtime and planning functions but also in simplified data modeling as the number of InfoProviders are highly reduced and all modeling is done within in a new, modern UI that is unique across all modeling tools. At the same time data layers are reduced and with the new virtualization capabilities BW can also serve as the “Logical Data Warehouse”.

 

Read more about SAP's overall Data warehouse strategy and find out about the highlights of SAP BW 7.5 in the SAP BW 7.5 SP1 powered by SAP HANA Overview and Roadmap .

 

 

HANA Data Warehousing: The HANADW

 

With this blog Thomas Zurek sheds some light into the direction that SAP is taking towards a unified offering for building a data warehouse on top of HANA. The unofficial working title is the HANA DW. The blog is divided into 3 sections, each addressing the most pressing questions  from customers who have already seen flavours of this.


 

 

 

 

Ramp Up

 

Ramp up  information is available on the Ramp-Up Page.

 

 

 

 

 

 

 

Roadmap

 

Roadmap_ no numbers.GIF

(Klick to enlarge)

 


Highlights

highlights75.GIF

(Klick to enlarge)

SAP S/4HANA Analytics & SAP BW Data Integration

$
0
0

This guide gives an overview of different SAP S/4HANA Analytics & SAP BW Data Integration scenarios and shows how to implement them. The focus is on integrating data provided by SAP S/4HANA Analytics with corresponding data in SAP BW using the example of ERP – Sales & Distribution.

View this Presentation

Viewing all 273 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>