Showing posts with label SAP. Show all posts
Showing posts with label SAP. Show all posts

Friday 14 September 2012

Easy way to remember all the main SAP tables used for any development


This will serve as an easy way to remember all the main SAP tables used for any development.

Remember Bank tables start with B, say "BKNF , BKPF".
Remember Customer tables start with K , say "KNA1,KONV".
Remember Material tables start with M, say "MARA , MAKT , MARC".
Remember Master data tables start with T, say "T001, T001W".
Remember Purchasing tables start with E, say "EKKO, EKPO"
Remember Sales table start with V, say "VBAK,VBAP".
Remember Vendor tables start with L, say "LFA1".

Six important FI tables.

They contain an I if it is an open item.
They contain an A if it is a closed item.
They contain an S if a GL account say "BSIS , BSAS".

Important table names of Billing, Delivery, Sales and Purchasing.

Each table had a K if it is a header data, say "VBAK, VBAP, LIKP, VBRK, EKKO".
Each table had a P if it is an item data, say "VBAP, LIPS , VBRP , EKPO".
With a D at the end , the table is a Customer
With a K at the end, the table is a Vendor.




Delta Errors

Error 1: Delta invalidation (Data mart scenario):

To solve the problem you need to reinitialize the load again (SAP recommends)
1) Delete the requests from cube which had as source the ODS. (Otherwise you will get duplicate data.)
2) Delete the initialization in the info package for the ODS-info source
3) Schedule a new init from ODS to info cube.

Workaround ( Customer's responsibility; SAP wont support incase of issues in this procedure).
You should ensure:

- The requests that really have not been loaded to the target cube are still available in the PSA for the loads to the ODS and
- You are deleting only the requests which are not loaded already.

Solution:

1. Delete ONLY the request from the source ODS which have NOT been loaded to the target cube already. If the requests which you need to delete are available in the PSA for the load to DSO then these can be reloaded to the target in a later step (see below step 4). If all requests are already loaded to the target and none are missing then you don't need to do anything in this step, proceed
2. You now need to delete the existing init request for the data mart load. To do this in the Info Package (just take one if there're several) reset the delta management in menu 'Scheduler', 'Init options for source system'.
3. Now execute an init load without data transfer, in the Info Package set the extraction mode to "Initialize without Data Transfer" and schedule the info package. This will fill RSDMDELTA with ALL requests contained in the source ODS's change log at that time. In other words this process resets the DataMart flag in the source ODS for the requests that have already been loaded.
4. Now in this step reload the requests from the PSA to the source ODS that you deleted in step 1 (if you deleted any requests).
5. The next DataMart delta from the source ODS should load these requests into the targets.

Error 2: Duplicate data in case of repeat delta:

If a delta load fails, it does NOT mean, that a repeat delta has to be executed automatically! It is necessary to analyse, if the data has already extracted correctly, and the corresponding tables (ROOSPRMSC) updated or not, before you execute a repeat delta.

Error 3: DataMart repeat request returns 0 records

After an incorrect delta, repeat delta does not fetch delta record. Reset the Data Mart status of the source request corresponding to the deleted Data Mart request in the administration view of the sourceobject.

Error 4: There are already full uploads in the ODS object and therefore inits or deltas cannot be posted or activated.

You have already loaded full requests in to the ODS. Due to business requirement, you want to change this to delta. If a full upload occurs once for an ODS object, you can no longer activate an init for the same Data Source/source system combination in this ODS object. The system does not check for overlapping selection conditions. Please execute report RSSM_SET_REPAIR_FULL_FLAG which will convert all Full loads in to repair full request.

Error 5: Unable to do compression though the delta seems to be updated in data target

When a Delta-DTP is started, pointer DMEXIST is increased to prevent the deletion of the source-request. As long as there is still a Delta-DTP or Export-Data Source that has NOT yet picked up the request, DMALL is not updated and the request cannot be compressed. When ALL Delta-DTPs AND Export-Data Sources have picked up the source-request, the pointer DMALL is increased.

So, if you are unable to do compression due to delta errors, please check that all downstream targets delta is successful. Even if one downstream target is not filled with delta data, compression is not allowed.

For Info Cubes, the contents of the RSMDATASTATE-DMEXIST and RSMDATASTATE-DMALL fields must always be equal to the largest request in the RSDMDELTA table of the relevant cube (ICNAME field).

Then only compression is possible.

In General,in table RSMDATASTATE:
DMALL <= DMEXIST <= QUALOK.

Tables to be checked in case of 3.x Flow: RSDMDELTA/RSDMSTAT
Tables to be checked in case of 7. X dataflow: RSBKREQUEST and RSSTATMANREQMAP.

For ODS objects, the contents of RSBODSLOGSTATE-PROCESSED_ALL and RSBODSLOGSTATE-PROCESSED_ONE fields must always be equal to the largest request in the RSDMDELTA table. In General,in table RSBODSLOGSTATE:
PROCESSEDALL<= PROCESSEDONE <= ACTIVE.

Error 6: Last delta incorrect. A repeat must be requested. This is not possible.

For few 0FI* extractors if the delta has failed and if You want to repeat delta which is not possible. In the standard system, you can repeat the delta extractions for the last sixty days. The number of days is defined by the 'DELTIMEST' parameter in the BWOM_SETTINGS table. Still, if you want to do repeat delta you need to change the time stamp. Please refer note 860500 for more details.

Error 7: Even after init simulation further deltas has not fetched relevant data

After initialization without data transfer delta has not fetched data. You can check if the data source is designed to perform Init simulations by checking the field ROOSOURCE-INITSIMU for the respective data source. If it's not enabled, if there is delta invalidation, only option is to do the reinit with data transfer.

Error 8: Missing delta records - in generic delta with calendar day.

Using "Calendar day" as delta-relevant field has certain disadvantages compared to the use of time stamps. If there is an adequate time stamp field in the source structure, use this field as delta-relevant field.

Error 9: Dump while execution of delta info package / opening an info package.

"MESSAGE_TYPE_X" "SAPLRSSM" or "LRSSMU36" "RSSM_OLTPSOURCE_SELECTIONS"
Reasons: Improper system copy / Terminated initialization / Overlapping initialization selection.

This dump happens due to inconsistent entries in table RSSDLINIT: (BW system) & ROOSPRMSC, ROOSPRMSF (OLTP system).

Error 10: Delta is lost after a system copy:

You have copied a source system and BW system is not copied. For delta to work from new copied source system, Repair the delta queue In the BW system: Execute report RSSM_OLTP_INIT_DELTA_UPDATE for each data source, which has an active init request. Provide the field 'ALWAYS' with value 'X'. This will push the init / delta information of the BW into the source system, thus allows to continue loading delta or to do a new init.

This report will update the delta tables in the source and from technical point of few delta upload will be possible, but you should be aware that in case of system copy (if you copy only one system not both) you cannot compare the data between source and BW system, a new init is suggested.

Error 11: How to load delta from failed delta request:

Delta is having missing records. It's not the latest delta as we don't know when the problem has occurred.

Error 12: CO-PA Data source: No new data since last delta update (Though the delta data exists!)

First delta has failed. And the repeat delta has fetched 0 records, though there are records in source. It is just normal that the first delta after a failed one finishes successful, but extracted zero records. The reason is that this is a repeat from the delta queue and as the delta run before did not extract anything due to the status of the data source in KEB2 the delta queue was empty and so the repeat just extracts the empty delta queue. After such a successful repeat you should just execute another delta run that should extract the desired data.

Error 13: Write optimized DSO has not fetched Delta

You need to run the report RSSM_WO_DSO_REPAIR_RSBODSLOG, which generates these missing entries.

Questions:

1. What are the possible reasons for delta invalidation?

There can be multiple reasons with which delta management can be invalidated like if
1) Info Cube: a request is compressed which has not been delta-updated to all downstream targets.
2) Request is deleted, which already had been delta-updated to one or more downstream targets.

How to find the delta invalidation:

To check if the delta management is invalidated, look up table RSDMSTAT-DMDELTAINA. The field DMDELTAINA (delta status) should be set to X. Also the table RSDMDELTA will not contain any data for this source/target combination.

2. Is it good to go for repair full request / reinit incase of missing delta:

It depends on parameters like: Volume of missing delta, possibility for selective deletion and the type of Downstream Data target.
Note that you should only use repair full request if you have a small number of incorrect or missing records. Otherwise, SAP always recommends a re-initialization (possibly after a previous selective deletion, followed by a restriction of the Delta-Init selection to exclude areas that were not changed in the meantime).
Please follow note 739863 for further details.

3. Is it possible to do initalization with out data transfer in DTP?

Unlike delta transfer using an InfoPackage, an explicit initialization of the delta process is not necessary for delta transfer with a DTP. When the data transfer process is executed in delta mode for the first time, all existing requests are retrieved from the source and the delta status is initialized. This implies that to re-init all existing requests must be deleted.

If you want to execute a delta without transferring data, analogous to the simulation of the delta initialization with the InfoPackage, select No data transfer; delta status in source: fetched as processing mode. This processing mode is available when the data transfer process extracts in delta mode. A request started like this marks the data that is found in the source as fetched, without actually transferring it to the target.

If you dont need delta DTP anymore, you can execute report: RSBK_LOGICAL_DELETE_DTP for the delta DTP that you want to be deactivated. Afterwards, the delta DTP will no longer be able to be executed and requests that have already been transferred will no longer be taken into account in the calculation of the data mart water level.
For the data mart, the system behaves as if the delta DTP never existed.
Please refer note 1450242 for further details.

4. Important tables in Delta / Data mart Scenarios:

RSSDLINIT: (BW system): Last Valid Initializations to an OLTP Source
ROOSPRMSC (OLTP system): Control Parameter Per DataSource Channel
ROOSPRMSF : Control Parameters Per DataSource
RSSDLINITSEL: Last Valid Initializations to an OLTP Source
RSDMDELTA: Data Mart Delta Management
This table contains the list of all the requests which are already fetched by the target system.
ROOSGENDLM: Generic Delta Management for DataSources (Client-Dep.)
RSBKREQUEST: DTP Request
RSSTATMANREQMAP: Mapping of Higher- to Lower-Level Status Manager Request
RSDMSTAT: Data mart Status Table
This table is used for processing a Repeat Delta. Using field DMCOUNTER we can follow up in RSDMDELTA.
RSMDATASTATE: Status of the data in the Infocubes
This table is used to know the state of the Data Target regarding compression, aggregation etc. For Data Marts the two field DMEXIST and DMALL are used.
RODELTAM: BW Delta Process
ROOSGEN: Generated Objects for OLTP Source
RSSELDONE: Monitor: Selections for executed request
RSSTATMANPART: Store for G_T_PART of Request Display in DTA Administration
RSBODSLOGSTATE: Change log Status for ODS Object
ROOSOURCE: Table Header for SAP BW OLTP Sources
Mainly used to get data regarding Initialization of the Extractor. It contains the header metadata of Datasources, e.g. if the datasource is capable of delta extraction.

5. Important corrections in BC-BW area:

1460643 Performance problems with asynchronous RFC
1250813 SAPLARFC uses all dialog work processes
1377863 Max no of gateways exceeded
1280898 IDoc hangs when it is to be sent immediately with
1055679 IDoc-tRFC inbound with immediate processing may not
1051445 qRFC scheduler does not use all available resourcen
1032638 Myself extraction: Cursor disappears, IDocs in
995057 Multiple execution of tRFC
977283 Outbound scheduler remains in WAITING status for a long time

Wednesday 12 September 2012

BEx

BI Statistics

Data Transfer Process (DTP)

Process Chain

Routines

Transformation: Rule Type & Rule Group

Delta Process

Remodeling

Data Warehousing: Step by Step

Performance

Broadcasting BI Content

Integrating Content from BI into the Portal

Data Analysis with Microsoft Excel

Query Design

OLAP

Time stamp error

Reason

The “Time Stamp” Error occurs when the Transfer Rules or Transfer Structure are internally inactive in the system.
They can also occur whenever the Data Sources are changed on the R/3 side or the DataMarts are changed in BW side. In that case, the Transfer Rules is showing active status when checked. But they are actually not, it happens because the time stamp between the Data Source and theTransfer Rules are different.


Solution 

1. Go to RSA1 --> Source system --> Replicate DataSource


2. Run the program RS_TRANSTRU_ACTIVATE_ALL 


3. Mention Source System and Info Source and then execute.


Now the Transfer Structure will be automatically activated then proceed with the reload, it will get success now.

LO COCKPIT

Logistic Cockpit (LC) is the technique to extract logistics transaction data from R/3.

All the Data Sources belonging to logistics can be found in the LO Cockpit (Transaction LBWE) grouped by their respective application areas. 


The Data Sources for logistics are delivered by SAP as a part of its standard business content in the SAP ECC 6.0 system and has the following naming convention. A logistics transaction DataSource is named as follows: 2LIS_<Application>_<Event><Suffix> where,
  • Every LO Data Source starts with 2LIS. 
  • Application is specified by a two digit number that specifies the application relating to a set of events in a process. e.g. application 11 refers to SD sales. 
  • Event specifies the transaction that provides the data for the application specified, and is optional in the naming convention. e.g. event VA refers to creating, changing or deleting sales orders. (Verkauf Auftrag stands for sales order in German). 
  • Suffix specifies the details of information that is extracted. For e.g. ITM refers to item data, HDR refers to header data, and SCL refers to schedule lines.
Up on activation of the business content Data Sources, all components like the extract structure, extractor program etc. also gets activated in the system.

The extract structure can be customized to meet specific reporting requirements at a later point of time and necessary user exits can also be made use of for achieving the same.

An extract structure generated will have the naming convention, MC <Application> <Event>0 <Suffix>. Where, suffix is optional. Thus e.g. 2LIS_11_VAITM, sales order item, will have the extract structure MC11VA0ITM.

Delta Initialization:
  • LO Data Sources use the concept of setup tables to carry out the initial data extraction process. 
  • The presence of restructuring/setup tables prevents the BI extractors directly access the frequently updated large logistics application tables and are only used for initialization of data to BI.
  • For loading data first time into the BI system, the setup tables have to be filled.
Delta Extraction:
  • Once the initialization of the logistics transaction data Data Source is successfully carried out, all subsequent new and changed records are extracted to the BI system using the delta mechanism supported by the Data Source. 
  • The LO Data Sources support ABR delta mechanism which is both DSO and InfoCube compatible. The ABR delta creates delta with after, before and reverse images that are updated directly to the delta queue, which gets automatically generated after successful delta initialization.
  • The after image provides status after change, a before image gives status before the change with a minus sign and a reverse image sends the record with a minus sign for the deleted records.
  • The type of delta provided by the LO Data Sources is a push delta, i.e. the delta data records from the respective application are pushed to the delta queue before they are extracted to BI as part of the delta update. The fact whether a delta is generated for a document change is determined by the LO application. It is a very important aspect for the logistic Data Sources as the very programthat updates the application tables for a transaction triggers/pushes the data for information systems, by means of an update type, which can be a V1 or a V2 update. 
  • The delta queue for an LO Data Source is automatically generated after successful initialization and can be viewed in transaction RSA7, or in transaction SMQ1 under name MCEX<Application>.
Update Method

The following three update methods are available

Synchronous update (V1 update) 
Asynchronous update (V2 update) 
Collective update (V3 update)

Synchronous update (V1 update)

Statistics updates is carried out oat the same time as the document update in the application table, means whenever we create a transaction in R/3, then the entries get into the R/3 table and this takes place in v1 update.

Asynchronous update (V2 update)

Document update and the statistics update take place in different tasks. V2 update starts a few seconds after V1 update and this update the values get into statistical tables from where we do the extraction into BW.
V1 and V2 updates do not require any scheduling activity.

Collective update (V3 update)

V3 update uses delta queue technology is similar to the V2 update. The main differences is that V2 updates are always triggered by applications while V3 update may be scheduled independently. 

Update modes

Direct Delta 
Queued Delta 
Unserialized V3 Update


Direct Delta
  • With this update mode, extraction data is transferred directly to the BW delta queues with each document posting. 
  • Each document posted with delta extraction is converted to exactly one LUW in the related BW delta queues. 
  • In this update mode no need to schedule a job at regular intervals (through LBWE “Job control”) in order to transfer the data to the BW delta queues. Thus additional monitoring of update data or extraction queue is not require. 
  • This update method is recommended only for customers with a low occurrence of documents (a maximum of 10000 document changes - creating, changing or deleting - between two delta extractions) for the relevant application. 
Queued Delta
  • With queued delta update mode, the extraction data is written in an extraction queue and then that data can be transferred to the BW delta queues by extraction collective run. 
  • If we use this method, it will be necessary to schedule a job to regularly transfer the data to the BW delta queues i.e extraction collective run. 
  • SAP recommends to schedule this job hourly during normal operation after successful delta initialization, but there is no fixed rule: it depends from peculiarity of every specific situation (business volume, reporting needs and so on).
Unserialized V3 Update
  • With this Unserialized V3 Update, the extraction data is written in an update table and then that data can be transferred to the BW delta queues by V3 collective run.
Setup Table 

  • Setup table is a cluster table that is used to extract data from R/3 tables of same application. 
  • The use of setup table is to store your data in them before updating to the target system. Once you fill up the setup tables with the data, you need not go to the application tables again and again which in turn will increase your system performance.
  • LO extractor takes data from Setup Table while initialization and full upload. 
  • As Setup Tables are required only for full and init load we can delete the data after loading in order to avoid duplicate data. 
  • We have to fill the setup tables in LO by using OLI*BW or also by going to SBIW à Settings for Application à Specific Data Sources à Logistics à Managing Extract Structures à Initialization à Filling in Setup tables à Application specific setup table of statistical data. 
  • We can delete the setup tables also by using LBWG code. You can also delete setup tables application wise by going to SBIW -> Settings for Application Specific Data Sources  -> Logistics   ->  Managing Extract Structures  ->  Initialization ->  Delete the Contents of the Setup Table. 
  • Technical Name of Setup table is ExtractStructure-setup, for example suppose Data source name 2LIS_11_VAHDR. And extract structure name is MC11VA0HDR then setup table name will be MC11VA0HDRSETUP.
LUW
  • LUW stands of Logical Unit of Work. When we create a new document it forms New image ‘N’ and whenever there is a change in the existing document it forms before image ‘X’ and after Image ‘ ‘ and these after and before images together constitute one LUW.
Delta Queue (RSA7)
  • Delta queue stores records that have been generated after last delta upload and not yet to be sent to BW. 
  • Depending on the method selected, generated records will either come directly to this delta queue or through extraction queue.
  • Delta Queue (RSA7) Maintains 2 images one Delta image and the other Repeat Delta. When we run the delta load in BW system it sends the Delta image and whenever delta loads and we the repeat delta it sends the repeat delta records.
Statistical Setup
  • Statistical Setup is a program, which is specific to Application Component. Whenever we run this program it extracts all the data from database table and put into the Setup Table.

Error calling number range object


Solution

1. Note down the InfoCube and Dimension name.
2. Go to T-Code: RSRV --> All Elementary Tests --> Transactional Data then double click on “Comparison of Number Range of a Dimension and Maximum DIMID” --> then click the same on the right side pane --> Mention the InfoCube name and Dimension name, click on Transfer button --> Click on top Correct Error.