Saturday, 11 June 2016

SAP - Project Lifecycle

A SAP project lifecycle consists of various stages, starting from evaluation to the project’s subsequent support.

SAP Project Lifecycle

SAP Project Lifecycle

Stages of SAP Project Lifecycle

A typical SAP project undergoes the following stages in its lifecycle −


Evaluation may be a decision to choose between different software vendors or selection of products from a single vendor.

Project Preparation

Since a SAP implementation intends to map the organization processes to the ones defined by SAP, the implementation needs to have on-board people with complete knowledge of the organization business processes. The project preparation phase, amongst other things, aims to identify this team.

Business Blueprint

A business blueprint includes what modules of a SAP product would be used and the mapping of the existing business processes to the processes provided by SAP.


The actual work of customizing the SAP software to be in sync with the organizations business processes is done in this phase. It includes customization of existing SAP package and solution along with the development of new objects based on requirement.


The changes made in the realization phase need to be tested in isolation as well as in a consolidated manner using real-time data. This is done in the testing phase.

Final Preparation

The production system is prepared using the changes from the realization and testing phases. Certain activities need to be done directly in the production system as well. These activities take place during the final preparation phase.


In this stage, the final product is released to the end-users. The go-live may be done in a Big Bang (all modules at one go) or in a phase-by-phase manner.

Sustain / Support

The project now moves into the “sustain and support” phase where the end-users’ issues would be resolved and ongoing maintenance of the system would be taken care of.

SAP Multiphase Implementation

SAP Multiphase Project Lifecycle

Thursday, 9 June 2016

What is the difference between dynamic and static parameter

In SAP, parameters are used to set the configuration and define the functionality of an sap system like number of workprocesses, buffer size, locktable size etc.

Basically there are 2 types of parameters :

Static are the parameters those won't take effect immediately the value is set. Restart of an sap system is required for them to take effect.

Dynamic are the parameters those will take effect immediately in the run time. Restart of an sap system is not required for them to take effect.

Parameter can be differentiated whether it is dynamic or static by checking in RZ11 transaction of SAP.

 In Rz11, type the parameter and display it. In the screen , there will be a check box with description as "Dynamically switchable". If that check box is ticked, it means that it is dynamic parameter and value can changed in run time without restart of the system. If it is not ticked, then it is static parameter and it needs restart of a sap system for the changes to take effect. 

Wednesday, 8 June 2016

How to view import history in SAP ?

 In real time scenarios, there will be circumstances like
i) You are facing particular issue from a particular day. So, after analysis, you have suspected this could be due to some transports done on a particular duration. So, now we need to know what are the transports imported between this duration
ii) You have restored data of SAP quality system and did system refresh to setup another training SAP system. After some days, some transports were done to Quality system. Now, you would like to perform transport sync to ensure these systems are consistent.
In the above scenarios, you would like to know what were the transports imported into SAP system between any 2 dates/times. This can be done by viewing import history at SAP level.

To view the import history, proceed as follows :

1)   Login to SAP ABAP system and goto STMS transaction code as below and click on truck icon to goto import overview of the SAP system.

2) In the below screen select and double click the SAP system and for which you would like to view the import history

3) Then you can view Import Queue of the SAP system.

4) To view transport import done between any 2 dates (import history), right click on this screen which pops some options. Select Import history from that popmenu or press ctrl + F7 button to view the import history screen as below

5) In the above screen, you can select the To date and from date and specify time interval to view the transports that were done between these 2 time periods.

The resulting output will be similar to below screen

Tuesday, 7 June 2016

Access to a particular tcode in SAP system?

How to find out who are all having access to a particular tcode in SAP system?

This article answers the following queries
  •  How to find out who are all having access to a particular tcode in SAP system?
  •  How to find out which user ids are having access to a transaction in SAP system?

In some real time scenarios, if there is an emergency like, a  change to a table to be done but particular end user doesnot have access to change the tables.  Then they may approach basis administrator/security(authorization) consultant to identify those users who are having that access so that they can request them to change the same in case of emergencies. So, you should be able to identify those users and confirm to business users.In general, basis/authorization consultant should be able to identify all user ids who are having access to a particular tcode. This can be done in the following way.In this example, am demonstrating how to identify all the users who are having access to SE16 in a given SAP system.Login to the SAP system and go to transaction SUIM as shown below :

In the above screen, please navigate to user -> users by complex selection criteria -> By Transaction Authorizations (This is highlighted in the). Select that and click F8 or clock symbol (highlighted)  to execute. This results in the next screen as below :

In the above screen, type any transaction code, which you would like to find who are all the users having access for the same. In our example, it is SE16.
Once you input transaction code, click on highlighted clock button, which results in the following screen which displays the list of users who are having access for SE16 in this SAP system.

Monday, 6 June 2016

How to Migrate Data using LSMW

Enter Transaction LSMW in SAP ,to start the workbench.


LSMW workbench shows the following information-
  • Project : An ID with a maximum of 10 characters to name your data transfer project. If you want to transfer data from several legacy systems, you may create a project e.g. for every legacy system.
  • Subproject: An ID with a maximum of 10 characters that is used as a further structuring attribute.
  • Object : An ID with a maximum of 10 characters to name the business object.
Enter Project ID , Subproject ID , Object ID.  Click Execute The next screen gives the STEPS in your LSMW data Migration


You can select a desired step and click execute. Lets look into each step in details

Step 1- Maintain Object Attributes.

There are four Modes of Data Transfer :
  1. Standard/ Batch Input : Standard upload Programs
  2. Batch Input Recording : Here you can create a recording of your own and use it to upload / change data
  3. BAPIs : Standard BAPIs are used to upload Data
  4. IDOCs : Any Inbound IDOC function modules can be used to process the data
Based on the requirement we try to find a suitable method to be processed. If it is a standard Master we can find it in the first method, Otherwise we try to use BAPIs or Idocs. If the requirement is a very custom one we use a recording to process the data.

Step 2 - Maintain  Source Structures


The source structures can be used to design the hierarchy of the files to be uploaded.

Step 3- Maintain  Source Fields

In this screen,the Fields which will be uploaded from the text file can be maintained here. The fields with identical names are taken as the Key


Source Filed  is used to identify whether a certain record should go to the specified structure. Eg : Suppose a file contains header rows and item rows, we can specify the first field as the indicator say 'H' for header and 'I' for Item. Thus when the file is being read, it checks the first field, if it is 'H' then it is read into the Header source structure else it is written to the item source structure.
The Source fields can be easily maintained in the form of a table maintenance.

Step 4 - Maintain Structure Relationships

The Structures which are needed for the processing of the data need to be assigned here. The Object may contain many structures and many source structures. The Mapping between the source and the target structures should be done after careful checking.



Step 5- Maintain Field Mapping and Conversion Rules

In this step, you assign source fields to target fields and define how the field contents will be converted.

  All fields of all target structures, which you selected in the previous step, will be displayed. For each target field the following information is displayed:
  • Field description
  • Assigned source fields (if any)
  • Rule type (fixed value, translation etc.)
  • Coding.
 Note: Some fields are preset by the system. These fields are called "technical fields" are marked with "Default setting". The coding for these fields is not displayed when first entering the fieldmapping; it can be displayed via the display variant. Changing the default setting may seriously affect the flow of the data conversion. If you erroneously changed the default setting, you can restore it by choosing Extras -> Restore default.

Step 6- Maintain Fixed Values,Translations and User-written Routines

Here the 3 reusable functions are maintained :
  1. Fixed Values : Fixed values are values which are fixed across the project eg : Company Code. We can assign a fixed value to BUKRS and this fixed value can be used in all the objects in this project. So if the value changes we can only change at one place i.e. in the fixed values instead of changing in each and every object.
  2. Translations : Here you can maintain the fixed translation for any legacy field and the translation can be assigned to the filed in Field Mapping and Conversion Rules. Translation can be 1:1 or many : 1 etc.
  3. User Defined Routines : These are user defined subroutines that are used in the object for processing the data.
All the Three functions mentioned above are reusable Rules which are valid for all objects in one Project.


Step7- Specify Files

Here we define the Files that we use to upload the data. The File can be on the Front end or in the application server.


Step 8-  Assign Files

Here we define which file we are going to use for current upload i.e. whether the file is on Presentation server or application server.


Step 9- Read Data

Reading the data from the file gives us an option to read only a few records and not the entire chunk in order to enable testing of first few records. This also provides the user defined selection parameter which can be used to restrict the read data based on the condition specified.

Step 10- Display Read Data

  • In this step, you can display all or a part of the read data in table form. Clicking on a line displays all information for this line in a clear way. The same happens when you click on Field contents.
  • Change display allows to select either a one-line or multi-line view.
  • Display color palette displays the colors for the individual hierarchy levels.

Step 11- Convert Data

Converting the data is the transfer of data from source to target structures based on the conversion routines written in maintain Field Mapping and conversion routines.


Step 12- Import Data

The steps displayed by the program depend on the selected object type:
Standard batch input or recording:
  1. Generate batch input session
  2. Run batch input session
Standard direct input:
  1. Start direct input session
BAPI or IDoc:
  1. Start IDoc creation
  2. Start IDoc processing
  3. Create IDoc overview
  4. Start IDoc post processing
This completes a detailed overview of steps to transfer your data using LSMW in SAP.

What is SAP LSMW ?

The LSMW Workbench is an  tool that supports the transfer of data from non-SAP systems ("Legacy Systems") to SAP  R/3 systems. This can be a one-time transfer as well as a periodic one.
LSMW also supports conversion of data of the legacy system in  numerous way. The data can then be imported into the SAP R/3 system via batch input, direct input, BAPIs or IDocs.
Furthermore, the LSM Workbench provides a recording function that allows generating a "data migration object" to enable migration from any required transaction.
LSMW can be used for following 3 functions -
The main functions of the LSM Workbench are:
  1. Import data (legacy data in spreadsheet tables and/or sequential files)
  2. Convert data (from source format to target format)
  3. Import data (into the database of the R/3 application)
To start the LSMW workbench use transaction LSMW

Also check out next tutorial on executing LSMW Step by Step

Saturday, 4 June 2016

Extension IDOC type

What is Extension IDOC type ?

An IDOC is of 2 types:-
  1. Basic
  2. Extension
SAP provides many a pre-defined Basic IDOC Types which can not be modified. In case you want to add more data to these restricted basic type  you may use an extension type. Most of the times you will NOT use extension.


Each IDOC are thoroughly documented in transaction WE60

Message Type

A message represents a specific type of document that is transmitted between two partners Ex. Orders,orders responses,invoices etc

An idoc type can be associated with many message types

Also a message type can be associated with different idoc types. Transaction WE81

IDOC Views

An IDOC type can be used for more than one message type, which results in IDOCs containing more fields than required for a particular message type.
IDOC views are used to improve performance in generating IDOCs to ensure only the relevant segments are filled with data. IDOC Views are important only for Outbound Processing.

Partner Profiles

A partner is defined as a business partner with whom you conduct business and exchange documents

In the partner profile of a partner that we exchange Idocs with, we maintain the parameters that are necessary for exchanging the data. The transaction used is WE20.


The port defines the technical characteristics of the connection between your SAP system and the other system you want to transfer data with (subsystem). The port defines the medium in which data is exchanged between the 2 systems.

There are different types of ports. The 2 most commonly used are the TRFC ports used in ALE  and File ports which EDI uses.

For TRFC ports we have to give the name of the logical destination created using SM59.

When using file port you can specify the directory where the IDOC file should be placed. The other system or the middleware will pick up the file from here. The Function module can be used to generate a file name for the idoc. While testing you can use "Outbound file" to specify a constant file name. The tab "outbound trigger" can be used to supply information if we want to trigger some processing on the subsystem when an idoc is created at this location. We have to specify the command file name and the directory which has to be run.

This is so CONFUSING!

Lets understand the process of creating an IDOC with an example -
  • Whenever a Purchase Order (PO) is created we want to send the IDOC to a vendor.
  • The PO is sent in the form of an IDOC to the vendor (partner). That partner has to be EDI enabled in that system. SAP should realize that it could send doc to this vendor electronically.
  • The PO sent as an outbound idoc by the customer will be inbound idoc for the vendor. The SAP system on the vendors side can process this to create an application document (a sales order) on their system.
  • Quotation, RFQ, PO, SO, Invoice, delivery note etc are some of the commonly exchanged documents through IDOC
The process of data transfer out of your SAP system is called the Outbound process , while that of data moving into you SAP system is called Inbound process. As a developer or a consultant who will be involved in setting up theses process for your organization. Here are the steps how to set them up-

The Outbound Process

Steps Involved -
  1. Create segments(WE31)
  2. Create an idoc type(WE30)
  3. Create a message type (WE81)
  4. Associate a message type to idoc type(WE82)
  5. Create a port(WE21)
  6. If you are going to use the message control method to trigger idocs then create the function module for creating the idoc and associate the function module to an outbound process code
  7. Otherwise create the function module or stand alone program which will create the idoc
  8. Create a partner profile(WE20) with the necessary information in the outbound parameters for the partner you want to exchange the idoc with.Trigger the idoc.

The Inbound Process

Steps Involved-
  1. Creation of basic Idoc type (Transaction WE30)
  2. Creating message type (Transaction WE81)
  3. Associating the Message type to basic Idoc type (Transaction WE82)
  4. Create the function module for processing the idoc
  5. Define the function module characteristics (BD51)
  6. Allocate the inbound function module to the message type(WE57)
  7. Defining process code (Transaction WE42)
  8. Creation of partner profile (Transaction WE20)

Friday, 3 June 2016


What is an IDOC?

IDOC is simply a data container used to exchange information between any two processes that can understand the syntax and semantics of the data.

In other words ,an IDOC is like a data file with a specified format which is exchanged between 2 systems which know how to interpret that data.

IDOC stands for " Intermediate Document"

When we execute an outbound ALE or EDI Process, an IDOC is created.

In the SAP System, IDOCs are stored in database.Every IDOC has an unique number(within a client).

Key Features
  • IDOCs are independent of the sending and receiving systems.(SAP-to-SAP as well as Non-SAP)
  • IDOCs are based on EDI standards, ANSI ASC X12 and EDIFACT. In case of any conflict in data size, it adopts one with greater length.
  • IDOCs are independent of the direction of data exchange e.g. ORDERS01 : Purchasing module : Inbound and Outbound
  • IDOCs can be viewed in a text editor. Data is stored in character format instead of binary format.

Structure of an IDOC

The Idoc structure consists of 3 parts -
  1. The administration part(Control Record)- which has the type of idoc,message type, the current status, the sender, receiver etc. This is referred to as the Control record.
  2. The application data (Data Record) - Which contains the data . These are called the data records/segments.
  3. The Status information (Status Record)- These give you information about the various stages the idoc has passed through.
You can view an IDOC using transaction WE02 or WE05
As seen in screenshot above IDOC record has three parts Control , Data and Status. Lets look into them in detail - Control Record
  • All control record data is stored in EDIDC table. The key to this table is the IDOC Number
  • It contains information like IDOC number, the direction(inbound/outbound),  sender, recipient information, channel it is using, which port it is using etc.
  • Direction '1' indicates outbound, '2' indicates inbound.
Data Record
  • Data record contains application data like employee header info, weekly details, client details etc
  • All data record data is stored in EDID2 to EDID4 tables and EDIDD is a structure where you can see its components.
  • It contains data like the idoc number, name and number of the segment in the idoc, the hierarchy and the data
  • The actual data is stored as a string in a field called SDATA, which is a 1000 char long field.
Status Record
  • Status record are attached to an IDOC at every milestone or when it encounter errors.
  • All status record data is stored in EDIDS table.
  • Statuses 1-42 are for outbound while 50-75 for inbound

IDOC Types

An IDOC Type (Basic) defines the structure and format of the business document that is to be exchanged. An IDOC is an instance of an IDOC Type , just like the concept of variables and variables types in programming languages. You can define IDOC types using WE30

What is a Segment?

Segment defines the format and structure of a data record in IDOC. Segments are reusable components.

For each segment SAP creates
  • Segment Type (version independent)
  • Segment Definition (version dependent)
  • Segment Documentation
The last 3 characters is the version of the segment

Definitions keep changing as per the version but the segment type remains the same.

Transaction :WE31

Tuesday, 31 May 2016

How to Configure and Test RFC.

This tutorial is divided into 4 sections
  1. Setup a RFC connection
  2. Trusted RFC connection
  3. Testing a RFC connection
  4. Error Resolution

Procedure to setup an RFC connection:

Enter Transaction Code SM59

In the SM59 screen, you can navigate through already created RFCs connection with the help of option tree, which is a menu based method to organize all the connections on the basis of categories.
Click the  'CREATE' button. In the next screen , Enter -
  • RFC Destination – Name of Destination (could be Target System ID or anything relevant)
  • Connection Type – here we choose one of the types (as explained previously) of RFC connections as per requirements.
  • Description – This is a short informative description, probably to explain the purpose of connection.
After you'SAVE'the connection, the system will take you to 'Technical Settings' tab, where we provide the following information:
  • Target Host– Here we provide the complete hostname or IP address of the target system.
  • System Number – This is the system number of the target SAP system.
  • Click Save
In the 'Logon and Security'  Tab, Enter Target System information
  • Language – As per the target system's language
  • Client – In SAP we never logon to a system, there has to be a particular client always, therefore we need to specify client number here for correct execution.
  • User ID and Password – preferably not to be your own login ID, there should be some generic ID so that the connection should not be affected by constantly changing end-user IDs or passwords. Mostly, a user of type 'System' or 'Communication' is used here. Please note that this is the User ID for the target system and not the source system where we are creating this connection.
Click Save. RFC connection is ready for use
Note: By default a connection is defined as aRFC. To define a connection as tRFC or qRFC go to Menu Bar -> Destination aRFC options / tRFC options ; provide inputs as per requirements. To define qRFC , use the special options tab.

Trusted RFC connection

There is an option to make the RFC connection as 'Trusted'Once selected, the calling (trusted) system doesn't require a password to connect with target (trusting) system.
Following are the advantages for using trusted channels:
  • Cross-system Single-Sign-On facility
  • Password does not need to be sent across the network
  • Timeout mechanism for the logon data prevents misuse.
  • Prevents the mishandling of logon data because of the time-out mechanism.
  • User-specific logon details of the calling/trusted system is checked.
The RFC users must have the required authorizations in the trusting system (authorization object S_RFCACL).Trusted connections are mostly used to connect SAP Solution Manager Systems with other SAP systems (satellites)

Testing the RFC Connection

After the RFCs are created (or sometimes in case of already existing RFCs) we need to test, whether the connection is established successfully or not.
As shown above we go to SM59 to choose the RFC connection to be tested and then we expand drop down menu - "Utilities->Test->…". We have three options:

Connection test -> This attempts to make a connection with remote system and hence validates IP address / Hostname and other connection details. If both systems are not able to connect, it throws an error. On success it displays the table with response times. This test is just to check if the calling system is able to reach the remote system.
Authorization Test -> It is used to validate the User ID and Password (provided under 'logon and security' tab for the target system) and also the authorizations that are provided. If test is successful, then same screen will appear as shown above for the connection test.

Unicode Test -> It is to check if the Target system is a Unicode or not.
Remote Logon –>This is also a kind of connection test, in which a new session of the target system is opened, and we need to specify a login ID and Password (if not already mentioned under 'Logon and Security' tab). If the user is of type 'Dialog' then a dialog session is created. To justify the successful connection test, output will be the response times for the communication packets, else error message will appear.


What went wrong?

If somehow the RFC connection is not established successfully, we can check the logs (to analyze the issue) at OS level in the 'WORK' director. There we can find the log files with the naming convention as "dev_rfc<sequence no.>" and the error description can be read from such files.