Wir freuen uns sehr, dass die DSAG unseren Mitgliedsantrag angenommen hat. Ich war früher, in meiner Zeit als Angestellter der zetVisions AG schon oft und gerne auf den Treffen der Arbeitskreise Development oder UI. Die Informationen von der SAP aus erster Hand sind durch nichts zu ersetzen. Und es gibt jede Menge interessanter Kontakte bei den Treffen.
Im Herbst 2021 möchten wir einen Auszubildenden Fachinformatiker in der Fachrichtung Anwendungsentwicklung oder Daten- und Prozessanalyse ausbilden. Dafür suche wir momentan noch den geeigneten Kandidaten.
Die Brandeis Consulting GmbH ist ein kleines Schulungs- und Beratungsunternehmen mit 4 Mitarbeitern. Unser Fokus sind die Softwarelösungen der SAP, insbesondere die SAP HANA Datenbank und das SAP BW/4HANA Data Warehouse. Ein Blick auf unsere Homepage bringt einen guten Überblick über die Themen, mit denen wir uns beschäftigen.
Was wir bieten können
- Einen gut ausgestatteten Arbeitsplatz ganz in der Nähe des Mannheimer Hauptbahnhofs
- Einarbeitung in sehr gefragte Qualifikationen am Arbeitsmarkt, z.B. die Programmiersprachen ABAP und SQLScript
- Sehr abwechslungsreiche Tätigkeiten
Was wir uns wünschen
Als Auszubildender sollten Sie das Folgende mitbringen:
- Logisches Denken und Lernbereitschaft
- Kenntnisse in einer Programmiersprache
- Selbständigkeit und Zuverlässigkeit bei der Arbeit
Falls wir Ihr Interesse geweckt haben, freuen wir uns über eine Bewerbung an die Adresse firstname.lastname@example.org .
With the ABAP Managed Database Procedures, or AMDP for short, SAP succeeds in making the performance of the SAP HANA database in ABAP developments easy to use. This is a framework that now includes not only SQLScript procedures, but also functions and also allows the creation of views programmed in SQLScript with the CDS-Table Functions.
On this page I have compiled some useful Information to this topic.
- Excerpts from my german book SQLScript für SAP HANA, SAP PRESS
- An example of BW AMDP transformation routines (German)
- SAP Documentation
Use cases of AMDP
The AMDP technology is used for different scenarios. Probably the most important is the useage in SAP BW/4HANA:
AMDP for SAP BW/4HANA Transformation Routines
With the BW on HANA, SAP has re-implemented the execution of data transfer processes (DTPs). Directly on the HANA database. This has brought an enormous acceleration and reduced the runtime by a factor of 10. A prerequisite for this HANA execution, however, is that no ABAP routines are used in the transformations.
The alternative to ABAP routines are procedures in SQLScript. These are developed as AMDP methods. The structure of these classes and methods is generated by the BW system to match the definition of the transformation. The developer then has to develop the SQLScript code. There is no need for a deeper understanding of the definition of database procedures.
AMDP in ABAP programs
The use of SQLScript is especially useful when
- there are large amounts of data, and
- the execution time is relevant.
Both are given in SAP BW, which is why AMDP plays a major role here. But when these factors also apply to an ABAP program, then we can also use SQLScript via AMDPs. Either to run procedures directly on the database. These can have read and write access to tables.
Or in the form of CDS-Table Functions as programmed views. These are implemented in SQLScript as a function. A CDS entity then serves as a wrapper to make it available to ABAP programs. They can be used there like a DDic view in a SELECT query.
An other use case could be the access to other database schema of your underlying HANA System. You can query them without problems from AMDP-Procedures, as long as the ABAP-DB-User (mostly SAPHANADB) has the required privileges.
The AMDP topic is interesting for all ABAP developers. Many use cases can be solved very performantly and elegantly. But a good knowledge of the SQLScript programming language is required.
With BW/4HANA 2.0 I observed that the packet size of the DTPs is dynamically suggested by the system. Figure 1 shows, for example, a suggested size of 1,311,000 data records. In this way, the system dares to go into an order of scale, that is better than the former fixed standard value of 100,000 rows based on my own practical experience.
In this blog post I have collected some useful information regarding the DTP package size and added an own example with a measurement series.
Actual packet size for HANA execution
When HANA executes DTPs, we only see exactly one number of records for each packet in the monitor. For ABAP execution, a number of records was specified here for each substep: Extraction, transformation, insertion, and so on. However, these substeps can no longer be distinguished in HANA execution. All logic from extraction, transformation rules, routines, and this possibly over several levels, is included in a single CalculationScenario, from which the data is read. And this scenario is optimized as a whole. This means that we no longer have individual values for the respective steps. And this also explains why the number of data records in the monitor is only displayed after a whole package has been processed, and not step by step as was the case in the past.
How much data is actually processed at once depends on several parameters. The package size in the DTP specifies the number of data records in the source. Normally, N packages of this size are formed, the last of the packages is correspondingly smaller.
If the data is retrieved request by request in delta mode, this procedure is repeated for each individual request. This means that for each request there is a package that is not the full package size.
If semantic partitioning is set, as many partitions as necessary are inserted into each package until the package size is exceeded. If the partitioning criteria are chosen inappropriately, this can result in enormously large packages or uneven package sizes.
However, the actual package size is also influenced by the transformation routines. If, for example, a filter with a WHERE is applied to the INTAB, the number of data records in the packet is reduced accordingly. This condition is pressed down into the extraction of the data. However, since the packet building takes place before the individual packets are processed, the packets are effectively reduced by the filter criterion.
Conversely, it can also happen that the actual packet size is increased by the logic in the routines. This is the case, for example, if data is multiplied by a join or if data is transposed from a key figure model to a corresponding characteristic model.
The actual number of data records is important for good performance.
Optimal Package Size for HANA Execution
An unfavorable package size increases the runtime. With HANA execution, the packages can be considerably larger than with ABAP execution. An order of scale of 1,000,000 data records is in most cases a good start value if you do not receive any proposals from the system. It is important that this is the actual number of data records. If your routines filter out 90% of the data records, you should take this into account in the package size of the DTPs. If the packages become too large, the number of parallel processes may not be used. You can see an example of this in the following example.
The optimum package size can only be determined by tests. These tests should be performed on the production system with real data.
Example of a DTP with Processing Logic
In the following, I show the runtimes for a DTP that contains a transformation that creates 16 records from one source record by transposing it. This is necessary to convert the PCA plan data from a key figure based model into a characteristic based model in which each data record represents a period. In the example, there are 1,994,590 data records in the source, from which 31,913,440 data records are converted in the target by the routine.
|Package Size in the DTP||Actual number of Data Records||Runtime in Seconds|
Table 1: Runtime depending on the Package Size
The default value of the BW/4HANA system for the DTP greetings is 1,000,000 data records. However, this is no longer optimal due to the duplication of the data records. If we choose a package size instead, so that the actual number of data records corresponds to approx. 1,000,000, then the runtime is considerably better. In our example, the optimum is 1.6 million actual records.
Influence of the Number of Work Processes in HANA Execution
In addition to the package size, the number of work processes also has an influence on the total runtime for a DTP. However, the effect of doubling the number of work processes is by far not as great as you would naively expect:
For the above example, I have increased the number of work processes from 3 to 6. This shortens the runtime from 138 seconds by only 14% to 121 seconds.
From ABAP execution, we know here a halfway linear dependency on runtime and number of work processes. When you execute the process chains, you must always make sure that the total number of background processes (type BTC) is sufficient.
The packet size is an important parameter for optimizing the DTP runtime. With the HANA version, we can now choose larger packages. The default value of BW/4HANA 2.0 must sometimes be adjusted, especially if the transformation logic changes the actual number of data records. Often the only thing that helps here is to try it out.
BW/4HANA 2.0 no longer supports the administration of InfoProviders and requests in the SAP GUI. Instead, a browser window with the SAP BW/4HANA cockpit opens when we click on the administration of the ADSOs or start a DTP and the DTP monitor opens. You can find the SAP documentation for the BW/4HANA Cockpit here: https://help.sap.com/viewer/107a6e8a38b74ede94c833ca3b7b6f51/2.0.0/de-DE/2447401c5d01428a9c4bb8edbd567cd8.html
The BW/4HANA Cockpit is a Fiori Web application that certainly has its advantages. But sometimes this transaction is not accessible, e.g. if you have problems with the configuration of the Web server, certificates or corresponding roles are missing (although you have the profile SAP_ALL).
A simple workaround is the SAP GUI transaction RSMNG. Here you can (still) reach the old interface for the administration of a data target. This allows us to navigate to your individual DTP requests.
If the BW/4HANA cockpit cannot be opened
If the BW/4HANA Cockpit does not open when you click on “Manage the DataStore Object (advanced)” in Eclipse, this may be due to your setting in Eclipse. It did not work for me with the setting “Default system web browser”. With the explicit selection of Chrome a window has opened.
You should of course get used to the new interface of the BW/4HANA Cockpit in the browser, because SAP may switch off or restrict the “old” GUI transactions. But as a workaround, this is definitely a good interim solution.
In the SAP Cloud Appliance Library (CAL), you can use SAP’s latest systems in the cloud with little effort. Some of the systems are free for certain purposes, and are great for learning new technologies. Especially the DEV systems can be used well for this purpose.
If you are a normal developer and get such a system, you will soon reach the point where you want to import or create transports. This is possible in principle, but you need access at the level of the Linux operating system.
I have written a simple program to copy the transport files directly from the client computer to the transport directories (trans/cofiles and trans/data) via the SAP GUI. This makes life much easier.
If necessary, the paths in the source code must be adjusted.