Skip to end of metadata
Go to start of metadata

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 20 Next »

Automation Objects contain configuration for each test execution. Automation Objects define the interface under test, test conditions, and necessary data manipulation. Automation Objects can also point to other objects, UI scripts, database validation definitions, and more.

Add-on _ SP De-Installation.mp4

Automation Objects Management

To start working with Automation Objects go to the Fiori launchpad and select the Automation Objects tile.

Automation Objects list will open, with typical list filtering features available.

Elements of Automation Objects list

  • Automation Obj. - technical name of automation object

  • Description - user friendly free form description

  • Type - Type of test case supported by the Automation Object

  • Interface - Interface name as defined in the Automation Object

Automation Object list features

  • List filtering

  • Create new Automation Object

  • Copy selected Automation Objects

  • Delete selected Automation Objects

  • Export selected Automation Objects to file

  • Import Automation Objects from file

  • Export Automation Objects lists to spreadsheet file

Working with Automation Objects

Clicking an Automation Object from the list or creating a new one will open Automation Object edition screen.

Video guide for Automation Object creation

Automation Object edit screen sections

First items from top are the automation object name, description and assigned test type. Then the screen is divided in several sections. Navigation between these sections is possible by scrolling or clicking the section names.

The sections presented will vary, based on the assigned test type. This section of manual describes all of the sections and their usage. For more details on each test types, see the dedicated pages of this manual.

Basic Information

This section contains the basic information about the integration being tested. Depending on selected test type this section will contain different fields, relating to the specific integration technology - e.g. for SAP IS (CPI) these would be the iFlow name.

Backend Systems

This feature is available only in APITester

Backend Systems allows to specify the system lines for database validation. The specified systems lines are read from the configuration and, together with landscape on which the test case is executed, map to a RFC connection. Int4 Suite uses this RFC connection to execute dedicated function modules and reading data from the system’s database for comparison.

Two system lines need to be specified. The current doc. system line points to the system where we expect to find data from current test execution. The reference document will be read from the system specified by reference doc. system line.

Based on Your testing scenario, both current and reference system lines can be the same or different. For regression testing it will usually be the same system line, e.g. ECC. The testing landscapes are then different, e.g. Test and Production and the system would read reference from Production and current document from Test. For migration scenarios, e.g. ECC to S/4HANA, usually the reference would be ECC and current would be S/4HANA. Please refer to the https://int4support.atlassian.net/wiki/spaces/IUM/pages/2068021270/Int4+APITester#Environments-and-System-lines section for more details on the system lines definition.

Database Validations

This feature is available only in APITester

image-20241016-112719.png

Database Validation ruleset points to a specific Database Validation object. Its settings are read and used for execution of the database tables and fields comparison. This allows for single database validation ruleset to be used with multiple Automation Objects.

Read more here on defining the Database Validation rulesets: DB Validation Rulesets

Persist reference DB Data parameter, allows user to decide when the DB Reference Data should be fetched. When it’s switched on, it will be done during test case creation and stored as a test case payload named Reference DB Data. Otherwise, it will happen during test case execution.

In Test Case Details screen in Payloads section user can refresh this data using button image-20241016-113848.png or manually change it by clicking on the Reference DB Data line in edit mode.

image-20241016-113431.png

Option for refreshing Reference DB Data for many test cases at once is available in Int4 API Tester Cockpit. This action can be performed for all test cases in the folder or for the selected items.

image-20241016-114143.png

Variables

Variables in Int4 Suite represents a complex feature that allows for data manipulation in the processed test cases. There are various scenarios where Variable processing can be useful:

  • Updates of document / message content before test execution

  • Capturing of data from document / message content after execution

  • Matching documents based on the Variable content

  • Passing data between test cases

In this section You can add, remove and edit Variable settings.

Read more here on details of Variables and their processing: Variables & Variable processing

Payload Validations

Payload Validation enables additional checks on message content and structure. In this section You can create, edit and delete these rules. Use the “Create” button or the “X” icon for creation and deletion respectively.

Int4 Suite basic test execution will compare the messages (reference and current execution) and report any difference as error, and fail the test. While this might be OK for some migration scenarios, not all message differences are errors, some differences are even expected.

Common scenario for different data in messages is date and time. It makes sense to ignore differences in these fields.

Definition of Payload Validation rule consists of

  • Description - free form text describing the purpose of the rule

  • Expression type - XPath or Flat File expression

    • XPath is a standardized way of locating nodes and data in XML messages. For JSON messages, JSONPath syntax is accepted as expression if XPath type is selected.

    • Flat File expressions language is developed by Int4 and allows for matching text patterns in flat file processing. Please read here for more details: Expression language for flat files

  • Expression - the XPath, JSONPath or Flat File expression

  • Rule - specifies the behaviour if data differs at the point specified by the Expression

    • Ignore - the difference has no impact on test result

    • Warning - the difference will trigger a warning and will be highlighted in the test report

    • Warning when different based on Variable replacement - the difference will trigger a warning, if the data matches data in a Variable, otherwise the difference will trigger an error

    • Warning when different based on Value Mapping - the difference will trigger a warning, if the data matches the definition in the Value Mapping, otherwise the difference will trigger an error

  • Parameter - additional parameter for processing - Variable name OR Value Mapping name for comparison

JSONPath and XPath are well-known and popular standards for structured data manipulation. You can find a lot of references and educational material on-line. Worth noting are the testing tools that allow You to experiment with XPath or JSONPath definitions. See here:

Please note that Int4 does not maintain these tools and can’t guarantee for their accuracy.

Payload Matching

In scenarios with multiple outputs for the same receiver, there is a need to compare them based on the same order as reference documents used for test case creation. Integration Platforms don’t guarantee that outputs will always be sent in the same order control; therefore, the solution is to sort the messages before comparison.

Please see the detailed pages for specific test types for more information.

IDoc Status Validations

This feature is only available in APITester

IDOC Status Validation parameters are used to override default status handling rules for Inbound and Outbound IDOC test types. Please see the relevant test type (IDoc Inbound, ) manual page for more details.

Execution Settings

For each test type, there are various settings for detailed control of test case execution. While more details can be found on the manual pages referring to specific test types, these are some general types usually found in the Automation Object.

Debug Log

Enables additional debugging messages in the test execution log. This is normally not enabled, as the execution log with debug will contain many details not needed for normal test execution. It is very useful for early stages of work with Int4 Suite, as it explains the details of each unsuccessful run and helps to track down and resolve connectivity problems and security issues.

Display wait popup before validation

In certain scenarios, between the message injection by Int4 Suite and readiness for test validation, manual steps must be taken. Imagine a scenario where XML message is injected into SAP CPI, where it’s mapped to an IDoc and transferred to SAP S/4. Such an IDoc might be processed automatically by the backend, but certain configurations might prevent that from happening. In such case, the tester will have to log into S/4 and manually request IDoc processing. This can be handled using the wait popup. Int4 Suite will send the message and display popup, before starting to validate backend data. Thanks to this, the tester can go to the system, complete IDoc processing, get back to Int4 Suite and confirm the popup. Only then Int4 Suite will do the data validations

Delay between execution and validation

Similarly to the wait popup, these options introduce a delay between starting the test run and/or between injecting test case data and result validation. If we know that backend processing or integration platform mapping could take a significant amount of time, in this section such a delay can be configured. Thanks to this Int4 Suite will wait before trying to validate integration platform and/or the backend - depending on the test type.

Number Ranges

Int4 Suite uses private number ranges for data processing needs. Quite often, when sending test data to test systems again and again, the message can’t be exactly the same. For example, sending the same purchase order to generate sales order in S/4 could result in an error, if the system is configured to treat such situations as duplicates. Also, having a specific data field in the document might be mandatory to find another document - e.g. by generating an unique PO number for the Sales Order, we can find the newly created Sales Order in the database. The actual mechanics for such replacements is described in the section Variables & Variable processing.

Number range definition consists of:

  • Number range name - a technical code identifying the number range

  • Prefix - string of characters added before the resulting number

  • Low value - starting value of the number range

  • High value - final value of the number range

  • Current value - value which will be assigned when the number range is next used

  • Suffix - string of characters added after the resulting number

  • Add zeroes - check to add leading zeroes so that the number has always the same length

  • Incr. per Test Run - normally the number is increased each time the test case is executed. If this is not desirable, the number range will advance only once per test run if this is checked

Data Scrambling

Data scrambling enables to reduce or remove any GDPR, privacy and security concerns when it comes to testing. When Int4 Suite is extracting the test data from existing messages or reports testing results, the visibility and availability of sensitive data could be an issue. To avoid this issue, sensitive data can be scrambled. Scrambling can occur at test case creation time and during the test execution. Actual behaviour of the scrambling engine is configured by a set of rules.

Data scrambling rule settings
  • Rule type - decide when the rule needs to execute, at Test Case Creation or at Runtime

  • Rule - free form text description of the rule

  • Method - actual scrambling method

  • Expression type - XPath or Flat File. See the Payload Validation section for more details on these Expressions

  • Expression - actual expression that points to specific data object in the message

  • Parameters - additional parameters for specific Methods\

Available scrambling Methods
  • CONSTANT - replace data with a constant text, provided in the first Parameter

  • CUSTOM - calls a custom ABAP code for scrambling - takes both Parameters as input

  • GUID - generates an unique ID based on the scrambled value

  • HASH - generates a numerical shortcut (using one-way hash function) that represents the scrambled value

  • MASK - replace each character in the value with a character specified in the first Parameter

  • RANDOM - generates a random number value

Depending on the actual scrambled data, it might be relevant or not for the test result. Usually textual data in the messages gets forwarded without modifications and is less relevant for the processing. For example, for employee data, it can contain name and address, but only the personal number is relevant for process execution. So, the name and address could be replaced by random data, masked or even removed by providing empty constant.

If the data to be scrambled needs to be compared, consider using the Hash. While it hides the actual data, it will always return the same hash value for the same scrambled value. This enables partial validation of the sensitive data.

Parameters

Parameters control interface technology specific behaviour of the testing program. They vary significantly, based on the selected test type for the specific Automation Object. Please see the manual pages for specific test types for details on these parameters.

  • No labels