OpenScript is part of Oracle Application Testing Suite (OATS) and allows the tester, or any other individual interested in performing tests for web environments and Oracle applications, to record workflows and user actions while using that website. In this tutorial I’m showing how to record a realistic use case scenario and replay it for as many iterations as we like using data fetched from a ‘databank’.

So here are the steps to run a test in OpenScript:

Record new scripts in OpenScript

Similar to any other environment, the first thing we have to do, in order to create a new project is to click on File -> New.

FIGURE 1: CREATE NEW SCRIPT

Figure 1: Create new script

Then, we should specify the type of the script, which is Web in our case.

FIGURE 2: DEFINE SCRIPT TYPE

Figure 2: Define script type

An optional step that we could do the first time we use the tool, is to create an additional repository for storing the scripts and their data instead of using the Default one.

In this tutorial, the repository we will use is located in: C:\OracleATS\ATSWorkshop. Following the previous step after clicking Next, we right-click on My Repository and then on New Repository.

FIGURE 3: CREATE NEW REPOSITORY

Figure 3: Create new repository

To create and use the new repository, we should specify its location and give a name to it (ATSWorkshop).

FIGURE 4: SET LOCATION OF REPOSITORY

Figure 4: Set location of repository

By clicking OK, we are now able to select the new workspace and finally give a name (duckshop_register_user) to the script we are about to create.

FIGURE 5: SET SCRIPT NAME

Figure 5: Set script name

In order to start recording the new script, all we have to do is click the red circle.

FIGURE 6: RECORD NEW SCRIPT

Figure 6: Record new script

Right after we click the recording button, a toolbar appears, so that we can pause or stop the recording, or use some other more advanced features such as extracting information from the page.

When the browser opens, all we have to do is to navigate through the page and execute the use case scenario steps that we have planned to test.

FIGURE 7: BROWSER AND TOOLBAR HELPER DURING RECORDING

Figure 7: Browser and helper during recording

For the recording to work properly, we should enable first the add-ons for the OpenScript helpers in the browser. In this case, the browser we have chosen is Internet Explorer as it is easiest to use for recording, compared to Google Chrome and Mozilla Firefox, and does not need any additional configuration.

FIGURE 8: BROWSER'S ADD-ONS

Figure 8: Browser add-ons

The script we have recorded is the registration of the new user. More specifically, the new user navigates to the Duckshop’s home page, clicks on the link to register himself, enters his information and clicks the button to create his account. At that time, he gets automatically logged in and we then log him out.

In this page, there is some basic level of validation such as validation that all required fields are completed or that the postcode, email and telephone number have an appropriate format.

In the tool’s Tree View tab, we can see all the steps we have followed during the recording of the above use case. The steps are categorised as navigations to a page, stay idle and wait for a page, click on page elements and actions of pressing the tab button or setting text in an input area.

FIGURE 9: RECORDING STEPS

Figure 9: Recording steps

In the Java Code tab of the tool, we can view how the steps from the tree view are translated to the Java Programming language. After we launch the browser through the initialize() method, we start executing the steps in the run() method.

An example of a step consists of the page we want to navigate to, the time we want to wait on that page, and the action we are going to do on that page or in that text box (e.g., click, press or type).

FIGURE 10: JAVA CODE TAB

Figure 10: Java code tab

 

Replay scripts

The main feature of this tool is that it allows us to replay the test cases we have recorded. In the registration example, just by clicking the button with the purple arrow, OpenScript opens a web browser and tries to recreate all the steps in real time.

FIGURE 11: REPLAYING TEST CASE

Figure 11: Replaying test case

In this case scenario, each user’s email must be unique; that’s why trying to replay the registration with the same data will display an error message and cause the test case execution to fail.

FIGURE 12: ERROR MESSAGE - EMAIL ALREADY EXISTS

Figure 12: Error message – Email already exists

 

Use databanks

If we need to execute the script using different values each time, we should find a way to use multiple credentials and data in the fields. A solution to the above problem is to create a databank using a .csv file which contains all the information of the users we want to register. To achieve that, we have to create this .csv file in our workspace’s folder so that we can select it as the source of our data for the tests later.

FIGURE 13: PATH TO DATABANK

Figure 13: Path to databank

The data file we have created is called duckshop_users and contains the data below. The format of this file is .csv which stands for comma-separated values and consists of multiple fields in each row separated by a comma. Each row is related to the data of a single user.

FIGURE 14: DATABANK DATA

Figure 14: Databank data

After creating the file, we navigate to the Assets Tab of the tool and we click on Databanks -> Add-> CSV File.

FIGURE 15: ADD CSV FILE

Figure 15: Add CSV file

The next step before using this data file, is to locate it in our system and give a name to it through OpenScript.

FIGURE 16: SET DATABANK LOCATION

Figure 16: Set databank location

With setting the databank for this script, we can then replace the default data which we typed in each input area during the recording, with the new data fields located in the databank.

By right-clicking in a step and then clicking on Properties, we can view the default values for the input areas.

FIGURE 17: TEXTBOX PROPERTIES

Figure 17: Textbox properties

To change the default input, we click on the icon next to the Text box and we select the field name from the databank which we want to map this input area with. For example, Jane was the value we recorded for the First Name input area, so from the databank we have to choose the Name field type and map them.

FIGURE 18: REPLACE DEFAULT RECORDED DATA TYPED IN INPUT AREA

Figure 18: Replace default recorded data typed in input area

After selecting the appropriate input, we can review the change and click OK.

FIGURE 19: NEW DATA SOURCE FOR INPUT AREA

Figure 19: New data source for input area

In this point, it’s worth emphasising that all recorded passwords ensure user’s privacy as they are encrypted.

FIGURE 20: ENCRYPTED PASSWORDS

Figure 20: Encrypted passwords

Changing the source of all the input areas, results in the following tree view.  As we can see, all fields fetch data from the appropriate field in the duckshop_users databank now.

FIGURE 21: UPDATED TREE VIEW (NEW DATA SOURCES)

Figure 21: Updated tree  view – New data sources

After setting the databank, we can replay the test script for one or more iterations. We can either choose how many iterations we want to execute or for how many records in the databank to run the test. Also, we can specify if we want the tool to use fields from the databank randomly or sequentially.

FIGURE 22: REPLAY FOR MORE THAN ONE ITERATION – SETTINGS

Figure 22: Replay for more than one iterations – Settings

At each iteration of registering a new user, we can see that the test fetches the appropriate combination of data from the .csv file.

FIGURE 23: REPLAY FOR MORE THAN ONE ITERATION – BROWSER

Figure 23: Replay for more than one iterations –Browser

When the replay finishes with all the iterations, a result report is displayed. This report helps us compare the durations of each step with the recorded ones, and check whether a step has passed the test or if a problem has occurred during its execution.

FIGURE 24: REPLAY RESULT REPORT

Figure 24: Replay result report

So, that’s a quick overview of recording scripts, using databanks and performing the tests.