OpenScript is part of Oracle Application Testing Suite (OATS) and allows the tester, or any other individual interested in performing tests for web environments and Oracle applications, to record workflows and user actions while using that website. In this tutorial I’m showing how to record a realistic use case scenario and replay it for as many iterations as we like using data fetched from a ‘databank’.
So here are the steps to run a test in OpenScript:
Record new scripts in OpenScript
Similar to any other environment, the first thing we have to do, in order to create a new project is to click on File -> New.
Then, we should specify the type of the script, which is Web in our case.
An optional step that we could do the first time we use the tool, is to create an additional repository for storing the scripts and their data instead of using the Default one.
In this tutorial, the repository we will use is located in: C:\OracleATS\ATSWorkshop. Following the previous step after clicking Next, we right-click on My Repository and then on New Repository.
To create and use the new repository, we should specify its location and give a name to it (ATSWorkshop).
By clicking OK, we are now able to select the new workspace and finally give a name (duckshop_register_user) to the script we are about to create.
In order to start recording the new script, all we have to do is click the red circle.
Right after we click the recording button, a toolbar appears, so that we can pause or stop the recording, or use some other more advanced features such as extracting information from the page.
When the browser opens, all we have to do is to navigate through the page and execute the use case scenario steps that we have planned to test.
For the recording to work properly, we should enable first the add-ons for the OpenScript helpers in the browser. In this case, the browser we have chosen is Internet Explorer as it is easiest to use for recording, compared to Google Chrome and Mozilla Firefox, and does not need any additional configuration.
The script we have recorded is the registration of the new user. More specifically, the new user navigates to the Duckshop’s home page, clicks on the link to register himself, enters his information and clicks the button to create his account. At that time, he gets automatically logged in and we then log him out.
In this page, there is some basic level of validation such as validation that all required fields are completed or that the postcode, email and telephone number have an appropriate format.
In the tool’s Tree View tab, we can see all the steps we have followed during the recording of the above use case. The steps are categorised as navigations to a page, stay idle and wait for a page, click on page elements and actions of pressing the tab button or setting text in an input area.
In the Java Code tab of the tool, we can view how the steps from the tree view are translated to the Java Programming language. After we launch the browser through the initialize() method, we start executing the steps in the run() method.
An example of a step consists of the page we want to navigate to, the time we want to wait on that page, and the action we are going to do on that page or in that text box (e.g., click, press or type).
The main feature of this tool is that it allows us to replay the test cases we have recorded. In the registration example, just by clicking the button with the purple arrow, OpenScript opens a web browser and tries to recreate all the steps in real time.
In this case scenario, each user’s email must be unique; that’s why trying to replay the registration with the same data will display an error message and cause the test case execution to fail.
If we need to execute the script using different values each time, we should find a way to use multiple credentials and data in the fields. A solution to the above problem is to create a databank using a .csv file which contains all the information of the users we want to register. To achieve that, we have to create this .csv file in our workspace’s folder so that we can select it as the source of our data for the tests later.
The data file we have created is called duckshop_users and contains the data below. The format of this file is .csv which stands for comma-separated values and consists of multiple fields in each row separated by a comma. Each row is related to the data of a single user.
After creating the file, we navigate to the Assets Tab of the tool and we click on Databanks -> Add-> CSV File.
The next step before using this data file, is to locate it in our system and give a name to it through OpenScript.
With setting the databank for this script, we can then replace the default data which we typed in each input area during the recording, with the new data fields located in the databank.
By right-clicking in a step and then clicking on Properties, we can view the default values for the input areas.
To change the default input, we click on the icon next to the Text box and we select the field name from the databank which we want to map this input area with. For example, Jane was the value we recorded for the First Name input area, so from the databank we have to choose the Name field type and map them.
After selecting the appropriate input, we can review the change and click OK.
In this point, it’s worth emphasising that all recorded passwords ensure user’s privacy as they are encrypted.
Changing the source of all the input areas, results in the following tree view. As we can see, all fields fetch data from the appropriate field in the duckshop_users databank now.
After setting the databank, we can replay the test script for one or more iterations. We can either choose how many iterations we want to execute or for how many records in the databank to run the test. Also, we can specify if we want the tool to use fields from the databank randomly or sequentially.
At each iteration of registering a new user, we can see that the test fetches the appropriate combination of data from the .csv file.
When the replay finishes with all the iterations, a result report is displayed. This report helps us compare the durations of each step with the recorded ones, and check whether a step has passed the test or if a problem has occurred during its execution.
So, that’s a quick overview of recording scripts, using databanks and performing the tests.