Writing Qorus Tests in Python

In this blog post, you’ll learn how to write Qorus tests in python. You’ll test the functionality developed in the “Poll SFTP Server For CSV Data And Import To DB” blog post series. You will begin by understanding the logic flow for the test and understand the code involved in each step.

ℹ️ Note: The complete test script is available in the examples/csv-sftp-to-db-import/example-import-csv-file.qtest.py of the Qorus’ building blocks repo.

Logic Flow

1. Set the job to run only on demand during the test
2. Create a test CSV file and put it on the SFTP server
3. Create a “RunJobResult” action and test for a COMPLETE status
4. Run the job and get the result
5. Check that a workflow order is created
6. Wait for order status
7. Get workflow order info and assert that it’s COMPLETE
8. Check if the test data has been added to the table
9. Check for duplicate file handling
10. Reset the job to run normally after the test completes
ℹ️ Note: Before you can run any Qorus tests, set the qorus-client.allow-test-execution to true in the /opt/qorus/etc/options file.

Qorus Modules and Classes Required

ObjectImported Classes LanguageType
qoreloader --- Python Module
qore.QorusInterfaceTest QorusJobTest, RunJobResult, Action Qore Class
qore.__root__.OMQ.UserApi UserApi Qore Class
qore.__root__.OMQ.Client QorusSystemRestHelper Qore Class
qore.ssh2 SFTPClient Qore Class
qore.SqlUtil AbstractTable Qore Class

ℹ️ Note: The qoreloader module is provided by Qorus and provides the bridge between Python ⇋  Qore and, with Qore’s “jni” module, between Python ⇋  Java.

The qoreloader module provides the magic package “qore” which allows us to import Qore APIs and use them as if they were Python APIs. The qoreloader module also enables the Qorus client to be used from native Python code directly.
ClassDescription
QorusJobTest The main job test class
RunJobResult A test action that runs a job and allows you to retrieve the result info and make assertions
 Action The base class for Qorus test actions
UserApi The main Qorus interface / utility class
QorusSystemRestHelper A class that makes it easy to talk to the Qorus REST API. When run a Qorus server it handles data serialization, deserialization and local authentication

It will use the network encryption key, if it's readable, to send an API request to the server in order to obtain a token with system permissions to execute Qorus API calls

If the key is not readable, then it requires username and password configuration in the $OMQ_DIR/etc/options file at the qorus-client.client-url option
SFTPClient SFTPClient class allows Qorus code to communicate with SFTP servers. This is the object type that's returned from an sftp:// connection
AbstractTable The low-level parent class for the actual Table implementation in SqlUtil modules
ℹ️ Note: Make sure your scripts have the executable bit set (on UNIX systems) and an appropriate hash-bang as the first line #!/usr/bin/env python3

1. Set the job to run only on demand during the test

You can Set the job to run only on demand during the job, with:
MyTest.qrest.put("jobs/example-import-csv-file/setActive", {"active": False})
ℹ️ Note: qrest is a static variable declared in the class:
qrest: QorusSystemRestHelper = QorusSystemRestHelper()

2. Create a test CSV File And Put it on the SFTP Server

Create a CSV file with test data:
csv: str = MyTest.getFileData(5)
filename: str = 'StockReport-{}.csv'.format(uuid.uuid4())

The getFileData above generates CSV data when supplied with the number of records to be generated as an argument:

def getFileData(num_records: int) -> str:
   csv: str = "StoreCode,ProductCode,ProductDescription,Available,Ordered,InTransit,ReportDate\n"
   for x in range(num_records):
      prod: dict = MyTest.getProductInfo(
      csv += '{},{},\"{}\",{},{},{},{}\n'.format(
         MyTest.stores[randint(0, 3)],
         prod['code'], prod['desc'],
         randint(0, 9), randint(0, 9), randint(0, 9), datetime.fromtimestamp(time())
      )
   return csv

The getProductInfo is defined like so:

defgetProductInfo() -> dict:
   x: int = randint(0, 2)
   if x == 0:
      return {
         "code": "SV300S37A/120G",
         "desc": "Kingston SSDNow V300 120GB 7mm",
      }
   elif x == 1:
      return {
         "code": "SSDSC2BW120A401",
         "desc": "Intel 530 120GB SSD bulk",
      }
   return {
      "code": "MZ-7PD256BW",
      "desc": "Samsung SSD840 256GB 7mm, Pro",
   }
You can put the generated CSV on to the SFTP server with:
def putFileOnSftpServer(self, filename: str, csv: str):
   self.getClient()
   tempname: str = '{}.part'.format(filename)
   bytes: int = MyTest.sftp.putFile(csv, tempname)
   # rename file to target name
   MyTest.sftp.rename(tempname, filename)
   if self.m_options.get('verbose', 0) > 2:
      print('wrote {} bytes of {} to sftp://{}:{}'.format(bytes, filename, MyTest.sftp.getHost(),
         MyTest.sftp.getPort()))
The getClient method establishes the SFTP connection and is defined as:
defgetClient(self):
   if not MyTest.sftp:
      # get connection name
      conn: str = MyTest.qrest.get('jobs/example-import-csv-file/config/sftp-polling-connection-name/value')
      MyTest.sftp = UserApi.getUserConnection(conn)

3. Create a "RunJobResult" action and test for a COMPLETE Status

The RunJobResult and Action classes are obtained from the qore.QorusInterfaceTest module.
action: Action = RunJobResult(OMQ.StatComplete)
The OMQ namespace is available in the Qore program as provided by importing the QorusInterfaceTest module.

4. Run the job and get the result

Before executing the job, get the current number of records in the table:
inventory_example: AbstractTable = UserApi.getSqlTable("omquser", "inventory_example")
num_recs: int = inventory_example.rowCount()
Execute the action, get the result and store it in a dictionary. You’ll utilise the exec method available in the MyTest class which is inherited from the QorusJobTest class:
result: dict = self.exec(action).getJobResult()

5. Check That a workflow order is created

Get the job info with:
jinfo: tuple = self.getJobResultHash(result['job_instanceid'])['info']
Now, check if one workflow order is created:
self.assertEq(1, len(jinfo), 'check wf orders created')

6. Wait for order status

Wait for the order and get its status with:
def waitForStatus(self, wfiid: int, status: str = OMQ.StatComplete):
   h: dict = None
   while True:
      h = self.qrest.get("orders/" + str(wfiid))
      if h['workflowstatus'] == status or h['workflowstatus'] == OMQ.StatError:
         break

      # wait for status to change
      sleep(0.250)

   if self.m_options.get('verbose', 0) > 2:
      print("workflow order ID {} has status {}".format(wfiid, h['workflowstatus']))

   self.assertEq(h['workflowstatus'], status, "wfiid " + str(wfiid) + " has status " + status)

Call the above waitForStatus method like so:

self.waitForStatus(jinfo[0]['workflow_instanceid'])

7. Get workflow order info and assert that it's COMPLETE

You can get the workflow order and assert that it’s complete by:

# get workflow order info
winfo: dict = self.qrest.get("orders/" + str(jinfo[0]['workflow_instanceid']))
# assert that it's COMPLETE
self.assertEq(OMQ.StatComplete, winfo['workflowstatus'], 'check order status')

8. Check if the test data has been added to the table

Check if the test data is added to the table, since we generated the CSV data with five records, we check if five new rows are added to the table:

self.assertEq(num_recs + 5, inventory_example.rowCount(), 'check data imported in DB')

9. Check for duplicate file handling

Let’s resubmit the same file and repeat the process to test for duplicate file handling
self.putFileOnSftpServer(filename, csv)
# create a "RunJobResult" action and test for a COMPLETE status
action: Action = RunJobResult(OMQ.StatComplete)
# run the job and check the result
result: dict = self.exec(action).getJobResult()
# check job results that no new workflow order was created
jinfo2: tuple = self.getJobResultHash(result['job_instanceid'])['info']
self.assertEq(1, len(jinfo2), 'check duplicate job result info length')
self.assertEq(jinfo[0]['workflow_instanceid'], jinfo2[0]['workflow_instanceid'],
   'check duplicate wf order ID')
self.assertTrue(jinfo2[0]['duplicate'], 'verify duplicate flag')

10. Reset job to run normally after the test completes

Reset the job to run normally after the test completes with:
MyTest.qrest.put("jobs/example-import-csv-file/setActive", {"active": True})

Running the test

Go to the Qorus IDE and in the interface hierarchy view, under the Tests file type, find the example-import-csv-file.qtest (or the name of your test script ) test. Run the test by clicking on the play icon next to the test’s name.

You should see the following output after a successful test in VS Code at View → Output → Qorus Remote Development

   ... response: 1
QUnit Test "example-import-csv-file" v
Ran 1 test case, 1 succeeded (13 assertions)
... status: FINISHED
Test execution finished successfully (ID: 20)

Conclusion

In this blog post, you explored the steps involved in testing the functionality developed in the Poll SFTP Server For CSV Data And Import To DB blog post series using Python. The complete script is available in the examples/csv-sftp-to-db-import/example-import-csv-file.qtest.py of the Qorus’ building blocks repo.

Grab your FREE application integrations eBook.
You'll quickly learn how to connect your apps and data into robust processes!