Version Control

There are two primary mechanisms for version control in API Fortress. The first is the Publish Test feature, which allows for the pushing of updated test-code to a live version of the test.

The second mechanism for version control is the API Fortress-Git integration. The integration is powered by the API Fortress Post-Receive Git Hook, which can be found here. Documentation is located here.

Testing GraphQL

GraphQL is a fantastic tool for creating versatile, eminently flexible servers. With API Fortress, testing GraphQL queries is as easy as testing regular REST endpoints.

API Fortress provides a demonstration GraphQL environment at:

GraphServerlet

The query and mutation below are formatted for this environment.

Testing a Query:

If the server has GraphicQL enabled, creating your queries through that tool and then copying the generated URLs into an API Fortress GET component is an acceptable process. However, this process does not lend itself to either readability or replicability. The preferred method is passing the query as a POST body.

If we’re sending a query request to our GraphQL server, we would format our POST body as follows:

{
 "query": "query (\$id: Int!) { course(id: \$id) { id, title, author, description, topic, url }}", 
 "variables": { "id": 1} 
}

The above object says the following: we are querying for a specific course by ID, and wish our response body to contain the ID, Title, Author, Description, Topic, and URL of that entry. If we send this body as a POST to our test GraphQL environment, our response will look like this:

{
 "data": {
   "course": {
   "id": 1,
   "title": "The Complete Node.js Developer Course",
   "author": "Andrew Mead, Rob Percival",
   "description": "Learn Node.js by building real-world applications with     
    Node, Express, MongoDB, Mocha, and more!",
   "topic": "Node.js",
   "url": "https://codingthesmartway.com/courses/nodejs/"
    }
  }
}

Since our response is a JSON body, API Fortress is fully capable of automatically generating  a test to validate future responses from this query.

Testing a Mutation:

A mutation is a GraphQL operation that makes a change to the data being provided. Whereas a GraphQL Query that performs a READ, a Mutation performs a CREATE, UPDATE, or DELETE, rounding out our CRUD operations.

A Mutation is also passed as a POST body to the GraphQL endpoint in question:

{
 "query": "mutation (\$id: Int!, \$topic: String!) { updateCourseTopic(id: \$id, topic: \$topic) { title, topic }}", 
 "variables": { "id": 1, "topic" : "Ruby" } 
}

This Mutation is executing the ‘updateCourseTopic’ operation on the database entry with the ID of 1 and changing its ‘topic’ value to ‘Ruby’. Note the ‘mutation’ keyword in place of the ‘query’ keyword from the Query example. This Mutation would return the following response from our test GraphQL environment:

{
 "data": {
    "updateCourseTopic": {
    "title": "The Complete Node.js Developer Course",
    "topic": "Ruby"
    }
  }
}

Again, as this is a valid JSON response body, API Fortress is able to generate a test automatically to validate its schema in the future.

Import / Export Tests

API Fortress makes it very simple to import and export your tests. This is useful when moving tests to another project, making another API Fortress deployment, or sharing with our support team for help.

Export Tests

After you login to the platform, the flow is as such:

  1. Click Tests for a Project (folder) of your choice
  2. To download click the circles to the left of the test name
  3. Click Export Selected Tests

This will download a zip of the tests you exported. See below for a GIF of the process.

Export Tests
Export Tests

Import Tests

To import a test is as easy as clicking the +Import Tests button next to the +New Test button. See image below.

Import Tests
Import Tests

Different ways to compose a Request Body

In this post we will show you the different ways you can compose a Request Body: from the simplest to the most complex.

  1. Copy and paste the body from somewhere

    The first and easiest way is when we have a body from somewhere to copy and paste as is into the call. Let’s see how this is done:

    1. In the composer we add the POST component and type the url and all of the required fields.
      Url: https://mydomain.com/myPost (the url of the resource you want to test)
      Variable: payload (the name of the variable that contains the response)
      Mode: json (the type of the response)

      post_comp

    2. Now we add the Body component and after selecting the Content-Type we paste the body in Content field.
      Content-Type: application/json
      Content: {"method":"post","url":"http://www.testme.com/api/run/test"} (the body required in your call)

      paste_body

    3. Now we can execute the call and proceed with the test.
      final_call
  2. Using Variables in the Request Body

    Another way to compose a request is using variables in the body.

    1. In the composer we add the POST component typing the url and all the required fields.
      Url: https://mydomain.com/myPost (the url of the resource you want to test)
      Variable: payload (the name of the variable that contains the response)
      Mode: json (the type of the response)

      post_comp

    2. We add the Body component. In the Content-Type we choose the proper one, let’s say application/json. In this scenario we need to use a variable so in the Content field we enter the following:
       {
       "user": ${user},
       "password": ${password},
       "url": "http://www.testme.com/api/run/test"
       }

      In this scenario “user” and “password” are not directly passed in the body but they are variables defined as global parameters in the data set.
      var_body
      data_set

    3. The POST has been done and can be executed.
      final_call2
  3. Using a Variable from another call

    The next way to compose a Request Body is by using a variable from another call. Let’s see how this can be done.

    1. The first thing we need to do is add the call we will retrieve the variable from. Let’s consider, as example, the common scenario where we need to perform a login for authentication and retrieve the authentication token required for the following call.
      Url: http://www.mydomain.com/login (the url of the resource you want to test)
      Variable: payload (the name of the variable that contains the response)
      Mode: json (the type of the response)
      Header: 
         Name: Authorization
         Value: Basic YWRtaW46cGFzc3dvcmQ= (this value comes from encoding username:password in Base64)

      loginAuth

    2. Executing the login we will have as response the desired token. Let’s see it using our console.
      response_token
    3. Now we need to save the token as variable.
      Var: token (the name of the variable)
      Variable mode: String (the type of the variable)
      Value: ${payload.access_token} (we retrive the value from the previous 'payload')

      var_token

    4. Once the token has been saved as variable we can proceed adding the second call and use that token in the Request Body.
      Content-Type: application/json
      Content: {"token":"${token}"}

      bodyWToken

  4. Using an object from another call

    In the next example we will show you a more complex case. We will consider the scenario where we need to use an object retrieved from a previous call into the body of a subsequent call. Let’s take a look at an example:

    1. First, we perform the call we retrieve the object from.
      search
    2. Let’s execute the call in our console in order to see the response.
      {
       "id": 123,
       "items": [
       {
       "id": 11,
       "name": "stuff1"
       },
       {
       "id": 12,
       "name": "stuff2"
       },
       {
       "id": 13,
       "name": "stuff3"
       }
       ]
      }

      search_response

    3. Let’s say we need the object ‘items’ as the body in the subsequent call. So, as a second call, we will add a POST and we will type the following as body:
      ${searchPayload.items.asJSON()}

       

      objectInBody

    4. Now we can proceed with the test.

     

  5. Creating a new structure to add as a body

The last scenario is yet another more complex one. In this case, we consider the scenario where we need to create a new structure to add as a body, using data from a previous call. Let’s see how we can do this:

  1. The first thing we have to do is to perform the call which retrieves the data we’re using. Let’s consider a GET that returns an array of items.firstCall
  2. Let’s see the response using our console.
    {
     "items": [
     {
     "id": 11,
     "price": 5.99
     },
     {
     "id": 12,
     "price": 6.99
     },
     {
     "id": 13,
     "price": 10.99
     },
     {
     "id": 14,
     "price": 15.99
     }
     ]
    }

    response_get

  3. Now we need to create the new data structure. To do so, we add a SET component as follow: 
    payload.items.each { it -> it.currency='$'  }; return payload.asJSON(); (for each item in the array, we add the currency attribute with "$" as value)

    newData

  4. Now we can add the POST and add the new structure as the POST request body:postWithNewStructure
  5. That’s it. Now we can proceed with the test.allDone

Note: in this post we have used the POST method but all of the examples shown can be applied to the other REST methods. In the same way, we have demonstrated scenarios with Request Bodies, but all of the examples can be used for Header or Param cases.

 

 

Setup Connectors (DataDog)

Here is a quick guide to setting up a DataDog integration.

  1. First, we need to generate a new API key in DataDog.
    1. Log in to your DataDog account.
    2. Mouse-over Integrations and then click API
    3. Create a new API key at the top of the view (Note: You must have Admin DataDog account access.)

datadog

  1. In API Fortress go to company settings (top right gear icon)
  2. Click on Alert Groups
  3. Create a new Alert Group (if necessary)
  4. Add recipients to the Alert Group (if necessary)
  5. Click on the Connectors icon
  6. Choose one of the DataDog connectors from the dropdown
  7. Add your DataDog API Key created previously and the DataDog host you wish the connector to pass data to.

connector

Once this process is complete, API Fortress will begin passing data to DataDog where it can be charted in any way you like!

Note: This connector shares events with Datadog, which are outages. If you would like to include performance metrics, such as latency and fetch, please let us know and we can help set that up. It requires a small script.

Load Agent Deployment (On-Premises)

A Load Agent is a server instance that provides the simulated users in a load test. Load Testing cannot function without at least one Load Agent.

The provided files (contained in core-server.tgz) are all that you need in order to deploy a Load Agent. This tutorial will explain what changed need to be made to the files within in order to properly deploy the Load Agent.

Before starting the process, there is a step that needs to be taken for clients who received their API Fortress containers before the introduction of Load Testing.

Step 0 (Not for all users) – Activate the Node Container

Open the docker-compose.yml in the main API Fortress directory. It can be located at /core/bin/docker-compose.yml

  • Paste the following code snippet in after the #RABBITMQ section and before the #APIFORTRESS DASHBOARD section:
#NODE
apifortress-node:
   image: theirish81/uitools
   hostname: node.apifortress
   networks:
      - apifortress
   domainname: node.apifortress
   labels:
      io.rancher.container.pull_image: always
  • In the links section of the #APIFORTRESS DASHBOARD configuration, add the following line:
- apifortress-node:node.apifortress
  • Save and close the docker-compose.yml.
  • Open the start_all.sh file in a code editor. It is also located in /core/bin.
  • Copy and paste the following and overwrite the entire contents of the file:
#!/bin/bash
sudo docker-compose up -d apifortress-postgres
sleep 5s
sudo docker-compose up -d apifortress-mongo
sleep 5s
sudo docker-compose up -d apifortress-rabbit
sudo docker-compose up -d apifortress-node
sleep 30s
sudo docker-compose up -d apifortress
sleep 1m
sudo docker-compose up -d apifortress-mailer
sudo docker-compose up -d apifortress-scheduler
sudo docker-compose up -d apifortress-connector
  • Your API Fortress instance can now utilize the API Fortress Node Container which powers Load Testing.

Step 1 – Unzip the provided file (core-server.tgz)

First, unzip the provided file.

Screen Shot 2018-06-05 at 11.44.28 AM

Step 2 – Define the maximum users per Load Agent

Users per agent are the maximum number of virtual users that each Load Agent can provide.

It’s important to remember that large numbers of simulated users will require large amounts of hardware resources. Contact your DevOps team to develop a strategy for resource allocation. 

  • Locate and open the file named application.conf. It is located in core-server/etc.
  • Line 14 of this file (fixed-pool-size) should have it’s value adjusted to match the desired number of maximum users per agent.
  • Line 48 of this file (nr-of-instances) should have it’s value adjusted to match the desired number of maximum users per agent. These two values should match.

Step 3 – Configure Config.yaml

  • Locate and open config.yaml. It is located at core-server/etc.
  • First, we have to configure the baseURL
    • baseURL is located on line 3.
    • If the Load Agent and the API Fortress Dashboard are located on the same server, then you can replace the baseURL with the internal address and port of the Dashboard on the server.
    • If the Load Agent and the API Fortress Dashboard are located on different servers, you can replace the baseURL with the actual URL of the Dashboard. That is to say, the URL you would use to access it via web browser.
  • Next, we need to provide the API Key and Secret.
    • Open the main API Fortress dashboard and click the gear icon in the upper right corner to access the settings menu
    • Click the “API Keys” option in the left sidebar.
    • Click “+API Key” 

Create API Key

(Click image for GIF of procedure)

  • Copy the API Key to line 5 of config.yml.
  • Copy the Secret to line 6 of config.yml.

Step 4 – Adding the Engine

  • The next step is to add the new Engine to API Fortress itself.
  • Log into API Fortress as an administrator.
  • Click the user icon in the upper right corner, and then click “Admin Panel”
  • Click “Engines” on the left side of the screen.
  • Click “+Engine”
  • Enter the name and location of the Engine.
  • The CRN value defaults to a random string. You must change it to something human-readable. This is the internal name of the engine.
  • After modifying the CRN, copy the value to line 11 of config.yml
  • Copy the secret to line 12 of config.yml
  • Select the Owning Company of the Engine. An Engine must be owned by a single company. The default value (Public Engine) should not be chosen.
  • Select “Yes” for “Dedicated to Load Testing
  • Click the green check to save the Engine settings.

Add Engine

(Click image for GIF of procedure)

Step 5 – Deploy the Load Agent

At the desired server location, use the “docker-compose up -d” command to deploy the Load Agent container. After the operation is complete, the Load Agent will be visible to your API Fortress Load Tests. 

Key/Value Store

The Key/Value Store

The Key/Value store allows API Fortress users to create temporary key/value pairs that can be accessed across different tests. The Key/Value store is accessed via the Key/Value Store Component.

Screen Shot 2018-05-24 at 1.22.48 PM

An extremely important point to note is that these key/value pairs are temporary. They expire after 24 hours has elapsed since the last update to the value itself. 

The Key/Value Store Component has 4 methods available for use. They are:

Set

Set will create a new key/value pair in the Key/Value store. The value is entered in the “Object” field.

Screen Shot 2018-05-24 at 10.50.19 AM

Load

Load will recall a value from the Key/Value store when provided with a key.

Screen Shot 2018-05-24 at 10.50.36 AM

 

Push

Push will add a value to the end of an existent value of the datatype “Array” in the Key/Value store. If no such key exists, it will create a new array containing the passed in value.  The passed in value is entered in the “Object” field.

Screen Shot 2018-05-24 at 10.51.09 AM

 

Pop

Pop will remove a value from the end of an existent value of the datatype “Array” in the Key/Value store.

Screen Shot 2018-05-24 at 10.50.52 AM

 

Basic Workflow

Let’s take a look at how this workflow works in a practical setting. The first example will be a simple set and retrieve of a value in the Key/Value Store.

First, we’ll make a GET request to an endpoint.

Screen Shot 2018-05-24 at 1.21.40 PM

Next, we’ll add a K/V Store component.

component

This first K/V Store component (we’re going to incorporate several) is going to set the Key/Value pair in the Store, so we’re going to use “Set.

Screen Shot 2018-05-24 at 1.46.41 PM

In this case we’re setting the Key “prods” equal to “products[0].name”, which in this case evaluates to “Baseball Cap.”

Next, we’re going to retrieve this Key/Value pair from the store with the “Load” method. In the K/V Store “Load” component, we’re going to assign the retrieved value to the variable “kvprods.”

Screen Shot 2018-05-24 at 1.47.22 PM

Finally, we’ll add in a “Comment” component to ensure that the data was recovered successfully.

Screen Shot 2018-05-24 at 1.48.01 PM

When we run the test, we’re presented with the following result:

Screen Shot 2018-05-24 at 1.48.28 PM

Success!

Push/Pop Workflow

Next, we’re going to take a look at how “Push” and “Pop” work. “Push” and “Pop” are both array methods and behave as they normally do outside of this context. “Push” will append a value to the end of an array. “Pop” will remove the last value in an array.

First, we’re going to use “Push.” It should be noted that “Pop” works similarly but with the opposite result. “Popalso assigns the removed value to a variable which can be used in the context of the test, but can no longer be accessed from the Key/Value Store. We’ll discuss this further when we take a look at “Pop.”

First, we’re going to send a GET request and assign a key in the Key/Value Store to a value from the response body. In this case, we’re going to use “Color,” which is an array.

Screen Shot 2018-05-24 at 1.49.16 PM

Next, we’re going to “Load” and “Comment” this key. We’re doing that so we can actually see the change on the test report at the end of this workflow.

The next step is to “Push” the new data on to the end of the existing array.

Screen Shot 2018-05-24 at 2.43.53 PM

In this case, we’re pushing the integer 999 onto the prods array.

Finally, we’re going to “Load” the modified data into the test from the K/V Store.

Screen Shot 2018-05-24 at 1.51.48 PM

When we run the test, we’re presented with the following test report.

Screen Shot 2018-05-24 at 1.51.59 PM

The comments show us clearly that we have pushed the number 999 onto the array stored in the key prods. 

Now, we’ve added something to the array. Let’s remove it with “Pop!”

The first step is to introduce a “Pop” K/V Store component.

Screen Shot 2018-05-24 at 2.31.17 PM

We provide the “Pop” component with the name of the key from the Key/Value Store, and the name of the variable we’d like to assign the popped value to. Remember, “Pop” removes the last value in an array and returns the value itself. In this case, we’re going to assign it to a variable called “Popped.”

Next, we’re going to recall the modified key from the Key/Value Store. Then, we’re going to Comment both the recalled Key/Value Store value AND the previously popped value.

Screen Shot 2018-05-24 at 2.28.58 PM

In the Test Report, we can clearly see the full workflow. First, we assigned an array to the Key/Value Store with “Set.” Then, we added to that array with “Push.” Finally, we removed the added value with “Pop.” Each time we made a change, we used “Load” to retrieve an updated value from the Key/Value Store.

Screen Shot 2018-05-24 at 2.29.09 PM

The last two comments show the final state of the array in the Key/Value Store and the popped value itself. The popped value will only be available within the scope of this test run. The array in the Key/Value Store will remain retrievable and until 24 hours after it’s most recent modification.

Note: “Load” does not reset the timer. Only “Set,” “Push,” and “Pop” reset the timer. 

Footprint

Consider a scenario where a route has a parameter in it. Let’s take a look at an example:

http://www.whereever.com/<id>/details

Each individual rest run for this route will produce a new line in the metrics view:
http://www.whereever.com/1/details
http://www.whereever.com/2/details
http://www.whereever.com/3/details
http://www.whereever.com/4/details

This sort of reporting will quickly turn our dashboard into a disorganized mess. To produce a single endpoint for reporting from each one of these calls, you can use what we call a ‘footprint.’

How is this accomplished? In the test, you need to add a Config component to the I/O component as seen below:

configFootprint

The Config component has two fields:
Name: the name you want to assign. In this case, you MUST to enter ‘footprint’
Value: The value for the configuration component. To set up a footprint, you must enter the same URL that’s in the I/O Component. Any parameterized portion of the URL must be wrapped in square brackets.

Based upon our example, the value in this case would be:

http://www.wherever.com/whatever/[id]/details

In the project dashboard, after every run of the test instead of
http://www.whereever.com/whatever/1/details
http://www.whereever.com/whatever/2/details
http://www.whereever.com/whatever/3/details
http://www.whereever.com/whatever/4/details

you will find only one endpoint, displayed as:
http://www.whereever.com/whatever/[id]/details

For each endpoint you can use more square brackets, one for each variable that could assume multiple values. For example: http://www.whereever.com/[whatever]/[id]/details/[colors]/whatever

When you write the value of the config, for the ‘static’ part of the endpoint you can also call a variable as in any I/O operation.

Example: ${protocol}/${domain}/[whatever]/[id]/details/[colors]/whatever

is valid syntax.

 

 

Dynamic dates

Have you ever needed to pass a future date as part of the request inside of a test? Perhaps as a check-in or check-out date? You could enter it as static value, but that means you would have to periodically update the date as time went on. Creating a dynamic date in API Fortress is a simple solution for this sort of situation.

Here’s the procedure:

  1. First, open the Composer and add a Set (variable)
    setVar
  2. Then, enter the variable name and leave the mode as String
    varComp
  3. Lastly, enter the following string in the Value field:${D.format(D.plusDays(D.nowMillis(),35), ‘yyyy-MM-dd’)}

valueField
Let’s analyse what this string means:

D.nowMillis(): returns the current Unix epoch in milliseconds
D.plusDays(): returns the provided milliseconds, plus the provided number of days (in our example, we have added 35 days to today date)
D.format(): creates a timestamp with the given format, using the current timezone (in our example yyyy-MM-dd)

As result, you will have something like 2018-05-15

You can obtain a past date, starting from todays date with the following string:
${D.format(D.minusDays(D.nowMillis(),35), ‘yyyy-MM-dd’)}

You can also create a date based on a specified timezone:
${D.format(D.plusDays(D.nowMillis(),35), ‘yyyy-MM-dd’,’America/New_York’)}
The above string create the same date as our first example using New York (EST) as the timezone.

For more details about you can check our reference page

 

How to create a dynamic header

Most APIs have only one response format, either JSON or XML. But what should we do in the case of an API endpoint that can return either JSON OR XML? We could write two different tests, with one supporting each response type, but we’d be repeating a good amount of code in both tests. API Fortress allows you to test both, using the same I/O component and assertion components for almost all test cases. In a few cases, we’ll need to add a bit of extra logic to allow the platform to distinguish between the two formats.

Let’s consider the scenario where you need to pass in the Header the Accept value that is ‘application/json’ if you are testing the JSON and ‘application/xml’ if you are testing the XML. Usually in this case, you should make two different calls, as shown in the image below, in order to be able to pass the different values in the header:

Let’s consider an example. The API call in question requires an “Accept” header. This “Accept” header needs a value of “application/json” if you are testing the JSON case and “application/xml” if you are testing the XML case. Below, we can see the solution to this problem that requires setting up two separate calls. It’s not particularly adherent to the DRY rule of programming. (Don’t repeat yourself!)

both_calls
API Fortress allows you to solve this issue by making only one call.

    1. Let’s start adding the different formats as variables, as seen below.inputSets
    2. Now, we can remove one call and add the format variable in the “Mode” input.
      varFormat
    3. The header is still static at this point. We need to turn it into a dynamic one which changes to suit the data type of the API we are testing. We add a variable component above the I/O operation that will return the appropriate value.setDynamVar
    4. if (format == 'xml')
      return "application/xml";
      else return "application/json";
      To explain: the ‘acceptHeader’ variable will have ‘application/xml’ as value if format is ‘xml’ and ‘application/json’ otherwise (since we have only two different formats, it will be ‘application/json’ only for JSON format
    5. Now, we can finally remove the ‘static’ header and add the ‘dynamic’ header by changing the Header value to ‘${acceptHeader}’dynamicHeader
      Now, the test will be executed two times; once for ‘XML’ and once for ‘JSON’, ensuring that the header will have the correct value.