Optimizing Rainforest on Rainforest

Akhila Iruku, Tuesday November 15, 2016

Recently, we made an effort to optimize the way we use our own product at Rainforest in order to be more efficient with our step usage, which costs us dollars just like it does for our customers.

Why we decided to optimize the way we use Rainforest

We wanted to learn how to be smarter about QA and in the process, eliminated some redundancies in our testing and can now leverage our QA resources more effectively.

Once we were done, we significantly changed the way we used our own product while maintaining the level of quality needed at the appropriate stage in the development process. As a result, we reduced our step usage by 50%, allowing us to add Rainforest coverage in strategic places that we had not been targeting before.

Shifting quality upstream

One of the most significant improvements we made is executing Rainforest tests upstream in our development process. Rather than running a 50+ regression test suite to capture issues at the end of our process, we now run a lightweight test suite of Rainforest tests as soon as developers check in their code. This lightweight test suite consists of ~5 smoke tests per feature that ensure all key workflows are functioning. If there is an issue, the developers can address it while it’s still fresh on their mind. This in turn ends up making the development process with Rainforest analogous to the process of unit testing.

By running a lightweight test suite earlier in the development cycle, we can be more strategic about running our larger regression tests at the right time before a release. This saves us steps but more importantly, makes our release process much smoother because our critical flows have been repeatedly checked throughout the cycle and we avoid getting overwhelmed with Rainforest failures during our final release process.

Testing based on business value

We also revisited the number of browsers being tested by Rainforest and are taking a more balanced approach. Rather than placing an equal emphasis on all browsers, we now run tests more frequently on browsers with higher traffic. We run our regression suite against browsers with less traffic less often. Overall, we save steps without compromising quality.

With these changes, we can ensure confidence in the quality of our product from the early stages rather than working backwards from the end of the development cycle.

Key Takeaways:

1) Push quality upstream by running a lightweight test suite after each commit.

Result: Fix errors early on and reduce the number of failures that come up pre-release.

2) Test cross-browser and device on Rainforest proportional to how your users touch your application.

Result: Ensure cross-browser coverage while using your steps effectively.

Have questions? Contact your Customer Success Manager or csm@rainforestapp.com for a consultation on how you can optimize your use of Rainforest.

Learn more about Rainforest