The following sections discuss Telligent's strategy for testing and verifying the quality of its products.

How we test releases of Telligent's products

We have significantly evolved how we test our products throughout development, packaging and release. Our latest releases of Telligent Community, Telligent Enterprise, and Telligent Analytics demonstrate high quality because of our significantly improved test and release processes.

Quality strategy

Our quality assurance strategy is centered on continually evolving our testing process to prevent bugs from being created as the software is developed or, worse yet, from being discovered by our customers. We do extensive manual testing and use a number of automated tools to validate that our products function properly. We are also using our products in real-world scenarios as they're being actively developed, which helps us to identify problems before our customers do.

Manual testing

Our manual test strategy uses extensive and continually updated test plans to cover our products with formal test cases. We take care to test our products on all supported platforms and browsers. We also make use of our Quality Assurance team's experience by unleashing them in ad hoc exploratory testing to carefully examine different areas of our products.

Test plans

Manual test plans lay out individual test cases to check correct functionality around our various products. We divide our test plans into two parts: P0 and Depth. P0 tests are "line of death" checks which ensure proper behavior of critical features or functionality. We run P0 testing on our components on a weekly basis. Depth tests are much more time-consuming and check important but less-critical functionality.

We've worked hard to evolve our test plans for the latest releases. We have exploded the number of test cases we cover, and we have created new test plans for components which weren't formally covered previously.

For the first release of the Telligent Enterprise 2.0, we completed a total of 366 test cases. For the Telligent Enterprise 2.5 release, we completed over 1,300 test cases - with a passing rate of 98%. For the latest release, Telligent Enterprise 3, we repeatedly executed over 3,000 individual manual test cases. These cases covered more areas than any previous release.

Telligent makes our test plans available to anyone who is interested in seeing exactly how we're testing our software. The test plans can also help customers validate their own installations of Telligent products, both for out-of-the-box and customized installations.

Ad hoc testing

Ad hoc testing encourages our team to use their experience and knowledge of the system to explore areas of the system which may not fall under automated or manual test plans. Ad hoc testing takes place during many different phases of our work including manual test passes, bug validation, and general exploratory testing.

Platform test matrix (Telligent Community and Telligent Enterprise)

Telligent products run on several different platforms, so we cover a number of different combinations of platforms and browsers during our testing.

Software IE 7, 8, and 9
FF 3.5 and 4.0
Server 2008 + SQL 2008 X X X
Server 2003 + SQL 2005 X X


We also do checks with Internet Explorer 6 in order to support our few customers who aren't yet able to move to a more stable browser.

Automated testing

For automated testing, we use a variety of tools to create a suite of tests that are repeatable and can be run with the click of a metaphorical button. These suites ensure features work as advertised, and also provide a safety net against regressions. These automated tests are run against our system on a regular basis - in some cases, several times each day.

We have a range of automated tests across all our products and components. We use unit tests to check small, discrete functionality within components of our software. Integration tests check behavior between system components such as REST Web services. Functional tests ensure all user features operate correctly.

Unit tests

Developers are responsible for writing unit tests to validate portions of the work they do.Our latest release had over 730 unit tests.

Integration tests

Our Web Services team creates integration tests around every piece of software they write. These integration tests exercise the Web services components to ensure proper behavior and fend off any regression bugs.

We started work on the Telligent Enterprise 2.0 release with 75 integration tests. Our latest release, Telligent Enterprise 3, had over 3,000 manual tests and over 16,000 automated cases.

Functional tests

The latest release has over 10,000 functional tests using the Selenium Web test framework; nearly 6,000 integration tests for our APIs, and almost 1,000 integration and unit tests covering internal functionality of our core platform.

REST Integration tests

Our Web services team creates integration tests around every piece of software they write. These integration tests exercise the Web services components to ensure proper behavior and to fend off any regression bugs.

We started work on the latest versions with 75 integration tests and grew to 630 tests by release.

Web tests

The QA team performs testing via automated browser sessions. These tests also guarantee proper functionality and provide another regression safety net, but it is through the user interface itself instead of at the Web services layer like the REST Integration tests.

Using our own product day in, day out

We use Telligent Enterprise as the backbone for our corporate intranet. We update our intranet site nearly every two weeks with the most current version we are developing. This helps us in a number of ways.

First, we get immediate feedback from real users on new features: Are the features working as expected? Do they fill a real need? Second, we get users who exercise the system in ways our test plan doesn’t cover.This helps us find bugs we might have otherwise missed, and it helps us expand our test plan with those new use cases.

Telligent’s intranet, like every other corporation’s, provides critical functionality necessary to the success of the company. We couldn’t roll out our regular releases and be successful if our quality was too low – our own colleagues wouldn’t be able to get their daily work done and we’d rapidly lose their trust.

Delivering well-tested product releases

As pointed out above, we are constantly testing our products as they are being developed; however, we focus carefully on testing during our release-for-delivery process. Every product package gets additional attention as it is completed and readied for release to our customers.

In addition to the regular testing already discussed, our major releases such as Telligent Analytics 3.5 undergo several additional test passes where the entire product development team focuses on executing detailed test passes on each of our Release Candidate packages. This final vetting ensures we’re getting additional levels of exposure to the entire system.

Telligent also releases hotfixes for our products. These differential versions hold incremental fixes we release to our customers in between regular product versions. Our hotfix releases undergo the same level of testing as our regular releases do; however, we also take care to specifically validate each issue we’re resolving in that release.

Moving forward

We’re committed to continuing the expansion of our testing, both manual and automated. We’re aggressively using customer feedback to evolve our testing plans as we resolve issues identified in our support forums, custom services, and formal support chains. Additionally, each new unit of development drives test cases specific to that work.

The next releases of Telligent Enterprise, Telligent Community, and Telligent Analytics will see vastly expanded test coverage, ensuring we continue to improve the quality of products we are delivering to our customers.

You can view a copy of Telligent's Core Test plan.