Products

Problems
we solve

We can help your business

Request a Free Demo / trial

Insights

Insights | From a different perspective
16 December, 2024

Carry On Testing: A Couple of Funny QA Mishaps

System 38

Ah, software testing—where meticulous planning meets unexpected chaos and where even the most rigorous quality assurance can sometimes go awkwardly awry. As we bid farewell to 2024, it’s time to take a step back and indulge in some light-hearted reflection on the quirky side of our profession.

We’ve all been there: that moment when seemingly well-tested code does the unexpected.

Whether it’s a seemingly coincidental system crash or a code miscalculation that clears out a bank account, testing mishaps remind us that Murphy’s Law is often lurking around the corner…

The Game Where Winning Means Losing (Your System)

Back in the 1980s, I was learning to code and was attending IBM training courses. Of course, this was well before the common use of PCs with graphical capability, but despite that, there were still text-based games available on the System 38 we were using.

That being the case, during course downtime, we were allowed to play dice games, an adventure game and—my personal favourite—a racing game!  Even better, I was allowed to copy the games and load them up on my company system.

The racing game was easily my most played, and I was hooked! Playing during breaks, I would choose the track and navigate around using the right amount of acceleration, braking and steering.

I stuck at the game for weeks, trying to get a perfect lap. The tolerances were ridiculous, and I just kept crashing out—games from the 70s and 80s were generally way tougher than current ones.

Then, one day, finally, I made it! I didn’t crash once, completed a perfect lap and finished the race.

Obviously buzzing, I wanted to show off to my colleagues when, all of a sudden, the computer shut down. As I said earlier, this was the pre-PC era, and we were running the business on a minicomputer, with 100+ people using that single machine.

It was an inconvenience, but these things would happen occasionally, so we rebooted and continued working.

Fast forward a few weeks later, and I was playing again. I was getting closer to another perfect lap… and then I managed it, and shock, horror, the system shut down again!

This time, we suspected the game was the cause—a bit of a harsh reward for winning the game.

Anyway, the next time I went back to IBM, I found out that their system engineers wrote the games in the early days of the computer, partially in machine code and referencing something that the system didn’t know how to handle.

Needless to say, my company quickly implemented new policies on what we could and couldn’t use the work computer for!

Maybe They Didn’t See the Point of Testing?

In another classic case of “oops, didn’t see that coming,” my company set out to implement a system for making payments to customers via BACS (Bankers’ Automated Clearing Services). These days we all have faster payments.

The programmer diligently wrote the code, tested it, and gave it the green light. What could go wrong?

Well, the first time it ran, the Finance Director got a call from the bank manager concerned that the account was massively overdrawn!

Now that I think about it, this was around the time of Superman III, with a plot to siphon off fractions of a penny, but this was more cockup than conspiracy. 

So, what could have happened?

It turns out that the developer had forgotten to allow for two decimal points, and the tape was instructing the bank to pay out 100x, the amount it should have been paying!

We never did find out what happened with the money, but I will hazard a guess that they didn’t get all the money back.

Share Your Testing Tales!

Now it’s your turn! What’s the funniest or most cringe-worthy testing story you’ve encountered in your career?

The best way to avoid future blunders is to learn from past mistakes—preferably with a good sense of humour—so share your tales in the LinkedIn comments!

Stephen Davis
by Stephen Davis

Stephen Davis is the founder of Calleo Software, a OpenText (formerly Micro Focus) Gold Partner. His passion is to help test professionals improve the efficiency and effectiveness of software testing.

To view Stephen's LinkedIn profile and connect 

Stephen Davis LinkedIn profile

16th December 2024
How to Choose A Test Management Tool

How to Choose The Right Test Management Tool

Test management tools ensure efficient, effective, and auditable testing processes. When choosing an enterprise-level test management solution, it’s essential to use a proven and trusted solution.

why choose loadrunner

Performance Testing: 6 Reasons Companies Choose LoadRunner

Despite what some might think, LoadRunner is not a single performance testing tool but a family of world-class load performance testing solutions. The family has recently undergone a name change that you may not be familiar with. Today, I will explain why LoadRunner family products are relied on by the largest companies.

Tomorrow’s World v. Today’s Reality: My WQR Thoughts

The World Quality Report (WQR) 2024-25 looks at the current state and future trends of Quality Engineering—albeit from a particular perspective. It is based on a survey of 1,775 executives across multiple sectors and regions. While it is a potentially interesting source of information, I’m always left asking the question: Is it useful?

LoadRunner Enterprise 243

LoadRunner Enterprise 24.3 – Release Highlights

Today we look at OpenText LoadRunner Enterprise 24.3, which introduced several enhancements to their enterprise-grade performance testing solution. We’ve been through the release notes so you don’t have to.

OpenText Customer? Read This to Avoid Overpaying for Support Costs

Following its acquisition of Micro Focus, OpenText has implemented a more stringent policy regarding support renewals and term/subscription renewals. This policy could increase your support costs. Today I’ll explain how to minimise costs when renewing your OpenText software support or subscriptions.

AI caution

WQR: This 1 AI Recommendation Could Derail Your QA Strategy

The World Quality Report 2024-25 (WQR) provides a few interesting insights into adopting AI in software testing. And, while many their recommendations are sound for most companies, one specific recommendation regarding AI implementation is a terrible idea for most businesses.

Insights

Search

Related Articles

To get other software testing insights, like this, direct to you inbox join the Calleo mailing list.

You can, of course, unsubscribe 

at any time!

By signing up you consent to receiving regular emails from Calleo with updates, tips and ideas on software testing along with the occasional promotion for software testing products. You can, of course, unsubscribe at any time. Click here for the privacy policy.

Sign up to receive the latest, Software Testing Insights, news and to join the Calleo mailing list.

You can, of course, unsubscribe at any time!

By signing up you consent to receiving regular emails from Calleo with updates, tips and ideas on software testing along with the occasional promotion for software testing products. You can, of course, unsubscribe at any time. Click here for the privacy policy.