Have you ever wondered how testing is done at big companies like Microsoft? Curious to know what it takes to succeed with testing and automation across a huge enterprise that has a bunch of different products and technologies? Alan has seen it all and shares his years of testing wisdom. Your ears will be ringing from all the test automation awesomeness knowledge bombs Alan will drop on you in this episode.
About Alan Page
Alan Page has been a software tester for nearly 20 years. He was the lead author on the book How We Test Software at Microsoft, contributed chapters for Beautiful Testing and Experiences of Test Automation: Case Studies of Software Test Automation. He also writes about a variety of software engineering subjects on his blog at http://angryweasel.com/blog.
Alan joined Microsoft as a member of the Windows 95 team, and since then has worked on a variety of Windows releases, early versions of Internet Explorer, and Office Lync. Notably, Alan served for over two years as Microsoft’s Director of Test Excellence.
Currently working on something new at Microsoft. Alan has had many roles at Microsoft his most recent was as a Principal Software Design Engineer in Test (SDET) on the Xbox team, Alan spends his time designing and implementing test infrastructure and tests; and coaching and mentoring testers and test managers across the Microsoft organization.
Alan also leads company-wide quality and testing focused communities made up of senior engineering employees.
Quotes & Insights from this Test Talk
- Exploratory Debugging – write your test, and right where your test determines whether it passed or failed, set a breakpoint and look at the variable values. What’s going on when it passes or fails? This will give you an idea for things to log — or you may find an error path that you don’t have a test for.
- The difference between short test and long tests
- Meantime to Diagnosis – a testing metric that tells you when a test fails – how long does it take you to figure out why.
- If a test fails and you need to hook up a debugger or run the test again, you’ve lost the battle
- You win the battle when you look at the log and within two minutes or less and you can say most likely what the issue is.
- Flaky tests – if you have tests that are failing but should be passing and you’re okay with that, are you also okay that you have tests that are passing but should be failing? Those are harder to find, so it’s important to have reliability on both sides. Until you can trust that your failing tests aren’t flaky you can’t trust your pass rate, either.
- Much, much more!
- How We Test Software at Microsoft
- Beautiful Testing: Leading Professionals Reveal How They Improve Software
- Experiences of Test Automation: Case Studies of Software Test Automation
- Where Good Ideas Come From
- Testing object-oriented systems
- A Practitioner’s Guide to Software Test Design
- The “A” Word – Under the Covers of Test Automation By Alan Page
- See Alan this year at the Star Canada Software Testing Analysis & Review Conference
Connect with Alan Page
- Twitter: @alanpage
- Blog: angryweasel.com
- Podcast: AB Testing <–Must listen to podcast about software testing
May I Ask You For a Favor?
Thanks again for listening to the show. If it has helped you in any way, shape or form, please share it using the social media buttons you see on the page.
Additionally, reviews for the podcast on iTunes are extremely helpful and greatly appreciated! They do matter in the rankings of the show and I read each and every one of them.
Special offer for TestTalks listeners, get 20 hours of automated testing for free when you sign-up with promo code testtalks14 (more info).