Test - a way to establish if something is acceptable or not - is one of the world's most commonly performed procedures. Billions of years ago, Mother Nature established agile processes to test molecules, cells and organisms while creating them. Stone age toolmakers were checking if their tool was doing what it was supposed to do before using it. Ancient humans assessed others for civil service and education. How did all those tests evolved with time?
The need to scale up led to automation. Educators got standardized tests and scanners. Software Quality Assurance professionals mastered machine-driven test cases and new test automation frameworks, teaming with software developers for better productivity. Medical testing evolved from tasting urine to sophisticated techniques and molecular diagnostics. At the forefront of test automation, electronic design engineers moved from asking "Does it work?" to "Are all elements present and working?" to "What could go wrong with this design?" focusing on defect-, circuit-, environment- and equipment-dependent variations.
Will artificial Intelligence take over, testing software, hardware and interviewing people for the AI-proof jobs? Will the Internet of Things and Wearables improve the assessment of people's health, educational progress and behavior? Will medical diagnostics merge with therapeutics enhancing nature's proof-reading and error correction mechanisms? Will crowdsourcing evolve into testsourcing with everyone everywhere being a tester of something for public good?
Perhaps. We have already started to explore AI-powered bots testing apps and see Artificial Intelligence challenging medical doctors on their home turf, so anything is possible.