Artificial Intelligence is fashionable again, aided by cheap computing and cheaper memory, computer behemoths have been diving head first into the abyss. It has happened before, albeit, with much limited computing power. Just as in the previous iteration in the 80s, the current excitement may die down in a few years. Simulated machine voice and jeopardy are easier problems to solve but finding a cure for cancer is a lot harder. Similarly, churning through limitless text - “search”, is a mechanistic process but “curing death” is not. Soon, hopefully, the “leaders” will realize the naked truth – machines do not learn and that the “singularity” they envision is many centuries away.
Machines have captured the human imagination from the start. When they trained animals to drive farm equipment many thousands of years ago, they understood how machines could be built and powered to enhance productivity. More recently, the silicon chip, a conventional and mechanistic processor, has raised their ambitions to a level that may not be realistic. In the initial going, it was focused on productivity, very similar to farm equipment but now some believe their machines can learn. If so, this is a departure for humans from sustenance to imagination, from mediocrity to advancement, from tactics to strategy. But alas, machines do not learn!
“Machine learning,” a term used too liberally by information technology and consulting firms, may need further definition. One can clearly demonstrate machines do not learn by themselves and if so, we are back to using machines to enhance productivity – a posture humans have been in for ten thousand years.