Google

YouTube

Spotify

Scientific Sense Podcast

Wednesday, December 14, 2016

Milking data

Milk, a new computer language created by MIT's Computer Science and Artificial Intelligence Laboratory (CSAIL) promises a four fold increase in the speed of analytics on big data problems. Although true big data problems are still rare, albeit the term is freely used for anything from large excel sheets to relational data tables, Milk is in the right direction. Computer chip architecture designs have been stagnant, still looking to double speed every 18 months, by packing silicon ever closer with little innovation.

Efficient use of memory has been a perennial problem for analytics, dealing with sparse and noisy data. Rigid hardware designs shuttle unwanted information based on archaic design concepts never asking the question if the data transport is necessary or timely. With hardware and even memory costs in a precipitous decline, there has not been sufficient force behind seeking changes to age old designs. Now that exponentially increasing data is beginning to challenge available hardware again and the need for speed to sift through the proverbial haystack of noise to find the golden needle is in demand, we may need to innovate again. And, Milk paves the path for possible software solutions.

Using just enough data at the right time to make decisions is a good habit, not only in computing but also in every other arena. In the past two decades, computer companies and database vendors sought to sell the biggest steel to all their customers on the future promise of analytics once they collect all the garbage and store it in warehouses. Now that analytics has "arrived," reducing the garbage into usable insights has become a major problem for companies.

Extracting insights from sparse and noisy data is not easy. Perhaps academic institutions can lend a helping hand to jump start innovation at computer behemoths, as they get stuck in the status-quo.

Monday, December 12, 2016

Democracy's event horizon

Recent results from a survey (1) of 2200 Americans showing over 1 in 4 believe that the sun goes around the earth is problematic for democracy. The system, that reflects the aggregate opinion of all participants, has served humanity well in recent years. However, the same characteristic could be its Achilles' heel as its leaders will have to reflect its population. If aggregate knowledge present in a democratic society falls below a threshold value, it can act like the event horizon of a black hole. Once through it, there is no turning back as it will spiral down to a singularity.

There have been telltale signs in many democratic societies for some time. In the world's largest democracy, elections were decided by last names and not policy choices. In Southern Europe, star power has been more dominant. More recently, powerful democratic countries have opted for less optimal outcomes. All of these may imply that democracy, as a system, is running out of its originally intended use - assure optimum outcomes for society in the long run. Instead, it is now more likely to reinforce low knowledge content, if it is dominant.

One democracy appears to have resisted the race to the bottom. Down under, where penalties are imposed for those not bothering to vote, high turn-out has assured that knowledge content of the voters is above the democratic event horizon. It appears that the prescription for ailing democracies returning sub-optimal results is to enhance voter turnout, possibly by the imposition of penalties. The biased selection in the non-voter cohort may be just enough to keep the system from the plunge to the unknown.

(1) http://www.npr.org/sections/thetwo-way/2014/02/14/277058739/1-in-4-americans-think-the-sun-goes-around-the-earth-survey-says

Monday, November 28, 2016

The irony of Artificial Intelligence

The indiscriminate use of the term "Artificial Intelligence," by analysts and consultants, demean what it was originally meant to be and is still supposed to imply. Those who have worked in this area for decades, much before the internet, search and autonomous vehicles existed, know fully well that there is nothing "artificial" about it. A lack of definition for "consciousness," has constrained computer scientists and philosophers alike from conceptualizing intelligence. Faster computers, memory and unlimited capital are unlikely to advance the field any further unless we return to thoughtfully studying the underlying issues. Let's do that before a few gamers think that they have accidently invented the "artificial mind."

Intelligence, something solely attributed to humans till very recently, is now shown to be present in a plethora of biological systems. Biological systems, however, are not an infinite array of on-off switches nor do they process information as computer systems do. So the idea that one can network a large number of silicon based idiot boxes, using mathematics from the 60s, to replicate the brain is fraught with issues. The recent fad of "artificial general intelligence," - the ability to teach a computer virtually anything such that it can not only replicate a human but also become a lot better, is a nice fantasy. What the technologists may be missing is that it is not a software problem. The millennials have become good at software but projecting what one is good at onto hard problems may not the best path forward.

Nature had nearly 3.8 billion years to perfect the underlying hardware to deliver intelligence. It was a slow and laborious process and it was never a software problem. Once the hardware was perfected, it was able to run virtually any software. This may give a clue to those plunging head first into teaching machines to "think," using software and age old mathematics. More importantly, the current architecture of computing representing calculators is not amenable to modeling intelligence. Just as there is a distinct difference between Arithmetic and Mathematics, conventional computers differ significantly from a true quantum computer. Adding and subtracting numbers fast is one thing but conceptualizing what it means is quite another.

The unfortunate term, "Artificial Intelligence," has led many mediocre lives astray. And God, with an immense sense of humor, seem to still lead them down blind alleys.