Thursday, May 28, 2015

Accent inefficiency

A recent study from the University of Washington at St.Louis, speculates that accents may have a high negative correlation to comprehension and recall to a population, that is native. If true, this is a significant loss of efficiency for 8 billion people, worldwide. Humans, already reeling from a plethora of disconnected languages and incomprehensible accents, seem to have painted themselves into a corner. Language, the foundation of communication that propelled humans away from their close cousins, chimps, may be their Achilles’ heel – as they struggle to understand each other.

And, most humans do not understand each other. Their natural inclination, driven by evolutionary forces, has been to distrust anything that is foreign – structure and accent. The segmentation schemes they have been able to invent – countries, religions and now, accents – have kept them bottled up from progressing any further for nearly hundred thousand years. As they take pride in their understanding of the universe, computers that run faster than ever, aluminum tubes that propel them across continents and into space, medicines that keep them alive and in pain for an incremental five years and ego that keep them stressed for ever, they are worried primarily about color, language and accents.

As the space agencies search for dominant extra-terrestrials across space-time, as the intellectuals seek a meaning for life, as physicists seek the next particle from heavy bombarding, as economists seek to define how money flow from one to the other, as chemists and biologists seek to keep the dying human on life extending machines, one has to wonder if a world with a singular language and indistinguishable accents could have made a difference.

Sunday, May 17, 2015

Downward sloping cognition

Humans, apparently on top of the mammals’ evolutionary chain, appear to lack some basic cognitive capabilities, as exhibited by their distant cousins- rats. A recent article in Animal Cognition describes advanced cognitive capabilities in rats that unilaterally lend a helping paw to another who may be sinking in water. This instinctual reaction appears to supersede food based reward, a dominant aspect of mammal life.

Evolution, held sacred by scientists as a sure way to higher intelligence and cognition, has to be rethought. It appears that tactical advantages gained by random mutations are more likely to create freak systems – such as humans. If there is a physical reason for life – such as accelerated entropy, it does make sense at the macro level. Systems that are able to think many different permutations and combinations to enhance entropy, will be selected and humans certainly fit the bill. The organ they carry on their shoulders, certainly helped them invent fire and they have been burning everything they could find for over 100 thousand years. And burning, certainly, is a sure way to increase entropy.

Somewhere along this evolutionary cycle, humans, seem to have picked up some bad habits – such as observations, societal formation and learning. These traits are certainly against the prescribed objectives and will be deselected if the objective function is indeed very clean and includes only positively sloped entropy. Since rats appear to be significantly less efficient than humans to accelerate entropy, it is clear that the forward momentum of evolution will likely correct for any random noise that was introduced such as empathy, knowledge and the desire for better societies.

Humans, a dominant evolutionary construct, have been efficient in optimizing a simple objective function – accelerate entropy at any cost. And the laws of physics indicate that they will get more efficient at it over time.

Friday, May 15, 2015

Seeing is not believing

Experimentalism and empiricism, corner stones of modern scientific exploration, have substantially dampened step-function changes in knowledge addition in almost every field. In economics, availability of metrics and statistics in abundance have kept academics, spending most of their careers proving the established theories. In Physics, the ability to generate data at will with heavy machines has kept any innate creativity bottled up. In life sciences, manufacturers, staffed with conventional statisticians and a regulatory regime with little understanding of risk management have assured that breakthrough drugs are yesterday’s story.

It is a perfect storm. As a vanishing generation, steeped in qualitative and non-scientific processes of information gathering is bombarded by another, trained to see and analyze data, we are left with little hope to advance knowledge. For the former, data do not matter and for the latter, it appears, only data matter. Neither can be further from the truth. From a societal perspective, one has to worry less about the former as they are checking out from the ecosystem. But, it is problematic to see educational systems, world-over are catering to processes that start from data and not knowledge. The implicit assumption of modern science, that experimentalism and associated empiricism are necessary conditions for the creation and establishment of theories, is fundamentally incorrect.

If we create a society, enslaved to data and thus prone to considering experimentalism and empiricism as the primary tools to generate and advance knowledge, we are doomed.

Saturday, May 9, 2015

The cost of thinking

A recent article from MIT (1) argues that consumers’ decision processes in the retail arena, replete with confusing choices and an overabundance of brands, are dominated by an “indexing strategy.” To reach a decision, consumers may be indexing (or utilizing a bundle of proxies to compare) rather than conducting an exhaustive search as such an optimization process could have high cognitive costs. If true, this finding has implications for companies in the retail arena for product design, promotions, delivery and pricing.

First, it opens up a dimension in the psyche of the consumer, that makes analyzing decisions more complex for the observer (retailer) even though it simplifies the process for the consumer. In a world of a large number of close substitutes for any product or service, a consumer with a preference for minimizing cognitive cost, will only consider a subset of products, that is not necessarily obvious to the retailer. As the consumer “indexes” against an unknown subset of substitutes, she will likely consider all aspects of comparability – as the simplified process allows her to do so, in the comfort of an already reduced cognitive cost. Ironically, these attributes may include both physical and virtual aspects – with differing weights, making it very difficult for the retailer to define “competition,” in a world of interacting product definitions.

Second, status-quo strategies that may include price discounting, bundling and couponing, may have a longer lasting effect on the “indexing strategy,” followed by the consumer. Such tactics by the retailer could move the brand away or closer to the indexing bundle, used by the consumer. Although the impact of such strategies on the near term decisions of the consumer is ambiguous, it does increase the complexity of optimizing such strategies. And, finally, retailers who have a rigid view as to “who their competition is,” may find themselves drifting – as the consumer preferences and retailing tactics may enroll or remove them from the proxy bundles considered by the consumer.

Retailers may have to move away from long held views on the competitive landscape and tactics that may have brought customers to their doorstep in the past. Flexible and dynamic strategies in design, delivery and pricing may be needed to win the consumer indexing game.

(1) The brain in the supermarket, Published: Friday, March 27, 2015 - 11:33 in Mathematics & Economics, Science News

Sunday, May 3, 2015


Statistical significance has let many industries down and built fortunes for many, riding their luck. A recent paper from Duke University explains what most non-statisticians and non-financiers always knew. Models that do not make practical sense are unlikely to work. Industries on top of the modern economy, those who attract the best and the brightest – physics, medicine and finance – have been playing with statistical fire, discovering and proving everything there is – some for money and others for fame. As the Duke paper points out, almost any hypothesis could be proven by a sufficiently large number of trials. And proving hypotheses is front and center for any “scientific profession.”

In this context, it may be interesting to make the following predictions:

1. LHC : If 6 trillion trillion collisions are made and the data analyzed, LHC could prove God exists. Just 6 trillion was enough to find the “God particle” within 5 sigma. This is a good experiment. Proving God exists may solve many of the vexing problems faced by humanity.

2. SETI : If an antenna is provided to every roof top in the world and the “search” accelerated by a billion times, SETI could prove ET with pointy ears, a cone head and red white and blue stripes across the body exists in some distant galaxy.

3. Wall Street : If the number of idiots trading securities back and forth every day is increased by an order of magnitude, Wall street could create somebody who wins on every trade for 10 years running.

Statistics, the “science” that is foundational for “accelerating knowledge” of humanity, may singlehandedly bring knowledge-seekers to a standstill in the presence of “big data.”