Scientific Sense Podcast

Thursday, January 17, 2013

Returning language to the wild

Recent research from the University of Warwick shows that most remember Facebook and Twitter updates better than faces and polished writing. This confirms that modern technology has been able to simplify the human mind and return it to its origins. As she stood up in the African savannahs for the first time, it was clear that simpler communication was not too far behind. The origin of language, albeit being hotly debated, had only one purpose – organizing to achieve common objectives. From its inception, language was simple, as the faster one can communicate the limited set of available information, the better.

Later, modern humans, took the simple construct of language and used it as an art form to kill time and nourish the inner mind. Literature and philosophy moved language into realms it was never meant to be used. Both created complexity and stretched the concept in pleasurable ways, but still only for a limited subset of the population. Much later, computers ushered in the first step back in language evolution as machines, being efficient and systematic, were not that fond of ornamentation. Computer languages provided the first glimpse of the original language, able to pack in maximum information into simple constructs. Now, Facebook and Twitter complete the full circle, returning language to its origins. After all, a constraint on characters was exactly what the original languages were facing. They had to communicate limited information efficiently.

For some, however, this is a sad story as it is ample evidence that the human psyche will degenerate to its origins at the first available opportunity, provided by technology or society.

Wednesday, January 16, 2013

Spooky action, closer

A recent article in the journal of Physical Review Letters discusses the possibility of practical teleportation using the known phenomenon of entanglement. By proving the possibility of serial and bulk teleportation of qbits, the authors open up endless possibilities in both quantum computing and space travel.

Ever since entanglement was shown to be real, it has been a constant source of inspiration and annoyance depending on where one sits on the quantum divide. Classical and traditional physics had overwhelmed the real but counterintuitive possibilities offered up by a theory that exposed the fundamental human weakness – a bias to his own scale and a preference for status-quo. For over fifty thousand years, humans have gotten used to object permanence and even babies are born with a firmware that fully accommodate such expectations. Research has provided tantalizing evidence that the brain is indeed a quantum computer, but ironically, most of it is unable to reprogram away from the stupor of evolution.

Extra-terrestrial life at different scales and sophistication may be able to accommodate such broader frameworks faster with an exponential increase in knowledge.

Sunday, January 13, 2013

Hundred year yawn

Recent research from the University of Arizona, that illustrates that the parameters that still “fit” the “dark energy” explanation for the observed expansion of the universe is running out of space. The study shows that the remaining relevant patch for the dark energy enthusiasts is a 2inch square area in a football field size of available ranges. The original explanation, proposed a hundred years ago, still seems to be more relevant than the “theories” that have been doled out since then.

Theoretical physics – a hundred year yawn – has accomplished little since the remarkable insights of Einstein, Plank and Heisenberg. More importantly, it has opened so many dead ends that it drained the intelligence and creativity of two generations of physicists, with nothing to show. Granted, in the last few decades engineers have made telescopes, atom smashers, space vehicles and other remarkable things to see, listen and measure that would have made Einstein blush. But what they delivered are noise in an expanding ocean of particles, confusion and empiricism.

There are three primary axes of exploration that has held the field back. First, educational institutions, around the world, appear to have set lower priority to theory compared to experimentation. They have led budding physicists into the rabbit tunnels of immense size and scope that they get lost in them for their entire careers. Second, space exploration has become an activity of pride and prejudice, with entire nations competing to send people and machines to nearby planets, asteroids and moons. Now, seeking extra-terrestrial life by looking for “earth-like planets,” or finding water, are considered to be research of the highest order. Finally, scientists seem to have lost the pure joy of seeking the truth – instead they appear happier if they can publish more papers or grab a Nobel prize.

Let’s hope the hundred year yawn in theoretical physics will end soon – but it will take the emergence of beautiful brains again.

Saturday, January 5, 2013

Research computer

Recent work by Emory mathematician, Ken Ono, finds further explanation and meaning in the work of Ramanujan, who died nearly a century ago. Applications stretch in many directions including explaining some behavior of certain types of black holes. This illustrates that ground breaking insights originate from pure imagination even in the absence of a formal education and proven tools. More importantly, it is possible that systematization of education, tools and processes is at the heart of declining productivity in true insight generation.

Most of today’s science is geared toward proving a stated hypothesis. This is akin to manufacturing – execution of a defined and specified process, that includes design of experiments, data gathering, statistical analysis and hypothesis testing. Indeed, humanity is fast approaching the “singularity,” in which computers can be trained to conduct research better than themselves, for the research process is the most systematized and computer programmable than anything else. It is possible that the first type of Artificial Intelligence that may become practical is a research computer, able to design studies, collect data, analyze, test and prove a hypothesis given to it. Such a machine, however, will be incapable of imagination and insight generation. So, it may walk, move objects and conduct research, but it will never sleep, weep or dream.

Perhaps, the “singularity,” that many look forward to is related to a complete elimination of imagination from the human psyche.

Thursday, January 3, 2013

Size matters

Recent research from the University of Sweden demonstrates that a larger brain is expensive for biological entities and it compromises the size of the gut and reproductive ability. Using a study on guppies, they show that the increased cognitive abilities of a large brain comes with significant energy demand. Similarly, In humans,the brain that makes up only 2% of the total mass consumes over 20% of the available energy, starving other organs.

Given this data, it is unclear why evolution will push in the direction of aiding the growth of the most inefficient organ in the body, especially if it substantially affects the ability to reproduce. Large brains have unambiguously pushed humans, other primates, dolphins and whales to smaller family structures, increasing the risk of extinction. Humans, for example, was nearly wiped out with less than a few thousand samples surviving through an incredible bottleneck.

Many have considered natural selection, an effective way to optimize systems. Data show this may not be true. Optimizing locally, allowing certain participants to gain advantage over others in short time horizons have long term deleterious effects on the survival of the species. In the case of humans, the large brain seems to have done further damage as individuals in an attempt at maximizing utility locally and within their life spans have accelerated the destruction of the environment, paving the way to extinction.

Size matters – an evolutionary accident and myopic natural selection processes have aided the development of an inefficient and ineffective organ, that will drive the species to extinction.

Wednesday, January 2, 2013

Bonobo Facebook

Recent research from Duke University shows that bonobos are much more sophisticated in social networking than humans. Past several thousand years of clan behavior has left the humans with distinct disadvantages in the modern world. In every realm, it has been shown that humans prefer to interact and share with those they know. Discrimination and racism – somewhat crude manifestations of the same deficiency – a proclivity to share with known individuals than those who are strangers, may have a significant negative impact on the world economy.

Bonobos appear to be different. They prefer to share (food in this case) with strangers than known friends and family. This allows them to extend their social network faster than otherwise. This is a highly sophisticated behavior that maximizes long term utility, that most humans appear to be incapable of doing. It has also been shown that bonobos are less likely to share anonymously compared to humans. This further illustrates that the behavior is driven by strategy with an understanding of long term and system effects.The behavior of humans in social networking channels further illustrate this handicap. Most connections seem to clump closely, paralleling clan behavior, indicating that it will be difficult for humans to get rid of this baggage. Xenophobia and prevalent resistance to free markets, trade and immigration are also manifestations of the same issue.

Accelerating social software have not improved the human firmware – that is likely to remain rigid for the foreseeable future, capping economic growth and welfare.

Saturday, December 29, 2012

Society initiated disease

A recent study from the University of Cambridge that shows a clear link between the incidence of schizophrenia and deprivation, population density & societal inequality poses an important question. Do all mental health issues have an underlying cause? Are the individuals from a sick society more likely to develop mental diseases?

Lack of education, employment and income (deprivation), high crime and  inequality (the gap between rich and poor) are characteristics of urban societies with generally high population density. Although cause and effect cannot be fully teased out, the high correlation seen between mental diseases and these types of habitats suggest, at the very least, a strong environmental effect on mental diseases. Treatment of CNS disorders by chemical means has been controversial. For example, a recent report shows that the perpetrators of the last two dozen mass killings in the US were on medication. If the environment is the primary cause of mental health issues, then, both the diagnosis and the treatment of these diseases need to be rethought.

More importantly, society has to assess the total cost of design and redesign. If the status-quo designs lead to segmentation in which certain parts of the system unavoidably pick up higher levels of deprivation, density and inequality and related high costs due to mental diseases, it is important to consider alternatives. The self reinforcing nature of sick societies leading to sicker inhabitants means that breaking this cycle is hard. Investing into improving these environments and their characteristics could have high return to society if all costs are considered.

The apparent connection between mental health and societal characteristics is an important notion. Both researchers and policy-makers need to consider this in their work.

Monday, December 24, 2012

Error correction

A recent study in the Proceedings of the National Academy of Sciences demonstrates that birds are able to correct small errors in their songs but not large ones. In fact, a threshold fault can be demonstrated beyond which there is no recovery. If we abstract this for complex systems, it points to many different implications.

Arrival of errors within certain thresholds, is easier for complex organizations to correct. This seemingly obvious conclusion has many practical applications. It has been shown that small errors in airplane cockpits happen on average once every 10 minutes, giving ample time for correction. So the best way to avoid catastrophic failures is to divide aggregate error in chunks and feed to decision-makers so that correction is optimal. This is true in airplane cockpits, companies and policy-making bodies. The inability of the bird brain and presumably the human brain to correct large errors should be considered in all available designs.

Decisions can be optimized by dividing aggregate error into packets of manageable size spread over manageable time.