Google

Winc

YouTube

Spotify

Scientific Sense Podcast

Saturday, December 29, 2012

Society initiated disease

A recent study from the University of Cambridge that shows a clear link between the incidence of schizophrenia and deprivation, population density & societal inequality poses an important question. Do all mental health issues have an underlying cause? Are the individuals from a sick society more likely to develop mental diseases?

Lack of education, employment and income (deprivation), high crime and  inequality (the gap between rich and poor) are characteristics of urban societies with generally high population density. Although cause and effect cannot be fully teased out, the high correlation seen between mental diseases and these types of habitats suggest, at the very least, a strong environmental effect on mental diseases. Treatment of CNS disorders by chemical means has been controversial. For example, a recent report shows that the perpetrators of the last two dozen mass killings in the US were on medication. If the environment is the primary cause of mental health issues, then, both the diagnosis and the treatment of these diseases need to be rethought.

More importantly, society has to assess the total cost of design and redesign. If the status-quo designs lead to segmentation in which certain parts of the system unavoidably pick up higher levels of deprivation, density and inequality and related high costs due to mental diseases, it is important to consider alternatives. The self reinforcing nature of sick societies leading to sicker inhabitants means that breaking this cycle is hard. Investing into improving these environments and their characteristics could have high return to society if all costs are considered.

The apparent connection between mental health and societal characteristics is an important notion. Both researchers and policy-makers need to consider this in their work.

Monday, December 24, 2012

Error correction

A recent study in the Proceedings of the National Academy of Sciences demonstrates that birds are able to correct small errors in their songs but not large ones. In fact, a threshold fault can be demonstrated beyond which there is no recovery. If we abstract this for complex systems, it points to many different implications.

Arrival of errors within certain thresholds, is easier for complex organizations to correct. This seemingly obvious conclusion has many practical applications. It has been shown that small errors in airplane cockpits happen on average once every 10 minutes, giving ample time for correction. So the best way to avoid catastrophic failures is to divide aggregate error in chunks and feed to decision-makers so that correction is optimal. This is true in airplane cockpits, companies and policy-making bodies. The inability of the bird brain and presumably the human brain to correct large errors should be considered in all available designs.

Decisions can be optimized by dividing aggregate error into packets of manageable size spread over manageable time.

Sunday, December 23, 2012

Stable pairing

The Gale-Shapely algorithm that showed stable and optimal pairing in a multi-period game was the basis of this year’s Nobel Prize in Economics.  This demonstrates that experiments and thought processes based on market economics is the best way to advance knowledge. Market design and game theory have much to offer for better policy-making as well. Corrupting theory with such artifacts, however, is a dangerous but necessary game.

The current stalemate in Washington, over an artificial and over-articulated fiscal cliff is the case in point. Although policy-makers are not equipped with a level of competence that will make the game possible, what is needed here are better auctions. Mechanisms have to be designed for auctions of policy priorities within the policy-makers’ sphere of influence and the country at large. What is at stake is a trade-off, a transfer of wealth across generations and uncertainty in growth patterns that can be influenced by allocation of capital among private and public users. A market price of policy alternatives has to be revealed and a pragmatic auction process needs to be designed to move toward better outcomes.

Leaving complex policy choices to a handful of idiots in the nation’s capital is neither optimal nor practical.

Saturday, December 22, 2012

The elusive truth

A recent colloquium conducted by the Kavli Institute for Cosmological Physics at the University of Chicago seems to take much relief from the fact that more physicists take the research on dark matter seriously. Some have also expressed hope that we are closing in on the answer. Such optimism may be premature.

Dark matter, energy and flow – mere abstract plugs to make the equations work within the accepted framework to account for observations – may be simply delaying the inevitable. Exponentially growing data have been aiding the feeding frenzy and every budding experimentalist around the world has been striving to find the predetermined truth – the verification of the existence of “dark matter.” Just as the Higgs Boson appeared out of the blues – some even willing to sweep aside the fact that the measurements do not quite agree or may point to a duality – it is almost certain that dark matter is about to pour out of the collection bins, spring loaded for an ejection. Such is the state of science, that questioning theory has become unfashionable and accepting uncertain experimental evidence for the stated hypothesis, the norm.

As the world nourishes another generation of tacticians, willing and able to prove the obvious and unwilling and unable to challenge the questionable, we simply sink to higher ignorance.

Friday, December 14, 2012

Removing the human

A recent report from the National Research Council asserts that a lack of national consensus on the priorities of NASA, has held the agency back. This is certainly true as this has resulted in both uncertainty in funding and reduced objectivity in the selection and design of programs. The problem faced by NASA is nothing new – the agency requires a top-down portfolio management process that maximizes societal utility, assuming that is the objective function to maximize.

Take the human spaceflight program for example. NASA has not been able to demonstrate the need and the value of such programs but traditionalists both inside and outside the agency always introduce these designs into the portfolio without any valuation criteria. The current objective of landing a human on an asteroid is bizarre and it points to picking design features to satisfy all participants – scientists, politicians and propeller heads. Dumb politicians want to send people to the Moon to mine and others to Mars to colonize. What is disappointing is that the level headed scientists and engineers at the agency will play along with these ideas just to capture the funding.

It is time NASA was held to a higher standard – that of market based economics. It has to show how value to society is maximized by the portfolio of programs it is pursuing. This valuation has to consider significant tail risks that exist for humanity from an asteroid collision as well as the huge benefits that can accrue from understanding physics more deeply. Sending a human to an asteroid with a pick-axe is unlikely to fit into such a portfolio.

Wednesday, December 12, 2012

Inefficient clumping

Using the income of political lobbyists, economists from the University of Warwick and London School of Business demonstrate that the value of who you know is at least 24% higher than what you know. This has been intuitively clear to most seeking a living but strong empirical support for the same in a contained experiment is satisfying.

This has multiple implications. First, this results in inefficient clumping and a loss of meritocracy in all aspects of life. Since the value of increasing connections is much higher than acquiring content, more time is spent in the former. The process starts early with most learning to optimize utility within these competing constraints. At the micro-level, it creates higher segregation between success and failure. Since connectivity is self-reinforcing, these trajectories can only diverge over time. Second, at the macro-level, resources are misallocated, chasing connections than content, with suboptimal outcomes for society.

Better designs of society that equilibrates the value of content and connections can set humanity on a faster trajectory to the next stage.

Sunday, December 9, 2012

Peripheral productivity

Recent revelations that the King Abdullah University of Science and Technology (KAUST), that was specifically set-up to jump start research in Saudi Arabia may be having some growing pains, demonstrate that research productivity is a complex phenomenon, not correlated with resources. We have seen this in large pharmaceutical and technology companies that turned in mediocre results when they were plush with cash.

What are the primary causes of high research productivity? Certain academic institutions in the US have shown sustained levels of highly valuable research output over long periods of time. Although these institutions have been resource rich, it does not appear to have been a determining factor. People matter but of course many other educational institutions have had access to an equal number of high quality researchers. One factor that differentiates institutions, companies and organizations is their ability to pursue a diverse portfolio of “peripheral research,” topics others shun to be less interesting. So, it is not the fact that an organization has access to resources and high quality researchers that portends valuable output, it is rather what it chooses to do with those riches. Increasingly, most valuable research occurs at the boundaries of different areas – such as material science and energy, neuroscience and computer science, space travel and biology and economics and psychology. Research in such obscure intersections does not attract grants or focus – and those who pursue them do so at their own peril.

Research productivity cannot be bought, nor can it be extracted from processes that mingle resources, traditionalists and conventionalism. It requires a level of imagination that typically exists only in those who will remain obscure till the world learns about them.

Sunday, December 2, 2012

The knowledge weapon

A recent article in "Genetics, seems to confirm the ground breaking research of biologist, Spencer Wells, that Native Americans and Northern Europeans are closely related. The “nursery effect,” eloquently portrayed by Dr. Wells goes further than the limited conclusion of the current study. More importantly, it is time educators have taken up such information to ameliorate knowledge gaps in the next generation that eventually lead to ignorance and racism.

Two pieces of information; first that humans survived an unimaginable bottleneck close to forty thousand years ago that reduced them to a few thousand specimens, pushing them to the edge of extinction and second that only a few dozen humans made it from Siberia to Alaska over the bridge twenty thousand years ago, should substantially broaden the perspective of students across the world. The first fact implies that the seven billion people sharing the blue planet are so closely related that it will be genetically impossible to separate them within time scales spanning over a few 100 generations. And second, all of the original occupants of the Americas originated from a handful of individuals, before they were met by their cousins taking a different route.

As they mutilate and kill across the world, as they plot to wipe out countries and nuclear factories and as they posture and haggle over land and people, let’s hope that information spreads fast and the next generation will look forward to a brighter future and shun the nightmares created by ignorance.