Google

YouTube

Spotify

Scientific Sense Podcast

Saturday, December 29, 2012

Society initiated disease

A recent study from the University of Cambridge that shows a clear link between the incidence of schizophrenia and deprivation, population density & societal inequality poses an important question. Do all mental health issues have an underlying cause? Are the individuals from a sick society more likely to develop mental diseases?

Lack of education, employment and income (deprivation), high crime and  inequality (the gap between rich and poor) are characteristics of urban societies with generally high population density. Although cause and effect cannot be fully teased out, the high correlation seen between mental diseases and these types of habitats suggest, at the very least, a strong environmental effect on mental diseases. Treatment of CNS disorders by chemical means has been controversial. For example, a recent report shows that the perpetrators of the last two dozen mass killings in the US were on medication. If the environment is the primary cause of mental health issues, then, both the diagnosis and the treatment of these diseases need to be rethought.

More importantly, society has to assess the total cost of design and redesign. If the status-quo designs lead to segmentation in which certain parts of the system unavoidably pick up higher levels of deprivation, density and inequality and related high costs due to mental diseases, it is important to consider alternatives. The self reinforcing nature of sick societies leading to sicker inhabitants means that breaking this cycle is hard. Investing into improving these environments and their characteristics could have high return to society if all costs are considered.

The apparent connection between mental health and societal characteristics is an important notion. Both researchers and policy-makers need to consider this in their work.

Monday, December 24, 2012

Error correction

A recent study in the Proceedings of the National Academy of Sciences demonstrates that birds are able to correct small errors in their songs but not large ones. In fact, a threshold fault can be demonstrated beyond which there is no recovery. If we abstract this for complex systems, it points to many different implications.

Arrival of errors within certain thresholds, is easier for complex organizations to correct. This seemingly obvious conclusion has many practical applications. It has been shown that small errors in airplane cockpits happen on average once every 10 minutes, giving ample time for correction. So the best way to avoid catastrophic failures is to divide aggregate error in chunks and feed to decision-makers so that correction is optimal. This is true in airplane cockpits, companies and policy-making bodies. The inability of the bird brain and presumably the human brain to correct large errors should be considered in all available designs.

Decisions can be optimized by dividing aggregate error into packets of manageable size spread over manageable time.

Sunday, December 23, 2012

Stable pairing

The Gale-Shapely algorithm that showed stable and optimal pairing in a multi-period game was the basis of this year’s Nobel Prize in Economics.  This demonstrates that experiments and thought processes based on market economics is the best way to advance knowledge. Market design and game theory have much to offer for better policy-making as well. Corrupting theory with such artifacts, however, is a dangerous but necessary game.

The current stalemate in Washington, over an artificial and over-articulated fiscal cliff is the case in point. Although policy-makers are not equipped with a level of competence that will make the game possible, what is needed here are better auctions. Mechanisms have to be designed for auctions of policy priorities within the policy-makers’ sphere of influence and the country at large. What is at stake is a trade-off, a transfer of wealth across generations and uncertainty in growth patterns that can be influenced by allocation of capital among private and public users. A market price of policy alternatives has to be revealed and a pragmatic auction process needs to be designed to move toward better outcomes.

Leaving complex policy choices to a handful of idiots in the nation’s capital is neither optimal nor practical.

Saturday, December 22, 2012

The elusive truth

A recent colloquium conducted by the Kavli Institute for Cosmological Physics at the University of Chicago seems to take much relief from the fact that more physicists take the research on dark matter seriously. Some have also expressed hope that we are closing in on the answer. Such optimism may be premature.

Dark matter, energy and flow – mere abstract plugs to make the equations work within the accepted framework to account for observations – may be simply delaying the inevitable. Exponentially growing data have been aiding the feeding frenzy and every budding experimentalist around the world has been striving to find the predetermined truth – the verification of the existence of “dark matter.” Just as the Higgs Boson appeared out of the blues – some even willing to sweep aside the fact that the measurements do not quite agree or may point to a duality – it is almost certain that dark matter is about to pour out of the collection bins, spring loaded for an ejection. Such is the state of science, that questioning theory has become unfashionable and accepting uncertain experimental evidence for the stated hypothesis, the norm.

As the world nourishes another generation of tacticians, willing and able to prove the obvious and unwilling and unable to challenge the questionable, we simply sink to higher ignorance.

Friday, December 14, 2012

Removing the human

A recent report from the National Research Council asserts that a lack of national consensus on the priorities of NASA, has held the agency back. This is certainly true as this has resulted in both uncertainty in funding and reduced objectivity in the selection and design of programs. The problem faced by NASA is nothing new – the agency requires a top-down portfolio management process that maximizes societal utility, assuming that is the objective function to maximize.

Take the human spaceflight program for example. NASA has not been able to demonstrate the need and the value of such programs but traditionalists both inside and outside the agency always introduce these designs into the portfolio without any valuation criteria. The current objective of landing a human on an asteroid is bizarre and it points to picking design features to satisfy all participants – scientists, politicians and propeller heads. Dumb politicians want to send people to the Moon to mine and others to Mars to colonize. What is disappointing is that the level headed scientists and engineers at the agency will play along with these ideas just to capture the funding.

It is time NASA was held to a higher standard – that of market based economics. It has to show how value to society is maximized by the portfolio of programs it is pursuing. This valuation has to consider significant tail risks that exist for humanity from an asteroid collision as well as the huge benefits that can accrue from understanding physics more deeply. Sending a human to an asteroid with a pick-axe is unlikely to fit into such a portfolio.

Wednesday, December 12, 2012

Inefficient clumping

Using the income of political lobbyists, economists from the University of Warwick and London School of Business demonstrate that the value of who you know is at least 24% higher than what you know. This has been intuitively clear to most seeking a living but strong empirical support for the same in a contained experiment is satisfying.

This has multiple implications. First, this results in inefficient clumping and a loss of meritocracy in all aspects of life. Since the value of increasing connections is much higher than acquiring content, more time is spent in the former. The process starts early with most learning to optimize utility within these competing constraints. At the micro-level, it creates higher segregation between success and failure. Since connectivity is self-reinforcing, these trajectories can only diverge over time. Second, at the macro-level, resources are misallocated, chasing connections than content, with suboptimal outcomes for society.

Better designs of society that equilibrates the value of content and connections can set humanity on a faster trajectory to the next stage.

Sunday, December 9, 2012

Peripheral productivity

Recent revelations that the King Abdullah University of Science and Technology (KAUST), that was specifically set-up to jump start research in Saudi Arabia may be having some growing pains, demonstrate that research productivity is a complex phenomenon, not correlated with resources. We have seen this in large pharmaceutical and technology companies that turned in mediocre results when they were plush with cash.

What are the primary causes of high research productivity? Certain academic institutions in the US have shown sustained levels of highly valuable research output over long periods of time. Although these institutions have been resource rich, it does not appear to have been a determining factor. People matter but of course many other educational institutions have had access to an equal number of high quality researchers. One factor that differentiates institutions, companies and organizations is their ability to pursue a diverse portfolio of “peripheral research,” topics others shun to be less interesting. So, it is not the fact that an organization has access to resources and high quality researchers that portends valuable output, it is rather what it chooses to do with those riches. Increasingly, most valuable research occurs at the boundaries of different areas – such as material science and energy, neuroscience and computer science, space travel and biology and economics and psychology. Research in such obscure intersections does not attract grants or focus – and those who pursue them do so at their own peril.

Research productivity cannot be bought, nor can it be extracted from processes that mingle resources, traditionalists and conventionalism. It requires a level of imagination that typically exists only in those who will remain obscure till the world learns about them.

Sunday, December 2, 2012

The knowledge weapon

A recent article in "Genetics, seems to confirm the ground breaking research of biologist, Spencer Wells, that Native Americans and Northern Europeans are closely related. The “nursery effect,” eloquently portrayed by Dr. Wells goes further than the limited conclusion of the current study. More importantly, it is time educators have taken up such information to ameliorate knowledge gaps in the next generation that eventually lead to ignorance and racism.

Two pieces of information; first that humans survived an unimaginable bottleneck close to forty thousand years ago that reduced them to a few thousand specimens, pushing them to the edge of extinction and second that only a few dozen humans made it from Siberia to Alaska over the bridge twenty thousand years ago, should substantially broaden the perspective of students across the world. The first fact implies that the seven billion people sharing the blue planet are so closely related that it will be genetically impossible to separate them within time scales spanning over a few 100 generations. And second, all of the original occupants of the Americas originated from a handful of individuals, before they were met by their cousins taking a different route.

As they mutilate and kill across the world, as they plot to wipe out countries and nuclear factories and as they posture and haggle over land and people, let’s hope that information spreads fast and the next generation will look forward to a brighter future and shun the nightmares created by ignorance.

Wednesday, November 28, 2012

Solar maximum

Recent statistics issued by the Department of Energy (DOE) shows that the price of solar photovoltaic (PV) modules continue to decline at nearly 15% per year. Solar PV, however, still remains uneconomical without subsidies, costing over $6/W for residential systems and over $3/W at utility scale. The report also indicates that the Balance of System (BOS) costs including labor has been declining but they still represent nearly 1/3 of the total costs.

We are inching towards fossil fuel parity – requiring another 100% decline at utility scale and 200% at residential level. In addition to research in materials sciences to fundamentally change module design, one cannot ignore the need to further drive the BOS costs down. Simpler installation, lower losses in transmission & storage and better energy management systems can substantially reduce the costs of installation and maintenance. Reaching fossil parity for solar is a huge game changer – effectively eliminating green house concerns and ushering in the era of distributed generation – that is much more robust providing higher security and efficiency. Such a technology will also lift developing countries, clamoring for cheaper transportation and refrigeration.

It is important to solve this problem holistically. Let’s not ignore the design of the overall system – including wiring, labor, alternators, storage and maintenance in addition to the PV modules.

Thursday, November 22, 2012

Broken universe


A recent report from the journal of Physical Review Letters show data that point to an anticipated break in time symmetry. This rejects symmetry in any dimension of CPT and we are left with a broken universe with no mirror images in anything. So, there is no looking back after all and contemporary time should flow uninterrupted in a singular direction. This coupled with the current hypothesis that the universe will expand forever with an unimaginable and unending outcome of complete darkness completes the pessimistic view that humans have strived to reach for generations. We are completely and permanently sealed in an irrelevant corner of a fish bowl, that leaks.

Is such knowledge useful? Would the participants of a system with programmatic bad outcomes like to know them in advance? How could science bridge the gap between the limited lifespan of a human and her yearning to understand the future she does not control? Are we coming full circle to the beginning where ignorance was bliss? If knowledge cannot be shown to be any more utility maximizing than ignorance, why would we engage in such a process?

The reason, ironically, may be that humans are programmed to seek information. This is likely a property of the system and it is no different from entropy. It seems that knowledge and entropy will increase unambiguously along an unwavering direction of time. So, it is unavoidable that humans will fully internalize the inevitable but sad truth.

Tuesday, November 20, 2012

Can you huddle?

A recent study from the American Physical Society’s Division of Fluid Dynamics demonstrates that penguins that face harsh Antarctica winters huddle optimally. There are two important findings. First, even though each individual penguin is driven by a selfish motivation to capture maximum available heat for itself, the ultimate outcome appears to result in democratic allocation of aggregate heat among the participants. And second, the shape of the societal huddle seems to be utility maximizing if all uncertainties are taken into account.

These findings have multiple implications. It reinforces the idea that a system, in which individual participants are local utility maximizers, tend toward societal optimum. Many have been concocting theories to explain the behavior of complex systems top down and they get lost in the process as the complexity of their own theories overtake them. The efficient market hypothesis, that argues prices tend toward optimal levels by the actions of selfish participants has been objected to those who think it is too simplistic. Similarly, the origins of planned and socialistic societies follow a consistent disbelief in the actions of the selfish individual and a misplaced trust in a few, who govern them.

Those who do not believe in the power of free markets, trade and societal organizations, may want to look South, where those less endowed have constructed better systems.

Saturday, November 17, 2012

Hardware or software?

A recent study seems to argue that the remarkable intellectual abilities demonstrated by Einstein are related to the unusual characteristics of his brain. The asymmetric shapes in prefrontal, somatosensory, primary motor, parietal, temporal and occipital cortices of his brain may have contributed to his success, the study portends. This may be premature.

By comparing the shape and structure of a singular brain against a cross section of ordinary brains, the study seems to find oddities. However, the cause and effect are not established. More importantly, the question will be whether such odd shapes exist in larger populations and if so, why such brains have not been able to push imagination to the limit. An alternative hypothesis has to be that the hardware of the brain has little to do with the demonstrated abilities of a human. It is the software – both the installed operating system – culture and societal effects and the applications – experience and curiosity that make each brain unique. It is neither the capacity to memorize nor the ability to process information fast that lead to remarkable innovations and discoveries.

Ordinary men and women will perish cutting and dicing a remarkable organ but finding nothing important. Meanwhile, the imagination that propelled the beautiful mind to eternity will wither away.

Saturday, November 3, 2012

Occam’s razor

In every field, from science to psychology and from education to entertainment, those with simpler ideas and fewer assumptions seem to do better. Theories that can be expressed on simpler terms, educational institutions with fewer constraints and movies, music and dramas, with simpler and enjoyable plots seem to win. Yes, economy and parsimony, terms considered anathematic by those reaching for the stars under city lights, may rule again. A complex theory that fails to explain what is observed certainly is inferior to one that is simple and equally incompetent. If observed complexity in a system can be attributed to a singular origin, such as God or dark matter, then one could argue that it is in a favorable direction to Occam’s razor. In a world basking in statistical noise from pharmaceuticals to astrophysics and from Wall Street to the forgotten alleys, one could anticipate a break from mediocrity but that is not certain. It is dangerous to open one’s mind in the middle of an open field in the midst of a storm and lightning.

Those who anticipate the impending singularity may be better advised to attempt to make sense out of confusion.

Saturday, October 6, 2012

Reality bites

Recent news that the Chandra observatory has produced data showing the Milky Way is embedded in a vast halo of hot gas extending many hundreds of thousands of light years and possibly forming a bridge across the Local Group, may help reintroduce reality and less fanciful speculation in Astrophysics. The field has been dominated by those adept at concocting undefined constructs – dark matter, energy and flow - to account for the missing baryons and the inexplicable tilt of the universe for many decades. Such a simple explanation – that one has to look harder to find the thin veil – if proven correct, may inject a much needed reality check, with many positive effects.

Occam’s razor rules and those who forget it in any field are not progressing knowledge, just introducing noise to constrain it. One can never underestimate the power of simplification and this is especially true in a regime of exponentially increasing data, driving every scientist into analysis-paralysis and pattern finding. Empiricism and speculation without a mathematical framework are unlikely to be useful. The tendency has been either to take the status-quo equation as sacrosanct and reject observations that do not fit or to use observations to create hypotheses without a mathematical foundation.

Next time, perhaps, it is better to seek simpler explanations for exotic phenomena and badly behaving equations.

Tuesday, October 2, 2012

Green Brain

Recent news from the universities of Sheffield and Sussex (1) about a highly ambitious project to replicate a honey bee brain, in an attempt to advance the stagnant field of Artificial Intelligence, is encouraging. Brains, driven largely by instincts are likely more amenable to replication by currently available techniques. However, such replication is closer to automation than intelligence. Research in this direction is useful to build more intelligent automatons. Adding a layer of cognition to machines could be useful. If this can be considered different from the larger vision of AI – an ability to replicate the human brain in all its grandeur, we may be able to advance both fields faster.

The engineering concept of Artificial Intelligence has been stuck, attempting to connect brain replication with automation, for many decades. One of the primary reasons is that the structure and semantics of contemporary software are not amenable to modeling holistic phenomena. It is easier to build an airplane or a robot from component parts systematically. Engineers have been trying to extend this basic idea to brains with very little success.

Advancing contemporary AI techniques to creating brains with high programmability, such as the honey bee brain, is a useful exercise to advance robotics. But it is unlikely to advance our understanding of complex brains.

(1) 'Green Brain' project to create an autonomous flying robot with a honey bee brain. Published: Monday, October 1, 2012 - 11:07 in Mathematics & Economics