Scientific Sense Podcast

Monday, May 23, 2016

Salt water bubbles

Economists, closer to salt water, appear to be prone to thoughts of inefficiency and bubbles in the financial markets, something that can be cured by a single trip to the windy city. A recent study from Columbia University (1) asserts that they could find over 13,000 bubbles in the stock market between 2000 and 2013. Using supercomputers, no less, and "big data," they appear to have "conclusively shown" that stock prices take wild and persistent excursions from their "fair values." Unfortunately, these academics, who profess to be "data scientists," are yet to encounter the phenomenon of "random walk," further evidence that "data scientists" should stay away from financial markets. After all, the “physicists” who descended into Wall Street have had a checkered history of “abnormal returns” wrapped in consistent negative alpha.

The remark from a graduate student from Harvard - "I expected to see lots of bubbles in 2009, after the crash, but there were a lot before and a lot after," is symptomatic of the problem faced by “data scientists,” seeking problems to solve in super-domains they have no clue about, where participants, who determine outcomes are equipped with pattern finding technology. They may have better luck in real markets, for prices in financial markets are determined by a large number of participants, each with her own inefficient algorithms. The most troubling aspect of the study is that the authors of the study believe that “a bubble happens when the price of an asset, be it gold, housing or stocks, is more than what a rational person would be willing to pay based on its expected future cash flows.” In a world, immersed in intellectual property, where future cash flows cannot be forecasted precisely, the value of an asset cannot be determined by such simple constructs that have been rendered invalid for decades.

The lure of financial markets have been problematic for “data scientists” and “physicists.” However, a cure is readily available in academic literature emanating from the sixties.


Monday, May 16, 2016

Small step toward bigger hype

Recent research from the University of Liverpool (1) suggests a method by which computers could learn languages by semantic representation and similarity look-ups. Although this may be in the right direction, it is important to remember that most of the work in teaching computers language or even fancy tricks, is not in the realm of "artificial intelligence," but rather they belong to the age old and somewhat archaic notion of expert systems. Computer giants, while solving grand problems such as Chess, Jeopardy, Go and self driving cars, seem to have forgotten that rules based expert systems have been around from the inception of computers, much before some of these companies were founded. The fact that faster hardware can churn larger set of rules quicker is not advancing intelligence but it is certainly helping efficient computing.

Engineering schools appear to still teach ideas that are already obsolete. Programming languages have been frozen in time, with prescriptive syntax and rigid control flow. Today's high level languages are certainly practical and immensely capable of producing inferior applications. Even those who could have "swiftly," assembled knowledge from previous attempts seem to have concocted together a compiler that borrows from the worst that have gone before it. As they proclaim "3 billion devices already run it," every hour an update is pushed or conduct conferences around the globe dotting and netting, the behemoths don't seem to understand that their technologies have inherent limitations.

Computer scientists, locked behind ivy walls, are given skills that the world does not need anymore.


Thursday, May 12, 2016

Nutritional genetics

Research from Indiana University (1) speculates that physical traits could be substantially impacted by food. The adage that "you are what you eat," appears to work at a deeper genetic level. In low complexity biological systems, such as ants and bees, variation in food at the larvae stage seems to explain specialization at the genetic level. If true, this has implications beyond what has been observed.

Food, a complex external chemical, has to be metabolized, utilized and purged by biological systems routinely. Although it is clear that available energy content and processing efficiency will depend on the variation and complexity in inputs, the idea that food could cause genetic specialization is fascinating. More importantly, this may lead to better design of food to favorably impact physical and mental conditions, the latter possibly holding higher promise for humans.

Ancient cultures and medicines have routinely relied on food as the primary way to remedy tactical issues. The Indiana research may provide a path to propel this idea into more systematic and planned impacts.


Thursday, May 5, 2016

No safety net

Recent research from Johns Hopkins (1) suggests there are over a quarter of a million deaths in the US per year due to medical errors. It is a sobering observation that future generations will look back on with anguish and perhaps, incredibility.  At the height of technology, we are slipping, not because of lack of know-how, but rather, lack of application. One preventable death is too much and the fact that medical errors are the third leading cause of death in the US, is immensely troubling.

Unfortunately, technology does not solve problems. Bigger data and faster computers are likely irrelevant if they cannot fundamentally influence decision processes and allow information flow to enhance decision quality. It is not about precision - there is no such thing - but a systematic use of all available information at the point of decision. Further, the human brain, with its inherent limitations, is unable to minimize downside risk in a regime of high utilization and volatility. A loss of life, a traumatic and life changing event for any healthcare provider, looms high but the environment simply does not allow anything more than what is tactically possible. The lack of a safety net below cascading, complex and error-prone processes suggest the need for a sudden and impactful change that most technology companies are unable to help with.

It is high time that healthcare embraced practical applications of available technologies to improve patient health and welfare.


Saturday, April 30, 2016

Predictions with a single observation

A recent study from the University of Rochester (1) claims to improve the "Drake equation," - a constant reminder that multiplying random numbers does not provide any additional insights that are not present in the components. Now with exoplanets galore, the study that appeared in Astrobilology claims they can put a "pessimistic estimate" on the probability of non-existence of advanced civilizations, elsewhere in the universe. Just as in the previous attempt, the study suffers from traditional statistics. Most agree that a singular observation is not sufficient to predict anything, let alone the age, capabilities and the distance from Earth, where they could reasonably exist.

As the authors note, the space time window afforded to humans is too narrow to prove or disprove anything. However, the tendency to be amazed by large numbers and the multiplicative effects of such constructs, have led scientists to make non-scientific claims. Until at least a second data point becomes available, the effort expended on statistical analysis in this area is a waste of time. Availability of data is necessary but not sufficient to assign probabilities. Even those clinging to normality statistics, centuries old by now, know that it is not a good tool to make predictions.

More importantly, those awaiting ET's arrival, have almost infinite flexibility to keep on searching. If one has a hypothesis, then an accumulation of negative findings against it, regardless of how many trials are possible, has to be given due consideration. As an example, if one claims favorable conditions exist for life on Enceladus, Saturn's famous moon, such as water, oxygen and a heat source - then investing into the exploration of the icy rock is reasonable. However, if one comes out empty, it cannot be irrelevant. Just because there are trillion other rocks, in the solar system alone, that could be explored, one cannot simply ignore such an observation. At the very least, it should challenge the assumptions used by the space agency and others to justify such explorations. This "new" statistics - perhaps called "Statistics of large numbers," - where no negative observation has any utility - is very costly even though it is well positioned to pump out publications.

Scientists, engaged in irrelevant and invalid observations. aided by large numbers, may need to challenge themselves to advance the field.


Tuesday, April 26, 2016

Uncertain networks

Recent research from MIT, Chicago and Harvard (1) contends that smaller shocks in the economy could be magnified significantly by network effects. If true, it may provide guidance on policy that is trying to "jump start" large economic systems by targeted investments. If the transmission of such shocks across the economy is predictable, then, it could impact macro-economic decisions favorably. However, looking back with a deterministic view of network evolution, may have some downside.

Economic growth is driven by the conscious harvesting of uncertainty and not by strategic investments by bureaucrats or even corporations. Further, networks are in a constant state of evolution. Measuring GDP impact, a backward looking measure has less meaning in an economy driven by information, innovation and intellectual property. Firms, locked into the status-quo, with a rigid view of demand and supply, indeed fall prey to shocks amplified by static networks, But those, keenly aware of unpredictable uncertainty and the value of flexibility, could certainly surpass such external noise. The question is not how the present network amplifies shocks but rather how the networks are built. If they are built by organizations with a static view of the future, then they will be brittle and consumed by minor shocks. The measurement of intellectual property by patents is symptomatic of the adherence to known metrics and a lack of awareness of where value is originating from.

Empirical analyses in the context of accepted theories have less value for the future - policy or not. The field of economics has to evolve with the modern economy. Lack of innovation will always have a negative effect on the economy - no further analysis is needed.


Monday, April 25, 2016

Biological storage

Recent news (1) that University of Washington researchers have successfully stored and retrieved data using DNA molecules is exciting. As the world accumulates data - already many tens of millions of gigabytes, growing at an alarming rate, storage capacity and efficiency are becoming top priorities in computing. With material sciences still lagging and computer scientists clinging to the Silicon status-quo, it is important that the field takes a biological direction. After all, nature has coded and retrieved complex information for ever. Using the same mechanism is likely a few orders of magnitude more efficient than what is currently available.

DNA, the most important biological breakthrough over four billion years, has been almost incomprehensible for humans, arguably, the best product of evolution. Lately, however, they have been able to understand it a bit better. Although some argues that the human genome map is the end game, it is likely that it is just a humble beginning. The multi factorial flexibility afforded by the DNA molecule may allow newer ways to store binary data, making it akin to the other belated innovation - quantum computing. Here, thus far, research focused on mechanistic attempts to force fit the idea into the Silicon matrix. Taking a biological route, perhaps aided by the best functioning quantum computer, the human brain, may be a more profitable path.

Biology could accelerate lagging innovation in material sciences and computer science.


Sunday, April 17, 2016

Bacteria rising

Recent research from Michigan State University (1) that demonstrates the rise of multi-drug resistant bacteria due to overuse of antibiotics in animals is troubling. It has long been known that flu originates in farms with multi-species interactions. As mentioned in the article, the swine farms in China are particularly problematic as they allow easy gene transfers among bacteria. This, in conjunction with lack of antibiotics research for decades due to declining commercial economics, could result in a perfect storm. 

Bacteria have been dominant all through the history of the planet. Robust architecture with fast evolution by sheer numbers, led them to largely supersede any other biological life form on earth. For the past several decades, they have been put on the back foot, for the first time, by humans. All they need, however, are sufficient number of trials to develop resistance against any anti-bacterial agent. Data shows that they are well on their way, thanks to a variety of experiments afforded to them by inter-species breeding and the overuse of known agents. 

The economics of this indicates that commercial organizations are unlikely to focus on it till it is too late. If R&D, commercial or publicly funded, is not focused on this developing problem, we may be heading toward a regime that may make Ebola look like a household pet. In a highly connected world of intercontinental travel, it is easy for the single cell organism to hitch a ride to anywhere they would like to go. Thus, local efforts are not sufficient and a true global push is needed to compete against the abundant experience, collected over four billion years.

R&D prioritization at the societal level needs to take into account the downside risk and value of investments.