Google

YouTube

Spotify

Scientific Sense Podcast

Showing posts with label science. technology. Show all posts
Showing posts with label science. technology. Show all posts

Tuesday, October 9, 2018

Right brained Artificial Intelligence

As Artificial Intelligence becomes more commonplace through hype and reality, it may be useful to characterize the plethora of methodologies and technologies that are part of the thoroughly confusing medley. Conventional AI appears to be largely driven by the left brain as engineers, data scientists, and technologists flock to the dream, ably assisted by capital, seeking returns somewhere. Generally speaking, that is a prescription for disaster as technology, data and mathematics do not typically solve any problem of importance to enterprises. Granted, game playing is interesting and faking human voices and interactions equally compelling but none of these are going to change anything in the lives of ordinary people. And, they add little value to the economy or even companies.

The search giant recently proclaimed that AI gets more aggressive as they get better. This observation is not substantially different from the twitter girl created by another giant, that turned nasty. What these companies seem to be missing is that building AI bottoms up from historical data will simply reflect existing information content. More generally, these AI agents should reflect society and such observations add no value to the emerging arena, except talking points. And as the hardware company found out down South recently, transforming an organization requires a bit more than a "pizza-sized box," albeit it has solved most of the world's problems already.

It is about economics, stupid!. And that requires the silent right brain. AI has enormous potential but only if they are developed with a right brain dominance. It is a tough task as the normally shy right brain prefers to work from the background and simply muffles out the noise created by the left hemisphere. The old-fashioned concept of, "seeing the big picture," is still very important before diving into the details. Education systems tend to churn out left only brains in great numbers and this is problematic for the emerging regime. Scientists, whether real or of the data kind, cannot solve the problems facing humanity, let alone companies.

As the singularity enthusiasts revise the date of arrival of the discontinuity, it is important to remember that civilizations did not advance by tactics in the past and it is unlikely in the future.

Saturday, September 22, 2018

Gut feel

A recent finding (1) that shows that the gut uses a variety of communication channels to rapidly communicate with the brain has implications for the prevention and treatment of many diseases, including obesity and the metabolic syndrome spectrum. Many had this intuition but now hard data is showing that humans are driven largely by their gut with the brain playing the role of a computer, merely calculating and shuttling instructions. This is not surprising. For over half a million years, they sought food for survival and the gut and its occupants, the constituents of the microbiome, have been reigning supreme. The overgrown appendage, the brain, serves little purpose in the grand scheme of things.

The recent reversal of roles has the brain scrambling to divide itself into two halves - the tactical and the strategic. Enormous excess capacity allows it to process ancient rules and instructions without breaking a sweat. The ensuing boredom has led it to seek utility from abstract ideas such as art, music, and literature. With science and technology in the background, not requiring significant processing power, the brain can float above triviality and the routine.

This is problematic. An organ, largely intended as a conventional computing resource, is wasting itself, getting involved in thought experiments, at least from the perspective of the gut. A bifurcating human architecture, still fundamentally managed by the microbiome in the gut, coupled with a confusing potpourri of capabilities upstairs, could portend disaster. The toys they have invented are now growing into entities without guts and that will certainly pose a challenge to the declining species.

The decline of the advanced entity is predictable but that has implications for the most successful species for nearly 5 billion years. A world controlled by silicon and light will be bad news for the microbiome. It is unlikely that the species that dominates the universe will let that happen.


(1) http://www.sciencemag.org/news/2018/09/your-gut-directly-connected-your-brain-newly-discovered-neuron-circuit

Thursday, September 6, 2018

Time to wake up and face technology


A recent article (1) that notes that advancing technology initiated productive scientific regimes, speculates that Artificial Intelligence could be the next engineering innovation that speeds up biological and chemical advancements. It makes sense but life sciences, an old and traditional industry, has been lagging in the application of technology. As high energy physics, economics and even business, embrace rapidly advancing AI, life sciences and healthcare have been reluctant.

Historical friction resulting from blind alleys followed by scientists based on prescriptive mathematics is one reason. Biology remains to be the last frontier where uncertainty and non-linear interactions have kept technologists from making measurable progress. Half a million years of trial and error could not be replicated easily in Silicon and this has been a late realization. Unchanging regulatory frameworks are the other reason why healthcare has not been able to take advantage of available technology. It is time that regulators realized that the failure and success of a pharmaceutical product have nothing to do with the p-value emanating from clinical trials data. Even manufacturing companies have moved away from this century old and incorrect notion.

Life sciences and healthcare have to (finally) embrace personalized medicine. Cross-sectional statistics of population data is misleading and damaging for the health of humans. It is not the health of the population that healthcare needs to worry about, but rather the health of the individual. Mass manufacturing of single-dose drugs is as archaic as TV dinners and static thresholds on blood pressure, sugar, cholesterol, and other such measurements are as obsolete as slide rules.

Healthcare, the most ancient of all industries, has been progressing slowly. If we are unable to break out of a constraining regulatory architecture and choking traditionalism, we will put the entire "population," at risk and the share of the GDP commanded by healthcare will continue to climb.

It is time for life sciences and healthcare to wake up and face technology.


(1) http://science.sciencemag.org/content/361/6405/864

Monday, July 30, 2018

Redefining Artificial Intelligence

Artificial Intelligence, the contemporary darling of technologists and investors, has been largely focused on trivial consumer-oriented applications and robotics/automation, thus far.  Constrained by conventional computing, AI has been bottled up in hype and confusing name calling. What the AI enthusiasts do not seem to understand is that AI was never meant to be a technology that fakes what a human being appears to do externally but rather it was supposed to replicate her thought processes internally. As the search giant demonstrates how its technology could fool a restaurant reservation system or play games, as the world's largest shipper of trinkets demonstrates how they could send you things faster and the purveyors of autonomous vehicles demonstrate how they could move people and goods without the need for humans at the driving wheel, they need to understand one important thing: these technologies are not using AI, they are using smarter automation. They do not replicate human thought processes. They either fake what a human appears to do or simply automate mundane tasks. We have been doing this for over half a century and as everybody knows, every technology gets better over time. So, before claiming victory in the AI land, these companies may need to think deeply about if their nascent technologies could actually do something good.

However, there is a silver lining on the horizon that could move AI to real applications (1) including predicting and controlling the environment, designing materials for novel applications and improving the health and happiness of humans and animals. AI has been tantalizingly "close" since the advent of computers. Imagination and media propelled it further than what it could ever deliver. As with previous technology waves, many companies attempt(ed) to reduce this problem to its apparently deterministic components. This engineering view of AI is likely misguided as real problems are driven fundamentally by dynamically connected uncertainties. These problems in domains such as the environment, materials, and healthcare require not only computing resources beyond what is currently available but also approaches further from statistical and mathematical "precision."

Less sexy areas of AI such as enhancing business decisions have attracted less interest, thus far. Feeble attempts at "transforming," a large healthcare clinic using a "pizza-sized," box of technology that apparently solved all the world's problems already, seem to have failed. Organizations chasing technology to solve problems using AI may need to spend time understanding what they are trying to tackle first, before diving head first into "data lakes" and "algorithms." Real solutions exist at the intersection of domain knowledge, technology, and mathematics. All of these are available in the public domain but the combination of this unique expertise does not.

Humans, always excitable by triviality and technology, may need better skills to succeed in the emerging regime, driven by free and fake information and the transformation of this noise into better decisions. Those who do this first may hold the keys to redefining AI and the future of humanity. It is unlikely to be the companies you know and love because they are focused on the status-quo and next quarter's earnings.

(1) http://science.sciencemag.org/content/361/6400/342


Monday, June 18, 2018

ET deadline

As we approach the deadline for ET discovery, as proclaimed by the space agency, there appears to be a bit of panic. Discovering organic matter in Martian rocks (1) is not ET discovery, especially because there are plenty of abiotic explanations for the same. Even if it were of biological origin, it proves nothing as close proximity panspermia is not particularly interesting. The question remains to be where the green women are hiding in this vast universe of ours. The answer is that they may not exist.

Statistics enthusiasts always pointed to the fact that the universe contains 10 billion galaxies and a billion trillion stars and that makes it virtually impossible for life not to exist elsewhere. This could be true but a more interesting question is what the probability is for humans to find them. On this question, the chances appear bleak, for the laws of Physics constrain them to the darkest corners of the universe as they make toys to "explore," the heavens. It is almost like the current crop of explorers are yet to understand the harsh space-time constraints proposed by the century-old theory. 

Contemporary physicists are adept at proving that ever elusive particles exist by mining "big data," but they are certainly incompetent in finding tangible proof for the puzzle that has vexed humanity ever since they looked up into the night sky. "Is there anybody out there?." The most logical answer appears to be an emphatic no, as an "N of 1," experiment proves nothing, in spite of the daunting statistical likelihood. Even if the rover finds worms and bacteria in the red planet, it does not mean that they are extra-terrestrial, for two reasons. First, robust single cell organisms have been hitching rides on Mars missions forever and second, it could just mean that life originated there and then migrated to the blue planet. So, this is not the ET that the world has been waiting for.  Further out near Saturn, icy globes of Enceladus and Titan have been tantalizing for ET enthusiasts forever. They appear to be giving up on them, as it could be too much work. Digging 6 meters into the Martian soil and finding a single cell organism appears to be an easy way to put an end to the misery.

The space agency is on notice. They have to produce an ET in less than 500 days (as they promised several years ago).


(1) http://science.sciencemag.org/content/360/6393/1096

Saturday, June 2, 2018

Deeper learning

A recent article (1) that demonstrates how neural networks could be used to approximate light scattering by nanoparticles is an interesting new direction. We appear to be approaching a regime in which prescriptive analytical solutions and conventional simulation become inferior to deep learning. This is exciting but it also presents a huge downside for the advancement of abstract knowledge. Models that show robust outcomes are welcome but a generation of new scientists, prone to taking to the machine to prove hypotheses, by feeding them small samples of historical data, could dampen theoretical advancement not only in Physics but also in other areas.

This struggle between empiricism and rationalism has been with humans from inception. Did they survive by predicting where the lion is likely to be by using historical data of previous (bad) outcomes or did they rationalize by abstracting the expected behavior of the animal? Did they predict when an animal is likely to attack by using historical data on the timing of previous attacks or did they understand the animal's incentives and available alternatives? Did they migrate incrementally by using predictions, originating from previous short excursions, or did they go boldly where no woman had ever gone? Were our ancestors empiricists or rationalists?

It is difficult to ascertain one way or the other. It appears that empiricism has been a hidden attribute in our psyche for long. Till the advent of computers, rationalism appears to have dominated but since then, empiricism has been on a steep rise. In Physics, they now collect and stream data to find "new particles," without even asking why such observations are important. In medicine, they "high throughput screen" looking for the needle in the haystack, without a clear understanding of the mechanism of action. In economics, they regress data to find insights without asking whether they are insights at all.

There is likely no stopping the trend. As computers get more powerful, empiricism will become ever more dominant. If this is a natural outcome of evolution, then, advanced societies elsewhere (if they do exist) would be asymptomatically approaching pure empiricism for knowledge generation. That could be there Achille's heal as it also means that their knowledge is dependent on the past. A planet full of robots, with no ability to abstract but with an infinite capacity to learn from the past, could be highly inefficient.

Would humans retain inefficient qualities of being a human? It seems unlikely.


(1) http://advances.sciencemag.org/content/4/6/eaar4206.full

Saturday, April 21, 2018

Physics saves humanity


Recent news that a blood test could detect early-stage cancer with a 65% accuracy (1), is promising. However, this is not a sensitivity level that makes such technology very useful. Life sciences and healthcare researchers have been suffering from segmented specialization and domain experts in each sub-segment believe that they know everything. This has led to underutilization of available technologies from other industries and solutions that optimize within a narrow context. If the goal is to reach the best possible solution, it is advisable to get out of the labs and look across domains and let some of the egos go.

Healthcare, perennial laggards in the use of information technology, in the prediction, diagnosis, and treatment of diseases is falling further behind. As the engineers figure out autonomous cars and space tourism without breaking a sweat, life sciences and healthcare professionals, steeped in conventionalism, have been pretending that humans are indeed different from machines. Certainly, the policymakers in Washington appear closer to machines as they "retire," with lifetime healthcare benefits after robbing the same from 13 million Americans. And the most powerful one, after figuring out the 140 character idiot box, has been addicted to it just as a robot would be.

Machines are accelerating toward demonstrating higher cognitive capabilities while the frail bodies of the declining species suffer from a lack of acceptance of change. They have been immensely creative at inception. As they stood up in the African Savannah with a feeble architecture that was no match to the beasts that roamed, they courageously exposed themselves to danger. They traveled to every corner of the blue planet on foot and created habitats that are in sync with the environment. They survived a narrow bottleneck of fewer than 15,000 samples as the ice age advanced across the globe. And then, the "modern woman," arrived - and that was bad news. Agriculture, the industrial revolution, and computer technology seem to have made them weaker. Grains made them diabetic, industries have been fuming poison into their greenhouse and technology now appears to set them back.

The fundamental question remains to be that if life indeed is a result of Physics. Before the "God particle," and "gravitational waves," there were more fundamental concepts such as entropy. If entropy has an unambiguous positive slope and more importantly if there is a universal objective function that maximizes entropy, life certainly fits. Life appears to be most efficient compared to natural processes to accelerate entropy and that points to the idea that the creation, maintenance, and eventual destruction of life are driven by physical processes. To reject this hypothesis, one has to prove that life has entropy reducing effects. It does not appear to be so. Organization of life in structures from bacteria to humans appear to accelerate entropy. It is possible that one can mathematically show that the size of colonies of life that we observe is entropy maximizing.

Physics may require life to survive as it may be the best way to maximize an overall objective function. Humans may be saved in spite of themselves, by Physics.


(1) http://www.sciencemag.org/news/2018/04/blood-test-shows-promise-spotting-early-cancers

Saturday, April 7, 2018

The bane of pharmaceutical R&D

A recent study (1) appears to raise red flags on pharmaceutical research, animal studies and the contemporary scientific process in general. Perhaps it could be new to the authors but most of what they describe have been known to the community for many decades. The following are important considerations in this debate. I state them without proof but there is plenty out there:

(a) A very large percentage of the published studies cannot be replicated
(b) Most of the published studies target proving something rather than the other way around
(c) The quantity of publishing (rather than the quality) is the most important metric for most educational institutions to determine the reward for academics
(d) Big pharma is run by outdated leaders who are trying to churn out incremental medicines to meet shareholder value targets
(e) The drug discovery and development processes are ably assisted by an incompetent regulatory agency with many conflicts of interest
(f) Hypothesis testing in life sciences still clings to a nearly 100 years’ old idea that uncertainty is normally distributed. And most statisticians, encumbered by the agency’s love for “p-value,” will not deviate from the framework. And in the process, they have approved bad drugs, rejected good ones and failed to identify sub-populations who could benefit from the NCE.

So, the authors’ contention that many animal studies are not published at all, albeit interesting, is just the tip of the iceberg. There is a much bigger problem to tackle. The leaders of life sciences companies and their regulators may want to consider retirement, say after 80, as they may need to yield to young leaders who have a higher appreciation of emerging technologies.

The correlation between animal studies that precede the clinic and what happens in humans has been incredibly low for almost a century. They have tried everything from mice, rabbits, dogs, and chimps in an attempt to prove the unprovable. In the process, they reduced animal welfare while simultaneously developing therapies that can only be called, “bad.” The finding that the therapeutic index of marketed drugs seems to decline over time is a warning signal that there are many inefficiencies in the R&D and approval processes.

Technology is advancing. We do not have to stick to regression slide rules to prove or disprove if a drug works anymore. It is time life sciences industry embraced ideas that are transforming every other industry. To make that happen, it will require cleaning the shop and starting over.

Old ideas die hard and older ideas are even worse.

(1) http://www.sciencemag.org/news/2018/04/clinical-trials-may-be-based-flimsy-animal-data


Sunday, April 1, 2018

Man-made panspermia

Man-made panspermia is an increasing concern for humans as they struggle to understand their role in the universe. Harsh space-time constraints give them a very narrow view of their container that could be a small bubble in a multiverse. And thus far, they have not heard or seen from anybody in their neighborhood even with great efforts to do so. Calculations by a UCL cosmologist who showed that the solar system is about the size of an atom in the city of London, if one were to create a model of the known universe, may provide context to the irrelevance of our existence. Space explorations pursued by the budding species have been messy and may have already contaminated the very areas they use to estimate the probability of life elsewhere. It is ironic that in this "advanced technological age," our own space junk is showering down on us from the heavens.
Physical exploration of close proximities to understand the origins and existence of life is symptomatic of the lack of development of the human psyche. At the turn of last century, there were glimpses of intelligence when science and philosophy came together to explore ideas without toys and data mining. With the advent of computers, the ability of humans to advance abstract ideas has been declining. Who wants to theorize if one can simply grab "big data," and prove any possible hypothesis? This idea is accelerating with clinicians and scientists as they turn to machines to prove what they want to prove. Physics, without significant theoretical advancements for over a century, has been solely focused on colliders and space telescopes as if the ultimate frontier is data. As humans slip down to a regime driven largely by incrementalism, technology, and data, it is worth looking back to an age where abstract thinking made fundamental positive changes.
Religion, the original science, has provided a framework to think. The originators have been unbiased with an objective function that encompassed the entire society. But just as anything else, politics, business and academics included, such pure abstract notions were hijacked for the benefit of a few. The practice of religion, as observed today, has no semblance to the original thinking, just the opposite. Then science came along but it also shows similar attributes. Those who practice this modern religion, optimize within very narrow contexts with no real implications for society. What saved humanity thus far, however, is the sheer quantity of good over bad, perhaps aided by Selection that optimized outcomes over expected life spans. Humans appear to be drifting without any specific goals. Scientists and technologists are speeding down the highway that looks like it is to nowhere. And the onlookers from the pedestrian corridors have succumbed to a lack of understanding of societal utility. They appear to cling to unproven ideas and often have leaders who attempt to divide than unite. 
In a divided world of haves and have-nots, the colored and less colored, tall and short, wide and narrow, young and elderly, urban and sub-urban, sailors and climbers and musicians and mathematicians, we are all nestled in a space of an atom in a city of the size of London. And, there could be an infinite number of such cities. 

Monday, February 26, 2018

Stop hiring "data scientists."

They have been riding high. The abandoned and somewhat less sexy field of Statistics has taken the business world by storm. Bottling old wine in new bottles certainly helped and now both venture capitalists and operating companies may be heading for a hangover. Engineers and statisticians have always wanted to be scientists and now they are crowned as such. There is a .ai company formed every 15 minutes by graduates of prestigious universities and there are capitalists with sacks of money willing to entertain them. As we have seen before, this movie will likely end in tears for many.
Data is certainly a good thing and applying "science," to it could also be good. But those who assert their "scientific credentials," based on regressions and neural nets should be aware that the slide rules they are using have been available for nearly half a century. Mathematics does not fade but asserting old ideas have suddenly sprung to life certainly shows the maturity and age of the emerging "scientists." Consulting firms have always been creative and some of the most famous ones, who could hardly spell "data science," just a few years ago are now pretending to be experts at it. Conferences are plenty where the scientists meet their seekers and the vendors portray their wares almost like the bartering that was routine a few centuries ago. They flow tensors, cognitive networks and even hardware in a Pizza size box, that apparently has solved all the world's problems, already.
Stop hiring "data scientists." They are ordinary human beings with bias and they could do your companies a lot of damage.

Monday, January 22, 2018

Personalized medicine

It appears that the completely archaic notion of mass-produced drugs for the average patient is about to change (1). The manufacturers paid lip service to personalized medicine for nearly a century and it was clear that their heart or business models were never in it. The normal function may have done as much damage to humanity as nuclear weapons, for those who adhere to it blindly believe in averages and standard deviations based on a manufactured construct. The only redeeming quality of humans is that they are different and diverse. As the men in power separate the weak from the wealthy, the struggling from those who never struggled, the golfers from those who cannot afford a club, the academics from practitioners, the atheists from the religious, the North from the South, the West from the East, they miss an important point - every human on Earth is different, regardless of the visible features they exhibit or where they originate from.
The design of clinical trials seems to fail this basic notion. Pushing humans through protocols like cattle through a food manufacturing company is not the best way to discover drugs. It is certainly the best way to reduce costs and to prove to the regulators that something important has been done. In the process, they left large underserved populations in the lurch and pumped those who take the medicine with a dose that is suboptimal. Emerging technologies are immensely capable to figure out who will benefit from a drug and who will not and at what quantity. It is time statisticians left the industry as their contributions do more harm than good, not unlike the insurance industry, clinging to actuarial tables.
Now, available technology can titrate every individual to the optimal dose and we do not need, "population statistics," to approve or to disapprove drugs. If the regulators do not return to school to learn what has been happening, they will continue to make bad decisions.
(1) Digitization of multistep organic synthesis in reactionware for on-demand pharmaceuticals
Philip J. Kitson, Guillaume Marie, Jean-Patrick Francoia, Sergey S. Zalesskiy, Ralph C. Sigerson, Jennifer S. Mathieson, Leroy Cronin*

Saturday, January 20, 2018

The dawn of non-invasive diagnostics

Recent news that a single blood test could provide the diagnosis of eight common cancers with 99% specificity (1) is a constant reminder that medicine is still stuck in archaic and invasive procedures to detect, diagnose and treat ailments. With a high concentration of human resources in provider settings, medicine has been slow in embracing emerging technologies and ideas, outside the domain. And this attitude is shared across the healthcare value chain including manufacturers, payers, and regulators.
It is unfortunate. Granted, Biology still remains to be the arena where humans could not progress exponentially. Their brains, with millions of years of deterministic training, have been well specialized to dominate engineering and chemistry. However, they could not understand the marvelous machines assembled by nature from a single cell organism to somewhat more complex humans, with any level of precision. Nature has had time to perfect designs of such beauty and humans, ever curious, have been trying to walk up to the cup of knowledge. But it has not been. Fossils indicate attempts at brain surgery many hundreds of thousands of years ago and despite higher structural knowledge, we have not advanced sufficiently to a differentiable plateau. In most simpler fields, we have demonstrably shown that humans are the weak links in decision processes - from transportation, energy, manufacturing and even, finance.
It is a conundrum. We are stuck - great strides in deterministic sciences do not translate into domains of high uncertainty and diversity. And, those who practice in these complex domains seem to have their blindfolds on as if they have nothing more to learn.  Diagnostics could provide the impetus to move higher - serum and stool harbor such information content, it is a shame we have not figured it out.
(1) Detection and localization of surgically resectable cancers with a multi-analyte blood test
1.        Joshua D. Cohen1,2,3,4,5, Lu Li6, Yuxuan Wang1,2,3,4, Christopher Thoburn3, Bahman Afsari7, Ludmila Danilova7, Christopher Douville1,2,3,4, Ammar A. Javed8, Fay Wong1,2,3,4, Austin Mattox1,2,3,4, Ralph. H. Hruban3,4,9, Christopher L. Wolfgang8, Michael G. Goggins3,4,9,10,11, Marco Dal Molin4, Tian-Li Wang3,9, Richard Roden3,9, Alison P. Klein3,4,12, Janine Ptak1,2,3,4, Lisa Dobbyn1,2,3,4, Joy Schaefer1,2,3,4, Natalie Silliman1,2,3,4, Maria Popoli1,2,3,4, Joshua T. Vogelstein13, James D. Browne14, Robert E. Schoen15,16, Randall E. Brand15, Jeanne Tie17,18,19,20, Peter Gibbs17,18,19,20, Hui-Li Wong17, Aaron S. Mansfield21, Jin Jen22, Samir M. Hanash23, Massimo Falconi24, Peter J. Allen25, Shibin Zhou1,3,4, Chetan Bettegowda1,2,3,4, Luis Diaz1,3,4, Cristian Tomasetti3,6,7,*, Kenneth W. Kinzler1,3,4,*, Bert Vogelstein1,2,3,4,*, Anne Marie Lennon3,4,8,10,11,*, Nickolas Papadopoulos1,3,4,*

Monday, January 15, 2018

Broad learning

Deep learning has been in vogue. Combining ideas from the 60s and an insane amount of computing power, the search giant and others have been learning deep - mind and all. This is good news, gentle tricks on established mathematics seem to have reduced overfitting and accelerated "learning." But, technologies based on unlimited resources and computing power, tend to be lazy and deep learning seem to have all the characteristics. Some even call it "Artificial Intelligence," even though there is nothing artificial or intelligent about it.
Humans have been fascinated by their brains forever. They have searched for the mind and soul in a few pounds of messy grey matter they carry on their shoulders but found nothing. When the computer scientists arrived who could create "General Artificial Intelligence," by assembling dumb silicon and using dumber games, their age showed why wisdom is not that easy to attain, Ph.D. or not. The search giant has been on a prowl, picking up anything that ends in .ai for a premium and as the greatest technologist of all times who invented the electric car and electrified space travel proclaimed that only he knew what AI was all about, we seem to have arrived at ego driven emptiness.
Get used to it. Nobody is intelligent enough to create "general artificial intelligence." Those who harbor higher than average brain cells have headed in the opposite direction by proclaiming that knowledge results from understanding and not modeling ideas. Therein lies the conundrum, as the technologists rise without human contact and attempt to travel to Mars, there appears to be a great vacuum between knowledge and know-how. There is a distinct difference between the two, the former conquered by philosophers and the latter by engineers and it is important to distinguish between the two.
It is time to look forward and abolish ego-driven behavior. Those who are prone to it should be told that they are no better than the worst of humans.

Monday, January 8, 2018

The end of statistics

For nearly hundred years, every field, life-sciences, manufacturing, high-energy physics, economics, healthcare, and others relied on basic statistics and a rather crude assumption that everything follows the Normal function. There is nothing wrong with the assumption but in a regime that works on the tails, the observation that something works for the population has little practical value. In life sciences, they have been inventing mediocre therapies for over a century, as the clinicians, their regulators, and aiding statisticians have been enamored by the mighty "p-value." They have been striving to prove that the incremental average benefits delivered to a large population are a lot better than life-saving therapies for a few. In manufacturing, they have been optimizing with constraints in an attempt to save nickels and dimes. Lean, mean and mighty, their determinism has led to incorrect decisions in the presence of uncertainty. In healthcare, they have been waiting for the protocols to change based on simplistic observations of small samples. Meanwhile, half the healthcare costs in the World could be attributed to a handful of related disease states. In physics, stuffed with engineers, they have been deploying heavy steel for finding particles and hearing waves, based on basic statistical notions. Even with that, they will be the first to admit that they do not yet know 94% of it. In economics, they have been inventing theories based on regression and even winning Nobel prizes but it is unclear if they are creating insights. Some of them ventured into even making money and some have failed spectacularly as would have been predicted by their own theories. Overall, if one can write down an equation for a process, it is symptomatic of the fact that she has not understood it. The practitioners, who seem to cling to the past are being rendered less effective in the presence of those who look forward.
A generation seems to have wasted their time adhering to basic principles laid out a century ago. Lately, statistics have been made sexier by better naming - now called, "Machine Learning." One has to admit it does sound a lot better, but has anything changed? In a world full of practicing scientists, who have been trained to make equations for everything, we are approaching a significant discontinuity. Machines are certainly marching forward but not because they know statistics but because they do not. Such is the state of affairs that a systematic education delivered by the greatest institutions in the world prepares the next generation to fail with high certainty. Meanwhile, machines can see, hear and make decisions in the presence of uncertainty. As we hunt for fossils to establish our own identity in a process that seems to have taken a long time, machines with no emotions and even less historical baggage, rise. Are humans being rendered irrelevant? As the greatest living physicist warns of ETs, as the world's richest and powerful worry about AI, and as the most powerful man on Earth worry about if his hair is falling straight, we have arrived at the precipice of a great discontinuity.
As they moved out of their homeland in Africa, humans must have made important calculations based on uncertainty. As they descended from the trees into the African Savannah, a few million years prior, they knew the regime was shifting. With dangers all around them, mighty beasts who could maul them in a single swipe, they made decisions based on uncertainty. Their initial journeys into the Middle East and South Asia, closely followed by those who went a bit North, seem to have provided a level of safety. They advanced culture and boredom, the latter most important for the development of human psyche. As the caves in Southern France prove, they could certainly rise above determinism and engineering, very early in their progression.
The regime is shifting again - the opponents are not as gentle as the Neanderthals. Machines are brutal and they are immensely capable. Humans, the victors of past conflicts, are starting from a position of a great disadvantage because of their education of the past. The end of statistics, a figment of the imagination of the most recent generation, is very near.