Google

YouTube

Spotify

Scientific Sense Podcast

Tuesday, September 19, 2017

Lazy deep learning

The neural net, an old technology, improved recently by marginal tricks for faster and stable learning, is now the most expensive old wine in a new bottle, appropriately called, "deep learning." Technologists have been getting more creative and now they believe they are going to take over the world. It is possible, but unlikely. Scientists, who have been struggling with age-old statistics suddenly find a way to throw large amounts of raw data to the dumb machine to make pattern finding easier. We could now chase fundamental particles, pharmaceutical products, and weather forecasts, by chasing noise. Now, one could automate it with insights rising to the top just like butter does as one churns spoiled milk.
Lack of experience is problematic; for politicians, management consultants, investment bankers, entrepreneurs, scientists and even bureaucrats. It takes a while to be good at what one does. Machines are certainly great but as a large mainframe maker found out recently down South that not even Sherlock, let alone Watson, can solve all the world's problems. And the largest and smallest analytics companies in the world clamoring for glory by the application of technology and "artificial intelligence," some to save humanity and others to stuff their own pockets, we are fast approaching a highly bifurcated regime. As they seek "deep mind," much deeper questions remain and that's not something the millennials appear to be interested in.
Preserve the human mind, compassion and an incessant yearning for knowledge to survive. Sometimes, it is better to take a break from "learning programmatically."

Friday, September 15, 2017

Goodbye Cassini

A fateful plunge into the heart of Saturn was how the faithful probe, Cassini, ended its own life. Saturn and its moons have stirred up the imagination in the human psyche forever. Cassini’s end though has been carefully orchestrated as the agency feared biological contamination of its moons if it were to collide with any of them unexpectedly. The fact that the agency fear microbes could still exist on the probe after two decades in the outer stretches of the solar system is telling. Sterilization of space projectiles have not been effective and it is very likely that humans have already spread robust microbes on both Moon and Mars. As the window draws near to 2020, the “drop dead” timeline for finding aliens, this could be a profitable way to accomplish it. But if they find microbes elsewhere that look remarkably terrestrial, caution could be in order as the “explorers,” have not been tremendously careful.



Green women have been curiously absent, albeit that stories of abduction and alien craft crashes have been plenty. The fact that some think an alien will conquer the space-time constraint to reach the most irrelevant speck in the Milky Way, just to be astonished by human biology, is symptomatic of the limitations of the species. For fifty thousand years they have killed and pillaged their neighbors and now they would like to explore nearby planets and pretend to be sage. Such explorations have led to little increase in knowledge and likely distracted the theorists from imagination. The domination of engineering in Astronomy has been costly as there has not been any advancement in the fundamental understanding of the universe in nearly a century. The unusual men and women at the turn of last century who made a leap into the knowledge sphere have not been replicated. Mathematical noise have assured that the younger generation will be lost in partial differential equations and “quantum uncertainty,” forever.



Goodbye Cassini, an engineering marvel, but it is unlikely to advance knowledge in any dimension. If it does not shower microbes in pristine environments, that is a bonus.

Thursday, August 31, 2017

More dangerous than politicians


NASA has revealed that a 3-mile asteroid will pass by the Earth Friday. The space agency may be a bit cavalier about how much of a threat such a large body poses but more likely it may have concluded that technology simply does not exist to save humanity from calamity. As philosophers have argued in the past, there is no point worrying about something one can’t do anything about. If the "representatives," and policy makers do not pose enough of a threat to the population, by their stupidity and ego, there are “huge,” objects flying past the blue planet, a sitting duck in the active shooting gallery.

The dinosaurs had no choice. After an impressive period of many millions of years of domination, they simply vanished in the blink of an eye. Their physical infrastructure was more robust than the mammals that followed to weather a catastrophe. However, with size came the need for higher energy consumption and in a regime of low energy availability, survival was not an option. The later incarnation of the mammals has also been endowed with an energy hog, an organ they carry on their shoulders. But more importantly, they have a tendency to stop thinking and kill each other at the first sign of trouble. So, humans have little chance of survival, much less than the dinosaurs, if an asteroid heads in this direction. It is unlikely that you will find a human genome a few million years from now if that were to occur. We have at least birds to remind us of the previous domination.

Engineering advancements in the last century were focused on tactics - buildings, transportation, chemicals, and power – attributes that incrementally improves the greenhouse, humans have been afforded. In the process, they have been trying to burn off the critical molecule they need to breathe and live. If there is a definition of stupidity, one will find it here, in the present. However, it is important to remember that even that fades in comparison of their inability to create technology that could stop their complete elimination.

Politicians are certainly dangerous, but there is a more dangerous thing out there – and humans are living on borrowed time.

Thursday, August 24, 2017

The discontinuity

It could be here. A resolution of the disconnect of the emerging against the status-quo, the pristine against the established, the young against the old, the uncultured against the culturally sophisticated, the academic against the the politically astute, the color blind against the racists, the healthy against the wealthy, the markets against the regulators, the agnostics against the atheists, the empathetic against the apathetic, the travelers against those who stay put, the globalists against the localizers, the peace lovers against the war mongers, the optimistic against the pessimistic, the thinkers against the feelers, the scientific against the religious, the capitalists against the communists, the future against the past, the good against the evil, the people against those who hold power, and the machine against the human. In the short history of the Homo sapiens, such reversals have been rare, if at all. And now, it is turning upside down in front of a singular generation as the technologists advance the ability to replicate the basic functions of the human brain. For thirty years, many have been on the prowl, but now they may be closing in on the algorithm that makes pattern finding practical.

Some caution may be apt. Pattern finding is certainly an important cognitive function and driven primarily by data. It has served humanity well from inception and the seven billion that inhabit the Earth today are well endowed with these capabilities due to selection. However, pattern finding would not have led humanity to advance art, language, science, and psyche. These are the features that made them human and mechanical replication of their basic brain functions is not “artificial intelligence,” by any stretch of the imagination. Intelligence is not artificial; the few cases on display from Newton to Einstein show characteristics that are not mechanistic. They were driven by dreams, visions and imagination, signs of a misbehaving brain seeking an escape route from boredom.

Till the technologists figure out a way for a computer to experience intellectual boredom, we are nowhere close to “artificial intelligence.”