40,000 fMRI studies to throw away!
A research team recently revealed what they called a Cluster failure.
Most functional MRI studies conclusions may be erroneous because of their statistical basis. Widely used clustering inference techniques are actually pretty bad at properly inferring clusters due to “spatial autocorrelation functions that do not follow the assumed Gaussian shape”.
It means that “most common software packages for fMRI analysis (SPM, FSL, AFNI) can result in false-positive rates of up to 70%”… suggesting that some results were so inaccurate they could indicate brain activity that does not exist at all!. What is interesting is that neuroscientists are interpreting what they’re told by the statistical software rather than images.
These findings speak to the need of validating the statistical methods being used in the field of neuroimaging.
Therefore, 15 years of research on brain functioning could be invalidated!
But the issue is not limited to research but also extends to clinical use that is pretty worrying.
Image credits: fMRI by OpenStax from the Textbook OpenStax Anatomy and Physiology Published May 18, 2016
Don’t rely (solely) on your external provider
Something incredible occurred last week in France…
Most mass media companies in France lost their subscribers database in seconds.
Actually the main provider for subscriptions management lost all of their data, from production data to source code and backups. The cause still remains to be determined.
The event points out a strong need for control over companies’ data. More and more companies put their data in the cloud and outsource data management. However, (client) data is the single biggest asset of a company. Since data are now a kind of capital, highly strategic and monetizable, companies should become aware that they must keep control over their data.
Data loss occurs so easily and data breaches are so abundant that you put your company at risk by relying exclusively on a few providers and not to keep control over your data. Any company should keep dedicated teams of specialists when outsourcing or putting critical systems in the cloud. I have heard about companies that had to pay huge amounts of money to pursue their activity after another company service failure or conflict.
There is also a need of intense negotiation to obtain good agreements (especially regarding service-level agreements). But only after wide concertation between IT decision makers and not only business decision makers… but also data specialists. By the way, remember that every big companies must have true data specialists (data manager, data scientist, data analyst, …) lead by a chief data officer if they want conduct efficient and effective data projects and have a data-driven activity.
Think about it next time.
Picture credits: Newspapers B&W by Jon S
First mammal species wiped out by global warming
I recently heard of very bad news.
A mammal species (Bramble Cay melomys, Melomys rubicola) discovered in 1845, which was the only one to be endemic to Great Barrier Reef, has been wiped out… because of human-induced climate change.
It was estimated there were several hundred on the small island of Bramble Cay, an uninhabited one which belongs to the Torres Strait Islands. These islands are a group of 274+ small islands located in the waterway separating far northern continental Australia’s Cape York Peninsula and the island of New Guinea.
The species used to live in a 3.62-hectare (8.9-acre) sand cay which is predominately grassland and populated by seabirds and green turtles.
Since 2007 it was not seen, despite a search by a team of scientists. A report has recommended the animal’s status be changed from “endangered” to “extinct” in 2014. An extensive search has then been conducted without success.
In their report, Natalie Waller and Luke Leung from the University of Queensland, recently concluded that the root cause of the extinction was sea-level rise and extreme climate events.
« According to our predictions, 10,000 island will be under water by the end of the century. »
– Franck Courchamp, CNRS Senior researcher
Picture credits: Bramble Cay melomys by State of Queensland
Our renewable future: favoring the next disruptive technologies
We all agree on the necessity to switch to renewable energy sources. It will ensure energy security and climate change mitigation.
But a remaining question is to know what is the best power source option. There no good answer in that it really depends on several factors and even experts don’t share a common view.
The main factor to consider is certainly “where are you?”:
Some parts of the U.S. are windier than others, some are sunnier, and some have better access to hydroelectricity or geothermal resources…. You get the point. – Kate Gordon
Distributed energy infrastructure requires the use of disruptive technologies that are able to locally enhance efficiency and energy storage.
The blockchain revolution
We are reaching some point where we will be required to pool resources. Not only to bring power to the developing world in a shared effort but also for optimizing energy distribution and allowing energy independency for both some legal entities at the top level and cities or districts at the lowest level.
Besides, the next electricity infrastructure will be far more resilient and be optimized in term of energy consumption. But the quest for energy optimization raised a concern about privacy. Smart meters are considered as a threat for privacy. GCHQ were forced to intervene because of insecure design. They are also pretty talkative: it is even possible to know what are your favorite TV shows.
Now, I’d like to say a few words on something new which have potential for revolutionizing the world.
I am convinced that blockchain is crucial for developing efficient locally distributed networks (see an initiative here). However, blockchain currently comes at a cost: it consumes a lot of power. But it could easily be tackled by well-optimized A.I. algorithms running on well-suited chips.
Besides the fact that research on every renewable energy sources should be favored to ensure resilience and adequate distribution, we have to develop better technologies, disruptive ones.
I will present you some initiatives that look promising.
Towards more efficient solar panels
First, Canadians have developed far more efficient solar panels using concentrators and efficient PV panels.
Secondly, Chinese people have developed solar panels that work 24/7 (at least if it rains).
These two technologies are destined for a bright future. Especially since PV prices will keep dropping.
Hydropower and wave power… are some competitive sources
Harnessing marine current through its kinetic energy looks promising since there are strong ocean currents.
A 2006 report from US department of Interior estimates that capturing just 1/1,000th of the available energy from the Gulf Stream would supply Florida with 35% of its electrical needs. – Wikipedia
EDF estimates that marine current power could generate up to 12,5 GW in Europe. It is equal to the energy that we can get from 14 nuclear reactors.
There are innovative open-ﬂow devices just like this one were central rotor is the only moving part:
Credits: france info
It was invented by EDF and DCNS in France.
In order to both limiting impact on animals, sediments and plants & functioning even at low currents some initiatives were proposed. Like open-ﬂow devices such as fish-friendly turbines in rivers.
I find this other one really interesting. It functions by converting the wave power and it is inspired by eels (an example of biomimetics) and is experimented in cooperation with ifremer… Its energy efficiency is really high and it works at low currents.
Credits: EEL energy
A better future?
As you may see there are plenty of disruptive technologies that can help us. Such technologies are more than necessary. They are of public interest. That is why both R&D and their use must be encouraged.
However technology is only a mean among many others. We also need a global awareness of the effets our habits have and public policies have an essential role to play.
Your brain does process information but it is not a computer
I recently came across an article pretending that your brain “does not process information, retrieve knowledge or store memories” and cannot be viewed as a computer.
I personally think that this assertion is fundamentally wrong regarding the information processing.
The computer metaphor is still a valid one
First, I’d like to show you that the computer metaphor couldn’t be discarded so easily.
Of course our brain is not a computer. It is embodied and cannot be considered as an autonomous system.
Credits: Exercise Plays Vital Role Maintaining Brain Health, by a health blog
An embodied system…
“Many features of cognition are embodied in that they are deeply dependent upon characteristics of the physical body of an agent” – RA Wilson and L Foglia, Embodied Cognition in The Stanford Encyclopedia of Philosophy
A downloaded brain, if it were possible, would probably not be able to function without a body having human-like sensors and effectors… since a lot of brain activity is dedicated to monitoring sensory inputs, regulating and interacting with the environment.
A system which is itself made up of of multiple systems
Rather than, the brain should be considered as multiples systems that interact together as a whole. Each of them is specialized and is able to communicate with others through different means. The basic elements of those systems are able to transmit signals at different speed, through different pathways: electrical signaling, chemical signaling (neurohormone)… even by using important diffuse systems (glia and immune system).
A limited metaphor…
To put in a nutshell,
“Humans, along with other organisms with brains, differ from computers because they are driven by emotions and motivations. The brain is much too hot and wet to be represented by a computer. The brain is electrical, but it is also driven by fluids (blood) and chemicals (hormones and neurotransmitters). Most importantly, the brain is part of a body which it drives to action, and research from an embodiment perspective also shows that the whole body (not just the brain) affects emotion, motivation, and other psychological processes.”
So what is the adequate metaphor?
Eddie Harmon-Jones, Ph.D., Professor of Psychology at Texas A&M University suggested that
“So, how can we replace the computer metaphor with a metaphor that more accurately represents the brain of an emotion-driven, motivated organism such as a human? I like the metaphor of a car.
A car may have a computer on board, and may be able to process information. But it is driven by fluids (gasoline, oil, etc.). It is both electrical and mechanical, and it can move.”
Credits: Renault Scénic Front Cut by Sovxx
Despite the fact that computers can be so complex using most recent advances (for instance VLSI or, simply supercomputers), the car metaphor is far more comprehensible.
We should also consider that computers have no significance if not considered through human perspective. On the contrary, all beings those have a brain exists by themselves.
But a useful metaphor
The metaphor of a brain as a computer helped scientists to gain a better understanding of our brain functioning. No more, no less. Just like both past metaphors of the brain and current ones.
Cognitivism (top down approach), connectionism (bottom up approach) and embodied cognition all succeed at explaining or modeling some aspects of our cognition. In neuroscience, several approaches contribute to explain the neural substrate functioning.
The working memory model , as well as the second revolution in linguistics, etc. were all useful for gaining a better knowledge. A knowledge that is helpful to evaluate individual’s cognitive aspects, model functions and, somehow, understand how the brain works.
These theories are still useful for natural language processing, cognitive remediation, and so on. We haven’t discarded Newtonian theory when Einstein published his general relativity. So is the computer metaphor.
The metaphor is helpful to explore the brain and to exploit its properties but it is not necessarily the ultimate metaphor. Nobody knows what the future has in store…
The brain process information
The brain is not able to recall a detailed picture you saw thousands times (a bank note for instance). That does not mean that it doesn’t store information.
As far as we know the brain stores information using synaptic plasticity (1, 2), i.e. connectivity changes, and brain network topology. The information is scattered through the brain and can be unlearned or mixed with new pieces of information.
What we exactly store is still on debate but it is synthetized and segmented information.
Actually, our brain does process information:
“Information is what is conveyed or represented by a particular arrangement or sequence of things” – Oxford dictionary
Which is exactly what the brain does.
To go further in this explanation:
“Information is an abstract concept. There is a temptation to think of information as fully representable by the bits used in a digital computer. However with signals, the timing of the signal makes a difference. When precisely a signal arrives at a neuron carries information about what the signal means, and there is increasing evidence that the relative timing among neural signals carries information as well. The brain accomplishes information processing using signal processing, but there is more going on than the phrase “information processing” alone would suggest. […]All signal processing is carried out by the spontaneous “random” interactions of molecular collisions, which are loosely guided by the continuously hierarchical structural form of the brain.” – Paul King, Computational Neuroscientist, Data Scientist, Technology Entrepreneur
What is your opinion on the question?
Weird black holes to reveal a new physics
A new discovery (to confirm) tends to show that some large regions of the universe were rotating.
Astronomers in South Africa discovered mysterious alignment of black holes (they are also spinning synchronously):
They were observing the ELAIS-N1 region of the space with the Giant Metrewave Radio Telescope (GMRT) when they unexpectedly discovered the weird orientation of all black holes in the region.
The only explanation for now is that this space region was rotating at the early stages of galaxies formation for hundreds of millions light-years.
Reasearchers speculates on what caused the phenomenon. It could be: primordial cosmological magnetic fields or a new physics that involes cosmic strings or fields of axions. But no one knows.
Some people even suggest that it could be aliens harbouring energy from black holes or synchronizing them for kind of GPS positioning system. It is highly speculative!
The only fact is that nobody were able to predict these observations.
A PM simply explains quantum computing
I’d like to share with you the following video that is extraordinary:
For your information, he’s the canadian PM.
Do we live in a discrete world?
Today, I will share with you some recent thoughts I had.
Do you remember of analog chips (a.k.a linear integrated circuits)? These circuits are almost no longer in use and have been replaced by semi-conductors and discrete processing chips (digital ones). Some kind of analog signal processing still remains in use in vacuum tubes and may be soon integrated in some current devices thanks to recent improvements.
To sum up, we have the following kind of computers: analog, digital, quantum.
The ubiquitous type is digital computer (even your smartwatch is basically a computer). Analog almost disappeared (they are still in use in aircrafts) but may become more prominent due to interest in very-large-scale integration. Quantum ones are more theoretical than common.
I already mentioned that our universe behaviour is either mathematical or a quantum, perhaps both.
Nick Boström even suggested that we are most likely living in a computer simulation. Some go further, hypothesizing in a thought experiment, that some of us may be p-zombies.
The real nature of our universe is a prolific question in physics, philosophy, metaphysics, spirituality and even cognitive science since we extensively rely on our sensors and the perception of a physical world (a perception that is easily tricked, even at integration levels).
If the world is quantum, then it can be reduced to a quantum computational system. If it is analogous, it can be reduced to an analogous computational system. Both are not exclusive at all.
What if our universe were discrete?
The discrete option is really interesting in that it supports the metaphor of a computer and, ultimately, of a simulation. It is theorized in digital physics.
One can notice that if the universe processes information, then it can also generate and process knowledge. We know that information and knowledge spread in societies by well-known social processes and networks dynamics. Oriented and labelled graphs can organize knowledge just like ontologies in information science. I found the analogy quite interesting since both societies and ontology-based systems in computer science can generate knowledge.
In my humble opinion, it tends to extend the computer metaphor from the universe to societies: the same process underpinning the quantum level, the cognitive level and the social level. It explains why discrete graph models succeed to explain some levels’ information processing (1, 2). Information processing would be the essential nature of the universe and knowledge discovery (perhaps?) a goal for us. Enaction and embodiment taught us that this appropriation of knowledge is not necessarily academic but can be achieved from everyday life or manual work.
It also reminds me the Plato’s theory of Forms (and its limitations) and how to access the ultimate reality.
“We come here to a difficulty which has troubled many philosophic theologians. Only the contingent world, the world in space and time, can have been created; but this is the every-day world which has been condemned as illusory and also bad. Therefore the Creator, it would seem, created only illusion and evil. Some Gnostics were so consistent as to adopt this view; but in Plato the difficulty is still below the surface, and he seems, in the Republic, to have never become aware of it.”- Bertrand Russel, philosopher (sorry for the mention of a Creator, you can replace it by every concept that suits your beliefs).
What is the intermediate level: human being. As you may already know, fractals are everywhere in our universe. The information processing is everywhere. So, it sounds interesting to explore the possibility that the same processes can be observed at all levels: from atoms to societies.
The point is that if our universe is discrete, then each level is discrete too.
Societies are discrete in the sense that information processing and knowledge spreading occurs temporarily by discrete steps. The processing speed is increased in our connected society.
Our brain is also discrete (a counterintuitive idea): from post synaptic potential triggering to time slices of perception as we recently discovered.
What about the quantum level?
Humanity might never be able to prove with certainty whether the universe is simulated, Chalmers said.
“You’re not going to get proof that we’re not in a simulation, because any evidence that we get could be simulated.” – David Chalmers, Professor of Philosophy and Director of the Centre for Consciousness at the Australian National University
But we have evidences for a simulation since universe has probably error correcting codes, just like in computer science.
“Error-correcting codes are what make browsers work, so why were they in the equations that I was studying about quarks, and leptons, and supersymmetry? […] That’s what brought me to this very stark realization that I could no longer say that people like Max [Tegmark] are crazy.” – James Gates, a physicist at the University of Maryland
We may also be able to prove that our universe is discrete by nature. The question is, ultimately: is there a smallest unit of length, beyond which you can’t divide any further?
It will probably be possible to confirm soon Giovanni Amelino-Camelia observations of Hubble’s quasar shift in high frequencies.
We should even not be required to deeply observe the universe since discrete and continuous may be two sides of the same!
“The most significant level of interaction is when one and the same phenomenon appearsin both the continuous and discrete setting. In such cases, intuition and insight gained fromconsidering one of these may be extremely useful in the other.” – László Lovász, Microsoft Research
One of the best chips is bad at maths
I will share with you a promising innovation in chips. A MIT project that was funded by DARPA resulted in a chip …
capable of processing frames almost 100 times faster than a conventional processor restricted to doing correct math—while using less than 2 percent as much power
The chip was actually doing imprecise summations but it was enough to perform very well on some hard tasks which do not required precise calculations. That is exactly what one does not expect from a chip’s functioning!
You can find more detail in this MIT review.