An affordable credit card-sized supercomputer by NVIDIA

Jetson TX1

NVIDIA announced yesterday the Jetson TX1, a small form-factor Linux system-on-module, credit card sized for various application ranging from autonomous navigation to deep learning-driven inference and analytics.

Jetson TX1

It will soon be available as development kit, e.g. a mini-ITX carrier board that includes the pre-mounted module and has low power consumption which provides an out of the box desktop user experience (it comes with a linux’s ubuntu custom distribution). Unfortunately, the development kit requires a USB hub to work with a keyboard and a mouse and the 16GB eMMC memory storage is probably too few.

Since I really enjoyed performing artificial intelligence at the university and during an experience as contractor in a public research center, I think I will ask the developer kit for christmas. I plan to use it as media center, intelligent home automation and for personal deep learning projets.

You may wonder why I chose this solution? Just because this card packs several interesting characteristics:

  • a Tegra X1 SoC : an ARM A57 CPU and a Maxwell-based GPU packing 256 CUDA cores (delivering 1 teraflop at 5.7W, i.e. the same peak speed as a small 15 years old supercomputer!)
  • 4GB of RAM shared between the CPU and GPU

It sounds interesting to me.

Some hints for a mathematical world


Max Tegmark, a cosmologist, considers the external reality to be essentially mathematical.

In that sense, the mathematical entities (such as groups or varieties) may not be different from physical entities (photons, magnetic fields, etc.) by nature, explaining why the mathematics are so successful at describing the physical world. In other words, the so-called “universal structural realism” asserts that our physical universe is isomorphic to a mathematical structure.


pi number by J.Gabás Esteban

A link between quantum physics and mathematics

I personally believe in the power of loop quantum gravity, combined to the idea that physical laws are reducible to a quantum circuit.

Therefore the universe may exhibit fundamental mathematical properties inherent in the quantum circuit’ subroutine “archetype”. That’s it for the theoretical digression.

If the universe has fundamental mathematical properties then we must be able to find similar patterns in formulas.

And that is the case: researchers found the same formula in quantum mechanical calculations of the energy levels of a hydrogen atom as in the derived formula for pi as the product of an infinite series of ratios in a book of the mathematician John Wallis (the so called Wallis product for π in the Arithmetica infinitorum from 1655).

It reminds me of something…

The formula links π and the quantum mechanics. That’s a fact.

I recently read an article in french that talked of the tau manifesto, saying that the value of pi is wrong and should be 6.28…

Interestingly, if we use π = 2π then several classical formulas in mathematics and physics exhibits the same appearance:

area of a disk = ½ π r2

Ec = ½ m v2

d = ½ g t2

Physics and mathematics are maybe two sides of the same coin…

Home reading environment is beneficial to children

Reading at school

While we already know that reading fiction improves brain connectivity and function and its effects is long lasting, a new study proves that “listening to stories, greater home reading exposure is positively associated with activation of brain areas supporting mental imagery and narrative comprehension, controlling for household income”.

Reading at school

If you want smart kids, just don’t put your children in front of the TV for long hours. Give instead priority to active play and reading. They will develop better cognitive capabilities and be healthier.

Deep learning for everyone!


That’s great news! Google just open-sourced TensorFlow, its deep (machine) learning library.

The engine is widely used at Google: by speech recognition systems, in the new Google photo product, in Gmail, in search, etc.


From now on startups will be able to develop systems as intelligent as a 4 year old children. More interestingly, code sharing in python between researchers or data scientists has never been easier.

The limitations of the previous system no longer exist:

[DistBelief] was narrowly targeted to [artificial] neural networks, it was difficult to configure, and it was tightly coupled to Google’s internal infrastructure — making it nearly impossible to share research code externally. […] TensorFlow has extensive built-in support for deep learning, but is far more general than that — any computation that you can express as a computational flow graph, you can compute with TensorFlow (see some examples). Any gradient-based machine learning algorithm will benefit from TensorFlow’s auto-differentiation and suite of first-rate optimizers. And it’s easy to express your new ideas in TensorFlow via the flexible Python interface.

Maybe the engine will soon get available for its cloud-based service on a clustered architecture…