TLDR
Google researchers are developing Flax: Google’s Open Source Approach To Flexibility In Machine Learning. The base for the calculations is JAX instead of NumPy, which is also a Google research project. One of the biggest advantages of JAX is the use of XLA, a special. compiler for linear algebra, that enables execution on. GPUs and TPUs as well. Unfortunately you will still need Tensorflow at this point because Flax misses a good data input pipeline.via the TL;DR App
no story
Written by fabian | does something with computer