Sharing some of the latest research, announcements, and resources on deep learning. By Isaac Madan ( email ) Continuing our series of deep learning updates, we pulled together some of the awesome resources that have emerged since our last post. In case you missed it, you can find all past updates . As always, this list is not comprehensive, so if there’s something we should add, or if you’re interested in discussing this area further. If you’re a machine learning practitioner or student, join our Talent Network to get exposed to awesome ML opportunities. here let us know here Research & Announcements by Google Research. Google releases its own cloud notebook platform — try it. Colaboratory Colaboratory is a data analysis tool that combines text, code, and code outputs into a single collaborative document. by Google. Original paper . OpenFermion: The Electronic Structure Package For Quantum Computers OpenFermion is an open source effort for compiling and analyzing quantum algorithms to simulate fermionic systems, including quantum chemistry. Among other functionalities, the current version features data structures and tools for obtaining and manipulating representations of fermionic and qubit Hamiltonians. here by Mills of University of Ontario. Deep learning and the Schrödinger equation et al We have trained a deep (convolutional) neural network to predict the ground-state energy of an electron in four classes of confining two-dimensional electrostatic potentials. . NVIDIA open sources its deep learning chip architecture to broaden its adoption as an IoT standard. NVIDIA Deep Learning Accelerator The NVIDIA Deep Learning Accelerator (NVDLA) is a free and open architecture that promotes a standard way to design deep learning inference accelerators. . (Wikipedia). Watch first place winner Ning6A1 by BengKiat Ng solve the maze (Youtube video). Read more about the contest via this blog post. Micromouse contest first place video Micromouse is an event where small robot mice solve a 16x16 maze here by Google. Google announces Neural Networks API for Android which executes against machine learning models on the device, bringing more AI to the edge. Neural Networks API The Android Neural Networks API (NNAPI) is an Android C API designed for running computationally intensive operations for machine learning on mobile devices. by Creswell of Imperial College London. Generative Adversarial Networks: An Overview et al The aim of this review paper is to provide an overview of GANs for the signal processing community, drawing on familiar analogies and concepts where possible. In addition to identifying different methods for training and constructing GANs, we also point to remaining challenges in their theory and application. by Vertex AI. Open source portable deep learning engine. Announcing PlaidML: Open Source Deep Learning for Every Platform Our mission is to make deep learning accessible to every person on every device. Resources, Tutorials & Data by Andreas Jansson and Ben Firshman. A handy tool that renders Arxiv academic papers as easy to read web pages so you don’t have to read the PDF versions that is typical of most ML papers. Arxiv Vanity by Alena Kruchkova. Great series of lecture videos that follow the Deep Learning book by Goodfellow . Original book . Video lectures accompanying book Deep Learning et al here by Adrian Rosebrock. Tutorial demonstrating near real-time object detection via a Rasberry Pi. Raspberry Pi: Deep learning object detection with OpenCV by Data4Bio. Thorough and understandable explanation of Principal Component Analysis (Youtube video). Dimensionality Reduction: Principal Components Analysis, Part 1 by ClearBrain. Explaining Your Machine Learning Model (or 5 Ways to Assess Feature Importance) Knowing which features, inputs, or variables in a model are influencing its effectiveness is valuable to improving its actionability. Assessing feature importance though is not straightforward. Below we outline five ways of addressing feature importance, with a focus on logistic regression models for simplicity. by Sebastian Ruder. Word embeddings in 2017: Trends and future directions Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers (Wikipedia). This post will focus on the deficiencies of word embeddings and how recent approaches have tried to resolve them. by Coursera. Course just started October 23. How to Win a Data Science Competition: Learn from Top Kagglers In this course, you will learn to analyse and solve competitively such predictive modelling tasks. By . Isaac is an investor at Venrock ( ). If you’re interested in deep learning or there are resources I should share in a future newsletter, I’d love to hear from you. If you’re a machine learning practitioner or student, join our Talent Network to get exposed to awesome ML opportunities. Isaac Madan email here is a newsletter of entrepreneurial ideas & perspectives by investors, operators, and influencers. Requests for Startups ** **❤” Please tap or click “︎ to help to promote this piece to others.