paint-brush
Up to Speed on Deep Learning: June Updateby@RequestsForStartups
2,910 reads
2,910 reads

Up to Speed on Deep Learning: June Update

by Requests for StartupsJune 1st, 2017
Read on Terminal Reader
Read this story w/o Javascript
tldt arrow

Too Long; Didn't Read

<em>By </em><a href="https://www.linkedin.com/in/isaacmadan" target="_blank"><em>Isaac Madan</em></a><em> (</em><a href="mailto:[email protected]" target="_blank"><em>email</em></a><em>)</em>

Companies Mentioned

Mention Thumbnail
Mention Thumbnail
featured image - Up to Speed on Deep Learning: June Update
Requests for Startups HackerNoon profile picture

Sharing some of the latest research, announcements, and resources on deep learning.

By Isaac Madan (email)

Continuing our series of deep learning updates, we pulled together some of the awesome resources that have emerged since our last post. In case you missed it, here are our past updates: May, April part 2, April part 1, March part 1, February, November, September part 2 & October part 1, September part 1, August part 2, August part 1, July part 2, July part 1, June, and the original set of 20+ resources we outlined in April 2016. As always, this list is not comprehensive, so let us know if there’s something we should add, or if you’re interested in discussing this area further.

Announcements

Convolutional Sequence to Sequence Learning by Gehring et al. Facebook team demonstrates using a novel convolutional neural network (CNN) approach for language translation that achieves state-of-the-art accuracy at nine times the speed of recurrent neural systems. FAIR sequence modeling toolkit (fairseq) available on GitHub so researchers can build custom models for translation, text summarization, etc. GitHub repo here. Original paper here.

NVIDIA GPU Tech Conference 2017 Highlights in 12 Minutes by Engaget. Relevant announcements by NVIDIA pertaining to AI & machine learning hardware advancements. Video.

TensorFlow Research Cloud by Google. 1,000 Cloud TPUs for the world’s top researchers to accelerate deep learning research.

Build and train machine learning models on our new Google Cloud TPUs by Jeff Dean and Urs Hölzle. We’re excited to announce that our second-generation Tensor Processing Units (TPUs) are coming to Google Cloud to accelerate a wide range of machine learning workloads, including both training and inference. We call them Cloud TPUs, and they will initially be available via Google Compute Engine.

Deep Voice 2 by Baidu. Human speech generation with less training data. It can learn the nuances of a person’s voice with just half an hour of audio, and a single system can learn to imitate hundreds of different speakers (article).

Research

Phase-Functioned Neural Networks for Character Control by Holden et al. University of Edinburgh researchers demonstrate using neural networks to more realistically animate the way characters moving in a real-time game environment, trained on a locomotion dataset of movement in virtual scenes. Original paper here.

Robots that Learn by OpenAI. We’ve created a robotics system, trained entirely in simulation and deployed on a physical robot, which can learn a new task after seeing it done once.

Using Machine Learning to Explore Neural Network Architecture by Quoc Le et al of Google. The process of manually designing machine learning models is difficult because the search space of all possible models can be combinatorially large. Google demonstrates a reinforcement learning approach to automate the design of machine learning models, making them more accessible.

Fully Convolutional Instance-aware Semantic Segmentation by Haozhi Qi et al. fully convolutional end-to-end solution for instance segmentation, which won the first place in COCO segmentation challenge 2016. See sample images of their instance segmentation at work here. Original paper here.

Adversarial Neural Machine Translation by Lijun Wu et al. In this paper, we study a new learning paradigm for Neural Machine Translation (NMT). Instead of maximizing the likelihood of the human translation as in previous works, we minimize the distinction between human translation and the translation given by a NMT model, via an adversarial training architecture, inspired by recent successes of generative adversarial networks.

Taming Recurrent Neural Networks for Better Summarization by Abigail See of Stanford. Enhancing abstractive, automatic text summarization via novel deep neural network architecture. In this work we propose a novel architecture that augments the standard sequence-to-sequence attentional model in two orthogonal ways. Original paper here.

Resources

AlphaGo, in context by Andrej Karpathy of OpenAI. Digs into the questions: “to what extent is AlphaGo a breakthrough?”, “How do researchers in AI see its victories?” and “what implications do the wins have?”

An overview of AI and it’s potential in image & video recognition by Fei-Fei Li, chief scientist of AI/ML at Google Cloud, associate professor of computer science at Stanford.

A new kind of deep neural networks by Eugenio Culurciello. Describes novel deep neural networks suitable for unsupervised learning, such as generative ladder networks, recursive ladder networks, and predictive coding networks, and their relation to generative adversarial networks. Paper on deep predictive coding networks here.

Navigating the Unsupervised Learning Landscape also by Eugenio Culurciello. Overview of unsupervised learning methods spanning general concepts, auto encoders, clustering, generative models, and more.

Roboschool by OpenAI. Open-source software for robot simulation, integrated with OpenAI Gym.

Baselines: high-quality implementations of reinforcement learning algorithms also by OpenAI. A set of high-quality implementations of reinforcement learning algorithms. These algorithms will make it easier for the research community to replicate, refine, and identify new ideas, and will create good baselines to build research on top of.

Dive into Deep Learning with 15 free online courses by David Venturi. Overview & reviews of various deep learning courses.

Top 15 Python Libraries for Data Science in 2017 by Igor Bobriakov. Overview of Python libraries for basic data science, visualization, machine learning, NLP, data mining, and stats.

Tutorials & Data

3 Million Instacart Orders, Open Sourced by Jeremy Stanley of Instacart. This anonymized dataset contains a sample of over 3 million grocery orders from more than 200,000 Instacart users. We hope the machine learning community will use this data to test models for predicting products that a user will buy again, try for the first time or add to cart next during a session.

Effective TensorFlow for Non-Experts (Google I/O ’17) by Google. In this talk, you will learn how to use TensorFlow effectively. TensorFlow offers high level interfaces like Keras and Estimators, which can be used without being an expert. This talk will show how to implement complex machine learning models and deploy them on any platform that supports TensorFlow. Video.

The $1700 great Deep Learning box: Assembly, setup and benchmarks by Slav Ivanov. Walk-thru of building a desktop for deep learning from scratch.

By Isaac Madan. Isaac is an investor at Venrock (email). If you’re interested in deep learning, we’d love to hear from you.

Requests for Startups is a newsletter of entrepreneurial ideas & perspectives by investors, operators, and influencers.

**Please tap or click “︎**❤” to help to promote this piece to others.