Copy
Oct 9 · Issue 129

Hey folks,

This week in deep learning we bring you a new code search challenge from Facebook, news of a Tesla acquisition, a text summarization model from Google, and an investor’s perspective on machine learning deployment.

You may also enjoy learning about 150 ML models deployed at Booking.com, a review of DL-based crowd counting models, an image deduping tool, neural machine translation in TensorFlow, a generative model for spatial graphs, and more.

As always, happy reading and hacking. If you have something you think should be in next week's issue, find us on Twitter: @dl_weekly.

Until next week!

Industry

Releasing a new benchmark and data set for evaluating neural code search models [Facebook]

Just a week after GitHub announced CodeSearcNet, Facebook announces their own code search dataset and challenge.

 

Get outfit inspiration with style ideas in Google Lens [Google]

Google Lens now lets users take a picture of a piece of clothing and get suggestions for similar styles they can purchase.

 

The Next Word: Where will predictive text take us? [The New Yorker]

A thoughtful exploration of recent advances in natural language processing and text generation.

 

Tesla Acquires DeepScale

Tesla has acquired DeepScale, an AI startup building models used for self-driving cars.

 

Google’s SummAE AI generates abstract summaries of paragraphs

Google releases code and data for a new text summarization model.

 

Machine learning deployment

Benedict Evans, investor at a16z, on where we are in the ML adoption cycle and where we might end up.

Learning

150 successful machine learning models: 6 lessons learned at Booking.com

A nice review of a recent KDD paper on deploying customer facing ML models at scale.

 

Dense and Sparse Crowd Counting Methods and Techniques: A Review

A nice roundup of crowd counting models.

 

Ultra-Wide Deep Nets and the Neural Tangent Kernel (NTK)

The mathematics behind the question of what happens when convolution layers have infinite numbers of channels.

 

Watch AI help basketball coaches outmaneuver the opposing team

Researchers develop a conditional adversarial network that produces basketball set plays and ways to defend them.

 

The Joy of Neural Painting

A nice overview of training models that paint with brush strokes. Code included.

 

The Paths Perspective on Value Learning

A closer look at how Temporal Difference learning merges paths of experience for greater statistical efficiency.

Libraries & Code

[GitHub] krasserm/super-resolution

TensorFlow 2.0 based implementation of EDSR, WDSR, and SRGAN for single image super-resolution.

 

[GitHub] idealo/imagededup

Finding duplicate images made easy!

 

[GitHub] OpenNMT/OpenNMT-tf

Neural machine translation and sequence learning using TensorFlow.

 

[GitHub] google/xnnpack

High-efficiency floating-point neural network inference operators for mobile and Web.

Papers & Publications

Neural Turtle Graphics for Modeling City Road Layouts

Abstract: We propose Neural Turtle Graphics (NTG), a novel generative model for spatial graphs, and demonstrate its applications in modeling city road layouts. Specifically, we represent the road layout using a graph where nodes in the graph represent control points and edges in the graph represent road segments. NTG is a sequential generative model parameterized by a neural network. It iteratively generates a new node and an edge connecting to an existing node conditioned on the current graph. We train NTG on Open Street Map data and show that it outperforms existing approaches using a set of diverse performance metrics. Moreover, our method allows users to control styles of generated road layouts mimicking existing cities as well as to sketch parts of the city road layout to be synthesized. In addition to synthesis, the proposed NTG finds uses in an analytical task of aerial road parsing. Experimental results show that it achieves state-of-the-art performance on the SpaceNet dataset.

For more deep learning news, tutorials, code, and discussion, join us on SlackTwitter, and GitHub.
Copyright © 2019 Deep Learning Weekly, All rights reserved.