Deep learning is many things, but it isn’t simple.
Even if you’re a data scientist who has mastered the basics of artificial neural networks, you may need time to get up to speed on the intricacies of convolutional, recurrent, generative, and every other species of multilayered deep learning algorithm. As deep learning innovations proliferate, there’s a risk this technology will grow too complex for average developers to grasp without intensive study.
But I’m confident that, by the end of this decade, the deep learning industry will have simplified its offerings considerably so that they're comprehensible and useful to the average developer. The chief trends toward deep learning tool, platform, and solution simplification are as follows:
1. The deep learning industry will adopt a core set of standard tools
By the end of this decade, the deep learning community will converge on a core set of de facto tooling frameworks. Currently, deep learning professionals have a glut of tooling options, most of which are open source. The most popular include TensorFlow, BigDL, OpenDeep, Caffe, Theano, Torch, and MXNet.
2. Deep learning will gain native support within Spark
The Spark community will beef up the platform’s native deep learning capabilities in the next 12 to 24 months. Judging by the sessions at the recent Spark Summit, it would appear that the community is leaning toward stronger support for TensorFlow, at the very least, with BigDL, Caffe, and Torch also picking up adoption.
3. Deep learning will find a stable niche within the open analytics ecosystem
Most deep learning deployments already depend on Spark, Hadoop, Kafka, and other open source data analytics platforms. What’s becoming clear is that you can’t adequately train, manage, and deploy deep learning algorithms without the full suite of big data analytics capabilities provided by these other platforms. In particular, Spark is becoming an essential platform for scaling and accelerating deep learning algorithms built in various tools. As I noted in this recent article, many deep learning developers are using Spark clusters for such specialized pipeline tasks as hyperparameter optimization, fast in-memory data training, data cleansing, and preprocessing.
4. Deep learning tools will incorporate simplified programming frameworks for fast coding
The application developer community will insist on APIs and other programming abstractions for fast coding of the core algorithmic capabilities with fewer lines of code. Going forward, deep learning developers will adopt integrated, open, cloud-based development environments that provide access to a wide range of off-the-shelf and pluggable algorithm libraries. These will enable API-driven development of deep learning applications as composable containerized microservices. The tools will automate more deep learning development pipeline functions and present a notebook-oriented collaboration and sharing paradigm. As this trend intensifies, we’ll see more more headlines such as “Generative Adversarial Nets in 50 Lines of Code (PyTorch).”
5. Deep learning toolkits will support visual development of reusable components
Deep learning toolkits will incorporate modular capabilities for easy visual design, configuration, and training of new models from pre-existing building blocks. Many such reusable components will be sourced through “transfer learning” from prior projects that addressed similar use cases. Reuseable deep learning artifacts, incorporated into standard libraries and interfaces, will consist of feature representations, neural-node layerings, weights, training methods, learning rates, and other relevant features of prior models.
6. Deep learning tools will be embedded in every design surface
It’s not too soon to start envisioning “democratized deep learning.” Within the next five to 10 years, deep learning development tools, libraries, and languages will become standard components of every software development toolkit. Equally as important, user-friendly deep learning development capabilities will be embedded in generative design tools used by artists, designers, architects, and creative people of all stripes who would never go near a neural network. Driving this will be a popular mania for deep learning-powered tools for image search, autotagging, photorealistic rendering, resolution enhancement, style transformation, fanciful figure inception, and music composition.
As the deep learning market advances toward mass adoption, it will follow in the footsteps of data visualization, business intelligence, and predictive analytics markets. All of them have moved their solutions toward self-service cloud-based delivery models that deliver fast value for users who don’t want to be distracted by the underlying technical complexities. That’s the way technology evolves.