How Deep is Your Net?

Posted by Vintra on February 21 , 2019


From the offices of the Vintra engineers — where racks of servers hum away, processing thousands of images through Vintra's deep neural nets — an article was passed down the food chain to the marketing department for your consumption and intellectual provocation.

The Limitations of Deep Learning for Vision and How We Might Fix Them, from The Gradient, is an in-depth look at the powerful, game-changing advancements that deep learning has made in computer vision thanks to GPUs (and talented engineers like the ones here at Vintra), and also the realistic limitations deep learning models of AI face when attempting to attain a more biological version of vision.

From the article

History of Deep Learning

We are witnessing the third rise of deep learning. The first two waves — 1950s–1960s and 1980s–1990s — generated considerable excitement but slowly ran out of steam, since these neural networks neither achieved their promised performance gains nor aided our understanding of biological vision systems. The third wave — 2000s–present — is different because deep learning has blown past its competition on a plethora of benchmarks and real world applications. While most of the basic ideas of deep learning were already developed during the second wave, their power could not be unleashed until large datasets and powerful computers (GPUs) became available.

The rise and fall of deep learning reflects changes in intellectual fashion and popularity of learning algorithms. The second wave saw the limitations of classical AI in the form of underwhelming performances on overwhelming promises. Thus began the AI winter of the mid-1980s. The decline of the second wave transitioned to the rise of support vector machines, kernel methods, and related approaches. We applaud the neural network researchers who carried on despite discouragement, but note that the pendulum has swung once again. Now it is difficult to publish anything that is not neural network related. This is not a good development. We suspect that the field would progress faster if researchers pursued a diversity of approaches and techniques instead of chasing the current vogue. It is doubly worrying that student courses in AI often completely ignore the older techniques in favor of the current trends.

This is a great, thought-provoking article that we hope you enjoy.

Subscribe to the blog 

Topics: What is artificial intelligence, Deep learning, Neural Nets, AI regulation, The Gradient

Comments are closed.


See all