The Belamy | Lottery Ticket Hypothesis, Andrew Ng & Turing Award

Top AI Stories from last week


The Lottery Ticket Hypothesis

A 2019 study by MIT called " Lottery Ticket Hypothesis " shocked the ML world.

In machine learning, bigger may not always be better. As the datasets and the machine learning models keep expanding, researchers are racing to build state-of-the-art benchmarks. However, larger models require more resources that are not always available.

Over time, researchers have developed several ways to shrink the deep learning models while optimizing training datasets.

In the Lottery Ticket Hypothesis, MIT researchers showed it was possible to remove a few unnecessary connections in neural networks and still achieve good or even better accuracy.

As the network size increases, the number of possible subnetworks and the probability of finding the ‘lucky subnetwork’ also increase. As per the lottery ticket hypothesis, if we find this lucky subnetwork, we can train small and sparsified networks to give higher performance even when 90% of the full network’s parameters are removed.


Andrew Ng Urges To Be More Data-Centric And Less Model-Centric

Last week, Andrew Ng drew the ML community’s attention towards MLOps, a field dealing with building and deploying machine learning models more systematically. Andrew Ng explained how machine learning development could accelerate if more emphasis is on being data-centric than model-centric.

Traditional software is powered by code, whereas AI systems are built using both code (models + algorithms) and data. “When a system isn’t performing well, many teams instinctually try to improve the code. But for many practical applications, it’s more effective instead to focus on improving the data,” he said.


The Brains Behind Modern Day Programming Honored With Turing Award

Modern-day computing stands on the shoulders of giants. The origin of many tools ranging from coding to databases, from smartphone apps to no code machine learning platforms, can be attributed to two computer scientists — Alfred Aho and Jeffrey Ullman, the recipients of the prestigious Turing Award for the year 2020.

Last week, the Association for Computing Machinery (ACM) named Alfred Vaino Aho and Jeffrey David Ullman as the 2020 ACM A.M. Turing Award recipients for their contributions to the fundamentals of programming language. Since 1966, Turing Awards — named after British computer science legend Alan M. Turing — has been awarded to contributions pushing computer science boundaries.


Featured Video | Big Data To Good Data


What The Floq

Google has come up with an API, Floq. Sandbox at Alphabet, Google’s secretive software development team, is the architect behind Floq. Floq allows developers to harness TPUs to simulate quantum computing workloads.

Google provided Floq to 50 teams as part of the QHack Open Hackathon. The Sandbox at Alphabet team repurposed TPUs to accelerate simulations in the cloud to allow developers to use frontends like TensorFlow Quantum and PennyLane to build quantum models and run them remotely on Floq.


Hands-On Guides for ML Developers

Guide to StyleCLIP: Text Driven Image Manipulation

Guide To GPBoost: A Library To Combine Tree-Boosting With Gaussian Process

Comprehensive Guide To K-Medoids Clustering Algorithm

Object-Oriented Programming with Python

Guide to Pykg2vec: A Python Library for Knowledge Graph Embedding



How This Lingerie Brand In India Uses AI To Push Sales

Pukhraj Singh, Cyber Intelligence Analyst

Karthik Kumar, Director Of Data Science For Auto Practise, Epsilon

Zoomcar’s AI & Data Science Roadmap For 2021

Rashmi Chandrashekar, DXC Technology

3-2-1 Strategy Is Quite Effective In Securing Data: Nikhil Korgaonkar, Arcserve

In Conversation With Dr Suman Sanyal, NIIT University

The Startup Story Of Augnito

How Lendingkart Uses Machine Learning


Google Announces TensorFlow Quantum 0.5.0

Google is celebrating the first anniversary of TensorFlow Quantum (TFQ), a library for rapid prototyping of hybrid quantum-classical ML models. TFQ is regarded as a tipping point for developments in hybrid quantum and classic machine learning models the company has been pushing for years.

Google will soon release TensorFlow Quantum 0.5.0, with more support for distributed workloads, many new quantum centric features and performance boosts.


Uber’s Michelangelo

In 2017, Uber introduced its ML-as-a-service platform Michelangelo to democratise machine learning and make scaling AI ‘as easy as requesting a ride’. In 2020 Q1, Uber made a staggering 1,658 million trips a day on average.

With such a big fleet of vehicles and drivers and an ever-growing customer base, Uber has access to a rich dataset. Uber has always been bullish on AI and machine learning, and Michelangelo is one of its pet projects.



Here's what all happened last week.

IIT Madras Launches Fellowship In AI For Social Good

Julia Computing Launches JuliaSim For Scientific Machine Learning In Cloud

Bangalore-based No-code AI Video Communication Platform Raises $300K In Seed Funding

Google Announces Recipients Of Research Scholar Program 2021

Accenture Acquires Core Compete To Expand Cloud And Data Science Capabilities