The Decline of TensorFlow and the Rise of PyTorch in AI Workloads: A Documented Shift
TensorFlow, formerly the preeminent deep learning framework created by Google, has been eclipsed in recent years by PyTorch, developed by Facebook AI Research. Previously the unequivocal choice for researchers and corporations, TensorFlow has experienced a slow decrease as PyTorch has emerged as the favoured framework for AI workloads. This transition is seen not only in research laboratories and enterprises but also on platforms such as Hugging Face, where PyTorch now prevails in the quantity of models accessible.
Let us examine the recorded instances of this transition, investigate the factors contributing to TensorFlow’s drop, and review empirical data, such as model quantities on Hugging Face, to emphasise this change.
Hugging Face Model Counts: A Clear Indicator
Significant evidence of PyTorch’s superiority over TensorFlow is the number of models available on Hugging Face, a premier model-sharing platform in the AI domain. Hugging Face has become the principal repository for pre-trained models, commonly employed by developers and researchers as a basis for further experimentation.
- As of October 2024, Hugging Face hosts over 100,000 models.
- Of these, over 85,000 models are implemented in PyTorch.
- By comparison, TensorFlow has around 5,000 models listed.
The substantial disparity in model quantity between PyTorch and TensorFlow underscores the pronounced shift in developer community preferences favouring PyTorch, particularly for emerging AI and machine learning initiatives.
Source: Hugging Face Model Hub
Documented Shifts from TensorFlow to PyTorch
OpenAI’s Transition to PyTorch
One of the most prominent migrations was OpenAI’s transition from TensorFlow to PyTorch in 2020. OpenAI originally developed numerous projects, including early iterations of GPT, utilising TensorFlow. OpenAI articulated in a blog post that PyTorch’s dynamic graph functionalities and facilitation of experimentation rendered it more suitable for their swiftly advancing research requirements. This transition reverberated throughout the AI community, indicating that even a leading user of TensorFlow had transitioned to PyTorch.
Reference: OpenAI Blog
Tesla’s AI Team Transition
A notable illustration is Tesla’s AI team, which manages the advancement of autonomous driving technologies. At Tesla’s AI Day in 2021, Andrej Karpathy, Tesla’s Director of AI, affirmed that the company had completely shifted from TensorFlow to PyTorch. Karpathy commended PyTorch for its superior flexibility and developer-centric design, enabling the team to expedite iterations and manage intricate model construction for their autonomous driving software. This change solidified PyTorch’s status as the favoured framework for AI-intensive, practical applications.
Reference: Tesla with Pytorch
University of California, Berkeley – AMP Lab
The AMP Lab at UC Berkeley, recognised for its groundbreaking contributions to AI and ML, transitioned from TensorFlow to PyTorch. The lab’s researchers articulated in several blog posts and research papers that PyTorch’s user-friendliness, dynamic execution, and superior support for innovative research rendered it the more advantageous technology for university environments. PyTorch’s clear and Pythonic code enabled researchers to concentrate on innovation, free from the constraints of TensorFlow’s rigid framework.
Reference: AMP Lab Blog
Why TensorFlow is Losing Ground
Steep Learning Curve and Complexity
TensorFlow has always faced criticism for its challenging learning curve, particularly in its initial iterations. The static computational graph concept, necessitating users to define graphs prior to doing computations, complicated experimentation and debugging. Despite TensorFlow’s introduction of dynamic execution in version 2.0, referred to as “eager execution,” it was insufficient, as PyTorch had already established considerable traction due to its user-friendly architecture.
Developer and Researcher Preferences
The allure of PyTorch mostly lies in its adaptability. The dynamic computational network approach is significantly more natural for developers, particularly those acquainted with Python. Researchers value PyTorch’s capacity for real-time code modifications, facilitating the prototyping and refinement of novel concepts. TensorFlow’s initially inflexible methodology was not well-suited for the rapid experimentation necessary in both academic and industrial environments.
The increase of PyTorch utilisation in research publications is apparent. At conferences like NeurIPS and ICML, over 70% of papers currently utilise PyTorch as their preferred framework. TensorFlow, once favoured in these domains, has experienced a decline in prominence in favour of PyTorch.
Hugging Face Model Dominance
The data from Hugging Face’s model repository reveals a significant trend: developers and academics predominantly select PyTorch for sharing models to the AI community. The significant disparity in model quantity between PyTorch and TensorFlow on Hugging Face highlights PyTorch’s status as the prevailing standard for disseminating pre-trained models. The diminished presence of TensorFlow in this domain indicates its declining appeal for emerging AI initiatives.
The Future of AI Frameworks: PyTorch Leading the Charge
The ascendance of PyTorch has transformed the landscape of AI frameworks. Its success is attributed to its adaptability, user-friendliness, and capacity to fulfil the requirements of both researchers and developers. TensorFlow, although pertinent in certain specialised areas (e.g., Google Cloud), has evidently relinquished its pre-eminence. The intricacy and gradual acceptance of dynamic execution techniques enabled PyTorch to capitalise on the opportunity and emerge as the preeminent contender.
The extensive array of models on Hugging Face, the transition of prominent organisations such as OpenAI and Tesla, and the widespread adoption of PyTorch in academic research underscore that PyTorch has emerged as the preeminent platform for AI development. Absent a substantial resurgence from TensorFlow, PyTorch appears likely to sustain its pre-eminence in AI workloads for the foreseeable future.
.