The State of PyTorch: A Year in Review
Welcome back everyone, and thank you for joining us today! I'm Joe, an engineering manager at Meta on the PyTorch team, and I'm excited to co-present the state of PyTorch with Guido Charon. As we continue our annual tradition of giving a lightning overview of some of our major feature launches, we're thrilled to share with you all the exciting developments that have taken place in the world of PyTorch over the past year.
First and foremost, I'd like to acknowledge the incredible community that has made PyTorch so successful. From active contributors on GitHub to enthusiastic users around the world, we couldn't do it without your support and engagement. As we look back on the past 12 months, we're proud of the progress we've made towards our goals of making PyTorch faster, more efficient, and more accessible to everyone.
One of the most significant updates we've made is the launch of Trainium ML Chips for training large language models. This exciting new development has shown near-linear scaling across training clusters, and we're thrilled to see the impact it's already having on the AI community. We're also proud to announce that our work on distributed training has continued to gain momentum, with improvements like the PyTorch Distributed PS3 plugin and our support for fully sharded data parallelism.
In addition to these technical advancements, we've made significant strides in collaboration with leading companies across the industry. From cloud providers like AWS and Google Cloud to startups like Stability.ai and Predibase, we're committed to building a vibrant ecosystem around PyTorch that enables innovation and growth. Our partnerships have resulted in exciting new applications of PyTorch, such as Amazon's use of large-scale models across multiple modalities to power their discovery engine.
But what about the community itself? How have you all been using PyTorch over the past year? We're excited to share some highlights from our own work on GitHub, including 20,000 replies to community questions and concerns, and 3,000 contributors who've made significant contributions to the project. We also want to recognize the many organizations and individuals who have demonstrated exceptional dedication to PyTorch, whether through open-source projects or innovative applications.
As we wrap up our presentation today, I'd like to extend a heartfelt thank you to each and every one of you for being part of the PyTorch community. Your passion, creativity, and hard work are what make PyTorch so special, and we're honored to be on this journey with all of you.
---
Industry Usage: Collaborations and Success Stories
As I hand it over to Guido Charon to talk about industry usage, I'd like to acknowledge the incredible progress we've made in collaboration with leading companies across the industry. From cloud providers like AWS and Google Cloud to startups like Stability.ai and Predibase, our partnerships have resulted in exciting new applications of PyTorch that are driving innovation and growth.
One of the most notable successes we've seen is from Stability.ai, a startup that's focused on building open AI tools for developing cutting-edge AI models. By leveraging PyTorch, they're able to accelerate their development process while maintaining the highest standards of quality and accuracy. We're thrilled to have collaborated with them on Integrations for FSTP and PyTorch.
Another standout success story is from Predibase, a startup that's developed an alternative to AutoML using a declarative approach. By harnessing the power of PyTorch, they've been able to reduce development time while maintaining the highest standards of model quality. We're excited to see their work and look forward to continuing our collaboration.
We've also seen significant progress from Microsoft Azure, who have launched PyTorch Azure containers and the DeepSpeed MII Library. This exciting new development has enabled faster model inference at a lower cost point, with built-in support for over 24,000 models. We're thrilled to see their work and look forward to continuing our collaboration.
In addition to these success stories, we've also seen many other organizations and startups leveraging PyTorch in innovative ways. From Tesla's use of PyTorch in their self-driving car project to Amazon's deployment of large-scale models across multiple modalities, the applications of PyTorch are endless and exciting.
As I hand it back to Joe, I'd like to take a moment to acknowledge the many organizations and individuals who have demonstrated exceptional dedication to PyTorch over the past year. Your hard work and passion are what make PyTorch so special, and we're honored to be on this journey with all of you.
---
The Future of PyTorch: Trends, Opportunities, and Looking Ahead
As I hand it back to John to summarize our presentation, I'd like to take a moment to reflect on the incredible progress we've made as a community. From technical advancements like Trainium ML Chips and distributed training to exciting collaborations with leading companies across the industry, we're proud of what we've accomplished.
Looking ahead to the future, there are several trends that we can expect to see continue to shape the world of PyTorch. One area of significant focus will be on accelerating the development of large-scale models, which have already shown tremendous promise in areas like computer vision and natural language processing. We're excited to explore new applications of PyTorch in these areas and to collaborate with leading companies and startups to drive innovation.
Another trend that we can expect to see continue is the growth of open-source projects and communities around PyTorch. With 20,000 replies to community questions and concerns on GitHub, it's clear that our community is passionate and engaged. We're committed to continuing to support this growth and to providing tools and resources that enable developers to build amazing things with PyTorch.
As we look ahead to the future, I'd like to leave you all with a few final thoughts. Firstly, thank you again to every single one of you for being part of the PyTorch community. Your passion, creativity, and hard work are what make PyTorch so special, and we're honored to be on this journey with all of you.
Secondly, I'd like to emphasize the importance of collaboration and community in driving innovation and growth. By working together, we can achieve far more than we ever could alone. Whether through open-source projects or innovative applications, your contributions are invaluable and will shape the future of PyTorch for years to come.
Finally, I'd like to leave you all with a message of hope and excitement for what's to come. The world of AI is rapidly evolving, and PyTorch is at the forefront of this revolution. With your help, we can build an even more amazing future together.