elections-2024-10: What is the most important thing Kata Containers needs to achieve in the next year and how should we get there? #411
Replies: 1 comment
-
This issue, in my opinion, is quite timely, especially as we approach the end of 2024 and begin planning the community work agenda for 2025. It provides an opportunity for everyone to contribute ideas and suggestions for the community's development. Firstly, the runtime-rs, as a new architecture for Kata 3.0, has been in the community for two years. However, based on current user adoption, most users still utilize the previous Go version, with only a few attempting to transition to runtime-rs. One issue is that many users are accustomed to the Go version and are reluctant to switch to runtime-rs. Additionally, Rust itself presents a high learning curve, making the transition from Go to Rust challenging. This has led to hesitation among developers and users regarding runtime-rs's architecture and usage. Most importantly, there may also be concerns about the stability and usability of runtime-rs. To address these concerns and help users and developers quickly understand the architecture and technical details of runtime-rs, we could supplement the documentation with detailed technical documentation specifically for runtime-rs. Moreover, we can encourage developers and maintainers of various subsystems to write technical blogs focused on their areas of expertise. This would allow users and developers to comprehend the architecture and technical specifics of runtime-rs more quickly and thoroughly. Once developers and users master and deeply understand the technology, their hesitations regarding developing or using runtime-rs may be alleviated. Additionally, based on our experience leading students in mainland China to participate in the Kata community, there is still a lack of documentation for beginners to learn and familiarize themselves with Kata. Therefore, I hope in the coming year, we can systematically create introductory documentation or manuals to help new users quickly deploy and use Kata. Finally, and most importantly in my view, with the rapid development and adoption of AI large models, there will be increasing demand for using Kata for model inference. Currently, Kata mainly supports full GPU passthrough, but inference typically does not utilize the entire GPU's resources. Therefore, supporting GPU virtualization for resource isolation in Kata will become a popular requirement. I hope that in the coming year, we can implement a complete solution for GPU virtualization that is suitable for Kata. |
Beta Was this translation helpful? Give feedback.
-
This initial Q&A period question came from @ildikov:
cc @gkurz @lifupan @sprt @wainersm
Beta Was this translation helpful? Give feedback.
All reactions