StaffAttract
  • Login
  • Create Account
  • Products
    • Private Ad Placement
    • Reports Management
    • Publisher Monetization
    • Search Jobs
  • About Us
  • Contact Us
  • Unsubscribe

Login

Forgot Password?

Create Account

Job title, industry, keywords, etc.
City, State or Postcode

Principal Machine Learning Engineer, Distributed vLLM ...

Red Hat, Inc. - Boston, MA

Apply Now

Job Description

OverviewAt Red Hat we believe the future of AI is open and we are on a mission to bring the power of open-source LLMs and vLLM to every enterprise. Red Hat Inference team accelerates AI for the enterprise and brings operational simplicity to GenAI deployments. As leading developers, maintainers of the vLLM and LLM-D projects, and inventors of state-of-the-art techniques for model quantization and sparsification, our team provides a stable platform for enterprises to build, optimize, and scale LLM deployments. As a Principal Machine Learning Engineer focused on distributed vLLM infrastructure in the LLM-D project, you will collaborate with our team to tackle the most pressing challenges in scalable inference systems and Kubernetes-native deployments. Your work with distributed systems and cloud infrastructure will directly impact enterprise AI deployments. If you want to solve challenging technical problems in distributed systems and cloud-native infrastructure the open-source way, this is the role for you. Join us in shaping the future of AI! ResponsibilitiesDevelop and maintain distributed inference infrastructure leveraging Kubernetes APIs, operators, and the Gateway Inference Extension API for scalable LLM deployments.Create system components in Go and/or Rust to integrate with the vLLM project and manage distributed inference workloads.Design and implement KV cache-aware routing and scoring algorithms to optimize memory utilization and request distribution in large-scale inference deployments.Enhance the resource utilization, fault tolerance, and stability of the inference stack.Contribute to the design, development, and testing of various inference optimization algorithms.Actively participate in technical design discussions and propose innovative solutions to complex challenges.Provide timely and constructive code reviews.Mentor and guide fellow engineers, fostering a culture of continuous learning and innovation.QualificationsStrong proficiency in Python and at least one systems programming language (GoLang, Rust, or C++), with GoLang being highly preferred.Experience with cloud-native Kubernetes service mesh technologies/stacks such as Istio, Cilium, Envoy (WASM filters), and CNI.A solid understanding of Layer 7 networking, gRPC, and the fundamentals of API gateways and reverse proxies.Working knowledge of high-performance networking protocols and technologies including UCX, RoCE, InfiniBand, and RDMA is a plus.Excellent communication skills, capable of interacting effectively with both technical and non-technical team members.A Bachelor's or Master's degree in computer science, computer engineering, or a related field.Preferred qualificationsExperience with the Kubernetes ecosystem, including core concepts, custom APIs, operators, and the Gateway API inference extension for GenAI workloads.Experience with GPU performance benchmarking and profiling tools like NVIDIA Nsight or distributed tracing libraries/techniques like OpenTelemetry.Ph.D. in an ML-related domain is a significant advantageSalary range: $189,600.00 - $312,730.00. Actual offer will be based on your qualifications. Pay TransparencyRed Hat determines compensation based on several factors including but not limited to job location, experience, applicable skills and training, external market value, and internal pay equity. Annual salary is one component of Red Hat’s compensation package. This position may also be eligible for bonus, commission, and/or equity. For positions with Remote-US locations, the actual salary range for the position may differ based on location but will be commensurate with job duties and relevant work experience. About Red HatRed Hat is the world’s leading provider of enterprise open source software solutions, using a community-powered approach to deliver high-performing Linux, cloud, container, and Kubernetes technologies. We hire creative, passionate people ready to contribute their ideas, help solve complex problems, and make an impact. BenefitsComprehensive medical, dental, and vision coverageFlexible Spending Account - healthcare and dependent careHealth Savings Account - high deductible medical planRetirement 401(k) with employer matchPaid time off and holidaysPaid parental leave plans for all new parentsLeave benefits including disability, paid family medical leave, and paid military leaveAdditional benefits including employee stock purchase plan, family planning reimbursement, tuition reimbursement, transportation expense account, employee assistance program, and more!Note:These benefits are only applicable to full time, permanent associates at Red Hat located in the United States. Inclusion and Equal OpportunityRed Hat’s culture is built on the open source principles of transparency, collaboration, and inclusion. We welcome applicants from all backgrounds and experiences and strive to ensure equal opportunity and access for all. Equal Opportunity Policy (EEO)Red Hat is proud to be an equal opportunity workplace and an affirmative action employer. We review applications for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, ancestry, citizenship, age, veteran status, genetic information, physical or mental disability, medical condition, marital status, or any other basis prohibited by law. Accommodation- Red Hat supports individuals with disabilities and provides reasonable accommodations to job applicants. If you need assistance completing our online job application, email . General inquiries regarding application status will not receive a reply.#J-18808-Ljbffr

Created: 2025-09-29

➤
Footer Logo
Privacy Policy | Terms & Conditions | Contact Us | About Us
Designed, Developed and Maintained by: NextGen TechEdge Solutions Pvt. Ltd.