Machine Learning Engineer, LLM Training Datasets
at Nvidia
π Santa Clara, United States
USD 148,000-287,500 per year
SCRAPED
Used Tools & Technologies
Not specified
Required Skills & Competences ?
Python @ 3 GCP @ 3 Algorithms @ 3 Machine Learning @ 3 Data Science @ 3 TensorFlow @ 3 Hiring @ 3 AWS @ 3 Azure @ 3 Communication @ 6 Data Engineering @ 3 LLM @ 3 PyTorch @ 3Details
NVIDIA is seeking a Machine Learning Engineer specializing in LLM training datasets engineering. This is a highly technical role requiring deep expertise in machine learning, data science, and data engineering to develop innovative solutions that address the unique challenges of training foundation models. The role focuses on building and improving the data ecosystem for pre-training, fine-tuning, RL, and multi-modal model workflows.
Responsibilities
- Develop datasets for LLM pre-training and post-training (fine-tuning and reinforcement learning), optimize models, and evaluate performance.
- Design and implement data strategies for model training and evaluation, including data collection, cleaning, labeling, augmentation, and RL verifier datasets to improve model performance.
- Actively identify and manage data issues such as outliers, noise, and biases.
- Generate high-quality synthetic data to augment existing datasets, especially for domain-specific or safety-critical use cases and multi-modal use cases (text, image, video, etc.).
- Define data annotation guidelines and curate high-quality labeled datasets for model alignment, including reinforcement learning with human feedback (RLHF).
- Conduct experiments to optimize Large Language Models with supervised fine-tuning (SFT) and reinforcement learning (RL) techniques.
- Design and develop systems for reasoning, tool calling, multi-modal processing, and RL verifiers.
- Implement post-training tasks for LLMs, including fine-tuning, RL, distillation, and performance evaluation; adjust hyperparameters to improve model quality.
- Partner with ML researchers, data scientists, and infrastructure teams to understand data needs, integrate datasets, and deploy efficient ML workflows.
Requirements
- Masterβs or PhD in Computer Science, Electrical Engineering, or a related field β or equivalent experience.
- 3+ years of work experience in developing datasets and training large language models or other generative AI models.
- Hands-on programming expertise in Python.
- Solid understanding of machine learning concepts and algorithms for managing data and experiments, including multi-modal datasets.
- Experience with synthetic data generation techniques and evaluation strategies.
- Background with high-performance data processing libraries and machine learning frameworks such as PyTorch, DataLoader, TensorFlow Data.
- Experience with alignment/fine-tuning of LLMs and VLMs (image-to-text, video-to-text) or any-to-text large models.
- Familiarity with distributed training paradigms and optimization techniques.
- Strong problem solving and analytical ability, plus excellent collaboration and communication skills.
- Demonstrated behaviors that build trust: humility, transparency, respect, and intellectual integrity.
Ways to Stand Out
- Strong track record of contributions to open-source data tools or research publications.
- Experience with cloud platforms (e.g., AWS, GCP, Azure) and data storage systems (e.g., S3, Google Cloud Storage).
- Continuously evaluate new tools, techniques, and methodologies in data engineering and generative AI to improve training data infrastructure.
- Passion for AI and demonstrated commitment to advancing the field through innovative research and publications.
Compensation & Benefits
- Base salary is location- and experience-dependent. Provided base salary ranges:
- Level 3: 148,000 USD - 235,750 USD
- Level 4: 184,000 USD - 287,500 USD
- Eligible for equity and company benefits (see NVIDIA benefits).
Additional Information
- Applications for this job will be accepted at least until September 18, 2025.
- NVIDIA is an equal opportunity employer and values diversity in hiring and promotion practices.