Silicon Validation Methodology Engineer
at Nvidia
π Santa Clara, United States
$164,000-304,800 per year
SCRAPED
Used Tools & Technologies
Not specified
Required Skills & Competences ?
Communication @ 4Details
NVIDIA Silicon Solutions Group is seeking a versatile engineer to be part of the HW ArchDev team. The SSG team is uniquely positioned to have an end-to-end view of the product development cycle - from early arch definition, through bringup, to product release.
Responsibilities
- Evaluate new architecture, feature, and product use cases through a validation and debug lens.
- Develop and improve methodologies to incorporate new coverage and use case needs.
- Deliver requirements to SSG, Arch/HW, SW, FW, and Application teams.
- Lead debug efforts from HW side to root cause feature sequences bugs, silicon bugs, and complex system level issues caused by interactions between multiple HW and SW features.
- Develop new methodologies, processes, and workflows for feature architecture and development.
- Apply insights from bring-up execution and post-action reviews to continually improve coverage.
Requirements
- BS or MS degree in EE/CE or equivalent experience.
- Effective in a collaborative environment.
- 10+ years of experience in some of the following areas:
- Defining HW validation and bringup methodologies for next generation silicon.
- Deep understanding of GPU/SOC system level architecture.
- Working experience with silicon active and low power features, boot, binning, PVT sensitivity, platform component losses.
- Post silicon debug and evaluating fix options against product needs.
- Effective collaboration and communication across different functional teams.
With competitive salaries and a generous benefits package, NVIDIA is widely considered to be one of the technology worldβs most desirable employers. We encourage you to join our team with some of the most hard-working people in the world working together to promote rapid growth. Are you passionate about becoming a part of an outstanding team supporting the latest in GPU and AI technology? If so, we want to hear from you.