18 days ago

Software Engineer Privacy

Google DeepMind

On Site
Full Time
$175,000
New York, NY
Apply

Job Overview

Job TitleSoftware Engineer Privacy
Job TypeFull Time
Offered Salary$175,000
LocationNew York, NY

Who's the hiring manager?

Sign up to PitchMeAI to discover the hiring manager's details for this job. We will also write them an intro email for you.

Uncover Hiring Manager

Job Description

Software Engineer Privacy

At Google DeepMind, our mission is to build the world's first general-purpose learning agent. Central to this mission is the complex task of measuring the intelligence of our prototypes. As a Software Engineer, you will be working with the cutting edge AI agents developed by our exceptional team of Machine Learning and Neuroscience research scientists. Your responsibilities will include everything from creating systems for agent testing using 2D and 3D games to developing test problems within physics simulators. You will create graphical visualization of results, build competitive agent leaderboards and test new algorithms on robots. To succeed in this role, you will need to have a strong foundation in software engineering and enjoy working on a wide range of challenging problems within a mission-driven team.

As a Software Engineer, Privacy, you will join the Google DeepMind (GDM) Privacy Working Group team to help build a culture of privacy within GDM’s product and engineering teams. In this role, you’ll have the opportunity to shape the development of models and products that will profoundly impact humanity’s future over the coming decades. Artificial intelligence will be one humanity’s most transformative inventions. At Google DeepMind, we are a pioneering AI lab with exceptional interdisciplinary teams focused on advancing AI development to solve complex global challenges and accelerate high-quality product innovation for billions of users. We use our technologies for widespread public benefit and scientific discovery, ensuring safety and ethics are always our highest priority. We are pushing the boundaries across multiple domains. Our global teams offer learning opportunities and varied career pathways for those driven to achieve exceptional results through collective effort.

Responsibilities

  • Lead teams through research and development to ensure transparent data practices and risk mitigation.
  • Conduct policy and foundational model reviews (large language models/generative media) to identify and escalate privacy concerns.
  • Contribute to product/business strategy, legal initiatives, and proactive engineering improvements.
  • Establish sustainable data access standards and complex access control frameworks.
  • Conduct outreach to educate Googlers within GDM on privacy topics.
  • Build a culture of privacy by collaborating with engineering teams to develop infrastructure.
  • Be familiar with ML privacy.

Minimum Qualifications

  • Bachelor’s degree or equivalent practical experience.
  • 2 years of experience managing and maintaining the privacy posture of an organization by analyzing/assessing proposed solutions (e.g., product features, infrastructure systems), and collaborating with stakeholders.
  • 2 years of experience applying privacy technologies (e.g., differential privacy, automated access management solutions), customizing existing solutions and frameworks to meet organizational needs.
  • 2 years of experience applying technologies and processes to mitigate risks (e.g., business, regulatory, etc.) using established frameworks such as privacy by design.
  • Experience working cross-functionally with attorneys, software developers, security, product managers.

Preferred Qualifications

  • Experience with machine learning privacy, including memorization/recitation, risks of hallucination, and security considerations.
  • Experience motivating and collaborating across different sets of teams with different working styles.
  • Proficiency with privacy principles, frameworks, and regulations.
  • Understanding of Google products and technologies.

Key skills/competency

  • Software Engineering
  • Privacy Engineering
  • Machine Learning Privacy
  • Differential Privacy
  • Risk Mitigation
  • Data Access Management
  • Cross-functional Collaboration
  • AI Ethics
  • Regulatory Compliance
  • Policy Review

Tags:

Software Engineer
Privacy Engineering
Machine Learning Privacy
AI
Deep Learning
Differential Privacy
Risk Management
Data Privacy
Google
DeepMind
Artificial Intelligence
ML Ops
Cloud Engineering
Cybersecurity
Tech Lead

Share Job:

How to Get Hired at Google DeepMind

  • Tailor your resume: Highlight your 2+ years of experience in privacy posture management, privacy technologies, and risk mitigation, using keywords from the job description like 'differential privacy' and 'privacy by design'.
  • Showcase cross-functional skills: Emphasize your experience collaborating with attorneys, software developers, product managers, and security teams to demonstrate your ability to work effectively in a team environment.
  • Prepare for technical interviews: Be ready to discuss your experience with privacy technologies, ML privacy, and risk assessment frameworks. Practice explaining complex privacy concepts clearly.
  • Understand DeepMind's mission: Research Google DeepMind's work on AI agents, their mission, and their commitment to safety and ethics. Align your application and interview responses with these core values.
  • Express interest in specific locations: Clearly state your preferred working location from Mountain View, CA, or New York, NY, as indicated in your application.

Frequently Asked Questions

Find answers to common questions about this job opportunity

Explore similar opportunities that match your background