← Serch more jobs

Applied Researcher II

LinkedIn Capital One Cambridge, MA
Not Applicable Posted April 3, 2026 Job link
Thinking about this job
Not Met Priorities
What still needs stronger evidence
Requirements
  • You’re comfortable with open-source languages and are passionate about developing further.
  • You have hands-on experience developing AI foundation models and solutions using open-source tools and cloud computing platforms.
  • Has a deep understanding of the foundations of AI methodologies.
  • Experience building large deep learning models, whether on language, images, events, or graphs, as well as expertise in one or more of the following: training optimization, self-supervised learning, robustness, explainability, RLHF.
  • An engineering mindset as shown by a track record of delivering models at scale both in terms of training data and inference volumes.
  • Experience in delivering libraries, platform level code or solution level code to existing products.
  • A professional with a track record of coming up with new ideas or improving upon existing ideas in machine learning, demonstrated by accomplishments such as first author publications or projects.
  • Possess the ability to own and pursue a research agenda, including choosing impactful research problems and autonomously carrying out long-running projects.
  • Multiple publications on topics related to the pre-training of large language models (e.g. technical reports of pre-trained LLMs, SSL techniques, model pre-training optimization)
  • Member of team that has trained a large language model from scratch (10B + parameters, 500B+ tokens)
  • PhD focus on topics in geometric deep learning (Graph Neural Networks, Sequential Models, Multivariate Time Series)
  • Multiple papers on topics relevant to training models on graph and sequential data structures at KDD, ICML, NeurIPs, ICLR
  • Worked on scaling graph models to greater than 50m nodes
  • Experience with large scale deep learning based recommender systems
  • Experience with production real-time and streaming environments
  • Contributions to common open source frameworks (pytorch-geometric, DGL)
  • Proposed new methods for inference or representation learning on graphs or sequences
  • Worked datasets with 100m+ users
  • Optimization (Training & Inference)
  • PhD focused on topics related to optimizing training of very large deep learning models
  • Multiple years of experience and/or publications on one of the following topics: Model Sparsification, Quantization, Training Parallelism/Partitioning Design, Gradient Checkpointing, Model Compression
  • Experience optimizing training for a 10B+ model
  • Deep knowledge of deep learning algorithmic and/or optimizer design
  • Experience with compiler design
  • Finetuning
  • PhD focused on topics related to guiding LLMs with further tasks (Supervised Finetuning, Instruction-Tuning, Dialogue-Finetuning, Parameter Tuning)
  • Demonstrated knowledge of principles of transfer learning, model adaptation and model guidance
  • Experience deploying a fine-tuned large language model
  • Data Preparation
  • Publications studying tokenization, data quality, dataset curation, or labeling
  • Contribution to a major open source corpus
Preferred Skills
  • PhD in Computer Science, Machine Learning, Computer Engineering, Applied Mathematics, Electrical Engineering or related fields
  • LLM
  • PhD focus on NLP or Masters with 5 years of industrial NLP research experience
  • Multiple publications on topics related to the pre-training of large language models (e.g. technical reports of pre-trained LLMs, SSL techniques, model pre-training optimization)
  • Member of team that has trained a large language model from scratch (10B + parameters, 500B+ tokens)
  • Publications in deep learning theory
  • Publications at ACL, NAACL and EMNLP, Neurips, ICML or ICLR
  • Behavioral Models
  • PhD focus on topics in geometric deep learning (Graph Neural Networks, Sequential Models, Multivariate Time Series)
  • Multiple papers on topics relevant to training models on graph and sequential data structures at KDD, ICML, NeurIPs, ICLR
  • Worked on scaling graph models to greater than 50m nodes
  • Experience with large scale deep learning based recommender systems
  • Experience with production real-time and streaming environments
  • Contributions to common open source frameworks (pytorch-geometric, DGL)
  • Proposed new methods for inference or representation learning on graphs or sequences
  • Worked datasets with 100m+ users
  • Optimization (Training & Inference)
  • PhD focused on topics related to optimizing training of very large deep learning models
  • Multiple years of experience and/or publications on one of the following topics: Model Sparsification, Quantization, Training Parallelism/Partitioning Design, Gradient Checkpointing, Model Compression
  • Experience optimizing training for a 10B+ model
  • Deep knowledge of deep learning algorithmic and/or optimizer design
  • Experience with compiler design
  • Finetuning
  • PhD focused on topics related to guiding LLMs with further tasks (Supervised Finetuning, Instruction-Tuning, Dialogue-Finetuning, Parameter Tuning)
  • Demonstrated knowledge of principles of transfer learning, model adaptation and model guidance
  • Experience deploying a fine-tuned large language model
  • Data Preparation
  • Publications studying tokenization, data quality, dataset curation, or labeling
  • Contribution to a major open source corpus
  • Contribution to open source libraries for data quality, dataset curation, or labeling Capital One will consider sponsoring a new qualified applicant for employment authorization for this position.
Education
  • (Required) – Currently has, or is in the process of obtaining, PhD in Electrical Engineering, Computer Engineering, Computer Science, AI, Mathematics, or related fields, with an exception that required degree will be obtained on or before the scheduled start date plus 2 years of experience in Applied Research or M.S. in Electrical Engineering, Computer Engineering, Computer Science, AI, Mathematics, or related fields plus 4 years of experience in Applied Research
  • (Not required) – PhD in Computer Science, Machine Learning, Computer Engineering, Applied Mathematics, Electrical Engineering or related fields
  • (Not required) – LLM
  • (Not required) – PhD focus on NLP or Masters with 5 years of industrial NLP research experience
  • (Not required) – PhD focus on topics in geometric deep learning (Graph Neural Networks, Sequential Models, Multivariate Time Series)