← Serch more jobs

Senior Software Engineer III

LinkedIn Pacific Northwest National Laboratory Seattle, WA
Not Applicable Posted March 29, 2026 Job link
Thinking about this job
Not Met Priorities
What still needs stronger evidence
Requirements
  • Core Engineering Excellence
  • Demonstrated proficiency in Python and working knowledge of at least one additional language (Java, C#/.NET, Go, C++, Rust) with deep knowledge of software engineering principles including object-oriented design, design patterns, data structures, algorithms, and clean code practices
  • Proficiency with version control systems (Git), collaborative development workflows, and strong foundation in automated testing methodologies including unit testing, integration testing, end-to-end testing, and test-driven development (TDD)
  • Understanding of CI/CD pipelines and DevOps practices with ability to contribute to build automation, deployment processes, and release management while writing maintainable, well-documented, and performant code
  • Ability to lead technical discussions around system design, microservice architecture, and distributed computing patterns while consistently leveraging AI assist tools (e.g., GitHub Copilot, Claude, Cursor) to accelerate development, generate test cases, and enhance problem-solving throughout the software development lifecycle
  • AI/ML & Deep Learning
  • Knowledge of machine learning fundamentals including supervised/unsupervised learning, model evaluation metrics, and common algorithms
  • Understanding of the machine learning lifecycle including data preparation, model training, hyperparameter tuning, evaluation, deployment, and monitoring
  • Knowledge of ML model serving architectures and ability to integrate models into production applications via APIs or batch processing
  • Understanding of ML best practices including experiment tracking, model versioning, A/B testing, and model performance monitoring
  • Cloud Native Application Development
  • Demonstrated experience building and deploying applications on cloud platforms (AWS, Azure, or GCP) with proficiency in containerization (Docker) and container orchestration (Kubernetes) for scalable application deployment (multi-cloud experience highly valued)
  • Ability to design and implement event-driven architectures using message queues, pub/sub systems, and serverless functions (Lambda, Azure Functions, Cloud Functions) with understanding of asynchronous processing patterns
  • Strong understanding of API design including RESTful principles (resource modeling, authentication, versioning) and alternative paradigms (GraphQL, gRPC) with ability to select appropriate protocols for different use cases
  • Experience designing microservice architectures with understanding of service boundaries, inter-service communication, and distributed system challenges, plus knowledge of both relational (PostgreSQL, MySQL) and NoSQL databases (MongoDB, DynamoDB, Cassandra) to select appropriate storage solutions
  • Data Engineering & Distributed Storage
  • Understanding of data pipeline architectures and ETL/ELT patterns using cloud-native services (AWS Glue, Lambda, Step Functions, Azure Data Factory) with knowledge of batch vs. streaming processing trade-offs
  • Knowledge of cloud-based data storage systems and their use cases (S3, Redshift, Delta Lake, BigQuery, PostgreSQL, MongoDB, OpenSearch, Snowflake) with understanding of data modeling principles including schema design, normalization/denormalization trade-offs, and data quality validation
  • Understanding of distributed data processing frameworks (Spark/Databricks, Kafka, Flink, Ray) and streaming architectures with ability to build applications that integrate with these platforms for parallel and real-time processing
  • Ability to design scalable systems handling large-scale data workloads with appropriate partitioning, indexing, and query optimization strategies while selecting optimal data formats (Parquet, Avro, JSON, Protocol Buffers) for different scenarios
  • Collaboration & Professional Effectiveness
  • Ability to collaborate effectively within cross-functional teams including product managers, data scientists, DevOps engineers, and other stakeholders while participating actively in Agile ceremonies, technical planning, and sprint activities
  • Strong communication skills to articulate complex technical concepts clearly through documentation, architecture diagrams, code reviews, and presentations with focus on knowledge sharing and maintaining team standards
  • Demonstrated capacity to mentor junior engineers through pair programming, constructive code reviews, and technical guidance while fostering a culture of continuous learning and improvement
  • Ability to balance technical excellence with pragmatic delivery, making appropriate trade-offs between ideal solutions and business value while demonstrating adaptability to rapidly learn new technologies and domains
  • Applying image classification for nuclear forensics analysis [Link]
  • Develop capabilities for scalable geospatial analytics [Link]
  • PhD and 1 year of software engineering experience; OR
  • MS/MA and 3 years of software engineering experience; OR
  • BS/BA and 5 years of software engineering experience; OR
  • AA and 14 years of software engineering experience in designing, architecting, programming, deploying, and automating software solutions in support of scientific research or consumer digital product development; OR
  • HS/GED and 16 years of software engineering experience in designing, architecting, programming, deploying, and automating software solutions in support of scientific research or consumer digital product development.
  • Expertise in Python and proficiency in at least one other language (Java, C#/.NET, C++, Go, Rust)
  • This position requires the ability to obtain and maintain a federal security clearance.
Preferred Skills
  • Data Engineering & Distributed Storage
  • Understanding of data pipeline architectures and ETL/ELT patterns using cloud-native services (AWS Glue, Lambda, Step Functions, Azure Data Factory) with knowledge of batch vs. streaming processing trade-offs
  • Knowledge of cloud-based data storage systems and their use cases (S3, Redshift, Delta Lake, BigQuery, PostgreSQL, MongoDB, OpenSearch, Snowflake) with understanding of data modeling principles including schema design, normalization/denormalization trade-offs, and data quality validation
  • Understanding of distributed data processing frameworks (Spark/Databricks, Kafka, Flink, Ray) and streaming architectures with ability to build applications that integrate with these platforms for parallel and real-time processing
  • Applying image classification for nuclear forensics analysis [Link]
  • HS/GED and 16 years of software engineering experience in designing, architecting, programming, deploying, and automating software solutions in support of scientific research or consumer digital product development.
  • Degree in computer science, software engineering, or related field
  • Expertise in Python and proficiency in at least one other language (Java, C#/.NET, C++, Go, Rust)
  • Ability to contribute to technical direction and independently structure complex problems into actionable work, in collaboration with senior engineers and cross-functional teams
  • Experience designing or implementing components of large-scale ETL pipelines and analytics systems (petabyte-scale experience valued)
  • Experience contributing to Cloud-native system design: API and microservice architecture, DevOps, containerization and orchestration (Docker/Kubernetes), infrastructure as code, and full-stack observability (logging, metrics, tracing)
  • Active technical community engagement demonstrated through meaningful open-source contributions, maintained GitHub repositories, technical blog posts or presentations, conference participation, mentoring activities, or self-initiated projects exploring emerging technologies that showcase continuous learning and passion for the engineering craft
Education
  • (Not required) – PhD and 1 year of software engineering experience; OR
  • (Not required) – MS/MA and 3 years of software engineering experience; OR
  • (Not required) – BS/BA and 5 years of software engineering experience; OR
  • (Not required) – AA and 14 years of software engineering experience in designing, architecting, programming, deploying, and automating software solutions in support of scientific research or consumer digital product development; OR
  • (Not required) – HS/GED and 16 years of software engineering experience in designing, architecting, programming, deploying, and automating software solutions in support of scientific research or consumer digital product development.
  • (Not required) – Degree in computer science, software engineering, or related field