Overview
Job Purpose The AI Center of Excellence (AICOE) at Intercontinental Exchange (ICE) is driving innovation by developing and deploying cutting-edge AI/ML solutions across the enterprise. We are seeking a Principal Engineer who will lead the design and implementation of scalable, cloud-native platforms and experiment with emerging AI/ML technologies. This role spans the full lifecycle-from development and architecture to deployment and optimization-enabling ICE to harness the power of data and machine learning for transformative business outcomes. The ideal candidate is a hands-on technologist with deep expertise in Python, distributed systems, and modern data architectures, combined with a passion for exploring generative AI, large language models (LLMs), and advanced ML frameworks. This position requires strong technical proficiency, problem-solving skills, and the ability to collaborate across engineering, data science, and business teams. Responsibilities
- Architect and implement cloud-native applications and data platforms optimized for AI/ML workloads.
- Design and build high-performance ETL frameworks, data lakes, and serverless systems using technologies like Apache Iceberg, Delta Sharing, and Databricks.
- Research and prototype emerging AI/ML technologies (e.g., LLMs, generative AI, vector databases) to assess feasibility and business impact.
- Develop and maintain CI/CD pipelines and infrastructure-as-code (Terraform) for seamless deployment of AI/ML models and applications.
- Optimize performance and cost efficiency across ML pipelines and distributed systems.
- Collaborate with data scientists and ML engineers to integrate models into production environments.
- Ensure secure data sharing and compliance with enterprise standards.
- Participate in team exercises to identify and implement areas for continuous improvement.
Knowledge and Experience
- 8+ years of software development experience with strong Python expertise.
- Hands-on experience with AWS services (ECS, EMR, Lambda, Athena, DynamoDB) and cloud-native architectures.
- Deep knowledge of modern data lake technologies (Apache Iceberg, Delta Sharing, Databricks Unity Catalog).
- Experience building scalable ETL frameworks and serverless systems.
- Familiarity with ML frameworks (PyTorch, TensorFlow) and interest in LLMs and generative AI.
- Strong understanding of CI/CD tools (Jenkins, GitHub Actions) and infrastructure automation (Terraform).
- Experience with distributed computing and big data technologies (Spark, DuckDB).
Preferred Knowledge and Experience
Exposure to deploying ML models in production environments. Knowledge of vector databases, feature stores, and real-time inference systems. Experience with Airflow or AWS Step Functions for orchestration. Contributions to open-source projects or AI/ML communities.
Intercontinental Exchange, Inc. is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to legally protected characteristics.
|