We use cookies. Find out more about it here. By continuing to browse this site you are agreeing to our use of cookies.
#alert
Back to search results
New

Senior Data Engineer

Aleron
United States, California, Rancho Cucamonga
11069 Cedar Creek Drive (Show on map)
Apr 22, 2026


Description
Our client located in Rancho Cucamonga, CA is looking for an Data Engineer to join their team!
Duration: Fulltime
Pay: Up to $157,000 depending on experience
Under the direction of the Department Leadership, the Data Engineer III is responsible for the design, planning and development of data solutions. The Data Engineer III will lead the design and development of data transformation. This position will be involved in data architecture. The Data Engineer III is expected to lead by example. In this role the Data Engineer III will follow coding standards throughout all aspects of the solution development to produce efficient and high-quality solutions, in addition to collaboration with, inter-departments to ensure member needs are met while simultaneously building strong peer relationships.

Key Responsibilities:

1. Design, develop and implement reliable and effective data solutions based on business requirements.
2. Maintain process design artifacts like data flow diagrams, end user process maps and technical design documents.
3. Find trends in data sets and develop algorithms to help make raw data more useful to IEHP.
4. Create and maintain optimal data pipeline architecture that meet security standards.
5. Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability.
6. Develop best practices for database design and development activities.
7. Create complex functions, scripts, and services to support the Data Services team.
8. Ensure all data solutions meet company and performance requirements.
9. Work under minimal supervision with wide latitude for independent judgment.
10. Conduct code reviews.
11. Direct and mentor some level I and II engineers.
12. Serve as a subject matter expert in key business projects.
13. Recommend improvements to existing Data Services processes as necessary.
14. Provide detailed analysis of data issues; data mapping; and the process for automation and enhancement of data quality.
15. Analyze and integrate new technologies with existing applications to improve the design and functionality of applications.
16. Maintain proficient programming skills in REST web services, C#.NET, TSQL, XML, JSON, and other relevant languages and/or frameworks.
17. Develop and automate solutions to consume data from multiple data sources, including external API
18. Program and modify code in languages like Java, Json, Python, and Spark to support and implement Data Warehouse solutions.
19. Design and deploy enterprise-scale cloud infrastructure solutions.
20. Research, analyze, recommend and select technical approaches for solving difficult and meaningful development and integration problems.
21. Work closely with the Data and Engineering teams to design best in class Azure implementations.
22. Clearly and regularly communicate with management, colleagues, and domain units.

Job Requirements
Required Skills / Qualifications:
  • 8+ years of experience in provisioning, configuring, and developing solutions in Azure Data Lake, Azure Data Factory, Azure SQL Data Warehouse, Azure Synapse and Cosmos DB.
  • 8+ years implementing software development methodologies.
  • 8 + years working with relational databases. Experience building and optimizing big data pipelines, architectures, and data sets.
  • 8 + years of experience performing root cause analysis on internal and external data and processes. Experience using Source Control and management tools such as Azure DevOps and Gitlab/GitHub. Experience transforming requirements into Design Concepts and ERDs using Visio and similar tools.
  • 5 + years of hands-on experience with cloud orchestration and automation tools and CI/CD pipeline creation is required.
  • Bachelor's degree in a quantitative discipline such as Computer Science, Statistics, Mathematics or Engineering from an accredited institution required.
Preferred Skills / Qualifications:
  • Strong knowledge and understanding in the following areas:
    - DevOps, Python or Java or Json, (HL7/ FHIR is a plus)
    - Applicable data privacy practices and laws
    - common SDLC models (Waterfall, Agile - Scrum and Kanban)
    - Practice of DevOps
    - Non-relational database (NoSQL) designs using MongoDB and others
    - Relational databases like MS SQL Server
    - Fluency with at least one scripting or programming language such as Python.
    - Message queuing, stream processing, and highly scalable 'big data' data stores.
After you apply, you may receive a call or message from our AI Talent Scout about this role or other opportunities that match your skills and preferences. AI agent role is to help speed up your hiring process by answering questions, confirming basic information, and identifying whether there's a mutual fit.
The call or chat may be recorded so that our recruiting team can review it - they make all final hiring decisions, while AI agent simply helps move you forward faster. The best part? They are available 24/7, so you can connect whenever it's convenient for you.
Aleron companies (Acara Solutions, Aleron Shared Resources, Broadleaf Results, Lume Strategies, TalentRise, Viaduct) are an Equal Opportunity Employer. Race/Color/Gender/Religion/National Origin/Disability/Veteran.
Applicants for this position must be legally authorized to work in the United States. This position does not meet the employment requirements for individuals with F-1 OPT STEM work authorization status.

Apply

Applied = 0

(web-bd9584865-5svq2)