Senior Data Engineer
Job Description
Senior Data Engineer
Are you a skilled Senior Data Engineer passionate about transforming complex data into actionable insights? Do you thrive on tackling challenging data ecosystems using cutting-edge tools and technologies? Join ICF’s mission-driven team and make a real impact on critical healthcare initiatives for our CMS client.
About ICF
ICF is more than just a global advisory and technology services provider. We are a mission-driven organization powered by individuals dedicated to making a positive difference in the world and improving lives. Our culture values Embracing Difference, fostering an inclusive environment where diverse perspectives drive innovation and success. We combine unmatched expertise with innovative technology to solve our clients’ most complex challenges, navigate change, and shape a better future.
Meet the Team: Health Engineering Solutions (HES)
Our Health Engineering Solutions (HES) team partners closely with clients like CMS to define a clear vision for success and then bring it to life. We know impactful results require the right people working together on the most effective solutions. We are actively seeking a seasoned Senior Data Engineer to be a key contributor in achieving these vital outcomes.
The Opportunity
As a core member of the CMS Public Data Platform Team within ICF’s Health Engineering Solutions group, you will play a crucial role in advancing strategic initiatives for our Centers for Medicare & Medicaid Services (CMS) client. This position is instrumental in transforming intricate data landscapes into valuable insights that fuel innovation, enhance integration, and optimize operational efficiency across various content and data programs. If you are a technically adept and innovative thinker who enjoys leveraging tools like AI/ML, low-code platforms, and custom coding to extract intelligence from vast amounts of structured and unstructured data, this is the perfect fit.
What You’ll Do (Key Responsibilities)
- Rapidly analyze and synthesize large-scale structured and unstructured datasets utilizing a combination of AI tools, low-code platforms, and custom code.
- Design and implement advanced data structures, including knowledge graphs, to effectively map relationships, hierarchies, and dependencies across disparate data sources.
- Perform comprehensive data acquisition activities, including web scraping, API integration, and ingesting data from external knowledge bases.
- Identify and resolve patterns, overlaps, and redundancies within datasets to support robust data harmonization and enrichment efforts.
- Develop and maintain resilient and scalable data pipelines and workflows that underpin advanced analytics and machine learning initiatives.
- Collaborate effectively with data scientists, analysts, and domain experts to translate complex business requirements into data-driven solutions.
- Champion data quality, consistency, and data governance best practices throughout the entire data lifecycle.
What You’ll Need (Required Qualifications)
- Bachelor’s degree in Computer Science, Engineering, or a closely related technical field.
- Minimum of 5+ years of experience in data engineering, data integration, or relevant technical roles.
- At least 3+ years of hands-on experience with Python.
- Minimum of 1 year of experience working with AI/ML tools and platforms (e.g., SageMaker, AzureML, VertexAI).
- Minimum of 1 year of experience with low-code/no-code platforms for data processing and automation.
- Minimum of 1 year of experience with knowledge graph construction, graph databases (e.g., Neo4j, RDF), and graph theory concepts.
- Solid understanding of data modeling, data warehousing, and ETL/ELT processes.
- Excellent verbal and written communication, along with strong collaboration skills.
- Ability to work effectively as a team player in a dynamic, fast-paced environment.
Bonus Points (Preferred Qualifications)
- Experience applying Generative AI and LLMs in data transformation and enrichment workflows.
- Proficiency in web scraping techniques, API development, and external data ingestion strategies.
- Familiarity with semantic technologies (e.g., SPARQL, OWL, RDF).
- Experience with data visualization and data storytelling.
- Proven ability to manage tasks independently and prioritize effectively in a fast-paced setting.
- Exceptional critical thinking and problem-solving abilities.
Location
This position is a Nationwide Remote Office role within the U.S.
Compensation
The pay range for this full-time position is $89,203.00 – $151,646.00. Please note that final compensation is determined by various factors, including experience, skills, certifications, location, education, and contract provisions.
Working at ICF
Join ICF and become part of a team that’s committed to making a difference. We are a global advisory and technology services provider, known for combining unmatched expertise with innovative technology to help our clients navigate complex challenges and shape a better future. We believe our greatest strength lies in our people and our diverse perspectives.
Equal Opportunity & Accessibility
ICF is an equal opportunity employer, committed to building an inclusive workplace where everyone can thrive. We hire regardless of protected characteristics. We provide reasonable accommodations for applicants and employees with disabilities, veterans, or for sincerely held religious beliefs. To request accommodation, please email Candidateaccommodation@icf.com.
Additional Information
Learn more about your workplace discrimination rights, the Pay Transparency Statement, and our benefit offerings included in the Transparency in (Benefits) Coverage Act.
Candidate AI Usage Policy
ICF values the authenticity of our interview process. The use of AI tools to generate or assist with interview responses is not permitted, except as a pre-approved accommodation. If you require an accommodation involving AI, please contact candidateaccommodation@icf.com in advance.
“