Employer Active
Job Alert
You will be updated with latest job alerts via emailJob Alert
You will be updated with latest job alerts via emailOur Digital Modernization and Experience (DMX) Group is growing and we are looking for a motivated experienced Senior Databricks SME who is passionate about turning complex data into actionable solutions that improve public systems and services. This role supports an enterprise initiative focused on platform infrastructure and analytics modernization for a federal customer.
Youll be joining a cross-functional team of full stack developers data engineers and data analysts working within a modular cloud-native platform supporting the emergency management sector. Your work will help ensure disaster management and mitigation decision-makers have access to accurate timely and meaningful data and data products to drive effective service delivery and measurable mission outcomes.
If you thrive in a collaborative environment enjoy working independently to solve real-world challenges through data we want to hear from you.
Job Location: This position is fully remote with up to 10% travel to the DC Metropolitan area for client meetings. This person must be able to interview in person in Raleigh/Durham NC Reston Virginia; Atlanta GA.
What youll be doing:
Enable secure scalable and efficient data exchange between federal client and external data sharing partners using Databricks Delta Sharing.
Support the design and development of data pipelines and ETL routines in Azure Cloud environment for many source system types including RDBMS API and unstructured data using CDC incremental and batch loading techniques.
Conduct data profiling transformation and quality assurance on structured semi-structured and unstructured data.
Identify underlying issues and translate them into technical requirements.
Assist in building and optimizing data lakes feature stores and data warehouse structures to support analytics and machine learning.
Prepare structure and validate data for data science and MLOps workflows ensuring it meets the quality and format requirements for modeling.
Help monitor and maintain the flow of data across BI dashboards analytics environments and machine learning pipelines.
Engage directly with clients and stakeholders to understand data needs and translate them into scalable solutions.
Collaborate with UX designers business analysts developers and end users to define data and reporting requirements
Work with external data partners to determine their data product needs and work within the databricks platform to enable rapid prototyping and extensible use cases
Meet with government employees at executive levels platform stakeholders and vendor partners.
Work within Agile teams to support iterative development backlog grooming and sprint-based delivery.
Provide mentorship to junior resources
What you must have:
Bachelors degree in computer science Information Systems Data Analytics or a related discipline.
Minimum 5 years in data engineering data security practices data platforms and analytics
3 years Databricks Platform Expertise SME Level Proficiency including:
Databricks Delta Lake and Delta Sharing
Deep experience with distributed computing using Apache Spark
Knowledge of Spark runtime internals and optimization
Ability to design and deploy performant end-to-end data architectures
4 years of ETL Pipeline Development building robust scalable data pipelines
Candidate must be able to obtain and maintain a Public Trust
Candidate must reside in the U.S. be authorized to work in the U.S. and all work must be performed in the U.S.
Candidate must have lived in the U.S. for three (3) full years out of the last five (5) years
Technologies youll use:
Databricks on Azure for data engineering and ML pipeline support.
SQL Python Spark Tableau.
Git Jira CI/CD tools (e.g. Jenkins CodeBuild).
Jira Confluence SharePoint.
Mural Miro or other collaboration/whiteboarding tools.
What wed like you to have:
Databricks certifications - Professional or specialty certifications
Hands-on experience with Azure services such as Synapse Data Factory or Databricks.
Familiarity with data visualization tools such as Tableau Power BI or similar.
Machine Learning and Analytical Skills including:
MLOps - Working knowledge of ML deployment and operations
Data Science Methodologies - Statistical analysis modeling and interpretation
Big Data Technologies - Experience beyond Spark with distributed systems
Experience with deployment pipelines including Git-based version control and CI/CD pipelines and DevOps practices using Terraform for IaC.
Emergency management domain knowledge a plus
Advanced proficiency in data engineering and analytics using Python Expert-level SQL skills for data manipulation and analysis and experience with Scala preferred but not required (Python expertise can substitute)
Proven experience breaking down complex ideas into manageable components
Demonstrable experience developing rapid POCs and Prototypes
History of staying current with evolving data technologies and methodologies
Professional Skills
Strong analytical thinking attention to detail and willingness to learn new tools and technologies.
Consulting experience with ability to work directly with clients executive level stakeholders and manage conflicts.
Why youll love working here:
Generous vacation and retirement plans.
Comprehensive health benefits.
Flexible work environment.
Ongoing training and development opportunities.
Inclusive and collaborative culture.
Meaningful work that impacts communities.
Working at ICF
ICF is a global advisory and technology services provider but were not your typical consultants. We combine unmatched expertise with cutting-edge technology to help clients solve their most complex challenges navigate change and shape the future.We can only solve the worlds toughest challenges by building a workplace that allows everyone to thrive. We are an equal opportunity employer.Together our employees are empowered to share theirexpertiseand collaborate with others to achieve personal and professional goals. For more information please read ourEEOpolicy.
We will consider for employment qualified applicants with arrest and conviction records.
Reasonable Accommodations are available including but not limited to for disabled veterans individuals with disabilities and individuals withsincerely heldreligious beliefs in all phases of the application and employment process. To requestan accommodationplease emailand we will be happy toassist. All information you provide will be kept confidential and will be used only to the extentto provide needed reasonable accommodations.
Read more aboutworkplacediscriminationrightsor our benefit offerings which are included in theTransparency in (Benefits) CoverageAct.
At ICF we are committed to ensuring a fair interview process for all candidates based on their own skills and knowledge. As part of this commitment the use of artificial intelligence (AI) tools to generate orassistwith responses during interviews (whether in-person or virtual) is notpermitted. This policy is in place tomaintainthe integrity and authenticity of the interview process.
However we understand that some candidates may require accommodationthat involves the use of AI. Ifsuch anaccommodation is needed candidates are instructed to contact us in advance at. Weare dedicated to providingthe necessary support to ensure that all candidates have an equal opportunity to succeed.
Pay Range - There are multiple factors that are considered in determining final pay for a position including but not limited to relevant work experience skills certifications and competencies that align to the specified role geographic location education and certifications as well as contract provisions regarding labor categories that are specific to the position.
The pay range for this position based on full-time employment is:
$98124.00 - $201840.00Virginia Remote Office (VA99)Full-Time