GCP Data Engineer
Must include LinkedIns I will NOT screen if the dates dont match and if they have different resumes in our database & genuine visa only
The interview WILL be onsite in Charlotte or Hartford but then fully remote!
Duration: 6mo Contract to Hire
Conversion: $140k-$175k
Required Skills & Experience
8 years of experience across data engineering utilizing data solutions SQL and NoSQL Snowflake ELT/ETL tools CICD Bigdata GCP Python/Spark Datamesh Datalake Data Fabric
1 year of hands-on experience supporting Gen AI initiatives in a data environment ideally from insurance background
Experience building out within ideally a GCP public cloud environment but could be AWS/Azure
Strong programming skills in Python and familiarity with deep learning frameworks such as PyTorch or TensorFlow
Nice to Have Skills & Experience
AI certifications
GCP/Azure/AWS certifications
Experience in Property & Casualty or Employee Benefits industry
Knowledge of natural language processing (NLP) and computer vision technologies
Job Description
Responsible for implementing data pipelines that bring together structured semi-structured and unstructured data to support AI and Agentic solutions - includes pre-processing with extraction chunking embedding and grounding strategies to get the data ready
Develop GCP driven systems to improve data capabilities ensuring compliance with industry best practices for insurance-specific data use cases and challenges
Develop data domains and data products for various consumption archetypes including Reporting Data Science AI/ML Analytics etc.
Partner with architects and stakeholders to influence and implement the vision of the AI and data pipelines while safeguarding the integrity and scalability of the GCP environment