Overview:
TekWissen is a global workforce management provider headquartered in Ann Arbor Michigan that offers strategic talent solutions to our clients world-wide. Our client is a trusted engineering construction and project management partner to industry and government. Differentiated by the quality of their people and their relentless drive to deliver the most successful outcomes they align their capabilities to their customers objectives to create a lasting positive impact. Since 1898 they helped customers complete more than 25000 projects in 160 countries on all seven continents that have created jobs grown economies improved the resiliency of the worlds infrastructure increased access to energy resources and vital services and made the world a safer cleaner place.
Job Title: Data Engineer
Location: Glendale AZ 85301
Duration: 12 Months
Job Type: Temporary Assignment
Work Type: Hybrid
JOB DESCRIPTION:
Who you are:
-
You yearn to be part of groundbreaking projects and cutting-edge research that work to deliver world-class solutions on schedule
-
Someone who is motivated to find opportunity in and develop solutions for evolving challenges is passionate about their craft and driven to deliver exceptional results
-
You love to learn new technologies and mentor junior engineers to raise the bar on your team
-
You are imaginative and engaged about intuitive user interfaces as well as new/emerging concepts and techniques
Responsibilities:
-
Big data design and analysis data modeling development deployment and operations of big data pipelines
-
Collaborate with a team of other data engineers data scientists and business subject matter experts to process data and prepare data sources for a variety of use cases including predictive analytics generative AI and computer vision.
-
Mentor other data engineers to develop a world class data engineering team
-
Ingest Process and Model data from structured unstructured batch and real-time sources using the latest techniques and technology stack.
Basic Qualifications:
-
Bachelors degree or higher in Computer Science or equivalent degree and 5 years working experience
-
In depth experience with a big data cloud platform such as Azure AWS Snowflake Palantir etc.
-
Strong grasp of programming languages (Python Scala SQL Panda PySpark or equivalent) and a willingness to learn new ones. Strong understanding of structuring code for testability.
-
Experience writing database-heavy services or APIs
-
Strong hands-on experience building and optimizing scalable data pipelines complex transformations architecture and data sets with Databricks or Spark Azure Data Factory and/or Palantir Foundry for data ingestion and processing
-
Proficient in distributed computing frameworks with familiarity in handling drivers executors and data partitions in Hadoop or Spark.
-
Working knowledge of queueing stream processing and highly scalable data stores such as Hadoop Delta Lake Azure Data Lake Storage (ADLS) etc.
-
Deep understanding of data governance access control and secure view implementation
-
Experience in workflow orchestration and monitoring
-
Experience working with and supporting cross-functional teams
Preferred Qualifications:
-
Experience with schema evolution data versioning and Delta Lake optimization
-
Exposure to data cataloging solutions in Foundry Ontology
-
Professional experience implementing complex ML architectures in popular frameworks such as Tensorflow Keras PyTorch Sci-kit Learn and CNTK
-
Professional experience implementing and maintaining MLOps pipelines in MLflow or AzureML
TekWissen Group is an equal opportunity employer supporting workforce diversity.