Job Title: Data Engineer
Experience: 8 Years
Location: California
Job Type: C2C W2
Visa Eligibility: H1B GC USC H4 candidates can apply
About the Job
We are seeking a skilled and proactive Data Engineer to join our team and contribute to the design development and maintenance of scalable data this role you will play a key part in transforming raw data into actionable insights supporting analytics business intelligence and AI/ML initiatives across the organization. You will collaborate with cross-functional teams to ensure data reliability efficiency and quality while driving process improvements and automation.
Duties to Include
Build maintain and optimize data pipelines and ETL/ELT workflows across cloud and on-premise environments.
Transform data from multiple structured and unstructured sources into usable formats for analytics and reporting.
Develop maintain and optimize relational and non-relational data models schemas and storage solutions.
Ensure high-quality reliable and secure data pipelines implementing data validation profiling and monitoring.
Collaborate with data scientists analysts and business stakeholders to understand requirements and deliver solutions that meet business needs.
Design and implement automation and CI/CD processes for data workflows integrating tools like Airflow Jenkins or similar orchestration platforms.
Identify opportunities to optimize performance reduce latency and improve operational efficiency of data pipelines.
Document and test data solutions ensuring accuracy compliance and maintainability.
Stay current with emerging technologies cloud platforms and data engineering best practices.
Requirements
8 years of experience in data engineering data integration or similar roles.
Strong experience with cloud platforms such as AWS Azure or GCP including cloud-native data services.
Hands-on expertise in Python SQL Spark Hadoop Kafka and ETL/ELT tools.
Experience with both relational and NoSQL databases (MySQL PostgreSQL MongoDB Cassandra etc.).
Proficient in building scalable high-performance data pipelines and implementing data governance and security practices.
Familiarity with CI/CD automation and containerization (Docker Kubernetes) in data workflows.
Strong collaboration and communication skills capable of explaining technical concepts to both technical and non-technical stakeholders.
Bachelors or Masters degree in Computer Science Data Engineering Information Systems or a related field
Requirements
8 years of experience in data engineering with strong hands-on skills in Python SQL Spark Hadoop Kafka ETL/ELT tools and cloud platforms (AWS/Azure/GCP).
Required Skills:
8 years of experience in data engineering with strong hands-on skills in Python SQL Spark Hadoop Kafka ETL/ELT tools and cloud platforms (AWS/Azure/GCP).
Job Title: Data EngineerExperience: 8 YearsLocation: CaliforniaJob Type: C2C W2Visa Eligibility: H1B GC USC H4 candidates can applyAbout the JobWe are seeking a skilled and proactive Data Engineer to join our team and contribute to the design development and maintenance of scalable data this role y...
Job Title: Data Engineer
Experience: 8 Years
Location: California
Job Type: C2C W2
Visa Eligibility: H1B GC USC H4 candidates can apply
About the Job
We are seeking a skilled and proactive Data Engineer to join our team and contribute to the design development and maintenance of scalable data this role you will play a key part in transforming raw data into actionable insights supporting analytics business intelligence and AI/ML initiatives across the organization. You will collaborate with cross-functional teams to ensure data reliability efficiency and quality while driving process improvements and automation.
Duties to Include
Build maintain and optimize data pipelines and ETL/ELT workflows across cloud and on-premise environments.
Transform data from multiple structured and unstructured sources into usable formats for analytics and reporting.
Develop maintain and optimize relational and non-relational data models schemas and storage solutions.
Ensure high-quality reliable and secure data pipelines implementing data validation profiling and monitoring.
Collaborate with data scientists analysts and business stakeholders to understand requirements and deliver solutions that meet business needs.
Design and implement automation and CI/CD processes for data workflows integrating tools like Airflow Jenkins or similar orchestration platforms.
Identify opportunities to optimize performance reduce latency and improve operational efficiency of data pipelines.
Document and test data solutions ensuring accuracy compliance and maintainability.
Stay current with emerging technologies cloud platforms and data engineering best practices.
Requirements
8 years of experience in data engineering data integration or similar roles.
Strong experience with cloud platforms such as AWS Azure or GCP including cloud-native data services.
Hands-on expertise in Python SQL Spark Hadoop Kafka and ETL/ELT tools.
Experience with both relational and NoSQL databases (MySQL PostgreSQL MongoDB Cassandra etc.).
Proficient in building scalable high-performance data pipelines and implementing data governance and security practices.
Familiarity with CI/CD automation and containerization (Docker Kubernetes) in data workflows.
Strong collaboration and communication skills capable of explaining technical concepts to both technical and non-technical stakeholders.
Bachelors or Masters degree in Computer Science Data Engineering Information Systems or a related field
Requirements
8 years of experience in data engineering with strong hands-on skills in Python SQL Spark Hadoop Kafka ETL/ELT tools and cloud platforms (AWS/Azure/GCP).
Required Skills:
8 years of experience in data engineering with strong hands-on skills in Python SQL Spark Hadoop Kafka ETL/ELT tools and cloud platforms (AWS/Azure/GCP).
View more
View less