Hiring: W2 Candidates Only
Visa: Open to any visa type with valid work authorization in the USA
We are seeking a skilled Data Engineer to design build and maintain scalable data pipelines and infrastructure. The ideal candidate will work closely with data scientists analysts and software engineers to ensure reliable high-quality data is available for analytics and decision-making.
Key Responsibilities
-
Design develop and maintain ETL/ELT data pipelines
-
Build and optimize data warehouses and data lakes
-
Ensure data quality integrity and reliability
-
Process large-scale structured and unstructured datasets
-
Collaborate with data analysts and data scientists to support analytics and ML use cases
-
Optimize database performance and query efficiency
-
Implement data security governance and compliance standards
-
Monitor and troubleshoot data pipeline failures
-
Automate data workflows and improve system scalability
Required Skills & Qualifications
-
Bachelors degree in Computer Science Engineering or related field
-
Strong experience with SQL and relational databases
-
Proficiency in Python Java or Scala
-
Experience with ETL tools (Airflow Informatica Talend etc.)
-
Knowledge of Big Data technologies (Spark Hadoop Kafka)
-
Experience with cloud platforms (AWS Azure or GCP)
-
Understanding of data modeling and schema design
Preferred Skills
-
Experience with data warehousing (Snowflake Redshift BigQuery)
-
Knowledge of DevOps & CI/CD pipelines
-
Familiarity with machine learning data pipelines
-
Experience with NoSQL databases
Hiring: W2 Candidates Only Visa: Open to any visa type with valid work authorization in the USA We are seeking a skilled Data Engineer to design build and maintain scalable data pipelines and infrastructure. The ideal candidate will work closely with data scientists analysts and software engineers ...
Hiring: W2 Candidates Only
Visa: Open to any visa type with valid work authorization in the USA
We are seeking a skilled Data Engineer to design build and maintain scalable data pipelines and infrastructure. The ideal candidate will work closely with data scientists analysts and software engineers to ensure reliable high-quality data is available for analytics and decision-making.
Key Responsibilities
-
Design develop and maintain ETL/ELT data pipelines
-
Build and optimize data warehouses and data lakes
-
Ensure data quality integrity and reliability
-
Process large-scale structured and unstructured datasets
-
Collaborate with data analysts and data scientists to support analytics and ML use cases
-
Optimize database performance and query efficiency
-
Implement data security governance and compliance standards
-
Monitor and troubleshoot data pipeline failures
-
Automate data workflows and improve system scalability
Required Skills & Qualifications
-
Bachelors degree in Computer Science Engineering or related field
-
Strong experience with SQL and relational databases
-
Proficiency in Python Java or Scala
-
Experience with ETL tools (Airflow Informatica Talend etc.)
-
Knowledge of Big Data technologies (Spark Hadoop Kafka)
-
Experience with cloud platforms (AWS Azure or GCP)
-
Understanding of data modeling and schema design
Preferred Skills
-
Experience with data warehousing (Snowflake Redshift BigQuery)
-
Knowledge of DevOps & CI/CD pipelines
-
Familiarity with machine learning data pipelines
-
Experience with NoSQL databases
View more
View less