Why Us
NewRocket is the AI-first Elite ServiceNow Partner that activates real value on the Now Platform. As a trusted advisor to enterprise leaders we combine industry expertise human-centered design and enterprise-grade AI to help organizations navigate change and scale with two decades of experience guiding clients to realize the full potential of the ServiceNow AI Platform we are one of the largest pure-play ServiceNow partners uniquely focused on enabling enterprises to adopt AI they trustAI that delivers lasting business value.
We #GoBeyondWorkflowsto create new kinds of experiences for our customers.
Come join our Crew!
The Role
We are looking for a Data Engineer to join our AI Product Team. You will be responsible for working with large-scale datasets developing robust data pipelines and ensuring that data is cleaned transformed and made available for AI Machine Learning and Data Science applications.
We are #GoingBeyond Come join our crew!
What You Will Be Doing
- Build and maintain scalable and reliable data pipelines to support various data science and machine learning initiatives.
- Ingest clean and process large volumes of structured and unstructured data from multiple sources.
- Implement data validation cleansing and transformation logic to ensure data quality and consistency.
- Work with cloud platforms including AWS (SageMaker Lambda S3) Azure and Google Cloud (Vertex AI).
- Collaborate with team to understand data needs and deliver high-quality datasets.
- Work with big data technologies such as Apache Spark and Snowflake for large-scale data processing and analytics.
- Design and optimize ETL pipelines for data quality management transformation and validation.
- Utilize SQL MySQL PostgreSQL and MongoDB for database management and query optimization.
- May perform additional duties as assigned.
What You Bring Along
- 2-3 years of experience as Data Engineer
- Strong SQL skills and understanding of relational databases.
- Familiarity with data processing frameworks like Apache Spark Pandas or PySpark.
- Knowledge of data cleaning techniques and dealing with missing or inconsistent data.
- Proficiency in Python or another scripting language used in data workflows.
- Basic understanding of data modelling concepts and data warehousing.
- Strong problem-solving skills and attention to detail.
Nice to Have
- Experience with cloud platforms like AWS GCP or Azure.
- Experience with unstructured data.
- Exposure to tools like Airflow DBT or Kafka.
- Understanding of version control systems (e.g. Git).
- Familiarity with ML workflows and data preparation for ML models.
Education:
- Bachelors or Masters degree in Computer Science Information Technology AI/ML or related field.
We Take Care of Our People
NewRocket is committed to a diverse and inclusive value and celebrate diversity believing that every employee matters and should be respected and are proud to be an equal opportunity workplace and affirmative action employer committed to providing employment opportunity regardless of sex race creed color gender religion marital status domestic partner status age national origin or ancestry physical or mental disability medical condition sexual orientation pregnancy citizenship military or Veteran individuals with disabilities who would like to request an accommodation please contact .
Why UsNewRocket is the AI-first Elite ServiceNow Partner that activates real value on the Now Platform. As a trusted advisor to enterprise leaders we combine industry expertise human-centered design and enterprise-grade AI to help organizations navigate change and scale with two decades of experien...
Why Us
NewRocket is the AI-first Elite ServiceNow Partner that activates real value on the Now Platform. As a trusted advisor to enterprise leaders we combine industry expertise human-centered design and enterprise-grade AI to help organizations navigate change and scale with two decades of experience guiding clients to realize the full potential of the ServiceNow AI Platform we are one of the largest pure-play ServiceNow partners uniquely focused on enabling enterprises to adopt AI they trustAI that delivers lasting business value.
We #GoBeyondWorkflowsto create new kinds of experiences for our customers.
Come join our Crew!
The Role
We are looking for a Data Engineer to join our AI Product Team. You will be responsible for working with large-scale datasets developing robust data pipelines and ensuring that data is cleaned transformed and made available for AI Machine Learning and Data Science applications.
We are #GoingBeyond Come join our crew!
What You Will Be Doing
- Build and maintain scalable and reliable data pipelines to support various data science and machine learning initiatives.
- Ingest clean and process large volumes of structured and unstructured data from multiple sources.
- Implement data validation cleansing and transformation logic to ensure data quality and consistency.
- Work with cloud platforms including AWS (SageMaker Lambda S3) Azure and Google Cloud (Vertex AI).
- Collaborate with team to understand data needs and deliver high-quality datasets.
- Work with big data technologies such as Apache Spark and Snowflake for large-scale data processing and analytics.
- Design and optimize ETL pipelines for data quality management transformation and validation.
- Utilize SQL MySQL PostgreSQL and MongoDB for database management and query optimization.
- May perform additional duties as assigned.
What You Bring Along
- 2-3 years of experience as Data Engineer
- Strong SQL skills and understanding of relational databases.
- Familiarity with data processing frameworks like Apache Spark Pandas or PySpark.
- Knowledge of data cleaning techniques and dealing with missing or inconsistent data.
- Proficiency in Python or another scripting language used in data workflows.
- Basic understanding of data modelling concepts and data warehousing.
- Strong problem-solving skills and attention to detail.
Nice to Have
- Experience with cloud platforms like AWS GCP or Azure.
- Experience with unstructured data.
- Exposure to tools like Airflow DBT or Kafka.
- Understanding of version control systems (e.g. Git).
- Familiarity with ML workflows and data preparation for ML models.
Education:
- Bachelors or Masters degree in Computer Science Information Technology AI/ML or related field.
We Take Care of Our People
NewRocket is committed to a diverse and inclusive value and celebrate diversity believing that every employee matters and should be respected and are proud to be an equal opportunity workplace and affirmative action employer committed to providing employment opportunity regardless of sex race creed color gender religion marital status domestic partner status age national origin or ancestry physical or mental disability medical condition sexual orientation pregnancy citizenship military or Veteran individuals with disabilities who would like to request an accommodation please contact .
View more
View less