DescriptionAs a Data Engineer at JPMorgan Chase within the Cybersecurity and Technology Controls youll leverage your skills to develop and implement robust data solutions using cutting-edge technologies. Youll play a critical role in analyzing complex data structures ensuring data accuracy and enhancing our security analytics capabilities. Collaborate with cross-functional teams to drive innovation implement scalable solutions and protect our digital assets. Your contributions will be pivotal in shaping the future of cybersecurity fostering a culture of excellence and ensuring the integrity and security of our data infrastructure.
JobResponsibilities:
- Design and implement complex scalable solutions to efficiently process large volumes of data ensuring consistent and timely delivery and availability.
- Troubleshoot and resolve complex issues related to data architecture including data ingestion indexing and search performance.
- Create reusable frameworks with a strong emphasis on quality and long-term sustainability.
- Collaborate with key partners to enhance understanding of data usage within the business.
- Serve as a subject matter expert on the content and application of data in the product and related business areas.
- Integrate data from multiple sources including structured semi-structured and unstructured data.
- Implement data quality checks and validation processes to ensure the accuracy completeness and consistency of the data.
- Analyze complex data structures from various security data sources and scale data engineering pipelines.
- Perform all Data Engineering job activities including ELT project development testing and deployment activities.
- Documents data engineering processes workflows and systems for reference and knowledge-sharing purposes
- Add to team culture of diversity opportunity inclusion and respect.
Required Qualifications Capabilities and Skills:
- Formal training or certification on SQL concepts and proficient applied experience
- Proficient in database management with experience in both relational databases (SQL) and NoSQL databases.
- Experience with Python and SQL.
- Expertise in Python and Java development.
- Extensive experience with Big Data technologies including Spark Hive Redshift Kafka and others.
- Strong understanding of ETL/ELT frameworks and tools including DBT Apache Airflow Trino Kestra or similar technologies.
- Hands-on experience in data pipelines and ETL/ELT processes using Python and SQL.
- Experience with Kafka data streaming or other streaming/messaging services like Kinesis SNS SQS.
- Demonstrated experience developing debugging and tuning complex SQL statements.
- Experience working on real-time and streaming applications.
- Exceptional understanding of distributed systems and cloud platforms.
- Skilled in working with data streaming platforms such as Apache Flink and Apache Kafka.
- Expertise in performing data transformations to enhance data quality and usability.
- Comprehensive knowledge of the Software Development Life Cycle.
- Solid grasp of agile methodologies including CI/CD application resiliency and security best practices.
Preferred Qualifications Capabilities and Skills:
- Relevant industry experience preferably in a data engineering role focused on threat detection and security analytics.
- Experience with advanced data streaming and transformation tools.
- Experience with Kubernetes for container orchestration is a plus.
- Experience onboarding datasets to Splunk ensuring CIM compliance.
- Be a team player and work collaboratively with team members.
DescriptionAs a Data Engineer at JPMorgan Chase within the Cybersecurity and Technology Controls youll leverage your skills to develop and implement robust data solutions using cutting-edge technologies. Youll play a critical role in analyzing complex data structures ensuring data accuracy and enhan...
DescriptionAs a Data Engineer at JPMorgan Chase within the Cybersecurity and Technology Controls youll leverage your skills to develop and implement robust data solutions using cutting-edge technologies. Youll play a critical role in analyzing complex data structures ensuring data accuracy and enhancing our security analytics capabilities. Collaborate with cross-functional teams to drive innovation implement scalable solutions and protect our digital assets. Your contributions will be pivotal in shaping the future of cybersecurity fostering a culture of excellence and ensuring the integrity and security of our data infrastructure.
JobResponsibilities:
- Design and implement complex scalable solutions to efficiently process large volumes of data ensuring consistent and timely delivery and availability.
- Troubleshoot and resolve complex issues related to data architecture including data ingestion indexing and search performance.
- Create reusable frameworks with a strong emphasis on quality and long-term sustainability.
- Collaborate with key partners to enhance understanding of data usage within the business.
- Serve as a subject matter expert on the content and application of data in the product and related business areas.
- Integrate data from multiple sources including structured semi-structured and unstructured data.
- Implement data quality checks and validation processes to ensure the accuracy completeness and consistency of the data.
- Analyze complex data structures from various security data sources and scale data engineering pipelines.
- Perform all Data Engineering job activities including ELT project development testing and deployment activities.
- Documents data engineering processes workflows and systems for reference and knowledge-sharing purposes
- Add to team culture of diversity opportunity inclusion and respect.
Required Qualifications Capabilities and Skills:
- Formal training or certification on SQL concepts and proficient applied experience
- Proficient in database management with experience in both relational databases (SQL) and NoSQL databases.
- Experience with Python and SQL.
- Expertise in Python and Java development.
- Extensive experience with Big Data technologies including Spark Hive Redshift Kafka and others.
- Strong understanding of ETL/ELT frameworks and tools including DBT Apache Airflow Trino Kestra or similar technologies.
- Hands-on experience in data pipelines and ETL/ELT processes using Python and SQL.
- Experience with Kafka data streaming or other streaming/messaging services like Kinesis SNS SQS.
- Demonstrated experience developing debugging and tuning complex SQL statements.
- Experience working on real-time and streaming applications.
- Exceptional understanding of distributed systems and cloud platforms.
- Skilled in working with data streaming platforms such as Apache Flink and Apache Kafka.
- Expertise in performing data transformations to enhance data quality and usability.
- Comprehensive knowledge of the Software Development Life Cycle.
- Solid grasp of agile methodologies including CI/CD application resiliency and security best practices.
Preferred Qualifications Capabilities and Skills:
- Relevant industry experience preferably in a data engineering role focused on threat detection and security analytics.
- Experience with advanced data streaming and transformation tools.
- Experience with Kubernetes for container orchestration is a plus.
- Experience onboarding datasets to Splunk ensuring CIM compliance.
- Be a team player and work collaboratively with team members.
View more
View less