Description Were driven by curiosity passion optimism and the belief that everybody can grow.
Your contributions will be pivotal in shaping the future of cybersecurity fostering a culture of excellence and ensuring the integrity and security of our data infrastructure.
As a Data Engineer at JPMorgan Chase within the Cybersecurity and Technology Controls youll leverage your skills to develop and implement robust data solutions using cuttingedge technologies. Youll play a critical role in analyzing complex data structures ensuring data accuracy and enhancing our security analytics capabilities. Collaborate with crossfunctional teams to drive innovation implement scalable solutions and protect our digital assets.
Job Responsibilities:
- Develop and implement processes and procedures to identify monitor and mitigate data risks within the product.
- Design and implement complex scalable solutions to efficiently process data ensuring consistent and timely delivery and availability.
- Focus on building robust systems that can handle large volumes of data with minimal downtime.
- Develop solutions using Agile DevOps methodologies and continuous integration/continuous deployment (CI/CD) practices on public cloud platforms.
- Troubleshoot and resolve complex issues related to data architecture including data ingestion indexing and search performance.
- Create reusable frameworks with a strong emphasis on quality and longterm sustainability.
- Perform root cause analysis on data to answer specific business questions or issues.
- Collaborate with key partners to enhance understanding of data usage within the business.
- Serve as a subject matter expert on the content and application of data in the product and related business areas.
- Document and enforce requirements for data accuracy completeness and timeliness within the product.
- Integrate data from multiple sources including structured semistructured and unstructured data and implement data quality checks and validation processes to ensure the accuracy completeness and consistency of the data.
Required Qualifications Capabilities and Skills:
- Formal training or certification on SQL concepts and 3 years applied experience with data transformation tools such as DBT.
- Proficient in database management with experience in both relational databases (SQL) and NoSQL databases.
- Minimum 5 years of experience with Python and SQL.
- Expertise in Python and Java development.
- Extensive experience with Big Data technologies including Spark Hive Redshift Kafka and others.
- Excellent understanding of ETL/ELT frameworks and tools including DBT Apache Airflow Trino Kestra or similar technologies.
- Handson experience in data pipelines and ETL/ELT processes using Python and SQL.
- Experience with Kafka data streaming or other streaming/messaging services like Kinesis SNS SQS.
- Demonstrated experience developing debugging and tuning complex SQL statements.
- Experience working on realtime and streaming applications with a solid grasp of agile methodologies including CI/CD application resiliency and security best practices.
- Exceptional understanding of distributed systems and cloud platforms with expertise in performing data transformations to enhance data quality and usability.
Preferred Qualifications Capabilities and Skills:
- Relevant industry experience preferably in a data engineering role focused on threat detection and security analytics.
- Experience with advanced data streaming and transformation tools.
- Experience with Kubernetes for container orchestration is a plus.
- Be a team player and work collaboratively with team members.