drjobs Data Engineer

Employer Active

1 Vacancy
drjobs

Job Alert

You will be updated with latest job alerts via email
Valid email field required
Send jobs
Send me jobs like this
drjobs

Job Alert

You will be updated with latest job alerts via email

Valid email field required
Send jobs
Job Location drjobs

Fort Washington - USA

Monthly Salary drjobs

$ 110400 - 160300

Vacancy

1 Vacancy

Job Description

At Accenture Federal Services nothing matters more than helping the US federal government make the nation stronger and safer and life better for 13000 people are united in a shared purpose to pursue the limitless potential of technology and ingenuity for clients across defense national security public safety civilian and military health organizations.

Join Accenture Federal Services a technology company and part of global Accenture to do work that matters in a collaborative and caring community where you feel like you belong and are empowered to grow learn and thrive through hands-on experience certifications industry training and more.

Join us to drive positive lasting change that moves missions and the government forward!

Job Description

Accenture Federal Services is searching for a Data Engineer to design develop and maintain data solutions for data generation collection and processing. This role will help to create data pipelines ensure data quality and implement ETL (extract transform and load) processes to migrate and deploy data across systems.

Responsibilities:

Data Pipeline Development & Management

  • Design and implement data pipelines utilizing AWS S3 Apache NiFi and ELK Stack
  • Create automated data ingestion workflows with API integration
  • Implement NLP-based data transformation and conditioning processes
  • Ensure pipeline monitoring maintenance and optimization

Data Architecture & Storage

  • Design and implement databases based on documented data models
  • Lead data storage solutions and architecture decisions
  • Establish data cleansing and normalization protocols
  • Integrate multiple data sources into consistent machine-readable formats

Data Quality & Governance

  • Implement methodologies for improving data reliability and quality
  • Manage data categorization labeling and retention policies
  • Enforce data governance standards across projects
  • Develop and maintain data quality monitoring systems

Visualization & Analytics

  • Lead dashboard development and customized visualization initiatives
  • Create and maintain reporting solutions
  • Provide technical guidance for program study areas

Technical Infrastructure & Tools

  • Proficiency in version control and CI/CD practices for data pipelines
  • Knowledge of stream processing frameworks (Apache Kafka Kinesis)
  • Experience with data warehousing solutions (Snowflake Redshift)

Programming & Development

  • Strong Python programming skills for data pipeline development
  • Advanced SQL knowledge for complex data transformations
  • Familiarity with containerization (Docker Kubernetes)

Data Testing & Monitoring

  • Set up monitoring and alerting for pipeline failures
  • Create data validation checks and reconciliation processes
  • Develop automated testing for data transformations

Performance Optimization

  • Experience optimizing large-scale data processing jobs
  • Knowledge of query optimization techniques
  • Ability to troubleshoot performance bottlenecks

Security & Compliance

  • Implement data security best practices
  • Understanding of security protocols for sensitive data

Basic Qualifications:

  • 3 years of experience with creating data pipelines ensure data quality and implementing ETL (extract transform and load) processes to migrate and deploy data across systems
  • Experience with any of the following technology:
    • Python programming
    • AWS cloud
    • Apache NiFi
    • ELK Stack
    • Docker or Kubernetes

Preferred experience:

  • Bachelors degree
  • Must obtain IAT Level II certification within 90 days of starting
  • Experience with AWS cloud services particularly S3
  • Proficiency in Apache NiFi and ELK Stack
  • Experience with API integration and data extraction
  • Knowledge of NLP and data transformation techniques
  • Understanding of data governance principles
  • Experience with Apache NiFi / Kafka
  • Experience in relational databases
  • Experience in NoSQL solutions
  • Proficiency in Python
  • Proficiency in Java
  • Experience with ETL pipeline development and maintenance
  • Experience with Elasticsearch Logstash and Kibana

As required by local law Accenture Federal Services provides reasonable ranges of compensation for hired roles based on labor costs in the states ofCalifornia Colorado Hawaii Illinois Maryland Minnesota New Jersey New York Washington Vermont and the District of Columbia. The base pay range for this position in these locations is shown below. Compensation for roles at Accenture Federal Services varies depending on a wide array of factors including but not limited to office location role skill set and level of experience. Accenture Federal Services offers a wide variety of benefits.You can find more information on benefits here. We accept applications on an on-going basis and there is no fixed deadline to apply.

The pay range for the states of California Colorado Hawaii Illinois Maryland Minnesota New Jersey New York Washington Vermont and the District of Columbia is:

$110400 - $160300 USD

Employment Type

Full Time

Report This Job
Disclaimer: Drjobpro.com is only a platform that connects job seekers and employers. Applicants are advised to conduct their own independent research into the credentials of the prospective employer.We always make certain that our clients do not endorse any request for money payments, thus we advise against sharing any personal or bank-related information with any third party. If you suspect fraud or malpractice, please contact us via contact us page.