TITLE: Programmer Analyst (US Citizenship or Green Card required)
RATE RANGE: $60/hr - $74/hr W2 (no health benefits while on contract-an hour worked is an hour paid)
LOCATION: Remote or Falls Church VA
DURATION: 6 months
SCHEDULE:9/80 (every other Friday off)
***No C2C we can NOT work with outside agencies/vendors and we can NOT do 1099-US CITIZENSHIP or Green Card IS REQUIRED***
KEY RESPONSIBILITIES
Design and implement data pipelines to ingest extract transform load (ETL) data and store large datasets from various sources
Build and maintain data warehouses including data modeling data governance and data quality
Ensure data quality integrity and security by implementing data validation data cleansing and data governance policies
Optimize data systems for performance scalability and reliability
Collaborate with customers to understand their technical requirements and provide guidance on best practices for using Amazon Redshift
Work with cross-functional teams including data scientists analysts and business stakeholders to understand data requirements and deliver data solutions
Provide technical support for Amazon Redshift including troubleshooting performance optimization and data modeling
Identify and resolve data-related issues including data pipeline failures data quality issues and performance bottlenecks
Develop technical documentation and knowledge base articles to help customers and AWS engineers troubleshoot common issues
REQUIREMENTS
Bachelors or Masters degree in Computer Science or a related field with at least 6 years of experience in Information Technology
Proficiency in one or more programming languages (e.g. Python Java Scala)
8 years of experience in data engineering with a focus on designing and implementing large-scale data systems
5 years of hands-on experience in writing complex highly-optimized queries across large data sets using Oracle SQL Server and Redshift.
5 years of hands-on experience using AWS Glue python/pyspark to build ETL pipelines in a production setting including writing test cases
Strong understanding of database design principles data modeling and data governance
Proficiency in SQL including query optimization indexing and performance tuning
Experience with data warehousing concepts including star and snowflake schemas
Strong analytical and problem-solving skills with the ability to break down complex problems into manageable components
Experience with data storage solutions such as relational databases (Oracle SQL Server) NoSQL databases or cloud-based data warehouses (Redshift)
Experience with data processing frameworks such as Apache Kafka Fivetran
Experience in building ETL pipelines using AWS Glue Apache Airflow and programming languages including Python and PySpark
Understanding of data quality and governance principles and best practices
Experience with agile development methodologies such as Scrum or Kanban
PREFERRED SKILLS
Experience with Dataiku
Experience with building reports using PowerBI and Tableau
Experience with Alteryx o Relevant cloud certifications (e.g. AWS Certified Data Analytics - Specialty)
Experience with AWS services and best practices
EXPERIENCE REQUIRED
8 years of experience in data engineering with a focus on designing and implementing large-scale data systems
5 years of hands-on experience in writing complex highly-optimized queries across large data sets using Oracle SQL Server and Redshift.
5 years of hands-on experience using AWS Glue python/pyspark to build ETL pipelines in a production setting including writing test cases
EDUCATION PREFERRED
Bachelors or Masters degree in Computer Science or a related field with at least 8 years of experience in Information Technology
If you would like to interview for this position please send an updated resume to Dee Smith
GeoLogics is an Equal Opportunity/Affirmative Action Employer that is committed to hiring a diverse and talented workforce. EOE/Disability/Veteran
Required Experience:
IC