"Data Platform Engineer"

Not Interested
Bookmark
Report This Job

profile Job Location:

Santa Clarita, CA - USA

profile Monthly Salary: Not Disclosed
Posted on: 5 hours ago
Vacancies: 1 Vacancy

Job Summary

Job Description:

This position is looking like a Data Platform Engineer with Data Engineering Knowledge on AWS platform (Preferable) GCP (Optional) SQL Databricks

Managed File Transfer (MFT) - GoAnywhere MFT Other MFT tools (Control M etc.)

Workflow Orchestration - Apache Airflow Job scheduling monitoring dependencies.

Cloud Platforms - AWS (S3 CloudWatch Lambda) GCP (GCS Compute Engine Cloud Functions)

Logging & Monitoring - Splunk (must-have) Reviewing MFT logs / system logs CloudWatch alerts

Scripting & Automation - Python Shell / Bash Automation of pipelines & processes

Linux / Unix - File systems Permissions Troubleshooting skills

Data Transfer Protocols - SFTP / FTP / FTPS HTTPS Encryption/Decryption standards Regex SMB & Message Queues

Operational Excellence - Incident management Ticketing systems UAT & Production validation Creating SOPs & runbooks.

Data Workflow Skills - File completeness validation Onboarding new workflows Hybrid data movement (on prem cloud)

Additional Skills - (Strong Differentiators) - Advanced Splunk (SPL dashboards) Custom Airflow DAG development SQL for validation Cloud automation (Lambda/Cloud Functions).

Job Description: This position is looking like a Data Platform Engineer with Data Engineering Knowledge on AWS platform (Preferable) GCP (Optional) SQL Databricks Managed File Transfer (MFT) - GoAnywhere MFT Other MFT tools (Control M etc.) Workflow Orchestration - Apache Airflow Job ...
View more view more

Key Skills

  • Apache Hive
  • S3
  • Hadoop
  • Redshift
  • Spark
  • AWS
  • Apache Pig
  • NoSQL
  • Big Data
  • Data Warehouse
  • Kafka
  • Scala