Title: Data Engineer
Location: Menlo Park CA
Duration: 12 Months
Hybrid 3 Days a Week Onsite
Open to Alpharetta GA or Menlo Park CA locations. Please put at the top of the resume which location the candidate is being considered for. - Must be local to either location
The real-time operation intelligence team in Brokerage Enterprise Computing is responsible for streaming terabytes of data daily.
We have built job frameworks to run large-scale ETL pipelines with Kafka ElasticSearch ELK Snowflake and Hadoop.
Our applications run both on-prem and on the cloud. There are hundreds of dashboards built for business and operations to provide insight and actionable items in real-time.
We are looking for a streaming data engineer
- Understand distributed systems architecture design and trade-offs.
- Design and develop ETL pipelines with a wide range of technologies.
- Able to work on the full cycle of development including defining requirements designing implementing testing and deployment.
- Strong communication skills to collaborate with various teams.
- Able to learn new technologies and work independently.
Requirements:
- 5 years of application development experience at least 2 years of data engineering with Kafka
- Working experience writing and running applications on Linux
- 5 years of coding experience with at least one of the languages: Python Ruby Java C/C GO
- SQL and database experience
Optional:
- AWS or other cloud technologies
- ElasticSearch ELK.
Best Regards
Scott Solanki
Direct -
Email -