Equirements for Resource: - Proficient in writing and supporting both functional and nonfunctional aspects of Flink with DataStreams API.
Flink Functional Requirements: - Expertise in Flink APIs (DataStream Process functions etc..
- Competence in state management (checkpoints and savepoints) with local storage.
- Configuration of connectors like EventHub Kafka and MongoDB.
- Implementation of Flink API Aggregators.
- Handling watermarks for outoforder events.
- Management of state using Azure Data Lake Storage (ADLS).
Flink NonFunctional Requirements: - Set up a private Flink cluster within a designated AKS environment.
- Configure both sessionbased and applicationtype deployments.
- Define and build nodes and slots.
- Manage and configure Job/Task Managers.
- Establish necessary connectors e.g. external storage for the Flink Cluster.
- Configure heap memory and RocksDB for state management.
- Define and set up checkpoints and savepoints for state recovery.
- Enable AutoPilot capabilities.
- Integrate network resources such as Azure EventHub and external databases like MongoDB.
- Implement integration with ArgoCD for job submissions.
- Install LTM agents for logging and Dynatrace agents for monitoring purposes.
- Provide access to the Flink Dashboard.
- Establish High Availability (HA) and Disaster Recovery (DR) configurations.
Experience: - 10 years of handson design and java coding experience in backend system development.
- 5 years handson experience with Kafka Flink Cloud Unit/Functional/Integration testing SQL or kSQL Java Github Actions Dynatrace Code scanner and MongoDB.
Additional details: Delivery of a fully configured Confluent Cloud infrastructure with FLINK integration and automated CI/CD pipeline to deploy in all environments (Test Stage Prod).