Job Description:
Primary Skills: Databricks Pyspark
- Proficiency in PySpark and Databricks (Delta Lake clusters jobs).
- Experience is architecting designs for integrating DBX with
- different application like Sales Force & MDM etc
- Different tool like Collibra etc
- Hands-on with Apache Airflow (DAG design monitoring).
- Strong in AWS services: S3 EC2 Lambda IAM.
- Strong SQL and Python for transformations and orchestration.
- Knowledge of Lakehouse architecture (Delta Lake) and data modeling.
- Experience in ETL/ELT and data warehousing best practices.
Job Description: Primary Skills: Databricks Pyspark Proficiency in PySpark and Databricks (Delta Lake clusters jobs). Experience is architecting designs for integrating DBX with different application like Sales Force & MDM etc Different tool like Collibra etc Hands-on with Apache Airflow (DAG de...
Job Description:
Primary Skills: Databricks Pyspark
- Proficiency in PySpark and Databricks (Delta Lake clusters jobs).
- Experience is architecting designs for integrating DBX with
- different application like Sales Force & MDM etc
- Different tool like Collibra etc
- Hands-on with Apache Airflow (DAG design monitoring).
- Strong in AWS services: S3 EC2 Lambda IAM.
- Strong SQL and Python for transformations and orchestration.
- Knowledge of Lakehouse architecture (Delta Lake) and data modeling.
- Experience in ETL/ELT and data warehousing best practices.
View more
View less