Data Modeller
Job Summary
Its fun to work in a company where people truly BELIEVE in what they are doing!
Were committed to bringing passion and customer focus to the business.
Responsibilities
- Participate in requirements definition analysis and the design of logical and physical data models for Dimensional Data Model NoSQL or Graph Data Model.
- Lead data discovery discussions with Business in JAD sessions and map the business requirements to logical and physical data modeling solutions.
- Conduct data model reviews with project team members.
- Capture technical metadata through data modeling tools.
- Ensure database designs efficiently support BI and end user requirements.
- Drive continual improvement and enhancement of existing systems.
- Collaborate with ETL/Data Engineering teams to create data process pipelines for data ingestion and transformation.
- Collaborate with Data Architects for data model management documentation and version control.
- Maintain expertise and proficiency in the various application areas.
- Maintain current knowledge of industry trends and standards.
Required Skills
- Strong data analysis and data profiling skills.
- Strong conceptual logical and physical data modeling for VLDB Data Warehouse and Graph DB.
- Hands-on experience with modeling tools such as ERWIN or another industry-standard tool.
- Fluent in both normalized and dimensional model disciplines and techniques.
- Minimum of 3 years experience in Oracle Database.
- Hands-on experience with Oracle SQL PL/SQL or Cypher.
- Exposure to Databricks Spark Delta Technologies Informatica ETL or other industry-leading tools.
- Good knowledge or experience with AWS Redshift and Graph DB design and management.
- Working knowledge of AWS Cloud technologies mainly on the services of VPC EC2 S3 DMS and Glue.
- Bachelors degree in Software Engineering Computer Science or Information Systems (or equivalent experience).
- Excellent verbal and written communication skills including the ability to describe complex technical concepts in relatable terms.
- Ability to manage and prioritize multiple workstreams with confidence in making decisions about prioritization.
- Data-driven mentality. Self-motivated responsible conscientious and detail-oriented.
- Effective oral and written communication skills.
- Ability to learn and maintain knowledge of multiple application areas.
- Understanding of industry best practices pertaining to Quality Assurance concepts and procedures.
Education/Experience Level
- Bachelors degree in Computer Science Engineering or relevant fields with 3 years of experience as a Data and Solution Architect supporting Enterprise Data and Integration Applications or a similar role for large-scale enterprise solutions.
- 3 years of experience in Big Data Infrastructure and tuning experience in Lakehouse Data Ecosystem including Data Lake Data Warehouses and Graph DB.
- AWS Solutions Architect Professional Level certifications.
- Extensive experience in data analysis on critical enterprise systems like SAP E1 Mainframe ERP SFDC Adobe Platform and eCommerce systems.
If you like wild growth and working with happy enthusiastic over-achievers youll enjoy your career with us!
Hiring Related Queries
Please share resume via job postings
Not the right fit Let us know youre interested in a future opportunity by clickingIntroduce Yourselfin the top-right corner of the page or create an account to set up email alerts as new job postings become available that meet your interest!
About Company
Fractal Analytics helps global Fortune 100 companies power every human decision in the enterprise by bringing analytics and AI to the decision.