Position: Data Analyst-Data Modeler/Engineer-ETL-Source-to-Target Mapping
Location: Minnetonka MN (Hybrid- 3 Days in-office). ONLY LOCAL CANDIDATES ACCEPTED.
Duration: 8-12 Months
Reqd. Skills:
- Data Engineering
- ETL-Informatica
- SQL/Oracle PL-SQL
- Informatica PowerCenter
- Data Modeling
- Data Analysis
- Snowflake or Azure or Kafka Preferred
- Healthcare Insurance
Key Accountabilities:
- Designing and implementing scalable data pipelines and storage solutions to support enterprise analytics.
- Ensuring data quality integrity and security across all stages of the data lifecycle.
- Collaborating with stakeholders to define data requirements and translate them into technical specifications.
- Monitoring and optimizing performance of data systems and ETL processes.
- Supporting the deployment and maintenance of data infrastructure in cloud environments.
- Developing and maintaining complex SQL scripts and stored procedures for data transformation and reporting.
- Leveraging Informatica IDMC for cloud-native data integration data quality and governance workflows.
In addition to engineering responsibilities the Data Generalist component of this role includes:
- Performing exploratory data analysis and generating actionable insights.
- Creating dashboards and visualizations using tools like Power BI Tableau or Excel.
- Collaborating with cross-functional teams to align data efforts with business goals.
- Automating data workflows using scripting languages such as Python or R.
- Supporting business intelligence initiatives and translating data into strategic recommendations.
- Documenting data processes and contributing to data governance standards.
- Applying AI and machine learning techniques to enhance predictive analytics automate decision-making and uncover deeper insights from complex datasets.
- Utilizing preferred AI tools and platforms such as TensorFlow Azure Machine Learning scikit-learn and PyTorch to build and deploy models that support enterprise analytics and innovation.
Position: Data Analyst-Data Modeler/Engineer-ETL-Source-to-Target Mapping Location: Minnetonka MN (Hybrid- 3 Days in-office). ONLY LOCAL CANDIDATES ACCEPTED. Duration: 8-12 Months Reqd. Skills: Data Engineering ETL-Informatica SQL/Oracle PL-SQL Informatica PowerCenter Data Modeling Data Analysi...
Position: Data Analyst-Data Modeler/Engineer-ETL-Source-to-Target Mapping
Location: Minnetonka MN (Hybrid- 3 Days in-office). ONLY LOCAL CANDIDATES ACCEPTED.
Duration: 8-12 Months
Reqd. Skills:
- Data Engineering
- ETL-Informatica
- SQL/Oracle PL-SQL
- Informatica PowerCenter
- Data Modeling
- Data Analysis
- Snowflake or Azure or Kafka Preferred
- Healthcare Insurance
Key Accountabilities:
- Designing and implementing scalable data pipelines and storage solutions to support enterprise analytics.
- Ensuring data quality integrity and security across all stages of the data lifecycle.
- Collaborating with stakeholders to define data requirements and translate them into technical specifications.
- Monitoring and optimizing performance of data systems and ETL processes.
- Supporting the deployment and maintenance of data infrastructure in cloud environments.
- Developing and maintaining complex SQL scripts and stored procedures for data transformation and reporting.
- Leveraging Informatica IDMC for cloud-native data integration data quality and governance workflows.
In addition to engineering responsibilities the Data Generalist component of this role includes:
- Performing exploratory data analysis and generating actionable insights.
- Creating dashboards and visualizations using tools like Power BI Tableau or Excel.
- Collaborating with cross-functional teams to align data efforts with business goals.
- Automating data workflows using scripting languages such as Python or R.
- Supporting business intelligence initiatives and translating data into strategic recommendations.
- Documenting data processes and contributing to data governance standards.
- Applying AI and machine learning techniques to enhance predictive analytics automate decision-making and uncover deeper insights from complex datasets.
- Utilizing preferred AI tools and platforms such as TensorFlow Azure Machine Learning scikit-learn and PyTorch to build and deploy models that support enterprise analytics and innovation.
View more
View less