drjobs Data Research - Database Engineer

Data Research - Database Engineer

Employer Active

1 Vacancy
drjobs

Job Alert

You will be updated with latest job alerts via email
Valid email field required
Send jobs
Send me jobs like this
drjobs

Job Alert

You will be updated with latest job alerts via email

Valid email field required
Send jobs
Job Location drjobs

Mumbai - India

Monthly Salary drjobs

Not Disclosed

drjobs

Salary Not Disclosed

Vacancy

1 Vacancy

Job Description

Responsibilities

  • Design develop and maintain the database infrastructure to store and manage company data efficiently and securely.

  • Work with databases of varying scales including smallscale databases and databases involving big data processing.

  • Work on data security and compliance by implementing access controls encryption and compliance standards.

  • Collaborate with crossfunctional teams to understand data requirements and support the design of the database architecture.

  • Migrate data from spreadsheets or other sources to a relational database system (e.g. PostgreSQL MySQL) or cloudbased solutions like Google BigQuery.

  • Develop import workflows and scripts to automate the data import process and ensure data accuracy and consistency.

  • Optimize database performance by analyzing query plans implementing indexing strategies and improving data retrieval and storage mechanisms.

  • Work with the team to ensure data integrity and enforce data quality standards including data validation rules constraints and referential integrity.

  • Monitor database health and identify and resolve issues.

  • Collaborate with the fullstack web developer in the team to support the implementation of efficient data access and retrieval mechanisms.

  • Implement data security measures to protect sensitive information and comply with relevant regulations.

  • Demonstrate creativity in problemsolving and contribute ideas for improving data engineering processes and workflows.

  • Embrace a learning mindset staying updated with emerging database technologies tools and best practices.

  • Explore thirdparty technologies as alternatives to legacy approaches for efficient data pipelines.

  • Familiarize yourself with tools and technologies used in the teams workflow such as Knime for data integration and analysis.

  • Use Python for tasks such as data manipulation automation and scripting.

  • Collaborate with the Data Research Engineer to estimate development efforts and meet project deadlines.

  • Assume accountability for achieving development milestones.

  • Prioritize tasks to ensure timely delivery in a fastpaced environment with rapidly changing priorities.

  • Collaborate with and assist fellow members of the Data Research Engineering Team as required.

  • Perform tasks with precision and build reliable systems.

  • Leverage online resources effectively like StackOverflow ChatGPT Bard etc. while considering their capabilities and limitations.

Skills and Experience

  • Bachelors degree in Computer Science Information Systems or a related field is desirable but not essential.

  • Experience with data warehousing concepts and tools (e.g. Snowflake Redshift) to support advanced analytics and reporting aligning with the teams data presentation goals.

  • Skills in working with APIs for data ingestion or connecting thirdparty systems which could streamline data acquisition processes.

  • Proficiency with tools like Prometheus Grafana or ELK Stack for realtime database monitoring and health checks beyond basic troubleshooting.

  • Familiarity with continuous integration/continuous deployment (CI/CD) tools (e.g. Jenkins GitHub Actions).

  • Deeper expertise in cloud platforms (e.g. AWS Lambda GCP Dataflow) for serverless data processing or orchestration.

  • Knowledge of database development and administration concepts especially with relational databases like PostgreSQL and MySQL.

  • Knowledge of Python programming including data manipulation automation and objectoriented programming (OOP) with experience in modules such as Pandas SQLAlchemy gspread PyDrive and PySpark.

  • Knowledge of SQL and understanding of database design principles normalization and indexing.

  • Knowledge of data migration ETL (Extract Transform Load) processes or integrating data from various sources.

  • Knowledge of cloudbased databases such as AWS RDS and Google BigQuery.

  • Eagerness to develop import workflows and scripts to automate data import processes.

  • Knowledge of data security best practices including access controls encryption and compliance standards.

  • Strong problemsolving and analytical skills with attention to detail.

  • Creative and critical thinking.

  • Strong willingness to learn and expand knowledge in data engineering.

  • Familiarity with Agile development methodologies is a plus.

  • Experience with version control systems such as Git for collaborative development.

  • Ability to thrive in a fastpaced environment with rapidly changing priorities.

  • Ability to work collaboratively in a team environment.

  • Good and effective communication skills.

  • Comfortable with autonomy and ability to work independently.

 


Qualifications :

5 Years exp in Database Engineering.


Additional Information :

Perks:

  • Day off on the 3rd Friday of every month (one long weekend each month)

  • Monthly Wellness Reimbursement Program to promote health wellbeing

  • Monthly Office Commutation Reimbursement Program

  • Paid paternity and maternity leaves


Remote Work :

Yes


Employment Type :

Fulltime

Employment Type

Remote

Company Industry

About Company

Report This Job
Disclaimer: Drjobpro.com is only a platform that connects job seekers and employers. Applicants are advised to conduct their own independent research into the credentials of the prospective employer.We always make certain that our clients do not endorse any request for money payments, thus we advise against sharing any personal or bank-related information with any third party. If you suspect fraud or malpractice, please contact us via contact us page.