This is onsite job in Batumi!
MIGx is a global consulting company with an exclusive focus on the healthcare and life science industries, with their particularly demanding requirements on quality and regulatory aspects. We have been managing challenges and solving problems for our clients in the areas of compliance, business processes and many others.
Our company is rapidly growing and now we are looking for a data enthusiast who likes to play with structured and unstructured data, transform, organize to work in state of the art data fabric and data mesh projects.
Project Description
In this role you will be working as a Data Engineer, working in complex projects with multiple data sources and formats. You will be part of a bigger team in MIGx responsible for the Data Services and building Data Products for our customers (mid to large size enterprises). You will have an opportunity to continue growing in the are of all things Data related. You will participate in building overall Data Mesh architecture for the customer while focusing on one specific visualization project and more upcoming.
Responsibilities
- Develop ETL pipelines in Python and Azure Data Factory as well as their DevOps CI/CD pipelines.
- Software engineering and systems integration via REST APIs and other standard interfaces.
- Work together with a team of professional engineers with the objective of>
- developing the data pipelines, Automate processes,
- deploying and building infrastructure as code
- managing the solutions designed in multicloud systems.
- Participate in agile ceremonies, weekly demos and such.
- Communicate your daily commitments.
- To be able to configure and connect different data sources, in special, SQL databases.
Requirements - Must have
- Studies in Computer Science (BSc and/or MSc desired).
- 3+ years of practical experience working in similar roles.
- Proficient with ETL products (Spark, Databricks, Snowflake, Azure Data Factory, etc.)
- Proficient with Azure Data Factory.
- Proficient with Databricks/Snowflake and PySpark.
- Proficient developing DevOps/CICD pipelines.
- Proficient with Azure DevOps Classic/YAML Pipelines.
- Proficient with Azure cloud services: ARM templates, API management, App Service, VMs, AKS, Gateways.
- Advanced SQL knowledge and background in relational databases such as MS SQL Server, Oracle, MySQL, and PostgreSQL
- Understanding of landing, staging area, data cleansing, data profiling, data security and data architecture concepts (DWH, Data Lake, Delta Lake/Lakehouse, Datamart)
- Data Modeling skills and knowledge of modeling tools.
- Advanced programming skills in Python.
- Ability to work in an agile development environment (SCRUM, Kanban)
- Understanding of CI/CD principles and best practices
Requirements - Nice to have
- Proficient with ETL products (Spark, Databricks, Snowflake, Azure Data Factory, etc.)
- Proficient with Azure Data Factory.
- Proficient with Databricks/Snowflake and PySpark.
- Proficient developing DevOps/CICD pipelines.
- Proficient with Azure DevOps Classic/YAML Pipelines.
- Proficient with Azure cloud services: ARM templates, API management, App Service, VMs, AKS, Gateways.
- Proficient with .NET C#
- Terraform.
- Bash/Powershell.
Seniority Level
- MID- SR
Languages:
- English B1+
We Offer:
- Excellent compensation package
- Full medical insurance for employee and his family
- Hybrid work model and flexible working schedule
- Newly built comfortable office near seashore with kitchen full of snaks, drinks, food / shower, toilets, etc
- Free English classes
- Different training programs to support your personal and professional development
- Possibilities of career development and the opportunity to shape the company future
- Work in a fast growing, international company
- Friendly atmosphere and supportive Management team
Задайте вопрос работодателю
Он получит его с откликом на вакансию
Где предстоит работать
Batumi, Tbel Abuseridze Street, 5A
Вакансия опубликована 29 января 2025 в Тбилиси