Core responsibilities
- ETL/ELT Development: Design, build, and optimize Matillion ETL/ELT pipelines for batch and near-real-time data processing.
- Environment Management: Configure and maintain Matillion environments, projects, jobs, agents, and connections (cloud & on-premises).
- Data Integration: Ingest data from diverse sources (databases, APIs, flat files, cloud storage) into cloud data warehouses/data lakes.
- Transformation & Modeling: Implement complex data transformations, data quality checks, and data modeling to support analytics and reporting.
- Orchestration & Scheduling: Manage job orchestration, dependency handling, and scheduling using Matillion, Cloud schedulers, or DevOps tools.
- API Integrations: Work with REST/SOAP APIs in Matillion for third-party integrations.
- Performance Optimization: Tune Matillion pipelines and SQL queries for scalability and performance.
- Collaboration: Partner with data engineers, analysts, and architects to deliver high-quality, production-ready solutions.
- CI/CD & DevOps: Leverage version control (Git), CI/CD pipelines, and deployment automation for Matillion jobs.
- Work with Databricks (Azure) – Spark SQL or Pyspark to build scalable, high-performance data solutions
- Develop scalable solutions leveraging Azure Databricks, ADLS, and Synapse Analytics
- Develop and orchestrate data workflows using Azure Data Factory (ADF)