Design and build data pipelines using Azure Data Factory and related tools. Collaborate with data scientists, analysts, and business stakeholders to deliver actionable insights. Troubleshoot and optimize data workflows using Azure Monitor and Azure DevOps.
Key Highlights
Technical Skills Required
Benefits & Perks
Job Description
Job Description:
Location: Guadalajara,
Relocation to Guadalajara will be available
Mode: Onsite - 5 days work from office
communication Level in English: Advance/C1/B2/Professional
Required Skill:
Key Responsibilities
- Design and Build Data Pipelines: Develop and maintain modern ETL/ELT processes using Azure Data Factory and related tools.
- Data Integration: Ingest data from multiple sources (databases, APIs, streaming platforms) into Azure environments.
- Data Modeling & Storage: Implement and optimize data models using Azure SQL Database, Azure Synapse Analytics, Azure Data Lake, and Cosmos DB.
- Data Transformation & Processing: Use Azure Databricks, Apache Spark, and Azure Synapse for large-scale data processing.
- Real-Time Data Processing: Configure streaming pipelines with Azure Stream Analytics for IoT or live data feeds.
- Data Governance & Security: Implement encryption, RBAC, compliance policies, and monitor data quality.
- Performance Optimization: Troubleshoot and optimize data workflows using Azure Monitor and Azure DevOps.
- Collaboration: Work closely with data scientists, analysts, and business stakeholders to deliver actionable insights.
Excellent analytical, problem solving and communication skills.