Data Bricks Architect, Data Modeler, Python, Azure 3+ Yrs Experience
Description
Location: Mumbai
Job Description
As part of the Astellas commitment to delivering value for our patients, our organization is currently undergoing transformation to achieve this critical goal. This is an opportunity to work on digital transformation and make a real impact within a company dedicated to improving lives.
DigitalX our new information technology function is spearheading this value driven transformation across Astellas. We are looking for people who excel in embracing change, manage technical challenges and have exceptional communication skills.
We are seeking committed and talented Business Intelligence (BI) Architect to join our new FoundationX team- which lies at the heart of DigitalX. As a member of FoundationX, you will be responsible for ensuring our BI applications are performant, operational, scalable and continue to drive business value.
This is a remote position and is based in India. Remote work from certain states may be permitted in accordance with Astellas’ Responsible Flexibility Guidelines. Candidates interested in remote work are encouraged to apply.
Purpose and Scope:
As part of the Astellas commitment to delivering value for our patients, our organization is currently undergoing transformation to achieve this critical goal. This is an opportunity to work on digital transformation and make a real impact within a company dedicated to improving lives.
DigitalX our new information technology function is spearheading this value driven transformation across Astellas. We are looking for people who excel in embracing change, manage technical challenges and have exceptional communication skills.
We are seeking committed and talented DataBricks Architects with experience in performance turning, optimization and expert Data Modeller experience, to join our new FoundationX team- which lies at the heart of DigitalX. As a member of our team within FoundationX, you will be responsible for ensuring our data driven systems are operational, scalable and continue to contain the right data to drive business value.
Essential Job Responsibilities:
Your responsibilities will include executing complex data projects, ensuring smooth data flows between systems, and maintaining the efficiency and reliability of databases. This is a fantastic global opportunity to use your proven agile delivery skills across a diverse range of initiatives, utilize your development skills, and contribute to the continuous improvement/delivery of critical IT (Information Technology) solutions. You will also be contributing to the following areas:
End-to-End Data Solutions:
- Design end-to-end scalable data streams, storage, data serving systems, and analytical workflows using Databricks.
- Define overall architecture, capabilities, platforms, tools, and governing processes.
Data Pipeline Development:
- Build data pipelines to extract, transform, and load data from various sources.
- Set up metadata and master data structures to support transformation pipelines in Databricks.
Data Warehousing and Data Lakes:
- Create data warehouses and data lakes for efficient data storage and management.
- Develop and deploy data processing and analytics tools.
Collaboration with DataX and other key stakeholder value teams:
- Collaborate with data modelers to create advanced data structures and models within the Databricks environment.
- Develop and maintain Python scripts for data processing, transformation, and analysis.
- Utilize Azure and AWS cloud services (e.g., Azure Data Lake, AWS S3, Redshift) for data storage and processing.
- Apply expertise in Databricks to enhance data architecture, performance, and reliability. Lead relevant data governance initiatives and ensure compliance with industry standards.
- Work closely with data scientists to develop and deploy data-driven solutions.
- Provide technical direction to Data Engineers and perform code reviews.
Continuous Learning:
- Stay up to date on the latest data technologies, trends, and best practices.
- Participate in smaller focused mission teams to deliver value driven solutions aligned to our global and bold move priority initiatives and beyond.
- Collaborate with cross functional teams and practices across the organization including Commercial, Manufacturing, Medical, DataX, GrowthX and support other X (transformation) Hubs and Practices as appropriate, to understand user needs and translate them into technical solutions.
- Provide Level 3 and 4 Technical Support to internal users troubleshooting complex issues and ensuring system uptime as soon as possible.
- Champion continuous improvement initiatives identifying opportunities to optimize performance security and maintainability of existing data and platform architecture and other technology investments.
- Participate in the continuous delivery pipeline Adhering to DevOps best practices for version control automation and deployment. Ensuring effective management of the FoundationX backlog.
- Leverage your knowledge of Machine Learning (ML) and data engineering principles to integrate with existing data pipelines and explore new possibilities for data utilization.
- Stay up to date on the latest trends and technologies in full-stack-development, data engineering and cloud platforms.
- Requirements
Required Qualifications:
- Bachelor's or Master's degree in Computer Science, Engineering, or related field.
- 3 years+ of experience as a Data Engineer or DataBricks Developer.
- Proficiency in Python for data manipulation, scripting, and analytics.
- Strong understanding of data modeling concepts and practices.
- Any relevant cloud-based DataBricks, AWS or Azure certifications for example:
- Databricks Data Engineer
- AWS Certified Data Analytics Specialty – Professional / Associate (will be considered with relevant experience)
- Microsoft Certified Azure Data Engineer Associate
- Microsoft Certified Azure Database Administrator
- Microsoft Certified Azure Developer
- Experience using ETL tools like Talend / Talend Cloud and DataStage (Essential)
- Knowledge and experience using Azure DevOps (Essential)
- Knowledge and experience of working with SalesForce / SAP (Desirable)
- Experience in working with MPP Databases like AWS Redshift
- Experience of delivering architectural solutions effectively within Lifesciences or Pharma Domains.
- Architectural knowledge of deployment modes of various SAP Analytics solutions (Cloud, On-Premises, Hybrid)
- Working Knowledge of connectors like SDA/SDI, Exposing models or queries as Odata for external consumption and Consuming external APIs into BW models.
- Good knowledge of latest trends in Analytics space including platforms and solutions from other vendors
- Proven work experience on mixed modelling with BW/4HANA and HANA native modelling.
- Hands-on experience in S/4HANA Analytics like building and extending CDS Views, Creating Query Views or Building KPIs using KPI Modeler
- Hands-on experience in advanced analytics domain like Predictive Analytics or SAC Smart Features, SAC Application Designer is a plus.
Preferred Qualifications:
- Experience analyzing and building star schema data warehouses
- Experiencing writing SQL and creating stored procedures is essential
- Data Analysis and Automation Skills: Proficient in identifying, standardizing, and automating critical reporting metrics and modelling tools.
- Analytical Thinking: Demonstrated ability to lead ad hoc analyses, identify performance gaps, and foster a culture of continuous improvement.
- Experience in integrating data from multiple Data sources like relational Databases, Salesforce, SAP and API calls.
- Agile Champion: Adherence to DevOps principles and a proven track record with CI/CD pipelines for continuous delivery.
- Understand and Interpret business requirements and can term them into technical requirements.
- Create and maintain technical documentation as part of CI/CD principles