Role Overview:
Do you want to define the way our Big Data Platforms on Databricks infrastructure operate We are looking for someone like you who can help us to:
• Drive internal product consulting and support of Azure Databricks deployments including various integration touch points like ADLS, AzureSQL, PowerBI• Capture and drive best practices across the firm to achieve effective use of Databricks• Manage incidents and issues, focusing on effective communication and follow-up with stakeholders and other technical teams including Microsoft• Manage interface to our Azure infrastructure engineering by addressing issues and keeping track of these end to end issues until they are resolved• Participate in different Databricks-related activities such as working with users, training & establishing a vibrant internal community of practitioners• Maintain documentation and learning resources• Engineer solutions and develop software modules, as defined by the software development life cycle• Resolve high-priority defects and deploy fixes to production systems• Communicate effectively with other technical teams and vendor partnersQualification Required:
You have:
• Deep, technical knowledge of Databricks deployments in Azure both from an administrative and a consulting standpoint• Solid understanding of big data best practices esp having worked on very large clusters both data and compute wise• Professional certification qualification in both Databricks / Spark / Lakehouse as well as Azure infrastructure is essential preferably in data engineering (DP-203)• Proven expertise in full stack integration of Azure Databricks (Lakehouse, Gitlab, SQLAnalytics, MLFlow, Azure Policy, ADLS, SCIM, PowerBI)• Ideally 5 years of solid experience in building data processing and data analytics solutions using Databricks on Azure• Strong knowledge of one of cloud platforms such as Azure, AWS (Azure preferred)• Proven experience in cloud integration, container services and good understanding of Agile Methodology based execution processes• Strong analytical, problem solving, debugging skills• A passion to help solve challenging problems which require multi-disciplinary skills across distributed software development, engineering and deep Azure knowledge• Proactive and detail-oriented, yet comfortable working in an environment with fast paced deliveries and changing requirements and focus on community and user engagement• Able to work independently and apply own initiativeAbout the Company:
UBS is the worlds largest and only truly global wealth manager. We operate through four business divisions: Global Wealth Management, Personal & Corporate Banking, Asset Management, and the Investment Bank. Our global reach and the breadth of our expertise set us apart from our competitors. With more than 70,000 employees, we have a presence in all major financial centers in more than 50 countries.
If you want to be part of a purpose-led culture and global infrastructure that fosters diversity, equity, and inclusion, UBS is the place for you to make an impact and be part of #teamUBS.
(Note: Additional details of the company were not present in the job description) Role Overview:
Do you want to define the way our Big Data Platforms on Databricks infrastructure operate We are looking for someone like you who can help us to:
• Drive internal product consulting and support of Azure Databricks deployments including various integration touch points like ADLS, AzureSQL, PowerBI• Capture and drive best practices across the firm to achieve effective use of Databricks• Manage incidents and issues, focusing on effective communication and follow-up with stakeholders and other technical teams including Microsoft• Manage interface to our Azure infrastructure engineering by addressing issues and keeping track of these end to end issues until they are resolved• Participate in different Databricks-related activities such as working with users, training & establishing a vibrant internal community of practitioners• Maintain documentation and learning resources• Engineer solutions and develop software modules, as defined by the software development life cycle• Resolve high-priority defects and deploy fixes to production systems• Communicate effectively with other technical teams and vendor partnersQualification Required:
You have:
• Deep, technical knowledge of Databricks deployments in Azure both from an administrative and a consulting standpoint• Solid understanding of big data best practices esp having worked on very large clusters both data and compute wise• Professional certification qualification in both Databricks / Spark / Lakehouse as well as Azure infrastructure is essential preferably in data engineering (DP-203)• Proven expertise in full stack integration of Azure Databricks (Lakehouse, Gitlab, SQLAnalytics, MLFlow, Azure Policy, ADLS, SCIM, PowerBI)• Ideally 5 years of solid experience in building data processing and