1. Background
-Designing, testing, deploying, and documenting data quality procedures and their outputs
-The primary tasks associated with this role are to use data quality tooling to profile the project source data, define or confirm the definition of the metadata, cleanse and accurately check the project data, check for duplicate or redundant records, and provide information on how to proceed with backend ETL processes
-Partner with data stewards to provide summary results of data quality analysis, which will be used to make decisions regarding how to measure business rules and quality of the data
-Analyse and translate client's needs into an industry-leading enterprise data architecture that seamlessly integrates the current system and data asset with the new strategy
-Document at a functional level how the procedures work within the data quality applications
-Work closely with data stewards, ETL developers, and business analysts demonstrating the ability to partner and maintain confidence as it relates to coordinating disparate information while being able to translate for downstream consumption
-Research all available data quality technologies and solutions, determine suitability, and provide guidance on the best solution for the project at hand
-Guide team members by offering support, advice, and best practices recommendations throughout the project implementation
2. Knowledge
-Effective communication skills with an ability to explain technical concepts to
developers, product managers, and business partners
-Excellent problem solving and critical thinking skills
-Ability and desire to work with a team of people solving complex problems that often require independent research with minimal supervision
-Knowledge of data warehousing, OLAP, multi-dimensional, star and snowflake schemas
-Knowledge and experience with database design principles including referential
integrity, normalization, and indexing to support application development
-Automated testing of multi-tenant Azure Data Factory Data Pipeline for ELT
-Data Ingestion Automated testing of multi-tenant Azure Data Lake Gen 2
-Advanced hands-on experience testing DW and automated testing of data security (RBAC, RLS, CLS, Data Masking)
-Advanced hands-on experience with automated testing of Azure APIs (Azure AD
Provisioning API, Azure Data Lake API, Azure Data Factory API)
-Advanced hands-on experience with automated testing of Power BI embedded, PowerShell, Azure CLI, ARM templates