Auckland Jobs |
Canterbury Jobs |
Northland Jobs |
Otago Jobs |
Southland Jobs |
Tasman Jobs |
Wellington Jobs |
West Coast Jobs |
Auckland Jobs |
Canterbury Jobs |
Northland Jobs |
Otago Jobs |
Southland Jobs |
Tasman Jobs |
Wellington Jobs |
West Coast Jobs |
Oil & Gas Jobs |
Banking Jobs |
Construction Jobs |
Top Management Jobs |
IT - Software Jobs |
Medical Healthcare Jobs |
Purchase / Logistics Jobs |
Sales |
Ajax Jobs |
Designing Jobs |
ASP .NET Jobs |
Java Jobs |
MySQL Jobs |
Sap hr Jobs |
Software Testing Jobs |
Html Jobs |
Job Location | Toronto, Ontario, M5A 3N7 |
Education | Not Mentioned |
Salary | $589 / Daily |
Industry | Not Mentioned |
Functional Area | Not Mentioned |
Job Type | Full time |
Job Title: Apptad - - Specialized IT Consultant - Senior Job Location: Toronto (Hybrid) Job Duration: Long-Term ContractOne of our Government clients is looking for a Senior Specialized IT Consultant Job DescriptionGeneral Responsibilities Responsibilities Works in partnership with clients, advising them on information technology in order to meet their business objectives or overcome problems, work to improve structure and efficiency of an organization #39;s I amp;IT systems. The I amp;IT Consultant may be used to provide strategic guidance to organizations with regard to Information Managementamp; IT technology, IT infrastructures and the enablement of major business processes through enhancements to IT. Provides subject matter expertise in their field and highly expert technical assistance. General Skills Acts as the technical advisor/expert on all aspects of a specific deliverable Provide the quality assurance/quality control of specific deliverables Anticipates and resolves problems to ensure that the deliverables are completed within budget, to the highest quality, meeting or exceeding expectations Develops processes and procedures for implementing deliverables Prepares reports and presentations including options, recommendations, implementation plans, etc. Works with clients to define the scope of a project and to determine requirements Defines software, hardware and network requirements Analyzes I amp;IT requirements giving independent and objective advice on the use of I amp;IT Designs, tests, installs and monitors new systems and develops solutions and implementation of new systems Familiar with change-management principles and methodology Knowledge and understanding of Information Management principles, concepts, policies and practicesAdditional ResponsibilitiesThis role will focus on data architecture, data warehousing, data lakes, and analytics. The individual will be designing, developing, maintaining, and optimizing ETL (Extract, Transform, Load) processes in Databricks for data warehousing, data lakes, and analytics. The individual will work closely with data architects and business teams to ensure the efficient transformation and movement of data to meet business needs, including handling Change Data Capture (CDC) and streaming data.Review business requirements, familiarize with and understand business rules and transactional data modelDefine conceptual, logical model and physical model mapping from data source to curated model and data mart.Analyze requirements and recommend changes to the physical model.Develop scripts for the physical model, create database and/or delta lake file structure.Access Oracle DB environments, set necessary tools for developing solution.Implement data design methodologies, historical and dimensional modelsPerform data profiling, assess data accuracy, design and document data quality and master data management rulesFunctionality Review, Data Load review, Performance Review, Data Consistency checks.Help troubleshooting data mart design issuesReview performance of ETL with developers and suggest improvementsParticipate in end-to-end integrated testing for Full Load and Incremental Load and advise on issuesTools used are:- Azure Databricks, Delta Lake, Delta Live Tables, and Spark to process structured and unstructured data.-Azure Databricks/PySpark (good Python/PySpark knowledge required) to build transformations of raw data into curated zone in the data lake.-Azure Databricks/PySpark/SQL (good SQL knowledge required) to develop and/or troubleshoot transformations of curated data into FHIR.Data designo Understand the requirements. Recommend changes to models to support ETL design.o Define primary keys, indexing strategies, and relationships that enhance data integrity and performance across layers.o Define the initial schemas for each data layero Assist with data modelling and updates of source-to-target mapping documentationo Document and implement schema validation rules to ensure incoming data conforms to expected formats and standardso Design data quality checks within the pipeline to catch inconsistencies, missing values, or errors early in the process.o Proactively communicate with business and IT experts on any changes required to conceptual, logical and physical models, communicate and review timelines, dependencies, and risks.Development of ETL strategy and solution for different sets of data moduleso Understand the Tables and Relationships in the data model.o Create low level design documents and test cases for ETL development.o Implement error-catching, logging, retry mechanisms, and handling data anomalies.o Create the workflows and pipeline designDevelopment and testing of data pipelines with Incremental and Full Load.o Develop high quality ETL mappings/scripts/notebookso Develop and maintain pipeline from Oracle data source to Azure Delta Lakes and FHIRo Perform unit testingo Ensure performance monitoring and improvementPerformance review, data consistency checkso Troubleshoot performance issues, ETL issues, log activity for each pipeline and transformation.o Review and optimize overall ETL performance.End-to-end integrated testing for Full Load and Incremental LoadPlan for Go Live, Production Deployment.o Create production deployment steps.o Configure parameters, scripts for go live. Test and review the instructions.o Create release documents and help build and deploy code across servers.Go Live Support and Review after Go Live.o Review existing ETL process, tools and provide recommendation on improving performance and reduce ETL timelines.o Review infrastructure and remediate issues for overall process improvementKnowledge Transfer to Ministry staff, development of documentation on the work completed.o Document work and share the ETL end-to-end design, troubleshooting steps, configuration and scripts review.o Transfer documents, scripts and review of documents to Ministry.Must Have Skills7+ years using ETL tools such as Microsoft SSIS, stored procedures, T-SQL2+ Delta Lake, Databricks and Azure Databricks pipelineso Strong knowledge of Delta Lake for data management and optimization.o Familiarity with Databricks Workflows for scheduling and orchestrating tasks.2+ years Python and PySpark Solid understanding of the Medallion Architecture (Bronze, Silver, Gold) and experience implementing it in production environments.Hands-on experience with CDC tools (e.g., GoldenGate) for managing real-time data.SQL Server, OracleExperience:Experience of 7+ years of working with SQL Server, T-SQL, Oracle, PL/SQL development or similar relational databasesExperience of 2+ years of working with Azure Data Factory, Databricks and Python developmentExperience building data ingestion and change data capture using Oracle Golden GateExperience in designing, developing, and implementing ETL pipelines using Databricks and related tools to ingest, transform, and store large-scale datasetsExperience in leveraging Databricks, Delta Lake, Delta Live Tables, and Spark to process structured and unstructured data.Experience working with building databases, data warehouses and working with delta and full loadsExperience on Data modeling, and tools e.g. SAP Power Designer, Visio, or similarExperience working with SQL Server SSIS or other ETL tools, solid knowledge and experience with SQL scriptingExperience developing in an Agile environmentUnderstanding data warehouse architecture with a delta lakeAbility to analyze, design, develop, test and document ETL pipelines from detailed and high-level specifications, and assist in troubleshooting.Ability to utilize SQL to perform DDL tasks and complex queriesGood knowledge of database performance optimization techniquesAbility to assist in the requirements analysis and subsequent developmentsAbility to conduct unit testing and assist in test preparations to ensure data integrityWork closely with Designers, Business Analysts and other DevelopersLiaise with Project Managers, Quality Assurance Analysts and Business Intelligence ConsultantsDesign and implement technical enhancements of Data Warehouse as required.Must Have Skills7+ years using ETL tools such as Microsoft SSIS, stored procedures, T-SQL2+ Delta Lake, Databricks and Azure Databricks pipelineso Strong knowledge of Delta Lake for data management and optimization.o Familiarity with Databricks Workflows for scheduling and orchestrating tasks.2+ years Python and PySpark Solid understanding of the Medallion Architecture (Bronze, Silver, Gold) and experience implementing it in production environments.Hands-on experience with CDC tools (e.g., GoldenGate) for managing real-time data.SQL Server, OracleSelection CriteriaTechnical Skills (70 points)Experience in developing and managing ETL pipelines, jobs, and workflows in Databricks.Deep understanding of Delta Lake for building data lakes and managing ACID transactions, schema evolution, and data versioning.Experience automating ETL pipelines using Delta Live Tables, including handling Change Data Capture (CDC) for incremental data loads.Proficient in structuring data pipelines with the Medallion Architecture to scale data pipelines and ensure data quality.Hands-on experience developing streaming tables in Databricks using Structured Streaming and readStream to handle real-time data.Expertise in integrating CDC tools like GoldenGate or Debezium for processing incremental updates and managing real-time data ingestion.Experience using Unity Catalog to manage data governance, access control, and ensure compliance.Skilled in managing clusters, jobs, autoscaling, monitoring, and performance optimization in Databricks environments.Knowledge of using Databricks Autoloader for efficient batch and real-time data ingestion.Experience with data governance best practices, including implementing security policies, access control, and auditing with Unity Catalog.Proficient in creating and managing Databricks Workflows to orchestrate job dependencies and schedule tasks.Strong knowledge of Python, PySpark, and SQL for data manipulation and transformation.Experience integrating Databricks with cloud storage solutions such as Azure Blob Storage, AWS S3, or Google Cloud Storage.Familiarity with external orchestration tools like Azure Data FactoryImplementing logical and physical data modelsKnowledge of FHIR is an assetDesign Documentation and Analysis Skills (20 points)Demonstrated experience in creating design documentation such as:o Schema definitionso Error handling and loggingo ETL Process Documentationo Job Scheduling and Dependency Managemento Data Quality and Validation Checkso Performance Optimization and Scalability Planso Troubleshooting Guideso Data Lineageo Security and Access Control Policies applied within ETLExperience in Fit-Gap analysis, system use case reviews, requirements reviews, coding exercises and reviews.Participate in defect fixing, testing support and development activities for ETLAnalyze and document solution complexity and interdependencies including providing support for data validation.Strong analytical skills for troubleshooting, problem-solving, and ensuring data quality.Communication and Leadership Skills (10 points) Ability to collaborate effectively with cross-functional teams and communicate complex technical concepts to non-technical stakeholders.Strong problem-solving skills and experience working in an Agile or Scrum environment.Ability to provide technical guidance and support to other team members on Databricks best practices.Must have previous work experience in conducting Knowledge Transfer sessions, ensuring the resources will receive the required knowledge to support the system.Must develop documentation and materials as part of a review and knowledge transfer to other members.