
Fahad Mahmood
Currently Associated with HCL Technologies- D&A practise as a Technical Architect with 10+ yrs. of Experience Snowflake DWH... | Noida, Uttar Pradesh, India
*50 free lookup(s) per month.
No credit card required.
Fahad Mahmood’s Emails fa****@hc****.com
Fahad Mahmood’s Phone Numbers No phone number available.
Social Media
Fahad Mahmood’s Location Noida, Uttar Pradesh, India
Fahad Mahmood’s Expertise Currently Associated with HCL Technologies- D&A practise as a Technical Architect with 10+ yrs. of Experience Snowflake DWH cloud,DBT cloud,Spark,Azure ADF,Azure Databricks,Azure Synapse,ADLS, Snowflake,ETL Informatica,Oracle SQL,SnowPipes,Migrating application from SAP HANA ,BW to snowflake through Datavard ,Hybrid cloud ingestion AWS in Snowflake tables,Continuous Data Load Ingestion pipeline for ETL/ELT Load,Dynamic DDL Schema change Detection,Creating ECC framework for Data load strategy using Automated Java Script Stored Proc and unloading to S3,Automating Hybrid Merge using Javascript Stored Proc,CDC Implementation using Stream and Task ,RBAC,Azure Cloud,PL/SQL,Sql server, Domain knowledge: US Health Care,Data Analytics,Telecom Key Roles: Worked on Data Warehouse Design Implementation . Extracting the data from various source system and Data Marts i.e. from Big Data platform Data Lake,Hive. . Creating transformation rules based on Mapping document and loading into the target system for different Subject Areas using Informatica as a ETL tool. . Worked on External stage AWS S3 to load semi structured data JSON,Parquet into snowflake through Snow Pipes,Copy command. . Developed Data Validation framework. . Worked on Informatica Designer for creating ETL mapping from source to target tables ,Implementing SCD2,Router transformation, Lookup,Joiner,Update Strategy, Incremental Load,Worked loading data from flat files into Staging layer and loading back to integration tables . • Created Snowpipe for continuous data load. • Worked on creating ETL pipelines in and out of data warehouse using Snowflakes SnowSQL Writing SQL. . Worked on creating Work Flow for creating Jobs,Monitor Log,Transferring files to downstream application through Unix shell scripts using ECG,Creating parameter files and session variables and maintaining entry in CFW Common Frame work table. . Worked on creating Data Model logical and Physical and Data Flow Diagram,Data base normalization techniques. . Creating TWS Jobs for scheduling jobs in different job streams and updating the dependency. . Worked on cursors and Refcursors,Exception Handling for error logging,Views and Materialized View. . Creating indexes for fast retrieval of data, Performance Tuning using DBMS_PROFILER and TKPROF,Hints,Database designing,Database Migration, Jenkins,CI/CD implementation,XL deployment set up in all server, Set up GITHUB ,
Fahad Mahmood’s Current Industry Hcl Technologies
Fahad
Mahmood’s Prior Industry
Igate
|
Hcl Technologies
|
Wipro
|
Unitedhealth
Not the Fahad Mahmood you were looking for?
Find accurate emails & phone numbers for over 700M professionals.
Work Experience

Hcl Technologies
Technical Architect
Tue Feb 01 2022 00:00:00 GMT+0000 (Coordinated Universal Time) — Present
Unitedhealth
Data Analyst
Wed Jun 01 2016 00:00:00 GMT+0000 (Coordinated Universal Time) — Tue Feb 01 2022 00:00:00 GMT+0000 (Coordinated Universal Time)
Wipro
Senior Project Engineer
Sun Feb 01 2015 00:00:00 GMT+0000 (Coordinated Universal Time) — Wed Jun 01 2016 00:00:00 GMT+0000 (Coordinated Universal Time)
Hcl Technologies
Senior Software Engineer
Wed Jan 01 2014 00:00:00 GMT+0000 (Coordinated Universal Time) — Thu Jan 01 2015 00:00:00 GMT+0000 (Coordinated Universal Time)
Igate
Software Engineer
Tue Jun 01 2010 00:00:00 GMT+0000 (Coordinated Universal Time) — Sun Dec 01 2013 00:00:00 GMT+0000 (Coordinated Universal Time)