Quantcast

Sql Server Data Warehouse Resume Falls c...
Resumes | Register

Candidate Information
Title Sql Server Data Warehouse
Target Location US-VA-Falls Church
20,000+ Fresh Resumes Monthly
    View Phone Numbers
    Receive Resume E-mail Alerts
    Post Jobs Free
    Link your Free Jobs Page
    ... and much more

Register on Jobvertise Free

Search 2 million Resumes
Keywords:
City or Zip:
Related Resumes
Click here or scroll down to respond to this candidate
OBJECTIVESeeking a challenging position in ETL Developer PositionSUMMARY15 years of hand-on data migration experience in multiple data sources / targets data warehouse environmentsStrong in MS SQL Server SSIS: designing, creating and deploying integration service packages; developing control flow and data flow; familiar with Connections, tasks, transformations, variables and package configurationSolid work experience in ETL tools Informatica Power Center mapping and workflow designDesigning and building scalable DataStage solutions for financial data martExtensive experience in ETL, Data Warehousing/Data Mart, OLTP (Online Transaction Processing), Relational online analytical processing (ROLAP) and Multidimensional OLAP (MOLAP)Extensive experiences on Enterprise BI reports performance optimization and data organizationDesign and implement BI / reporting solutions as per requirements gathered with business stakeholders by collecting the required data with BI Tools Power BI, Dremio, Alteryx Tableau, MicroStrategyRich experiences in Database: RedShift, S3, Netezza, Oracle and MS SQL Server as well as Postgre SQL, SQL, PL/SQL and T-SQL programmingHand on experience on AWS S3, Lambda, EC2, CloudWatch, RDS, RedShift as well as PostgreSQL skillsUtilize Python 3.6 pandas to extract data from AWS RedShift and write to CSV file for down streamConvert the SAS script to source from universal CSV file instead of retiring DBImprove report performance with tuning the queries, database design and processes improvement, limiting the final input data, using proper automation control10+ years experiences in various enterprise application systems development lifecycle including business analysis, design, development, test and implementationImplement Conceptual/Logical and Physical Data Modeling of RDBMS, Data Warehouses and Data Marts using ERwin, TOAD Modeler and Visio etc.Advanced experience with SDLC, iterative development with Scrum Agile, Jira, Bitbucket and JenkinsProfessional Documentations experience with effective category, naming convention and standardized templatesExcellent verbal and written communication skills and the ability to interact professionally with diverse groupsStrong analytical and problem-solving abilities, detail oriented with strong organizational skillsProactive team player and self-motivated worker, quick learning and hard workingWORK EXPERIENCESr. Data AnalystFannie Mae Washington DC May 2019  May 2024Industry: FinanceDevelop Informatica mapping and workflow as well as upgrade validationSourcing data from S3 files with Dremio and create reports by TableauUtilized S3 copy and Redshift copy to loaded flat file data into AWS RedshiftExtracted the data from Redshift with python 3.6 pandas to generate the file for the down steam applicationSolved the Redshift DB connection timeout issue by Python pandas chunk load and Netezza timeout issue by creating external file function to improve the extract performanceConfigurated the S3 Bucket/Key and Redshift properties for the data migration processingDeveloped SAS script, sourced the data from files and created the dataset for reportsReconciled the data with old model data and new design Enterprise Data Warehouse data, made sure the reports correctly generated with the new data modelDesigned the data migration automation for the application with Autosys and UNIX scriptsPerformed End to End test with SF3 application calling and debugging the SAS codes for business UATCollaborated with Quality Assurance, DBA, release management teams for implementation of projectsIdentified, documented and performed root cause analysis to prevent production issues from reoccurringUtilized Jenkins to deploy codes to Test and UATAWS Lambda and Step function testing and Cloud watch validationJil contingency AutoSys job to utilize the App ID as well as YBYO to manipulate file for testingCompared data with through DB, Files, Alteryx as well as other toolsEnvironment: AWS, Lambda, EC2, RedShift, S3, Cloud, Python, SAS, Autosys, Netezza, Oracle, Bitbucket, Jira, JenkinsSr. BI ConsultantDXC TechnologyProjects: EDW on Azure Cloud/Hive Mclean VA Feb 2018  May 2019Industry: RetailSolved client performance issue by created the EDW for client with Extract Load and Transform from current system to Big Data Hive (Data Lake) and presented the data through SQL DB (Power BI)Ingested the files from different brands all over the world into the cloud Hive (Data Lake) and loaded them into data mart, organized source data time zone and dependencies, prepared the data for the final reports runDesigned solution for clients Data warehouse with Analyzing, Modeling, ETL and BI ReportingConverted the clients old report database to dimension design model to reduce the reports latencyTook advantage of the cloud computer to use more computer power for peak run to improve report performanceCollected and summarized the file arriving time and jobs dependencies to generated the job scheduling for Automation loadingProvided solution for large volume historical data initial load and handled the data quality and file format issuesExtracted data directly from ERP system AX2012 and D365 and loaded to Data warehouse for reportsDeveloped Power BI visualizations for business KPI needsEnvironment: Azure, Cloud DB, HDInsight, Hadoop, Hive 3.0, SQL DB, Power BI, SSIS, Jira, DevOp Git, ORC File, HQL, Logic AppBig Data ConsultantToronto Big Data Professional AssociationProjects: Big Data in Finance Toronto ON Oct. 2016  Jan. 2018Industry: FinanceExplored the insight of the data for decisions to improve the performance of the departments and corporationDesigned and developed POC for client to evaluate the Big Data platform compare with the current platformMigrated data from tradition data into data lake improve the report performance and reduce the costPresented data for client for marketing and analytical purposeEvaluated different tools and platform for the need of the client environments and budgetEnvironment: Hadoop, Hive, Netezza, UNIX Linux, SQL, Python, Informatica, XML, Agile, GitSenior ETL DeveloperNext PathwayProject: Scotia RCRR, CPP, CRP, GLRAS Toronto ON May 2016  Apr 2017Industry: BankOrganized data through different layers. Ingested data to Temp Storage Zone, loaded to Snapshot layer, transformed to Semantic layer and presented to users to generate the reportsDesigned and developed the ELT processes to solve complex data problems with HQL and Java scriptsImplemented Big Data ELT with Talend 6.1 using tELTHiveInput, tELTHiveMap, tELTHiveOutput, tRunJobEmbraced Agile process, involved the client in every step of the project, developed and changed with fast respondDesigning and building scalable DataStage solutions for financial data martUtilized partitions to store data in table to balance the storage and performanceEvaluated the different execution engines MapReduce, TEZ and Spark to improve the run timeCollaborated with Data Architects and BA to refine the requirements and designsTested the codes and documented unit test cases, solved QA and UAT issues with Jira systemEnvironment: Hadoop, Hive, Spark Java, Python, Informatica, Talend, DataStage, Agile, Toad, Linux, Jira, Confluence, HUE, GIT, Avro File, HQL, SASSenior ETL Developer/LeadBMOProject: AML, MECH, OSFI, Basel III, GTSCM, Finance North York ON Nov. 2014  May 2016Industry: Bank (Capital Market)Designed and developed ELT loading processes to meet reporting timing and data requirementsCoordinated with different departments to setup the SLA for each source file, designed the seamless schedule and dependencies for daily BI reportsImproved Netezza Database performance using Organize, Distribute and Groom functionsDeveloped ELT codes to load data from multiple sources to landing database then load to Data Mart for ReportingLeaded development applying ELT standards, Framework standard and Database standardsRetrieved the MFA log summary data from Hive into Netezza through Fluid Query, to generate party of the risk score factor and use Spotfire to generate the Analytic reportsMigrated large volume of data (5TB) as pre deployment activity in Production for new releaseImplemented large projects release by cooperating development team, QA team, Scheduling team and Release teamUtilized the development framework to manage ELT metadata, customize it for special requirementsCommunicated efficiently with BA, Business, PM and Vendors to collect and clarify requirementsPerformed data validation processes to control the data errors, prevented error data loaded into databaseEnvironment: Netezza, DB2, SQL Server 2012, Power Designer, UNIX Linux, SQL, XML, Hadoop, Hive, Spotfire, AgileSenior ETL Developer/LeadRogers Inc.Project: CDR, CRM, Billing System Brampton ON Jun. 2011  Nov. 2014Industry: TelecomLeaded ETL development according to coding standards, using restartable and scalable strategy in Informatica workflows, automated the ETL processes with scheduler (Control M) designAnalyzed, defined and documented requirements for data, workflow and logical processesDesigned and developed ETL codes with Informatica, Control M and Unix Script to meet business and project requirements, help business solve report problemsUsed Hadoop as data store the huge volume data CDR and unstructured data as the source of data warehouseUtilized the control framework to collect metadata of each workflow for data quality controlHandled data errors through coding level, database level and control framework level base on the requirement needsApplied Teradata and Informatica functions to handled huge volume of CDR and Billing dataPerformed unit test, code review, integration test and generated test documentsCreated detail design document, unit test document, implementation plan, ETL process and run bookAssisted Business Analyst with data profiling activitiesEnvironment: Informatica 9.5/8.6/7, Teradata v13, Control M 7/8, Toad 10, Cognos 8, MicroStretegy; Oracle 11g/9i, SQL Serve; UNIX AIX, PL-SQL, XML, Hadoop, HiveSenior ETL DeveloperCGI Inc.Project: Atlantic Lottery, Corporation Information System, Moncton NB Nov. 2010  Jun. 2011Industry: GovernmentAnalyzed the business requirements to design, architect, develop and implement efficient and scalable ETL flowReviewed, analyzed and developed fact tables, dimension tables and reference tables for Data MartsInspected and analyzed the documents of business and system requirements for EDW (Enterprise Data Warehouse)Made recommendations for ETL design and performance enhancements for large data volume loads and processingPrepared, maintained and published ETL documentation, including source-to-target mappings, business-driven transformation rules, functional design, test report, installation instruction and runbookDeveloped SSIS solutions using Integration Service projects for Control, Extract, Data cleansing, Transform and LoadDeveloped and maintained data management standards and conventions, data dictionaries, data elements naming standards and metadata standardsCreated test cases for ETL; prepared test data and test scripts to test fact tables and dimension tablesEnvironment: SSIS, Cognos; SQL Server 2005, 2008R2, Informix, DB2; AccuRev, JIRA; ERwin, Toad, XML, EspressoEDUCATIONBachelor of EngineeringSouth China University of TechnologyBachelor: Computer ScienceBachelor of EconomicsGuangdong University of Finance and EconomicsBachelor: EconomicsREFERENCESAvailable upon request

Respond to this candidate
Your Message
Please type the code shown in the image:

Note: Responding to this resume will create an account on our partner site postjobfree.com
Register for Free on Jobvertise