Quantcast

Sql Server Data Warehouse Resume Aurora,...
Resumes | Register

Candidate Information
Name Available: Register for Free
Title Sql Server Data Warehouse
Target Location US-IL-Aurora
Email Available with paid plan
Phone Available with paid plan
20,000+ Fresh Resumes Monthly
    View Phone Numbers
    Receive Resume E-mail Alerts
    Post Jobs Free
    Link your Free Jobs Page
    ... and much more

Register on Jobvertise Free

Search 2 million Resumes
Keywords:
City or Zip:
Related Resumes

Sql Server Data Glenwood, IL

Data Science Sql Server Chicago, IL

Data Warehouse Sql Server Chicago, IL

Sql Server Information Systems Chicago, IL

Sql Server Web Developer Hinsdale, IL

.Net Developer Sql Server Chicago, IL

Data Services Sap Bods Naperville, IL

Click here or scroll down to respond to this candidate
Candidate's Name
Mobile: PHONE NUMBER AVAILABLEEMAIL AVAILABLEProfessional Summary:21 years of IT experience in analysis, design, development, implementation, support and testing of business applications and most recent endeavours include 12+ years of experience in design, development and testing of data warehouse using ETL and Reporting.12+ years of experience in Data integration and Data Migration using Informatica PowerCenter, IDQ, IICS,Big data developer, MDM, SSIS, DataStage and Red.Experience in Snowflake Procedure, PL/SQL, SQL, Unix shell scripts, Python, AutoSys JIL Scripts and Mainframe Programming (Cobol, JCL and CICS).Experience in migrating on premise data stores (Oracle, SQL server) onto AWS Snowflake and Microsoft Azure database platforms.Experience in data modelling, building Datawarehouse solutions and integration architecture.Experience in Data Extraction, Transformation, and Loading (ETL) from different Data sources like Hadoop, HDFS, Hive, Oracle, Teradata, Sybase, Informix, Azure SQL Server, Azure SQL Datawarehouse and DB2 using informatica and SSIS.Experience with MS SQL Server Integration Services (SSIS), Reporting Services (SSRS), Power BI and Analysis Services (SSAS).Expertise in Data Modelling using Star Schema/Snowflake Schema, ODS, Fact and Dimensions tables.Well versed with architecture of big data technologies like HDFS and Spark.Experience in using Automation Scheduling tool like Autosys, Control M and Tidal.Well Versed in Database testing, Data warehouse Testing, Functional Testing, Mainframe Testing and web- based applications.Experience in developing Software Test Plans, Test Metrics, Test Case Design, and test Scripts based on User Requirements.Experience in QA Methodologies and Software Testing Life Cycle (STLC).Peer reviews, Knowledge transfer sessions and creating training materials for the projects worked on.Hands-on experience across all stages of Software Development Life Cycle (SDLC) including business requirement analysis, data mapping, build, unit testing, systems integration, and user acceptance testing.Worked with diversified Industry domains like Finance, Retail, Oil&Gas, Insurance, and Energy sectors.Experience Summary:Working as a Lead Data Engineer in RCG Global Services (Contracting through Compunnel Staffing) from November 2019 to till date.Worked as a Data Integration Consultant in UBS (Contracting through Compunnel Staffing) from February 2016 to October 2019.Worked as a Lead Developer in British Petroleum (Contracting through Wipro Technologies) from June 2009 to January 2016.Worked as a ETL Tester in Oracle (Contracting through Wipro Technologies) from January 2008 to May 2009.Worked as a Mainframe Developer in Thames Water, UK (Contracting through Wipro Technologies) from April 2006 to December 2007.Worked as a Mainframe Developer in Verizon Data services from August 2005 to April 2006.Worked as a Mainframe Developer in Accenture from May 2005 to July 2005.Worked as a Mainframe Developer in GE (Contracting through Satyam Computer Services Ltd) from October 2003 to February 2005.Worked as a Mainframe Developer in CITI Bank (Contracting through Excellbex Corporation SdnBhd) from February 2003 to July 2003.Worked as a Mainframe Developer in Fortune Technology from July 2001 to January 2003.Technical Skills:DatabasesSnowflake, Hadoop HDFS, Hive, Oracle, Sybase, Informix, DB2, Teradata and Microsoft SQL.Database ToolsToad, SQL Developer and DB2 Control CentreETL ToolsInformatica Power Center,IICS, IDQ, Big data developer, MDM, SSIS, DataStage and Red.Operating SystemsWindows and UNIXProgramming LanguagesSnowflake Procedure, PLSQL, Cobol, JCL, Python and JIL ScriptsSchedulingAutosys, Control M and TidalToolsHP ALM, JIRACloud DatabasesSnowflake and Azure SQLDeployment ToolsGIT, Jenkins and Bit BucketProject Summary:ProjectROP KPIs ProjectClientMartin Brower, ChicagoDurationNovember 2019 to Till dateSkillsETL Tools: DataStage, Informatica Big Data developer 10.2.2, Informatica Power Centre 10.1 Informatica IICS, SSIS,DBT,MatillionReporting: Power BIDatabase: Hadoop HDFS, Hive, Snowflake, Teradata, and Azure SQLCloud Platform: AWS and Azure.Scheduling Tool: Control MDeploy Tools: Jenkins and CI/CD PipelineDescription:Martin Brower is a part of the Reyes Family of Businesses which is one of the largest privately held companies in the US.As one of the leading supply chain providers for global quick service restaurants, Martin Brower boasts over 60 years of successful partnerships with consumers, suppliers and communities.In order to achieve optimum performance, the supply chain must be closely tracked, monitored and measured. For this we are typically using Supply chain metrics to cover the following areas: procurement, production, distribution, warehousing, inventory, transportation and customer service.Metrics are designed to highlight the areas which are not performing as expected. This will allow the organization to perform root cause analysis and improve associated processes to address any deficiencies while maintaining a balance between service levels and cost.ROP (Restaurant Order Processing) Metrics are a set of indicators or measurements which track the effectiveness of the people and processes involved in managing the ROP processes. These metrics will help the MB planning organization and McDonalds to understand how ROP service is performing over a given period of time and drive action plan when performance is not at an expected level.Responsibility:Working as an Informatica Architect and Data Integration consultant.Propose architectures considering cost/spend in Azure and develop recommendations to right-size data infrastructure.POC to build Azure Datawarehouse to migrate existing Datamart.Involved from End to end implementation of this project.Worked with Informatica Admin team to establish the Informatica environment, Database connectivity.Participated in business analysis, ETL requirements gathering, data modelling and documentation.Analysing SSIS packages.Designing the data transformation mappings,ODS and data quality verification programs.Extracted data from Teradata,Hadoop HDFS and Hive and Loaded into Azure SQL cloud environment.Migration of existing SQL server data-base to Azure SQL database and SnowflakeImplement alerts, threat protections and transparent data encryption for Azure SQL databasesCreated reports and dashboards using Power BI.Investigate, debug and fix problems with Mappings and Workflows. And taken care performance tuning.Participated in Decision support team to analyse the user requirements and to translate them to technical team for new and change requests.Coordinated/Managed the development and project implementation and production support activities.Performed Unit Testing and User Acceptance testing.ProjectAsset Management UnifyClientUBS, ChicagoDurationFebruary 2016 to October 2019SkillsETL Tools: Informatica Power Centre 10.1Informatica IDQ 10.1,Informatica MDMDatabase: AWS Snowflake, Oracle, Sybase, Teradata, and InformixProgramming Language: PLSQLScheduling Tool: AutosysDescription:The UNIFY program is a major transformation program for the Asset Management IT platform. Todays system is a complex application landscape grown historically out of several separate platforms which were only partially integrated. The result isA high degree of functional redundancy, resulting in high cost for maintaining the overall system.no consistent data sourcing with the same data sourced from different systems by different applicationslargely point-to-point integrations without re-use and standardization of data formatsFrom an integration and transition perspective, GIM2 and IDL(Integrated Distribution Layer) are an important building block of today's landscape. IDL serves as an integration tool in the current state. It is a Data Hub that stores, transforms, and distributes data to many downstream systems. This will be achieved by introducing a new portfolio accounting system providing the capabilities of the decommissioned systems. From an application perspective, there are 3 new applications that will be introduced.1. Simcorp Dimensions2. Simcorp IMW3. AIM-GAIN (Corporate Actions Management)The overall program goal can only be achieved if effective integration solutions are in place, given that the transition from the current state to the target state has to be implemented in a coordinated way.Responsibility:Involved from End to end implementation of this project.Worked with DIS (Data integration support team) to establish the Informatica environment, UNIX, Database connectivity and Autosys.Participated in business analysis, ETL requirements gathering, data modelling and documentation.Designing the data transformation mappings and data quality verification programs.Created database tables in AWS Snowflake.Extracted data from various databases like Teradata, Oracle, Informix, Sybase and Flat files into AWS Snowflake.Developed Autosys JIL Scripts for Scheduling the Informatica workflows.Investigate, debug and fix problems with Informatica Mappings and Workflows.Participated in Decision support team to analyse the user requirements and to translate them to technical team for new and change requests.Coordinated/Managed the development and project implementation and production support activities.Created File transfer process using MFT and BHUB.Performed Unit Testing and User Acceptance testing.ProjectOperational Planning and Data system2ClientBritish Petroleum, ChicagoRoleLead DeveloperDurationJune 2009 To January 2016EnvironmentETL Tools: Informatica Power Centre 9.5Programming Languages: Cobol and JCLDatabase: Oracle and DB2Scheduling Tool : TidalDescription:BP, sometimes referred to by its former name British Petroleum, is a British multinational oil and gas company headquartered in London, England. BP is vertically integrated and operates in all areas of the oil and gas industry, including exploration, production, refining, distribution, marketing, petrochemicals, power generation and trading. It also has renewable energy activities in biofuels and wind power.OPSD2 is a BP North America downstream proprietary application which provides the means to (1) capture the demand forecast, (2) schedule bulk shipments to meet the forecast while maintaining inventory targets, and (3) monitor the execution of the forecast.OPDS2 also generates the forecast report, inventory report, scheduling report, lifting report, Material/Plant data, Contract Report, Third Party pipeline scheduling report etc. This application is developed in Mainframes. We have migrated the data from DB2 to Oracle.Responsibility:Coordinated/Managed the development and project implementation and production support activities.As a Mainframe Programmer, I have complete understating on the functional knowledge.Participated in business analysis, ETL requirements gathering, physical and logical data modelling and documentation.Designing the data transformation mappings and data quality verification programs using Informatica and PL/SQL.Extracted source definitions from various databases like DB2,Oracle, MS SQL Server, MS Access and Flat-files into Informatica repository.Developed and Tested Mappings using Informatica Power Center Designer.Designed Reusable Transformations and Mapplets. Used most of the Transformations like Source Qualifier, Joiner, Update Strategy, Lookup, Rank, Expressions, Aggregator, Filter, and Sequence Generator for loading the data into Oracle database.Designed Workflows, reusable Tasks and Worklets using Informatica Workflow Manager SQL and Database tuning.Investigate, debug and fix problems with Informatica Mappings and Workflows, BO reportsParticipated in Decision support team to analyse the user requirements and to translate them to technical team for new and change requests.Performed unit and integration testing in User Acceptance Test (UAT), Operational Acceptance Test (OAT), Production Support Environment (PSE) and Production environments.Production support for data loads to DWH.Experience Summary:Completed B.E in Computer Science from M.S.University,India in 2001.

Respond to this candidate
Your Message
Please type the code shown in the image:

Note: Responding to this resume will create an account on our partner site postjobfree.com
Register for Free on Jobvertise