Quantcast

Data Architect Sql Server Resume Bloomsb...
Resumes | Register

Candidate Information
Title Data Architect Sql Server
Target Location US-PA-Bloomsburg
Email Available with paid plan
20,000+ Fresh Resumes Monthly
    View Phone Numbers
    Receive Resume E-mail Alerts
    Post Jobs Free
    Link your Free Jobs Page
    ... and much more

Register on Jobvertise Free

Search 2 million Resumes
Keywords:
City or Zip:
Related Resumes

Change Management Data Center Reading, PA

Data Manager Mechanicsburg, PA

Data Engineering Governance Mechanicsburg, PA

Sql Server Developer Harrisburg, PA

Senior Data Engineer Harrisburg, PA

Data Analyst Business Mechanicsburg, PA

Power Bi Sql Server Camp Hill, PA

Click here or scroll down to respond to this candidate
PROFILE SUMMARYResult-oriented professional with 17 years of experience in multiple DWH technical platforms with Multiple technologiesProven success in working on Database including ETL Tool Green plum, Teradata, Oracle, Vector WiseHands-on experiences in working on Informatica PowerCenter 10.5.2/9.5, IICS, Informatica Power Exchange 10.4/9.5, IICS, HVR, Data Stage, SAP-BODS, Talend, Hadoop Scala and Spark SQLExposure of creating designs & and Project plans documents like PGP, SGP and Understanding of gathering business requirements, developing, & deploying custom data systems / applicationsGood experience in designing and developing audit, error identification and reconcile process to ensure the Data Quality of Data warehouse.Skilled in creation & maintenance of database/application architecture and standards following best practicesMined &analyzed data from multiple sources to drive optimization & improvement and delivered data-driven solutions to business challenges.Experience working with data lake implementation. Involved in development using Informatica to load data into Hive and impala systems.Extensively worked on Informatica B2B Data Exchange Setup from Endpoint creation, Scheduler, Partner setup, Profile setup, Event attributes creation, Event status creation, etc.Excellent knowledge in identifying performance bottlenecks and tuning the Informatica Load for better performance and efficiency.Extensive knowledge in developing Teradata, Fast Export, Fast Load, Multi load and BTEQ scripts. Coded complex scripts and finely tuned the queries to enhance performance.Have done Requirements gathering, Design, Development, Implementation, Migration, testing for various ETL implementations using Informatica. Worked extensively and designed ETL solutions using sources like Oracle, SQL, JD Edwards, Siebel ERP, SQL, and AWS S3.Profound knowledge about the architecture of the Teradata database.Experience in writing PL/SQL, T-SQL procedures for processing business logic in the database. Tuning of SQL queries for better performance.Involved in all phases of data warehouse project life cycle. Designed and developed ETL Architecture to load data from various sources like DB2 UDB, Oracle, Flat files, XML files, Sybase and MS SQL Server into Oracle, Teradata, XML, SQL Server, Hive, Impala and Azure targets.Significant Multi-dimensional and Relational data modeling experience, Data Flow Diagrams, Process Models, ER diagrams with modeling tools like ERWIN & VISIO.Extensive experience in implementation of Data Cleanup procedures, transformations Scripts, Triggers, Stored Procedures and execution of test plans for loading the data successfully into the targets.Extensive experience is maintaining Informatica connections in all the environments like Relational, Application and FTP connections.Excellent at managing design & development, testing, debugging, troubleshooting and facilitating smooth implementation of the applicationTECHNICAL SKILLSInformatica PowerCenter 10.5.2/9.5, IICS, Informatica Power Exchange 10.4/9.5, IICS, Data Stage SAP BODS, Talend, HVR, TALEND, HDFS, HIVE, SCALA, SPARK-CORE and Data Frames SPARK SQLPerformance Tuning (Partition, Bucketing, Map side join, Broad cast join in Spark-SQLRDBMS Green plum, Teradata, Oracle, SQL SERVER, DB2 Vector wiseScheduler Tool Control  M 6.3, TidalUNIXDimensional modeling from conceptual to Logical and physical data modelingEDUCATION2013: MBA Sikkim Manipal University2002: B. Com from IGNOU, Dhanbad, Jharkhand2001: Diploma in Computer Application from NIIT DhanbadCORE COMPETENCIESSOFT SKILLLead ETL Developer and Data ArchitectGeisinger Health PlanSince Jul 2022Projects Undertaken:Project:Role:Customer:ETL Manage ServicesLead ETL Developer and Data ArchitectGeisingerPeriod:Environment:Domain:Jul22 Till DateInformatica, SQL Server, TidalHealth CareDescription:Geisinger is one of the leading hospitals in health care and plan in Pennsylvania.Its primary care facility is the Geisinger Medical Center . It was started in 1915.Geisinger Health System serves over half a million patients across multiple states in theNortheastern United StatesResponsibilities:Worked on GHP-Project with multiple PBI mostly handing Facets/Medicaid/CHIP Data to generate Member/Medical/Dental and RX Claim and Eligibility file etc.Build ETL code for Cohere/CMS Interop/Navitus/Facets sources.Working on the for cloud (IICS) mappings for 350 + Mappings. This engagement will be over at the end of next year.Automated/Scheduled the cloud (IICS) jobs to run daily with email notifications for any failures.Working on CMS Interop, Big data Migration to SQL Server.CMS Interop Navitus and Cohere implement in Progress.Helping FHIR Accelerator implementation based on the current CMS Interop data set.24/7 production support of the delivered projects.Extensively worked with optimizing technique and various Active transformations like Filter, Sorter, Aggregator, Router, SQL, Union and Joiner transformations. For best performance to handle big data set.Extensively worked with various Passive transformations like Expression, Lookup, Sequence Generator, Mapplet Input and Mapplets Output transformations.Used Informatica Power Center for (ETL) extraction, transformation and loading data from heterogeneous source systems into target database.Ensured data consistencies is being maintain in each layer of Data.Connect to do multiple teams of all the inclusive environment to make sure no roadblocks will impact the project timelinesDesigned data models using with BAs; designed programs for data extraction and loading into SQL Server database.Leading a team of 7 members and serving as a primary point of contact for ongoing.Techno Manger/ArchitectCox CommunicationJan19 Jun-22Project:Role:Customer:Enterprise data platform servicesTechno Manager/ArchitectCOX CommunicationPeriod:Environment:Domain:Jan19 Jun-22Hadoop Scala and Spark, HVRTelecomDescription:COX Communications is the largest private broadband company in America, providing advanced digital video, Internet, telephone and home security and automation services over its own nationwide IP network. Cox serves more than 6.5 million residences and businesses across 18 statesResponsibilities:Managing end to end of projects from the stage of initiation till monitoring & control and closure including planning, estimation & scheduling, updating information to all stakeholders, integrating change control, controlling base lines, planning risk responses and contingency planningImplementing project plans within preset budgets and deadlines, monitoring and reporting on project progress.Proven success in leading a team of 37 members and serving as a primary point of contact for ongoing support in areas of responsibility including setting the analytics roadmap and budget for information delivery functions.Ensuring that the project delivery team structure is adequate and enforces compliance, best practices, approach & direction for the technical aspects of the organization's practice, providing technical leadership to fellow team members.Involved in migration projects to migrate data from data warehouses on Oracle/DB2 and migrated those to TeradataCome up with health reports for the environment.Experience working with Member, pharmacy, patient, provider, encounters and claim dataExperience working with parsing for structured and unstructured files using Informatica PowerCenter.Implemented Slowly changing dimension Type 1 and Type 2 for change data capture. .Worked with various look up cache like Dynamic Cache, Static Cache, Persistent Cache, Re Cache from database and Shared Cache.Worked on loading data into Hive and Impala system for data lake implementation.Designed solution to process file handling/movement, file archival, file validation, file processing and file error notification steps using Informatica.Worked extensively with update strategy transformation for implementing inserts and updates.As per business we implemented Auditing and Balancing on the transactional sources so that every record read is either captured in the maintenance tables or wrote to Target tables.Extensively used the tasks like email task to deliver the generated reports to the mailboxes and command tasks to write post session and pre-session commands.Techno ManagerBaker Huges General ElectricMay17 Jan19Project:Role:Customer:BHGE Application MigrationTechno ManagerBHGEPeriod:Environment:Domain:May17 Jan19Informatica, Green Plum, Talend,Oil and GasDescription:BHGE has a large strategic synergy initiative to move Analytics applications from legacy on premise platforms to the new AWS Data Lake, hosted in green plum. The strategic visualization technologies remain OBIEE and Tableau. The strategic Data integration technology is Talend. The purpose of the project is move to one data lake from many today and drive a common technology stack to improve operations, reduce costs, and improve uptime for usersResponsibilities:Steered efforts in developing Informatica, Talend, GPDB, HVR, Reporting.Created a new label for Meta data entry in Talend and Checking the file format and Handled Code testing and deployment in higher environmentManaged the entire gamut of operations starting from gathering requirements, designing interfaces, examining design and providing assistance to the team in coding, testing, performance tuning.Participated in the development and review of business and system requirements to obtain a thorough understanding of business needs in order to deliver accurate solutionsCommunicated with internal/external clients to determine specific requirements and expectations; managing client expectations as an indicator of quality.Using Informatica Power Center created mappings and Mapplets to transform the data according to the business rules.Designed the ETL mappings between sources to operational staging targets, then to the data warehouse using Power center Designer.Involved in Design, analysis, Implementation, Testing and support of ETL processes for Stage, ODS and Mart.Documented Informatica mappings in Excel spread sheet.Developed mappings/Reusable Objects/Transformation/Mapplets by using mapping designer, transformation developer and Mapplets designer in Informatica Power Center.Worked with Informatica power center Designer, Workflow Manager, Workflow Monitor and Repository Manager.Used Informatica Power Center to migrate the data from different source systems.Extensively used Autosys for Scheduling and monitoring.Responsible for developing, support and maintenance for the ETL (Extract, Transform and Load) processes using Informatica Power Center 9.1 by using various transformations like Expression, Source Qualifier, Filter, Router, Sorter, Aggregator, Update Strategy, Connected and unconnected look up etc.Extensively used various Performance tuning Techniques to improve the session performance.Developed and maintained ETL (Extract, Transformation and Loading) mappings to extract the data from multiple source systems like Oracle, SQL server and Flat files and loaded into Oracle. .Involved in creating new table structures and modifying existing tables and fit into the existing Data Model.Project Lead/ManagerBaker Huges General ElectricFeb14  Mar16Project:Role:Customer:OFS BillingProject Lead/ManagerBaker HughesPeriod:Environment:Domain:Feb14  Mar16Informatica, SAP BODS, Oracle, Vector wise and TableauPower and WaterDescription:GE was not able to drive operational decision due to multiple operational definitions with no Cross-functionality reporting & lack of drill down facility to its finance users. This implementation of BI platform NexGen would provide a near real time consistent global view & true analysis of operational & financial data to support business drivers & save manual efforts. NexGen Provides more accurate reporting based on one single table as opposed in earlier Guide that provides near real time Trial Balance Sheet.Responsibilities:Gained understanding of:oRequirement from existing ARGO model and create a linage documentoExisting Business Model and planned the challenges for new Data basesCommunicated Code Review/Migration team to find the issues and solutions.Ensured data consistencies is being maintain in each layer of Data.Assisted Project Managers in establishing plans, risk assessments and milestone deliverablesDesigned data models using Oracle Designer; designed programs for data extraction and loading into Oracle databaseDeveloped complex reports using multiple data providers, user defined objects, aggregate aware objects, charts, and synchronized queriesPREVIOUS EXPERIENCETrack LeadTarget Corporation, BangaloreJun06  Feb14Projects Undertaken:Project:EGPTechnology Used:Data Stage, DB2, Teradata and OracleProject:MBI-InventoryTechnology Used:Data Stage, DB2, Teradata and OracleProject:PARSTechnology Used:Data Stage, DB2, Teradata and OracleProject:Merchandising ItemTechnology Used:Data Stage 7.5, DB2, Oracle and UNIXProject:DQ SCR Target CorporationTechnology Used:Data Stage 7.5, DB2, Oracle and UNIXProject:Vendor Compliance IntegrationTechnology Used:Data Stage 7.5, DB2, Oracle and UNIXProject:Vendor Report CardTechnology Used:Data Stage 7.5, DB2, Oracle and UNIXProject:Gift registryTechnology Used:Data Stage 7.5, DB2, Oracle and UNIXJunior EngineerJazzcon, New DelhiMar03 - Mar06PERSONAL DETAILSDate of Birth: 07/09/1979Language Known: English &HindiAddress: 408 Iron Street Bloomsburg Pennsylvania* Refer to annexure for projects undertakenANNEXURE (PROJECTS UNDERTAKEN)Project:Environment:EGPData Stage, DB2, Teradata and OracleRole:Period:Track LeadJan 2013  Feb 2014Description:This is IT initiative project major focus that all the systems are in uniform platform and Database. Maintaining the inventory of Target Stores in EGP program we are transferring AWD Capabilities like Guest scoring, Item affinity, Mark Basket analysis. In Current ADW is retained, driving the need for the data, application and report integration into EDW.Project:Environment:MBI-InventoryData Stage, DB2, Teradata and OracleRole:Period:Track LeadDec 2011  Dec 2012Description:Inventory data in the legacy UDB environment does not meet all the analytical requirements of both the food and general merchandise businesses. By ensuring inventory data meets these analytical needs we will be enabling more timely, accurate and actionable reporting to be delivered back to the business via our parallel partner projects: 2012 MBI reporting release, Presentation & Space Reporting and Strategic Planning Funnel reporting. By implementing a more retail industry specific model and leveraging our learning's while building FMA\Unsalable marts, Target will be better positioned to anticipate future market trends and leverage its own inventory data more effectively. Inventory was initially implemented in the legacy UDB environment as a part of the Food Merchant Analytics (FMA) project but had a decidedly food slant to it. During the course of usRe-plat forming analysis the project team identified a number of Logical and Physical recommendations to improve the architecture and usability of this data for General Merchandise.Project:Environment:PARS (Platform architecture, Replacement and Support)Data Stage, DB2, Teradata and OracleRole:Period:Track LeadFeb 2010  Dec 2011Description:The Objective of this project is to migrate the code and ware house from DB2 to Teradata, basically migration project as warehouse is growing in DB2. Migrating the code in Data stage 8.5 and Conversion of DB2 SQL into Teradata using pearl programming.Project:Environment:Merchandising ItemData Stage, Informatica and DB2Role:Period:Track LeadMar 2009  Jan 2010Description:The Foundation and dimension layer of Item is created to get the granular information of all the item of target related to particular item as per hierarchy level. SO was built with a vision of it to be a source of information for all reference information. Shared Objects built reference confirmed dimensions which include Item Hierarchy, Location hierarchy, Calendar information, senior buyer, Vendor, Team Member. The SO is a never-ending project and will increase the number of entities in it, as new reference objects are identified in EBI.Project:Environment:DQ SCRData Stage, Informatica and DB2Role:Period:Track LeadDec 2008  March 2009Description:Handled data quality issue of the implemented Project. As per the severity and ticket raised by business data quality team.Project:Environment:VCIVendor Compliance IntegrationData Stage 7.5, DB2, Oracle and UNIXRole:Period:Software EngineerMay 2008  Dec 2008Description:The objective of this project is to integrate the source systems Vendor Compliance System (VCS) and Import Claim System (ICL) to VRC Data Mart enabling the ability to see performance calculations and compliance charges in a single view and to do dispute research in one report. The objective is to view the relationships between Chargeback, Reversal and Denials also a reason code for reversals & denials. The new system needs the ability to view chargeback that are in progress (not fully paid).Project:Environment:VRC  Vendor Report CardData Stage, Informatica, DB2, Teradata, Oracle and TandemRole:Period:Software EngineerNov 2006 - Apr 2008Description:The objective of this project is to improve data collection, expand measures to align with the needs of business, improve vendor management and improve vendor performance. By fulfilling these objectives Target Corporation wants to increase the overall vendor performance by 1 to 2 percentage points in Fill Rate Original, Fill Rate Revised, On Time Shipment, EDI 856 %Match, Lead-Time Reliability and Automated Receiving Technology.Project:Environment:VRC  Vendor Report CardData Stage 7.5, DB2 and UNIXRole:Period:Software EngineerJune 2006- Nov 2006Description:The Gift registry is an application through which a register user can select his/her gifts on the occasion in the target.com Website the popular registries are baby product weeding gifts and holiday gifts. The user adds the gifts what he wants to get from his registries are searchable based on the username.Decisive, strategic and performance-driven professional targeting senior level assignments in Data Warehousing with an organization of high repute in multiple technologies and rolesEMAIL AVAILABLEPHONE NUMBER AVAILABLENavin Kumar VishwakarmaETL ArchitectData Analysis and designProject Execution & ManagementAutomationClient/ Stakeholder ManagementPerformance TuningCross-functional CoordinationCode & Unit Test Cases Review

Respond to this candidate
Your Message
Please type the code shown in the image:

Note: Responding to this resume will create an account on our partner site postjobfree.com
Register for Free on Jobvertise