Quantcast

Data Architect Resume South riding, VA
Resumes | Register

Candidate Information
Name Available: Register for Free
Title Data architect
Target Location US-VA-South Riding
Email Available with paid plan
Phone Available with paid plan
20,000+ Fresh Resumes Monthly
    View Phone Numbers
    Receive Resume E-mail Alerts
    Post Jobs Free
    Link your Free Jobs Page
    ... and much more

Register on Jobvertise Free

Search 2 million Resumes
Keywords:
City or Zip:
Related Resumes

Data Architect Oracle Dba Germantown, MD

Data Architect Rockville, MD

Data Architect Warehouse Fairfax, VA

Sr. Enterprise Architect La Plata, MD

Cloud Computing Architect Silver Spring, MD

Business Analyst Data Management Washington, DC

data engineer Reston, VA

Click here or scroll down to respond to this candidate
 Candidate's Name
EMAIL AVAILABLEPHONE NUMBER AVAILABLESuresh has over 20+ years of IT Industrial experience in HealthCare, Finance, and Banking. He is an expert managing and implementing large scale programs involving Data Integration, building Data warehouse, Business Intelligence systems, building information delivery strategy and load patterns, building ETL frameworks, data architecture guidelines, MDM architecture. He is a proven leader in various aspects of technology implementations with an eventual goal of providing an integrated 360-degree view of the customer and providing higher quality of data.
Data Warehousing / Data Architect / ETL Architect / Business Intelligence Expertise      Envisioning Enterprise-level Solution, translating high level goals into requirements and Architectural Designs, establishing Architectural roadmaps, Application Design & Development, Testing, Production Support and a comprehensive background in Database and Data Warehousing concepts      In Enterprise Data Architect/ETL Architect/Data Modeler with extensive experience in covering Data Warehouse, Data Integration, Business Intelligence related models (Conceptual, Logical & Physical), Data Analysis, Data Mapping, Data Profiling, Data Acquisition, Data Replication, Data Lineage, and Data Governance.      Have been working as a Data Management Architect for over 10 years, Off late been helping clients evolve/build their data management architecture using cloud technologies/Services from Azure, AWS.      Creating future state architecture road map document to build Data Warehouse architecture using emerging solutions like Data Vault, Data Mesh, Data Fabric, Data Virtualization, c, Azure Synapse Analytics      In Implementing Multi-Tier Architecture solutions comprising the Operational Data Store (ODS), Staging, Data Warehouse and Business intelligence/reporting layers      In creating future state architecture road map document to streamline Data Warehouse architecture using emerging technologies like Hadoop      In leading an effort to build Big Data POC initiative. Exposure to installing and configuring a 7-node cluster using Cloudera distribution of Hadoop CDH4.7 and using Hive, HBase and Sqoop.
      In Database/ETL programming creating Data Warehouses (Star Schemas and Snow Flake Schemas) Using some of leading ETL tools like AbInitio, Informatica and Data Stage.      Well versed with Cloud computing fundamental/technologies; specifically IaaS, SaaS, PaaS.      In creating the enterprise level data strategies for new system initiatives, building Data Architecture standards, understanding and championing industry best practices      In leading initiatives to create Enterprise standards and design patterns for Data Marts, Enterprise Data Warehouse and Application Databases      In leading technical discussions and solution planning involving Enterprise and Solution Architecture teams to rollout the detailed design for various solutions and more importantly for all architecture teams to align with IT goals set      Providing guidance and creation of enterprise data profiling reports for all CareFirst databases and particularly for the Data Governance team. These reports shall be used by project teams to determine the quality of data as well as assist in source to target mappings
      In accessing current state MDM implementation and a future state architectural road map document for Master Data Management
      Worked extensively on Oracle Exadata platform      Experience in Performance Tuning and Capacity Planning of databases, Optimization of SQL queries, and Performance Tuning of ETL Informatica objects such as sources, mappings, targets and sessions      Well versed with Data Profiling, maintaining Data Dictionary and performing Data Lineage for trouble shooting production issues.      Well versed with CICD processes and code versioning using GIT, also well-versed using Docker Containers for code development.
Employer History
      February 2020   present   Data Architect with AFS      October 2018   January 2020   Digital Insights lead / Data architect with CGI
      October 2017   September 2018   Independent Data consultant      February 2009   September 2017   ETL Architect, Consultant Specialist, Technical Delivery Manager and Enterprise Data Architect with CareFirst BCBS      March 2007   January 2009, Expert Consultant with Hewlett Packard      October 2004   February 2007, Principal Consultant with Knightsbridge Solutions      April 2000   September 2004, Independent consultant working mostly for IBM Global Services      July 1997   March 2000, Programmer/Analyst with Sristek Technologies Ltd. (India)Education      BE in Electrical Engineering   1995Citizenship   US Citizen, with active Secret Clearance
Certification
      SAS Certified Professional      ITIL CertifiedTechnical SkillsFederalDepartment Of State   Industries:Healthcare Insurance, Financial, Banking, TelecommunicationsProject Domain:Databases:Data Integration/Data Warehousing initiative, Systems built for Government Mandated programs, Financial Management Solutions, Performance Improvement, Business Process Re-engineering, Application Development etc.
Oracle 11g, Teradata, DB2, Sybase, SQL ServerETL Tools:Modeling Tools:Ab Initio, Informatica, Data stage, Oracle Data Integrator
ER/Studio Data Architect 10.0.2, Erwin
MDM Tools:
IBM Initiate systemsReporting Tools:MicroStrategy 10, SSRS, Business Objects 4.0/5.0, Salesforce Einstien
Operating Systems:Unix, Windows, IBM AIX, SUN Solaris, MS-DOSComputer Language:Python, R, C, C++, ASP 3.0, HTML, DHTML, VBScript, JavaScript, SQL, PL/SQL, XML, CGI, PERL, Java, Servlets 2.1, JSP1.0, EJBGUI:CRM/Cloud Tech:Visual Basic 6.0/5.0/4.0, Developer 2000Salesforce, AWSWeb Servers:Java Web Server 2.0, BEA Web Logic 4.5/5.1, and IIS4.0/5.0Development Tools:TOAD, SQL Developer, ER Studio, Visual Studio CodeApplications:Microsoft Office 2000, Clear Quest, HP Service Manager,Training:Data Modelling, IBM Initiate, MicroStrategy 10,Ab Initio, DataStage, Oracle 8i.

PROFESSIONAL EXPERIENCEDOS (Dept. of State)February 2020 to Present
Data ArchitectAs part of the IT modernization effort at DOS, I lead the Azure cloud environment setup & configuration to boost their Data Management capabilities and enable them to build a data strategy ecosystem that supported the people who use, technology that supports and processes that facilitates it.
Following are some of the high-level tasks/accomplishments.      Created and maintained an execution plan for cloud infrastructure build, involving Accenture, Microsoft, Informatica and DOS      IaaS server setup and Informatica product configuration for (IDQ, EDC and Axon)      Enabling ADLS Gen-2 setup and configuration      Enabling Synapse Analytics
      As a Data Architect, was responsible for creating the data architecture standards.      As part of a pilot effort, built the Data Model for project domain that covers all project activities and reviewed it with CTO and BCDO.      Was responsible for creating the ETL Standards
      Built the future state Data Management architecture and presented it to the CTO.      Contributed heavily towards getting the Informatica product & Azure services ATO d (Authorization to Operate).      As a data architect & Technology sub-lead on the Data Strategy team, contributed heavily to document and create a data strategy ecosystem deck that captured the current state and a proposal for a future state.      Built data pipelines using Python to processes csv and excel files.      As part of the Data Strategy team, was heavily involved with Analysis of Alternatives for the following areas in the Data Management realm.
      Operational systems low code/No code apps      Data Streaming services      Data Lakehouse      Data Mesh and Data Fabric      Analytics and reporting platform      Dev OpsCGI
October 2018   January 2020Digital Insights LeadAs a digital Insights lead, I was involved with pre-sale and POC initiatives for various clients. My responsibilities involved
      Analyze current state and identify opportunities to resurrect the architecture around Data Management, Data Lake, Data Warehouse, Analytics and business Intelligence i.e. with the more contemporary tools, technologies and using the modern data architecture techniques and best practices      Lead ideation sessions and collaborate with the client to understand their pain points and issues with the current state architecture and produce blue print / road map around Data Engineering involving Data Profiling, Data Acquisition, Data Quality, Data Mapping and Master Data Management
      Identify opportunities / use cases for implementing Advanced Analytics and enable the clients realize the importance of Data Science, Predictive Analytics, Data Visualization and   Machine Learning & Artificial Intelligence
Creative Systems and Solution (Contracting)
December 2017   September 2018Data Architect (Dept. Of Treasury)
The Community Development Financial Institutions Fund (CDFI Fund) plays an important role in generating economic growth and opportunity in some of our nation s most distressed communities. By offering tailored resources and innovative programs that invest federal dollars alongside private sector capital, the CDFI Fund serves mission-driven financial institutions that take a market-based approach to supporting economically disadvantaged communities. The Bureau of the Fiscal Service (Fiscal Service) on behalf of the CDFI is seeking assistance for the implementation of its legacy Community Investment Impact System (CIIS) in the Awards Management Information System (AMIS).   AMIS is a cloud-based solution developed on the Salesforce platform. It supports the CDFI Fund s certification and award/allocation programs through all phases of the programs  life cycles, spanning from program announcement through award close-out.As a Data Architect on the effort      Analyze the legacy system, to understand all its functionality
      Lead and plan for the discovery phase, identify topics and set agenda for discussion with the fund      Reverse Engg the existing data model to facilitate understanding of the data and the business      Develop a migration strategy for the historical data into the new salesforce AMIS system, vet the plan with all stake holders and get approvals      Build the source to target mapping document to capture the mapping for the source and target elements      Used data loader to facilitate migrating the data over      Analyze, identify use cases, within the business functionality to build cases for implementing Salesforce Einstein (AI enabled) reporting platform for the client
      Used Informatica Analyst tool extensively profile data, catching data anomalies and reporting them to the governance team
      Used ER Studio to build and maintain data dictionariesSkillsMS Visio, SQL Server, Salesforce, Data loader, Workbench, Einstein, Informatica Analyst tool, ODI 12c
CareFirst BCBS
Enterprise Data Architect Consultant
September 2013   November 2017Technical Delivery Manager / ETL Architect
February 2009   August 2013
As an Enterprise Data Architect Consultant      Collaborated with Executives in defining the strategic view of corporate business Information needs      Responsible for defining vision, strategies, data patterns and policies around how data is acquired, stored, integrated, conformed and distributed within the enterprise      Responsible for designing enterprise data models, which included models for data warehouse, data marts, application data stores and Big Data   HDFS structures; creation, instantiation and deployment of databases (working with DBA s)      Lead the EDA team responsible for building the future state CareFirst Business Intelligence Warehouse (CBIW)      Lead the effort to create Enterprise standards and design patterns for Data Marts, Enterprise Data Warehouse and Application Databases      Used Informatica Metadata Manager to track data lineage
      Responsible for working closely with various project stakeholders (Business, Data Governance, Enterprise Architecture, Solution Architecture, Security, DBA, ODS and CBI) throughout the project SDLC phases      Responsible for performing data profiling for all incoming data into the warehouse to check/ensure quality      Responsible for working with legal to ensure all external vendors have Data use Agreement.      Responsible for keeping track on industry trends and directions, develop and present substantive technical recommendations to project teams      Responsible to collaborate with SME s, Enterprise Architects and Solution Architects to analyze the current and future information needs of the project based on short- and long-term business goals and determine the enterprise data impact and help with creating L1 estimation and high-level delivery schedule of data deliverables      Created a current state assessment document and a future state architectural road map for Master Data Management      Communicate with business stakeholders to understand the business goals and strategy (particularly for Data warehouse related projects)
      Coordinated technical design sessions across multiple domains development teams, ensured consistency of design approach and swift resolution of cross-team design issues      Contributed towards building MicroStrategy schema objectsAs a Technical Delivery Manager/ETL Architect                                                                                    February 2009 to August 2013ODS & EDM:As a part of the corporate wide Model Office initiative, the Data and Informatics team was charged with creating a consolidated Operational Data Store (ODS) that contains data from all major source systems that CareFirst currently uses so that the data is housed in one location and more importantly decoupled from the sources for further dissemination and use.
MLR: The Medical Loss Ratio (MLR) Regulation as mandated by Section 2718 (b) of the Public Health Services Act (PHSA) and the Patient Protection and Affordable Care Act (PPACA) requires health plans offering group and individual health insurance to publicly report on their medical loss ratio in order to create transparency around a plan s efficiency in allocating premiums to actual healthcare costs.  Plans will be required to issue a rebate to groups and subscribers if the amount of premium spent on claims expenses is less than 85% for large group or 80% for small group and the individual market. My roles and responsibilities on the above programs are below      The above program was a multifaceted and a multiple release program
      Was responsible to design the ETL framework      Was responsible for designing the ABC framework      Responsible for conducting JAD and design session      Responsible for completing the SAP 3(Solution Architecture Package) documents      Responsible for engaging with the business to review requirements and Source to Target Mappings      Apart from being responsible for the technical side of the effort, was responsible for the planning tasks, scheduling, monitoring and reporting on the objectives and progress of the project      Involved with benchmarking/comparing ETL efficiency against loading data into Oracle 12 C and Oracle Exadata platform      Involved with benchmarking/comparing ETL efficiency using full and partial pushdown optimization to leverage the power of Exadata serverHewlett Packard (Expert Consultant)
August 2007   January 2009
Client - Republic Bank Ltd, TrinidadETL Architect - Enterprise Data Warehouse Republic bank is one of the leading banks in Trinidad and Tobago. The project was a massive enterprise level data Integration initiative. Around 12 independent source systems (credit cards, regular banking, investment banking and mortgage systems) were identified to enable customer relationship management and other reporting needs.
Activities
      Conducted as a trusted advisor to lead the analysis and implementation of the DI effort      Collaborated with other architects to align the DI architecture to the business case and overall solution architecture      Contributed heavily in discussions for implementing the design of the CDC process      Set DI standards and architecture for the project      Guided the design and implementation of the DI solution from a technical perspective
      Developed a template job for performing the CDC processSkillsMS Word 2003, MS Power point 2003, MS Visio, DataStage, Neoview R2.3, Sybase, SQL Server

Knightsbridge Solutions LLC (Principal Consultant)   			 October 2004   September 2007
Client   Fannie Mae Washington, DCDI Architect /DI Designer
 FDW (Financial Data warehouse)The goal of this project was to obtain and store Fannie Mae securities data from various Fannie Mae source systems in the Financial Data Warehouse (FDW). To achieve this, the FDW Securities team had to develop the processes and the associated extract, transform and load processes. One of the key goals as part of this effort was to provide Portfolio Data Services (an internal Fannie Mae System) with the data needed to perform portfolio management and modeling.Client  Fannie Mae Washington, DCProject Manager/ETL Architect /BI ArchitectEvaluation of RFDW & FDWThe assignment was to evaluate two robust and significantly different Data Warehouse systems used to track loan level data. Part of the assignment was to study the feasibility to integrate the 2 systems (FDW Finance Data Warehouse and RFDW Restatement Finance Data Warehouse) into one consolidated system ADW (Accounts Data Warehouse). This system ADW would replace the 2 systems to support Fannie-Mae current and future financial reporting needs.Client - Kaiser Permanente Silver Spring, MDProject Manager/ETL ArchitectIDX (GE ImageCast)The implementation of GE ImageCast version 10.4.1 in the Mid-Atlantic States region is to replace the KMATE Radiology Information System. This implementation will bring the region on to a Radiology platform that is capable of supporting digital imaging and PACS. ImageCast will also be interfaced to existing systems in this region such as KMATE, KP HealthConnect, MRS and Dictaphone as well as the Radiology modalities. The application scope involved moving the data from the OLTP (700+ tables on SQL Server) system into their Data warehouse maintained on Sybase.Client - Kaiser Permanente Silver Spring, MDETL Architect/Business Analyst/Data AnalystPMD (Project Management Database)The ASM Information Integration Services (IIS) Project Management Database was a POC used to provide centralized tracking and program management of development projects for the ASM IIS group. The IIS team was located in three service centers   Pasadena, CA; Portland, OR; and Silver Spring, MD. Additional staff members resided multiple geographical regions. The IIS PMD system allowed executive management to track metrics, such as resource allocation, financial information, and project status, at a national level. It resulted in reduced administrative costs, faster delivery time, and overall lower total cost of ownership through improved efficiencies.
America Online, Dulles, VAETL Designer/Lead DeveloperSIFT
SIFT collected usage counts, such as the number of members using a specific product, the total number of unique sessions for active unique accounts per product, and total number of unique sessions for active sub accounts per product. It also provided information on usage frequency, such as the number of sessions in which a unique account or sub account used a product over a specified period of time.Consultant IBM Global Services
May 2000   May 2004
Capital One, Glen Allen, VATeam Lead/Onsite CoordinatorAMDDThis was a massive enterprise level application migration program to enable legacy application decommissioning. This effort reduced legacy database space consumption by over 11 Terabytes. This effort also involved replacing their clustered Oracle setup with Teradata. The effort predominantly involved evaluating the ETL jobs (written using PL-SQL), rebuilding the ETL framework, and pointing them to the new Teradata database.
United Health Care, St. Louis, MOAb Initio Developer/SAS Programmer AnalystGalaxy Management SystemGalaxy Management System (GMS) was a multi-terabyte Data Warehouse, the fundamental goal of the effort was to integrate and conform data from their various source systems. We were also engaged in building a prototype using Ab Initio and evaluate its performance and capabilities against PL-SQL scripts and other base SAS jobs that existed in moving huge volumes of data. As an SAS programmer/analyst, was responsible for the enhancement of the existing system. This included analyzing, coding, reviewing, testing, and implementing SAS programs. Also, developed common utility programs using SAS macros that could be used by the development team.Sr. SAS Developer Sristek Technologies
July 2097 to April 2000Dr. Reddy s Lab, Hyderabad, IndiaSAS Programmer AnalystMaintenance of Data Warehouse
Activities      Was a member of the IT division s data warehouse team that installed SAS v6 on client servers and demonstrated various SAS tools. He developed numerous prototype applications to demonstrate the capability of the SAS suite of products. He also later developed applications that involved extracting data from flat files and OLTP systems and loading them to the target data marts using Base SAS, SAS Access, and SAS Macros.Gatti Cargo, Hyderabad, IndiaProgrammerThe basic objective of this project was to accommodate dealer business dividend incentive reports on the company s intranet. Sales and dealer finance detail data in the database was used for dividend incentive calculations and report generation.

Respond to this candidate
Your Email «
Your Message
Please type the code shown in the image:
Register for Free on Jobvertise