Quantcast

Data Engineer Science Resume Highlands r...
Resumes | Register

Candidate Information
Name Available: Register for Free
Title Data Engineer Science
Target Location US-CO-Highlands Ranch
Email Available with paid plan
Phone Available with paid plan
20,000+ Fresh Resumes Monthly
    View Phone Numbers
    Receive Resume E-mail Alerts
    Post Jobs Free
    Link your Free Jobs Page
    ... and much more

Register on Jobvertise Free

Search 2 million Resumes
Keywords:
City or Zip:
Related Resumes

Data Science Software Developer Denver, CO

senior big data engineer Aurora, CO

Data Engineer Machine Learning Denver, CO

Data Center Network Engineer Denver, CO

Quality Assurance Data Science Denver, CO

Data Science Business Intelligence Denver, CO

Data Scientist Science Boulder, CO

Click here or scroll down to respond to this candidate
Candidate's Name
E-mail: EMAIL AVAILABLEContact: PHONE NUMBER AVAILABLELinkedIn URL : https://LINKEDIN LINK AVAILABLEProfessional ExperienceData Engineer/Sr.SAS Developer with over 12 years of experience in transforming Business requirements into analytical models, designing algorithms, building models, devising data mining, and reposting solutions that scale across a massive volume of structured and unstructured data and expertise working in a variety of industries including retai and healthcare.Experience in development of SAS reports and Production support environment.Experience in SAS Admin activies by providing user level role and installing SAS EG Software.Advanced expertise in data manipulations by using SAS data step statements such as, SAS Formats/Informants, Merge, DATA _NULL_, Set, Update, Functions, and conditional statements, and by applying advanced SAS programming techniques, such as PROC SQL (JOIN/ UNION), PROC APPEND, PROC DATASETS, and PROC TRANSPOSE.Experience in Base SAS, SAS/MACRO, SAS/ODS, SAS/GRAPH, and SAS/SQL.Extensive experience in generating reports by using procedures such as PROC REPORT, PROC TABULATE, PROC TRANSPOSE, PROC FORMAT, PROC SUMMARY, PROC MEANS, PROC FREQ, PROC SQL, PROC PRINT, etc.Proficient in writing macros to create SAS data sets, tables, report, and graphs from Procedure Output and automatically send to various destinations (including HTML, RTF, PDF, Printer, and Listing) by using SAS ODS statement, as well as convert various file types to SAS Data sets.Hands on experience in SAS programming for extracting data from Flat files, Excel spreadsheets and external RDBMS (ORACLE,TERA DATA,DB2, Netezza) tables using LIBNAME and SQL PASSTHROUGH facilityQuick learner, excellent team player and always ready to take additional responsibilities.Ability to work on multiple tasks simultaneously and meet project deadlines, and able work in irregular hours.Proficient in Data science process life cycle: Data Acquisition, Data Preparation, Modeling (Feature Engineering, Model Evolution) and Deployment.Familiar with key data science concepts such as Statistics, data visualization, machine Learning, and so forth, and experienced in Python, R, MATLAB, SAS, PySpark Programming for statistic and quantitative analysis.Good knowledge in building production quality and large-scale deployment related to SAS .Acquainted with high performance computing (Cluster computing on AWS with Spark/Hadoop) and building real-time analysis with Kafka and Spark Steaming, Knowledge using Qlik, Tableau, and Power BI.Extensive experience in writing macro program for existing model for prediction.Excellent analytical and problem-solving skills. Good communication and presentation skills.Experience using batch processing mehods in SAS files depolyment and report generation.Experience in writing shell scripting and UNIX commands.Extensive experience working with RDBMS such as SQL Server, MySQL, and NOSQL databases such as MongoDB, HBase.Successful in reducing the time required for development of each dashboard from approximately 1 week to 6 hours by extensive automation using SAS macro and Excel macro.SQL queries on Teradata SQL Assistant for time efficiency and later extracted desired results into SAS using SQL pass through facility for report generationUsage of macro recording in MS Excel for data summarization when needed.Generated data visualizations using tools such as Tableau, Python Matplotlib, Python Seaborn, R.Command and experience working in Agile environments including the scrum process and used Project management tools like ProjectLibre, Jira and Version Control tools such as GitHub/Git.Programming Skills:DATA SOURCES : Snowflake, PostgreSQL, MS SQL Server, MongoDB, MySQL, HBase,Amazon Redshift, Teradata.LANGUAGES : SAS BI Tools(Base, Advance, SAS g, SAS Management console),Python (NumPy, Pandas, SciKit-Learn,Matplotlib, Seaborn), R, SQL.DATA VISUALIZATION : Tableau, Python (Matplotlib, Seaborn), R(ggplot2), Power BI, QlikView,SASWeb Report studio.OPERATING SYSTEMS : UNIX Shell Scripting (via PuTTY Client), Linux,Windows, Mac OS.EDUCATIONBachelor of Technology in Information Technology from J.N.T.University, Kakinada in 2011 with FIRST CLASS.Professional Experience:Charter communications, Denver, CO JAN 2023 to till dateSr.Consultant /Sr.SAS DeveloperDescription of the Project:This is one of the leading video and internet service provider company. This project involves developing reports using SAS, Python, Azure & CI/CD - Devops application to improve performace. We are fetching data from multiple Data sources like DB2, Oracle, Teradata, Hadoop based on business requirement.Responsibilities:Experience in using different PROCs to analyze the data.Created SAS datasets for analysis by creating SAS MACROS for reusability.Handled the big data to prepare extensive reports which can use for further analysis.Ad-hoc code development based on requirement to see customer utilization.Created report by referring members demographic details.Wrote code for feature engineering, Principal component analysis PCA, hyperparameter tuning to improve the accuracy of the model.Responsible for Developing Statistical reports using SAS EG and SAS Studio.Worked with SAS/BASE, SAS/MACRO, SAS/SQL, SAS/ODS procedures.Worked with SAS/MACRO for creating macro variables, macro programs and Auto call macro library to modify existing SAS program for ease of modification while maintaining consistency of results.Worked with SAS procedures like PROC PRINT, PROC REPORT, PROC TABULATE, PROC FREQ, PROC MEANS, PROC SQL, PROC SORT, PROC CONTENTS and PROC TRANSPOSEDeveloped the python automation script for consuming the data from Teradata,DB2 & Hadoop.Developed the python Script to automate the data cataloging in Alation Data catalog tool. Tagged all the Personally identified Information (PII) Data in Alation Enterprise data catalog tool, to identify the sensitive consumer information.Ingested various types of data into Hive using ELake Ingestion Framework which internally uses Pig, Hive and Spark for data processing.Practical understanding of the Data modeling (Dimensional & Relational) concepts like Star  Schema Modeling, Snowflake Schema Modeling, Fact and Dimension tables.Job Environment: SAS BI Tools(BASE SAS,Adv.SAS, SAS Management console),Python (NumPy, Pandas, Matplotlib, Seaborn), Agile/Scrum.CIGNA Bloomfield, CT AUG 2018 to DEC 2022Sr.Data Engineer/Sr.SAS DeveloperDescription of the Project:A development & production support project for a Leading Health Care Client. This project involves developing reports using SAS and Hyperion. SAS reports are developed using SAS EG and SAS studio which fetches data from multiple Data sources like DB2, Oracle, Teradata, Hadoop based on client Requirement.Responsibilities:Experience in using different PROCs to analyze the diagnosis-based claims.Created SAS datasets for analysis by creating SAS MACROS for reusability.Handled the Claims data to prepare extensive reports which can use for further analysis.Ad-hoc code development based on requirement on medical claims process.Created report by referring providers and members demographic details.Responsible for Developing Statistical reports using SAS EG and SAS Studio.Worked with SAS/MACRO for creating macro variables, macro programs and Auto call macro library to modify existing SAS program for ease of modification while maintaining consistency of results.Worked with SAS procedures like PROC PRINT, PROC REPORT, PROC TABULATE, PROC FREQ, PROC MEANS, PROC SQL, PROC SORT, PROC CONTENTS and PROC TRANSPOSEPractical understanding of the Data modeling (Dimensional & Relational) concepts like Star  Schema Modeling, Snowflake Schema Modeling, Fact and Dimension tables.Highly experienced in developing data marts and developing warehousing designs using distributed SQL concepts, Presto SQL, Hive SQL, Python (Pandas, Numpy, SciPy, Matplotlib) and Pyspark to cope up with the increasing volume of data.Created tables using Hive and queries are performed using HiveQL which will invoke and run.Coordinates with clients to arrange meetings and discussing issues and challenges in theproject. Involved in SAS client application installation,User creation,monitoring jobstreams,Handlingincidents & Enhancements in servicenow,creating JOBs and modifying existing jobs,did theanalysis issues raised by client. Extract data from different sources and transforming as per design document using SAS DI Jobs,Base SAS & Macros. Involved in migration for SAS version from SAS9.4M3 to M6 version. Taking initiative in accepting critical issues and supporting the peers. Involving in Daily andMonthly Refresh loads.Responsible for developing customized and Scheduled Hyperion Reports using Hyperion based on the client Request.Automated and Migrated Scheduled 200 Tableau and Hyperion codes to SAS programs which reduced 90% of Manual efforts.Interacted with the clients & business customers to understand the requirement.Worked in Estimating the Efforts Needed from Offshore for Development of ReportsLead the offshore Team and responsible for providing Knowledge Transfer to theoffshore team and allocating reports to the team. Helped the team to resolve the critical issues anddeliverthereportsontime.Responsible for Performing Unit Testing on the reports verifying whether the components developed are matched according to the requirementsCreated Unit Testing TestCase Documents,Unit Testing Document etc.Job Environment:SAS BI Tools,Python,Hyperion, PostgreSQL, NumPy, Pandas,Jira, GitHub, Agile/Scrum.Accenture solutions Pvt Ltd, India JUL 2015 to AUG 2018Sr.SAS Developer/ Data AnalystDescription of the Project:It is an American multinational food manufacturing company. This project involves providing the forecast for the many products of food industry based on historical data and providing the support for the server maintenance to the customers.Role: Data AnalystResponsibilities:Collaborated with data engineers and operation team to implement the ETL process, wrote and optimized SQL queries to perform data extraction to fit the analytical requirements.Performed data analysis by retrieving the data from the Hadoop Cluster.Involved in SAS client application installation, User creation, monitoring job streams, Handling incidents & Enhancements in service now, creating JOBs and modifying existing jobs, did the analysis issues raised by client.Extract data from different sources and transforming as per design document using SAS DI Jobs, Base SAS &Macros.Involved in migration for SAS version from SAS 9.4 M3 to M6 version.Day to-day responsibility includes developing ETL Pipelines in and out of data warehouse, develop major regulatory and financial reports using advanced SQL queries in snowflake.Performed univariate and multivariate analysis on the data to identify any underlying pattern in the data and associations between the variables.Used Python (NumPy, SciPy, Pandas, Scikit-learn, Seaborn) and R to develop a variety of models and algorithms for analytic purposes.Excellent Experience in Designing, Developing, Documenting, Testing of ETL jobs and mappings in Server and Parallel jobs using Data Stage to populate tables in Data Warehouse and Data marts.Proficient in developing strategies for Extraction, Transformation and Loading (ETL) mechanism.Experienced in integration of various data sources (DB2-UDB, SQL Server, PL/SQL, Oracle, Teradata, XML and MS-Access) into data staging area.Job Environment: SAS BASE, SAS/MACRO, SAS DI Studio,Hadoop, Python, R, Tableau, Jira, GitHub, Agile/SCRUM.GENPACT INDIA PVT LTD SEP 2014 to JUL 2015Data Engineer/ Data AnalystDescription of the Project:One of the leading pharmacy benefit managements in USA. It is a TPA in between the pharmacy and insurance company to provide insurance on drugs. As part of re-pricing team, we must generate forecasted pricing for the next 3 years as per the incumbent pricing. We must share with the Underwriting team.UW team will come up with the approved financial discounts. We must segment the data into Retail, Mail and Specialty with that guarantees.Responsibilities:Combine SAS data sets, create basic and detail summary reports using SAS procedures, Identify and correct data, syntax and programming logic errors.Hands on writing code in Python and PySpark to manipulate data for data loads, extracts, statistical analysis, modeling, and data validation.Developed streaming & batch applications using Enterprise Scheduler & Python.I was mainly part of doing true sourcing the data through our pipeline includes various techniques to handle the data which are mainly Data quality checks, schema validation, row count validation, data standardization and data lineage.Wrote python custom scripts to transform data in to ETL logic and perform the Data driven analysis, Data quality checks and Data profiling.Developed Python scripts to transfer files between cross-region.To read the big files and different format files like parquet, EBCDIC, and text format files we use python, pandas, and spark to validate the millions records of data.In Python I worked on Pandas, NumPy and Dask data frames to analyze millions of transactions and perform validation reports on the data.Mostly for the large files I used PySpark data frames to analyze and save time to perform validations on the data.In pandas used loc and iloc functions to filter the data based on conditions and performed data slicing and data indexing on the big files.Wrote custom scripts in Python and PySpark to make refined and transformed files as per the consumers requests.Worked on Jenkins continuous integration tool for deployment of project.Perform data analysis and data modeling to support the business user's needs.Backup databases and test the integrity of the backupsJob Environment: SAS BI Tools,Agile, Python, pandas, Jupyter Notebook, PySpark, MS Visio, Word, Excel, PowerPoint, SQL, SAS, SharePoint 2010, MS Project.GENPACT INDIA PVT LTD NOV 2011- SEP 2014Data AnalystDescription of the Project:One of the leading General Insurance providers in USA. It has General Insurance products like Auto Insurance, Home Insurance and Commercial Insurance along with Life Insurance. We are providing IT solutions to support and develop IT solutions to Allstate Insurance Company.Responsibilities:Implemented and followed a Scrum Agile development methodology within the cross functional team and acted as a liaison between the business user group and the technical team.Involved in defining the source to target data mappings, business rules and data definitions.Created data trace map and data quality mapping documents.Performed data analysis and data profiling using complex SQL queries on various sources systemsWorked on various UNIX commands and shell scripting using VI editor and ultra-edit.Performed Exploratory Data Analysis using R Also involved in generating various graphs and charts for analyzing the data using Python Libraries.Worked with internal architects and assisting in the development of current and target state enterprise data architectures.Designed and developed the UI for the website with HTML, XHTML, CSS, Java Script, and AJAX.Created SQL scripts using OLAP functions to improve the query performance while pulling the data from large tables.Developed SQL queries to perform data extraction from existing sources to check format accuracy.Co-developed the SQL server database system to maximize performance benefits for clients.Creating Unix scripts for automation.Worked with project team representatives to ensure that logical and physical data models were developed in line with corporate standards and guidelines.Wrote and executed various MySQL database queries from Python-MySQL connector and MySQL dBpackage.Experience with AWS cloud implementation using Lamda,SQS & RedShiftUsing Base SAS to perform sorting, merging, updating and generated reports and developed various forms of out puts, RTF, PDF, HTML files using SAS ODS facility.Involved in Converting into SAS Datasets from flat files and excel data as per requirement.Performed data validation with Redshift and constructed pipelines designed over 100TB per day.Working with business users on the new Tableau versions features and explaining self-service capabilities.Designed & developed logical & physical data model using data warehouse methodologiesCreated Summary and detail dashboards for identifying mismatch of the data in Source and reporting systems using Tableau Desktop.Performed Data profiling, preliminary data analysis and handle anomalies such as missing, duplicates, outliers, and imputed irrelevant data.Job Environment: Rapid SQL, Data Organization, Python, Spark, Data Profiling, MVS Assembler, MS Visio, MS Project, MS Office, Windows.CERTIFICATIONBase and Advanced SAS Certified professional from 2015.Agile Methodology (Trained and Applied):Agile methodologies aim to deliver the right product, with incremental and frequent delivery of small chunks of functionality, through small cross-functional self-organizing teams, enabling frequent customer feedback and course correction as needed.Exceptional skills in gathering, analyzing, and translating business requirements into functional Specifications that were used to design and implement business solutions.Accomplished at introducing and facilitating the adoption of Scrum principles, removing Impediments, and fostering self-organization.Proficient in facilitating Scrum ceremonies (Daily scrum, Sprint Planning, Sprint Review and Sprint Retrospectives) and leading agile projects.AWARDS & RECOGNITIONReceived recognition for automating and migrating Hyperion codes to SAS. Got Incentive performance Points from Clients for Excellent Work. Received ACE (Accenture Center of Excellence) award for excellence in Year - 2017 in Accenture.I am trained & tested in LEAN & Six Sigma tools.BEST employee award in GENPACT -2014.

Respond to this candidate
Your Message
Please type the code shown in the image:

Note: Responding to this resume will create an account on our partner site postjobfree.com
Register for Free on Jobvertise