| 20,000+ Fresh Resumes Monthly | |
|
|
| | Click here or scroll down to respond to this candidateCandidate's Name
Street Address
PHONE NUMBER AVAILABLE cell EMAIL AVAILABLESummary Of SkillsHolder of both Base Programming for SAS 9 and Advanced Programming for SAS 9 certifications with over 6 years of SAS experience including 3 years programming in a clinical trials (phase I IV) environment .Holder of BS in Applied Mathematics/Statistics and MS in Software Engineering with knowledge of Operational Research, Numerical Computations, Statistical Modeling, Statistical Inference, Time Series Analysis, Design and Analysis of Statistical Experiments and Neural Networks.Over 4 years experience in Predictive Analytics including 3 years experience software development related to credit scoring.Proficient in SAS Data steps as well as SAS Procedures including SAS/STATS, PROC FREQ, PROC MEANS, PROC UNIVARIATE, PROC DATASETS, PROC REG, PROC FACTOR, SAS MACROS,PROC CORR, PROC SQL, PROC CONTENTS, PROC GPLOT, PROC PRINT, PROC REPORTS, PROC CHART, PROC FORMAT, PROC TABULATE, PROC ANOVA, PROC CDISC, PROC IMPORT, PROC TRANSPOSE PROC REPORT SASGRAPH and many others.Over Fifteen years of experience in software development in both business and academic settings.Competent knowledge of full life cycle Software Development from Requirements Analysis, Technical Specification, Design, Coding, Unit Testing,Regression Testing and Maintenance.Over 7 years experience of C/C++, STL, DOS/VSE, OpenVMS, UNIX, C-shell, Korn shell and Windows NT/2000, SQL, TSQL, SQLServer 2008 R2.Extensive experience in VAX/VMS Open VMS COBOL, DCL procedures,.Experience in migrating from ICL 2950 to VAX/VMS, From VAX/VMS to Open VMS from Open VMS to UNIX (COBOL to ORACLE)Over 10 years experience in Statistical Programming and Statistical Data Analysis, SAS, and SPSS, RSelf-motivated; able to set effective priorities in order to achieve immediate and long-term goals and to meet operational deadlines.Functions well in a fast-paced, high pressure environment.ObjectiveTo find a position in a reputable firm where my software engineering skills together with my strong mathematics/statistics background can be utilized to produce software that meets specifications, budget and schedule.Technical ProficienciesLanguages COBOL C/C++ C# dBaseIII+ Clipper Ada Pascal MirandaSmalltalk BASIC FORTRAN Z HTML Visual Basic UML SAS SAS Viya R PythonSCRIPTING: Unix Shell Scripting OpenVMS DCL Scripting R Scripting SPSS ScriptingOperating Systems: UNIX OpenVMS VAX/VMS Solaris MS-DOS Windows 9x, 2000, NTDatabases: Oracle dBaseIII+ Datatrieve MS Access DB2 RDB SQL T-SQLHardware: PCs Sun Sparc DEC IBM Mainframe ICL MainframeTools: MS Office CMS MMS DECforms ACMS TDMS Lotus Notes SPSS CICS Continuus FrontPage ASPPROTOCOLS TCP/IP FTP Socket ProgrammingProfessional ExperienceBank of America July,2016 PresentPosition: Quantitative Operations AssociateRole: ATM Cash Forecasting and SAS Development of SAS programs, Statistical Programming and monitoring the performance of ATM Cash Forecasting models. Specifically:Attend weekly team meetings where progress of various projects that impact ATM Cash levels are reported on and new issues are identified.Gathering of requirements for proposed reports as well as current reports that need enhancement.Writing of code that meet the stated requirements. (Daily Defects Reports, Daily Volumes Report, Cash Outages etc.)Improvement of the user experience. Many of the programs have been written in SAS EG and production of reports is often done by members of the group tasked with running of these reports that also automatically email users of these reports. Improving user experience involves creating suitable user interfaces that empower users themselves to interact with SAS data sets without having to invoke SAS reducing the risk of accidentally deleting or corrupting these valuable data sets. In addition, these user interfaces enable the users to determine and select exactly what they need thus enabling them to customize their own reports according to their individual needs.Construction of dashboards for ATM Cash Dispense Availability metrics, ATM Deposit Bin Availability metrics, Cash out Event Count, Bin Full Events Count all by Servicer and by day for the past two months.Code testing, and peer review to verify that the code meets specification.Monitoring performance of the programs.Maintenance of existing code.Created C/C++ tools for accessing and manipulating SAS data via user interfaces that use the Integrated Object Model (IOM) interface.Tools: SAS Enterprise Guide 9.4, SAS-GRID, Linux, Visio, SAS Base, SAS/STAT, SAS PROC REPORT SAS Macro, Tableau C# .NET, VBA (Visual Basic for Applications), VBScript. Hadoop Hue Machine Learning, Python, Pandas, Spark, Numpy, Scipy, Git and GitHub, Anaconda, Jupyter Notebooks, Spyder, Scikit-learn, Pytorch, C/C++.Fidelity Investments, Westlake, TX March,2015 December,2015Position: Software EngineerRole:Migration of SAS programs, SAS Data sets from Mainframe to Linux.Validating the programs in the Linux environment.Automation of the validation of SAS data using a suite of tools created in C/C++.Tools: SAS Base, SAS/STAT, SAS Macro, TSO, MVS, z/OS, JCL,ISPF, Vim, WinSCP, Linux, Excel Control-M, C/C++.Metlife, Somerset, NJ August,2014 March, 2015Position: SAS Developer/Analytics ConsultantRole: SAS/SPSS/SAS Macro/ QLIKVIEW Development and Modeling of Insurance data including writing programs that reported on Key performance indicators on various MetLife insurance products.Tools: SAS Base, SAS/STAT, SAS Macro, QLIKVIEW, Netezza, SQL, ETL.Wells Fargo, St. Louis MO March,2014 May, 2014Position: SAS Developer/Analytics ConsultantRole: SAS/Perl/Linux Development and validation of Mortgage data including writing SAS macros to automatate various loan monitoring activities. The data, which consisted of a number of carefully selected variables was scrubbed and then used for predictive modeling in order to have a feel of how the bank was doing financially and more specifically to be able to sense if a financial crisis similar to that of 2008 could be approaching.Other Tools: Teradata, ODS, SAS/Base SAS/STAT, C/C++Centene Corporation, Clayton MO Apr, 2013 March, 2014Position: Sr. Provider Network AnalystRole: Provision of analytical support to the Cost of Care and/or Provider Contracting organizations. Focused efforts to lower claims costs, improving the quality of care, and increasing member and provider network satisfaction by examining variables that are deemed to drive costs to use these variables to model healthcare costs. Provision of advice, analytic and consultative support to Medical Directors and management on cost of care issues.Technologies used: SQL, SQL Server Management Studio, Teradata, SAS, Unix, Linux, PerlTechnologies used: SQL, SQL Server Management Studio, TOAD, OLAP cubes, Teradata, Sharepoint, MicroStrategy, SAS, Impact Intelligence, SPSS and RHewlett-Packard (Contracting for Insight-Global), Houston TX Aug, 2012 Jan, 2013Position: Analytics ConsultantRole:Testing the SQL Queries for accuracyConversion of the SQL Queries to RVerifying that what is produced by the R queries on the R Studio matched what was produced by the SQL Management Studio.Using the R queries and Virtual factory to create analytic query operations.Using R and Virtual factory to create wrapper analytic operations to further manipulate the returned rows and create R objects.Using Virtual factory to create analytic tasks that consisted of analytic operations (queries, wrappers and scripts for producing u-charts, p-charts, pareto-charts, xmr-charts, Histograms and Dials etc)Creation of dashboards and other visuals.Submission of the task for Quality Assurance.Daily morning meetings with off-shore teams in India and Puerto-Rico to compare notes and report progress and difficulties.Using the Analytical Tasks for extraction of supplier data, field data, regional factory data and user community data from the Hewlett-Packard Global Data Warehouse and using this data to create quality and operational metrics, analytics, dashboards and other reporting mechanisms including alerting mechanisms for monitoring quality of components from various sources as well as products produced by Hewlett-Packard.Technologies used were: SQL Server 2008 R2, SQL, T-SQL, R, SAS, Virtual Factory, Microsoft Office Tools, Linux, Unix, C/C++Biomedical Systems, St. Louis MO July, 2009 May, 2012Position: Data Manager/SAS Programmer/Applications DeveloperRole:Kicking off each protocol with a meeting involving Clinical Project Manager a client representative and myself to outline the nature of the study, go through requirements, concerns and agree on both possibilities and limitations.Preparation a Load Specification document based on inputs from client. The Load Specifications specify among other things, the data definition tables (DDT) that detail the structure of each data set and how the values in each variable is derived, the types of data sets (SAS, ASCII, EXCEL, SAS transport files), whether CDISC standards/format are to be applied, transfer schedules, edit checks that are to be applied as a part of the validation, and the contact information etc. The Load Specification is the contract document between the Biomedical Systems and the pharmaceutical company(client) establishing the data characteristics, transfer file characteristics, schedules for data transfers and methods to be used to make the transfers.Writing of SAS/VB6 programs that extracted data from the Clinical Database prepared test data sets according to Load Specifications and Processing Instructions taking into account the companywide standard operating procedures and department standard operating procedures (SOPs), the study protocol, Good Clinical Practice (GCP) and the Code of Federal Regulations (CFR), specifically 21 CFR part 11 and 21 CFR part 820..Submitting Test Data for Quality Assurance.Making Test Data Transfer and notifying the client.Developing Validation Protocols that detailed the tests to be carried out to validate the programs and to establish that the programs were producing data as specified in the load specifications.Participating in Peer Review as one of the Quality Control activities.Submitted my programs for peer review.Making both scheduled and ad-hoc transfers as agreed.Developing systems that are used in the overall management of the data, enhanced the functionality of the data infrastructure and produced reports about the data as determined by Biomedical Systems.Working as part of a team(s) that processed and submitted study data on efficacy of various drugs that eventually met FDA approval.Communicating directly with clients that relied on study data that I processed at Biomedical Systems on behalf of the pharmaceutical companies and was answerable to them in matters regarding accuracy of the data, format of the data, submission schedules and any ad-hoc adjustments that were needed etc.Technologies used are: Base SAS, SAS Macros, SAS/STAT, ODS, SAS/GRAPH, SAS System Viewer V8.2, SAS/ACCESS Interface for PC Files, Visual Basic 6,Visual Basic.NET, ADO, SQL, Visual C++, CDISC, SDTM, ADaM., C/C++, EXCEL, OUTLOOK, Seapine and TFS. Etc and was exposed to Knowledge of Case Report Tabulation Data Definition Specification (define.xml)H & R Block (UST Global, Inc) Dublin OH July, 2008 October, 2008Software EngineerDesigned, developed and maintained software for H & R Block that helps provide on-line and in-office to tax solutions.Technologies used are C, C++, Visual C++, COBOL and DCL (for scripting on an Open VMS platform) MMS (Make for OpenVMS), CMS (for change management), FTP, Open VMS Debugger, Base SAS, SAS Macros, SAS/STAT, SAS/GRAPH and SPSS.Reuters (Advanced Resources, Inc) Creve Coeur, MO February, 2006 June, 2008Software EngineerDesigned, developed and maintained financial message interpreters for Reuters Alpha Quotation Systems (RAQS). This is part of a high-speed, integrated information and transaction service that gives users a view of the global real-time financial arena and provides information, insight as well as access to the global Reuters trading community. Performed Statistical Analysis procedures especially Regression Analysis and Time Series Analysis as Statistical models.Technologies used are Pascal, C, C++, Visual C++, and DCL (for scripting on an Open VMS platform) MMS (Make for OpenVMS), CMS (for change management), FTP, Open VMS Debugger, COBOL, Base SAS, SAS Macros, SAS/STAT, SAS/GRAPH and SPSS.Express-Scripts(Solution Consultants, Inc) Maryland Heights, MOMay, 2005 February, 2006Programmer/AnalystDesigned, developed and maintained software tools for Express Scripts, Inc. that are used to analyze thousands of COBOL source programs for quality. Designed, developed and maintained software that processes over 400 million retail prescriptions and over 50 million mail order prescriptions each year.Designed, developed tools that help in cleaning of data and checking for consistencies.Designed, developed and maintained a suite of software using C, COBOL, DECForms for maintenance of FTP addresses, usernames, passwords, encryption and decryption of user passwords.Designed, developed and maintained PARAGRAPH_ANALYZER; a tool developed in C and used by the Code Review team to analyze COBOL source programs for proper use of PERFORM and GO TO statements.Designed, developed and maintained ACCESS_ANALYZER; a tool developed in C and used by the Code Review team to analyze COBOL source programs for proper use of different file access methods.Maintained various C, COBOL in processing retail and mail prescriptions.Additional technologies used include MMS, CMS, FTP, DCL for scripting, DATATRIEVE, OpenVMS DebuggerAmdocs Broadband Cable Satellite (SMCI) Charlotte, NC October, 2004 February, 2005Programmer /AnalystMaintained parts of a billing application for DST on behalf of Software Management Consultants, using VAX COBOL, SQL and C in an Open VMS environment.Designed and developed programs for doing statistical data analysis and production of statistical reports using SAS and SPSS.Lightbridge, Inc Burlington, MA July, 2000 March, 2003Software EngineerDesigned, developed and maintained portions of the Talkers; a subsystem of the Customer Acquisition System(CAS) used for extracting individual/company credit information from various credit bureaus, merging the information into a Lightbridge standard format and score. These products are to process wireless subscriber applications, fraud monitoring and prevention, among many others.Used both internally developed customer provided Statistical models in determining key independent variables that help in scoring and predicting certain dependent variables like customer default rate.Designed, developed and maintained Sprint-Preprocessor using C in an OpenVMS platform, a Lightbridge product used for parsing and pre-processing pre-approved Sprint wireless subscriber applications into specific Lightbridge format.Designed, developed and maintained components of the Talkers using Ada, C/C++, DATATRIEVE and SQL.Provided statistical analysis support by designing and writing programs to extract and perform statistical data analysis, Time Series Analysis and production of statistical support using SAS and SPSS.Made Talker credit models configurable thus reducing the amount of time it took to introduce/modify a credit model by a factor of 5.Designed, developed, coded and tested software to build support for TransUnion TU40v1 new/improved set of products while at the same time maintaining support for TU40v0 (older set of products).Designed, developed and maintained portions of the Lightbridge Insight products used for monitoring existing records, major business accounts and previous applications for pre-approvals of wireless subscriber applications.Designed, developed, coded and maintained the Process Monitor using Ada, C/C++, a Lightbridge utility for monitoring Lightbridge processes running UNIX platforms.Maintained various utilities and components of the Customer Acquisition System (CAS) as assigned.Additional technologies used are AIX, MMS (Make for OpenVMS platform), Credit scoring, CMS for change control, Continuus for change control, STL, DCL scripting, Unix scripting DBX debugger, FTP, OpenVMS, AdaEducationUniversity of Michigan (US) Master of Applied Data Science (student)University of Aston (UK) Master of Science, Software EngineeringUniversity of Nairobi (Kenya) Bachelor of Science, Mathematics/StatisticsCertificationsCoursera Python 3SAS Institute SAS Advanced Programming for SAS9SAS Institute SAS Base Programming for SAS9University of Cambridge (England) Certificate, International Language Testing SystemIELTSRefereesAvailable on requestVisa StatusUS citizen |