Sean McGill DATA SCiENTiST · M.S DATA ANALYTiCS · B.S INDUSTRiAL AND SYSTEMS ENGiNEERiNG 12220 Hunters Chase Dr. Apt 5306 Austin, TX 78729 (714) 615-0852 | seanmcgill714@gmail.com | www.linkedin.com/in/seantaesoomcgill Education Georgia Institute of Technology Atlanta, GA M.S iN DATA ANALYTiCS Graduated: Aug. 2020 Texas A&M University College Station, TX B.S. iN INDUSTRiAL AND SYSTEMS ENGiNEERiNG Graduated: Dec. 2016 Skills & Awards Analytics/Machine Learning SQL (Snowflake, T‑SQL, Spark SQL, Hive) · Python (pandas, numpy, sklearn) · R ETL/Data Engineering AWS (S3, API Gateway, Lambda, Redshift) · Git · Airflow · Hadoop (Hive, Spark) Data Visualization Tableau · Power BI · Splunk · Python · d3.js Programming Python · Excel VBA · Automation Anywhere (RPA) · Java Awards BSA Eagle Scout Troop 174 Work Experience Zoom Video Communications Remote DATA SCiENTiST, REVENUE ‑ REVENUE TEAM Mar. 2021 ‑ Present • Developed and maintained data pipelines in Airflow to bring in key revenue data into Snowflake from Oracle, AWS S3, Box, Zuora etc. • Automated the ingestion of 4 manually ingested reports to save 20 hours a week, consuming presigned URLs sent from Zuora via a REST API in AWS API Gateway triggering AWS Lambda functions to store into S3 and then ingest into Snowflake. • Maintained and updated Zoom’s revenue model in Snowflake to provide 100% accuracy in revenue tables, coordinating with stakeholders on timing for downstream processes/reporting mechanisms and dashboards to be refreshed General Motors Austin, TX DATA SCiENTiST ‑ AI/ML TEAM Jan. 2020 ‑ Mar. 2021 • Developed Jupyter notebooks for ad‑hoc analytics requests, working with data from multiple sources (Hadoop, Oracle, Cassandra etc.) to inves‑ tigate and evaluate key metrics with analogous visualizations (chloropleths, paretos etc.) in support of GM’s’ autonomous vehicle Supercruise system • Assisted in developing and implementing the Intelligent Vehicle Selection model, a predictive model used to identify the most likely vehicles to enter a specified geographic bounding box for the automated collection of telemetry data • Created a self‑service analytics tool for end users to generate queries via a GUI created in Jupyter to pass selections to pySpark query, generate csvs in HDFS, and then create analyses output to Excel workbooks • Responsible for updating Power BI dashboards, coordinating between data engineers and dashboard owners to define requirements. Aggre‑ gated and visualized multi‑million dataset within Jupyter as deliverable for customer AT&T Dallas, TX TECHNOLOGY DEVELOPMENT ENGiNEER ‑ ASSET LiFECYCLE MANAGEMENT TEAM Jul. 2017 ‑ Jan. 2020 • Identified automation opportunities across multiple business groups and successfully developed and moved 6 Robotic Process Automation (RPA) bots into production using Automation Anywhere, T‑SQL, VBScript, Python, and Excel VBA for annual savings totaling $230,500/year • Responsible for all ad‑hoc reporting/dashboarding from the master SQL Server database, developing ad‑hoc queries for external groups, ven‑ dors, and audits • Automated daily monitoring of internal Q&A chatroom by developing and deploying a Java‑based chatbot to answer customer questions and/or direct to appropriate resources (20 employee hours saved/week) • Helped plan and implement an IQ Bot solution, a machine learning package within Automation Anywhere, to automate the ingestion of hand‑ written technician forms into a cloud‑based tool to track task completions (potential savings of $115k/year) • Developed an Excel‑based dashboard using custom VBA userforms to generate dynamic queries and visualizations from SQL Server, replacing cost of Power BI licenses across the organization • Co‑founded the Asset Lifecycle Management (ALM) Bot Team to increase and coordinate automation efforts • Mentored 4 junior developers, and assisted with bot development for projects with savings totaling $65,000/year Boon Chapman Austin, TX ELECTRONiC DATA INTERCHANGE DEVELOPER ‑ EDI TEAM Jan. 2017 ‑ Jul. 2017 • Developed SQL stored procedure for evaluating COBRA eligibility and implemented process for automated load into T‑COBRA software • Automated ACH billing process through SQL stored procedure, Python script to generate text file, and scheduled Talend job • Optimized outbound ETL jobs by analyzing and standardizing referenced SQL views and Talend jobs JANUARY 27, 2022 SEAN MCGiLL · RÉSUMÉ 1