Home

Brandon Glover - Data Analyst / Business Data Analyst - Open for W2 / C2C / 1099
[email protected]
Location: Dallas, Texas, USA
Relocation:
Visa: US Citizen
Brandon Glover (Dallas, TX) US Citizen
(940) 326-7151 | [email protected]
Sr Data Analyst / Business Data Analyst

Open to Work on W2 / 1099 / C2C With In Texas any position


My email id: brandonglover399(at the rate symbol)gmail(dot symbol)com

Summary:
Business Data Analyst with Close to 7 years of diversified experience including strong Business Data Analysis, Data Management and Data Governance in requirements gathering and documentation with solid oral and written communication skills seeking a business data analyst position which will utilize and incorporate my current skills and experience, as well as expand them to assist the organization in being successful.EXCEL
Strong experience in Data Analysis, Business Analysis Data Profiling, Data Migration, Data Quality, Data Integration, Data Mapping, Data Modeling, and Metadata Management Services in all phases of Software Development Life Cycle (SDLC) like Requirement Analysis, Implementation, and Maintenance and good experience with AGILE Methodologies and Jira
Skilled in analyzing business processes, eliciting requirements, defining business needs and translating it to Business Requirement Document (BRD), Functional Requirement Document(FRD), Software Requirement Specification (SRS)/ Technical Requirement Documents (TRD), User Stories, Use-case Specification and Non-functional Requirements.
Great experience in interacting with Stakeholders/Customers, gathering requirements through interviews, workshops, and existing system documentation or procedures, defining business processes, identifying and analyzing risks using appropriate templates and analysis tools.
Experienced in various phases of Software Development life cycle (Analysis, Requirements gathering, Designing) with expertise in documenting various requirement specifications, functional specifications, Test Plans, Source to Target mappings.
Experience in the insurance domain especially on property and casualty (P&C) insurance claims.
Experienced in P&C (Property and causality) Insurance which provides the coverage for property.
Property and Casualty insurance helps protect individuals, businesses, and organizations from financial loss due to unexpected events.
Proficient in Python, with a strong grasp of the language's syntax, data structures, and libraries.
Worked on Data Management using Collibra for Enterprise Data Governance projects on areas of Data Quality, Reference data management, Data Dictionary, Business Glossary, Data Discovery, Metadata management.
Worked in projects involving Data Mapping, Data Profiling, Data Migration, Data Conversion, Data Quality, Data Governance, Data Lineage, Data Integration.
Worked in accordance with the Agile Safe /Scrum methodology for various phases of the business and created business vision, business architecture documents and mock reports.
Experienced in extracting data from various sources, such as databases, APIs, and files, using ETL processes, data integration tools, and scripting languages like SQL
Strong experience MS excel Pivot tables and V-Lookup tables for Data Analysis.
Proficient in data analysis using SQL, Tableau, and other data analysis techniques.
Used SQL to extract data from Oracle ERP databases to analyze the data.
Strong understanding of project life cycle and SDLC methodologies including RUP, RAD, Waterfall and Agile.
Expertise in Master Data Management, Meta Data, Informatica Business Glossary & Data Quality
Involved in analyzing the data using MS Excel, SQL and Access and generating different reports to gain more knowledge about the business status.
Strong experience in using Excel and MS Access to dump the data and analyze based on business needs.
Involved in defining the source to target data mappings, business rules and data definitions.
Worked on the implementation of Metadata Repository, Maintaining Data Quality, Data Clean-up procedures, Transformations, Data Standards, Data governance program, Scripts, Stored Procedures, triggers and execution of test plans.
Utilized SSIS to design and implement data integration workflows, seamlessly moving and transforming data from diverse sources for analysis.
Developed robust ETL processes using SSIS, cleansing and enriching data to ensure accuracy and consistency for analytical insights.
Industrious, results oriented professional with technical and business experience dedicated to organizational growth and profitability through excellent project management, business analysis and leadership capabilities.
Implemented and Administered Collibra Data Governance: Setup Business Glossary, Workflows, Data Governance Assets, Setup users and Data Catalogue.
Ingested business and technical metadata into Collibra via excel import template based on Source, Region and Assets type.
Ensure Data Governance function standards - Metadata, Data Lineage, Data Mappings, Data Ingestion into the EDL appropriately, etc to support business and technology needs to locate data, understand data and ensure appropriate use.
Serve as an expert in Business Metadata in Collibra and promote understanding of the data definition at the application, data element and data domain level.


Education:
Bachelor s in information Systems from University of Texas at Arlington

Skill Matrix:

Data Analysis and Modeling: Power Designer, SQL, R, Informatica
Project Management: MS Project, MS Excel, Confidential Rational Portfolio Manager
Database Management: Oracle SQL, MySQL, MS SQL Server, MS Access, Data Lineage, Hadoop (Hive, PIG), Teradata
Data Visualization: Tableau, Power BI, Excel, Macros
Cloud Platform AWS, Google Cloud, Azure
Version Control and Documentation: MS Office Word, MS Project, Advanced Excel, Lotus Notes, GitHub repository
Business Intelligence: Data Warehousing, RDBMS, Hypothesis Testing, A/B Testing, Data Mining
Data Visualization and Reporting: Tableau, Power BI, Excel, Macros
Collaboration Tools: JIRA, Confluence
Testing and Quality Assurance: User Acceptance Testing (UAT), Test Case Design,
Test Planning

Professional Work Experience:
Wells Fargo, Dallas Texas Jan 2022 -Present
Role: Data Engineer / Analyst / Business Data Analyst

Responsibilities:
Worked on projects involving Data Analysis, Data Management, SQL, problem-solving, and multi-tasking between multiple projects.
Wrote complex SQL queries on daily basis to retrieve data and worked on Teradata, Oracle, MS Access, SQL Server etc.
Involved in documenting the Data quality scores in Collibra using integrations with Informatica Data Quality.
Performed data analysis and data profiling on a day-to-day basis using complex SQL on various source systems including Oracle and Teradata.
Tested Complex ETL Mappings and Sessions based on business user requirements and business rules to load data from source flat files and RDBMS tables to target tables.
Wrote the business requirements in Jira Agile Epics and user stories format and uploaded and assigned the user stories in Jira for both Agile (sprints) methodology.
Proficient in using data modelling tools like ERwin, ER/Studio, or Lucid chart to design data models.
Experienced in web development using Python, including web frameworks like Django and Flask to build web applications and APIs.
Proficient in Python programming language for data analysis, manipulation, and visualization using libraries such as Pandas, NumPy, and Matplotlib.
Proficient in using Power Automate to automate repetitive tasks and streamline business processes without the need for coding.
Integrated Power Automate with Microsoft Excel, SharePoint, SQL Server, and other data sources to automate data extraction, transformation, and loading (ETL) processes.
Leveraged Power Automate connectors and templates to quickly connect to data sources and automate data workflows.
Built workflows in Power Automate to perform data cleansing, enrichment, and transformation tasks, such as removing duplicates, formatting data, and calculating metrics.
Implemented the continuous delivery pipeline using the Docker and Jenkins.
Developed automated data processing scripts and pipelines in Python to streamline data cleansing, transformation, and analysis, resulting in increased efficiency and accuracy of insights generated.
Collaborated with cross-functional teams to integrate Python-based data analysis solutions into production systems, ensuring seamless data pipelines and scalable solutions for real-time data processing.
Demonstrated strong problem-solving skills through the implementation of Python-based algorithms for data wrangling, feature engineering, and predictive modelling, resulting in improved accuracy and performance metrics.
Utilized Kibana extensively to create visually appealing and informative dashboards, enabling comprehensive data exploration and analysis.
Implemented data cleansing and deduplication techniques within MDM processes to improve data accuracy and consistency, resulting in enhanced reporting and analytics capabilities.
Experience in integrating master data from various sources and synchronizing it across enterprise systems, ensuring that critical business data remains consistent and up-to-date for analysis purposes.
Utilized MDM tools and techniques to conduct data quality assessments, identifying discrepancies, anomalies, and inconsistencies in master data sets, and recommending corrective actions to improve overall data quality.
Leveraged Kibana's robust querying and filtering capabilities to perform in-depth data analysis and identify key trends and patterns.
Demonstrated ability to create interactive and visually appealing dashboards and reports using Power BI.
Skilled in conducting in-depth data analysis using Power BI to identify trends, patterns, and actionable insights.
Proficient in creating interactive and visually compelling reports and dashboards using Power BI Desktop.
Published and shared Power BI reports and dashboards securely with stakeholders using Power BI Service.
Collaborated with team members by sharing datasets, reports, and insights through Power BI Workspaces and Apps.
Designed and optimized Power BI reports and dashboards for mobile devices, ensuring accessibility and usability on smartphones and tablets.
Enabled users to access and interact with reports offline using Power BI Mobile app for iOS and Android.
Experienced in developing customized dashboards in Tableau to provide actionable insights and drive data-driven decision-making processes.
Extract Transform and Load data from Sources Systems to Azure Data Storage services using a combination of Azure Data Factory, T-SQL, Spark SQL, and Azure Data Lake Analytics.
Data Ingestion to one or more Azure Services and processing the data in In Azure Databricks.
Experienced in leveraging Tableau to create dynamic and interactive visualizations that facilitate in-depth data analysis and interpretation.
Prepared test Data sets and performed data analysis and used MS excel for data mining, data cleansing, data mapping, data dictionary and data analysis.
Worked on Data Management using Collibra for Enterprise Data Governance projects on areas of Data Quality, Reference data management, Data Dictionary, Business Glossary, Data Discovery, Metadata management.,
Managed all data collection, extraction, transformation and load (ETL) activities using Microsoft SSIS, Talend and informatica including data profiling, data cleansing, data conversion and quality control
Experienced in using data modeling languages like SQL Data Definition Language (DDL) to create and modify database objects.
Reviewed the data model and reporting requirements for Cognos Reports with the Data warehouse/ETL and Reporting team.
Performed detailed analysis of complex business processes, data requirements and proposes solutions and worked directly with internal customers to understand the business environment and needs.
Involved in several complex financial reports related to Non-performing Loans portfolio, which includes Credit Loss Reporting Package and Annual Reports Package.
Created (Extract, Transform, Load) ETL design mapping sheets, data reconciliation strategy, data archival strategy- ETL framework, stored procedures and built SQL query objects to detect data loss.
Involved in writing Data Mapping Documents for the system and involved in documenting the ETL process and writing SQL Queries for the retrieval of the data including MS SQL Server and Teradata.
Created Interactive Tableau Dashboard after gathering and analyzing the data from the warehouse to illustrate the metrics of the business process.
Responsible from the user perspective to analyze dashboards and analytical reporting requirements using Tableau.
Worked with the business users to understand & document the design and functionality requirements.
Involved in testing and documentation for applications developed in SDLC Environment.
Performed BI and ETL Analyst role interfacing with subject matter experts to understand and define the functional requirements supporting reporting applications by outlining data strategy and report specifications.
Migrated data from different data sources (SQL Server, Flat files, Excel source files, Oracle) and loaded the data into target tables, data warehouse by performing different kinds of transformations such as conditional split, derived columns, aggregation etc., to ensure data validation, before loading based on the business need using SSIS.
Used Data Integrator to load data from various sources such as MS Access, MS Excel, CSV files to MS SQL Server.
Extracted data from SQL Server database tables and Oracle database tables into flat data files and Excel sheet
Used SQL Loader and SQL Developer to load data from flat data files and excel sheet into Oracle database.
Designed and implemented data profiling and data quality improvement solution to analyze, match, cleanse, and consolidate data before loading into data warehouse.

Blue Cross and Blue Shield, Richardson, Texas July 2017 Dec 2021
Role: Data Analyst / Business Data Analyst
Created data dictionary, Data mapping for ETL and application support, DFD, ERD, mapping documents, metadata, DDL and DML as required.
Create and maintain product backlog items, bugs, features, etc. in JIRA and TFS / Azure DevOps for story tracking and documentation.
Worked on Snowflake cloud database and used tools like Alteryx and Tableau to present study, analyze and present data
Prepared user stories documentation in JIRA for Agile projects, documented requirements for the development and data migration epic to convert customer data to oracle instance with upgrades.
Conducted and automated the ETL operations to Extract data from multiple data sources, transform inconsistent and missing data to consistent and reliable data and finally load it into the Multi-dimensional data warehouse.
Lead the go-live implementation plan from the business side for digital banking initiatives and partner with other management leads in IT to align organizational readiness plans with delivery plans.
Documenting the Data quality scores in Collibra using integrations with Informatica Data Quality.
Performed data analysis and data profiling using complex SQL on various source systems including Oracle and Teradata.
Gathered and analyzed large datasets to identify trends and patterns in property and casualty insurance claims.
Developed statistical models to forecast risk and predict potential claims, improving accuracy in underwriting decisions.
Conducted comprehensive risk assessments by evaluating historical data and current market trends to support strategic decision-making in property and casualty insurance policies.
Created detailed reports and visualizations to communicate findings and insights to stakeholders, enabling data-driven decisions in property and casualty insurance operations.
Utilized tools like Tableau, Power BI, and Excel to present complex data in an accessible and actionable format.
Identified and implemented process improvements to enhance data accuracy and efficiency in property and casualty insurance data management.
Ensured data integrity and compliance with regulatory standards in the property and casualty insurance sector.
Implemented Power Automate and workflows to monitor data quality, track changes, and notify stakeholders of important updates or deviations.
Scheduled Power Automate flows to run on a recurring basis, automating batch processing tasks such as data imports, exports, and report generation.
Integrated Power Automate with Power BI to automate data refreshes, report distribution, and dashboard updates.
Skilled in creating comprehensive documentation for data models, including data dictionaries, naming conventions, and data model metadata.
Orchestrated data workflows between Power Automate and other analytics tools such as Tableau, Qlik, or Google Data Studio to streamline data analysis and visualization processes.
Prepared business glossary and metadata with Confidential for all deposit related CDE's.
Used advanced Excel to work on large datasets by developing formulas, creating Pivot tables, V-lookup and H-lookup formula development, regression analysis.
Developed and maintained interactive data visualizations and dashboards using Tableau, enabling business stakeholders to easily understand and analyze complex data.
Working experience on interactive dashboard and Reports in Tableau for monitoring the operation performance on a day-to-day basis.
Experienced in connecting to and manipulating databases using Python, including SQL databases (e.g., MySQL, PostgreSQL) and NoSQL databases (e.g., MongoDB).
Proficient in data modeling for NoSQL databases like MongoDB, Cassandra, or Neo4j, considering their unique structures and querying requirements.
Experience in creating Data Governance Policies, Business Glossary, Data Dictionary, Reference Data, Metadata, Data Lineage, and Data Quality Rules.
Use SQL to query into databases and analyze data.
Worked on Snowflake cloud database for Data Analysis and used ETL tools like Alteryx and Visualization tools like Tableau to Clean, Prep , analyze and present data.
Extract, transform, and load data from source systems to Azure Storage services using a combination of Azure Data Factory, T-SQL, Spark SQL, and U-SQL of Azure Data Lake Analytics
Created SQL queries for Extracting, Transforming, and Loading (ETL) mapping using PostgreSQL, Oracle, and SQL Server.
Worked with Finance, Risk, and Investment Accounting teams to create Data Governance glossary, Data Governance framework and Process flow diagrams.
Proficiency in multiple databases like MongoDB, Cassandra, MySQL, ORACLE, and MS SQL Server
Designed logical model with all relationships for database; forward engineered it to SQL Server with Erwin.
SQL Server, Visual Studios, PowerBi, Tableau, and SharePoint.
Created data stories, reports and visualizations using Tableau and Power BI
Managed full SDLC processes involving requirements management, workflow analysis, source data analysis, data mapping, metadata management, data quality, testing strategy and maintenance of the model.
Created business scenarios and insights that are meaningful and which impact on critical customer facing issues and help navigate process improvements/decision
Extracting data through SQL query and business intelligence solutions, analyzing and interpreting
Designed and implemented ETL processes based in SQL, T-SQL, stored procedures, triggers, views, tables, user defined functions and security using SQL SERVER 2012, SQL Server Integration Service (SSIS).
Keywords: cprogramm business intelligence rlang information technology golang wtwo microsoft Idaho Texas

To remove this resume please click here or send an email from [email protected] to [email protected] with subject as "delete" (without inverted commas)
[email protected];2978
Enter the captcha code and we will send and email at [email protected]
with a link to edit / delete this resume
Captcha Image: