Sr. Data Architect/ Data Modeler Resume
Nicholasville, KY
SUMMARY
- Over 9+ years of strong IT experienced in Data Architecture, Data Modeling, and Big Data Reporting Design and Development.
- Experience in data modeling, designing, and data analysis with Online Transaction Processing and Online Analytical Processing (OLTP & OLAP)
- Extensive work experience in ER Modeling, Dimensional Modeling (Star Schema, Snowflake Schema) and Data Warehousing, OLAP tools
- Proficient in interacting with users, analyzing client business processes, documenting business requirements, performing design analysis and developing design specifications
- Proficient in Normalization/De - normalization (3NF) techniques in relational/dimensional database environments
- Expertise in many Software Development Life Cycle (SDLC) implementations, performing process planning, documentation, functional and system design, implementation, unit, integration, regression testing, and system maintenance
- Very Strong experience in Data modeling - Conceptual, Logical/Physical, Relational and Multi-dimensional modeling,Data analysis for Decision Support Systems (DSS), Data Transformation(ETL) and Reporting
- Expertise in UML (class diagrams, object diagrams, use case diagrams, state diagrams, sequence diagrams, activity diagrams, and collaboration diagrams)as a business analysis methodology for application functionality designs using Rational Rose and MS-Visio
- Experienced in designing the data marts using the Ralph Kimball and Inmon’s dimensional data mart modeling techniques
- Experience in working with business intelligence and data warehouse software, including SSAS, Pentaho, Cognos, OBIEE, QlikView, Greenplum Database, Amazon Redshift, or Azure Data Warehouse
- Expert in implementing the projects from end to end & in providing the Architectural with emphasis on requirements analysis, design, coding, testing and documentation.
- Additionally experienced in NameNode where Hadoop stores all the file location information in HDFS and tracks the file data across the cluster or multiple machines.
- Experience in Teradata Database design, implementation and maintenance mainly in large scale Data Warehouse environments, experience in Teradata RDBMS using Fastload, Multiload, Tpump, FastExport, Teradata SQL Assistance, Teradata Parallel Transporter and BTEQ Teradata utilities.
- Excellent working knowledge in CASE tools like ERWIN and EmbarcaderoER/Studio, Power Designer. Used data modeling tool such asERWIN forForward and Reverse engineering functions
- Exposure and knowledge of Data Mart design, creation of Cubes, identifying facts and dimension
- Experienced in conductingGAP analysis and User Acceptance Testing (UAT). Facilitated and participated in Joint Application Development (JAD) sessions, Joint Requirement Planning (JRP), white board sessions to resolve the revolving issues and facilitated the coordination between the teams
- Experienced in query optimization & performance tuning
- Experienced in dealing with Slowly Changing Dimensions (SCD), Conformed Dimensions, Junk Dimensions, Status Dimensions, Role-Playing Dimensions, Mini dimensions and Outriggers
- Experienced in writing and debugging complex SQL queries to perform end-to-end ETL validations and support Ad-hoc business requests.
- Excellent experience in developing Stored Procedures, Triggers, Functions,Packages,Inner Joins & Outer Joins, views using TSQL/PLSQL
- Excellent experience with IBM Ionosphere utilizing MDM, Data profiling and data Modeling.
- Experience in importing and exporting data using Sqoop from HDFS to Relational Database Systems (RDBMS) and from RDBMS to HDFS.
- Excellent understanding of Hub Architecture Style for MDM hubs the registry, repository and hybrid approach.
- Experienced in migration of data from Excel, DB2, Sybase, Flat file, Teradata, Netezza, Oracle to MS SQL Server usingBCP and DTS utility and extracting, transforming and loading of data
- A good familiarity and experience in the work environment consisting of Business analysts, Production/Support teams, Subject Matter Experts, Database Administrators and Database developers
- Good understanding and knowledge with Agile and Waterfall environments
- Exceptional communication and presentation skills and established track record of client interactions
TECHNICAL SKILLS
Data Modeling Tools: Erwin 9.6/9.5, Sybase Power Designer, Oracle Designer, ER/Studio 9.7
Database Tools: Oracle 12c/11g, MS Access, MS SQL Server16.0, DB2 11.1, Teradata13.1, Hive, Poster SQL, Netezza.
Web technologies: HTML 5, DHTML, XML
Project Execution Methodologies: Ralph Kimball and Bill Inmon data warehousing methodology, RUP, JAD, Agile, Waterfall, RAD
Reporting Tools: Business Objects, Crystal Reports 8/7, Tableau
Big Data tools: Hadoop, HDFS 2, Hive, Pig, HBase, Sqoop, Flume, AWS, EC2, S3
ETL/Reporting Tools: SSIS, SSRS, Informatica Power 9.6
Tools: & Software: TOAD, MS Office, BTEQ, Teradata SQL Assistant, MS-Office suite (Word, Excel, MS Project and Outlook)
Programming Languages: SQL, PL/SQL, UNIX shell Scripting
Operating System: Windows 8/7, Unix, Linux
PROFESSIONAL EXPERIENCE
Confidential - Nicholasville, KY
Sr. Data Architect/ Data Modeler
Responsibilities:
- Designed and developed architecture for data services ecosystem spanning Relational, NoSQL, and Big Data technologies.
- Designed, created and maintained QlikView applications
- Assigned tasks among development team, monitored and tracked progress of project following Agile methodology.
- Monitored and measured data architecture processes and standards to ensure value is being driven and delivered as expected.
- Worked with Data governance, Data quality, data lineage, Data architect to design various models and processes.
- Worked as Architect and build data marts using hybrid Inmon and Kimball DW methodologies
- Designed the new Teradata data warehouse with star schema & snow flake schema for most efficient use of Micro Strategy & Teradata resources. Model is approved by Micro Strategy senior architect and Teradata expert.
- Created reports from Greenplum (Pivotal) which references the positions/transactions for each customer's monthly invoice for all jurisdiction (CFTC, ESMA, CANADA, ASIC & MAS) and make it available on the Portal
- Involved in all the steps and scope of the project reference data approach to MDM, Creating a Data Dictionary and Mapping from Sources to the Target in MDM Data Model.
- Review system architecture, data flow, data warehouse dimensional model, DDL to identify the area for improvement to reduce the loading & reporting time for a meter reading system. maintained database architecture and metadata that support the Enterprise Data Warehouse (EDW)
- Propose, design and supported ETL implementation using Teradata Tools and Technology like BTEQ, MLOAD, FASTLOAD, FAST EXPORT, SQL ASSISTANT, and Teradata Parallel Transporter (TPT)
- Analyze business process, involved in requirement gathering, template creation, Design and create the conceptual, logical, and physical data models utilizing data Warehousing components
- Designed both 3NF data models for ODS, OLTP systems and dimensional data models
- Developed new and innovative analytical solutions that meet customer requirements
- Developed Database Architecture using design standards and tools.
- Designed and build relational database models and defines data requirements to meet the business requirements.
- Involved in writing Shell Scripts to accumulate the MTD source file Collaboration with Architects and Managers for review of solutions and data strategy
- Used data virtualization tool connect multiple heterogeneous sources without requirement of physically moving the data.
- Collaborated various stakeholders in authoring policies and procedures including the Data Management Working Group (DMWG) Charter and DHS Authoritative and/or Trusted Data Methodology.
- Worked with data compliance teams, Data governance team to maintain data models, Metadata, Data Dictionaries; define source fields and its definitions.
- Worked with DBA to create Best-Fit Physical Data Model from the logical Data Model using Forward Engineering in Erwin9.6
- Gathered Business Requirements & Analyzed Business Process (User & Source Driven); Performed the roles of Data Analyst; Developed Used Cases.
- Called Greenplum Business Rules, Data Rules and Transform Rules functions using Informatica Stored Procedure Transformation.
- Performance Tuning (Database Tuning, SQL Tuning, Application/ETL Tuning).
- Responsible for design and implementation of ad-hoc tools so the users could view the results and compare against other classrooms, districts and state wide entities
Environment: Erwin 9.6, Teradata 15, Informatica 9.6, Metadata, MS-Office, SQL Architect, DBA, OLAP, SQL T- PL/SQL, SharePoint, Agile, NoSQL, MS-Office, QlikView, ODS, Hadoop, OLTP
Confidential - Austin TX
Sr. Data Architect/Data Modeler
Responsibilities:
- Creating and oversight of data warehouse, data quality assessments and reporting, data modeling and integration, and data profiling.
- Interacted with business users and studied available documents/application to understand the requirements.
- Architected a detailed migration methodology to transferdatafrom multiple sources (Flat files, Teradata and Oracle DB) and Presented to Senior Management.
- Analysed requirements and put together the overall solution architecture for the project and created thedatamodels, system design and technical design documents for multiple work-streams on the projects.
- Worked closely with Business Analyst and business users to identify sourcedata, source validation rules, and process calculations; finalize the Granularity and Documented the Business Rules
- Designed ER diagrams, logical model (relationship, cardinality, attributes, and, candidate keys) and physical database (capacity planning, object creation and aggregation strategies) as per business requirements.
- Co-ordinate all teams to centralize Meta-datamanagement updates and follow the standard Naming Standards and Attributes Standards forDATA&ETL Jobs.
- Implemented logical and physical data modeling with Star and Snowflakes techniques using ERwin in Data Mart.
- As an Architect implemented MDM hub to provide clean, consistent data for a SOA implementation
- Researched, evaluated, architected, and deployed new tools, frameworks, and patterns to built sustainable Big Data platforms for the clients
- Cleansed, extracted and analyzed business data on daily basis and prepared ad-hoc analytical reports using Excel and SQL
- Prepare data modelsfrom existing archived data to support the reporting efforts. Assisted business analysts in mapping the data from source to target
- Performed Greenplum database versions upgrades on QA, development and production environments.
- Created native Oracle functions to Greenplum DDLs for Business Objects to use.
- Wrote numerous BTEQscripts to run complex queries on the Teradata database.
- Generated ad-hoc SQL queries using joins, database connections and transformation rules to fetch data from legacy database systems and DB2
- Designed, created and tuned physical database objects (tables, views, indexes, PPI, UPI, NUPI, and USI) to support normalized and dimensional models.
- Forward Engineering the Physical Model to generate the DDL scripts to implement on Oracle 12c database.
- Meet with technical and business staff to document requirements and current processes for thedatawarehouse solution.
- Developed scripts using NZSQL, NZLOAD functions to load new database tables.
- Implemented migration tasks from Oracle to PostgreSQL databases
- Developed Source to Target Mappings using Informatica PowerCenter Designer from Oracle, Flat files sources to Teradata database, implementing the business rules.
- Developed a Conceptual model using Erwin based on requirements analysis
- Worked on IBMOptim tool to facilitate archival of data and supported outdated/not in use Application decommission.
- Developed Star and Snowflakeschemas based dimensional model to develop the data warehouse.
- Developed and maintained data dictionary to create metadata reports for technical and business purpose.
- Generated ad-hoc reports using Crystal Reports
- Performed tuning and optimization of complex SQL queries using Teradata Explain.
- Designed and developed Informatica ETL mappings to extract master and transactional data from heterogeneous data feeds and load
- Designed and Developed Oracle PL/SQL and Shell Scripts, Data Import/Export, Data Conversions and Data Cleansing
- Created action plans to track identified open issues and action items related to the project.
- Worked on creating few Tableau dashboard reports, Heat map charts and supported numerous dashboards, pie charts and heat map charts that were built on Teradata database.
- Created series of Macros for various applications in TERADATA SQL Assistant.
- Prepared analytical and status reports and updated the project plan as required.
Environment: Oracle 12c, Erwin r 9.6, IBM Optim, Guardium, Tableau, Aginity Netezza Workbench, Windows 7 Enterprise, MS OFFICE, Teradata SQL Assistant, Netezza, PL/SQL, SQL, SQL Server, Metadata, PostgreSQL
Confidential, St. Louis, MO
Sr. Data Modeler/Data Analyst
Responsibilities:
- Analyzed business processes, gathered requirements from business users and project stakeholders
- Participated in sessions with business and technical teams to align the project goals to the data warehouse requirements.
- Created logical data model from the conceptual model and it’s conversion into the physical data models.
- Identified the Facts and Dimensions
- Identified and tracked the slowly changing dimensions and determined the hierarchies in dimensions
- Prepared data dictionaries and Source-Target Mapping documents to ease the ETL process and user’s understanding of the data warehouse objects
- Extensively used Star and Snowflake Schema methodologies in building and designing the logical data model into Dimensional Models.
- Designed and developed database tables and loaded into Netezza Database
- Performed GAP analysis with Teradata MDM and Drive (SQL) data models to get clear understanding of requirements.
- Extensively used Reverse engineering feature of the Oracle Data Modeller tool to create Logical data models from existing OLTP system.
- Generated ad-hoc SQL queries using joins, database connections and transformation rules to fetch data from legacy Oracle database systems
- Performed bulk data load from multiple data source (ORACLE, legacy systems) to TERADATA RDBMS using BTEQ, Multiload and Fastload.
- Identified and tracked slowly changing dimensions, Role-Playing and Multi-Valued Dimensions.
- Worked on clients financial database to identified reporting and data analysis requirements and translated them into Conceptual, Logical and then Physical data Models
- Worked with ETL teams and used Oracle data integrator, create repositories and establish users, groups and their privileges
- Enforced referential integrity in the OLTP data model for consistent relationship between tables and efficient database design
- Used forward engineering to create a Physical Data Model with DDL, based on the requirements from the Logical Data Model
- Facilitated in developing testing procedures, test cases and User Acceptance Testing (UAT)
- De-normalized the database to put them into the star schema of the data warehouse.
- Involved heavily in writing complex SQL queries to pull the required information from Database using Teradata SQL Assistance.
- Created UNIX Scripts for triggering the Stored Procedures and Macro.
- Worked with project team to create and update project documents to communicate project roadmap to higher management during weekly project meeting
Environment: Erwin 9.5/9.1, Oracle SQL Developer, Oracle Data Modeller, Tableau, PL-SQL, Windows XP, Oracle 11g, MS OFFICE, OBIEE11g, GitHub, Teradata, Netezza, Informatica
Confidential - Malvern, PA
Sr. Data Modeler/ Data Analyst
Responsibilities:
- Participated in requirement gathering session with business users and sponsors to understand and document the business requirements as well as the goals of the project
- Conducting the meetings with business users to gather data warehouse requirements
- Conducted team meetings and Joint Application Design (JAD) session
- Worked with the Business Analyst, QA team in their testing and DBA for requirements gathering, business analysis, testing and project coordination
- Developed normalized Logical and Physical database models to design OLTP system for education finance applications
- Wrote, tested and implemented Teradata FastLoad, MultiLoadand BTEQ scripts, DML and DDL.
- Created dimensional model for the reporting system by identifying required dimensions and facts using ERwin
- Implemented the slowly changing dimension scheme (Type II) for most of the dimensions
- Implemented Referential Integrity using primary key and foreign key relationships
- Translated business concepts into XML vocabularies by designing XML Schemas with UML
- Exhaustively collected business and technical Metadata and maintained naming standards
- Used Erwin for reverse engineering to connect to existing database and ODS to create graphical representation in the form of Entity Relationships and elicit more information
- Created physical data models using forward engineering
- Worked with ETL teams and used Informatica Designer, Workflow Manager and Repository Manager to create source and target definition, design mappings, create repositories and establish users, groups and their privileges
- Used Model Mart of Erwin for effective model management of sharing, dividing and reusing model information and design for productivity improvement, Involved in Data Mapping
- Created UNIX shell scripts for Informatica ETL tool to automate sessions.
- Consulted with client management and staff to identify and document business needs and objectives, current operational procedures for creating the logical data model.
- Coding using Teradata Analytical functions, BTEQ SQL of TERADATA, write UNIX scripts to validate, format and execute the SQLs on UNIX environment.
- Participated in performance management and tuning for stored procedures, tables and database servers
- Worked with ad-hoc reporting using Crystal Reports 9 and T-SQL.
- Responsible in integrated the work tasks with relevant teams for smooth transition from testing to implementation phase.
Environment: Erwin, SQL, SQL Server 2012, Rational Rose, Windows XP, Oracle 10g, Windows XP/NT/2000, Microsoft Access, Teradata SQL Assistant 12.0/13.11, Data stage 8.1, UNIX, Tableau, GitHub, Control-M 6.3.01, MS OFFICE, DB2.
Confidential - Charlotte, NC
Sr. Data Modeler/ Data Analyst
Responsibilities:
- Gathered and translated business requirements into detailed, production-level technical specifications, new features, and enhancements to existing technical business functionality.
- Involved in the redesign of the existing OLTP system.
- Involved in preparing Conceptual Data Models and conducted controlled brainstorming sessions with project focus groups.
- Used Model Mart of ERWIN for effective model management of sharing, dividing and reusing model information and design for productivity improvement.
- Conducted team meetings and JAD sessions
- Identified the Objects and relationships between the objects to develop a logical model in ERWIN tool and later translated the model into physical model using the Forward Engineering technique in ERWIN tool.
- Used Model Marts to understand different versions of data models existing in the system.
- Involved in dimensional modeling, identifying the Facts and Dimensions and different hierarchies.
- Implemented Referential Integrity using primary key and foreign key relationships.
- Used Erwin tool for relational database and dimensional data warehouse designs.
- Created entity-relationship diagrams, functional decomposition diagrams and data flow diagrams.
- Worked extensively on SQL querying using Joins, Alias, Functions, Triggers and Indexes.
- Managed all indexing, debugging, optimization andquery optimization techniques forperformance tuning usingT-SQL.
- Created/modified SQL Joins, sub-queries, and otherT-SQLcode to implement business rules.
- Developed Logical data model using Erwin and created physical data models using forward engineering.
- Created documentation and test cases, worked with users for new module enhancements and testing.
- Extensively used SharePoint for version controlling
- Executed the UNIXshellscripts that invoked SQL loader to load data into tables.
- Helped in creating Source-Target Mapping document.
- Conducted performance tuning of the database that included creating indexes, optimizing SQL statements.
- Worked with business analyst to design weekly reports using combination of Crystal Reports.
- Understood existing data model and documented suspected design affecting the performance of the system.
- Conducted logical data model walkthroughs and validation.
Environment: SQL Server 2008R2, SQL Server Reporting Services (SSRS), Business Intelligence Development Studio (BIDS), MS Excel, Erwin, Windows XP, Oracle, Teradata, Netezza, Metadata, Informatica.
Confidential
Data Analyst/Data Modeler
Responsibilities:
- Gathered the requirements document and analyzed the process.
- Data profiling and analyzing the requirements and developing models using reverse engineering.
- Translate business requirements into conceptual, logical data models and integration data models, model databases for integration applications in a highly available and performance configuration using ER/Studio.
- Created Physical Data Model from the Logical DataModelusing Compare and Merge Utility in ER/Studio and worked with the naming standards utility.
- Developed advanced SQL queries to extract, manipulate, and/or calculate information to fulfill data and reporting requirements including identifying the tables and columns from which data is extracted.
- Worked in managing the Business Unit’s KYCdata and associated reports on file servers and databases.
- Worked in Supporting, promoting and documenting all Data Management processes and procedures related to banking data as per KYC guidelines data.
- Ensured the quality, consistency, and accuracy of data in a timely, effective and reliable manner.
- Experience in understanding user requirements around technical data, interaction of data between systems and system dependencies.
- Worked to ensure high levels of data consistency between systems.
- Worked with the global technical data community and assist with implementing the global data management strategy.
- Worked with OracleSQLDeveloper to develop the SQL scripts.
- Developed PL/SQL programming that included writing Views, StoredProcedures, Packages, Functions and Database Triggers.
- Worked with ETL teams and used InformaticaDesigner, Workflow Manager and Repository Manager to create source and target definition, design mappings, create repositories.
- Developed and maintained data dictionary to create metadata reports for technical and business purpose.
- Prepared analytical and status reports and updated the project plan as required.
Environment: ER Studio, Oracle SQL Developer, MS Access, Teradata, Informatica Power Center 4.7, Windows XP, Excel, Crystal Reports.