- The document provides a summary of a candidate's experience as a Senior Datastage Designer and Developer with over 9 years of experience building data warehouses. Key skills include data modeling, ETL job design and development in Datastage, data profiling and cleansing, and administration of Datastage environments. Recent projects include roles at Macy's, NYC DOHMH, and Sears developing ETL solutions to integrate data from various sources into data marts and warehouses.
This document contains a professional summary and resume for Arun Roy Kondra. It summarizes his experience of over 4 years working with data warehousing and ETL solutions using Informatica PowerCenter. It details his technical skills and 3 projects he has worked on extracting and transforming data from various sources like Oracle, GreenPlum and flat files into data warehouses.
Experienced technical leader with over 15 years of experience developing applications using SQL Server 2008 and DB2. Specializes in database design and has led teams in implementing projects for major companies in financial, energy, and food industries. Recognized for strong technical skills including SQL, SSIS, SSRS, and expertise in database modeling. Completed a SQL Master's program with a perfect score average of 100%.
The document provides a summary of Prasenjit Chowdhury's experience and qualifications. He has over 12 years of experience in information technology, including as an Exadata administrator. He has skills in Oracle database administration, Exadata administration, and backup and recovery. His objective is to work as a customer-focused solution architect utilizing skills in technologies like Teradata and Hadoop.
The document provides details of 6 projects undertaken by Lekkala Sekhar as an ETL developer and Datastage and Teradata expert. The projects involved designing and developing ETL processes to extract, transform and load data from various source systems like flat files, Oracle and Teradata databases into target data warehouses. Technologies used include Datastage 7.5, 8.0, 8.5, 9.1 and 11.3 and Teradata V2R5, 12, 13 and 14.10. Responsibilities included requirement gathering, documentation, job development, testing, performance tuning and support.
Sasmit Kumar Bhitiria is an experienced data integration professional with over 4 years of experience using Informatica PowerCenter. He has extensive experience developing ETL processes, performing data modeling, and implementing data warehouses. Some of his skills include Informatica, SQL, Oracle, data warehousing, and dimensional modeling. He has worked on multiple projects involving data extraction, transformation, and loading from various sources into a data warehouse.
This document provides a summary of William (Bill) Gulley's professional experience and qualifications. He has over 10 years of experience as a Business/Systems Analyst with a focus on data warehousing, ETL, and Agile methodologies. His technical skills include SQL, SSIS, Informatica, and working with technologies like Teradata, SQL Server, Oracle, and Hadoop. He has experience leading requirements gathering and analysis in both Agile and waterfall projects across multiple industries.
This document contains a professional summary and resume for Arun Roy Kondra. It summarizes his experience of over 4 years working with data warehousing and ETL solutions using Informatica PowerCenter. It details his technical skills and 3 projects he has worked on extracting and transforming data from various sources like Oracle, GreenPlum and flat files into data warehouses.
Experienced technical leader with over 15 years of experience developing applications using SQL Server 2008 and DB2. Specializes in database design and has led teams in implementing projects for major companies in financial, energy, and food industries. Recognized for strong technical skills including SQL, SSIS, SSRS, and expertise in database modeling. Completed a SQL Master's program with a perfect score average of 100%.
The document provides a summary of Prasenjit Chowdhury's experience and qualifications. He has over 12 years of experience in information technology, including as an Exadata administrator. He has skills in Oracle database administration, Exadata administration, and backup and recovery. His objective is to work as a customer-focused solution architect utilizing skills in technologies like Teradata and Hadoop.
This document provides a summary of Sivakumar's professional experience and qualifications. He has over 6 years of experience in data warehousing projects across various domains. He is proficient in Informatica and has extensive experience in ETL development, testing, and support. Currently he works as a consultant for Genpact on a project involving data extraction and loading from various sources into GE's Transaction Life Cycle Management tool.
Sreekanth has over 5 years of experience in data warehousing and ETL development using IBM DataStage. He has worked on projects in the telecom and automotive industries, extracting data from various sources and loading it into data warehouses. His responsibilities included designing and developing ETL jobs, testing, troubleshooting, performance tuning, and providing production support. He is proficient in DataStage, Oracle, Teradata, UNIX, and scheduling tools like Autosys.
This document provides a summary of Gregory Harvey's experience as a Senior Database Engineer. It lists his technical skills including expertise with SQL, Sybase, MS SQL Server, and Oracle databases. It also provides highlights of his career working on projects for government agencies and large companies, where he performed tasks like database design, performance tuning, report creation, and system implementations. His experience spans over 15 years in database administration and engineering roles.
Ashish Mishra has over 1 year of experience as an ETL Developer working with Cognizant Technology Solutions and XL Catlin Insurance Company. He has extensive experience designing and developing mappings using Informatica PowerCenter to extract, transform, and load data from various sources like Oracle and SQL Server into data warehouses. His projects involved tasks like data profiling, claim conversion, and performance tuning of mappings.
John Plunkett III is an experienced database professional seeking a position as a Database Administrator or Data Manager. He has skills in Oracle, Sybase, SQL Server, and other databases, as well as experience tuning databases for performance and ensuring data integrity. His background includes roles supporting SNP, Trizetto Group, Caesars Entertainment, and other organizations.
This document contains a resume for Chakravarthy Uppara. It summarizes his contact information, objective, 6+ years of experience in database development using SQL Server and SSIS. It details his roles and responsibilities in projects for Tesco and Accenture developing ETL processes and interfaces to integrate various systems. His technical skills include SQL Server, SSIS, SSAS and .NET. He holds a B-Tech in Information Technology and is currently employed as a Senior Software Engineer at Tesco in Bangalore, India.
This document contains the resume of Mohamed Sakr, who has over 8 years of experience as a Senior ETL Developer specializing in ETL, data warehousing, data quality, and business intelligence. It lists his technical skills and experience with tools like Informatica PowerCenter, SQL Server Integration Services, and IBM DataStage. It also provides details of projects he has worked on, including roles and responsibilities in building data warehouses and ETL processes for various clients in Saudi Arabia.
This document is a resume for R. Michael Levin summarizing his objective, clearances, technical skills, and professional experience. Levin has over 20 years of experience in fields like enterprise architecture, data architecture, data modeling, database administration, and project management. He currently works as a Solution Architect and Technical Project Manager developing data warehouses and managing teams. His previous roles include positions as a Principal Solution Data Architect, Senior Data Architect, and Senior Database Architect at various companies.
Alphonso Triplett is currently an Enterprise Data Manager and TDM/MDM consultant at BOKF, where he provides guidance on test data management, data masking, and data privacy solutions. Previously, he held several roles as a senior data architect and database administrator, where he was responsible for data modeling, database design, test data management, and implementing data governance policies.
- The document contains the resume of Abdul Mohammed, an ETL developer with 8 years of experience using Informatica for data warehousing projects.
- He has expertise in requirements gathering, data extraction from various sources, transforming the data using Informatica tools, and loading the data into target databases.
- His most recent role was as an ETL/SR Informatica Lead from 2015-present where he worked on building a data warehouse for a pharmaceutical company using Informatica to extract data from Oracle and flat files.
This candidate has over 4 years of experience as an Informatica Developer, Master Data Management specialist, and Information Development Director developer. She has worked on various projects involving ETL development using Informatica PowerCenter, master data management using Informatica MDM, and application development using Informatica IDD. She is proficient in Oracle PL/SQL and has experience analyzing requirements, designing solutions, developing mappings and workflows, testing, and supporting clients. She is a motivated professional who works well independently and in a team.
This candidate has over 4 years of experience as an Informatica Developer, Master Data Management specialist, and Information Development Director developer. She has worked on various projects involving ETL development using Informatica PowerCenter, master data management using Informatica MDM, and application development using Informatica IDD. She has strong skills in Oracle PL/SQL, Informatica PowerCenter, MDM, and IDD and has experience designing and developing mappings, packages, and applications to meet client needs and requirements.
SQL Power Consulting is a Toronto-based consulting firm founded in 1988 that specializes in data warehousing and business intelligence solutions. They offer professional consulting services to support the end-to-end deployment of BI solutions. Their methodology involves multiple phases including requirements review, architecture design, project planning, ETL and report design/build, and warranty support. They emphasize critical success factors like commitment from stakeholders, flexible architectures, productivity tools, and delivering business value for clients.
Suppositoria aminofilin dibuat dengan menggunakan PEG 1000 dan PEG 4000 sebagai basis. Aminofilin dan PEG dicairkan dan dicampur untuk membentuk massa yang kemudian dicetak menjadi suppositoria. Suppositoria mengandung 500 mg aminofilin dan disimpan dalam kemasan aluminium foil.
El documento describe las principales partes de una computadora, incluyendo el monitor, la CPU, el mouse, la tarjeta madre, el zócalo del procesador y los puertos PCI. El monitor muestra información al usuario, la CPU ejecuta programas de software, el mouse permite interactuar con elementos en la pantalla, la tarjeta madre conecta los componentes electrónicos, y el procesador recibe órdenes y realiza funciones.
Special edition of luxury party wear dresses has recently introduced in Orient Ramadan Luxurious Collection 2016. This collection contains such elegant outfits, which make you enchanting in the gatherings of this Ramadan and Eid. Orient Textiles has presented 3-pieces and 2-pieces unstitched dress in this collection. These dresses has specially made with quality fabric of Lawn and Cambric which keeps you cool in the hot days of summer. Orient has offered wide range of luxury summer outfits in this collection which has crafted according to the latest trends to make you charming. If you are looking for graceful dresses to wear in the Iftar parties of this Ramadan, or you want to look enchanting on this Eid, you will find every type of fancy dress in Orient Ramadan Luxurious Collection 2016. For more information, visit us at www.shopatorient.com/luxurious-collection
The document provides daily technical levels for various currency futures contracts for intraday trading on February 1st, 2013. It lists the previous day's close, intraday trend, pivot point, and resistance and support levels (R1, R2, R3 for resistance and S1, S2, S3 for support). The pivot point is a trigger for intraday buy/sell decisions. The analyst notes that intraday trends are only valid while the price remains above/below the pivot point. Buy positions should be taken above the pivot point using the pivot as a stop loss, with targets at resistance levels. Sell positions are taken below the pivot point also using the pivot as a stop loss, with targets at support
The House Republican budget would cut taxes by an average of $150,000 for millionaires and billionaires by ending Medicare as we know it and reducing investments in education, medical research, and other programs that benefit the middle class. The budget fails tests of balance, fairness, and shared responsibility by disproportionately benefiting the wealthy at the expense of programs that help ordinary Americans.
طراحی وب سایت و تجارت الکترونیک
در این آموزش به بررسی مقدمات طراحی سایت از جمله دامنه، هاست، انواع زبان های برنامه نویسی، طراحی سایت با سیستم مدیریت محتوا جوملا، سئو و ... می پردازیم و در نهایت چند کسب و کار اینترنتی موفق را معرفی می نماییم
The presentation given by Bud Caddell and I at the SXSW Interactive Conference, March 2010. Brands have come a long way in making successful web videos. Our audience helped us decide which videos reigned supreme. More info here - http://www.webvideothunderdome.com
The candidate is seeking a position that allows him to contribute fully and continue developing his professional and personal skills. He has over 8 years of experience in accounting, auditing, and inventory management. His most recent role was as an AR and Inventory Coordinator for Al Laith Scaffolding LLC since 2016. Prior to that he held roles as an Inventory Analyst for Accenture and AR Specialist for Xerox India. He has a Master's degree in Business Administration with a focus on human resources and marketing.
Toyota developed just-in-time (JIT) manufacturing in the 1970s-1980s to improve quality and reduce costs. JIT aims to eliminate waste by providing the right parts at the right time and place. It is a philosophy of continuous problem-solving that pulls supplies through production as needed. Key elements include acquiring inventory as needed, improving quality, reducing lead times, and daily revisions. JIT seeks to reduce waste and variability for low-cost, high-quality production through a pull-based system with small lot sizes and close supplier ties.
PCC provides tailored support to primary care providers and their teams to help with federating and developing new models of care. Their support includes CCG-wide practice development, member engagement and development, bespoke support for federations, and facilitating organizational growth. Testimonials from Calderdale CCG and a GP federation chair praise PCC's professional and helpful approach. PCC's services include workshops, guidance, and support for collaboration, business planning, responding to tenders, and integrated service provision.
El documento presenta una lista de necesidades para un proyecto que incluye recepción, oficinas, baños, cuartos de mantenimiento, cocina, salones para belleza, enfermería, habitaciones, áreas recreativas y de entretenimiento, talleres, estacionamiento, bodega, caseta de vigilancia, biblioteca, invernadero, salas de usos múltiples, consultorio y depósito de basura. También analiza las amenazas, fortalezas, oportunidades y ubicación del terreno, así como los requerimientos de
This presentation summarizes a Road Scholar program in Costa Rica called "A Taste of Costa Rica". Over the course of 9 days, 13 participants experienced Costa Rican culture through activities like visiting museums and plantations, rafting, birdwatching, and learning about coffee and chocolate production. The itinerary included stops in Sarapiqui, La Fortuna, Monteverde, and San Jose. The program was led by Erick Castro Vargas and aimed to provide an immersive cultural experience in Costa Rica through educational activities and interactions with local people and places.
Steve Jobs provided many insightful quotes about innovation, quality, and his vision for Apple. Some of his most notable quotes include his belief that quality is more important than quantity, his desire to "put a ding in the universe" with innovative products, and his view that a few well-executed innovations are better than many mediocre ones. Jobs focused on building products for himself and Apple rather than for customers, and felt strongly that attention to detail and excellence should extend to all aspects of a product, even if unseen.
Pradeep Kumar Pandey has over 10 years of experience as a data/systems integration specialist and ETL expert. He has extensive experience designing and implementing data warehouses using tools like IBM DataStage, Informatica, Oracle OBIEE, and Oracle OBIA. He has led teams and taken on roles such as developer, technical lead, and team lead. Pradeep has worked on projects across various industries including telecom, financial services, HR, and retail.
Hamsa Balaji has over 20 years of experience in database administration, application development, and business intelligence. He is an Oracle Certified PL/SQL Developer and Database Administrator with extensive experience developing data warehouses using Oracle, SQL Server, DB2, and ETL tools like Informatica and DataStage. He currently works as a Systems Engineer at Wells Fargo where he leads the development and implementation of complex data warehouse schemas and reporting solutions.
IT Professional with 9 years of Data Warehousing experience in the areas of ETL design and Development.Excellent Experience in Requirement Gathering, Designing, Developing, Documenting, Testing of ETL jobs and mappings in Parallel jobs using Data Stage to populate tables in Data Warehouse and Data marts.
Pradeep Reddy Amanaganti has over 11 years of experience as an ETL Test Lead/Senior QA Analyst working with tools like Informatica, SAP BODs, SSIS, SQL Server, Oracle, and testing methodologies. He has expertise in data warehousing concepts, ETL processes, data modeling, SQL, and testing applications throughout the SDLC. Pradeep has worked on projects in various industries testing data integration and ensuring data quality.
Ganesh Kamble is a technology professional with over 1.5 years of experience in business intelligence development. He has experience developing reports, dashboards, and universes in BusinessObjects 4.0 and migrating projects from BI 3.1 to 4.0. He has also worked on ETL processes using Informatica and developed macros and procedures using Visual Basic for Applications. Kamble aims to contribute his skills in business intelligence and gain further experience and knowledge.
The document provides a summary of Praveena Chandrasekaran's professional experience and skills. She has over 7 years of experience in data warehousing, business intelligence, and ETL development using Informatica. Some of her roles included developing ETL mappings and workflows to load data from various sources into targets, performing testing, and leading a project as an onsite coordinator. She has strong skills in SQL, databases, data modeling, Informatica, and UNIX scripting.
- The document contains the resume of Sandhya Chamarthi which summarizes her work experience in IT with a focus on Informatica and Pentaho Data Integration (KETTLE) ETL tools. She has over 4 years of experience in data warehousing, ETL processes, business intelligence and dimensional modeling. Some of the key projects listed include developing ETL processes for insurance, banking and telecom clients to load data into data warehouses and datamarts.
- Mallikarjun Konduri is seeking an opportunity in data warehousing technology to design, develop, test, and manage ETL projects as well as provide production support for existing products.
- He has over 11 years of experience in IT with expertise in Datastage, Oracle, SQL, and data warehousing projects for banking, telecommunications, and manufacturing clients.
- His experience includes designing and developing ETL processes to extract, transform, and load data from various sources into data marts and data warehouses.
This document provides a summary of an IT professional's skills and experience. It includes contact information, locations, a summary of Oracle database and Informatica experience, lists of technical skills and tools, training and education background, and descriptions of previous roles involving data warehousing, ETL development, and Oracle development. Previous roles included senior positions at Symphony EYC, Barclays Investment Bank, and DeCare Systems Ireland developing ETL processes and Oracle databases.
Pratik Dey is an IT professional with over 4 years of experience in data warehousing and ETL development. He has strong skills in Informatica PowerCenter and experience loading data from various sources into Teradata. Currently he works as an ETL Data Specialist for Thomson Reuters implementing their Connect Data Warehouse. Previously he worked on projects in healthcare and banking to develop ETL processes and load data into data warehouses.
This resume summarizes Navendu Baghel's experience as a senior Oracle SQL, PL/SQL, and ETL developer with 10 years of experience programming databases. He has expertise in writing SQL queries, stored procedures, and using Oracle utilities. His experience includes projects involving compliance tracking, payment processing, and membership management systems.
This resume summarizes Parthiban Ranganathan's experience in IT with a focus on data warehousing, business intelligence, and big data. He has over 7 years of experience working on healthcare and manufacturing data warehouse projects. His technical skills include Teradata, Oracle, SQL Server, DB2, Sybase, Informatica, SSIS, SSAS, and Cognos. He has experience as an ETL developer, data modeler, and BI developer. His most recent role was as a technical lead for an Anthem healthcare data mart project involving ETL, data integration, and business intelligence.
Vasudevan Venkatraman has over 11 years of experience working in the IT industry, including 7+ years of experience with Oracle PL/SQL, data warehousing, and 3+ years in performance consulting and applications database administration. He has experience designing and developing applications using Oracle PL/SQL, Hadoop, and big data technologies. Currently he works as an Assistant Consultant at TCS focusing on data warehousing projects using Oracle and Hadoop.
- The document describes the candidate's 7 years of experience in data analysis, software development, testing, and documentation using technologies like Oracle, SQL, PL/SQL, and Unix scripting.
- Key responsibilities included developing data models, writing stored procedures and functions, performance tuning, and working on projects for clients in various industries like healthcare and pharmaceuticals.
- The candidate has experience with applications like training systems, assessment tools, medical information response systems, and laboratory applications.
Salim Khan is seeking a position that allows him to utilize his 3.8 years of experience in data warehousing and business intelligence. He has extensive experience with ETL tools like Informatica PowerCenter and has worked on projects involving data extraction from various sources and loading it into data warehouses. He is proficient in SQL, Oracle, Informatica, and Salesforce and has experience managing the entire software development life cycle.
Subhoshree Deo has over 4 years of experience as an ETL developer and Salesforce integrator. She has worked on projects involving Informatica, SQL Server, Oracle, and Salesforce. Her experience includes designing and developing interfaces to migrate data between various systems, load historical data, and integrate applications like Veeva and Salesforce. She is proficient in technologies like Informatica, SQL, and programming languages like C/C++ and has handled projects for clients including Astrazeneca, Merck, and Takeda.
Alok Singh is seeking challenging assignments in Business Intelligence/Data warehousing. He has nearly 7 years of experience in BI/DW, ETL, data integration, and data warehousing solution design. He is proficient in SQL, ETL tools like Informatica and SSIS, and visualization tools like QlikView and Tableau. He has experience designing and developing ETL solutions, requirements gathering, and data analysis. His past roles include positions at Technologia, Subex, and Reliance Communications where he worked on projects involving Teradata, Oracle, billing systems, and fraud detection. He has a bachelor's degree in electronics and telecommunications.
Amit Kumar has over 2 years of experience working as an Informatica Administrator and Developer. He has expertise in Informatica PowerCenter, Oracle, and Linux. Some of his responsibilities include installing and configuring Informatica, creating repositories and services, applying patches, performing upgrades, managing security, and providing support. He also has experience developing ETL mappings to load data from various sources into targets like data warehouses and data marts.
Raghav Mahajan has 3.8 years of experience in data warehousing and business intelligence using Informatica and Oracle. He has expertise in ETL development, data modeling, and working with various data sources. His skills include Informatica, Oracle SQL, Unix scripting, and Qlikview. He is currently an Associate ETL Developer at Cognizant Technology Solutions working on projects for clients such as AT&T.
Sailaja Prasad Mohanty is a software test engineer with 3 years of experience in testing data warehouses and reporting tools. He has worked on projects involving Teradata, SAP HANA, Vertica, and Tableau. His skills include test automation using Selenium, Protractor, Python and Java. He is proficient in test data management tools like CA TDM and performance testing tools like JMeter. He is currently working as a test engineer at Infosys where he performs data warehouse testing, requirement gathering, test automation, and knowledge transfer.
Similar to Krishna_IBM_Infosphere_Certified_Datastage_Consultant (20)
1. KRISHNA KISHORE PUSAPATI
kishore1.pusapati@gmail.com
SUMMARY
• 9+ years of experience in building Data Warehouse as a Senior Datastage Designer, Senior Datastage Developer,
Datastage administrator, Quality Stage Developer with consultant roles and responsibilities
• Strong Knowledge of Data Warehouse Architecture which includes Star Schema, Snow flake Schema, FACT,
Dimensional Tables, Physical and Logical Data Modeling.
• Extensive knowledge and wide range of experience in Data profiling, Data cleansing, designing and developing
Datastage jobs in Datastage versions 11.3,9.1, 8.7, 8.5, 8.1 (IBM Info sphere, Information Server Suite).
• Actively participated in migration of data stage jobs between different versions.
• Good experience in Datastage Administrator, Datastage Director, Datastage Designer
• Involved in Datastage Version migration like data stage 8.5/9.1 and data stage 8.1/9.1
• Actively participated in user / client meeting to understand and put the requirements in compatible with the ETL design
on OLAP as well as OLTP systems.
• Involved and assist the team in different phases of the project like Design, Development, Testing, Preproduction and
Production.
• Good experience in writing SQL loader scripts and UNIX Shell Scripts to automate file manipulation and data loading
procedures.
• Experience in gathering business requirements and preparation of software requirement specification (SRS) documents
and actively participated while designing the ETL model with the architect.
• Implement procedures for maintenance, monitoring, backup, and recovery operations for the ETL environment.
• Reporting (daily, weekly, and monthly) on health and performance of the ETL environment and jobs.
• Infrastructure optimization, troubleshooting and debugging.
• Actively participated in Data stage admin activities
• Maintain ownership of release activities which interact with ETL projects.
• Work experience in production and support for the critical application
• Processed large volumes of data (more than 10 million) on a daily frequency, in form of flat files as well as from
database tables.
• Worked on the performance tuning with Datastage, SQL and the Oracle, DB2 bulk loads and simple inserts.
• Experience in the performance tuning of the Datastage jobs for both historical migration and daily loads.
TECHNICAL SKILLS
ETL Tools Datastage 11.3,9.1,8.7,8.5,8.1 , Quality stage, Information Services Director,
Standardization rules designer ,Information Analyser.
Databases Oracle 11g, Oracle 10g , Oracle 9i, DB2, Teradata, Netezza.
Operating
Systems
AIX, LINUX, UNIX, Windows 2003 server, XP
Languages Sql, Pl/Sql , C++,perl scripting
Scheduling
Tools
UNIX Cron, batchman Scheduler, Datastage Scheduler,Autosys,control-M.
Certifications Certified IBM Datastage Professional
Certified IBM Qualitystage Professional
PROFESSIONAL EXPERIENCE
MACYS SYSTEMS and TECHNOLOGY,JohnsCreek,GA Oct 2015- Present
DataStage Lead Consultant
2. Page 2 of 7
The business owner requires the ability to manage inbound and outbound campaigns (offers/messages), including those from
external lists, and operational reporting for Email, SMS, Mobile Push and Site media types for Macy’s/Bloomingdales
(M/BL) using the SAS MA/MO tools. The key goal of the RTO effort is to replace separate and disparate systems by channel
with a single enterprise Omni channel Campaign, Optimization and Real Time Offer/Message solution to enable a new
platform for delivering targeted, relevant, and personalized marketing campaigns and offers/messages to customers. Support
a shift in Marketing from the traditional event/calendar-driven paradigm towards a customer-centric, relationship-driven
approach to communicate with our customers in a contextually relevant dialogue across omnichannel marketing vehicles. To
achieve the above goals Macy’s/Bloomingdales (M/BL) will utilize the SAS Campaign Management/Campaign Optimization
(MA/MO) product that will integrate with existing execution/delivery systems and a newly created Data Mart. Datastage is a
ETL tool which has given good solution for creating the Required Data mart.
• Working on the environment set up for the current project.
• Gathering business requirements and preparation of software requirement specification (SRS) documents and actively
participated while designing the ETL model with the architect.
• Designing the mapping documents for the development ETL jobs.
• Active participation on the dimensional modeling for the data marts and building the flattened structure for SAS.
• Working on the IBM Data stage jobs design templates.
• Developing the ETL jobs using Data stage and conducting Unit testing for the jobs.
• Involving in the Administrative tasks for the set up of the Data stage Environment.
• Creating the Tables Indexes and Referential integrities for the Data marts.
• Involving in maintenance of the data quality and data profiling of the client data.
• Managing offshore delivery teams directly ( assigning tasks, getting status, doing regular calls, etc ).
• Preparing the Unit test cases and Quality maintenance of the ETL code.
• Working with Oracle tuning, modeling and database design.
• Responsible for the code deployment process between the environments.
• Working on the performance issues with the initial and incremental loads and code bug fixing.
• Working on the reject processing and audit control jobs.
• Extensive experience with the following Create/Delete projects, create/modify environmental parameters. Multiple
APT_CONFIG_FILE.
• Environment: IBM Info Sphere Quality Stage and Data Stage 11.3.2, Flat Files, DB2,Oracle 11g,SAS,Sub-
Version,Control-M.
NYC DOHMH, LongIsland City, NY April 2015- Sep 2015
DataStage and QualityStage Lead Consultant
NYC DOHMH is continuing a project to evaluate the NY/NY III Supportive housing program by comparing and identifying
individuals in supportive housing to individuals who are not by using health and administrative data sets. The Division of
Informatics, Information Technology, and Telecommunications (DIITT) is to develop a probabilistic matching algorithm
utilizing IBM Infosphere Quality Stage/Data Stage v9.1 to de-duplicate persons in supportive housing against multiple
disparate health data sets. Need to include these data in an SQL relational data warehouse based on results of the match to
link person health data across different sources. DIITT is a multidisciplinary unit with the goal of combining cutting-edge
informatics, application development analysis and reporting to implement innovative IT solutions for public health.
• Responsible for standardizing different data sources in one common SQL table structure for de-duplication.
• Extensively worked with IBM Info sphere Quality Stage V9.1 investigation, standardization (including CASS address
verification), and Unduplicate and Reference file matching.
• Worked to setup one-time person match using historic data, work with Informatics, Technology and Epidemiology staff
to provide data sets to evaluate the algorithm and set thresholds
• Implemented stored procedures to implement recurring matching going forward including updating the SQL Data
warehouse.
3. Page 3 of 7
• Experince in real time stages like XML,parseJSON(JavaScript Object Notation) to handle web based data.
• Woked with contract libraries contains the schema that describes the JSON document for company profiles.
• Technical documentation of implementation.
• Performed Unit Testing, Integration Testing, Regression Testing and User Acceptance Testing (UAT) for every code
change and enhancement.
• Implemented reusable components (Containers) for the error handling and error mailing.
• Worked on improving the performance of the designed jobs by using various performance tuning strategies
• Experience with complex Sql scripts, Stored procedures, Creation of indexes, complex joins in SQL Server2012
• Ability to assist with operations and production coverage
• Experience installing and upgrading Datastage Server product suite on different tiers
• Experience with Post install Configuration, various data sources configuration for the Enterprise plug-in and
configurations
• Experience with stopping and starting server and monitor the logs.
• Extensive experience with the following Create/Delete projects, create/modify environmental parameters. Multiple
APT_CONFIG_FILE
• Outstanding communication and customer support skills and experience
• Environment: IBM Info Sphere Quality Stage and Data Stage 9.1, Flat Files, SQL Server 2012.
SEARS, Livonia, MI February 2013- March 2015
Data Stage and Quality Stage Lead Consultant/Infosphere Data Stage Admin
The objective of the project is to process the data from source files and Database by performing ETL to develop job streams
for the interfaces, which will process data from source to target .These files are generate automated reporting capability for
its client businesses. These are met by the Extraction, Transformation and Loading using Data stage.
• Worked with the Business process analyst to thoroughly understand the different business processes and requirements.
• Leading the role of onsite coordinator and handling the team.
• Implemented Administrative tasks using Data stage Administrator
• Involved in migration of the data stage jobs from older version(8.1) to new version(9.1)
• Creating backup of the jobs and Generation of Configuration file.
• Creation of users and groups and Assigning roles and responsibilities to the users.
• Documented user requirements, translated requirements into system solutions and develop implementation plan and
schedule.
• Involved in the implementation of Clustering Architecture(Grid Technology)
• Developed the OLTP data models for the real time retail data.
• Developed parallel jobs between multiple source systems and sequencers.
• Extensively worked on Information Analyzer to perform data profiling and identify and analyze the Metadata of the data
(Column Analysis). Developed a detailed data profiling reports.
• Publishing XML document from tabular data
• Parsing XML document into tabular data
• Accessing a Web service with input and output data
• Extensive experience with the following Create/Delete projects, create/modify environmental parameters. Multiple
APT_CONFIG_FILE
• Experience in real time stages like XML,parseJSON(JavaScript Object Notation) to handle web based data.
• Worked with contract libraries contains the schema that describes the JSON document for company profiles.
• Using JSON stage to get the data from .XSD format.
• Implemented reusable components (Containers) for the error handling and error mailing.
• Worked on Information Analyzer for column analysis, primary key analysis and foreign key analysis and developed
detailed data quality assessment (DQA) reports.
• Ability to assist with operations and production coverage.
• Worked used Data Stage and Quality Stage Designer to develop customized rule sets for standardizing fields like
individual name, organization names, address and phone numbers.
• Extensively used Data Stage and Quality Stage Designer to design and develop jobs for extracting, cleansing,
transforming, integrating, and loading data using various stages like Standardize, Match Frequency, Reference Match,
4. Page 4 of 7
Unduplicate Match, Survive, Remove Duplicate, Surrogate Key, Aggregator, Funnel, Join, Merge, Lookup, Change
Capture, Change Apply and Copy.
• Worked with complex Sql scripts, Stored procedures, Creation of indexes, complex joins in DB2
• Worked with Data Stage Director to monitor and analyze performance of individual stages and run Data Stage parallel
jobs and sequencers.
• Extensively worked on migrating Data Stage jobs from development to test and to production environments.
• Performed Unit Testing, Regression Testing, and User Acceptance Testing (UAT) for every code change and
enhancement.
• Worked on improving the performance of the designed parallel jobs and sequencers by using various performance tuning
strategies.
• Environment: IBM Info Sphere Data Stage 8.1/9.1, Information Analyzer, IBM AIX, Flat Files, Neteeza NPS 7.0,
DB29.7.
WALGREENS, Detroit, MI March 2011 – January 2013
Senior ETL Lead Data Stage Developer/Admin
The scope of the project is to develop Data Warehouse is built upon integrated legacy system data feeds received from many
divisions of business data sources .Data Marts were build according to the business critical user requirements.
• Involved in different phases of building the Data Marts like analyzing business requirements, ETL process design,
performance enhancement, go-live activities and maintenance.
• Interacted with the end users, Business Analysts and Architects for collecting, understating the business requirements.
Documented them and translated requirements into system solutions.
• Worked as a lead data stage developer.
• Experienced with Creation of users, groups and different tasks using Data stage Administrator
• Adding Environment variables to the project and involved in the migration of data stage version from 8.5 to 9.1.
• Maintaining the Version control of the jobs.
• Implemented reusable components (Containers) for the error handling and error mailing.
• Designed ETL process as per the requirements and documented ETL process using MS Visio.
• Extensively worked on Data Stage Designer, Director and Administrator.
• Extensively used Data Stage Designer to design, develop ETL jobs for extracting data from Oracle tables and Flat files
and load them into respective DataMarts.
• Importing and Exporting of jobs through DataStage Designer into different projects.
• Extensively worked with Fact and Dimension tables to produce source - target mappings based upon business specs.
• Extensively worked on Information Analyzer to perform data profiling for identify and analyze the Metadata of the data
(Column Analysis). Developed a detailed data profiling reports.
• Worked on UNIX Shell Scripts to automate the file transfer process (FTP) and Scheduled jobs using AutoSys.
• Worked with DataStage Director to monitor and analyze performance of individual stages and run DataStage jobs.
• Experience in complex Sql scripts, Stored procedures, Creation of indexes, complex joins in Oracle
• Implemented enhancements to the current ETL programs based on the new requirements.
• Involved in setting up of war room sessions for the closure of issues.
• Involved in the design of BIS architecture, Code reviews of DataStage jobs and implemented best practices in DataStage
jobs.
• Involved in up gradation of tool and migration of jobs from lower version to higher version.
• Performed Unit Testing, Integration Testing, Regression Testing and User Acceptance Testing (UAT) for every code
change and enhancement.
• Worked on improving the performance of the designed jobs by using various performance tuning strategies.
• Extensively worked on coordination with Offshore.
• Environment: IBM InfoSphere DataStage 8.5/9.1, Oracle (10g/11g), SunOS 5.1, HP Quality Center,Cognos 10,
AutoSys, MS Visio, PL/SQL and Flat Files.
THAMESWATER, Reading, UK June 2010 – February 2011
DataStage and QualityStage Consultant
The Scope of WAMI (Thames water) project is client wanted to a single view on customer data. Initially WAMI was
scattered in different. They decided to create a single customer solution using SAP CRM system.
5. Page 5 of 7
• Involved in different phases of Data Integration like gathering and analyzing business requirements, ETL process design,
performance enhancement, go-live activities and support.
• Interacted with the users, Business Analysts for collecting, understating the business requirements.
• Designed ETL process as per the requirements and documented ETL process using MS Visio.
• Extensively worked on DataStage Enterprise Edition (Formerly Parallel Extender) Designer, Director and Administrator.
• Extensively used DataStage Designer to design, develop ETL jobs for extracting data from different source system,
transforming the legacy data to SAP ready data and loading the data into SAP system.
• Extensively worked on administration tasks like creating projects for the respective releases and creating global
parameters which can be commonly used across the environments.
• Designed and developed DataStage jobs, containers to implement business rules.
• Used After/ before job routines to perform specific task.
• Involved in Master Data like Vendor Master, Customer Master and Transaction Data like AR Invoices, IM and WM
managed Inventory, packing conversion.
• Scheduled jobs using DataStage Director and Unix Shell Scripts.
• Worked on UNIX Shell Scripts to automate the file transfer process(FTP).
• Worked with DataStage Director to monitor and analyze performance of individual stages and run DataStage jobs.
• Provided 24/7 support during the various test and mock phases and production support during the Go-Live.
• Extensively worked on migrating DataStage codes from development to quality and to production environments.
• Actively participated in discussions and implementation of Error Handling and Reject management functions.
• Experience in monitoring the InfoSphere suite server for I/O usage, memory usage, CPU usage, network usage, storage
usage.
• Involved in setting up of war room sessions for the closure of issues.
• Performed Unit Testing, Integration Testing, Regression Testing and User Acceptance Testing (UAT) for every code
change and enhancement.
• Worked on improving the performance of the designed parallel jobs and sequencers by using various performance
tuning strategies.
• Extensive Offshore coordination.
• Environment: IBM InfoSphere DataStage and Quality Stage 8.7 Parallel Extender, HP Quality Center, MSSQL Server
2000/2005, Sybase, SAP BW 7.0, IBM AIX 5.X, , SAP R/3, AutoSys, MS Visio, PL/SQL, DOS and Flat Files, Oracle
10g,DB2 9.7.
Hutchinsion3G, Maidenhead, UK May 2009 – March 2010
DataStage and QualityStage Consultant
The Hutchinson 3G UK Limited business and main IT focus is now on BI – analyzing data to predict future strategies for
driving business behavior, developing campaigns, and targeting clients to best effect with reliable up to date information that
can help differentiate themselves in the marketplace and from their competition.
• Worked with the Business analyst to thoroughly understand the different business processes and requirements.
• Documented user requirements, translated requirements into system solutions and develop implementation plan and
schedule.
• Developed parallel jobs between multiple source systems and sequencers.
• Extensively worked on Investigate Stage to perform data profiling and identify and analyze the data patterns and
inconsistencies. Developed a detailed data profiling report.
• Worked on Information Analyzer for column analysis, primary key analysis and foreign key analysis.
• Extensively used DataStage and QualityStage Designer to design and develop jobs for extracting, cleansing,
transforming, integrating, and loading data using various stages like Standardize, Match Frequency, Unduplicate Match,
Survive, Remove Duplicate, Surrogate Key, Aggregator, Funnel, Join, Change Capture, Change Apply and Copy.
• Worked with DataStage Director to schedule, monitor and analyze performance of individual stages and run DataStage
jobs.
• Extensively worked administration tasks like creating Repository, User groups, Users and managed users by setting up
their profiles and privileges and creating global parameters which can be commonly used across the environments.
• Designed and developed DataStage jobs to populate into the MDM and send notification in xml messages using web
service transformer.
6. Page 6 of 7
• Written various Unix Shell Scripts for scheduling the jobs.
• Extensively worked on migrating DataStage jobs from development to test and to production environments.
• Involved in setting up of war room sessions for the closure of issues.
• Designed mechanism for identifying bad records during the loading process.
• Performed Unit Testing, Regression Testing, and User Acceptance Testing (UAT) for every code change and
enhancement.
• Worked on improving the performance of the designed jobs by using various performance tuning strategies.
• Environment: IBM WebSphere DataStage and QualityStage 8.1 Parallel Extender, Information Analyzer, IBM MDM,
IBM AIX 6.1, Oracle 10g, PL/SQL, Flat Files.
ALLSTATE Insurance, Fort Wayne, IN January 2009 – April 2009
DataStage Consultant
The main objective of the project was to develop an Enterprise Data Warehouse for reporting and analysis purposes. Matrix
extracts data from different source systems and applies business rules, loads the data into warehouse to different DataMarts
and sends data to the representatives on the field.
• Involved in all the phases of building the DataMarts like analyzing the business requirements, ETL process design,
performance enhancement and maintenance.
• Extensively worked on DataStage Enterprise Edition (Formerly Parallel Extender) Administrator, Designer, Director,
and Manager.
• Extensively used DataStage Designer to design, develop ETL jobs for extracting, transforming and loading the data into
different DataMarts.
• Extensively worked administration tasks like creating user profiles and assigning user privileges and creating projects for
the respective releases and creating global parameters which can be commonly used across the environments.
• Involved in Data Modeling and creation of Star Schema and Snowflake Dimensional, DataMarts using ERWin tool.
• Extensively worked in Teradata RDBMS V2R6.1.
• Have Strong working experience in various Teradata Client Utilities like Multiload, Fast load, Fast Export and
TERADATA administrator activities.
• Designed mechanism to send alerts as soon as the jobs failed to PSO and the respective developers.
• Provided production support during the various phases.
• Extensively worked on migrating DataStage jobs from development to test and to production environments.
• Created Stored Procedures to transform the data and worked extensively in PL/SQL for various needs of the
transformations while loading the data.
• Experience in complex Sql scripts, Stored procedures, Creation of indexes, complex joins.
• Extensively worked Requirement analysis & Impact Analysis
• Extensively worked Designing solution framework and Data Analysis
• Extensively coordinated with Offshore.
• Performed Unit Testing, Integration Testing, Regression Testing and User Acceptance Testing (UAT) for every code
change and enhancement.
• Worked on improving the performance of the designed jobs by using various performance tuning strategies.
• Environment: IBM WebSphere DataStage 8.1 Parallel Extender, Cognos, AutoSys, Windows NT 4.0, IBM AIX 4.1, Red
Hat Enterprise Linux 5, Oracle 10g, PL/SQL, DOS and Sequential Files.
Ryvo Electronics, Atlanta, GA April 2007 –December 2008
PL/SQL and DataStage Developer
The objective of the project is to develop the system, which can give him intelligent Information reports regarding his
existing business situation and how he can analyze business.Involved in all the phases of building the DataMarts like
analyzing the business requirements, ETL process design, performance enhancement and maintenance.
• Extensively Worked on Parallel Extender (PX), QualityStage.
• Extensively used DataStage Designer to design, develop ETL jobs for extracting, transforming and loading the data into
different DataMarts.
• Performed Import and Export of DataStage components and table definitions using DataStage Manager.
7. Page 7 of 7
• Extensively worked administration tasks like creating user profiles and assigning user privileges and creating projects for
the respective releases and creating global parameters which can be commonly used across the environments.
• Developed Stored Procedures, Functions, database Triggers and created Packages to access databases from front end
screens.
• Developed both inbound and outbound interfaces to load data into targeted database and extract data from database to
flat files.
• Extensively used Data Stage Designer to develop a variety of jobs for extracting, cleansing, transforming, integrating and
loading data into target database.
• Extracted data from various source systems like Siebel, Oracle, Flat Files and SAP R/3.
• Extensively used SAP integration using SAP connection.
• Created Stored Procedures to transform the data and worked extensively in PL/SQL for various needs of the
transformations while loading the data.
• Extensively worked Requirement analysis & Impact Analysis
• Performed Unit Testing, Integration Testing and User Acceptance Testing (UAT) for every code change and
enhancement.
• Worked on improving the performance of the designed jobs by using various performance tuning strategies.
• Written various Unix Shell Scripts for scheduling the jobs.
• Environment: IBM WebSphere DataStage 8.1, Windows NT 4.0,UNIX, LINUX, Oracle 8i, PL/SQL, IBM DB2, Dbase3
Files, DOS and UNIX Sequential Files, MS Access, .csv Files, XML Files.
MetLife insurance Company, Bangalore, INDIA January 2007 - March 2007
Software Developer
The primary objective of the project is to develop one process for their daily activities used to calculate the premium and
analyze what type of policies people preferring.
• Automated SQL Loader to load the data from the flat files
• Used Materialized Views, Packages and Dynamic SQL
• Created programs to transform data from Legacy systems into Oracle database.
• Worked on DataMarts using ERWIN tool.
• Created Stored Procedures to transform the data and worked extensively in PL/SQL for various needs of the
transformations while loading the data.
• Written various Unix Shell Scripts for scheduling the jobs.
• Implemented enhancements to the current programs based on the new requirements.
• Experience in complex Sql scripts, stored procedures, Creation of indexes, complex joins.
• Performed Unit Testing, Integration Testing and User Acceptance Testing (UAT) for every code change and
enhancement.
• Worked on improving the performance of the designed jobs by using various performance tuning strategies.
• Worked closely with Business Users and Data Quality Analysts after loading data for accuracy and consistency of data.
• Environment: MS Word, Mercury Win runner, Mercury Load runner, Windows NT 4.0, Autosys, UNIX, Knoppix
Linux, Oracle 9i/8i, PL/SQL, Dbase3, DOS and UNIX Sequential Files, MS Access, ERWin, DB2/UDB/EE Database.
CERTIFICATION
IBM Infosphere Datastage
IBM Infosphere Qualitystage
Got many performance awards