Call :+91 8055223360 

INFORMATICA 10.x DEV + ADMIN TRAINING IN KOCHI

2546 Ratings
5/5

1880 Learners

In challenging times good decision-making becomes critical. The best decisions are made when all the relevant data available is taken into consideration. The best possible source for that data is a well-designed data warehouse. To make any new decision or to introduce new Plan data warehousing is very important.

Overview

Data warehouses are widely used within the largest and most complex businesses in the world. Use with in moderately large organisations, even those with more than 1,000 employees remains surprisingly low at the moment. We are confident that use of this technology will grow dramatically in the next few years.

ETL is one of the main processes in data warehousing. ETL means extract transform and Load data into data warehouse. Informatica is ETL tool. It is very flexible and cheaper as compared to other ETL tool.

Watch
INTRO VIDEO

Why Radical Technologies

Check Batch Schedulings

Benefits

Today Following IT companies are using Informatica as ETL tool

1) IBM

2) Accenture

3) Amdocs

4) CTS

5) HSBC

And Many more

Course Curriculum

Course description

Informatica is a tool, supporting all the steps of Extraction, Transformation and Load process. Now days Informatica is also being used as an Integration tool. Informatica is an easy to use tool. It has got a simple visual interface like forms in visual basic. You just need to drag and drop different objects (known as transformations) and design process flow for Data extraction transformation and load. These process flow diagrams are known as mappings. Once a mapping is made, it can be scheduled to run as and when required. In the background Informatica server takes care of fetching data from source, transforming it, & loading it to the target systems/databases. Informatica can communicate with all major data sources (mainframe/RDBMS/Flat Files/XML/VSM/SAP etc), can move/transform data between them. It can move huge volumes of data in a very effective way, many a times better than even bespoke programs written for specific data movement only. It can throttle the transactions (do big updates in small chunks to avoid long locking and filling the transactional log). It can effectively join data from two distinct data sources (even a xml file can be joined with a relational table). In all, Informatica has got the ability to effectively integrate heterogeneous data sources & converting raw data into useful information

Informatica is not only ETL tool but also Data integrator. Informatica is one of the Best tool in current industry provides versatile solutions to any kind of data. In Radical we provide scenario based training with real time examples, project explanation, project related queries.

Pre-requisites

Any one who is having basic knowledge of SQL queries can start learning Informatica tool.As it is a GUI based​ tool, coding knowledge wouldn’t be necessary.At least 1-3 yrs of experience is good enough to get interview calls.

Course Content

  • Understanding Domains
  1. Nodes
  2. Application Services
  • Using Administration Console
  • Managing the Domain
  1. Managing Alerts
  2. Managing Folders
  3. Managing Permissions
  4. Managing Application Services
  5. Managing the Nodes
  • Managing Users and Groups
  • Managing Privileges and Roles
  1. Domain Privileges
  2. Repository Services Privileges
  3. Reporting Service Privileges
  4. Managing Roles – Assigning Privileges and Roles to Users and Groups
  • Creating and Configuring the Repository Services
  • Managing the Repository
  • Creating and Configuring Integration Services
  1. Enabling and Disabling the Integration Services
  2. Running in Normal and Safe Mode
  3. Configuring the Integration Services Processes
  • Integration Services Architecture
  • Creating the Reporting Services
  1. Managing the Reporting Services
  2. Configuring the Reporting Services
  • Managing License
  • Understanding Pipeline Partitioning
  1. Partitioning Attributes
  2. Dynamic Partitioning
  3. Partitioning Rules
  4. Configuring Partitioning
  • Partitioning Points
  1. Adding and Deleting Partitioning points
  2. Partitioning Relational Sources
  3. Partitioning File Targets
  4. Partitioning transformations
  • Partitioning Types
  • Real Time Processing
  • Commit Points
  • Workflow Recovery
  • Stopping and Aborting
  1. Error Handling
  2. Stopping and Aborting Workflows
  • Concurrent Workflows
  • Load Balancer
  • Workflow Variables
  1. Predefined Workflow Variables
  2. User- Defined Workflow Variables
  3. Using Worklet Variables
  4. Assigning Variable Values in a Worklet
  • Parameter and variables in Sessions
  1. Working with Session Parameters
  2. Assigning Parameter and Variables in a Session
  • Parameter File
  • Session Caches
  • Incremental Aggregation
  • Session Log Interface
  • Using Command Line Programs
  1. Infacmd
  2. Infasetup
  3. Pmcmd
  4. pmrep
  • Using the Designer
  1. Configuring Designer Options
  2. Using Toolbars
  3. Navigating the Workspace
  4. Designer Tasks
  5. Viewing Mapplet and Mapplet Reports
  • Working with Sources
  1. Working with Relational Sources
  2. Working with COBOL Sources
  3. Working with Cobol Source Files
  • Working with Flat Files
  1. Importing  Flat Files
  2. Editing Flat Files Definition
  3. Formatting Flat Files Column
  • Working with Targets
  1.  Importing Target Definition
  2. Creating Target Definition from Source Definition
  3. Creating Target Definition from Transformations
  4. Creating Target tables
  • Mappings
  1. Working with Mappings
  2. Connecting Mapping Objects
  3. Linking Ports
  4. Propagating Port Attributes
  5. Working with Targets in a Mapping
  6. Working with Relational Targets in a Mapping
  7. Validating a Mapping
  8. Using Workflow Generation Wizard
  • Mapplets
  1. Understanding Mapplets Input and Output
  2. Using Mapplet Designer
  3. Using Mapplets in Mapping
  • Mapping Parameters and Variables
  • Working with User-Defined Functions
  • Using the Debugger
  1. Creating Breakpoints
  2. Configuring the Debugger
  3. Monitoring the Debugger
  4. Evaluating Expression
  • Creating Cubes and Dimensions
  • Using Mapping Wizard
  • Naming Conventions

Performance Tuning Overview

  • Bottlenecks
  1. Using Thread Statistics
  2. Target Bottlenecks
  3. Source Bottlenecks
  4. Mapping Bottlenecks
  5. Session Bottlenecks
  6. System Bottlenecks
  • Optimizing the Targets
  • Optimizing the Source
  • Optimizing the Mapping
  • Optimizing the Transformations
  • Optimizing the Sessions
  • Optimizing thePowerCenter Components
  • Optimizing the System
  • Using Pipeline Partitions
  • Performance Counters
  • Understanding the Repository
  • Using Repository Manager
  • Folders
  • Managing Object Permissions
  • Working with Versioned Objects
  • Exporting and Importing Objects
  • Copying Objects
  • Working with Transformations
  1. Configuring Transformations
  2. Working with Ports
  3. Working with Expressions
  4. Reusable Transformations
  • Aggregator Transformation
  • Custom Transformation
  • Expression Transformation
  • External Transformation
  • Filter Transformation
  • Joiner Transformation
  • Java Transformation
  • Lookup Transformation
  • Lookup Caches
  • Normalizer Transformation
  • Rank Transformation
  • Router Transformation
  • Sequence Generator Transformation
  • Sorter Transformation
  • Source Qualifier Transformation
  • SQL Transformation
  • Stored Procedure Transformation
  • Transaction Control Transformation
  • Union Transformation
  • Update Strategy Transformation
  • The Transformation Language
  • Constants
  • Operators
  • Variables
  • Dates
  • Functions
  • Creating Custom Function
  • Workflow Manager
  1. Workflow Manager Options
  2. Navigating the Workspace
  3. Working with Repository Objects
  4. Copying Repository Objects
  5. Comparing Repository Objects
  • Workflow and Worklets
  1. Creating a Workflow
  2. Using Workflow Wizard
  3. Assigning an Integration Service
  4. Working with Worklets
  5. Working with Links
  • Sessions
  1. Creating a Session Task
  2. Editing a Session
  3. Pre- and Post- Session Commands
  • Session Configuration Objects
  • Tasks
  1. Creating a Task
  2. Configuring Tasks
  3. Working with Command Task
  4. Working with Decision Task
  5. Working with Event Task
  6. Working Timer Task
  7. Working with Assignment Task
  • Sources
  1. Configuring Sources in a Session
  2. Working with Relational Sources
  3. Working with Flat Sources
  • Targets
  1. Configuring Targets in a Session
  2. Working with Relational Targets
  3. Working with File Targets
  4. Reject Files
  • Validation
  1. Validating Tasks
  2. Validating Worklets
  3. Validating Session
  4. Validating Workflows
  • Scheduling and Running Workflows
  1. Scheduling a Workflow
  2. Manually Starting a Workflow
  • Sending Email
  1. Working with Email Tasks
  2. Working with Post-Session Email
  • Workflow Monitor
  1. Using Workflow Monitor
  2. Customizing Workflow Monitor Options
  3. Working with Tasks and Workflows
  4. Using Gantt Chart View and Task View
  • Workflow Monitor Details
  1. Integration Services Properties
  2. Workflow Run Properties
  3. Worklet Run Properties
  4. Session Task Run Properties
  5. Performance Details
  • Session and Workflow Logs
  1. Log Events
  2. Log Events Window
  3. Working with Log Files
  4. Workflow Logs

Note: Lab sessions for all the points mentioned above will be taken.

  • Evolution of Datawarehousing – History
  • The need of Datawarehousing
  • Why Datawarehousing
  • What is Datawarehousing – The Definition
  1. Subject -Oriented
  2. Integrated
  3. Non – Volatile
  4. Time Varying
  • Datawarehousing Architecture
  1. Data Source Layer
  2. Data Extraction Layer
  3. Staging Layer
  4. ETL Layer
  5. Data Storage Layer
  6. Data Logic Layer
  7. Data Presentation Layer
  8. Metadata Layer
  9. System Operation Layer
  • Dimension table
  • Fact table
  1. Additive Facts
  2. Semi Additive Facts
  3. Non – Additive Fact
  4. Cumulative
  5. SnapShot
  • Attribute
  • Hierarchy
  • Types of Schema
  1. Star Schema
  2. Snow Flake Schema
  3. Fact Constellation Schema
  • Slow Changing Dimension
  1. SCD1 – Advantages/ Disadvantages
  2. SCD2 – Advantages/ Disadvantages
  3. SCD3 – Advantages/ Disadvantages
  • OLAP and OLTP
  1. Difference between OLAP and OLTP
  2. Types Of OLAP
  3. Multi-Dimentional (MOLAP)
  4. Relational(ROLAP)
  5. Hybrid(HOLAP)

Training Options

Live Online Training

  • Highly practical oriented training
  • Installation of Software On your System
  • 24/7 Email and Phone Support
  • 100% Placement Assistance until you get placed
  • Global Certification Preparation
  • Trainer Student Interactive Portal
  • Assignments and Projects Guided by Mentors
  • And Many More Features

Course completion certificate and Global Certifications are part of our all Master Program

Live Classroom Training

  • Weekend / Weekdays / Morning / Evening Batches
  • 80:20 Practical and Theory Ratio
  • Real-life Case Studies
  • Easy Coverup if you missed any sessions
  • PSI | Kryterion | Redhat Test Centers
  • Life Time Video Classroom Access ( coming soon )
  • Resume Preparations and Mock Interviews
  • And Many More Features

Course completion certificate and Global Certifications are part of our all Master Program

Exam & Certification

Course Reviews

Why we are the best Radical Technologies

Radical Technologies is truly progressing and offer best possible services. And recognition towards Radical Technologies is increasing steeply as the demand is growing rapidly.

Creative

0%

Innovative

0%

Student Friendly

0%

Practical Oriented

0%

Valued Certification

0%

Training FAQs

Similar Courses

ENQUIRE NOW











    [recaptcha]