51. What do you mean by direct loading and
Indirect loading in session properties?
Direct
loading: Indicates
the source file contains the source data
Indirect
loading: Indicates
the source file contains a list of files with the same file properties. When
you select Indirect, the Integration Service finds the file list and reads each
listed file when it runs the session.
52. What is the status code?
Status
code provides error handling for the informatics server during the session. The
stored procedure issues a status code that notifies whether or not stored
procedure completed successfully. This value cannot seen by the user. It only
used by the Informatica server to determine whether to continue running the
session or stop.
53. What is Data driven?
The
Integration Service will follows instructions coded into update strategy
transformations with in the session mapping determine how to flag records for
insert, update, delete or reject. If you do not choose data driven option setting,
the Integration Service will ignores all update strategy transformations in the
mapping.
54. Can you use the mapping parameters or
variables created in one mapping into another mapping?
No, we cannot use the mapping parameters or
variables created in one mapping into another mapping.
55. When we can join tables at the Source
qualifier itself, why do we go for joiner transformation?
The Source Qualifier transformation is used to join source
data from the relation source residing in the same schema, database or system.
The Joiner transformation is used to join source
data from two related heterogeneous sources residing in different locations or
file systems. You can also join data from the same source.
56. What are a DTM and Load Manager?
DTM: The PowerCenter Integration Service
starts a DTM process to run each Session and Command task within a workflow.
The DTM process performs session validations, creates threads to initialize the
session, read, write, and transform data, and handles pre- and post- session
operations.
Load Manager: The
PowerCenter Integration Service uses the Load Balancer to dispatch tasks. The
Load Balancer dispatches tasks to achieve optimal performance. It may dispatch
tasks to a single node or across the nodes in a grid.
57. What is tracing level and what are its
types?
The tracing level is the amount of detail displayed
in the session log for this transformation.
Tracing level are 4 types:
Normal: Integration Service logs initialization and status
information, errors encountered, and skipped rows due to transformation row
errors. Summarizes session results, but not at the level of individual rows.
Terse: Integration Service logs initialization information and
error messages and notification of rejected data.
Verbose Initialization: In addition to normal tracing, Integration Service logs
additional initialization details, names of index and data files used, and
detailed transformation statistics.
Verbose Data: In addition to verbose initialization tracing, Integration
Service logs each row that passes into the mapping. Also notes where the
Integration Service truncates string data to fit the precision of a column and
provides detailed transformation statistics.
Allows
the Integration Service to write errors to both the session log and error log
when you enable row error logging.
When
you configure the tracing level to verbose data, the Integration Service writes
row data for all rows in a block when it processes a transformation.
58. What is a command that used to run a batch?
PMCMD
59. What is sequential and concurrent run?
Sequential
Run: In
Sequential run the tasks will run one after another based on the link
condition.
Concurrent
Run: In
concurrent run a set of task will run parallel based on the link condition.
60. How do we improve the performance of
the aggregator transformation?
- By passing sorted input to aggregator
transformation, by selecting Sorted input option. Which will reduce the amount of data
cached during the session and improves session performance.
- By limiting the number of
connected input/output or output ports to reduce the amount of data the
Aggregator transformation stores in the data cache.
- By using a Filter
transformation in the mapping, place the transformation before the
Aggregator transformation to reduce unnecessary aggregation.
61. What is a code page? Explain the types
of the code pages?
A
code page contains encoding to specify characters in a set of one or more
languages and is selected based on source of the data. The set code page refers
to a specific set of data that describes the characters the application
recognizes. This influences the way that application stores, receives, and
sends character data.
62. How many ways you can delete duplicate
records?
We can delete duplicate records in 3 ways:
1. By select distinct option in Source
Qualifier (Incase relation Source, if there is no sql override query).
2. By select distinct option in Sorter
Transformation (Incase file is source).
3. By select group by all ports in
Aggregator Transformation.
63. What is the difference between Power
Center & Power Mart?
PowerCenter
-
ability to organize repositories into a data mart domain and share metadata
across repositories.
PowerMart - only local
repository can be created.
64. Can u copy the session in to a
different folder or repository?
Yes.
By using copy session wizard you can copy a session in a different folder or
repository. But that target folder or repository should consists of mapping of
that session. If target folder or repository is not having the mapping of
copying session, you should have to copy that mapping first before you copy the
session
65. After dragging the ports of three
sources (SQL server, oracle, Informix) to a single source qualifier, can we map
these three ports directly to target?
No, each source definition required
separate Source Qualifier as they different database tables.
66. What is debugger?
Debugger is used to valid mapping to gain
troubleshooting information about data and error conditions. To debug a mapping,
we configure and run the Debugger from within the Mapping Designer. The
Debugger uses a session to run the mapping on the Integration Service. When you
run the Debugger, it pauses at breakpoints and we can view and edit
transformation output data.
You might want to run the Debugger in the
following situations:
·
Before you run a
session: After we save a
mapping, we can run some initial tests with a debug session before we create
and configure a session in the Workflow Manager.
·
After you run a
session: If a session
fails or if you receive unexpected results in the target, we can run the
Debugger against the session. We might also want to run the Debugger against a
session if we want to debug the mapping using the configured session
properties.
67. What are the different threads in DTM
process?
The PowerCenter Integration
Service process starts the DTM process to run a session. The DTM process is
also known as the pmdtm process. The DTM is the process associated with the
session task.
Read the Session Information: The
PowerCenter Integration Service process provides the DTM with session instance
information when it starts the DTM. The DTM retrieves the mapping and session
metadata from the repository and validates it.
Perform Pushdown Optimization: If the
session is configured for pushdown optimization, the DTM runs an SQL statement
to push transformation logic to the source or target database.
Create Dynamic Partitions: The DTM adds
partitions to the session if you configure the session for dynamic
partitioning. The DTM scales the number of session partitions based on factors
such as source database partitions or the number of nodes in a grid.
Form Partition Groups: If you
run a session on a grid, the DTM forms partition groups. A partition group is a
group of reader, writer, and transformation threads that runs in a single DTM
process. The DTM process forms partition groups and distributes them to worker
DTM processes running on nodes in the grid.
Expand Variables and
Parameters: If the workflow uses a parameter file, the
PowerCenter Integration Service process sends the parameter file to the DTM
when it starts the DTM. The DTM creates and expands session-level,
service-level, and mapping-level variables and parameters.
Create the Session Log: The
DTM creates logs for the session. The session log contains a complete history
of the session run, including initialization, transformation, status, and error
messages. You can use information in the session log in conjunction with the
PowerCenter Integration Service log and the workflow log to troubleshoot system
or session problems.
Validate Code Pages: The
PowerCenter Integration Service processes data internally using the UCS-2
character set. When you disable data code page validation, the PowerCenter
Integration Service verifies that the source query, target query, lookup
database query, and stored procedure call text convert from the source, target,
lookup, or stored procedure data code page to the UCS-2 character set without
loss of data in conversion. If the PowerCenter Integration Service encounters
an error when converting data, it writes an error message to the session log.
Verify Connection Object
Permissions: After validating the session code pages, the DTM
verifies permissions for connection objects used in the session. The DTM
verifies that the user who started or scheduled the workflow has execute
permissions for connection objects associated with the session.
Start Worker DTM Processes: The
DTM sends a request to the PowerCenter Integration Service process to start
worker DTM processes on other nodes when the session is configured to run on a
grid.
Run Pre-Session Operations: After
verifying connection object permissions, the DTM runs pre-session shell
commands. The DTM then runs pre-session stored procedures and SQL commands.
Run the Processing Threads: After
initializing the session, the DTM uses reader, transformation, and writer
threads to extract, transform, and load data. The number of threads the DTM
uses to run the session depends on the number of partitions configured for the
session.
Run Post-Session Operations: After
the DTM runs the processing threads, it runs post-session SQL commands and
stored procedures. The DTM then runs post-session shell commands.
Send Post-Session Email: When
the session finishes, the DTM composes and sends email that reports session
completion or failure. If the DTM terminates abnormally, the PowerCenter
Integration Service process sends post-session email.
68. What are the data movement modes in informatica?
Data movement modes determines
how informatica server handles the character data. U choose the data movement
in the informatica server configuration settings.
Two types of data movement
modes available in informatica:
ASCII mode
Unicode mode.
69. What is the difference between $ &
$$ in mapping or parameter file?
$ are System defined and $$ are User
defined variables
70. What is aggregate cache in aggregator
transformation?
The
Integration Service stores data in the aggregate cache until it completes
aggregate calculations. The Integration Service stores group values in an index
cache and it stores row data in the data cache. If the Informatica server
requires more space, it stores overflow values in cache files.
71. What do you mean by SQL override?
SQL Override: To override the default SQL
query by customize user defined SQL query.
72. What is a shortcut in Informatica?
Any object which is used more than folder
is called shortcut, it should be created in shared folder.
Any point of time, if you want to do any
modification to shortcut, you need to do to original object in the shared
folder.
73. What
will happen if you copy the mapping from one repository to another repository and if there is an
identical source?
It will ask for source to Rename, Replace,
Reuse or Skip.
74. What is a dynamic lookup and what is
the significance of NewLookupRow?
To cache a table, flat file, or source
definition and update the cache, configure a Lookup transformation with dynamic
cache. The Integration Service dynamically inserts or updates data in the lookup
cache and passes the data to the target. The dynamic cache is synchronized with
the target.
Significance
of NewLookupRow:
The Designer adds this port to a Lookup
transformation configured to use a dynamic cache. Indicates with a numeric
value whether the Integration Service inserts or updates the row in the cache,
or makes no change to the cache. To keep the lookup cache and the target table
synchronized, pass rows to the target when the NewLookupRow value is equal to 1
or 2.
75. What happen if cache size got full?
If the cache size got full, then the
Integration Service stores the overflow values in the cache files. When the
session completes, the Integration Service releases cache memory and deletes
the cache files unless you configure the Lookup transformation to use a
persistent cache.
Login Your tradeatf Account To Read The Latest News About The Platform.s
ReplyDelete