Saturday, September 21, 2013

Transfering CLOB Data to BLOB using Informatica...Part - III

After discussing with Nilesh, came to know that if we defined Source Qualifier Data Type to varchar [instead of text, which comes by default when you pull clob source to an informatica mapping] it will load 4000+ character data without giving any Oracle Error.

To verify the same, I created new clob table as below

create table ClobTest
(id number,
clobData clob
);

I have inserted 4000+ character data to clobData column of ClobTest table. To know how to insert 4000+ characters into CLOB column please refer link http://gogates.blogspot.in/2013/09/inserting-clob-data-which-is-more-than.html

Now, I have imported ClobTest table to a mapping as below. In mapping my target will be FlatFile.

 If you see, by default, informatica has taken data type as text with length / precision as 4000. 

When I ran the mapping workflow executed successfully and inserted 4000 characters to target text file.

Now, error occurred at office might be because of Informatica or Oracle version which I need to confirm. Here, I am using Informatica 9.0.1 Hot Fix and Oracle 10.2.0 Standard Edition.

Will verify it soon..

Inserting CLOB data which is more than 4000 characters into Oracle Table

I was trying to insert 4000+ characters data into an Oracle table having CLOB as data type, but could not do that.. using simple insert statement

Table Structure which I used is as below

create table ClobTest
(id number,
clobData clob
);

Table structure is pretty simple, having only 2 columns a number and a clob data type.

Now... if you run below query it will get execute without any error

insert into ClobTest values(1, 'ClobTestData');

However, when you increase the length of your data set 4000+ it will not succeed.and will give error as below

Error report:
SQL Error: ORA-01704: string literal too long
01704. 00000 -  "string literal too long"
*Cause:    The string literal is longer than 4000 characters.
*Action:   Use a string literal of at most 4000 characters.
           Longer values may only be entered using bind variables.

After googling this issue, came to know that, if you want to insert 4000+ char data into CLOB data type, try to do that using PL-SQL block.

Root cause of this is, Oracle SQL do not support char / varchar data type which is more than 4000 characters; however PL-SQL does so.

I used below anonymous pl-sql to insert 4000+ character data into CLOB data type

declare

vClob varchar(8000);

begin

vClob := '<Charater data having 4000+ characters>';

insert into ClobTest values(1, vClob);

end;

To verify if really 4000+ characters got inserted into target executed below query

select length(clobData) from ClobTest

Sunday, September 15, 2013

"Hello World" ... First Map Using Informatica Developer

In last blog we have seen how to connect to Informatica Developer, and today is time for our first map using Informatica Developer.

One you are connected to Informatica Developer, right click on Repository Service, and you will get option to create new project or import an existing project.

Here, we are going to create a new project "HelloWorld"

Once you click on Project, new window will appear which will ask you to give name for your project.
In this case, I have given a name "HelloWorld". Once you click on Finish, new project called "HelloWold" will appear at left panel. Right click on new Project "HelloWorld" and create a new mapping as shown below


Give new mapping a same name called "HelloWorld".. we will keep mapping simple with Source and a Target.
I have created table called EMP2 having same structure as EMP in scott schema under orcl db as below.
------------------------------------------------------------------------------------
Copyright (c) 1982, 2005, Oracle.  All rights reserved.

Enter user-name: scott
Enter password:

Connected to:
Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - Production
With the Partitioning, OLAP and Data Mining options

SQL> create table EMP2 as select * from EMP where 1=2;

Table created.

SQL> exit;
Disconnected from Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - Pr
oduction
With the Partitioning, OLAP and Data Mining options

------------------------------------------------------------------------------------

We will import source EMP and target EMP2 from Scott schema in ORCL database as below.

To import new Source \ Target as a Physical Object in Mapping..
1. Right click on mapping "HelloWorld"
2. Click on New --> Other



3. New window will appear which will give different options. Out of available options, select Relational Data Object to import new Source \ Target from Physical RDBMS schema.
4. New window will appear as below after you click on "Relational Data Object". Choose available connection [if its not then create a new one].

In this case "MyConnection" object was already available.


5. Once you choose the connection, new window will appear to select the tables \ objects from available schema for particular connection.

6. Import EMP and EMP2 and treat them as source and target for your mapping.
7. Once you import both tables, it will appear at left panel of your mapping window as below

8. Drag both EMP and EMP2 at your mapping panel; treat them as source and target and make port connection between them.








9. We have done with our first task and now we have to validate it and run it.

10. Right click on mapping panel and validate the mapping [you will get the validate option once you right click on mapping window]

11. Once mapping is valid, again right click on the panel and run the mapping.
12. Log file will show that mapping completed successfully
13. Check the target table count and results are as below.
--------------------------------------------------------
C:\Documents and Settings\Administrator>sqlplus

SQL*Plus: Release 10.2.0.1.0 - Production on Sun Sep 15 15:07:19 2013

Copyright (c) 1982, 2005, Oracle.  All rights reserved.

Enter user-name: scott
Enter password:

Connected to:
Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - Production
With the Partitioning, OLAP and Data Mining options

SQL> select count(*) from emp2;

  COUNT(*)
----------
        14

SQL>
-----------------------------------------------------------
Here, new things which we observed is we do not require Source Qualifier and Workflow \ Session.

More on this after some days...

Saturday, September 14, 2013

Installing "Informatica Developer" ...

After installing Inforamtica 9.0.1 was doing only traditional work using Power Center Designer of creating normal mapping; Moreover there was one new thing came to notice after installation and it was new button at Start Up "Launch Informatica Developer"

This was not available at Informatica 8.6 and prior version. Though I am still not sure what is the exact use of it, but atleast now I am able to log in into it.

At least for 3-4 months I was struggling to find a way to get into it. I had an existing repository and integration service created for Power Centre Designer, however that was not useful for "Informatica Developer". Normal repository which you have created for "Power Center Designer" won't allow you to login at "Informatica Developer"

After doing some trial and error, I noticed that to login at "Informatica Developer" you need to create separate service called "Data Integration Service". 

However, for creation of new Data Integration Service, informatica requires "Model Repository Service" to be created.


So, first step to build the Rome is to have "Model Repository Service" created.

 Click on Action --> New ---> Model Repository Services and a new prompt will appear as below.

First step is pretty simple and here you just have to specify Names and choose the values from drop downs.

Here, you have to select database types and specify userid and password of existing oracle user.

Thought the process seems very simple, I encountered below issues while creating it

1. Error "MRS_50006 Failed to Startup Model Repository Service" --

After googling it, came to know that if you specify same schema which you have used to create other informatica repository, informatica won't allow you to create new "Model Repository Service" there.

So, you need to specify separate schema where there is no Integration Service Installed prior.

2. I had specified "Scott" user instead of HR, as in HR schema there was a old repository service already installed. however, even after specifying new schema I got an error as "DOM 10079" and when I checked the log I got an error description as "user do not have privileges to create an object"

I altered the user by logging via sys user and grant all the privileges to it as below
------------------------------------------------------------------------------
Copyright (c) 1982, 2005, Oracle.  All rights reserved.

Enter user-name: sys as sysdba
Enter password:

Connected to:
Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - Production
With the Partitioning, OLAP and Data Mining options

SQL> grant all privileges to scott;

Grant succeeded.

SQL> exit

------------------------------------------------------------------------------

I again tried to create "Model Repository Service" and this time it got created successfully.

 Now, the remaining task is to create a "Data integration Service" which is a quite easy task.

Once you have "Data Integration Service" in place, try to login at "Informatica Developer"

You will get connected easily into "Informatica Developer". Here after, creating a new mapping using Informatica Developer is a different task and even I am unaware of it as of now.

Will keep posted as soon as I will create a new map using this...

Thursday, September 5, 2013

Transfering CLOB Data to BLOB using Informatica...Part - II

In last part we saw how we can transfer CLOB data to BLOB data using Informatica. However, the solution which we saw will not always holds true.

As seens earlier, when you import CLOB datatype to Informatica as a source, it will assign data type as TEXT and precision as 4000 to that port by default. [you can increse the precision later on, informatica allows you that]. But why by default its always 4000 ?

Lets create sample table as below and all 1 record in it

Create Table CLOB_Source
(Idx NUMBER(3),
Text_File CLOB);


Insert into  CLOB_Source values (1, '<b> Adding first record </b>');

Now, try to convert CLOB record to Char using To_Char as below

select idx, to_char(text_file)
from CLOB_Source

Oracle will convert that without any difficult.

Now add one more record, but this time at CLOB field give 4000+ characters. Your insert statement will not work giving below error.

Error report:
SQL Error: ORA-01704: string literal too long
01704. 00000 -  "string literal too long"
*Cause:    The string literal is longer than 4000 characters.
*Action:   Use a string literal of at most 4000 characters.
           Longer values may only be entered using bind variables.


Even if you add character set more than 4000 in CLOB field, and if you try to convert that using to_char function, oracle will not allow you giving same 4000 characters limit error.

So, oracle will not allow you to convert CLOB data type to To_Char if its more than 4000 characters.

As seen earlier, informatica by default assigns TEXT 4000 to CLOB data type when we import it as a source. Eventually, even internally informatica is trying to convert CLOB data type to Char but when then size of CLOB characters is more than 4000, your mapping will fail giving Oracle database fetch Error.

Solution  --  After doing google and even trial and error, ultimately we found below workaround.

When CLOB data type has more than 4000 characters, divide such records in set of 4000 characters each.

E.g. First find out maximum length of characters in CLOB data type. lets say if its 7500, then divide your column in 2 sets 1-4000 and 4001 to 8000

select idx, substr(text_file, 4000),
substr(text_file,4001,8000)
from CLOB_Source


Add above query to your informatica source qualifier, so in this case you have one extra port for same column. Concatenate these 2 port in a expression tranformation immediately after Source Qualifier.

As we have added SubStr limit to 4000, informatica can easily convert CLOB data to Text and your mapping will not fail.

Sunday, September 1, 2013

Transfering CLOB Data to BLOB using Informatica...

Few days before one of my colleague had a requirement of transferring data from one source table to another target table as is... Requirement was quite simple as no logic was involved in between, however there was a difference in data type of both the tables.

Below is a sample structure of  Source and Target table respectively.

Source Table
Create Table CLOB_Source
(Idx      NUMBER(3),
Text_File      CLOB);


Target Table
Create Table BLOB_Source
(Idx      NUMBER(3),
Text_File      BLOB);


Though the requirement was pretty clear and simple, challenging part was to convert CLOB data type to BLOB. We tried number of things as below

When you import CLOB data type as a source, Source definition will have CLOB data type by default as of source and Source Qualifier will have TEXT data type with precision as 4000.

If you try to transfer this text data [from source] directly to Target BLOB without converting it, it won't allow you and will give error "Data Types text and BLOB are incompatible"

If you convert data type of expression from text to binary, it will allow you to connect; however mapping will be still invalid giving below error, because Source Qualifier has data type as text.

------------------------------------------------------------------------------
9/1/2013 2:29:04 PM ** Validating mapping m_ConvertBlobToClob
------------------------------------------------------------------------------
Validating transformations of mapping m_ConvertBlobToClob...
...transformation validation completed with no errors.
Validating data flow of mapping m_ConvertBlobToClob...
...link has incompatible data types: from  SQ_CLOB_SOURCE: TEXT_FILE  to  exp_ClobToBlob: TEXT_FILE.
...there are data flow validation errors.
Parsing mapping m_ConvertBlobToClob...
...parsing completed with no errors.

******** Mapping m_ConvertBlobToClob is INVALID ********



So, we have to find some way, by which we can convert incoming text data to binary.

After doing lots of Hit and Trials, we finally got a way to convert Characters to Binary using Java code.

Java function <String>.getBytes() will return byte array which is a binary value of a particular string and same we will pass ahead to Target column.

So our mapping will be as below


Here, we will add Java Transformation in between source and target which will actually convert Char to Binary.

Code written in Java transformation is as below

We just called getBytes() method of Java which will return binary array values and same we will pass to the target.

Validate the mapping and create the workflow.


If you run the workflow with Bulk Load it will fail giving below error.


Database errors occurred:
ORA-39778: the parallel load option is not allowed when loading lob columns

Database driver error...
Function Name : Prepare DPL
SQL Stmt : INSERT INTO BLOB_SOURCE(IDX,TEXT_FILE)  VALUES ( ?, ?)
Database driver error...
Function Name : Execute Multiple
SQL Stmt : INSERT INTO BLOB_SOURCE(IDX,TEXT_FILE)  VALUES ( ?, ?)  




Change the load from Bulk to Normal and your workflow will get succeed. Also, keep in mind that, if you are creating Passive Java transformation [in this case it will be passive, as we are not changing number of incoming rows] you do not need to call generateRow() API in java transformation, rather java transformation will not allow you to call it inside the code when its a Passive transformation.





All about CSS

From book HTML & CSS - Design and Build Websites - Jon Duckett CSS works by associating rules with HTML elements. These rules govern how...