Thursday, June 21, 2012

Unix Day - 2 - DB Connection and File Reading

Reading file and passing parameter to Oracle SQL to fetch the data.

#!/bin/sh
rm -r FinalDepInfo.txt
for row in `cat myDep.txt`
do

a=$(C:/oraclexe/app/oracle/product/10.2.0/server/BIN/sqlplus -s hr/magogate@XE <<QUERY
spool mytest.txt
select first_name, last_name, email
from employees
where department_id = $row;
spool off;
exit;
QUERY);
less mytest.txt >> FinalDepInfo.txt
done;


Reading file line by line

#!/bin/sh
while read line
do
    echo $line
done < FinalDepInfo.txt

Sunday, June 10, 2012

Dynamic Lookup..

The "Dynamic Look-up" implementation in any project I have not seen personally. People say that it get used in SCD-2 type implementation, but even that is not mandatory; In my last project, to implement SCD-2 type, they had used static lookup itself.

But, I came across one scenario wherein it was a need to impement Dynamic Lookup. In my project they have used static lookup, when it was giving problem in particular situation.

We had layes or a data flow as below
1. Source System which was OP
2. Staging tables pulling data from OP.
3. Dimention and Facts tables which were pulling data from Staging.

In staging, most of the tables were truncate and reload, and indexes defined on them used to drop and re-create after every staging data load initiation and completion using pl-sql script. 

Data was getting picked up for staging load was using below condition at source qualifier.

Select *
from ods_dw_op_per_all_people_f_stg
where ges_update_date >=
(select max(last_successful_completion_date) - 3
from dw_jobs
where map_name = <Hard Coded Value>)

So, here ods_dw_op_per_all_people_f_stg is our source system table. Its actually a synonym at target database. Using ges_update_date condition, we can pick the latest data from source system. This technique will work fine most of the times i.e. 99%.  But consider a situation like below.

We have below data which is coming on 11-Jun-2012
Id - 1
Name - Mandar
SurName - Gogate
City - Mumbai
Location - Girgaon

While processing this data from source system into staging tables, some prior dependent jobs got failed and somehow this data did not got inserted into staging tables. Because of which now at next data load again from 11-Jun-2012 data will get picked up.

At source side, on 12-Jun source data got changed..

Old Records were
Id - 1
Name - Mandar
SurName - Gogate
City - Mumbai
Location - Girgaon

New created records at soruce are
Id - 1
Name - Mandar
SurName - Gogate
City - Pune
Location - Bavdhan

So now at source 2 records exists, one with ges_uddate_date as 12-Jun and other with 13-Jun. Our source qualifier for staging load will pick both the records together and will try to insert those into staging level. Since, there is a index defined on employee_id, this insertion will ultimately fail.

In the normal scenario, if job would not have failed... on 11-Jul first records would have got inserted at staging table. and on 12-Jul again even if 2 recrods got picked up by Source Qualifier, those would have got updated [as there is a update strategy defined at mapping level].

The solution for this situation is "Dynamic Looktup"; which will pass only one value ahead.. and in its buffer its buffer itself it will mark rows as insert or update.

Wednesday, June 6, 2012

Migration Process - PADA or PVCS

Approximately 2 years ago, was in CISCO and following the process of PVCS for informatica workflow migrations from Dev to QA and after that from QA to PROD. Now here at CS, the name is different i.e. PADA but process is same...

Just give the label to Dev objects, mention the workflows and folder.. application will migrate those informatica objects to QA and subsequently to PROD.

I never asked myself, that how this migration happens? What will be the code behind it? Always thought like people are so genious that they are migrating this thing automatically..

Yesterday while checking with Nilesh, realized that this process is quite easy. Below is the entire description of it...

Problem -- Nilesh wants to delete [not purge] some objects from PROD repository as those were obsolate. but as number of objects were more, Infra admin will not be able to do it manually. So, what will be the proces to delete those?
We got one way like below..

At Dev repository
1.
     At informatica repository manager, create a query which will display all objects which you want to delete
or just apply label to required workflow [selecting all child / parents while applying label] and create a query from repository manager giving that label as search criteria.

2. Right click on result set and click on "Change Object Status" and change the object status to Deleted.

3. After changing the status make sure that you will check in all those objects; Just click on folder and do find checkouts and slect all results and simply do check in.

After doing the 3rd step, you have completed deletion of Infra objects from Dev repository. Now next steps are to migrate those deleted objects to QA/PROD

4. Once you deleted all objects, create a query to find out deleted objects. Give "version status = deleted" as a criteria. This query will display you all deleted objects. Right click on result set and click on apply label. Apply a new label [make sure you click on apply to children and parents].

5. Create new query for Deployment group, giving criteria as "version status=deleted" and "label contains XXX". Save this query with specific name.


6. Create a dynamic deployment group and give above query as an input to it.

7. Just drag and drop this deployment group to new Repository. Just click on next button [no need to apply any label in this process] and your deployment will get done.

By doing this, you will successfully delete objects at QA / PROD.

But, here problem which nilesh was facing was different. Till step 6 nilesh has done correctly, but step 7 we can not do it manually. This step has to be done via raising a PADA request, and migration failed at PADA request.

When we checked for a log at PADA execution, we came across below main steps
1. PADA was creating a label for source repository folder using createlable pmrep command
2. PADA was creating a label for target repository folder using createlable pmrep command
3. PADA was calling deploymentgroup which we have created in step 6 using deploydeploymentgroup pmrep command

And at Step3, it was failing giving error as --

       Error: Apply label failed for Source Definition object EMPLOYEES version
6 in repository EDWPROD because this version of the object has been purged. This
 object was skipped.
Now, when we tried deploying DeploymentGroup manually [Drag and Drop] it worked fine, but from PADA it gave us an error. So, we thought of diong this process same as PADA i,e. via command line execution.

For executing Deployment Group from Command line below is the command, you will get that even at help file

First go to C:\Informatica\PowerCenter8.6.0\server\bin folder from command line and execute command pmrep.

Now execute deploydeploymentgroup command giving below parameters as an input

pmrep>deploydeploymentgroup -p Migration -c MigrationTest.xml -r EDWPROD -h infr
a-34af3549f -o 6001 -n Administrator -x magogate
-p -- DeploymentGroup
-c -- Contorl file
-r -- Repositroy
-h -- Host
-o -- Port
-n -- User
-x -- Password

Now, out of alll these, only Control file as missing at our end. We did the search at help and with respect to standard format we created new one as below

<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE DEPLOYPARAMS SYSTEM "depcntl.dtd">
<DEPLOYPARAMS DEFAULTSERVERNAME="Administrator"
COPYPROGRAMINFO="YES"
COPYMAPVARPERVALS="YES"
COPYWFLOWVARPERVALS="YES"
COPYWFLOWSESSLOGS="NO"
COPYDEPENDENCY="YES"
LATESTVERSIONONLY="YES"
RETAINGENERATEDVAL="YES"
RETAINSERVERNETVALS="YES">
<DEPLOYGROUP CLEARSRCDEPLOYGROUP="NO">
<OVERRIDEFOLDER SOURCEFOLDERNAME="ASA"
SOURCEFOLDERTYPE="LOCAL"
TARGETFOLDERNAME="ASA_PROD"
TARGETFOLDERTYPE="LOCAL"/>
<APPLYLABEL SOURCELABELNAME="NEW_SRC_LABEL_NAME"
SOURCEMOVELABEL="YES"
TARGETLABELNAME="NEW_TGT_LABEL_NAME"
TARGETMOVELABEL="YES" />
</DEPLOYGROUP>
</DEPLOYPARAMS>

As in contorl file, we have mentioned source and target labels, accordingly we have created those at source and target repositories. If you do not create those labels, command will fail giving error that mentioned labels do not exists.

We stored this contorl file at same directory as pmrep and ran the above mentioned command and this time we got same error which Nilesh got at office..

Applying label "NEW_SRC_LABEL_NAME" in the source repository...
Applying label "NEW_TGT_LABEL_NAME" in the target repository...
       Error: Apply label failed for Source Definition object EMPLOYEES version
6 in repository EDWPROD because this version of the object has been purged. This
 object was skipped.
So, what went wrong? We thought its definetly because of labels, and to verify that one, we removed the lable tag from control file.. New control file is like below

<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE DEPLOYPARAMS SYSTEM "depcntl.dtd">
<DEPLOYPARAMS DEFAULTSERVERNAME="Administrator"
COPYPROGRAMINFO="YES"
COPYMAPVARPERVALS="YES"
COPYWFLOWVARPERVALS="YES"
COPYWFLOWSESSLOGS="NO"
COPYDEPENDENCY="YES"
LATESTVERSIONONLY="YES"
RETAINGENERATEDVAL="YES"
RETAINSERVERNETVALS="YES">
<DEPLOYGROUP CLEARSRCDEPLOYGROUP="NO">
<OVERRIDEFOLDER SOURCEFOLDERNAME="ASA"
SOURCEFOLDERTYPE="LOCAL"
TARGETFOLDERNAME="ASA_PROD"
TARGETFOLDERTYPE="LOCAL"/>
</DEPLOYGROUP>
</DEPLOYPARAMS>

Again we ran the command with same input and this time it did the deployment successfully and objects at EDWPROD repository got deleted.

This proves that, if you are migrating deleted objects from command line, do not give label at control file.

Deployment log will be as follows
---------------------------------------------------------------------------------
pmrep>deploydeploymentgroup -p Migration -c MigrationTest.xml -r EDWPROD -h infr
a-34af3549f -o 6001 -n Administrator -x magogate

--------------------------------------------------------------------------------
Starting copy at 01/07/2009 23:44:10
--------------------------------------------------------------------------------


Creating temporary tables...
Obtaining locks...
Deploy trying to acquire shared lock on Repository EDWPROD in target repository.

Deploy trying to acquire shared lock on Deployment Group Migration in source rep
ository.
Deploy trying to acquire shared lock on Folder ASA in source repository.
Deploy trying to acquire subtree shared lock on Folder ASA in source repository.


Finding Dependencies...
Finding dependent database connections...
Finding dependent FTP connections...
Finding dependent External loader connections...
The Repository Service is finding dependent message queue connections.
Finding dependent External loader connections...
Finding dependent servers...
Finding dependent local shared folders...
Checking for any global shortcuts in this folder...
No global shortcuts detected.
Objects in source folders [ASA] will be copied to the following target folders [
ASA_PROD] respectively.
No mapping variable persisted values found in the repository for this folder.
No workflow variable persisted values found in the repository for this folder.
Existing CurrentValue's for Sequence Generator and Normalizer transformations or
 XML generated keys will be retained.
Existing Integration Service-grid-related values in workflows and session instan
ces will be retained.
Copying Deployment Group Migration
Copying table OPB_MAP_PERSISVAL...
Deploy trying to acquire exclusive lock on Repository EDWPROD in target reposito
ry.
Copying temporary tables...
Saving group deployment history...
Clearing source deployment group...
Copying deleted objects:
Source Definition object "
EMPLOYEES@XE" from folder "ASA"
Source Definition object "
DEPARTMENTS@XE" from folder "ASA"
Target Definition object "EMPLOYEESDUMMY" from folder "ASA"
Target Definition object "DEPARTMENTS" from folder "ASA"
Mapping object "m_DeploymentGroupTest" from folder "ASA"
Workflow object "wf_DeploymentGroupTest" from folder "ASA"
The following deleted objects are not be deployed because they do not exist in t
he target repository:
Workflow object "wf_DeploymentTest" from folder "ASA"
Committing changes, deployed objects are now checked out...
Checking in checked out objects...
Making nonreusable objects reusable if needed...
Copy was successful, committing changes...


--------------------------------------------------------------------------------
Completed at 01/07/2009 23:44:14
--------------------------------------------------------------------------------


deploydeploymentgroup completed successfully.

All about CSS

From book HTML & CSS - Design and Build Websites - Jon Duckett CSS works by associating rules with HTML elements. These rules govern how...