Sunday, 14 February 2016

Datastage Administrator

Administrator 

       Administers DataStage projects, manages global settings and interacts with the system. Administrator is used to specify general server defaults, add and delete projects, set up project properties and provides a command interface to the datastage repository.
  • With Datastage Administrator users can set job monitoring limits, user privileges, job scheduling options and parallel jobs default.
  • Job parameters should be used in all DataStage server, parallel and sequence jobs to provide administrators access to changing run time values such as database login details, file locations and job settings.
  • One option for maintaining these job parameters is to use project specific environment variables. These are similar to operating system environment variables but they are setup and maintained through the DataStage Administrator tool.
Start up DataStage Administrator.




Choose the project and click the "Properties" button


General Tab:
  • Enable job administrator in Director responsible for  enable DataStage Administrator in the Director . we can directly login  DataStage Administrator by clicking option in Director then DataStage Administrator window will appear
  • Runtime Column Propagation it is responsible  for enable run time column propagation in designer client .InfoSphere DataStage is flexible about metadata. It allows us to skip the metadata details partially or entirely and specify that, if it encounters any extra columns while running, it should interrogate their source, pick them and propagate them through the rest of the job. This is known as Runtime Column Propagation (RCP).

    On the General tab click the "Environment..." button.Click on the "User Defined" folder to see the list of job specific

environment variables.There are two types of variables - string and encrypted.  If you create an encrypted environment variable it will appears as the string "*******" in the Administrator tool and will appears as junk text when saved to the DSParams file or when displayed in a job log. This provides robust security of the value.

    
   
 permissions tab: we can create Users and Groups,we can  assign privileges to the users





Auto purging

  • The auto-purge setting will work only on new jobs that have Auto-purge enabled before they are created. Auto-purge is enabled from DS Admin; once you have enabled the Auto-purge option in DS Admin, this will purge job logs for all NEW jobs created in that project. In order to enable the auto purge setting for EXISTING jobs, you must export/import the job. On the import the auto-purge setting will be applied.
  • You can also cleanup space by removing the jobs you do not need. Please note it is highly advisable you do so via DS Director, deleting the jobs that you no longer need, or exporting the jobs and then removing the ones you no longer need/run.

  • If users create Hash files in the project, using the account option, you may want to have them use the Directory path option. This way user files are kept out of the project directory. Note: If UV stages are used in the jobs then switching from Account to Directory Path will not be seamless. If this is the case then you will want to research this further.
  • If you use sort in your jobs, the sort jobs would use the /tmp directory as a temporary working area, so you can check the /tmp directory for any temp files no longer needed and clean up this directory.
  • Clean up scratch and resource disk directories, as applicable.

No comments:

Post a Comment