Tuesday, May 31, 2011

Simplified Method to Export and Import Data from/to DB2 tables

Here is an easy way to export and import data from and into a DB2 Table.

-- EXPORT

1. Create a sql file (export.sql) with following content.
--
connect to db2Instance user username using password;
Export to /filepath/tableDataFile.dat of del modified by nochardel coldelx09
select * from schema.tableName with ur;
Connect reset;
terminate;
--

2. Execute the sql file like this:

db2 -tvf export.sql &



-- IMPORT

1. Create a sql file (import.sql) with following content.
--
CONNECT to db2Instance  user username using password;

LOAD CLIENT FROM '/filepath/tableDataFile.dat' OF DEL MODIFIED BY NOCHARDEL COLDELX09 MESSAGES '/filespath/tableDataFile.msg'
TEMPFILES PATH /tempfilespath  INSERT INTO schema.tableName NONRECOVERABLE INDEXING MODE REBUILD;

RUNSTATS ON TABLE schema.tableName WITH DISTRIBUTION AND INDEXES ALL ALLOW WRITE ACCESS;
CONNECT Reset;
TERMINATE;
--
2. Execute the sql file like this:
db2 -tvf import.sql &

Wednesday, May 11, 2011

How to handle huge data in and out of DB2 database?

I often come across situations where I need to export or import huge amounts of data into and out of databases. Hence I decided to collect all the information/steps regarding this, at one place. Here it is.

1. CSV files: If you want to create a sample or test data, you can use rand( ) function in EXCEL to generate random numbers. Then copy-and-paste to create huge files.

I had created a csv file with more than 4 million rows (datfile.csv, say).
SQL to be used to load this data into table say, myschema.my_temp_table, (assuming the table has already been created, to match the csv file):
--
IMPORT FROM '/home/data/datfile.csv' OF DEL MESSAGES '/tmp/messages.txt'
INSERT INTO myschema.my_temp_table ;
--

Embedd this SQL into a shell script, which looks like this:
##
db2 connect to InstanceName user username using passwd
db2 -tvf import_temp_table.sql # shown above
db2 "RUNSTATS ON TABLE myschema.my_temp_table WITH DISTRIBUTION AND INDEXES ALL ALLOW WRITE ACCESS"
db2 connect reset
db2 terminate
##

Easy? Try it.

Monday, December 20, 2010

Tips and Tricks

To compare 2 tables:
(select * from Table1
except
select * from Table2)
Union ALL
(select * from Table2

except
select * from Table1)

To find duplicates in a table:
select field1, count(field1)
from Table1
group by field1
having count(field1) >

To find SECOND highest value in a column:
SELECT MAX(SALARY) FROM EMPLOYEE WHERE SALARY NOT IN (SELECT MAX(SALARY) FROM EMPLOYEE)




Friday, December 18, 2009

CSQA Information

CSQA:
This link has a whole lot of information about CSQA.
http://csqa.blogspot.com/

Test Plan Template

Here is a sample Test Plan Template.
-----------------------
Introduction
Summarize the product/Feature that being tested. A high level Outline all the functions. Provide a list of relevant related documents, such as requirements, design documents, other test plans, etc.

Scope
Describe WHAT is being tested, such as all the functions of a specific product, interfaces, integration of all functions.
Describe HOW you will accomplish above activities.
Also, mention the features that will NOT be tested.

Standards, Naming conventions

QA Organization and Resources
Mention the names of the QA resources, their roles and responsibilities.

Testing Approach
Describe the overall approach to testing. For each major group of features or feature combinations, specify the approach which will ensure that these feature groups are adequately tested. Mention the participants and methodology for each type of testing activity.
o Unit Testing
o System and Integration Testing
o Performance and Stress Testing
o User Acceptance Testing
o Automated Regression Testing
o Beta Testing

Assumptions, Risks and Dependencies.

Test Environment
Mention the Hardware and Software requirements for the Testing environment.
Also mention the setup and configuration issues.

Software CM Processes

QA Entry and Exit Criteria.
Here are some examples.
QA Entry Criteria
- All code developed and compiled on ALL applicable platforms
- Code Walk-thru and Peer review complete
- Unit Testing complete

QA Exit Criteria

- All test cycles complete
- System Test Cycle complete
- UAT complete
- All Test Runs recorded and the Test Run Metrics submitted to PM
- No S1 or S2 defects open

Tools
Mention the Automation tools and Bug tracking tool here.
Note: A brief discussion on the justification of Automation would be appropriate here.

Test Schedule
Describe the Testing cycles, durations and Completion dates. Also mention the resources.

QA Deliverables
Mention all the deliverables: Test Run charts, Pass-Fail Metrics and so on.

QA Plan

QA Plan is a high-level document that describes the overall QA Process and Strategy at the Organization level.
Following is a QA Plan Template.
----------------------------------------------------------------

Purpose
The purpose of this document is to describe in detail the overall QA process, which is applicable for all projects.    
The intended audience for this document is as follows:
-     Project Manager(s)
-    Business Analyst(s)
-    Development Team
-    QA Team

2.0    QA Philosophy and Guidelines

It is generally expected that, a variety of projects, features, releases, hot fixes that will be developed and released by a Development shop, and a specific QA request shall be made for each one of the above tasks. Different types of task will have different QA expectations. It is critical that QA team acknowledges this, and covers all these possibilities.
The QA process should be flexible enough to accommodate all different kinds of QA requests, and should have a definite course of action for each kind of QA Requests.

2.1    Major Release
For a major release (possibly of an entire product), QA process consists of following steps:
-    Pre-Test Planning
-    Gathering Requirements
-    Creating Test Plan
-    Test Execution
-    Close-out and GO-LIVE
A detailed discussion is given in Section 5 of this document.

As far as testing is concerned, following methodologies should be considered:
-    Functional Testing
-    Integration Testing
-    Performance/Load Testing
-    Boundary Testing
-    Regression Testing
-    User Acceptance Testing

2.2    Feature Release
Many times, only a single feature is developed and added to the existing product. The QA considerations are slightly different here. We may not need a different test plan; however, Test execution Plan is definitely needed.
In the Testing section, Performance/Load testing may or may not be needed, depending on the functionality of the feature. If it is new feature, Integration testing will be necessary to make sure that this feature integrates well with the existing features of the product.
A full Regression testing is absolutely necessary, to ascertain that the new feature is not introducing any undesired issues, and is not breaking the existing functionality.

2.3    Dot release (or Bug Fix Release)
This type of release contains code for bug fixes, and not necessarily any new functionality. Accordingly, the QA process shall possibly exclude a separate test plan.
Apart from the individual bug fix testing scenarios, some level of regression is definitely needed. It is up to the QA lead to identify the regression scenarios around the functionality that is affected by the individual bugs.

2.4    Hot Fix
A hot fix is usually a very small fix that can consist of a small number of bug fixes and/or a small functionality change. This kind of fix does not call for a full QA cycle. Only a smoke test is necessary, to test the fix.

2.5    Emergency Fix
Sometimes, it happens that there is an emergency fix, and there is not enough time for full testing. In such cases, QA team should be able to identify a subset of the test scenarios/cases that will effectively test the fix, in the given amount of time.
The test scenarios can be prioritized based on various factors:
o    Core functionality of the code that is being fixed.
o    UI Changes that will be most obvious visible to the users.
o    Mission-Criticalness of certain portion(s) of the code.

3.0    Outstanding Issues
List outstanding issues (if any), and their resolution.
       
4.0    QA Process

QA Process is divided into following steps.
-    Pre-Test Planning
-    Gathering Requirements
-    Creating Test Plan
-    Test Execution
-    Close-out and GO-LIVE

4.1    Pre-Test Planning
This is a sort-of Planning-ahead step.
QA Manager will periodically review the anticipated QA activities for the next 2 sprints. He will perform an Ambiguity Review for all the activities, and submit an Ambiguities Checklist to the PM, which will eventually be addressed and resolved.
In this Planning phase, QA Manager will also identify any security issues, like, access to databases/applications/sites, which are needed by the QA Team, in order to perform the scheduled QA Activities. For example, if an application A is accessing a database B, then QA will require an access to both application A and the database B, taking into account whether it’s a read-only or read-write access.

4.2    Gathering Requirements
As a general practice, the QA Manager shall receive a QA request for a project, from the PM team.
QA Manager will acquire all pieces of project-related documents that are required for conducting a successful QA Activity. Documents can include (but are not limited to)
a.    Business Requirements Specifications
b.    Design specifications
c.    Functional specifications
d.    Data dictionaries
It is recommended that these documents are available at a shared place, like SharePoint.

4.3    Creating Test Plan
Upon receiving all the required documents, QA Manager will work with QA Lead(s) to create a Test Plan. A test plan is a high-level document that outlines the QA Strategy, Effort, Resources, Schedule and Deliverables for a given project. It also includes an exhaustive list of test scenarios. Please see appendix for a Test Plan Template.
Specifically, following items shall be identified and described in a test plan:
o    Test Scenarios
o    Testing framework (Hardware/Software specifications)
o    Resources (Local/Off-shore)
o    Justification for Automation (if applicable), and requirements (Automation software etc).
o    Testing Environments (QA/Staging etc)
o    Test Execution Schedule: QA Cycles and their start/finish dates
o    QA Entry/Exit Criteria
o    Go-Live schedule
o    Post-Production Activities, if any.
The test plan shall be reviewed by the Development team, PM(s) and Business analyst(s).
A more detailed Test Execution plan shall be developed by QA lead(s). It will include detailed test cases, Tester assignments, test completion schedule. Please see the Appendix for an example of Test Execution Matrix.

4.4    Test Execution
Test Execution phase will occur once the QA Entry criteria are satisfied. Entry criteria include Product readiness and QA readiness.
As a Thumb Rule, the Test Execution will take place in Staging (or QA) environment. Please see Appendix for a discussion on Environments.
Test Execution phase will be divided into suitable test cycles. Contents and schedule for each test cycle shall be created and distributed by the QA Lead.
4.4.1    Test Execution
All the manual test scenarios will be executed by the QA testers, and the test runs will be recorded in an online QA Management system (Quality Center), that will be accessible to QA as well as Developers.
All automated test scenarios will be executed by the automation team, and the test runs will be recorded in QA Management System.
In addition, QA Lead(s) and QA Manager(s) may perform some ad-hoc tests.
QA lead(s) will use the test run details, to create QA run status reports and QA Metrics.
4.4.2    Defect Logging
Defects will be logged for each and every test failure. Each defect will have a Severity index attached to it, from 1 to 5, 1 being a show-stopper (Absolute Critical). Each defect will be assigned to a developer.
It is recommended that each test failure be associated with one (or more) defects. A suitable cross-reference should be established between a failed test and associated defect(s). NOTE: if we use a single Test Management system for Test execution as well as defect management, then it will be really easy to maintain such a cross-reference.
4.4.3    Defect Tracking
Once logged, the defects will be tracked by QA. A developer will fix a given defect, and send it back to the QA person who logged it. QA will re-test it. If it is really fixed, QA will close the defect. If not, the defect will be sent back to the developer.

Please refer to Defect Management Guidelines Section in the Appendix of this document.

4.4.4    System Testing
Once the feature testing is complete, System Testing shall be carried out. In this testing phase, QA will ensure all components work together and meet all non-functional requirements.  This phase includes testing adjacent functional areas, end to end testing, ad-hoc testing.

4.4.5    UAT Co-ordination
Once all the test cycles are complete, and the QA Exit criteria are met, QA shall release the product for User Acceptance Testing. QA will work with the UAT testers, and will make the product available to them for testing.
QA will receive the feedback from UAT, and will address any issues reported. It is a good QA practice to analyze the UAT feedback, and create new test scenarios based on it.

4.5    Close-out and GO-LIVE
At the end of the Test Execution phase, QA Manager will make a GO/NO-GO decision based on the Test execution Report. If a GO-Live decision is made, then QA manager will work with the Release team, in order to make the GO-Live process smooth. He will also identify a set of smoke test cases, which will be run after the product is moved into Production.



5 Defect Management Guidelines 


Field
Description
Example
Summary
One-liner summary of the defect
‘Account Name field shows garbage data’
Project
Project Name
MSDM
Feature
Feature Name
Linkage Report
Code Drop
Code Drop Reference
Drop_1_DEC152009
Environment
Environment in which this defect was found
Staging
Description
Description of the defect. Also include the steps to reproduce the defect.
When we populate the account ID and tab out of the field, the Account Name shows garbage characters.
Severity
Description of how severe the defect is.  Please refer to the Defect Severity Reference table below.      
S1
Assigned To
Name of the person (either Developer or QA) working on the defect at this moment.

Status
Defect status: whether it is in DEV or in QA or closed.
In QA
Attachments
Screenshots, data sheets or any additional information.

 

Thursday, August 20, 2009

Agile Methodology

I would like to share my understanding of Agile methodology.
In the traditional waterfall model, the development process is step-by-step. Meaning, we do not go to the next step of development, until the previous step is approved and signed off. This can sometimes be problematic, especially when, during QA/testing phase, if a defect is found, that turns out to be a design flaw. It is too expensive to go back and change the software design and rewrite it.

In contrast, Agile development is a continuous cycle of Design-Code-Test. The development cycles are shorter. In ordinary terms, instead of having a huge laundry list of deliverables, to be delivered in 18 months, Agile method will have 3 shorter lists of deliverables, delivered every 6 months.

The process to implement the Agile methodology is called 'Scrum'.
The development cycle is broken into sprints. Each sprint is typically 2 weeks (it's a general norm. It can be longer than that. Sprint shorter that 2 weeks does not make much sense). Each project team member is given a specific task and deadline. Based on that, he/she has to break it down into subtasks. At the beginning of each sprint, each team member has to come up with a list of (sub)tasks that he/she is planning to perform in that sprint, and send it to the scrum master (a person, generally a senior team member, or someone from PMO).
Every day, a scrum meeting is arranged, typically early morning. The scrum master asks each person about the progress of his task(s), any issues that stop him/her from completing it. That way, the overall progress of the project is tracked on a daily basis, and problems, if any, are detected immediately.