Wednesday, January 29, 2014

Test Plan

TEST PLAN DEFINITION
A Software Test Plan is a document describing the testing scope and activities. It is the basis for formally testing any software/product in a project.
  • test plan: A document describing the scope, approach, resources and schedule of intended test activities. It identifies amongst others test items, the features to be tested, the testing tasks, who will do each task, degree of tester independence, the test environment, the test design techniques and entry and exit criteria to be used, and the rationale for their choice,and any risks requiring contingency planning. It is a record of the test planning process.
  • master test plan: A test plan that typically addresses multiple test levels.
  • phase test plan: A test plan that typically addresses one test phase.
TEST PLAN TYPES
One can have the following types of test plans:
  • Master Test Plan: A single high-level test plan for a project/product that unifies all other test plans.
  • Testing Level Specific Test Plans:Plans for each level of testing.
    • Unit Test Plan
    • Integration Test Plan
    • System Test Plan
    • Acceptance Test Plan
  • Testing Type Specific Test Plans: Plans for major types of testing like Performance Test Plan and Security Test Plan.
TEST PLAN TEMPLATE
The format and content of a software test plan vary depending on the processes, standards, and test management tools being implemented. Nevertheless, the following format, which is based on IEEE standard for software test documentation, provides a summary of what a test plan can/should contain.
Test Plan Identifier:
  • Provide a unique identifier for the document. (Adhere to the Configuration Management System if you have one.)
Introduction:
  • Provide an overview of the test plan.
  • Specify the goals/objectives.
  • Specify any constraints.
References:
  • List the related documents, with links to them if available, including the following:
    • Project Plan
    • Configuration Management Plan

Test Items:
  • List the test items (software/products) and their versions.
Features to be Tested:
  • List the features of the software/product to be tested.
  • Provide references to the Requirements and/or Design specifications of the features to be tested
Features Not to Be Tested:
  • List the features of the software/product which will not be tested.
  • Specify the reasons these features won’t be tested.
Approach:
  • Mention the overall approach to testing.
  • Specify the testing levels [if it's a Master Test Plan], the testing types, and the testing methods [Manual/Automated; White Box/Black Box/Gray Box]
Item Pass/Fail Criteria:
  • Specify the criteria that will be used to determine whether each test item (software/product) has passed or failed testing.
Suspension Criteria and Resumption Requirements:
  • Specify criteria to be used to suspend the testing activity.
  • Specify testing activities which must be redone when testing is resumed.
Test Deliverables:
  • List test deliverables, and links to them if available, including the following:
    • Test Plan (this document itself)
    • Test Cases
    • Test Scripts
    • Defect/Enhancement Logs
    • Test Reports

Test Environment:
  • Specify the properties of test environment: hardware, software, network etc.
  • List any testing or related tools.
Estimate:
  • Provide a summary of test estimates (cost or effort) and/or provide a link to the detailed estimation.
Schedule:
  • Provide a summary of the schedule, specifying key test milestones, and/or provide a link to the detailed schedule.
Staffing and Training Needs:
  • Specify staffing needs by role and required skills.
  • Identify training that is necessary to provide those skills, if not already acquired.
Responsibilities:
  • List the responsibilities of each team/role/individual.
Risks:
  • List the risks that have been identified.
  • Specify the mitigation plan and the contingency plan for each risk.
Assumptions and Dependencies:
  • List the assumptions that have been made during the preparation of this plan.
  • List the dependencies.
Approvals:
  • Specify the names and roles of all persons who must approve the plan.
  • Provide space for signatures and dates. (If the document is to be printed.)
TEST PLAN GUIDELINES
  • Make the plan concise. Avoid redundancy and superfluousness. If you think you do not need a section that has been mentioned in the template above, go ahead and delete that section in your test plan.
  • Be specific. For example, when you specify an operating system as a property of a test environment, mention the OS Edition/Version as well, not just the OS Name.
  • Make use of lists and tables wherever possible. Avoid lengthy paragraphs.
  • Have the test plan reviewed a number of times prior to baselining it or sending it for approval. The quality of your test plan speaks volumes about the quality of the testing you or your team is going to perform.
  • Update the plan as and when necessary. An out-dated and unused document stinks and is worse than not having the document in the first place.

Test Estimation for Manual Test & Automation Test

Through Experienced  on the project for the all areas, How much time take of each
Delphi Method
Functional Point
Work Break Down Structure
Predefined template for your need of project ( ex: Spered Sheet of status of the project) 

Testing Process

Company Testing Objective
Project wise
Product wise
Company wise
Share holder wise
Technical stuff wise
Skill Set wise

Automation Framework

Data Driven
Functional
Keyword Driven
Modular
Library
Hybrid
Native
Etc

Saturday, January 11, 2014

Selenium

Þ     Exp in TestNG & data driven, Hybrid  frameworks.
Þ     Setup creation for driver
Þ     Using Firebug & Firepath for inspect Elements in Firefox
Þ     Integrating TestNG to Eclipse
Þ     Generating the HTML & Emailable Report for the Client
Þ     Included & Excluded the Sanity & Regression Test cases in XML file
Þ     Running the Suite file from TestNG XML file
Þ     Creating Runnble JAR file for client reviewing Suite
Þ     Configuring the multiple environments
Þ     Read the data from Excel through our Java Code
Þ     Repository of ID’s and test data maintained in excel sheet.
Þ     Handing the alerts and popup windows in Driver
Þ     Experience on handle the window applications with autoit and keyboard events
Þ     Adding all suite into ant build
Þ     Creating BUILD.XML file and making the Batch file
Þ     Scheduling the tasks using schedules
Þ     Using the getEval method in JavaScript for unidentified objects
Þ      Making the Reg Expressions to ID/Name/Text/Xpath Values
Þ     Using the commands like Actions, Accessors, Asserts
Þ     Understanding the code – Command, Target, Value


Þ     Identifying the Automation test cases for regression & Sanity 

JMeter

Þ      Running the sanity test cases for JOSN objects validation for Payment API
Þ      Testing cloud based application through apache JMeter
Þ      Distributing the different data to the different User Groups
Þ      Validating the JSON request form Objects Vs Response form Objects
Þ      Checking the restful services 
Þ      Creating functional test cases for JOSN & JOSNP forms over cloud API
Þ      Verifying & exchanging the data  between  in-house API with third party API
Þ     Validating & analyzing the all SLA’s, NFR,Network features
Þ     Validating & analyzing the Service Level Agreement
Þ      Creating Proxy Server to Workbench
Þ      Enabling the browser setting for Proxy server
Þ      Creating thread groups
Þ      Adding user defined samplers to thread group
Þ      Making regular expression extractor for post processors
Þ      Parameter the samplers for test items
Þ      Adding the test data elements through CSV config file
Þ      Adding controllers for respective test items
Þ      creating assertions for test items
Þ      Generating the user defined test results in excel sheet

Þ      Creating the appropriate listeners to respective thread groups

Silk Test

Þ     Made Three Different frameworks as per client need from the scratch
Þ      Worked with existed frameworks of the clients for 4T & VB.Net code
Þ      Experience in 4T Language , VB.Net Frameworks.
Þ      Followed frame work for data driven frame work with specified folder structure(batch, data, Error, extend, Library, plan, result, scripts)
Þ      Worked with Regression automation closely with DG,BT,NSC clients
Þ      Worked on following files: .vtp, .stp .opt,.ini, .inc, .res or rex,.s, .txt,.pln, accex.inc,pln.ini,.t, .g.t., .ino, .to,. defaults.inc transcript,4Test.inc, cs.inc, oleclass.inc, vbclass.inc, winclass.inc, files
Þ      Mapping the respective test cases (.t) with respective test plan(.pln)
Þ      Creation of 4 test script for required test cases (.t files)
Þ      Execution of 4 Test language test cases  as well as  VB.Net  test cases,
Þ      Debugging the scripts as on when changes happens in respective pages
Þ      Writing the scripts through DB Id’s for un indentified objects in SILK Test
Þ      Capturing, analyzing & verifying the case result in RES file & Rex file
Þ      Pass the data of results into flat file for verification of test case results
Þ      Stored the all projects files (.STP or .VTP files) in to shared drives (common folder).
Þ      Execution of Data driven testing with .g.t file or through flat file or direct .t file or wring code in .t file
Þ      Creating the regular expressions(wild card characters) to tags for dynamically  changing objects
Þ      Including the files as necessary depends on .t files
Þ      Repeating the above scenario’s in sanity  pack and regression pack  test for major functionless
Þ      Exception taking care by recovery, exception logics and debugging the things in code

Þ    Worked on   file handlers & file modes for flat files & xl sheets

Silk Performer

SilkPerformer –
 Introduction
 Identify the Business Need for SilkPerformer
 Describe the Business and IT Issues that SilkPerformer Addresses
 Compare the Alternative Approaches to Borland SilkPerformer
Borland SilkPerformer Integrates with other Borland Products
 Identify the Overall Benefits to a Business Using Borland SilkPerformer

Modeling and Implementing Load Tests –

 Project Plan, Test Plan, and Project Outline
 Modeling the Scripts
 Customizing the Test
 Finding and Confirming Baseline
 Adjust Workload and Run Test
 Introduction to BDL Scripting
 Data Types, Variables, and Randomizing
 Profile Settings
 Object Recognition
 Capturing Your Application
 Accessing Dynamic Objects

Results Analysis and Correlation
 Reviewing the Basics
 Using Performance Explorer
 Understanding Quantified Data
 Analyzing Client-Side Data
 Analyzing Scenarios
 Analyzing Server Side Data

 Other Client-Side Measures