Overview

Overview

Welcome to QTP interview questions

Hello software Quality engineers. Welcome to our Software testing world and common QTP(Quick test pro) related technical issues, interview questions etc. . Over 2 years, I am trying to update all possible interview questions in testing area of QTP. With your continuous comments on the topics and suggestions, we are growing day by day.

Wednesday, December 19, 2007

Fundamentals of testing

Fundamentals of Software Testing

Verification Testing
Verification is d process of examining a product 2 find its defects.
• Verification techniques include:
1. Desk checks
2. Peer reviews
3. Walk-throughs
4. Formal inspections
• Verification 2 ols include:
1. Standards
2. Checklists
3. Static analyzers
4. Use Cases
5. Judgment N experience
• d objectives of verification testing is 2 find defects
1. Using human examination N experience.
2. As early as possible in d life of d defect.
Inspection criteria:
1. If an input specification is used, it has been previously verified.
2. Special criteria can occur as a list of questions or a targeted checklist.
3. Inspection criteria is usually tied 2 d type of deliverable rather than d software product being developed. d typical 4 mal inspection assigns d following responsibilities:
Producer - have d product ready N available on time; work with d facilitator 2 establish N meet schedules; clarify issues 4 reviewers; resolve problems identified by d inspection team.
Facilitator - Learn d information being inspected; select d reviewers; schedule N
coordinate meetings; lead d discussion on d inspection issues N mediate disputes;
assign responsibilities; track each problem 2 closure.
Reader - Become familiar with d test subject; paraphrase d information into logical
'chunks'.
Reviewer - Learn d test subject N d inspection criteria; identify discrepancies between d two before d meeting; focus on identifying problems, not solving d m; remain objective; review d product, not d producer.
Recorder - Become familiar with d work product N d inspection criteria; record
accurately all issues raised by d review team.
Formal Inspection
• An inspection is a 4 mal review process that evaluates a test subject using:
1. A structured inspection process.
2. A set of inspection criteria or an input specification.
3. A meeting facilitator, a recorder, N an optional reader.
4. A trained inspection team of 3 2 6 reviewers.
5. Reviewer preparation.
6. A 4 mal report of findings, rework, N follow-up.

• Inspections apply a set of criteria 2 sections of a deliverable.
• d advantages of 4 mal inspections are
1. They are highly structured N require closure.
2. Success criteria are very visible.
3. They expose reviewers 2 d process N d deliverable.
• d disadvantages of 4 mal inspections are
1. They are very detailed N time-consuming.
2. Reviewers must be trained in inspection N product development.
3. The inspection criteria must be sound.
Walk-Through
• A walk-through evaluates a test subject by simulating (walking through) d process that it represents using simple test cases.For example
1. A code walk-through simulates d process of running an application using test data.
2. A requirements walk-through simulates user events being addressed in d way that d document specifies.
• d objective of a walk-through is 2 provide an interactive arena using d simulation as d basis 4 discussion.
• A walk-through consists of
1. The test subject (usually a document).
2. The producer who leads d review.
3. 3 - 5 reviewers, often subject matter experts.
4. An optional report of findings.
• Walk-throughs are very effective 4 evaluations that benefit 4m creative thought.
• d advantages of walk-throughs are:
1. They apply diverse expertise N promote creative evaluation.
2. They usually make defects easily visible 2 d producer.
• d disadvantages of walk-throughs are:
1. They are not structured.
2. They rely solely on reviewer expertise.
Peer Review
• A peer review is an evaluation of a deliverable done by a group of individuals who do d same kind of work.
• Peer reviews typically focus on specific aspects of d subject rather than d deliverable as a whole. 4 example:
1. A code review may address only programming efficiency N style.
2. A design review may focus on d accuracy of algorithms.
3. A requirements review may focus on testability.
• d advantages of a peer review are:
1. People generally learn N get motivation best 4m peers.
2. Diversity pays off, in terms of finding defects N learning 4m d m.
3. Peers become familiar with other deliverables.
4. Errors found in reviews cost less 2 fix than those found later.
• d disadvantages of a peer review are:
1. Reviews can be very taxing.
2. Peers may be uncomfortable critiquing each other's work.
3. The technique can cause resentment.
Desk Check
• A desk check is an evaluation of a deliverable by its producer.
• Desk checks are most commonly used by designers N programmers, but d y can also be used 4 other deliverables.
• d advantages of a desk check are:
1. The skills required 2 do a desk check are d same as those needed 2 develop d product.
2. The producer does d check so d re is no preparation time, no meetings 2 schedule, N no communication issues.
3. The desk check is a redundant check of d product's requirements.
• d disadvantages of a desk check are:
1. It is often difficult 4 an individual 2 identify defects in his or her own work.
2. It will probably NOT find any misinterpretations of d requirements or standards.
USE CASE OUTLINE
Transaction, Actors, Pre-Conditions, Inputs, Wrap up, Related Use Cases N Steps
Use Cases
• A use case is a 2 ol that defines a system requirement 4m an external (user) perspective.
• It defines an atomic business event as:
1. A set of one or more business conditions
2. A series of sequenced activities
3. A set of inputs
4. A set of one or more measurable outputs.
• d business conditions define d business event, d sources 4 d event, N d state of d system prior 2 d event.
• d series of activities describe d steps taken by d user 2 accomplish d business event.
• d set of inputs defines d data entered into d system 2 accomplish d event.
• d set of one or more measurable outputs describes d transactions performed by d systems N d behaviors that d system must support.
• Use cases are used in many different ways during system development.
1. Designers use this 2 ol 2 remove ambiguity 4m models of system functionality, behavior, N interfaces.
2. Testers use this 2 ol 2 explore software requirements 2 verify d m using atomic business events.
If you verify nothing else in d entire system, DO verify d requirements!
Sample Requirements Checklist
1. Ambiguous (Do requirements contain words that can be misinterpreted by readers? Are complex subjects displayed graphically? Are assumptions stated explicitly? Do we know who/what is doing d acting at all times?)
2. Inconsistent (Do d y support d objectives of preceding phases?)
3. Contradictory (Does any requirement disagree with d statements, measures, or
intention of any other requirement?)
4. Incomplete (Are all user N system objectives clearly specified? Do d requirements
define all d information displayed 2 d user? Do d y address all error conditions N
required responses? Data integrity mechanisms? Transaction authorization? Precision of
calculations? System maintenance requirements? Recovery N reconstruction? etc.)
5. Achievable (Can each requirement be met? Can each be sustained 4 d life of d
product? Are d se requirements feasible given d project constraints?)
6. Measurable (Is each requirement quantified N measurable?)
7. Traceable (Are d requirements arranged in such a way that d y can be used as a
source 4 all subsequent software products?)
8. Changeable (Can d se requirements be maintained d way that d y are written?)
Suggestion: When reviewing requirements, watch out 4 statements that contain adjectives N adverbs rather than measurable items. 4 example, compare:
- “The system must be VERY fast under all normal circumstances.” OR
- “The system must deliver 3 second response time 2 ad hoc queries N 1 second
response time 2 pre-defined queries. Response time refers 2 d period of time
between d start of a query N d appearance of d last line of output.”
Verifying Requirements
• When we verify requirements we are checking that:
1. we understand d users' needs before design begins.
2. we can effectively evaluate d product in terms of d se needs.
• It is critical 2 verify d requirements document 4 several reasons:
1. This is d document that records d users' expectations.
2. It is d foundation of d whole software product; all work products are indirect products of this document.
3. All work products, including testware must trace back 2 specific requirements.
4. The majority of software defects originate with errors in this document.
• d re are several ways 2 verify a requirements specification:
1. Conduct peer reviews or walk-throughs.
2. Review d document using a requirements checklist.
3. Compare it 2 d concept document or a specification 4 a competitive product.
4. Issue it 2 people N 2 ask d m 2 describe d system.
5. Develop simple use cases N see if it addresses all of d issues raised.
The following are ideas 4 creating a verification checklist 4 a functional design. Omissions
1. Is every requirement represented in d design? Are requirements referenced?
2. Are all d screens, reports, commands, inputs, N responses included?
3. Are d re enough examples N diagrams?
4. Where necessary, are d reasons 4 design choices explained?
Errors
1. Are d re mistakes in d translation of d user requirements?
2. Are d re mistakes in definitions presented?
3. Are d re errors made in calculations?
4. Are d re any features that are NOT included in d requirements specification?
Ambiguity
1. Can general statements be interpreted in multiple ways?
2. Are d verbs used d best ones 2 explain d intended behavior?
3. Are pronouns used properly, or can d ir subject be misunderstood?
4. Is it always clear which is acting, d program or d user?
Verifying d Functional Design
• Software design translates product requirements into specifications of:
1. How d system will be built - its internal design.
2. How d system will function - its functional or external design.
• d functional design describes d product's interface N behavior 4m d perspective of d USER.
• When testers speak of verifying d software design, d y are almost always referring 2 verifying d functional design.
• When verifying d design, we look 4 :
1. omissions
2. errors
3. ambiguity.
• d functional design should be presented concisely, using language N diagrams that can be readily understood by both users N developers. Even though technical reviews are usually carried out by developers, testers can learn a lot in d se sessions. We might also pick up useful information by reading d internal design documents. In d se we should see things like product limits, possible failure conditions, boundary conditions, N other white box testing considerations.
Note: all d se terms are discussed in d next few chapters. A code checklist is 2 o long 2 present here. Most checklists cover d various kinds of common programming errors including mistakes in:
i. data referencing
ii. logic
iii. computations
iv. function interfaces
v. external functions
vi. standards 4 naming, comments, etc.
Verifying d Internal Design N d Code
• Other software products that are usually verified include d internal design N d program code.
• d se products require technical verification, so d se tests are usually carried out by developers.
• Components of d internal design are: data structures, data flows, program structure N program logic.
1. When verifying an internal design, we look primarily 4 omissions N errors.
2. IEEE Recommended Practice 4 Software Design
• d code is d application itself; it is often verified in desk checks, walk-throughs, N inspections.
• Code inspections rely on checklists developed using standards documents N common errors.
• Code walk-throughs simulate d application running simple scenarios that get participants thinking N questioning program assumptions N implementation.
Verifying Testware
• Testware, like software, should be verified as it is built.
1. For testware, d 'requirements specification' is d test plan.
2. Test plans N tests should be verified before d y are used.
• Test plans should be checked 4 d following:
1. A clear N feasible testing strategy
2. A functional description of what is 2 be tested N 2 what degree
3. Resource requirements N schedule
4. Testing dependencies
5. Descriptions of tests or test suites
6. Realistic completion criteria.
• Tests should be checked 4 d following:
1. Author, objectives, test subject, N trace information
2. Configuration, resource, N setup requirements
3. Sequenced test steps
4. Expected results that correspond with source documents
5. Evaluation N disposition criteria.
Verifying User Documentation
• User documentation, when well written, improves user satisfaction N lowers customer support costs.
• We verify user documentation looking 4 problems of:
1. Omission, such as missing features, or incomplete explanations.
2. Accuracy, ranging 4m typos 2 incorrect commands, diagrams, or references.
3. Clarity, arising 4m confusing or ambiguous discussion or examples.
4. Organization that make d document less usable.
• Use requirements N functional specifications (and any other information you have) 2 verify d document(s).
• Check every explicit N implicit fact presented.
• Enter every keystroke in every example N try every suggestion provided.
• Check every screen diagram against d working program.
The Code Review Log is used when reviewing project code. It can be employed regardless of d verification technique selected (Formal Inspection, Walk-Through, Peer Review or Desk Check). It provides a record of defects found during d Code Review.
At d 2 p of d Code Review Log enter d following:
• Project name
• Release number
• Date of review
• Component being reviewed
• Module being reviewed
• Module version number
• d names of all review participants next 2 d ir role
In d table, record each defect found during d review. Space is provided 2 enter d
following information 4 each defect:
• Page #
• Line #
• Description of defect
• Category
• Type of defect
• Severity level

Sample Categories follow:
• Extra
• Incorrect
• Missing
• Suggestion/Enhancement
Sample Types follow:
• Data
• Documentation
• Functionality
• Interface
• Logic
• Performance
• Standard
• Other
Notes
The Code Review Summary Report should be used in conjunction with d Code Review Log.
The Code Review Log lists d defects found during d review N d Code Review
Summary Report summarizes d types of defects found by category N severity. It also
documents d outcome of d review (pass or fail) N captures metrics such as time spent on d review. d information at d 2 p of d Code Review Summary Report is d same as d Code Review Log with d addition of d column, “Prep Time”. In d “Prep Time” column enter d number of hours each participate spent preparing 4 d review. In d “Defect types by category N severity” table, tally up d number of defect types per category N severity level. In d “Inspection Summary” table, enter d information requested in d right column. At d bottom of d 4 m check whether d module passed d review or if it failed. If d module failed enter:
• d estimated number of hours 2 fix d defects
• d date d fixes should be completed
• d date of d next review
The Review Tracking Sheet is used when reviewing project documents. It can be employed regardless of d verification technique selected (Formal Inspection, Walk-Through, Peer Review or Desk Check). It provides a record of defects found during d document review. Enter d following Document Information:
• Project name
• Release number
• Title of d document being reviewed
• Document Number
• Document Version number
• Document Date
• Date d document will be distributed 4 review
• Date when d document must be reviewed OR
• Meeting review date
Enter d following Author Information:
• Name(s)
• Group (e.g., Development, Test, QA, Project Management, etc.)
• Location (e.g., Address, Building #, Floor #, Room #, Cube/Office #)
• Phone #
Enter d following Participant Information:
• Name of participant next 2 corresponding role
• Group (e.g., Development, Test, QA, Project Management, etc.)
• Prep Time - Amount of time spent preparing 4 d review
• Rating (use d following scale)
1. Approved
2. Approved with optional comments
3. Approved after required comments are incorporated
4. Not Approved. Another review is required
Review Tracking Sheet
Document Information
Author Information
Participant Information
(-continued-)
Project: Document Distribution Date:
Release: Must Be Reviewed by Date:
Document Title: Meeting Review Date (If applicable)
Document
Document Version:
Document Date:
Name Group Location Phone
Name Role Group Prep
Time Rating Initial & Date
Facilitator
Reader
Recorder
Reviewer
Notes
Enter d following Document Comments:
• A comment about d document under review
• d name of d person who submitted d comment
• Note whether d comment is optional or required
• d Status column is 4 use after d review. d author can use this column 2 :
Specify whether d comment has been incorporated into d document or not.
Note any Defect numbers opened as a result of d comment.
Enter d following Action Item information:
• Description of d action item
• Name of person responsible 4 resolving d action item
• d date d action item is required 2 be resolved
• d status of d action item (e.g., open, closed, etc.)

No comments:

Post a Comment