Overview

Overview

Welcome to QTP interview questions

Hello software Quality engineers. Welcome to our Software testing world and common QTP(Quick test pro) related technical issues, interview questions etc. . Over 2 years, I am trying to update all possible interview questions in testing area of QTP. With your continuous comments on the topics and suggestions, we are growing day by day.

Showing posts with label qa. Show all posts
Showing posts with label qa. Show all posts

Wednesday, December 19, 2007

Testing activity measurements


qtp life cycle task list2


QTP Life cycle Task list


LIFECYCLE FOR DEVELOPMENT PROJECT


Roles - Responsibilities QTP- Testing


























STANDARD FOR LIFECYCLE REPRESENTATIONs

8.1 Lifecycle definitions will comprise the following:

- Model representation: A simple flow chart outlining the activities/tasks.
- Task list: A table containing a list of all the tasks/activities along with their entry criteria, validation requirements and exit criteria.
- Measurements: A table containing a list of measurements collected from the tasks/activities.
- Tailoring: Guidelines for tailoring the lifecycle for different situations.

8.2 Lifecycles for the following types of projects are currently defined in this procedure:

Development Project: The project involves creation of a new system or enhancement of an existing system whose requirements are usually outlined by a customer. Enhancements may include the addition of new features and requirements or a subset of the development activities such as testing existing features and requirements thoroughly.
- Product development : The project involves creation of a new off-the-shelf product or enhancement of an existing off-the-shelf product. Enhancements may include addition of new features and requirements or a subset of the development activities such as testing existing features and requirements thoroughly.
- Iterative Model Product Development : This involves development of a new off-the-shelf product or enhancement of an existing off-the-shelf product on an iterative basis.
- Porting project : The project involves customising an off-the-shelf product to a specific, defined environment.
- Conversion project : The project involves converting a software system (entirely or partially), currently operating in a specific environment, in order to make the software operate in a different environment.
- Maintenance / product support project : The project involves customer-query handling and problem corrections to existing software in order to provide support to users of the software.
- Re-engineering project : The project involves refinement of existing software to meet current documentation, coding and other procedural standards.
- Lifecycle for small development project: The projects involves project/product development of or prototyping to the same using integrated project technical document.
- Testing Project: The project involves providing a complete test service for a developed project/product

8.3. The following conventions are used while representing the various lifecycles.
signifies that the task at the tail of the arrow must be completed before the task at the head of the arrow can commence

8.4 Process Performance model identifies the individuals responsible for monitoring various business objectives and inturn the process performance parameters

project plan - life cycle procedure

Responsibilities

6.1 For each project, the Project Manager identifies any variations from the Life Cycle described in this
document and also which phases of the life cycle are to be used for project control in their projects.

6.2 The Quality Manager approves and the General Manager authorises any deviations from this guideline.


7. Procedure

7.1. During project planning, the Project Manager decides on the appropriate lifecycle for the project by selecting from the various lifecycles described in this section .

7.2 The selected lifecycle for the project is then tailored to the project's needs based on:

- Tailoring criteria provided along with the lifecycle model
- Any requirements for the project specified by the customer
- Tailoring of organisation baselines based on new processes that will be piloted.

7.3 The lifecycle for the project is documented in the process handbook for the project.

7.4 The following terms are used while describing the various lifecycles:

- Business Requirements Specification (BRS) - the requirements to be met by the software such as functional, performance, interface and processing requirements are specified in detail.
- Functional Specification (FS) – the implementation aspects of the BRS such as the functional, performance, interface and processing requirements through business modelling, use cases, screen specifications and business rules.
- Design Specification
- High Level design - the system is broken down into individual components. The interfaces between the components are identified.
- Detailed Design - the behaviour and composition of each of the components is defined.
- Coding - the design is translated into code in an appropriate programming language.
- Unit Testing - each of the components is tested individually to verify that the component satisfies its goals, such as functional, performance and reliability goals.
- Integration Testing - every combination of components is tested to verify that the combination functions correctly as intended.
- System Testing - the system is built as a whole and tested to verify its conformance to the defined requirements.
- Release - the components of the software (including all documentation and information necessary to build the software) are assembled together for shipment.
- Acceptance - the activity of obtaining the customer's agreement that all the required deliverables meet their defined requirements.
- Specification of conversion - the various items to be converted - such as operating system calls, blocks of code specific to the older system, utility routines, etc are identified along with how they will be converted.
- Re-engineering specification - the software items to be created or modified in order to build a system that satisfies the requirements are identified, and the modifications to be made to these are identified.







7.5 Prototyping can be performed at any of these phases and need not conform to the structure as described in this document. However, all resulting code and/or design must undergo the life cycle steps and controls before it can be used in the product. The process such as the one for Rapid Development will have to be approved by the General Manager/Quality Manager before it can be used in a project. In addition, such a development can be taken up only if the contract/LOI allows it.

Handling overlapping of phases

This Lifecycle Procedure recognises that the phases mentioned in the previous sections may overlap due
to project schedule requirements. In such cases, the following guidelines are applicable:

- The Project Manager identifies the phase overlap as a risk in the project plan and lists the steps taken to counter these risks. These may include the steps outlined below.
- The Project Manager ensures that the inputs for an activity are available before the activity commences. (For example, design specifications for a module must be reviewed and approved when coding for the module commences although design of other modules might not be complete). The Project Manager also ensures that such inputs are under version control.
- When a change in an item requires a change in other items / activities for which this has been used as an input, the Project Manager confirms if the change is really necessary.
- If yes, the Project Manager ensures that revised versions / amendments of the item, along with detailed descriptions of the changes, are made available to personnel performing subsequent activities.
- The Project Manager also performs an impact analysis and revises the project schedules, if necessary. Such schedule changes are highlighted in the project status report. The Project Manager also ensures that sufficient review / testing of the changes is performed.

Testing abbreviations (terminology)

The following abbreviations are used while representing the various lifecycles.

Abbreviations
LOI
Letter Of Intent
WO
Work Order/TSOP
ATP
Acceptance Test Plan / Acceptance Test Planning
AT
Acceptance Test
STP
System Test Plan / System Test Planning
ST
System Test
ITP
Integration Test Plan / Integration Test Planning
IT
Integration Test
UTP
Unit Test Plan / Unit Test Planning
UT
Unit Test
BRS
Business Requirements Specification
FS
Functional Specification
DS
Design Specification
HLD
High Level Design
DD
Detailed Design

Fundamentals of testing

Fundamentals of Software Testing

Verification Testing
Verification is d process of examining a product 2 find its defects.
• Verification techniques include:
1. Desk checks
2. Peer reviews
3. Walk-throughs
4. Formal inspections
• Verification 2 ols include:
1. Standards
2. Checklists
3. Static analyzers
4. Use Cases
5. Judgment N experience
• d objectives of verification testing is 2 find defects
1. Using human examination N experience.
2. As early as possible in d life of d defect.
Inspection criteria:
1. If an input specification is used, it has been previously verified.
2. Special criteria can occur as a list of questions or a targeted checklist.
3. Inspection criteria is usually tied 2 d type of deliverable rather than d software product being developed. d typical 4 mal inspection assigns d following responsibilities:
Producer - have d product ready N available on time; work with d facilitator 2 establish N meet schedules; clarify issues 4 reviewers; resolve problems identified by d inspection team.
Facilitator - Learn d information being inspected; select d reviewers; schedule N
coordinate meetings; lead d discussion on d inspection issues N mediate disputes;
assign responsibilities; track each problem 2 closure.
Reader - Become familiar with d test subject; paraphrase d information into logical
'chunks'.
Reviewer - Learn d test subject N d inspection criteria; identify discrepancies between d two before d meeting; focus on identifying problems, not solving d m; remain objective; review d product, not d producer.
Recorder - Become familiar with d work product N d inspection criteria; record
accurately all issues raised by d review team.
Formal Inspection
• An inspection is a 4 mal review process that evaluates a test subject using:
1. A structured inspection process.
2. A set of inspection criteria or an input specification.
3. A meeting facilitator, a recorder, N an optional reader.
4. A trained inspection team of 3 2 6 reviewers.
5. Reviewer preparation.
6. A 4 mal report of findings, rework, N follow-up.

• Inspections apply a set of criteria 2 sections of a deliverable.
• d advantages of 4 mal inspections are
1. They are highly structured N require closure.
2. Success criteria are very visible.
3. They expose reviewers 2 d process N d deliverable.
• d disadvantages of 4 mal inspections are
1. They are very detailed N time-consuming.
2. Reviewers must be trained in inspection N product development.
3. The inspection criteria must be sound.
Walk-Through
• A walk-through evaluates a test subject by simulating (walking through) d process that it represents using simple test cases.For example
1. A code walk-through simulates d process of running an application using test data.
2. A requirements walk-through simulates user events being addressed in d way that d document specifies.
• d objective of a walk-through is 2 provide an interactive arena using d simulation as d basis 4 discussion.
• A walk-through consists of
1. The test subject (usually a document).
2. The producer who leads d review.
3. 3 - 5 reviewers, often subject matter experts.
4. An optional report of findings.
• Walk-throughs are very effective 4 evaluations that benefit 4m creative thought.
• d advantages of walk-throughs are:
1. They apply diverse expertise N promote creative evaluation.
2. They usually make defects easily visible 2 d producer.
• d disadvantages of walk-throughs are:
1. They are not structured.
2. They rely solely on reviewer expertise.
Peer Review
• A peer review is an evaluation of a deliverable done by a group of individuals who do d same kind of work.
• Peer reviews typically focus on specific aspects of d subject rather than d deliverable as a whole. 4 example:
1. A code review may address only programming efficiency N style.
2. A design review may focus on d accuracy of algorithms.
3. A requirements review may focus on testability.
• d advantages of a peer review are:
1. People generally learn N get motivation best 4m peers.
2. Diversity pays off, in terms of finding defects N learning 4m d m.
3. Peers become familiar with other deliverables.
4. Errors found in reviews cost less 2 fix than those found later.
• d disadvantages of a peer review are:
1. Reviews can be very taxing.
2. Peers may be uncomfortable critiquing each other's work.
3. The technique can cause resentment.
Desk Check
• A desk check is an evaluation of a deliverable by its producer.
• Desk checks are most commonly used by designers N programmers, but d y can also be used 4 other deliverables.
• d advantages of a desk check are:
1. The skills required 2 do a desk check are d same as those needed 2 develop d product.
2. The producer does d check so d re is no preparation time, no meetings 2 schedule, N no communication issues.
3. The desk check is a redundant check of d product's requirements.
• d disadvantages of a desk check are:
1. It is often difficult 4 an individual 2 identify defects in his or her own work.
2. It will probably NOT find any misinterpretations of d requirements or standards.
USE CASE OUTLINE
Transaction, Actors, Pre-Conditions, Inputs, Wrap up, Related Use Cases N Steps
Use Cases
• A use case is a 2 ol that defines a system requirement 4m an external (user) perspective.
• It defines an atomic business event as:
1. A set of one or more business conditions
2. A series of sequenced activities
3. A set of inputs
4. A set of one or more measurable outputs.
• d business conditions define d business event, d sources 4 d event, N d state of d system prior 2 d event.
• d series of activities describe d steps taken by d user 2 accomplish d business event.
• d set of inputs defines d data entered into d system 2 accomplish d event.
• d set of one or more measurable outputs describes d transactions performed by d systems N d behaviors that d system must support.
• Use cases are used in many different ways during system development.
1. Designers use this 2 ol 2 remove ambiguity 4m models of system functionality, behavior, N interfaces.
2. Testers use this 2 ol 2 explore software requirements 2 verify d m using atomic business events.
If you verify nothing else in d entire system, DO verify d requirements!
Sample Requirements Checklist
1. Ambiguous (Do requirements contain words that can be misinterpreted by readers? Are complex subjects displayed graphically? Are assumptions stated explicitly? Do we know who/what is doing d acting at all times?)
2. Inconsistent (Do d y support d objectives of preceding phases?)
3. Contradictory (Does any requirement disagree with d statements, measures, or
intention of any other requirement?)
4. Incomplete (Are all user N system objectives clearly specified? Do d requirements
define all d information displayed 2 d user? Do d y address all error conditions N
required responses? Data integrity mechanisms? Transaction authorization? Precision of
calculations? System maintenance requirements? Recovery N reconstruction? etc.)
5. Achievable (Can each requirement be met? Can each be sustained 4 d life of d
product? Are d se requirements feasible given d project constraints?)
6. Measurable (Is each requirement quantified N measurable?)
7. Traceable (Are d requirements arranged in such a way that d y can be used as a
source 4 all subsequent software products?)
8. Changeable (Can d se requirements be maintained d way that d y are written?)
Suggestion: When reviewing requirements, watch out 4 statements that contain adjectives N adverbs rather than measurable items. 4 example, compare:
- “The system must be VERY fast under all normal circumstances.” OR
- “The system must deliver 3 second response time 2 ad hoc queries N 1 second
response time 2 pre-defined queries. Response time refers 2 d period of time
between d start of a query N d appearance of d last line of output.”
Verifying Requirements
• When we verify requirements we are checking that:
1. we understand d users' needs before design begins.
2. we can effectively evaluate d product in terms of d se needs.
• It is critical 2 verify d requirements document 4 several reasons:
1. This is d document that records d users' expectations.
2. It is d foundation of d whole software product; all work products are indirect products of this document.
3. All work products, including testware must trace back 2 specific requirements.
4. The majority of software defects originate with errors in this document.
• d re are several ways 2 verify a requirements specification:
1. Conduct peer reviews or walk-throughs.
2. Review d document using a requirements checklist.
3. Compare it 2 d concept document or a specification 4 a competitive product.
4. Issue it 2 people N 2 ask d m 2 describe d system.
5. Develop simple use cases N see if it addresses all of d issues raised.
The following are ideas 4 creating a verification checklist 4 a functional design. Omissions
1. Is every requirement represented in d design? Are requirements referenced?
2. Are all d screens, reports, commands, inputs, N responses included?
3. Are d re enough examples N diagrams?
4. Where necessary, are d reasons 4 design choices explained?
Errors
1. Are d re mistakes in d translation of d user requirements?
2. Are d re mistakes in definitions presented?
3. Are d re errors made in calculations?
4. Are d re any features that are NOT included in d requirements specification?
Ambiguity
1. Can general statements be interpreted in multiple ways?
2. Are d verbs used d best ones 2 explain d intended behavior?
3. Are pronouns used properly, or can d ir subject be misunderstood?
4. Is it always clear which is acting, d program or d user?
Verifying d Functional Design
• Software design translates product requirements into specifications of:
1. How d system will be built - its internal design.
2. How d system will function - its functional or external design.
• d functional design describes d product's interface N behavior 4m d perspective of d USER.
• When testers speak of verifying d software design, d y are almost always referring 2 verifying d functional design.
• When verifying d design, we look 4 :
1. omissions
2. errors
3. ambiguity.
• d functional design should be presented concisely, using language N diagrams that can be readily understood by both users N developers. Even though technical reviews are usually carried out by developers, testers can learn a lot in d se sessions. We might also pick up useful information by reading d internal design documents. In d se we should see things like product limits, possible failure conditions, boundary conditions, N other white box testing considerations.
Note: all d se terms are discussed in d next few chapters. A code checklist is 2 o long 2 present here. Most checklists cover d various kinds of common programming errors including mistakes in:
i. data referencing
ii. logic
iii. computations
iv. function interfaces
v. external functions
vi. standards 4 naming, comments, etc.
Verifying d Internal Design N d Code
• Other software products that are usually verified include d internal design N d program code.
• d se products require technical verification, so d se tests are usually carried out by developers.
• Components of d internal design are: data structures, data flows, program structure N program logic.
1. When verifying an internal design, we look primarily 4 omissions N errors.
2. IEEE Recommended Practice 4 Software Design
• d code is d application itself; it is often verified in desk checks, walk-throughs, N inspections.
• Code inspections rely on checklists developed using standards documents N common errors.
• Code walk-throughs simulate d application running simple scenarios that get participants thinking N questioning program assumptions N implementation.
Verifying Testware
• Testware, like software, should be verified as it is built.
1. For testware, d 'requirements specification' is d test plan.
2. Test plans N tests should be verified before d y are used.
• Test plans should be checked 4 d following:
1. A clear N feasible testing strategy
2. A functional description of what is 2 be tested N 2 what degree
3. Resource requirements N schedule
4. Testing dependencies
5. Descriptions of tests or test suites
6. Realistic completion criteria.
• Tests should be checked 4 d following:
1. Author, objectives, test subject, N trace information
2. Configuration, resource, N setup requirements
3. Sequenced test steps
4. Expected results that correspond with source documents
5. Evaluation N disposition criteria.
Verifying User Documentation
• User documentation, when well written, improves user satisfaction N lowers customer support costs.
• We verify user documentation looking 4 problems of:
1. Omission, such as missing features, or incomplete explanations.
2. Accuracy, ranging 4m typos 2 incorrect commands, diagrams, or references.
3. Clarity, arising 4m confusing or ambiguous discussion or examples.
4. Organization that make d document less usable.
• Use requirements N functional specifications (and any other information you have) 2 verify d document(s).
• Check every explicit N implicit fact presented.
• Enter every keystroke in every example N try every suggestion provided.
• Check every screen diagram against d working program.
The Code Review Log is used when reviewing project code. It can be employed regardless of d verification technique selected (Formal Inspection, Walk-Through, Peer Review or Desk Check). It provides a record of defects found during d Code Review.
At d 2 p of d Code Review Log enter d following:
• Project name
• Release number
• Date of review
• Component being reviewed
• Module being reviewed
• Module version number
• d names of all review participants next 2 d ir role
In d table, record each defect found during d review. Space is provided 2 enter d
following information 4 each defect:
• Page #
• Line #
• Description of defect
• Category
• Type of defect
• Severity level

Sample Categories follow:
• Extra
• Incorrect
• Missing
• Suggestion/Enhancement
Sample Types follow:
• Data
• Documentation
• Functionality
• Interface
• Logic
• Performance
• Standard
• Other
Notes
The Code Review Summary Report should be used in conjunction with d Code Review Log.
The Code Review Log lists d defects found during d review N d Code Review
Summary Report summarizes d types of defects found by category N severity. It also
documents d outcome of d review (pass or fail) N captures metrics such as time spent on d review. d information at d 2 p of d Code Review Summary Report is d same as d Code Review Log with d addition of d column, “Prep Time”. In d “Prep Time” column enter d number of hours each participate spent preparing 4 d review. In d “Defect types by category N severity” table, tally up d number of defect types per category N severity level. In d “Inspection Summary” table, enter d information requested in d right column. At d bottom of d 4 m check whether d module passed d review or if it failed. If d module failed enter:
• d estimated number of hours 2 fix d defects
• d date d fixes should be completed
• d date of d next review
The Review Tracking Sheet is used when reviewing project documents. It can be employed regardless of d verification technique selected (Formal Inspection, Walk-Through, Peer Review or Desk Check). It provides a record of defects found during d document review. Enter d following Document Information:
• Project name
• Release number
• Title of d document being reviewed
• Document Number
• Document Version number
• Document Date
• Date d document will be distributed 4 review
• Date when d document must be reviewed OR
• Meeting review date
Enter d following Author Information:
• Name(s)
• Group (e.g., Development, Test, QA, Project Management, etc.)
• Location (e.g., Address, Building #, Floor #, Room #, Cube/Office #)
• Phone #
Enter d following Participant Information:
• Name of participant next 2 corresponding role
• Group (e.g., Development, Test, QA, Project Management, etc.)
• Prep Time - Amount of time spent preparing 4 d review
• Rating (use d following scale)
1. Approved
2. Approved with optional comments
3. Approved after required comments are incorporated
4. Not Approved. Another review is required
Review Tracking Sheet
Document Information
Author Information
Participant Information
(-continued-)
Project: Document Distribution Date:
Release: Must Be Reviewed by Date:
Document Title: Meeting Review Date (If applicable)
Document
Document Version:
Document Date:
Name Group Location Phone
Name Role Group Prep
Time Rating Initial & Date
Facilitator
Reader
Recorder
Reviewer
Notes
Enter d following Document Comments:
• A comment about d document under review
• d name of d person who submitted d comment
• Note whether d comment is optional or required
• d Status column is 4 use after d review. d author can use this column 2 :
Specify whether d comment has been incorporated into d document or not.
Note any Defect numbers opened as a result of d comment.
Enter d following Action Item information:
• Description of d action item
• Name of person responsible 4 resolving d action item
• d date d action item is required 2 be resolved
• d status of d action item (e.g., open, closed, etc.)

QTP Interview questions (faqs) set 9

1.Tell me about yourself.
2.Do you know any testing tools
3.How many projects you have executed using WinRunner
4.Which version of WinRunner you have used
5.You know Test director
6.What is your role as a Senior Test Engineer
7.Tell me the contents of Test Plan
8.How do u do effort estimation
9.You know anything about trade finance.
10.What is LC
11.What is revolving LC and non-revolving LC.
12.You know anything about Core Banking
13.What is NOSTRO and VOSTRO account
14.What is payment gate way
15.You know anything about SWIFT messages.
16.Tell me any one MT, which is using frequently in payment system
17.What is third party transfer
18.What is difference between QA and QC
19.You are QA tester or QC tester
20.What is white box testing
21.What is block box testing
22.What is gray box testing
23.What is full and plain Regression testing
24.Tell me defect life cycle
25.How will you assign severity, for identified bugs
26.What is defect dimension
27.Have you written test Cases for any project
28.How many projects you have executed and tell me the project names
29. Have you done UAT for any project
30. Have you done performance testing using Load runner
31. Are you willing to work in UK hours and week ends
32. What is boundary value analysis

QTP Interview questions (faqs) set 8

1)What is the difference between the black box testing and the white box testing?
2)What is test strategy document and test approach document?
3)What is adhoc testing?
4)Tell me about two critical bugs you found in your last project?
5)What are testing technics?
6)What is boundary value analysis and equalence value partioning?
7)Do you go out of your module to find bugs?
8)Tell me about ur self?
9)Can you brief about your previous project?
10)Do you have an experience in banking domain?
11)Do you have any knowledge in Brokerage?
12)What is sanity testing?
13)Did you invlove in prepartioin of test strategy and test doucument?
14)What you feel are the qualities of test lead?
15)What is NASTRO account and VOSTRO account?
16)What is current account and saving account?
17)What is regression testing?

QTP Interview questions (faqs) set 7

a) How you masure the 100% bug free through black box testing
b) How far you are comfortable with TSL(WinRunner)
c) Whats is your current responsibility
d) What is ad-hock testing ? why people do the adhock testing?
e) SQL
1.Tell me about yourself.
2.What is difference between QA and QC
3.Which version of WinRunner you have used
4.You know Test director
5.What is your role as a Test Engineer
6.Tell me the contents of Test Plan
7.Currently which project you are handling.
8.Is it Retail Banking or Corporate banking?
9.You know anything about Core Banking
10.What is NOSTRO and VOSTRO account
11.What is white box testing
12.What is block box testing
13.What is glass box testing
14.What is Regression testing
15. Tell me defect life cycle
16.Have you written test Cases for any project
17.How many projects you have executed and tell me the project names
18.What is exhaustive testing?
19.Without FS how would be your testing approach.
20.Do you know Silk Test?
21.Which version silk you used in your project
22.Do you know silk performer?
23.What is the difference between Silk and Winrunner?
24.Are you willing to work in UK hours and week ends
25.What are the techniques used in testing?
26.What is BVA and equivalence partition?

QTP Interview questions (faqs) set 6

1) Explain Different type of defect category with example
2) Difference between black box and white box testing
3) Explain project with workflow (Current and Previous Project)
4) How you are assured 100% tested in Ur module using black box testing
5) Explain Defect tracking
6) Explain Test Condition, Test Script
7) Role and Responsibility of Current Project and previous project
8) How you assure the 100% testing completed in black box testing

QTP Interview questions (faqs) set 5

1) Difference between black box and white box testing
2) Difference between Integration and Systems testing
3) Recording Mode in Win runner
4) Regular Expression in Win runner
5) Difference between retesting and regression testing
6) Explain Different type of defect category
7) Are u prepared any test plan and explain.
8) What are deliverable document in your previous project

QTP Interview questions (faqs) set 4

1. What is the difference between SDLC and SLC and either or same
or different..
2. What is Testing Life Cycle?
3. What are the phases of testing?
4. Difference between severity and priority.
5. What is test strategy?
6. How to document the test conditions.
7. What is mortgage loan.
8. What is the procedure to approach if file specification is not
Available.
9. What is vostro and nastro.
10. What is pre-payment penalty.

QTP Interview questions (faqs) set 3

1. What is Testing Life Cycle.
2. What are the phases of testing.
3. What is test strategy.
4. Difference between severity and priority.
5. Example for High/Medium/Low Severity and Priority
6. What are the contents of test strategy
7. Define regression testing (do not explain).
8. What is Defect Management / How will you escalate and close defects
9. Difference between Integration testing and System testing
10.Difference between System testing and UAT
11.Where are defining severity and prority (classification)
12.Content of Defect Report
13. What model are you using for Testing and Explanation of V model?
14. What is Grey box testing?

QTP Interview questions (faqs) set 2

1. What is defect Density?
2. What is Test Log?
3. What is test effectiveness & Efficiency?
4. How do you test the Date & Time through win runner compare to system?
5. Give me traceability matrix Format
6. Give me negative Test cases for uninstall the software
7. Give me negative Test cases for Excel sheet
8. How do test load test manually?
9. Example for the low priority high Severity
10. Example for the high priority low Severity
11. What are entry criteria? Contents
12. What are Exit criteria?
13. What is accepting criteria?
14. What is suspense criteria?
15. What are the disadvantages for bitmap check points?
16. What is configuration management? Tools used?
17. .what is Decision making table, where this will be used.
18. .How will you estimate the time to write the test cases? on what basis will you write the test cases,
19. .How will you calculate the time to execute the test cases?
20. .I have client server applications where the client request to server is taking 3 mins how will you test?
21. How will you check the performance at this stage?

QTP Interview questions (faqs) set 1

1. Explain Bug Life Cycle and explain briefly the 'Status' also.
Ex: If the Bug is resolved or the Bug is rejected or Bug is re-opened then at that duration the Bug will be called as which Status?
2. What is Traceability Matrix
3. . If U had 10 days to deliver the product but the CEO of ur organization tells to deliver it in 5 days, then How and what Testing Techniques or Strategy you will follow in order to deliver it in short time (5days)
4. What Testing process are U using in ur organization
5. During the Requirement phase (Initial), do U do any testing?
6. When to Stop Testing
7. What are the Test Case Guidelines
8. U have a Scenario and there is box to enter values from 0 to 1000
9. Now what is the Test data u will test
10. What are the positive and negative test cases
11. How will U use Equivalence Partition
12. . If U have Test Cases already prepared and u have to just excute those test cases in a very short time then how will U decide which test cases has to be excuted first. Is there any technique?
14. During Execution time how the testers will calculate the number of Test Cases to be runned? Any Specific way to calculate
15. If U are executing and test cases and u find some extra bugs in the application and u have not written such Scenarios in the Test Cases then what will u do?
16. If Ur project consists of multiple bugs then how will U deliver it in a very short span of time to the Client?
17. During Regression Testing how will the testers judge or calculate that the changed module (Done by developers) will affect the rest of the modules or not.
18.Since the testers are just doing black box testing , so how will the testers know that the code changed will affect the rest of the modules or not?
19. If the Developers are unable to understand the bug send by the testers, then how will the Testers convince or explain the developer.