Test Strategy for Performance Testing
|
|
Title of test:
![]() Test Strategy for Performance Testing Description: MF302 |



| New Comment |
|---|
NO RECORDS |
|
Once the overall Test Strategy is defined for performance testing, a detailed test plan with dates, activities, adn people responsible and other details is to be defined. True. False. A performance test plan is prepared based on the analysis of the requirements collected. True. False. To unambiguously understand, plan, schedule and conduct Performance Testing within the constraints of time and budget, Performance Test Strategy is a must. True. False. Identify the items that are dependant on Test Strategy: System appreciation. NFR Capture. NFR Validation. Workload Profiling. Understand how performance testing will be carried out. Identify the critical scnarious to be tested. Decided on human, software adn hardware resources. Decide on test environment and teh tools to be used for monitoring the parameters on each tier. Requirements elicitation. Conducted to determine response time by varying the user load. Load Testing. Endurance Testing. Stress Testing. Spike Testing. Volume Testing. Conducted to test the reliability of the system or check fo rmemory leaks. Load Testing. Endurance Testing. Stress Testing. Spike Testing. Volume Testing. Conducted to determine the maximum number of load a system can handle. Load Testing. Endurance Testing. Stress Testing. Spike Testing. Volume Testing. Conducted by applying sudden burst of load. Load Testing. Endurance Testing. Stress Testing. Spike Testing. Volume Testing. Testing of database with large volume of data. Load Testing. Endurance Testing. Stress Testing. Spike Testing. Volume Testing. Performance Test Strategy Planning Definition falls under the following SDLC stages. REquirements Elicitation. Architecture and Design. Build. Testing. Deployment. Post-Deployment. Maintenance. P-Test Methodology is not Iterative in nature. False. True. When exit criterion defined in performance test plans are met only then it is taken as completed. True. False. If Application does not meet SLA, then performance parameters monitored during the test should be reported to the development team to identify problem areas: True. False. Identify on which phase does the first focus of PTLC go: NOn-functional Requirements from a performance perspective. Finalizing the test strategy. Decision is taken on human, software and hardware resources required for the project. Order the following in the correct sequence: 1) TEst Strategizing and planning Overview 2) Performance Test Requirements Gathering 3) Test analysis and recommendation 4) Performance TESt Execution adn Result Generation 5) Test Design Overview. 1,2,3,4,5. 2,1,5,4,3. 5,4,3,2,1. 5,4,1,2,3. Order the follwoing Components of performance Test Strategy in the right order: 1) Test Data 2) Test Scenarios 3) Risks and Mitigation Plans 4) Performance Test Scope 5) Parameters to eb monitored 6) Test Environment 7) Stake Holders 8) Roles and Responsibilities 9) Objectives of Performance testing. 9,7,8,3,4,2,5,6,1. 1,2,3,4,5,6,7,8,9. 9,7,8,2,4,3,5,6,1. 9,8,7,6,5,4,3,2,1. Identify Objectives for Performance Testing. Determine application perforamnce at various user loads. Identifying perforamnce bottlenecks in the applications. Predict Peak processing ability of application before deployment. Aid in Capacity Projection. Increase Uptime of mission-critical applications. Analyze effect if hardware and/or software change. External Interfaces. Resource Intensiveness. Response time minimization. Is 'External Interface' a useful question for setting performance measure?. False. True. Is the system capable of handling peak load a useful question for setting performance measure?. False. True. Is the type of work load a useful question for setting performance measure?. False. True. Does the system provide consistency in perforamnce? a useful question for setting performance measure?. False. True. Is the the system tuned optimally a useful question for setting performance measure?. False. True. Is the response time acceptable, a useful question for setting performance measure?. False. True. Is the system capable of handling peak load, a useful question for setting performance measure?. False. True. Is the system capable of handling minimum load, a useful question for setting performance measure?. False. True. Is the system scalable, a useful question for setting performance measure?. False. True. Identify the Stakeholders in PTLC: Client Senior Managers. End users. Stakeholder Departments. Call Center. Performance Test Architects. Performance Test Desiners. Performance Test Leads and Engineers. DBA. Data Center Group. Identify the false statements: Performance test results must be compiled and presented to stakeholders. Strategy must satisfy a minimum of two groups to proceed. All stake holders expect similar kind of results that conform to a common SLA. Test Strategy must capture the SLA for all teh groups of stakeholders. Overall ownership of the functioning adn results of the performance testing exercise lies with: Performance Test Manager. Performance Test Engineer. Performance Testing Analyst. Development Team. Performance Engineer. ____ Guides test analyst in preparation of test plan: Performance Test Manager. Performance Test Engineer. Performance Testing Analyst. Development Team. Performance Engineer. Interfaces with customer, Infosys Offshore, product vendors and other consulting organizations: Performance Test Manager. Performance Test Engineer. Performance Testing Analyst. Development Team. Performance Engineer. Analysis of test results and recommendations of migration to production: Performance Test Manager. Performance Test Engineer. Performance Testing Analyst. Development Team. Performance Engineer. Guide infrastructure expert in tuning: Performance Test Manager. Performance Test Engineer. Performance Testing Analyst. Development Team. Performance Engineer. Prepare and execute test scripts: Performance Test Manager. Performance Test Engineer. Performance Testing Analyst. Development Team. Performance Engineer. Prepare and execute test data creation scripts: Performance Test Manager. Performance Test Engineer. Performance Testing Analyst. Development Team. Performance Engineer. Prepare performance test plan: Performance Test Manager. Performance Test Engineer. Performance Testing Analyst. Development Team. Performance Engineer. Manage changes to test plan/testing strategy during testing cycle. Performance Test Manager. Performance Test Engineer. Performance Testing Analyst. Development Team. Performance Engineer. Interface with infosys offshore for effectife resolution of issues to be fixed in code: Performance Test Manager. Performance Test Engineer. Performance Testing Analyst. Development Team. Performance Engineer. Setting up application at the test infrastructure provided: Performance Test Manager. Performance Test Engineer. Performance Testing Analyst. Development Team. Performance Engineer. Provide scripts for database loading and sanitization: Performance Test Manager. Performance Test Engineer. Performance Testing Analyst. Development Team. Performance Engineer. Analyze the test results for tuning application adn remove any identified bottleneck: Performance Test Manager. Performance Test Engineer. Performance Testing Analyst. Development Team. Performance Engineer. Develop Test Strategy: Performance Test Manager. Performance Test Engineer. Performance Testing Analyst. Development Team. Performance Engineer. Projecting Scale Factor: Performance Test Manager. Performance Test Engineer. Performance Testing Analyst. Development Team. Performance Engineer. identify the true statements about Risk and Mitigation Plans: Risks are major concern for successfully running test scenarios. Risks and mitigation plans must be detailed. Risks may be from customer side or service provider end. Risks are mainly documented from customer side. When sufficient production data is not available: Client will be responseible for providing realistic volume of data or proper data creation strategy will be used based on realistic estimates of data in production scenario. This is a Performance Test Manager Problem. Development Team will be responseible to provide the realistic volume of data. Exact industry standard benchmarks for hardware in production adn test are not available: Bench marks will be ordered from the hardware vendor or comparabler benchmarks will be used for projection. Development team will set the new benchmarks based on their past experience. Customer Department teams will set the new benchmarks based on their past experience. Bench marks will be requested by Performance Test Analyst to the Client and Client will in turn order it from the vendor. Clear understanding of project Scope is mentioned in: Test Scenarios. Risks and Mitigation Plans. Performance Test Scope. Test Environment. Parameters to eb monitored. Stake Holders. Roles and Responsibilities. Objectives of Performance testing. Type of testing to be carried out is mentioned in: Test Scenarios. Risks and Mitigation Plans. Performance Test Scope. Test Environment. Parameters to eb monitored. Stake Holders. Roles and Responsibilities. Objectives of Performance testing. Test Environment must be defined in: Test Scenarios. Risks and Mitigation Plans. Performance Test Scope. Test Environment. Parameters to eb monitored. Stake Holders. Roles and Responsibilities. Objectives of Performance testing. Out of scope must be given explicitly in : Test Scenarios. Risks and Mitigation Plans. Performance Test Scope. Test Environment. Parameters to eb monitored. Stake Holders. Roles and Responsibilities. Objectives of Performance testing. Identify Types of testing in: Test Scenarios. Risks and Mitigation Plans. Performance Test Scope. Test Environment. Parameters to eb monitored. Stake Holders. Roles and Responsibilities. Objectives of Performance testing. Identify Criteria for selection in: Test Scenarios. Risks and Mitigation Plans. Performance Test Scope. Test Environment. Parameters to eb monitored. Stake Holders. Roles and Responsibilities. Objectives of Performance testing. Identify performance Testing Transactions in: Test Scenarios. Risks and Mitigation Plans. Performance Test Scope. Test Environment. Parameters to eb monitored. Stake Holders. Roles and Responsibilities. Objectives of Performance testing. Once performance critical transactions are identified the type of testing applicable need to be decided. True. False. For performance critical transactions that have been chosen, strategic scenario's that test the code well are to be chosen, with appropriate volumes and workload mix needs to be dreived. False. True. Based on the test criteria chosen, match the follwoing: Test is carried out for single transaction. Identify bottlenecks in a given transaction and tune code based on results. Test done for identified critical transactions at different load level for determining scalability. Transactions based on workload analysis are selected. Simulate a real-life situation of load and identify performance issues. Utilization levels will be found at various CPU and used for capacity projection. Infrastructure Validation is based on the metrics collected during this test. TEst cnducted for a longer duration with dixed number of users to simulate the real life scenario. Identify stability of an app running for a longer duration and identify memory leaks. Critical use cases are used and test carried out for long durations. Identify the Perforamnce Parameters to be monitored: Throughput of the system measured in transactions per second at the server end. Response time of transactions as per teh workloads defined in the design. Resource utilization during testing must be measured. Bottleneck measurements. Throughput of the system measured in trans/sec at client end. Response time of transactions as per teh workloads defined in the test Scenarios. Identify the Testing Environment items measured: Hardware. Software. Tools. Testing Infrastructure. Application Architeture. Project Partners. Test Data Generators. Identify true statements about Using Production Data: If Proper Test Data is not available, even with a good test environment, testing may not be sufficient. Use Production data if avaialble. If Production Data is not available creating test data will take a huge effort. If Production DAta is not availble, using Data Generators for test data is a good option. Usage of Data Generators is not a good option when no production data is there, as these tools do not give real-life scenarios to mock patterns. Production Data cannot be used as it might not fit the logic coded, fudging the production data and using it for testing is a good option. If Perforamnce test scripts gets modified during the course of the Performance Testing, then the Data Setup should conform to: Database needs to be refreshed after every performance test. Generate mock data between two performances test using Data Generators. Database need not to be refreshed after every performance test, few records from production data can be copied over to bring a change to the prior data. It does not matter as this depends on the user load being handled, lower the user load, lower is the need to refresh the data. As for the test data, The data in database shoudl be sanitized adn uploaded at different injection rates depending on the user load used for a particular performance test. True. False. Volume of data in database should be sufficient to handle the number of requests coming from the test scripts and close to satisfying the end user response time. False. True. |





