ࡱ> [ bjbj t ΐΐ $999P\93Mih &}'}'}'X(*;|@LLLLLLL$9OQbLCX(X(CCL}'}'LGGGC}'}'LGCLGGIYzL}'m#9vDtRLLM03MfL=RF=R(zL=RzL$CCGCCCCCLLGCCC3MCCCC=RCCCCCCCCC :  TITLE \* MERGEFORMAT  Test Plan Template (file, properties, summary, title to change) Author: Date: Index  TOC \* MERGEFORMAT Index  GOTOBUTTON _Toc372366998  PAGEREF _Toc372366998 2 Revision History  GOTOBUTTON _Toc372366999  PAGEREF _Toc372366999 4 Introduction  GOTOBUTTON _Toc372367000  PAGEREF _Toc372367000 4 Goal of Project and Feature Team  GOTOBUTTON _Toc372367001  PAGEREF _Toc372367001 4 Primary Testing Concerns  GOTOBUTTON _Toc372367002  PAGEREF _Toc372367002 4 Primary Testing Focus  GOTOBUTTON _Toc372367003  PAGEREF _Toc372367003 4 References  GOTOBUTTON _Toc372367004  PAGEREF _Toc372367004 4 Personnel  GOTOBUTTON _Toc372367005  PAGEREF _Toc372367005 4 Testing Schedule  GOTOBUTTON _Toc372367006  PAGEREF _Toc372367006 4 Feature History  GOTOBUTTON _Toc372367007  PAGEREF _Toc372367007 5 Features:  GOTOBUTTON _Toc372367008  PAGEREF _Toc372367008 5 Files and Modules:  GOTOBUTTON _Toc372367009  PAGEREF _Toc372367009 5 Files List:  GOTOBUTTON _Toc372367010  PAGEREF _Toc372367010 5 Registry, INI Settings:  GOTOBUTTON _Toc372367011  PAGEREF _Toc372367011 5 Setup Procedures:  GOTOBUTTON _Toc372367012  PAGEREF _Toc372367012 5 De-installation Procedures  GOTOBUTTON _Toc372367013  PAGEREF _Toc372367013 5 Database Setup and Procedures  GOTOBUTTON _Toc372367014  PAGEREF _Toc372367014 5 Network Domain/Topologies Configuration Procedures  GOTOBUTTON _Toc372367015  PAGEREF _Toc372367015 5 Performance Monitoring Counters Setup And Configurations  GOTOBUTTON _Toc372367016  PAGEREF _Toc372367016 6 Operational Issues  GOTOBUTTON _Toc372367017  PAGEREF _Toc372367017 6 Backup  GOTOBUTTON _Toc372367018  PAGEREF _Toc372367018 6 Recovery  GOTOBUTTON _Toc372367019  PAGEREF _Toc372367019 6 Archiving  GOTOBUTTON _Toc372367020  PAGEREF _Toc372367020 6 Monitoring  GOTOBUTTON _Toc372367021  PAGEREF _Toc372367021 6 Operational Problem Escalation/Alert Methods  GOTOBUTTON _Toc372367022  PAGEREF _Toc372367022 6 Scope of Test Cases  GOTOBUTTON _Toc372367023  PAGEREF _Toc372367023 6 Acceptance Criteria  GOTOBUTTON _Toc372367024  PAGEREF _Toc372367024 6 Key Feature Issues  GOTOBUTTON _Toc372367025  PAGEREF _Toc372367025 6 Test Approach  GOTOBUTTON _Toc372367026  PAGEREF _Toc372367026 6 Design Validation  GOTOBUTTON _Toc372367027  PAGEREF _Toc372367027 6 Data Validation  GOTOBUTTON _Toc372367028  PAGEREF _Toc372367028 6 API Testing  GOTOBUTTON _Toc372367029  PAGEREF _Toc372367029 6 Content Testing  GOTOBUTTON _Toc372367030  PAGEREF _Toc372367030 7 Low-Resource Testing  GOTOBUTTON _Toc372367031  PAGEREF _Toc372367031 7 Setup Testing  GOTOBUTTON _Toc372367032  PAGEREF _Toc372367032 7 Modes and Runtime Options  GOTOBUTTON _Toc372367033  PAGEREF _Toc372367033 7 Interoperability  GOTOBUTTON _Toc372367034  PAGEREF _Toc372367034 7 Integration Testing  GOTOBUTTON _Toc372367035  PAGEREF _Toc372367035 7 Compatibility: Clients  GOTOBUTTON _Toc372367036  PAGEREF _Toc372367036 7 Compatibility: Servers  GOTOBUTTON _Toc372367037  PAGEREF _Toc372367037 7 Beta Testing  GOTOBUTTON _Toc372367038  PAGEREF _Toc372367038 7 Environment/System - General  GOTOBUTTON _Toc372367039  PAGEREF _Toc372367039 8 Configuration  GOTOBUTTON _Toc372367040  PAGEREF _Toc372367040 8 User Interface  GOTOBUTTON _Toc372367041  PAGEREF _Toc372367041 8 Performance & Capacity Testing  GOTOBUTTON _Toc372367042  PAGEREF _Toc372367042 8 Scalability  GOTOBUTTON _Toc372367043  PAGEREF _Toc372367043 8 Stress Testing  GOTOBUTTON _Toc372367044  PAGEREF _Toc372367044 8 Volume Testing  GOTOBUTTON _Toc372367045  PAGEREF _Toc372367045 8 International Issues  GOTOBUTTON _Toc372367046  PAGEREF _Toc372367046 8 Robustness  GOTOBUTTON _Toc372367047  PAGEREF _Toc372367047 8 Error Testing  GOTOBUTTON _Toc372367048  PAGEREF _Toc372367048 9 Usability  GOTOBUTTON _Toc372367049  PAGEREF _Toc372367049 9 Accessibility  GOTOBUTTON _Toc372367050  PAGEREF _Toc372367050 9 User Scenarios  GOTOBUTTON _Toc372367051  PAGEREF _Toc372367051 9 Boundaries and Limits  GOTOBUTTON _Toc372367052  PAGEREF _Toc372367052 9 Special Code Profiling and Other Metrics  GOTOBUTTON _Toc372367053  PAGEREF _Toc372367053 9 Test Environment  GOTOBUTTON _Toc372367054  PAGEREF _Toc372367054 10 Operating Systems  GOTOBUTTON _Toc372367055  PAGEREF _Toc372367055 10 Networks  GOTOBUTTON _Toc372367056  PAGEREF _Toc372367056 10 Hardware  GOTOBUTTON _Toc372367057  PAGEREF _Toc372367057 10 Machines  GOTOBUTTON _Toc372367058  PAGEREF _Toc372367058 10 Graphics Adapters  GOTOBUTTON _Toc372367059  PAGEREF _Toc372367059 10 Extended and Expanded Memory Boards  GOTOBUTTON _Toc372367060  PAGEREF _Toc372367060 10 Other Peripheral  GOTOBUTTON _Toc372367061  PAGEREF _Toc372367061 10 Software  GOTOBUTTON _Toc372367062  PAGEREF _Toc372367062 10 Unique Testing Concerns For Specific Features  GOTOBUTTON _Toc372367063  PAGEREF _Toc372367063 11 Area Breakdown  GOTOBUTTON _Toc372367064  PAGEREF _Toc372367064 11 Feature Name  GOTOBUTTON _Toc372367065  PAGEREF _Toc372367065 11 Sub Feature One  GOTOBUTTON _Toc372367066  PAGEREF _Toc372367066 11 sub 1.1  GOTOBUTTON _Toc372367067  PAGEREF _Toc372367067 11 sub 1.2  GOTOBUTTON _Toc372367068  PAGEREF _Toc372367068 12 sub 1.3  GOTOBUTTON _Toc372367069  PAGEREF _Toc372367069 12 Sub Feature Two  GOTOBUTTON _Toc372367070  PAGEREF _Toc372367070 12 Sub Feature Three (etc.)  GOTOBUTTON _Toc372367071  PAGEREF _Toc372367071 12 Spec Review Issues  GOTOBUTTON _Toc372367072  PAGEREF _Toc372367072 13 Test Tools  GOTOBUTTON _Toc372367073  PAGEREF _Toc372367073 13 Smoke Test (acceptance test, build verification, etc.)  GOTOBUTTON _Toc372367074  PAGEREF _Toc372367074 13 Automated Tests  GOTOBUTTON _Toc372367075  PAGEREF _Toc372367075 13 Manual Tests  GOTOBUTTON _Toc372367076  PAGEREF _Toc372367076 14 Regression Tests  GOTOBUTTON _Toc372367077  PAGEREF _Toc372367077 14 Bug Bashes  GOTOBUTTON _Toc372367078  PAGEREF _Toc372367078 14 Bug Reporting  GOTOBUTTON _Toc372367079  PAGEREF _Toc372367079 14 Plan Contingencies  GOTOBUTTON _Toc372367080  PAGEREF _Toc372367080 14 External Dependencies  GOTOBUTTON _Toc372367081  PAGEREF _Toc372367081 14 Headcount Requirements  GOTOBUTTON _Toc372367082  PAGEREF _Toc372367082 14 Product Support  GOTOBUTTON _Toc372367083  PAGEREF _Toc372367083 14 Testing Schedule  GOTOBUTTON _Toc372367084  PAGEREF _Toc372367084 14 Drop Procedures  GOTOBUTTON _Toc372367085  PAGEREF _Toc372367085 14 Release Procedures  GOTOBUTTON _Toc372367086  PAGEREF _Toc372367086 15 Alias/Newsgroups and Communication Channels  GOTOBUTTON _Toc372367087  PAGEREF _Toc372367087 15 Regular Meetings  GOTOBUTTON _Toc372367088  PAGEREF _Toc372367088 15 Feature Team Meetings  GOTOBUTTON _Toc372367089  PAGEREF _Toc372367089 15 Project Test Team Meetings  GOTOBUTTON _Toc372367090  PAGEREF _Toc372367090 15 Feature Team Test Meetings  GOTOBUTTON _Toc372367091  PAGEREF _Toc372367091 15 Decisions Making Procedures  GOTOBUTTON _Toc372367092  PAGEREF _Toc372367092 15 Notes  GOTOBUTTON _Toc372367093  PAGEREF _Toc372367093 15  Revision History First Draft: Introduction Single sentence describing the intent and purpose of the test plan. For example This test plan addresses the test coverage for the XXX release of the BAR area of feature Foo. Goal of Project and Feature Team Mission statement and goal of overall project team. Mission statement and goal of specific feature team. This section is used to set the stage for testings plans and goals in relation to the feature team and projects goals. Primary Testing Concerns A statement of what the main critical concerns of the test plan are. An itemized list, or short paragraph will suffice. Primary Testing Focus A short statement of what items testing will focus on. The testing concerns above state what testing is worried about. Focus indicates more of a methodology - a statement of how those concerns will be addressed via focus. References document name location test plan test plan location project specifications project spec location feature specification feature spec location development docs on feature dev doc location bug database queries location for raid queries test case database queries location for test case queries schedule documents location for schedule documents build release server location of build releases source file tree location of source file tree other related documents other locations Personnel Program Manager: name and email Developer: name and email Tester: name and email Testing Schedule Break the testing down into phases (ex. Planning, Case Design, Unit & Component Tests, Integration Tests, Stabilization, Performance and Capacity Tuning, Full Pass and Shipping) - and make a rough schedule of sequence and dates. What tasks do you plan on having done in what phases? This is a brief, high level summary - just to set expectation that certain components will be worked on at certain times - and to indicate that the plan is taking project schedule concerns into consideration. Include a pointer to more detailed feature and team schedules here. Feature History A history of how the feature was designed, and evolved, over time. It is a good idea to build this history up as test plans go. This gives a good feel for why the current release is focusing on what it has done. It also serves a good framework for where problems have been in the past. A paragraph or two is probably sufficient for each drop, indicating - original intent, feedback and successes, problems, resolutions, things learned from the release, major issues dealt with or discovered in the release. Basically, this section is a mini post-mortem. It is eventually finishes with a statement regarding the development of the specific version. It is often helpful to update this history at each milestone of a project. Features: This section gives a breakdown of the areas of the feature. It is often useful to include in this section a per area statement of testings thoughts. What type of testing is best used for each area? What is problematic about each area? Has this area had a problem in the past. Quick statements are all that is need in this list. NOTE: this is only here as a high level summary of the features. The real meat is in the area breakdown. This is a tad redundant in that respect... Files and Modules: Include in this section any files, modules and code that must be distributed on the machine, and where they would be located. Also include registry settings, INI settings, setup procedures, de-installation procedures, special database and utility setups, and any other relevant data. Files List: filename purpose location on machine Registry, INI Settings: setting1 purpose Setting1 possible values setting 2 purpose Setting 2 possible values Setup Procedures: blah blah De-installation Procedures blah blah Database Setup and Procedures blah blah Network Domain/Topologies Configuration Procedures blah blah Performance Monitoring Counters Setup And Configurations Operational Issues Is the program being monitored/maintained by an operational staff? Are there special problem escalation, or operational procedures for dealing with the feature/program/area? Backup Recovery Archiving Monitoring Operational Problem Escalation/Alert Methods Scope of Test Cases Statement regarding the degree and types of coverage the testing will involve. For example, will focus be placed on performance? How about client v.s. server issues? Is there a large class of testing coverage that will be intentionally overlooked or minimized? Will there be much unit and component testing? This is a big sweeping picture of the testing coverage - giving an overall statement of the testing scope. Acceptance Criteria How is Good Enough To Ship defined for the project? For the feature? What are the necessary performance, stability and bug find/fix rates to determine that the product is ready to ship? Key Feature Issues What are the top problems/issues that are recurring or remain open in this test plan? What problems remain unresolved? Test Approach Design Validation Statements regarding coverage of the feature design - including both specification and development documents. Will testing review design? Is design an issue on this release? How much concern does testing have regarding design, etc. etc.. Data Validation What types of data will require validation? What parts of the feature will use what types of data? What are the data types that test cases will address? Etc. API Testing What level of API testing will be performed? What is justification for taking this approach (only if none is being taken)? Content Testing Is your area/feature/product content based? What is the nature of the content? What strategies will be employed in your feature/area to address content related issues? Low-Resource Testing What resources does your feature use? Which are used most, and are most likely to cause problems? What tools/methods will be used in testing to cover low resource (memory, disk, etc.) issues? Setup Testing How is your feature affected by setup? What are the necessary requirements for a successful setup of your feature? What is the testing approach that will be employed to confirm valid setup of the feature? Modes and Runtime Options What are the different run time modes the program can be in? Are there views that can be turned off and on? Controls that toggle visibility states? Are there options a user can set which will affect the run of the program? List here the different run time states and options the program has available. It may be worthwhile to indicate here which ones demonstrate a need for more testing focus. Interoperability How will this product interact with other products? What level of knowledge does it need to have about other programs -- good neighbor, program cognizant, program interaction, fundamental system changes? What methods will be used to verify these capabilities? Integration Testing Go through each area in the product and determine how it might interact with other aspects of the project. Start with the ones that are obviously connected, but try every area to some degree. There may be subtle connections you do not think about until you start using the features together. The test cases created with this approach may duplicate the modes and objects approaches, but there are some areas which do not fit in those categories and might be missed if you do not check each area. Compatibility: Clients Is your feature a server based component that interacts with clients? Is there a standard protocol that many clients are expected to use? How many and which clients are expected to use your feature? How will you approach testing client compatibility? Is your server suited to handle ill-behaved clients? Are there subtleties in the interpretation of standard protocols that might cause incompatibilities? Are there non-standard, but widely practiced use of your protocols that might cause incompatibilities? Compatibility: Servers Is your feature a client based component that interacts with servers? Is there a standard protocol supported by many servers that your client speaks? How many different servers will your client program need to support? How will you approach testing server compatibility? Is your client suited to handle ill-behaved or non-standard servers? Are there subtleties in the interpretation of standard protocols that might cause incompatibilities? Are there non-standard, but widely practiced use of protocols that might cause incompatibilities? Beta Testing What is the beta schedule? What is the distribution scale of the beta? What is the entry criteria for beta? How is testing planning on utilizing the beta for feedback on this feature? What problems do you anticipate discovering in the beta? Who is coordinating the beta, and how? Environment/System - General Are there issues regarding the environment, system, or platform that should get special attention in the test plan? What are the run time modes and options in the environment that may cause difference in the feature? List the components of critical concern here. Are there platform or system specific compliance issues that must be maintained? Configuration Are there configuration issues regarding hardware and software in the environment that may get special attention in the test plan? Some of the classical issues are machine and bios types, printers, modems, video cards and drivers, special or popular TSRs, memory managers, networks, etc. List those types of configurations that will need special attention. User Interface List the items in the feature that explicitly require a user interface. Is the user interface designed such that a user will be able to use the feature satisfactorally? Which part of the user interface is most likely to have bugs? How will the interface testing be approached? Performance & Capacity Testing How fast and how much can the feature do? Does it do enough fast enough? What testing methodology will be used to determine this information? What criterion will be used to indicate acceptable performance? If modifications of an existing product, what are the current metrics? What are the expected major bottlenecks and performance problem areas on this feature? Scalability Is the ability to scale and expand this feature a major requirement? What parts of the feature are most likely to have scalability problems? What approach will testing use to define the scalability issues in the feature? Stress Testing How does the feature do when pushed beyond its performance and capacity limits? How is its recovery? What is its breakpoint? What is the user experience when this occurs? What is the expected behavior when the client reaches stress levels? What testing methodology will be used to determine this information? What area is expected to have the most stress related problems? Volume Testing Volume testing differs from performance and stress testing in so much as it focuses on doing volumes of work in realistic environments, durations, and configurations. Run the software as expected user will - with certain other components running, or for so many hours, or with data sets of a certain size, or with certain expected number of repetitions. International Issues Confirm localized functionality, that strings are localized and that code pages are mapped properly. Assure program works properly on localized builds, and that international settings in the program and environment do not break functionality. How is localization and internationalization being done on this project? List those parts of the feature that are most likely to be affected by localization. State methodology used to verify International sufficiency and localization. Robustness How stable is the code base? Does it break easily? Are there memory leaks? Are there portions of code prone to crash, save failure, or data corruption? How good is the programs recovery when these problems occur? How is the user affected when the program behaves incorrectly? What is the testing approach to find these problem areas? What is the overall robustness goal and criteria? Error Testing How does the program handle error conditions? List the possible error conditions. What testing methodology will be used to evoke and determine proper behavior for error conditions? What feedback mechanism is being given to the user, and is it sufficient? What criteria will be used to define sufficient error recovery? Usability What are the major usability issues on the feature? What is testings approach to discover more problems? What sorts of usability tests and studies have been performed, or will be performed? What is the usability goal and criteria for this feature? Accessibility Is the feature designed in compliance with accessibility guidelines? Could a user with special accessibility requirements still be able to utilize this feature? What is the criteria for acceptance on accessibility issues on this feature? What is the testing approach to discover problems and issues? Are there particular parts of the feature that are more problematic than others? User Scenarios What real world user activities are you going to try to mimic? What classes of users (i.e. secretaries, artist, writers, animators, construction worker, airline pilot, shoemaker, etc.) are expected to use this program, and doing which activities? How will you attempt to mimic these key scenarios? Are there special niche markets that your product is aimed at (intentionally or unintentionally) where mimic real user scenarios is critical? Boundaries and Limits Are there particular boundaries and limits inherent in the feature or area that deserve special mention here? What is the testing methodology to discover problems handling these boundaries and limits? Operational Issues If your program is being deployed in a data center, or as part of a customer's operational facility, then testing must, in the very least, mimic the user scenario of performing basic operational tasks with the software. Backup Identify all files representing data and machine state, and indicate how those will be backed up. If it is imperative that service remain running, determine whether or not it is possible to backup the data and still keep services or code running. Recovery If the program goes down, or must be shut down, are there steps and procedures that will restore program state and get the program or service operational again? Are there holes in this process that may make a service or state deficient? Are there holes that could provide loss of data. Mimic as many states of loss of services that are likely to happen, and go through the process of successfully restoring service. Archiving Archival is different from backup. Backup is when data is saved in order to restore service or program state. Archive is when data is saved for retrieval later. Most archival and backup systems piggy-back on each other's processes. Is archival of data going to be considered a crucial operational issue on your feature? If so, is it possible to archive the data without taking the service down? Is the data, once archived, readily accessible? Monitoring Does the service have adequate monitoring messages to indicate status, performance, or error conditions? When something goes wrong, are messages sufficient for operational staff to know what to do to restore proper functionality? Are the "hearbeat" counters that indicate whether or not the program or service is working? Attempt to mimic the scenario of an operational staff trying to keep a service up and running. Upgrade Does the customer likely have a previous version of your software, or some other software? Will they be performing an upgrade? Can the upgrade take place without interrupting service? Will anything be lost (functionality, state, data) in the upgrade? Does it take unreasonably long to upgrade the service? Migration Is there data, script, code or other artifacts from previous versions that will need to be migrated to a new version? Testing should create an example of installation with an old version, and migrate that example to the new version, moving all data and scripts into the new format. List here all data files, formats, or code that would be affected by migration, the solution for migration, and how testing will approach each. Special Code Profiling and Other Metrics How much focus will be placed on code coverage? What tools and methods will be used to measure the degree to which testing coverage is sufficiently addressing all of the code? Test Environment What are the requirements for the product? They should be reflected in the breadth of hardware configuration testing. Operating Systems Identify all operating systems under which this product will run. Include version numbers if applicable. Networks Identify all networks under which this product will run. include version numbers if applicable. Hardware Identify the various hardware platforms and configurations. Machines Graphics Adapters This includes the requirements for single or dual monitors. Extended and Expanded Memory Boards Other Peripheral Peripherals include those necessary for testing such as CD-ROM, printers, modems, faxes, external hard drive, tape readers, etc. Software Identify software included with the product or likely to be used in conjunction with this product. Software categories would include memory managers, extenders, some TSRs, related tools or products, or similar category products. Unique Testing Concerns For Specific Features List specific features which may require more attention than others, and describe how testing will approach these features. This is to serve as a sort of hot list. Area Breakdown This is a detailed breakdown of the feature or area - and is best done in an outline format. It is useful as a tool later when building test cases. The outline of an area can go on quite long. Usually it starts with a menu breakdown, and then continues on with those features and functionalities not found on any menu in particular. Feature Name Sub Feature One sub 1.1 Feature testing approach matrix: this will repeat for each subitem, including any class of testing relevant to any item. Put in NA if not applicable. Location of this matrix in the hierarchy determines scope. For example, data validation rules global to anything under Sub Feature One should go under Sub Feature One. Inheritance should be implied. Class InfoAuto?Man.?Design ValidationData ValidationValid data & expected results (e.g. alphanumeric) Invalid data & expected results (e.g. no ;, / or @) How to validate?API TestingWhat are the APIs exposed? What are the permutations of calling these APIs (order, specific args, etc.)?Content TestingWhat content exercises this feature? What content does this feature produce, modify or manage?Low-Resource TestingWhat resource dimensions to test? What to do when resource is low?Setup TestingWhat types of setups? How to confirm feature after a setup?Modes & Runtime OptionsWhat modes and runtime options does this have? What should be tested during these modes? What are expected results in different modes?InteroperabilityWhat do we interoperate with? Do what action with it?Integration TestingWhat do we integrate with? Do what action with it?Compatibility: ClientsWhat clients? Doing what actions?Compatibility: ServersWhat servers? Doing what actions?Beta TestingEnvironment/SystemWhat environmental issues apply to this? What to do to expose?ConfigurationWhat environmental issues apply to this? What to do to expose?User InterfaceWhat are the interface points? How to exercise them?PerformanceWhat are the target performance dimensions? What will you do to exercise these?CapacityWhat is the target capacity? What will you do to test this?ScalabilityWhat is the target scale, and how? What will you do to test this?StressWhat dimensions do you plan on stressing? What is expectation? How will you stress it?Volume TestsWhat actions will be included in volume tests?InternationalWhat are the international problems of this item?RobustnessWhat robustness (crashes, corruption, etc.) errors are anticipated? How will you look for them?Error TestingWhat are the relevant error conditions that the program expects? What are the error situations you plan on simulating?UsabilityWhat are the usability issues about this item?AccessibilityWhat are the accessibility issues about this item?User ScenariosHow would a user typically use this item? What tests will you do to simulate user scenarios?Boundaries and LimitsWhat are the boundary conditions surrounding this item? What are the limits of this item? Max Values? Minimum Values?Special Code Profiling and Other MetricsScheduleWhen?Code Paths and Sequences?What are the different ways to invoke or activate this item? What are things you can do just before this item that are supposed to change the way it operates? What should NOT change the way it operates?sub 1.2 sub 1.3 Sub Feature Two Sub Feature Three (etc.) Test Case Structure Where will test cases be stored? What is the naming scheme? What is the organizing structure? How do test cases correlate to the test plan? RECOMMENDED: The test case structure follows the area breakdown structure highlighted above. Test cases will be stored in the TCM database. TCM was chosen because it supports arbitrary depths of hierarchy, and because it SQL based - allowing a great deal of flexibility in reporting, database management, etc. In TCM, the left pane holds the hierarchy, the right pane holds instances of test cases. The left pane will follow the hierarchy through all the levels of feature detail, and then add one more level to express test class types. For example, one might see the following hierarchy in the left pane: Word Editing Format Font Typeface Data Validation Errors Boundaries Limits Etc. Style Etc. Paragraph Test cases can exist at any level, but it is recommend that they be entered at the terminal level of the hierarchy so that they can be easily associated with similar classes of testing. When a given level is selected, the list of test cases associated with it is shown in the right pane. This hierarchy allows one to avoid needing to follow numbering schemes (which are a pane to maintain and organized), allows you to express a test's location in the tree with a "item.item.item.item" naming convention, and allows one to determine to what degree different classes of tests are covered for each feature. It is recommended that if a class is being skipped that an explanatory entry be placed in the right pane justifying why tests of that class are not relevant. The intent is to make this document drive the creation and evaluation of the test cases. Spec Review Issues Indicate location and method being used for reporting problems against the specification and design. Test Tools List whatever test tools will be used, or need to be written, and for what purpose. It is often best to point to an external location for more details, as tools usually require an entire plan and architectural summary of their own. Smoke Test (acceptance test, build verification, etc.) The smoke test determines whether or not the build is good enough to be submitted to testing. This section gives a statement of what the basic smoke test consists of, how it is design, and how it will be performed. A pointer to suite locations is helpful here too. Automated Tests What degree of automation will be used testing this area? What platform/tools will be used to write the automated tests? What will the automation focus on? Where are the automated tools, suites and sources checked in? Manual Tests What sorts of items will be tested manually rather than via automation? Why is manual testing being chosen over automation? Where are the manual tests defined and located? Regression Tests What is your general regression strategy? Are you going to automate? Where are the regressions stored? How often will they be re-run? Bug Bashes What is your strategy for bug bashes? How many? What goals? What incentives? What areas are targetted to be bashed? By who? Bug Reporting What tool(s) will be used to report bugs, and where are the bug reports located? Are there any special pieces of information regarding categorization of bugs that should be reported here (areas, keywords, etc.)? Plan Contingencies Is there anything that may require testings plans to change? Briefly describe how you plan to react to those changes. External Dependencies Are there any groups or projects external to the team that are dependent on your feature, or that your feature is dependent on? What testing problems and issues does this create? How are deliverables from external groups going to be tested and confirmed with your own feature? Who are the individuals serving as primary contact and liaison in this relationship? Headcount Requirements How many people will it require to implement these plans? Are there currently enough people on staff right now? What effect will hiring more or less people have (slip in schedule, quality, or something else?). Product Support What aspects of this feature have been a problem for support in the past? How are those problems being addressed? What aspects of this feature will likely cause future support problems? How are those problems being resolved? What testing methodology is being used to prevent future support problems? How is information being sent to support regarding problems and functionality of the feature? Testing Schedule Break the testing down into phases (ex. Planning, Case Design, Unit & Component Tests, Integration Tests, Stabilization, Performance and Capacity Tuning, Full Pass and Shipping) - and make a rough schedule of sequence and dates. What tasks do you plan on having done in what phases? This is a brief, high level summary - just to set expectation that certain components will be worked on at certain times - and to indicate that the plan is taking project schedule concerns into consideration. Include a pointer to more detailed feature and team schedules here. Drop Procedures Define the methodology for handing off the code between Development and Testing. Release Procedures Describe the step-wise process for getting the product from the network testing version to ready-to-ship master diskette sets. Alias/Newsgroups and Communication Channels List any email aliases and what they are for. List any bulletin boards, newsgroups, or other communication procedures and methodologies here. Regular Meetings For each meeting, when, where, what is general agenda. Feature Team Meetings Project Test Team Meetings Feature Team Test Meetings Decisions Making Procedures Who reviews decision points for the following sorts of things: build approval, bug triage, feature sign off, test plan sign off, development design sign off? What is the process for the various decisions? Notes Areas in Red(() are flagged as such to express spec items that are non-existing, incomplete, unclear, or being considered as issues. Areas in Blue(() are flagged as such to express an important issue that needs definition and/or resolution.     PAGE   TITLE \* MERGEFORMAT Test Plan Template (file, properties, summary, title to change) page PAGE \* MERGEFORMAT 2 of  NUMPAGES \* MERGEFORMAT 14 PAGE   DATE \* MERGEFORMAT 05-Jun-2011  FILENAME \* MERGEFORMAT Test Plan Template Revision 1  TITLE \* MERGEFORMAT Test Plan Template (file, properties, summary, title to change) page  PAGE 14 of  numpages 14 YZ\  * + , . < = X Y p q r t      7 8 S T k l m o { |  jhUmHnHuhmHnHujhU hCJh h5CJ$jh5CJ$UM\r/ u ! p A  ^ N gdgd$a$gdgd$a$gd)$$d %d &d 'd N O P Q a$gd $ % < = > @ Q R m n     % & A B Y Z [ ] v w   1 2 I J K M l m  JKfg~jhUmHnHuhmHnHu\N Q>$kD"u \Hgdgd   45LMNP[\wx!"9:;=RSno !#23NOfghj}~  jhUmHnHuhmHnHu\ '(?@ACTUpq!<=XYpqrt  #$?@WXY[st+,CDEGVWrsjhUmHnHuhmHnHu\*+,.;<WXopqs56MNOQ]^yz./JKbcdfvwjhUmHnHuhmHnHu\/tR g`:} j`:|gdgdgdgd'(CD[\]_qr4579CD_`wxz|  01LMdegi{|&'BCZ[jhUmHnHuhmHnHu\[]_op4579BC^_vwy{,-DEGIcd)*,.fgjhUmHnHuhmHnHu\J/1 | !V!!!A"""####$$y$$$%d%gdgdgdgd  + , . 0 B C ^ _ v w y { !!!!!!8!9!P!Q!S!U!l!m!!!!!!!!!!!!!!!""#"$";"<">"@"R"S"n"o"""""""""""""""####jhUmHnHuhmHnHu\# #"#O#P#k#l###############$$$$!$#$?$@$[$\$s$t$v$x$$$$$$$$$$$%%%%!%#%*%+%F%G%^%_%a%c%d%e%gh^x_x !"ȽڵhB*ph jMhB*phhB*ph h>* h>*hjhCJUjhUmHnHuhmHnHuFd%f%g%y%%%%s&&&&v''((()):)g)))) & F h^h`gd f!h^hgdgdgd X ^gd))*\******+)+A+B+S+?---./0f0g0q0h^hgd pp^p`gd ph^hgdgdgd & F h^h`gdq01N2a2}3~33333344/44494:4U4Z4 & F gdgd & F gd & F gd h^hgd @ Ph^hgdgd pgdgdh^hgdZ4_4`4~4444444455555555 d8^gdh^hgdgd & F gd & F gd & F gd & F gdgdgd & F gd55566+67788$929D92:B:::i;y;!<6<<===u?h^hgdgd d8^gdgdu??@@BBDDFFGHsIIJJ L*LMMNNPPtQ^gd d8^gd d8^gdgdtQQgSrSTU@VJVCWQWXXZZt[[c\d\k\b]c]l] _ ___ d8^gdgd d8^gdgd_```bbbcccdoeeHfIfZfffNgWgggghgdgdgd d8^gdgdgd d8^gdhhThxhh iii'jjjj*l7lGlOlmmmmm $Ifgd)p^pgdgdgdgdgdgdgdgdmmmmmm90000 $Ifgd)kd$$Ifl\px #j d (04 lap(yt)mmm!n\nmnnnoni`````` $Ifgd)kd*$$Ifl\px #j d04 layt)onpn|nnnni```` $Ifgd)kd$$Ifl\px #j d04 layt)nnnoYoZo[oi````` $Ifgd)kd$$Ifl\px #j d04 layt)[o\oqoooooi````` $Ifgd)kd$$Ifl\px #j d04 layt)oooopppi````` $Ifgd)kdN$$Ifl\px #j d04 layt)pppKpuppppi`````` $Ifgd)kd$$Ifl\px #j d04 layt)ppppppi```` $Ifgd)kd$$Ifl\px #j d04 layt)ppq7q8q9qi```` $Ifgd)kd$$Ifl\px #j d04 layt)9q:qQqsqtquqi```` $Ifgd)kdr$$Ifl\px #j d04 layt)uqvqqqqqi```` $Ifgd)kd;$$Ifl\px #j d04 layt)qqqqqqi```` $Ifgd)kd $$Ifl\px #j d04 layt)qqqrrri```` $Ifgd)kd $$Ifl\px #j d04 layt)rr&rerfrgri```` $Ifgd)kd $$Ifl\px #j d04 layt)grhrwrrrri```` $Ifgd)kd_ $$Ifl\px #j d04 layt)rrrr s s si````` $Ifgd)kd( $$Ifl\px #j d04 layt) sss4sSsTsUsi````` $Ifgd)kd $$Ifl\px #j d04 layt)UsVsbsssssi````` $Ifgd)kd $$Ifl\px #j d04 layt)sssttti```` $Ifgd)kd$$Ifl\px #j d04 layt)tttDtEtFti```` $Ifgd)kdL$$Ifl\px #j d04 layt)FtGtUtttti```` $Ifgd)kd$$Ifl\px #j d04 layt)tttttti```` $Ifgd)kd$$Ifl\px #j d04 layt)ttuGu}u~uui````` $Ifgd)kd$$Ifl\px #j d04 layt)uuuuuui```` $Ifgd)kdp$$Ifl\px #j d04 layt)uuuuuui```` $Ifgd)kd9$$Ifl\px #j d04 layt)uvv9vlvmvnvi````` $Ifgd)kd$$Ifl\px #j d04 layt)nvovvvvvvvi`````` $Ifgd)kd$$Ifl\px #j d04 layt)vv'w(w)w*wi```` $Ifgd)kd$$Ifl\px #j d04 layt)*w+w4w:w;wgdgd>I1hq[h%2@'5CVgdgdVՍ׎#$@  &`#$ & F gd & F gdgdgdgd 67AJvw|}‘ÑđǑɑʑˑؿȫȱؒ~~jh)0JCJUh)0JCJ h)CJ h)5hh0JmHnHu h0Jjh)0J5CJUh)0J5CJ h)0Jjh)0JUh)jh)UhVhhB*ph jNhB*ph1đőƑ,-  ! !h]h  ! !h]h&`#$ ! !h]h *+,-./FGQZú{whV h)5h0J5mHnHu h)0J5jh)0J5U h0J h)0Jjh)0J5CJUh)0J5CJh) h)CJh0JCJh)0JmHnHuh)0JCJjh)0JCJUh0JmHnHu&; 000P/ =!"#$% Dp5 000&P/ =!"#$% ($$If!vh5j5 55d#vj#v #v#vd:V l (05j5 55d4ap(yt)$$If!vh5j5 55d#vj#v #v#vd:V l05j5 55d4ayt)$$If!vh5j5 55d#vj#v #v#vd:V l05j5 55d4ayt)$$If!vh5j5 55d#vj#v #v#vd:V l05j5 55d4ayt)$$If!vh5j5 55d#vj#v #v#vd:V l05j5 55d4ayt)$$If!vh5j5 55d#vj#v #v#vd:V l05j5 55d4ayt)$$If!vh5j5 55d#vj#v #v#vd:V l05j5 55d4ayt)$$If!vh5j5 55d#vj#v #v#vd:V l05j5 55d4ayt)$$If!vh5j5 55d#vj#v #v#vd:V l05j5 55d4ayt)$$If!vh5j5 55d#vj#v #v#vd:V l05j5 55d4ayt)$$If!vh5j5 55d#vj#v #v#vd:V l05j5 55d4ayt)$$If!vh5j5 55d#vj#v #v#vd:V l05j5 55d4ayt)$$If!vh5j5 55d#vj#v #v#vd:V l05j5 55d4ayt)$$If!vh5j5 55d#vj#v #v#vd:V l05j5 55d4ayt)$$If!vh5j5 55d#vj#v #v#vd:V l05j5 55d4ayt)$$If!vh5j5 55d#vj#v #v#vd:V l05j5 55d4ayt)$$If!vh5j5 55d#vj#v #v#vd:V l05j5 55d4ayt)$$If!vh5j5 55d#vj#v #v#vd:V l05j5 55d4ayt)$$If!vh5j5 55d#vj#v #v#vd:V l05j5 55d4ayt)$$If!vh5j5 55d#vj#v #v#vd:V l05j5 55d4ayt)$$If!vh5j5 55d#vj#v #v#vd:V l05j5 55d4ayt)$$If!vh5j5 55d#vj#v #v#vd:V l05j5 55d4ayt)$$If!vh5j5 55d#vj#v #v#vd:V l05j5 55d4ayt)$$If!vh5j5 55d#vj#v #v#vd:V l05j5 55d4ayt)$$If!vh5j5 55d#vj#v #v#vd:V l05j5 55d4ayt)$$If!vh5j5 55d#vj#v #v#vd:V l05j5 55d4ayt)$$If!vh5j5 55d#vj#v #v#vd:V l05j5 55d4ayt)$$If!vh5j5 55d#vj#v #v#vd:V l05j5 55d4ayt)$$If!vh5j5 55d#vj#v #v#vd:V l05j5 55d4ayt)$$If!vh5j5 55d#vj#v #v#vd:V l05j5 55d4ayt)$$If!vh5j5 55d#vj#v #v#vd:V l05j5 55d4ayt)j" 666666666vvvvv666>666666666666666666666666666666666666666666666666hH6666666666666666666666666666666666666666666666666666666666666666662 0@P`p2( 0@P`p 0@P`p 0@P`p 0@P`p 0@P`p 0@P`p8XV~ OJPJQJ_HmH nH sH tH H`H Normal OJPJQJ^J_HmH sH tH N`N  Heading 2<&d@&P5F`F  Heading 3 & F<@&56L`L  Heading 4$p0@&^p`0>*CJH`H  Heading 5$@ 0@&^@ `05DA`D Default Paragraph FontRi@R 0 Table Normal4 l4a (k ( 0No List P/P Heading 2 Char5CJOJPJQJ^JaJR/R Heading 3 Char56CJOJPJQJ^JaJP/P Heading 4 Char>*CJOJPJQJ^JaJP/!P Heading 5 Char5CJOJPJQJ^JaJ:`: TOC 2 !^CJ:`: TOC 3 !^CJ4`R4 Header  !F/aF  Header CharCJOJPJQJ^JaJ4 `r4 Footer  !F/F  Footer CharCJOJPJQJ^JaJ.)`.  Page Number>`>  Normal Indent ^6`6 TOC 4 ! X^X6`6 TOC 5 !  ^ pop normal 1=$$$ p@  d^ a$OJQJ4o4 normal 2 `^`4o4 normal 3 ^HH !0 Balloon Text CJOJQJ^JaJR/R  0Balloon Text CharCJOJPJQJ^JaJPK![Content_Types].xmlj0Eжr(΢Iw},-j4 wP-t#bΙ{UTU^hd}㨫)*1P' ^W0)T9<l#$yi};~@(Hu* Dנz/0ǰ $ X3aZ,D0j~3߶b~i>3\`?/[G\!-Rk.sԻ..a濭?PK!֧6 _rels/.relsj0 }Q%v/C/}(h"O = C?hv=Ʌ%[xp{۵_Pѣ<1H0ORBdJE4b$q_6LR7`0̞O,En7Lib/SeеPK!kytheme/theme/themeManager.xml M @}w7c(EbˮCAǠҟ7՛K Y, e.|,H,lxɴIsQ}#Ր ֵ+!,^$j=GW)E+& 8PK!Ptheme/theme/theme1.xmlYOo6w toc'vuر-MniP@I}úama[إ4:lЯGRX^6؊>$ !)O^rC$y@/yH*񄴽)޵߻UDb`}"qۋJחX^)I`nEp)liV[]1M<OP6r=zgbIguSebORD۫qu gZo~ٺlAplxpT0+[}`jzAV2Fi@qv֬5\|ʜ̭NleXdsjcs7f W+Ն7`g ȘJj|h(KD- dXiJ؇(x$( :;˹! I_TS 1?E??ZBΪmU/?~xY'y5g&΋/ɋ>GMGeD3Vq%'#q$8K)fw9:ĵ x}rxwr:\TZaG*y8IjbRc|XŻǿI u3KGnD1NIBs RuK>V.EL+M2#'fi ~V vl{u8zH *:(W☕ ~JTe\O*tHGHY}KNP*ݾ˦TѼ9/#A7qZ$*c?qUnwN%Oi4 =3ڗP 1Pm \\9Mؓ2aD];Yt\[x]}Wr|]g- eW )6-rCSj id DЇAΜIqbJ#x꺃 6k#ASh&ʌt(Q%p%m&]caSl=X\P1Mh9MVdDAaVB[݈fJíP|8 քAV^f Hn- "d>znNJ ة>b&2vKyϼD:,AGm\nziÙ.uχYC6OMf3or$5NHT[XF64T,ќM0E)`#5XY`פ;%1U٥m;R>QD DcpU'&LE/pm%]8firS4d 7y\`JnίI R3U~7+׸#m qBiDi*L69mY&iHE=(K&N!V.KeLDĕ{D vEꦚdeNƟe(MN9ߜR6&3(a/DUz<{ˊYȳV)9Z[4^n5!J?Q3eBoCM m<.vpIYfZY_p[=al-Y}Nc͙ŋ4vfavl'SA8|*u{-ߟ0%M07%<ҍPK! ѐ'theme/theme/_rels/themeManager.xml.relsM 0wooӺ&݈Э5 6?$Q ,.aic21h:qm@RN;d`o7gK(M&$R(.1r'JЊT8V"AȻHu}|$b{P8g/]QAsم(#L[PK-![Content_Types].xmlPK-!֧6 +_rels/.relsPK-!kytheme/theme/themeManager.xmlPK-!Ptheme/theme/theme1.xmlPK-! ѐ' theme/theme/_rels/themeManager.xml.relsPK] g  =  ''''  [#JLNOPRSUVN d%)q0Z45u?tQ_hmmonn[ooppp9quqqqrgrr sUsstFtttuuunvv*wVKMQTWXYZ[\]^_`abcdefghijklmnopqrstuvwxyz{|}~Y*,-<Xprs7Skmn{$<>?Qm%AY[\v1IKLl Jf~   4LNO[w!9;<Rn  ! " 2 N f h i } ' ? A B T p    < X p r s  # ? W Y Z s  + C E F V r *,-;Woqr5MOP]y.Jbdev'C[]^q478C_wz{  0Ldgh{&BZ]^o478B^vyz,DGHc),-f+./B^vyz8PSTl#;>?Rn !Ok!"?[svw!"*F^abd 2%ĕD2%ĕD2%ĕD2%ĕD2%ĕD2%ĕD2%ĕD2%ĕD2%ĕD2%ĕD2%ĕD2%ĕD2%ĕD2%ĕD2%ĕD2%ĕD2%ĕD2%ĕD2%ĕD2%ĕD2%ĕD2%ĕD2%ĕD2%ĕD2%ĕD2%ĕD2%ĕD2%ĕD2%ĕD2%ĕD2%ĕD2%ĕD2%ĕD2%ĕD2%ĕD2%ĕD2%ĕD2%ĕD2%ĕD2%ĕD2%ĕD2%ĕD2%ĕD2%ĕD2%ĕD2%ĕD2%ĕD2%ĕD2%ĕD2%ĕD2%ĕD2%ĕD2%ĕD2%ĕD2%ĕD2%ĕD2%ĕD2%ĕD2%ĕD2%ĕD2%ĕD2%ĕD2%ĕD2%ĕD2%ĕD2%ĕD2%ĕD2%ĕD2%ĕD2%ĕD2%ĕD2%ĕD2%ĕD2%ĕD2%ĕD2%ĕD2%ĕD2%ĕD2%ĕD2%ĕD2%ĕD2%ĕD2%ĕD2%ĕD2%ĕD2%ĕD2%ĕD2%ĕD2%ĕD2%ĕD2%ĕD2%ĕD2%ĕD2%ĕD2%ĕD2%ĕD 0pv(@!!!!8@0(  B S  ? _Toc372366998 _Toc291498919 _Toc291498920 _Toc372366999 _Toc372367000 _Toc291498921 _Toc372367001 _Toc372367002 _Toc372367003 _Toc372367004 _Toc291498922 _Toc372367005 _Toc291498923 _Toc372367006 _Toc372367007 _Toc372367008 _Toc372367009 _Toc372367010 _Toc372367011 _Toc372367012 _Toc372367013 _Toc291498925 _Toc372367014 _Toc372367015 _Toc372367016 _Toc291498928 _Toc291498927 _Toc372367017 _Toc342289746 _Toc372367018 _Toc342289747 _Toc372367019 _Toc342289748 _Toc372367020 _Toc372367021 _Toc372367022 _Toc372367023 _Toc372367024 _Toc291498937 _Toc311604073 _Toc372367025 _Toc372367026 _Toc311604074 _Toc372367027 _Toc291498929 _Toc311604075 _Toc372367028 _Toc311604076 _Toc372367029 _Toc372367030 _Toc372367031 _Toc372367032 _Toc372367033 _Toc311604077 _Toc372367034 _Toc372367035 _Toc291498930 _Toc311604078 _Toc372367036 _Toc372367037 _Toc372367038 _Toc372367039 _Toc291498931 _Toc311604079 _Toc372367040 _Toc372367041 _Toc291498934 _Toc311604080 _Toc372367042 _Toc372367043 _Toc372367044 _Toc291498935 _Toc311604081 _Toc372367045 _Toc291498936 _Toc311604082 _Toc372367046 _Toc311604083 _Toc372367047 _Toc372367048 _Toc372367049 _Toc372367050 _Toc372367051 _Toc372367052 _Toc372367053 _Toc311604084 _Toc282006786 _Toc282006854 _Toc244638960 _Toc298905541 _Toc372367054 _Toc282006787 _Toc282006855 _Toc244638961 _Toc298905542 _Toc372367055 _Toc282006788 _Toc282006856 _Toc244638962 _Toc298905543 _Toc372367056 _Toc282006789 _Toc282006857 _Toc244638963 _Toc298905544 _Toc372367057 _Toc282006790 _Toc282006858 _Toc244638964 _Toc298905545 _Toc372367058 _Toc282006791 _Toc282006859 _Toc244638965 _Toc298905546 _Toc372367059 _Toc282006792 _Toc282006860 _Toc244638966 _Toc298905547 _Toc372367060 _Toc282006793 _Toc282006861 _Toc244638967 _Toc298905548 _Toc372367061 _Toc282006794 _Toc282006862 _Toc244638968 _Toc298905549 _Toc372367062 _Toc372367063 _Toc311604087 _Toc372367064 _Toc291498933 _Toc372367065 _Toc372367066 _Toc372367067 _Toc372367068 _Toc372367069 _Toc372367070 _Toc372367071 _Toc372367072 _Toc372367073 _Toc291498938 _Toc372367074 _Toc291498941 _Toc372367075 _Toc372367076 _Toc291498940 _Toc372367077 _Toc372367078 _Toc372367079 _Toc372367080 _Toc372367081 _Toc372367082 _Toc372367083 _Toc372367084 _Toc282006798 _Toc282006866 _Toc244638976 _Toc298905557 _Toc372367085 _Toc282006799 _Toc282006867 _Toc244638977 _Toc298905558 _Toc372367086 _Toc372367087 _Toc372367088 _Toc372367089 _Toc372367090 _Toc372367091 _Toc372367092 _Toc372367093gghssv ""A#B#%g(N*~++,:,_,`,,,,,,--------./000$1212122222222i3!445u7u78:::<>?sAsAsAB D D DEFHHHtItItIgKgKL@NCOPRtSI^I^I^I^I^I^^^^^^N_N_N_N_N___________`````T`T`T`T`T`x`x`x`x`x` a a a a aabbb*d7dGd%p-p5pEp`p>x1y1yqzqz[{|||2}~~CCCCCՅ׆$  %" !#$'&()*4+,1-./02536:;789<>?=@CDABEFGHIJKLMNOPQRSTUVWXYZ[\]^_`abcdefghijklmnopqrstuvwxyz{|}~xx !!""R#%%p(`*++.,T,},,,,--------.*.*./001111C1C1A2A22x3545557888:<>@@@ABBB)DEFFFHHHIIIqKqKMINPOPR]Y^Y^Y^Y^Y^^^^^^V_V_V_V_V______``````````w`w`w`w`w``````aaaaa&b&bbb6dFdNd,p4pDp]p_p`pwHxgyzzg{g{$||?}&~~4UUUUU"?mp..CCYYddffBvUv}} IN%%(())++++++,,U-y-//0$0/212W6~6@<C<w>z>2?G?IIOPRSVVWWm[q[\\^^__bbGdJdnn%p(p-p0pEp]pXrbrtt?v@v{{*}1}4G/  333333333333333333333333333333333333333333333$- 77AJ|ʉʉ *+GGQZ1yftpS;1ftpr2ftpzlIftpAV]`$g*@h^`.@h^`.@h^`.@h^`.@h ^`OJQJo(@h^`. AV\r2r2H]1y1y(VlS;1S;1` zlIzlI]`]` `n@h ^`OJQJo( @h^`. @h^`.`o@h^`.P @h^`. @h^`.V)@ h@UnknownG* Times New Roman5Symbol3. * Arial;Wingdings7.{ @CalibricCG Times (W1)Times New Roman5. *aTahomaA BCambria Math"1h--stFstF!02HP $P0!xxQATutorial.comQATutorial.com(       Oh+'0xX    ,8@HPQATutorial.com Normal.dotmQATutorial.com1Microsoft Office Word@Ik@&#@fL#stTest Plan Template 2Test Plan Template 2 TemplatexX    ,8@HPQATutorial.com Normal.dotmQATutorial.com1Microsoft Office Word@Ik@&#@fL#stTest Plan Template 2Test Plan Template 2 Template  !"#$%&'()*+,-./0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\]^_`abcdefghijklmnopqrstuvwxyz{|}~Root Entry F0o#Data 1TableeRWordDocumentt SummaryInformation(DocumentSummaryInformation8(CompObjy   F'Microsoft Office Word 97-2003 Document MSWordDocWord.Document.89q՜.+,0 hpx  F  TitleQATutorial.comOh+'0