A Project Assessment Questionnaire
There are no right or wrong answers to these questions, the purpose of these questions is to ascertain an accurate view of the past and current development processes, methodologies, and practices used for your project.
- Project and Development Team Information
- Name of project: __________________________________________________
- Please provide a brief description of the project and product.
- Names and roles of respondent(s) to this questionnaire:
- Size of project (lines of code, function points, or other units):
VA Java code?
DB- related code?
Other (C, C++, etc.)
- Delivery dates for key functions (or target delivery dates) including original dates and any reset dates: ______________________________________________ _______________________________________________________________
- Current stage of the project if not already shipped (e.g., functional test almost complete, in final test phase, product/release in beta, etc.):
-
- Does/did the project involve cross-site or cross-lab development?
- If yes, what site(s) and lab(s)?
- Is there a cross-lab development process available?
- At what organizational level are cross-site/cross-team development implemented (e.g., 1st line level, 2nd line/functional level, etc.)?
-
- Did the design point of the project serve to satisfy multiple users or constituencies?
- Was the project implemented on an open /common platform (e.g., Intel, PowerPC, Linux, Window, FreeBSD)?
Please specify:
- On a scale of 1 to 10 (10 being the most complex), how would you rate the complexity of the project based on your experience and knowledge of similar types of software projects?
- Development cycle time (equate ship date with final delivery in an iterative model):
- From design start to ship: _____ months
- From design start to bring-up: _____ months
- From bring-up to code integration complete (all coding done): _____ months
- From code integration complete to internal customer use (all development tests complete) of the product: _____ months
- From development test complete to GA: _____ months
- Development team information (please provide estimates if exact numbers are not available):
- Total size of team of the entire project: _____
- Number of VA Java programmers spending 100% of time on project: _____
- Number of VA Java programmers spending less than 100% of time on project: _____
- Number of database programmers spending 100% of time on project _____
- Number of database programmers spending less than 100% of time on project _____
- Number of other programmers _____ (specify skills)
- Distribution of team members by education background (percent):
Computer science _____ %
Computer engineering _____ %
Others (please specify) _____ %
_________________ _____ %
Total 100.0% (N = total number of members)
- Approximate annual turnover rate of team members: _____
- How would you describe the skills and experience levels of this team (e.g., years with tools experience, very experienced team, large percent of new hires, etc.)?
- Are there sufficient skilled technical leaders /developers in the organization to lead and support the whole team?
- If possible, please give percent distribution estimates with regard to years of industry software development experience:
< 2 years _____ %
“5 years _____ %
> 5 years _____ %
TOTAL _____ 100.0%
- Name of project: __________________________________________________
- Requirements and Specifications
- To what extent did the development team review the requirements before they were incorporated into the project. (Please mark the appropriate cell for each row in the table)
Always
Usually
Sometimes
Seldom
Never
Functional requirements
Performance requirements
Reliability/availability/serviceability (RAS) requirements
Usability requirements
Web Publishing/ID requirements
- Per your experience and assessment, how important is this practice (requirements review) to the success of tools projects? (Please mark the appropriate cell for each row in the table.)
Very Important
Important
Somewhat Important
Not Sure
Functional requirements
Performance requirements
Reliability/availability/serviceability (RAS) requirements
Usability requirements
Web Publishing/ID requirements
- Specifications were developed based on the requirements and used as the basis for project planning, design and development, testing, and related activities.
- Always
- Usually
- Sometimes
- Seldom
- Never
- Per your experience and assessment, how important is this practice (specifications and requirements to guide overall project implementation) to the success of this project?
- Very important
- Important
- Somewhat important
- Not sure
- If your assessment of the above is "very important" or "important" and your project's actual practice didn't match the level of importance, what were the reasons for the disparity (e.g., obstacles, constraints, process, culture, experiences, etc.)?
- Per your experience and assessment, how important is this practice (specifications and requirements to guide overall project implementation) to the success of this project?
- How did your project deal with (a) late requirements and (b) requirements changes? Please elaborate.
- To what extent did the development team review the requirements before they were incorporated into the project. (Please mark the appropriate cell for each row in the table)
Project Strengths and Weaknesses with Regard to Section B
(B1) Is there any practice(s) by your project with regard to requirements and specifications that you consider a strength and that should be considered for implementation by other projects? If so, please describe and explain.
(B2) If you were to do this project all over again, what would you do differently with regard to requirements and specifications, and why?
- Design, Code and Reviews/Inspections
- To what extent did the design work of the project take the following into account? (Please mark the appropriate cells .)
Largest Extent Possible
Important Consideration
Sometimes
Seldom
Don't Know
(a) Design for extensibility
(b) Design for performance
(c) Design for reliability/availability/serviceability (RAS)
(d) Design for usability
(e) Design for debugability
(f) Design for maintainability
(g) Design for testability
(h) Design with modularity (component structure) to allow for component ownership and future enhancements
- Was there an overall high-level design document in place for the project as overall guidelines for implementation and for common understanding across teams and individuals?
a. Yes
b. No
- Per your experience and assessment, how important is this practice (overall design document) to the success of this project?
- Very important
- Important
- Somewhat important
- Not sure
- Per your experience and assessment, how important is this practice (overall design document) to the success of this project?
- To what extent were design reviews of the project conducted ? (Please mark the appropriate cell in each row in the table.)
All Design Work Done Rigorously
All Major Pieces of Design Items
Selected Items Based on Criteria (e.g., Error Recovery)
Design Reviews Were Occasionally Done
Not Done
Original design
Design changes/ rework
- Per your experience and assessment, how important is this practice (design review/verification) to the success of this project?
- Very important
- Important
- Somewhat important
- Not sure
- If your assessment in question 16(a) is "very important" or "important" and your project's actual practice didn't match the level of importance, what were the reasons for the disparity (e.g., obstacles, constraints, process, culture, experiences, etc.)?
- Per your experience and assessment, how important is this practice (design review/verification) to the success of this project?
- What is the most common form of design reviews for this project?
- Formal review meeting with moderators, reviewers/inspectors, and defect tracking ”issues resolution and rework completion as part of the completion criteria
- Formal review but issue resolution is up to the owner
- Informal review by experts of related areas
- Codeveloper (codesigner) informal review
- Other.....(Please specify.)
- In your development process, are there an entry/exit criteria for major development phases?
- If yes to question 18, is the review process related to the entry/exit criteria of process phases (e.g., is the successful completion of design reviews part of exit criteria of the design phase)?
- If yes to question 18a, how effectively are the criteria followed/enforced?
- Very effectively
- Effectively
- Somewhat effectively
- Not effectively
- If yes to question 18, if entrance /exit criteria were not met, what did you do?
- Per your experience and assessment, how important is this practice (successful design review as part of exit criteria for the design phase) to the success of this project?
- Very important
- Important
- Somewhat important
- Not sure
- If your assessment in question 18d is "very important" or "important" and your project's actual practice didn't match the level of importance, what were the reasons for the disparity (e.g., obstacles, constraints, process, culture, experiences, etc.)?
- Were any coding standards used?
If yes, please briefly describe.
- To what extent did the code implementation of the project take the following factors into account? (Please mark the appropriate cell for each row in the table.)
Largest Extent Possible
Important Consideration
Sometimes
Seldom
Don't Know
Code for extendibility
Code for performance
Code for debugability
Code for reliability/availability/serviceability (RAS)
Code for usability
Code for maintainability
- To what extent were code reviews/inspections conducted? (Please mark the appropriate cell for each row in the table.)
Rigorously 100% of the Code
Major Pieces of Code
Selected Items Based on Criteria (e.g., Error Recovery Code)
Occasionally Done
Not Done
Original code implementation
After significant rework/changes
Final (or near final) code implementation
- Per your experience and assessment, how important is this practice (code reviews and inspections) to the success of this project?
- Very important
- Important
- Somewhat important
- Not sure
- If your assessment in question 21(a) is "very important" or "important" and your project's actual practice didn't match the level of importance, what were the reasons for the disparity (e.g., obstacles, constraints, process, culture, experiences, etc.)?
- Per your experience and assessment, how important is this practice (code reviews and inspections) to the success of this project?
- To what extent did the design work of the project take the following into account? (Please mark the appropriate cells .)
Project Strengths and Weaknesses with Regard to Section C
(C1) Is there any practice(s) by your project with regard to design, code, and reviews/inspections that you consider a strength and that should be considered for implementation by other projects? If so, please describe and explain.
(C2) If you were to do this project all over again, what would you do differently with regard to design, code, and reviews/inspections, and why?
- Code Integration and Driver Build
- Was code integration dependency (e.g., with client software, with database, with information development, with other software, with other organizations or even with other sites) a concern for this project?
a. Yes
b. No
- If yes to question 22, please briefly describe how such dependencies were managed from a code integration/driver build point of view for this project and what (tools, process, etc.) was used.
- Per your experience and assessment, how important is this practice (code integration dependency management) to the success of this project?
- Very important
- Important
- Somewhat important
- Not sure
- If your assessment in question 22b is "very important" or "important" and your project's actual practice didn't match the level of importance, what were the reasons for the disparity (e.g., obstacles, constraints, process, culture, experiences, etc.)?
- With regard to the integration and build process, how do you control part integration?
- In a cross-site development environment, how is the part integration handled from an organizational point of view? Is there an owning organization responsible for part integration?
- If yes to question 23(a), how is the development group involved in the integration/bring-up task?
- Please briefly describe your process, if any, in enhancing code integration quality and driver stability.
- Per your experience and assessment, how important is this practice (code integration control, action/process on integration quality and driver stability) to the success of this project?
- Very important
- Important
- Somewhat important
- Not sure
- If your assessment in question 24(a) is "very important" or "important" and your project's actual practice didn't match the level of importance, what were the reasons for the disparity (e.g., obstacles, constraints, process, culture, experiences, etc.)?
- Per your experience and assessment, how important is this practice (code integration control, action/process on integration quality and driver stability) to the success of this project?
- What is your driver build cycle (e.g., daily, weekly, biweekly, monthly, flexible ”build when ready, etc.)? Please provide your observations on your build cycle as it relates to your project progress (schedule and quality). If it varied throughout the project, please describe how this was handled through the different phases (i.e., early function delivery and bring-up, vs. fix-only mode, etc.).
- Was code integration dependency (e.g., with client software, with database, with information development, with other software, with other organizations or even with other sites) a concern for this project?
Project Strengths and Weaknesses with Regard to Section D
(D1) Is there any practice(s) by your project with regard to code integration and driver build that you consider a strength and that should be considered for implementation by other projects? If so, please describe and explain.
(D2) If you were to do this project all over again, what would you do differently with regard to code integration and driver build, and why?
- Test
- Was there a test plan in place for this project at the functional (development test) and overall project level (including independent test team)? Who initiated the test plan? (Please fill in the cells in the table.)
Test Plan in Place (Yes/No)
Who Initiated
Who Executed
Development Test
Overall Project
- What types of test/test phases (unit, simulation test, functional, regression, independent test group, etc.) were conducted for this project? Please specify and give a brief explanation of each.
- Please elaborate on your error recovery or "bad path " testing.
- Please elaborate on your regression testing.
- Was test coverage/code coverage measurement implemented?
If yes, for which test(s), and who does it?
- Are entry/exit criteria used for the major test phases/types?
If yes,
(a) please provide a brief description.
(b) How are the criteria used or enforced?
- Per your experience and assessment, how important is this practice (entry/exit criteria for major tests) to the success of this project?
- Very important
- Important
- Somewhat important
- Not sure
- If your assessment in question 29a is "very important" or "important" and your project's actual practice didn't match the level of importance, what were the reasons for the disparity (e.g., obstacles, constraints, process, culture, experiences, etc.)?
- Per your experience and assessment, how important is this practice (entry/exit criteria for major tests) to the success of this project?
- Is there a change control process in place for integrating fixes?
a. Yes (please briefly describe)
b. No
- If yes, how effectively in your assessment is the process being implemented?
- Very effectively
- Effectively
- Somewhat effectively
- Not effectively
- Per your experience and assessment, how important is this practice (change control for defect fixes) to the success of this project?
- Very important
- Important
- Somewhat important
- Not sure
- If your assessment in question 30b is "very important" or "important" and your project's actual practice/effectiveness didn't match the level of importance, what were the reasons for the disparity (e.g., obstacles, constraints, process, culture, experiences, etc.)?
- If yes, how effectively in your assessment is the process being implemented?
- Was there a test plan in place for this project at the functional (development test) and overall project level (including independent test team)? Who initiated the test plan? (Please fill in the cells in the table.)
Project Strengths and Weaknesses with Regard to Section E
(E1) Is there any practice(s) by your project with regard to testing that you consider a strength and that should be considered for implementation by other projects? If so, please describe and explain.
(E2) If you were to do this project all over again, what would you do differently with regard to testing, and why?
- Project Management
- Was there a dedicated project manager for this project?
- How would you describe the role of project management for this project?
- Project management was basically done by line management.
- There was a project coordinator ”coordinating activities and reporting status across development teams and line managers.
- There was a project manager but major project decisions were progress-driven by line management.
- The project manager, together with line management, was responsible for the success of the project. The project manager drove progress (e.g., dependency, schedule, quality) of the project and improvements across teams and line management areas.
- Other.....(Please specify/describe.)
- Per your experience and assessment, how important is this practice (effective role of project management) to the success of this project?
- Very important
- Important
- Somewhat important
- Not sure
- If your assessment in question 31(b) is "very important" or "important" and your project's actual practice didn't match the level of importance, what were the reasons for the disparity (e.g., obstacles, constraints, process, culture, experiences, etc.)?
- How would you describe the role of project management for this project?
- How were sizing estimates of the project ( specifically the amount of design and development work) derived?
- How was the development schedule developed for this project? Please provide a brief statement (e.g., top-down [GA date mandated ], bottom-up, bottom-up and top-down converged with proper experiences and history, based on sizing estimates, etc.).
- Per your experience and assessment, how important is this practice (effective sizing and schedule development process based on skills and experiences) to the success of this project?
- Very important
- Important
- Somewhat important
- Not sure
- If your assessment in question 33(a) is "very important" or "important" and your project's actual practice didn't match the level of importance, what were the reasons for the disparity (e.g., obstacles, constraints, process, culture, experiences, etc.)?
- Per your experience and assessment, how important is this practice (effective sizing and schedule development process based on skills and experiences) to the success of this project?
- Was a staged delivery/code drop plan developed early based on priorities and dependencies and executed?
a. Yes, please briefly describe.
b. No, please briefly describe.
- Per your experience and assessment, how important is this practice (good staging and code drop plan) to the success of this project?
- Very important
- Important
- Somewhat important
- Not sure
- If your assessment in question 34a is "very important" or "important" and your project's actual practice didn't match the level of importance, what were the reasons for the disparity (e.g., obstacles, constraints, process, culture, experiences, etc.)?
- Per your experience and assessment, how important is this practice (good staging and code drop plan) to the success of this project?
- Does this project have to satisfy multiple constituents or diverse users?
If yes,
- how was work prioritized?
- How was workload distribution determined?
- How was conflict resolved?
- If this is a cross-site development project (Question 5), please briefly describe how cross-site dependency was managed.
- Under what level of management were major dependencies for deliverables managed?
- Under the same development manager
- Under the same functional manager
- Across functional areas but under the same development directors
- Coordination across development directors
- Under the same project executive
- Other... Please describe.
- What were the major obstacles, if any, to effective team communications for your project?
- Were major checkpoint reviews conducted at various stages of the project throughout the development cycle?
a. Yes
b. No
- If yes to question 39, please describe the major checkpoint review deliverables.
- If yes to question 39, how effective in your view were those checkpoint reviews? Please briefly explain.
- Very effective
- Effective
- Somewhat effective
- Not effective
- Per your experience and assessment, how important is this practice (effective checkpoint process) to the success of this project?
- Very important
- Important
- Somewhat important
- Not sure
- If your assessment in question 39(c) is "very important" or "important" and your project's actual practice didn't match the level of importance, what were the reasons for the disparity (e.g., obstacles, constraints, process, culture, experiences, etc.)?
- Was there a dedicated project manager for this project?
Project Strengths and Weaknesses with Regard to Section F
(F1) Is there any practice(s) by your project with regard to project management that you consider a strength and that should be considered for implementation by other projects? If so, please describe and explain.
(F2) If you were to do this project all over again, what would you do differently with regard to project management, and why?
- Metrics, Measurements, Analysis
- Were any in-process metrics used to manage the progress (schedule and quality) of the project (e.g., function delivery tracking, problem backlog tracking, test plan execution, etc.)?
a. Yes
b. No
- If yes, please specify/describe where applicable .
- Metric(s) used at the front end of the development cycle (i.e., up to code integration)
- Metric(s) used for driver stability
- Metric(s) used during testing with targets/baselines for comparisons
- Others (simulation measurement, test coverage/code coverage measurement, etc.) ”Please specify.
- Per your experience and assessment, how important is this practice (good metrics for schedule and quality management) to the success of this project?
- Very important
- Important
- Somewhat important
- Not sure
- If your assessment in question 40(b) is "very important" or "important" and your project's actual practice didn't match the level of importance, what were the reasons for the disparity (e.g., obstacles, constraints, process, culture, experiences, etc.)?
- If yes, please specify/describe where applicable .
- Was there any defect cause analysis (e.g., problem components , Pareto analysis) which resulted in improvement/corrective actions during the development of the project?
If yes, please describe briefly.
- Were any in-process metrics used to manage the progress (schedule and quality) of the project (e.g., function delivery tracking, problem backlog tracking, test plan execution, etc.)?
Project Strengths and Weaknesses with Regard to Section G
(G1) Is there any practice(s) by your project with regard to metrics, measurements, and analysis you consider a strength and that should be considered for implementation by other projects? If so, please describe and explain.
(G2) If you were to do this project all over again, what would you do differently with regard to metrics, measurements, and analysis, and why?
- Development Environment/Library
- Please name and describe briefly your development environment/platform(s) and source code library system(s).
- To what extent was the entire team familiar with the operational, build, and support environment?
- Was your current development environment or any part of it a hindrance in any way? What changes might enhance the development process for quality, efficiency, or ease-of-use? Please provide specifics.
Project Strengths and Weaknesses with Regard to Section H
(H1) Is there any practice(s) by your project with regard to development environment/library system that you consider a strength and that should be considered for implementation by other projects? If so, please describe and explain.
(H2) If there is any development environment/library system that per your assessment is the best for tools development, please describe and explain.
(H3) If you were to do this project all over again, what would you do differently with regard to development environment/library system, and why?
- Tools/Methodologies
- In what language(s) was the code for the project written?
- Was the project developed with
- object-oriented methodology?
- procedural methods ?
- Are multiple environments required in order to fully test the project? If so, please describe.
- Are any kind of simulation test environments available? Please describe.
- If yes to question 48, how important is this to the success of tools projects?
- Very important
- Important
- Somewhat important
- Not sure
- If yes to question 48, how important is this to the success of tools projects?
- Please describe briefly any tools that were used for each of the following areas:
- Design
- Debug
- Test ”code coverage
- Test ”automation/stress
- Other. Please explain.
- What was the learning curve of the development team to become proficient in using the above tools and the development environment/library discussed earlier? Please provide information if any specific education is needed.
Project Strengths and Weaknesses with Regard to Section I
(I1) Is there any practice(s) by your project with regard to tools and methodologies you consider a strength and that should be considered for implementation by other projects? If so, please describe and explain.
(I2) If any tools and methodologies that per your assessment are the best for the type of projects similar to this project, please describe and explain.
(I3) If you were to do this project all over again, what would you do differently with regard to tools and methodologies?
- Project Outcome Assessment
- Please provide a candid assessment of the schedule achievement (vs. original schedule) of the project. Please provide any pertinent information as appropriate (e.g., adherence to original schedule, meeting/not meeting GA date, meeting/not meeting interim checkpoints, any schedule reset, any function cutback/increase, unrealistic schedule to begin with, etc.).
- Please provide a candid assessment of the quality outcome of the project. Please provide any pertinent information as appropriate (e.g., in-process indicators, test defect volumes /rates, field quality indicators, customer feedback, customer satisfaction measurements, customer critical situations, any existing analysis and presentations. Please attach files or documents, etc.).
- How would you rate the overall success of the project (schedule, quality, costs, meeting commitments, etc.)?
- Very successful
- Successful
- Somewhat successful
- Not satisfactory
- Comments
Please provide any comments, observations, insights with regard to your project specifically or tools projects in general.
Категории