When evaluating a Decision Support System (DSS) during development, which of the following is typically <em>not</em> a key component of the evaluation framework?

Difficulty: Medium

Correct Answer: means of measuring system-development time spent on the project

Explanation:


Introduction / Context:
DSS evaluation focuses on whether the system improves decision quality, usability, responsiveness, and stakeholder satisfaction. While project management metrics (like time spent) are useful for delivery oversight, they are not central to evaluating the effectiveness of the DSS itself. This question distinguishes product evaluation from project administration.


Given Data / Assumptions:

  • Evaluation criteria for DSS typically include accuracy, relevance, timeliness, usability, and impact on decisions.
  • Monitoring progress and formal reviews help structure iterative improvements.
  • Development time tracking is a project metric, not a direct quality metric of the DSS’s decision support.


Concept / Approach:

An effective evaluation framework specifies criteria (what “good” looks like), monitoring (how progress toward those criteria is tracked), and a formal review process (how findings drive decisions). Measuring development time is relevant to schedule and cost management, but it neither proves nor disproves the DSS’s utility in supporting decisions.


Step-by-Step Solution:

Separate product effectiveness (decision support outcomes) from project efficiency (time/cost).Identify elements integral to product evaluation: criteria, monitoring, reviews.Recognize that “time spent” is peripheral to system evaluation.Select “means of measuring system-development time spent on the project”.


Verification / Alternative check:

Best-practice frameworks (usability testing, KPI impact, A/B pilots) emphasize outcomes and user adoption, not engineering hours, when judging DSS success.


Why Other Options Are Wrong:

Criteria, monitoring, and formal reviews are core to systematic evaluation.

“All of the above” is incorrect because one option is not key to evaluation proper.


Common Pitfalls:

Equating on-time delivery with effectiveness; a timely but unhelpful DSS still fails users.


Final Answer:

means of measuring system-development time spent on the project

Discussion & Comments

No comments yet. Be the first to comment!
Join Discussion