Engineering economic decision problems encountered in practice are embedded in information-rich environments, where large volumes of data are available from multiple sources. However, the information that is most relevant to solving the problem may be unavailable, inaccessible, inaccurate, or uncertain. In contrast, typical engineering economy textbook problems present only the relevant information in a convenient format. To help bridge the gap between textbook and practice, we engage student teams in a series of ill-structured problems. Teams work in an online Problem Solving Learning Portal (PSLP) that provides access to a variety of information resources containing both relevant and irrelevant information. In one problem instance, some information relevant to the solution must be obtained from an external resource that is not available or mentioned in the PSLP. Student work in the PSLP is organized into successive stages of specifying decision criteria, stating assumptions, expressing their solution in a spreadsheet file and written rationale, and conducting a sensitivity analysis on a single variable they judge to be critical. In addition, they cut and paste information from the resources they see as relevant into a "working memory" repository. We explore different methods for assessing students' ability to select which information is relevant. Direct measures include simple counts of "hits" and "false alarms" in the working memory that are assessed as part of the grading rubric and analyzed using signal detection theory. Their choice of the parameter(s) on which to conduct the sensitivity analysis can be considered as an indirect measure because the most relevant information is that which provides the best prediction of the most critical parameter (i.e., the parameter that will have the greatest impact on the decision criterion). The online environment also tracks the information resources visited by the student teams and the time of visitation. Data collected from a large engineering economy course are used to evaluate the effectiveness of these assessment methods. © American Society for Engineering Education, 2007.