departments, students’ chapters, blogs and architecture firms in the USA were approached. The survey included 22 questions and the average duration for taking the survey was approximately 8 to 12 minutes. A welcome page explained the objective of the survey, informed participants of the approximate survey duration, and expected target group. Including the above mentioned issues the survey listed the tools that will be inquired. The questionnaire was structured into 4 parts:
professionals and almost a quarter of respondents (24%) were AIA accredited architects. The survey sampled the architects’ community with a prior interest in green building design and energy performance. Participants that did not fall into the above mentioned criteria were excluded in order to assure cross-discipline benchmarking.
information collection concerning respondent’s current position, types of software used for energy simulation and CAD/3D modeling.
The second and third parts of the survey focused on the following key criteria. (1) The usability and information management (UIM) of interface and (2) the integration of intelligent design knowledge-base (IIKB). The respondents were asked not only to judge the relevant importance of the above mentioned criteria, but also to share their experience by comparing longitudinally the ten selected tools.
In the fourth part, and prior to the closing message, respondents were asked to rank the most important criteria for a BPS tool to be considered as ‘Architect Friendly’.
An open question followed every part of the questionnaire in order to allow respondents to share their thoughts and comments. At the end of the survey respondents were invited to post their ideas about current limitations or improvements that should be avoided or integrated in the future development of BPS tools.
Fig. 2. Respondents’ current position & affiliation
What of the following BPS tools do you use? Next, respondents were asked what BPS tool they constantly used during all different design phases. Respondents could choose more than one tool. Figure 3 shows respondents’ choices. Over 64% (159 individuals) of the respondents reported they use ECOTECT. The figure reveals that ECOTECT is the most commonly used tool among respondents. 123 individuals responded that they use eQUEST corresponding to 49% of all respondents. Surprisingly, both EP and EPSU plug-in were used by 32% of the respondents. IES VE was used by 24% of all respondents, E10 22.6%, DB 21.6%, DOE-2 19.2%, HEED 18% and GBS 10.8%. Although, those figures cannot be an indicator for market penetration they reflect at least these respondents’ preference towards simulation tools.
RESULTS Hosted at
available online from mid December 2008 until mid January 2009. With the assistance of AIA-
visitors with over 249 eligible Researchers and engineers were
respondents. excluded to
be proven to be
Despite hat, the representative of
any given amount of
population, but with responses, patterns can
Fig. 3. BPS tools used by respondents
and cross-discipline analysis is possible. Part 1: Basic Information How do you describe your current position?
2 shows the six available categories from respondents could chose. The majority of
respondents were designers (19.2%).
undergraduate students 6%, while 4.8% of the respondents were intern-architects. Moreover, half of the respondents were LEED accredited
For which design phase would you use the following programs? In a follow up question, respondents were asked to justify the design phases for every tool they use. Figure 4 indicates the typical usage phase for the ten tools according to the respondents’ pattern. GBS, E10, HEED and DB were considered as tools that are used in early design phases. ECOTECT, eQUEST and IES VE were considered as tools that can be used during the conceptual and design development phase. Finally, DOE-2, EP and EPSU were considered