Home Energy Score Qualified Assessor Analysis. Results from the Qualified Assessor Questionnaire and Pilot Summit

Similar documents
101 Sources of Spillover: An Analysis of Unclaimed Savings at the Portfolio Level

IAASB Main Agenda (March, 2015) Auditing Disclosures Issues and Task Force Recommendations

CCG 360 o Stakeholder Survey

Development of a Manufacturability Assessment Methodology and Metric

Violent Intent Modeling System

Report CREATE THE FUTURE YEAR OLDS

Figure 1: When asked whether Mexico has the intellectual capacity to perform economic-environmental modeling, expert respondents said yes.

What We Heard Report Inspection Modernization: The Case for Change Consultation from June 1 to July 31, 2012

RFP/2017/015. Section 3

Address: South Elgin, IL Phone: ( ) - Fax: ( ) - Cell: ( ) - Contractor Type: Phone: ( ) - Fax: ( ) - Cell: ( ) - Contact Type:

June Phase 3 Executive Summary Pre-Project Design Review of Candu Energy Inc. Enhanced CANDU 6 Design

Getting the evidence: Using research in policy making

STATE REGULATORS PERSPECTIVES ON LTS IMPLEMENTATION AND TECHNOLOGIES Results of an ITRC State Regulators Survey. Thomas A Schneider

Analysis 3. Immersive Virtual Modeling for MEP Coordination. Penn State School of Forest Resources University Park, PA

Technology Executive Committee

Module B contains eleven modules. This is Module B8. International Standards Development

Identifying and Managing Joint Inventions

Sustainable Development

Technology Roadmaps as a Tool for Energy Planning and Policy Decisions

NIMS UPDATE 2017 RUPERT DENNIS, FEMA REGION IV, NIMS COORDINATOR. National Preparedness Directorate / National Integration Center.

Advanced Circuit Rider (Technical Assistance) Programs to Enhance Deployment of New Energy Efficient Technologies

Stakeholder Comments Template

Eight Key Features of an MDM for Education

2. Why did you apply to the Bristol Home Energy Upgrade project? Please rank in order of importance with 1 being the most important.

LEAD-BASED PAINT EVALUATION AND DISCLOSURE. Determining the Presence of Lead-Based Paint. Disclosure of Lead-Based Paint Information

Topline Summary of Key Findings: Business Customer Focus Groups

Training that is standardized and supports the effective operations of NIIMS.

An Integrated Expert User with End User in Technology Acceptance Model for Actual Evaluation

By RE: June 2015 Exposure Draft, Nordic Federation Standard for Audits of Small Entities (SASE)

PRIMATECH WHITE PAPER COMPARISON OF FIRST AND SECOND EDITIONS OF HAZOP APPLICATION GUIDE, IEC 61882: A PROCESS SAFETY PERSPECTIVE

New Development Bank Technical Assistance Policy

Arshad Mansoor, Sr. Vice President, Research & Development INNOVATION SCOUTS: EXPANDING EPRI S TECHNOLOGY INNOVATION NETWORK

Oil and Gas Course Descriptions

Unit 2: Understanding NIMS

University of Northampton. Graduate Leaders in Early Years Programme Audit Monitoring Report by the Quality Assurance Agency for Higher Education

Indigenous and Public Engagement Working Group Revised Recommendations Submitted to the SMR Roadmap Steering Committee August 17, 2018

ETCC First Quarter-2012 Meeting CPUC Update. Ayat Osman, Ph.D. March 29, 2012 PG&E PEC, San Francisco

Chris James and Maria Iafano

ENERGY STAR Lighting Update. ALA Engineering Committee Meeting May 7, 2013

Item 4.2 of the Draft Provisional Agenda COMMISSION ON GENETIC RESOURCES FOR FOOD AND AGRICULTURE

Proposed International Standard on Auditing 315 (Revised) Identifying and Assessing the Risks of Material Misstatement

M&M MANUFACTURING COMPANY 4001 Mark IV Parkway, Fort Worth, TX Voice (817) Fax (817)

Enfield CCG. CCG 360 o stakeholder survey 2014 Summary report. Version 1 Internal Use Only Version 1 Internal Use Only

FY 2008 (October 1, 2007 September 30, 2008) NIMS Compliance Objectives and Metrics for Local Governments

Sutton CCG. CCG 360 o stakeholder survey 2014 Summary report. Version 1 Internal Use Only Version 1 Internal Use Only

Torsti Loikkanen, Principal Scientist, Research Coordinator VTT Innovation Studies

Impact on audit quality. 1 November 2018

Forensic Photographer II

HELPING BIOECONOMY RESEARCH PROJECTS RAISE THEIR GAME

Case Study Protocol NCPI Project 5.1

Phase 2 Executive Summary: Pre-Project Review of AECL s Advanced CANDU Reactor ACR

Information and Communication Technology

Electronics Technology

Re: JICPA Comments on the PCAOB Rulemaking Docket Matter No. 034

2. Overall Use of Technology Survey Data Report

Thank you for the opportunity to comment on the Audit Review and Compliance Branch s (ARC) recent changes to its auditing procedures.

RISE OF THE HUDDLE SPACE

West Norfolk CCG. CCG 360 o stakeholder survey 2014 Main report. Version 1 Internal Use Only Version 7 Internal Use Only

Update on the Developments in Government Auditing Standards Yellow Book Revision

A/C Change-Out Permit Guidelines - Residential

WHAT SMALL AND GROWING BUSINESSES NEED TO SCALE UP

Orkney Electricity Network Reinforcement Stakeholder Consultation Response. August 2014

Critical Issues and Problems in Technology Education

ADDENDUM #2 September 12, 2018

Methods for Climate Change Technology Transfer Needs Assessments and Implementing Activities

This research is supported by the TechPlan program funded by the ITS Institute at the University of Minnesota

2017 NIMS Update. John Ford, National Integration Center

Here we will briefly give you the following information (like very short and oversimplified overview and conclusions):

GAO. NASA PROCUREMENT Contract and Management Improvements at the Jet Propulsion Laboratory. Report to Congressional Requesters

Scotian Basin Exploration Drilling Project: Timeline

North Carolina Fire and Rescue Commission. Certified Fire Investigator Board. Course Equivalency Evaluation Document

Core Concepts of Technology ITEA 2

Final Prospectus and Terms of Reference for an Independent Review of the New England Fishery Management Council 2/27/18

Draft executive summaries to target groups on industrial energy efficiency and material substitution in carbonintensive

SOCIAL ACCEPTANCE FOR ENERGY EFFICIENT SOLUTIONS IN RENOVATION PROCESSES

April 2015 newsletter. Efficient Energy Planning #3

Attention: Mr. Corey Peet USAID MARKET Project

FOREWORD. [ ] FAO Home Economic and Social Development Department Statistics Division Home FAOSTAT

Disclosure Initiative Principles of Disclosure

Eradicating. Orkin s internal audit process fosters more reliable service, reduces customer churn

Distribution Restriction Statement Approved for public release; distribution is unlimited.

H2TRUST. Dr. Lourdes F. Vega MATGAS 2000 AIE h2trust.eu. Click to add title. FCH JU Grant agreement number:

Project Status Update

Event Industry Global Market Research

Model Pro Bono Policy for Large Firms

Selecting, Developing and Designing the Visual Content for the Polymer Series

Objective 3.1: Provide or stimulate provision by the private sector of affordable housing units.

RIVERSIDE COUNTY OFFICE OF EDUCATION REGIONAL OCCUPATIONAL PROGRAM (ROP) INDIVIDUALIZED TRAINING PLAN (ITP)

Using a third-party cementing laboratory to generate value for your company

TExES Art EC 12 (178) Test at a Glance

Bulk Electric System Definition Reference Document

Novelties that IEC brings into test laboratories

Is Texas Ready for Mileage Fees? Results from Exploratory Study Presentation to the Texas Transportation Commission December 15, 2010.

COMMISSION STAFF WORKING PAPER EXECUTIVE SUMMARY OF THE IMPACT ASSESSMENT. Accompanying the

1. Recognizing that some of the barriers that impede the diffusion of green technologies include:

Other Transaction Authority (OTA)

2017/18 Mini-Project Building Impulse: A novel digital toolkit for productive, healthy and resourceefficient. Final Report

Mindfulness Teacher. Training Programme

6,000+ ABI Community Survey Key Findings. For more information, visit: beltline.org/2018survey AUGUST satisfied

2016 Smart Cities Survey Summary Report of Survey Results

Transcription:

Home Energy Score Qualified Assessor Analysis Results from the Qualified Assessor Questionnaire and Pilot Summit

Table of Contents Summary... 2 Background... 2 Methodology... 3 Findings... 5 Conclusions... 8 Attachments... 9 1

Summary In 2011, the U.S. Department of Energy (DOE) pilot tested its Home Energy Score program and scoring tool in ten locations in order to collect information about its applicability and acceptance in a number of different climate regions and energy programs. As part of the pilot testing, energy auditors used by the various pilot locations, known as qualified assessors or assessors, were questioned about the tool and associated training and support materials. In particular, assessors were asked for their feedback concerning the time required for data acquisition, data entry, energy improvement recommendations created by the tool, assessor training, and usefulness of the score to homeowners. In terms of the time requirements for conducting the assessments and using the software tool, assessors indicated that data collection and entry time requirements were reasonable. Most assessors were able to collect and enter the data into the tool within one hour. For assessors collecting additional data not required by the tool, the collection time averaged 17 minutes. For the four assessors collecting only data for the Home Energy Score, 75 percent of the respondents indicated data collection time and data entry required less than a half hour. All assessors reported needing less than 30 minutes to enter data into the tool. Assessors noted problems concerning a lack of consistency between the tool s recommendations and those generated by their particular energy program. They also commented that, in some cases, the scoring tool recommended installation of systems or elements that were already in place. Assessors also noted that the payback values provides by the scoring tool for individual recommendations did not always agree with local estimated costs of improvements. Assessors were not always clear about what information was required by the tool and how it should be collected. Some assessors also demonstrated a lack of clarity regarding the intent of the Home Energy Score program and the purpose of the scoring tool. DOE used the feedback provided by the assessors and pilot partners to make changes to the scoring tool; training materials; assessor tests; and the overall program. A summary of these changes, now incorporated into the Home Energy Score Program and Scoring Tool, is provided in the conclusion section of this report. 1 Background The U.S. Department of Energy developed the framework for the Home Energy Score, a residential energy efficiency program, in response to key recommendations of Vice President Biden s Middle Class Task Force, as outlined in the report, Recovery Though Retrofit. 2 The overall objective of the program is to facilitate a new level of investment in home energy efficiency by overcoming a range of informational and market barriers frequently identified as hindering these investments. The Home Energy Score is 1 Additional information concerning DOE s refinements to the program and tool is provided on the DOE Home Energy Score web site at www.homenergyscore.gov. 2 Middle Class Task Force. 2009, October. Recovery Through Retrofit. Council on Environmental Quality. 12 pp. http://www.whitehouse.gov/assets/documents/recovery_through_retrofit_final_report.pdf 2

intended to support and complement existing residential retrofit programs being implemented across the country. As part of the process for evaluating the elements of the Home Energy Score, ten pilot locations were selected to administer the program and collect information to report back to DOE regarding the various facets of the training, the scoring tool, recommendations provided by the program, ease of implementation and assessor reaction. Pilot partners performed Home Energy Score assessments for their local clients between March and July, 2011. Concurrently, homeowner assessments were conducted to gauge reactions from occupants concerning the score and associated information such as motivation and necessary information required to pursue energy improvements for their home. A pilot summit was conducted July 19 and 20, 2011, where representatives from the pilot organizations were invited to meet and discuss their impressions of Home Energy Score implementation and provide suggestions for improvements prior to a national launch of the program later in the year. Pilot participants were asked to give feedback on the functionality of the tool, the ability of the program to motivate homeowners to perform energy improvements, and areas for improvement for the tool and overall program. Specifically, the results from the summit were categorized by: Pilot feedback on the usefulness of the scoring tool both from assessor and administrative standpoints Analysis of the skills required of the assessors in order to effectively score a home in the Home Energy Score program The ability of the Home Energy Score program to integrate with other energy efficiency efforts Administration of the Home Energy Score program on local levels The homeowner experience with the Home Energy Score process and results Applications for the Home Energy Scoring Tool to link with other energy audit based tools or programs. This paper specifically focuses upon reactions from assessors, gathered by a questionnaire and direct interaction with assessors and pilot representatives administering the programs. Methodology The Home Energy Score was developed with the goal of providing homeowners with reliable information concerning their home s energy performance, at a relatively low cost, and in a format that would encourage investment in energy improvements. The development process was informed through the use of focus groups, a federally- published request for information, and feedback from industry conferences and webinars. Once developed, DOE sought to pilot the program throughout various areas of the United States to represent different climate zones, construction types, administrative organizations and credentials of administrating energy assessors. Ten locations ultimately participated in the pilot effort and are 3

represented in Figure 1. One pilot partner in Utah (not shown here) provided data but joined late in the process and did not provide the score information to homeowners. Figure 1. Home Energy Score Pilot Locations The ten pilots were asked to score at least 100 homes using Building Performance Institute (BPI) Building Analyst or Residential Home Energy Network (RESNET) Home Energy Rater- certified staff. These credentialed staff also had to pass a test administered by DOE to demonstrate proficiency in the Home Energy Scoring Tool to become a Qualified Assessor. Only Qualified Assessors were provided access to the Home Energy Scoring Tool. Pilot organizations were asked to provide information on the purpose of the Home Energy Score pilot process (Attachment A) to their Qualified Assessors and prospective homeowner clients. Assessors were asked to fill out a questionnaire after scoring at least 10 homes in order to obtain their estimates on the time required to gather and input data needed to generate the score as well as their thoughts and recommendations on the results generated by the tool (Attachment B). Homeowners were also given a questionnaire which gauged consumer impressions of the process and information provided from the energy assessment. In July 2011, a two- day summit was held where the pilot organizations gathered to collectively share organizational opinions on the Home Energy Scoring Tool and program, as well as provide recommendations for improvements prior to the national launch of the program (Attachment C). During a facilitated discussion, each pilot was asked to report on the basic characteristics of the effort 4

such as the number of assessors and homes scored; how homes were selected; lessons learned from the pilot process regarding the assessors and the scoring tool; any analysis done regarding Home Energy Score and observations thought important to report to DOE; and recommendations for the program prior to a national launch. The summit included attendees representing DOE and national laboratory staff who reported on efforts underway related to Home Energy Score and other information potentially of interest to pilot participants. Feedback obtained during the summit was recorded by the facilitator during flip chart exercises and note takers. Findings Contractor experience was gauged primarily from the Qualified Assessor questionnaire and verbal reporting by pilot representatives. Of the 31 Qualified Assessors participating with the pilot programs, 20 assessors returned questionnaires representing a 65 percent response rate. The responding assessors accounted for seven of the ten pilot locations (IL, MA, MN, OR, SC, TX, and VA). Time Requirements In responding to questions about time to collect data for tool input, it is important to note that 16 (80 percent) of the responding assessors also collected additional data for their particular pilot programs administrative needs. For these assessors, the average time to collect the information required for the Home Energy Scoring Tool and not captured by their other program activities was 17 minutes, with responses ranging from none to up to an hour of additional time. Of the four assessors who only collected information for the Home Energy Score program, two assessors indicated 26 to 30 minutes, one assessor indicated less than 15 minutes, and the fourth assessor indicated needing greater than 30 minutes to capture the information required by the tool. Responses regarding the time required for data entry into the Home Energy Scoring Tool appear to show that most assessors can enter the compiled home characteristic data in less than 20 minutes (Figure 2). Three of the assessors indicated more time required for data entry and one assessor did not respond to the data entry question. Data Requirements Reponses related to ease of collection seemed to indicate it was less of an issue to physically obtain the necessary data and focused on a request for clarification of procedures or approaches where instruction is not explicitly given. For example, 75 percent of the respondents indicated that data input requirements were clear, but from respondents that expressed opinions indicating input requirements were not clear, the following items were listed as confusing: Conditioned floor area Number of stories above grade Roof absorptance Multiple cases for an input field such as foundation for a home with slab and crawlspace or different wall assemblies each with unique insulation Solar Heat Gain Coefficient 5

HVAC system efficiency values Duct location Figure 2. Time Required by Assessor to Enter Data into Tool Assessors indicated that homes with the following characteristics were problematic to collect information and score: Homes with multiple HVAC systems Homes with complex roof/wall/foundation elements such as those with both conventional and cathedral ceilings, split- level homes, or those containing multiple foundation types Homes with renewable energy generation or using geothermal heat pumps Newer homes or those with properly- installed weatherization features evident Homes equipped with electrical- resistance heating Log cabins or manufactured housing Half of the respondents indicated the tool was effective in assessing different home types based upon the home characteristic issues listed above. Scoring Tool Energy Improvement Recommendations Assessors noted that recommendations provided by the scoring tool were often not in agreement with recommendations for improvements made by the local energy efficiency programs, which was difficult to rectify with homeowners. Additionally, assessors commented that the recommendations were too generic and left the homeowners wanting more detailed information. Recommendations for improved appliances or systems that were already in place further clouded understanding of the suggested improvements generated by the tool. 6

The assessors indicated a less than neutral response to the questions regarding whether the score helped them make better recommendations or helped the homeowner understand their recommendations. However, most assessors did not find the recommendations hard to understand. Assessor Qualifications The pilot program required participating organizations to use assessors certified as BPI Building Analyst Professionals or HERS raters under the RESNET- accredited provider system. The vast majority of assessors were BPI Building Analysts with one pilot (Texas) containing one HERS rater as an assessor. The South Carolina pilot listed an assessor who possessed both credentials but this individual did not respond to the assessor questionnaire so no further information regarding this individual is known. The issue of whether HERS raters and BPI Building Analysts were similarly qualified to score homes was not studied as part of the pilot given the fact that only one of the assessors was a HERS rater. However, a small sample of homes were rescored by separate assessors to examine whether homes can be consistently scored by different, similarly trained individuals. In all but one of the homes that were rescored, the two independent scores were either equivalent or within one point of the other. This issue will be further studied in the next phase of program implementation as part of quality assurance procedures. The issue of different certifications will also be further studied in the next phase of implementation. Program Training & Materials Some assessor comments demonstrated insufficient program training or a failure of the assessor to comprehend and retain information from the training. The data is not sufficient to explain whether these comments suggest assessor qualification weaknesses or deficiencies within the tool or associated training and support materials. Based on responses and discussions with assessors and pilot administrators, some assessors needed more descriptive training materials concerning the program and/or tool input requirements. A number of assessors noted the importance of having visual examples of data requirements and collection procedures in training materials. Based upon data from the questionnaires, assessors did not feel they were equipped to properly explain the score or process in terms the homeowners could understand. Seven of 19 respondents indicated a negative response on a five- point Likert scale asking about the assessor s understanding of the recommendations generated by the scoring tool. The average response to this question was 2.25, where 1 indicated strong understanding and 5 a strong misunderstanding of the recommendations. Table 1 describes results from questions regarding homeowner communication and tool results. It is not clear what factors led to the negative responses noted in Table 1. However, drivers could include inadequately prepared or presented training and introductory materials; pilot administrators not properly explaining to their assessors how Home Energy Score can be integrated with their existing programs; and assessors not taking time to read or understand program materials. 7

Table 1. Usefulness of Scoring Tool Recommendations Question Topic Average Response (Number of Respondents) Range of Responses Allowed Assessor to make better recommendations 1 - Strongly disagree 5 - Strongly agree Helped homeowners understand recommendations 1 - Strongly disagree 5 - Strongly agree Score recommendations difficult for Assessor to understand 1 - Strongly disagree 5 - Strongly agree 1.6 (19) 1-3 2.3 (16) 1-4 2.3 (20) 1-5 Other Feedback Recommendations from assessors regarding tool improvements were often contradictory (e.g., remove inputs that don t affect energy use such as fuel type ; add more inputs such as fuel type ; add more house characteristics as long as it does not make the tool more complicated ). Several contractors indicated a desire to have the tool recognize renewable energy components such as photovoltaic or geothermal heat pump appliances. Pilot representatives raised concerns regarding the fact that the scoring tool is web based. This could limit the ability of assessors to provide the Home Energy Score Report during their visits to homes in some remote areas. This was not noted specifically in responses to the questionnaire. Conclusions The following list is a summary of changes DOE made to the program specifically as a result of the feedback provided by assessors and pilot partner organizations. Training o DOE expanded training and informational tips to include more specific instructions on data collection and calculation procedures including window measurement, locating efficiency data on appliances such as furnaces, water heaters and air conditioners, defining the thermal efficiency of dissimilar elements such as walls with different construction details. 8

o o Tool o o The program is developing guides that include pictures to illustrate concepts such as conditioned and unconditioned areas of a house. Prior to implementing the next phase in 2012, DOE significantly enhanced assessor training and supplemental materials for program partners and assessors including information regarding the intent of the Home Energy Score program and how it can be used with existing energy programs or as a stand- alone program. The Home Energy Score report no longer includes payback values alongside individual energy improvement recommendations; however, the tool continues to use payback calculations to prioritize and set cut- offs for listed recommendations. LBNL revised the scoring tool s approach for developing recommendations so that it no longer recommends improvements that have already been made in the home (i.e., the new version of the tool strictly evaluates improvement options with respect to the existing home characteristics). Overall Program o DOE significantly enhanced materials and ongoing outreach efforts with Home Energy Score Partners who manage the assessors working under them. Additional program materials are under development and will continue to be updated to address needs of program partners as they are raised. Other refinements to the program and the scoring tool, made as a result of separate analysis, are delineated in complementary reports available on the Home Energy Score web site. For specific information on the current Home Energy Scoring Tool, LBNL documents its methodology, calculations used, as well as changes made since the tool was created in 2010, on its web site at https://sites.google.com/a/lbl.gov/hes- public/home- energy- scoring- tool. Assessor comments as well as other pilot findings were greatly beneficial in guiding the development of the current Home Energy Score program. As part of future program development, DOE will assess how best to address additional pilot feedback (e.g., whether and how to incorporate on- site renewables in the score) as well as findings from future analysis to be conducted in this next phase of program implementation. Attachments Attachment A: Pilot Supplementary Information Attachment B: Assessor Questionnaire 9

Attachment A. Pilot Supplementary Information 10

11

Attachment B. Assessor Questionnaire 12

13