Home / Daily Dose / Remapping the DNA
Print This Post Print This Post

Remapping the DNA

Editor's note: This story was originally featured in the December issue of DS News, out now

Forty years ago, consumer-based computer technology began to touch our lives in substantially different ways. Microsoft Office was first released in 1989. That same year, the first of 24 satellites was launched to support the Global Positioning System (GPS) that we now rely on daily for navigation.  

During this same period, the current methods of appraising property were initially fortified under the rules emanating from the Financial Institutions Reform, Recovery, and Enforcement Act (FIRREA). Congress formed the Appraisal Subcommittee, as well as the Appraisal Foundation, which authored and continues to update the Uniform Standards of Professional Appraisal Practice (USPAP). It is hard to imagine but prior to FIRREA, no federal rules regarding appraisal practices existed.

The government-sponsored enterprises (GSEs) were coming of age with standards for loan production and servicing. At that time, appraisers were using notepads, manual tape measurers, and bulky, low-resolution cameras often taped with the photos to the appraisal reports. Many of the forms still in use today—including the infamous 1004 Uniform Residential Appraisal Report—were put into use as dictated by the GSEs. 

While the entire mortgage process from application to servicing is being challenged to operate in a more efficient manner, one of the hottest topics in recent months is how best to appraise the value of a home. 

Conventional wisdom and regulatory history is being challenged across the industry as a direct result of the technology enhancements in data and analytics (DnA) now available to the marketplace.

What has evolved?

The appraiser operating in a local market has historically been considered the fiduciary. He/she had access to the most-and best-real estate data to assess the value of a home. Today, with the broad expansion of DnA, that is no longer always the case. 

Though Automated Valuation Models (AVMs) have been available for two decades, their use was generally focused on low-dollar, low-risk loans and portfolio assessment. Much of that has changed. Today’s mature AVM modeling can provide not only a value, but also a value band, comp adjustments, property complexity scores, Home Price Index statistics, desirability, walk scores, enhanced confidence scores, etc. Moreover, the GSEs have assembled enough information to now be comfortable offering, in some cases, to acquire loans waiving the traditional appraisal process altogether. 

Some in the industry are challenging this pivotal change and the velocity of the change and asking: “Are we, just 10 years later, lowering lending standards again? Will this serve as a pivotal point for the next market collapse?”

Many who have participated in the appraisal process are quick to remind the industry that the home is not just a commodity but its own ecosystem and each home maintains its own unique value based on a variety of constantly changing variables. Additionally, part of the value of a home is based on the surrounding neighborhood and the larger local economy. Many question whether the automated models properly account for these nuances. These are legitimate considerations when redeveloping the standards—and they serve as the foundation of the debate.

Can we truly reduce the myriad of moving parts down to bits of data and analysis and make lending decisions? For the naysayers, we need only to look back at the FICO score that revolutionized the credit-reporting industry. The emergence of FICO reduced the mortgage-credit production time from minutes and hours to just seconds and brought the cost down from $60 to less than $10. There is an easy argument to make that a borrower’s propensity to pay is potentially more volatile than the fluctuation of home value.

As a practical matter, there is an accelerating reliance on the technology that is increasing daily—and at an exponential rate. The DnA in our industry is no different and that will likely not change. Machine learning technology will bring additional velocity to this evolution.

The Human Element

Appraisers and AVMs are not mutually exclusive. AVMs and the data that accompanies them provide a tremendous snapshot of the subject collateral and often the surrounding environment but are not always the entire answer. 

As confidence in the models increase, so too will AVM usage, but there remain those attributes that are less tangible. For properties with unique attributes, there is and will continue to be a need for humans to play a role in determining the value of a property. 

AVMs do not easily or with high confidence make what we often consider to be subjective decisions or observations: 

  • Is the house truly in reasonable market condition? 
  • What is the remaining life of the roof or the painted surfaces? 
  • What is the quality of the landscape?
  • Is there a value to the view of the water?

There will always remain a significant volume of homes that have unique attributes that require human intervention. The appraiser is able to make those determinations that a machine cannot. 

Will the work appraisers do change? Yes. Enhanced camera and video technology is also impacting the appraisal process. There are now smartphone applications that can take videos of a house and digitally “stitch” the rooms together so that the ultimate viewer can “walk” through the house from a computer—this is no different than the street view available today on Google Maps. There will be a future when the person holding that video camera is not the appraiser. It may be the homeowner, the real estate agent, or even an Uber driver, while the appraiser assesses quality and condition from the desktop. The metadata attached to the video documents the time and location that the video was recorded and brings a level of validation that is not always available today. 

Appraisers in the near future will also start to receive what I refer to as “Case Files” from which to operate. The Case File will come from the lender, appraisal management company (AMC), or other third party and will serve as the preliminary dataset to be used for any specific transaction. It is not much different than what an appraiser assembles today by hand when accessing Multiple Listing Services and Public Records data. The primary difference is it will arrive in an automated form, will have substantially more data, and will provide a preliminary analysis of the larger market dynamics. It will serve as the precursor to the Work File that the appraiser is required to develop and maintain under USPAP.  The Case File will include all the relevant comparable properties within a given area.

These comparable properties will have been objectively sorted, ranked and stacked, according to local market attributes. The Case File will include average cost adjustments for the neighborhood, as well as other regression analysis to assist in the development of a final value. The appraiser will then have the ability to overwrite any or all the data based on his/her local knowledge with the opportunity to comment on why the change was made. The results will be a neater, more thorough, and more consistent form of reporting. This concept of beginning with a Case File is in the market today and should be explored and pursued by appraisers.

Technology is affecting each of us in new ways and on a daily basis. All of us must appraise ourselves periodically to determine if we are remaining relevant in the business environment in which we operate. 

How an appraiser operates today and how the appraisal is derived will continue to change over the next few years. The appraisal community will have the opportunity to redefine itself as these changes occur.  

About Author: Mark Johnson

x

Check Also

Federal Reserve Holds Rates Steady Moving Into the New Year

The Federal Reserve’s Federal Open Market Committee again chose that no action is better than changing rates as the economy begins to stabilize.