PROPERTY JOURNAL

Managing real estate data

Why is the quality of data so important in real estate, and how can it be improved?

Author:

  • Kevin Grice

29 March 2020

Whether it's the nefarious collection practices of companies such as Google and Facebook or the apparently infinite possibilities of artificial intelligence and automation, data seems to be on everyone's lips. For those of us in real estate, the promises of the proptech explosion are equally alluring. What's not to like in a beautiful online dashboard that tells you everything you need to know about your portfolio?

However, as my old computer science professor used to tell me, it remains the case that garbage in equals garbage out. And as my own long experience in property management has since proven, too many companies are prepared to put up with poor-quality data when it is entirely in their power to improve it.

But here's the good news: fresh ideas about digital transformation – many of them from UK Government Digital Service (GDS) and its first leader Mike Bracken – can significantly improve data collection and the quality of the processes that follow.

Finding a place for big data

Let's pause to draw an important distinction. Big data is where much of the current buzz is. But the clue is in the name: big data sets need to be very sizeable indeed to be useful.

Only big data has the scale necessary to excite PhD statisticians, to power revolutionary algorithms and machine learning, or to predict one individual's pregnancy on the basis of the online shopping habits of millions of other women.

So while there is a growing place for big data in real estate, it requires automation – or automated sensors – capable of collecting it at a massive scale if it is to provide insights that are useful. In everyday property management, which deals with leases, financial transactions and physical measurements, big data as yet plays little part.

If our data may seem mundane by comparison, it still lies at the heart of our businesses. Even a small inaccuracy – for example, missing a critical date – can have a severe impact on profitability and reputation. So above all, good-quality data must be accurate and, ideally, should conform to common standards if it is
to be of value.

Fears of automating mistakes remain

Even if we were to enforce common standards on format or measurement method, however, this would still pale into insignificance when compared to how that data is being collected. This is because so much of our data collection remains manual. Ours is a profession that still has to collect and manage a significant proportion of its data manually – and will continue to do so for the foreseeable future.

Manual data entry inevitably introduces a disconnect between the data collectors, often surveyors, and the computer systems that then do the processing. At some point, fingers have to connect with keys – and keying always introduces errors. Often, the processes themselves make this worse: the more complicated the process, the more likely the user will make mistakes.

This wastes time in checking, validating and approving data – but it also wastes resources, because the more uncertain you are of your data the less likely you are
to automate higher processes.

All businesses should be actively addressing these issues, but it rarely happens; generally, it takes a massive business problem to provoke action. That was the situation one of our bigger clients faced not so long ago when it secured its largest ever instruction. Suddenly it was confronted with the need to upload, manage and maintain a massive amount of data all in the space of a few months.

The data related to an international estate of astonishing complexity, and its existing data management practices – built using a combination of off-the-shelf packages such as Sharepoint, Workflow and Excel – was simply not going to cut it.

Not only did the project take huge resources to run, the approach was also completely rigid: there was no way to change approval procedures on a form-by-form basis or add new fields specific to the instruction, and no documentary evidence could be attached or notes added to say why something had been changed. Worst of all, it was impossibly complicated to use. Given that people around the world had to engage with it immediately and without training, this was clearly a disaster waiting to happen.

Template for refinement

Fortunately, if you're looking to solve the problems of manual data management in a systematic manner there is a solution right in front of us – courtesy of the UK's largest data collector, the civil service. It is the process called digital transformation, and it has been defined and refined by GDS over the past ten years.

The story of GDS and its revolutionary approach to process refinement has been told many times elsewhere, but it remains remarkable. At its heart was the understanding that the internet could change forever the way that a large organisation interacted with its end users – in all sorts of ways, but particularly when it came to data management.

If you're a car owner, you'll have encountered the GDS approach when you last paid your road tax. What was once an unbelievably complicated process, involving the collation of multiple documents and the completion of complex forms, now involves little more than a couple of clicks and an online payment to update the central Driver and Vehicle Licensing Agency database fully and accurately.

To paraphrase GDS's definition of digital transformation, it is the use of internet-era principles and technologies to create processes that the user prefers to use. In other words, although the internet clearly enables new ways of thinking, digital transformation actually has much more to do with improving how the user experiences the process.

By ending the big IT culture of the past – where, for instance, the civil service awarded vast contracts for software that promised the earth but never delivered – digital transformation has saved UK taxpayers many millions of pounds. Unsurprisingly, it has since been adopted by governments and organisations around the world.

This was the path that my own company, Trace Solutions, chose to follow, in partnership with our client. In its initial incarnation the project was aimed squarely at data management. Following the principles of digital transformation, we put web apps for data collection directly into the hands of the person closest to the source, using whatever internet-enabled device they had to hand. That person might be a surveyor, a solicitor, an accountant, a manager, or even an external investor; but each app would focus on that user’s needs. Once we’d understood those needs, we could optimise the process.

So, for example, data could be stored as a draft, passed automatically along the approval chain and only posted to the live database once it was fully signed off. We stored comprehensive audit trails with commentary and supporting evidence alongside the data. We also allowed the removal of unnecessary fields and the addition of new ones, making some mandatory if required. In short, we made data collection and its subsequent maintenance simpler, to reduce the chance of introducing inaccurate information.

"Too many companies are prepared simply to put up with poor-quality data when it is entirely in their power to improve it"

Implementing improvements

It is clear that digital transformation, as defined by GDS, can be an immensely powerful way to improve the quality of real-estate data, wherever people are involved in its collection or maintenance. This encompasses the vast majority of data we work with as real-estate professionals on a day-to-day basis.

However, as GDS founder Mike Bracken points out, although digital transformation is not a complicated concept to grasp it can be very hard to implement. Anyone who seeks to change long-established ways of doing things will encounter resistance from those who feel threatened.

Few organisations will have the resources, skill sets or even the will to attempt such a project in house, so clarity of thinking about your data is needed from the outset. Do you need all the data you currently collect? Is the investment to improve data management actually worth it compared to the scale of the potential losses if you don’t?

Whichever way you choose to answer those questions, there is no doubt that digital transformation will improve data quality far more than traditional, procedural methods such as Six Sigma process design methodologies. And as for routine data audits – well, somehow you just never seem to get around to them.

For our large client, the investment in digital transformation has certainly paid off. As mobilisation began, cloud-based data entry apps were distributed automatically to the team of people doing the job. These people had never used any of our systems and yet they were able to enter and subsequently maintain the data on no fewer than 10,000 separate properties, all in a fraction of the time it would have taken using traditional systems. Best of all, the client knew that all this data was accurate.

"Anyone who seeks to change long-established ways of doing things will encounter resistance from those who feel threatened"

kevin.grice@tracesolutions.co.uk

 

@trace_solutions

Related Articles

PROPERTY JOURNAL

go to article Why UK commercial property can be cautiously optimistic

CONSTRUCTION JOURNAL

go to article Data integrity key in golden thread and building safety

LAND JOURNAL

go to article Understanding Antarctic melt needs geospatial data