The business community at large has entered the Second Modern Age of Tech, where industry-specific programs and systems are replacing more generalized legacy software. As it pertains to real estate investment, the ability to source and analyze opportunities quickly, efficiently, and then share those findings internally—while operating at scale—has become a true differentiator in a hyper competitive global real estate market estimated to exceed $10 trillion in value.
As your investment firm’s first line of deal sourcing, analysts are under tremendous pressure to sort through copious amounts of data on a rolling basis, while distinguishing viable investment opportunities from those that don’t pencil out.
Once they’ve sifted through their most recent batch of opportunities, the analyst typically narrows them down to five to 10 deals they have determined are worth further examination. How, then, do they actually oversee pipeline tracking and share that data with their teams, and send it up to the C-Suite for that deeper dive?
In most companies, data collection is a tedious, time-consuming process, where analysts open a shared file and manually input new data. This is a recurring process, such that analysts spend a considerable amount of time inputting data rather than analyzing data.
Once the data input is complete, the analyst flags the most relevant data and deals, while also manually transferring the remainder of their prep work into a separate folder for future review. Given that, correctly or incorrectly, on first pass the analyst deemed these data sets and deals to be ‘not relevant’, they will remain buried in that folder—a data graveyard—never to be reviewed again, stored on the analyst’s individual computer.
The analyst may also consider using this ‘not relevant’ data as comps against future deals, but this too, rarely happens, rendering the data essentially lost and inert.
If you are the chief investment officer or comparable C-Suite officer in your firm, your team of analysts might have thousands of deals stored individually on their respective computers. But as the CIO utilizing this internal process, you and the rest of you team don’t have access to that data and deals, and thus are not even aware of the information your team has collected.
Consider this scenario: what if a deal opportunity, which one of your analysts initially rejected and stored in that data graveyard, resurfaces again a month or even a year later? And at least on the surface, because market conditions have changed or the analyst is more focused, this opportunity now appears viable for a deeper dive, but does not include as much relevant data as did the initial offering?
No one on your team, including you, will even know to look back at the previous iteration of the OM, know where in your system to look, or whom to consult about whether it had surfaced before. Without a programmatic system that’s accessible by your entire team, that historical data might as well not even exist. And yet… it does.
Another common conundrum: most real estate investment firms are set up such that each analyst works more or less in a silo, unaware what information the other analysts do or do not possess. If the analyst who initially viewed the deal subsequently leaves your firm, but there’s no trail to follow, you won’t know why the analyst initially passed on that particular opportunity. There is no historical record to consult.
And what if, for example, your lone Chicago analyst leaves the firm and takes the data with them? Is that Chicago data package buried so deep in their saved files you’ll never find it? Or is it just gone, wiping out your entire Chicago data set, forcing you to start over again? In either scenario, that puts your firm at a startling disadvantage.
Most work done by the deal team is, ironically, not about the deals, but about the manual capture, sorting, and initial set up of the inbound data. Analysis comes later. Under this inefficient model, you are putting an extraordinary amount of trust in what are, essentially, rogue agents.
But what if, instead of relying on legacy systems and outdated protocols predicated on tedius, time-consuming, manual data capture and sorting, your analysts were able to focus most or all of their time reviewing new deals on an ongoing basis in a pre-digitized format they are comfortable with? What if historical data is captured, sorted, and accessible to every member of your team, in real time, at exactly the same time?
New systems are emerging that enable your teams to analyze deals faster, more efficiently and more thoroughly, and at scale. This change translates to analysts passing along better quality deals up to the C-Suite, so that strategic decisions can be made before a viable opportunity slips away, scooped up by another firm examining the same opportunity.
The Second Modern Age of Tech is focused on providing industry-specific tools to solve problems and unlock value in ways legacy systems cannot. These new systems are out there. If you are looking to arm your analysts with the ability to source deals that other firms miss, or get to them first, investing in modern technology is the path forward. Because if a deal falls in the woods but there’s no analyst there to record it, does that deal even exist?
Gary Kao is vice president of Business Development & Operations at Dealpath.