Andrew Weakland is senior vice president and director of systems development at W. P. Carey Inc. (NYSE: WPC), one of the largest diversified net lease REITs specializing in the acquisition of operationally critical, single-tenant properties in North America and Europe. Weakland has been at W. P. Carey since 2006.
How has the use of data science evolved at W. P. Carey?
We’ve always been a data-driven organization in my experience, and access to consistent and reputable information about our assets and tenants is key to our underwriting and management practices.
W. P. Carey has been successfully investing in commercial real estate for longer than the field of data science as we know it today has existed, so it is important not to get too caught up in terminology here. What has evolved most quickly over the years is not the fundamental approach to using data to make the best decisions, but rather the tools and technologies to better scale our decision making and deliver the necessary data. ”Data science” is not a standalone discipline here, but a core value across our teams.
Where do you see the biggest advances in PropTech coming from?
In my opinion, the biggest advance in the next two to four years will be increasing standardization of data and connectivity between the myriad components of a PropTech stack. This will be more transformational for corporate landlords and investors than any single technology, as many offerings have already advanced technologically beyond the capabilities of most potential customers to fully utilize their outputs.
Key drivers of this change will be ongoing consolidation amongst PropTech companies as well as increased scrutiny from investors and regulators, such as the upcoming SEC guidelines for climate-related disclosure.
With a potentially more challenging economic environment ahead, do you see data science playing an even more important role?
Running a data-driven organization is a perennial exercise in continuous improvement. Leveraging data in our decision making is fundamental to our process and will remain paramount regardless of the conditions of the commercial real estate market or capital markets more broadly.
Acute events such as the COVID-19 pandemic did showcase the value of being able to quickly integrate and analyze new data into existing models, but that is a benefit to consistently investing in data strategies during all market conditions, not a reaction to a specific one.
What are the key challenges in having to assess data across various property types and geographic markets?
Our portfolio is highly diversified across both geography and asset type which leads to analytical obstacles many other portfolio managers don’t have. The key challenge is simply scale—each time you add a bespoke data set to cater to the needs of a single geography, statutory requirement, or asset-level model, you increase your possible scenarios exponentially.
We are lucky enough to have a single core system and data model across our entire enterprise, which is invaluable for data cleanliness. Whilst processes and requirements differ from market to market, all outputs can be traced back to the core data model, allowing for transparency of the data lineage and thus trust in the system.