AI is already being used in the real estate industry for a variety of development tasks from assessing acquisitions opportunities to zoning reviews.
ChatGPT and generative AI has been popularized recently within the press, and already solutions leveraging this technology are being used (sometimes unknowingly) in the industry.
While AI is still a valuable tool it is important to be aware of potential pitfalls:
Hallucinations - ChatGPT sometimes “hallucinates”, meaning it gives you an answer it thinks is right, even if it’s blatantly wrong…. and it sounds credible when doing so.
Bias & Prejudice - Bias and stereotyping are still huge problems for some Ai systems, despite companies’ attempts to fix it.
Commercialization - Already there are plug-ins for ChatGPT such as Kayak, Instacart and others. The extent to which the answers provided by Ai are influenced by their partnerships (and paid partners) is questionable.
Plagiarism - Works referenced do not have the permission of its authors and there may be legal battles to come.
Incomplete Data - Being provided only digital data sources, any data points not included in the analysis will lead to a distorted result.
Regulatory Uncertainty - As often happens in tech, the law is two steps behind, leaving room for unethical conduct or abuse.
It is important to be aware of these limitations and take steps to mitigate these risks, such as using ethical AI practices and building in privacy protections.
In the same way as you wouldn't let a crane be operated by just anyone, personnel using powerful AI tools should also be trained to a high standard to ensure diligent use of this powerful technology.
Comments