Real Estate Pricing Prediction via Textual and Visual Features
Document Type
Article
Publication Date
10-21-2023
Publication Source
Machine Vision and Applications
Abstract
The real estate industry relies heavily on accurately predicting the price of a house based on numerous factors such as size, location, amenities, and season. In this study, we explore the use of machine learning techniques for predicting house prices by considering both visual cues and estate attributes. We collected a dataset (REPD-3000) of 3000 houses across 74 cities in the USA and annotated 14 estate attributes and five visual images for each house's exterior, interior-living room, kitchen, bedroom, and bathroom. We extracted features from the input images using convolutional neural network (CNN) and fed them along with the estate attributes into a multi-kernel deep learning regression model to predict the house price. Our model outperformed baseline models in extensive experiments, achieving the best result with a mean absolute error (MAE) of 16.60. We compared our model with a multi-kernel support vector regression and analyzed the impact of incorporating individual feature sets. In future, we plan to address class imbalance by having the same number of houses in each class and explore feature engineering for improving the model's performance.
ISBN/ISSN
Print: 0932-8092; Electronic: 1432-1769
Document Version
Postprint
Publisher
Springer
Volume
34
Peer Reviewed
yes
Issue
6
Sponsoring Agency
This research was supported by the National Science Foundation (NSF) under Grant 2025234 and UD/UDRI Research Fellowship Program.
eCommons Citation
Yousif, Amira; Baraheem, Samah; Vaddi, Sai Surya; Patel, Vatsa S.; Shen, Ju; and Nguyen, Tam, "Real Estate Pricing Prediction via Textual and Visual Features" (2023). Computer Science Faculty Publications. 183.
https://ecommons.udayton.edu/cps_fac_pub/183
Comments
The authors' accepted manuscript will be made available for download upon expiration of the publisher's required embargo; permission documentation is on file. To read the document of record, use the DOI: https://doi.org/10.1007/s00138-023-01464-5