Airbnb Algorithm Analysis
Takeaways on algorithm design, limitations, algorithmic bias, ethical concerns and potential use case
3 min readOct 17, 2020
Takeaways for algorithm design:
- Having a thorough understanding of the problem, dividing into sub-problems.
- Continually asking questions to guide solution development.
- Conducting focus groups to observe, identify loopholes.
- Continued post-release analysis, tweaking based on newly validated observations. Iterative, recursive testing improving over time until several scenarios/factors are detailed.
- Applying ML approach that takes advantage of human intuition when necessary.
- Allowing humans to easily interpret ML “thought-process”.
- Building algorithms that learns from its mistakes and improves.
- Identifying major data types, attributes and rule hierarchy for decision-making.
- Fixing bugs promptly before it amplifies.
Algorithm limitations
- Guaranteeing privacy — Host first name, rough location, exterior photos, location details, host/reviewer picture is available before booking. In a digital era with internet search, social media, public records one could reverse engineer address making easy prey to potential thefts. Image recognition software and text mining could warn users when uploaded pictures could compromise privacy. Eg. Alert when host adds exterior home picture with mailbox number, description includes neighborhood, spouse name, etc.
- Guests could contact host outside of the platform to avoid paying commissions.
- Reconsider whether name and picture at the time of booking is expected. This could help infer race and facilitate discrimination.
- Identify fraudulent listings, duplicate pictures and suspicious accounts. Eg. A scammer listing multiple homes with similar pictures from different angles.
- Verify to ensure majority of reviewers are not from family/friends or paid reviewers (fudging data).
- Adequately warn about pedophiles/felons/criminals, suspicious recent activity providing host decision-making choice without revealing guest identity.
- Flag “High risk” reservations to prevent risks. Eg. An upscale CA home request for 1 night on Halloween for 12 people, the owner with previous city violations. This could be flagged requiring additional screening. Door cameras could be integrated to alert guests.
Ethical concerns
- Potential price discrimination based on mapped micro-neighbourhoods, race, gender, ethnicity, income, etc. Host picture and name could intentionally or unintentionally influence booking decisions. Vice versa last minute cancellations by host.
- Algorithms could be used for vertical agreements i.e based on the price of one firm/consumers reactions price prediction for other products/services resulting in higher prices.
- Algorithmic pricing can intensify competition, infringe competition laws resulting in different price for customers for identical products/services.
- Pricing decisions based on past bias/discrimination would mean that the algorithms continues the same practices rather than unlearn.
Potential business case
- Payment fraud component could be leveraged by businesses in developing countries to mature existing engines.
- Household income prediction based on neighborhood clusters, maps, additional data.
- Businesses in restaurant, hospitality, relating integration partners can offer integrated services to hosts. Eg. cleaning companies, virtual welcoming, guestbook, loyalty apps, discounted local events tickets, catering, automated check-in/out without host being present, sanitization/inspection after checkout.
- With algorithm interactions many businesses could benefit.
- Guest preference data, supplemented with social media analytics can provide insights to related business segments and personalized experiences.
References
https://hackernoon.com/how-to-rob-an-airbnb-252e7e7eda44
https://www.thestreet.com/personal-finance/airbnb-scams-15158841