Skip to main contentSkip to navigationSkip to navigation
A row of houses
Snug gives renters a score based on their profile but may be ‘incentivising renters to offer above the advertised price’, according to one expert. Photograph: Mike Bowers/The Guardian
Snug gives renters a score based on their profile but may be ‘incentivising renters to offer above the advertised price’, according to one expert. Photograph: Mike Bowers/The Guardian

Imperfect match: Australian renters in the dark over use of data by tech company Snug

This article is more than 1 year old

Exclusive: Snug gives renters a score based on their profile that is meant to match them to properties, but critics say the process is opaque and skirts the law on rental bidding

One of Australia’s fastest growing rental application platforms is using renters’ data in obscure and potentially discriminatory ways to “score” their applications against rental properties, and gives them a higher score when they offer to pay more rent, a Guardian Australia investigation has found.

Snug, which serves as an intermediary between renters and real estate agents, denies that it facilitates rental bidding. It claims its mission is to “make renting easier” thanks to a platform designed “to facilitate a better relationship between renters and agents”. It says it believes “transparent rental relationships result in better outcomes for all parties”.

But the company’s use of renters’ data is far from transparent, and its relationship to one of Australia’s biggest sharehousing Facebook groups has exacerbated privacy concerns. One expert described Snug’s technology as a “troubling” development in “landlord tech”, and renters’ advocates have urged tougher regulation.

How does Snug work?

Renters apply for properties through Snug by filling out a profile, which includes data such as legal identification, payslips and references. Snug encourages, though doesn’t require, renters to submit bank statements and background checks at their own expense.

One of Snug’s key selling points is the Snug Match Score, which rates a renter’s compatibility with a property as a value out of 100. Snug promotes the Match Score as helping renters and owners “find the best fit for a rental property” by allowing owners to “express their preferences for their property” and enabling renters to “personalise their application”.

When a renter views a property listing, their Match Score appears at the bottom of the screen. Snug’s FAQ says the Match Score is based on “property owner preferences, property data, rental application attributes, renter profile completion and market conditions”.

How the score is calculated is mostly invisible to the renter. But one thing that makes it go up substantially is offering a higher rent.

Guardian Australia tested the process for two properties in Victoria. Applying to rent at the advertised price of $490 a week for a two-bedroom townhouse in the Melbourne suburb of Brunswick resulted in a Snug Match Score of 74.

But offering an extra $10 a week pushed the score up to 77. Each additional $10 a week offered increased the score by two or three points. It reached a score of 88 when the offer hit $540 a week.

The results were similar for a one-bedroom apartment in Footscray, advertised at $310 a week. An initial Snug Match Score of 72 at the advertised rent increased to 85 when the offer reached $340.

In some states, including Victoria and Queensland, soliciting rental offers above the advertised price is illegal, but a higher rent offer may be accepted if it is made “unprompted”.

Linda Przhedetsky, a researcher in rental technologies and an associate professor at the Human Technology Institute of the University of Technology, Sydney, said the Snug Match Scores might be “incentivising renters to offer above the advertised price”.

“Laws including Victoria’s ban on rent bidding have attempted to curb the solicitation of higher rents, but it’s unclear whether these protections apply here,” Przhedetsky said.

The director of community engagement at Tenants Victoria, Farah Farouque, said Snug’s setup “certainly sparks concern for us”.

“We would welcome further clarity from Consumer Affairs Victoria as the regulator,” Farouque said. “As we see more renters going through third-party platforms, we hope to see corrective action being taken to address problems that are clearly emerging.”

A screenshot of Snug’s website. Photograph: Snug

Snug’s chief executive, Justin Butterworth, denied that its program facilitated rent bidding. “Snug wants to help address these issues and Snug certainly doesn’t facilitate it,” he said in a statement.

“Our platform is designed to discourage the kind of rent bidding that is already happening despite being illegal in some states, and to prioritise factors such as start date and lease term in matching tenants to owners.

“We are committed to making adjustments, whenever necessary, to help eliminate rent bidding.”

A spokesperson for Consumer Affairs Victoria said it could not comment on individual businesses but it was “monitoring Victoria’s new rental laws closely” and took cases of potential rental bidding “very seriously”.

“It is an offence for a rental provider or their real estate agent to ask for or invite offers of rent higher than the advertised price,” the spokesperson said. “Consumer Affairs Victoria will investigate potential breaches and take further action where necessary.”

Rental providers and agents who engage in rental bidding in Victoria face penalties of more than $11,000 for a person and more than $55,000 for a company.

Facebook friends and Uber ratings

Przhedetsky’s doctoral research focuses on technologies used to rate potential renters.

She said the Match Score was potentially discriminatory.

“There is no transparency about how the Match Score is calculated,” Przhedetsky said. “We don’t know if their Match Score is calculated fairly. We don’t know how their algorithm is using renters’ data.”

A patent application filed by Snug in 2018, which is still active, suggested that the company’s intention was to collect information from users that included friend lists, social media networks and ratings on third-party platforms such as Airbnb and Uber, and to develop a kind of rental credit system.

In the documents, the company listed various data categories or “attributes” that could be used to determine compatibility between a property and a person. Some relate to the property, including its internal and external aesthetics, crime rates in the area, the “type” of community it is located in and that community’s “cohesiveness” and “demographic profile”.

Attributes listed for renters include their claims history, details of income and employment, and “3rd party ratings from alternate sources (e.g. Facebook, Airbnb, eBay, Uber, Linkedin)”.

skip past newsletter promotion

Data from the renter’s “social graph” – a complex map of relationships from information on social media platforms including their “friend network, professional network” – were also listed as a potential attributes.

In many states, agents and landlords are not allowed to ask renters about attributes such as race, sexual orientation, age, disability, gender identity, marital status, children and religion.

Some states have additional protections. In Tasmania, renters cannot be discriminated against on the basis of an “irrelevant” criminal record. In Victoria, a renter’s profession (including sex work) is a protected attribute, as is a spent criminal conviction, or personal association with someone who has any one of those protected characteristics. Many states note that a renter’s appearance at a tribunal hearing cannot be counted against them.

Przhedetsky warned that the use of tenancy screening technology overseas had also shown the potential for discrimination on the basis of “proxy data” – information that seems neutral but correlates strongly with a protected attribute, such as living in a suburb that has a high proportion of people with a particular racial background.

Apps had been used in other countries to find and scan renters’ social media accounts for “red flags”, she said.

Snug declined to disclose what data points went into the Match Score, saying only it was “based on far less personal information than would be collected in an old style application”. It was “just one indicator within a complete rental application that property managers may consider”.

“Snug does not capture social activity data, nor does it generate ‘proxy data’,” Butterworth said.

“Snug has no foreseeable plans to implement further permissible data attributes contained conceptually in the patent application.”

Butterworth said the company took data security and privacy very seriously, and automatically deleted sensitive information and documents after 60 days to protect customers. Renters could receive notifications about property managers accessing their data, and were able to delete their profile data.

The executive director of Digital Rights Watch, James Clark, said renters should be concerned about the increasing prevalence of “landlord tech” such as Snug.

“Real estate companies already collect a huge amount of information about renters. While the public has recently become more aware of the risks of data breaches, the dangers of companies collecting vast amounts of personal information about our lives go far beyond that. This data gives companies the power to discriminate as well as influence markets,” Clark said.

“Housing is a human right. Decisions about housing shouldn’t be trusted to an opaque algorithm.”

‘Incredibly detailed profiles’

Snug may already have access to large amounts of personal information, even on people who have not applied for a property using the platform.

In 2017, Snug bought the administration rights to Fairy Floss Real Estate, a Facebook group with more than 316,000 members that has become synonymous with share housing in Melbourne.

Since taking over the group, Snug has updated the “About” page to state that users agree to abide by its community standards, which refer to Snug’s privacy policy. The policy states that Snug may “collect information regarding your professional qualifications, your employment history, your rental history and information regarding your social media accounts and social media activity”.

Przhedetsky said Snug’s purchase of the group “raises alarm bells”.

“When posting in these groups, users often share a range of personal information that they would not disclose to a real estate agent or tenancy screening company – this information can be highly personal, whether they are talking about their sexual preferences, political ideologies, or attitudes to drugs,” she said.

“It’s deeply concerning to think about the personal information that Snug could have access to, and how this could be used in a rental process. I’m also extremely concerned for the members that joined the group prior to Snug’s takeover. They wouldn’t have consented to Snug’s ‘community standards’.”

Clark said by accumulating data from third-party sources such as social media profiles, property tech companies could be building “incredibly detailed profiles” on renters.

“These apps use opaque processes to make recommendations or adjust prices, and in the context of housing this leaves renters vulnerable,” he said. “As long as agents have the ability to make you homeless, renters will do what they ask, and regulations mean very little if they are poorly enforced. It’s not just about what’s legal, it’s about who has power and who doesn’t. And landlord tech tilts the scales even further away from renters.”

Butterworth said there was “no data exchange between FFRE and Snug.com”.

“Snug’s privacy policy applies to users that use snug.com.

“Fairy Floss Real Estate group on Facebook provides a free platform to list rental spaces and operates stand alone. Facebook applies the privacy policy on its platform and Snug does not access any Facebook social graph information about Fairy Floss members,” Butterworth said.

Most viewed

Most viewed