Home PC News Landlord Tech Watch project maps where landlords may be using tech to...

Landlord Tech Watch project maps where landlords may be using tech to spy on tenants

The AI Now Institute, People Power Media, and the Anti-Eviction Mapping Project at the moment launched Landlord Tech Watch, a crowdsourced map inspecting the place surveillance and AI applied sciences are being utilized by landlords to doubtlessly disempower tenants and neighborhood members. The website invitations tenants to self-report the varieties of tech which can be being put in of their residences and neighborhoods, and it goals to function a useful resource to assist educate in regards to the widespread use and harms of those applied sciences.

Currently, there’s little in the best way of laws governing the gathering and use of information within the context of actual property. Owners and landlords usually buy and set up tech merchandise and platforms with out notifying or discussing potential harms with their tenants, and typically with out even letting them know.

For occasion, in New York City, rent-stabilized tenants on the Atlantic Plaza Towers in Brownsville had been subjected to a facial recognition safety system from a third-party vendor. Elsewhere within the metropolis, an aged tenant in Hell’s Kitchen charged {that a} keyless system put in by his landlord was too complicated, and feared that his actions could be tracked by the know-how.

Residents and native elected officers had been fast to rail in opposition to the techniques, and final October, City Council proposed regulation that may pressure landlords to offer tenants with conventional steel keys to enter their buildings and flats. The tenant in Hell’s Kitchen together with neighbors secured the proper to bodily keys in May after suing the owner.

Landlord Tech Watch goals to supply tenants and researchers a greater sense of the scope and scale of landlord know-how at the moment in use like digital camera, cost, and screening. It contains examples of several types of tech and the particular harms related to every sort, together with a deployment map that signifies the place such tech is getting used and a survey that encourages folks to share their experiences with the ways in which their constructing and neighborhood are putting in know-how.

Landlord Tech Watch

Residents at 406 West 129th Street in Manhattan have already used Landlord Tech Watch to report that intercoms from GateGuard have been put in at buildings with out permission. (CNET just lately reported GateGuard has been pitching its know-how to landlords in New York as a solution to sidestep rent-control laws.) At 61 Wyckoff Ave in Brooklyn, a tenant claims the owner just lately changed buzzers with new camera-equipped digital buzzers.

“Facial and movement recognition cameras made by the Israeli-based FST21 [have been installed in our building],” a resident of New York’s 10 Monroe Street wrote. “This came after Hurricane Sandy inflicted damage on the building. The landlord then installed this without our consent … We don’t know what happens with the data being collected about us. It also doesn’t work well, and we all have to do humiliating dances to be recognized by it.”

The Landlord Tech Watch web site notes that tech can be utilized to carry out doubtlessly prejudicial background, revenue, and credit score checks on potential landlords; whereas there’s no registry of all tenant screening firms, it’s estimated that there are over 2,000. (Last yr, the U.S. Department of Housing and Urban Development started circulating rules that may make it more durable for tenants to sue landlords when algorithms disproportionately deny housing to folks of shade.) Virtual property administration platforms would possibly stop tenants from speaking with their precise landlord, leading to neglect and fewer responsive administration. And AI safety techniques may goal and doubtlessly endanger sure tenants relying on their ethnicity and pores and skin shade.

Consider facial recognition, which numerous research have proven to be inclined to bias. A study last fall by University of Colorado, Boulder researchers confirmed that AI from Amazon, Clarifai, Microsoft, and others maintained accuracy charges above 95% for cisgender women and men however misidentified trans males as girls 38% of the time. Separate benchmarks of main distributors’ techniques by the Gender Shades undertaking and the National Institute of Standards and Technology (NIST) counsel that facial recognition know-how displays racial and gender bias and facial recognition packages may be wildly inaccurate, misclassifying folks upwards of 96% of the time.

Most Popular

Recent Comments