时间:2025-10-08 08:26:21 来源:网络整理编辑:綜合
If the past few years have taught us anything, it's that algorithms should not be blindly trusted.Th
If the past few years have taught us anything, it's that algorithms should not be blindly trusted.
The latest math-induced headache comes from Australia, where an automated compliance system appears to be issuing incorrect notices to some of Australia's most vulnerable people, asking them to prove they were entitled to past welfare benefits.
Politicians and community advocates have called foul on the system, rolled out by Australia's social services provider, Centrelink.
SEE ALSO:Facebook reveals how many times governments requested data in 2016Launched in July, the system was intended to streamline the detection of overpayments made to welfare recipients and automatically issue notices of any discrepancies.
The media and Reddit threads have since been inundated with complaints from people who say they are being accused of being "welfare cheats" without cause, thanks to faulty data.
The trouble lies with the algorithm's apparent difficulty accurately matching tax office data with Centrelink records, according to the Guardian, although department spokesperson Hank Jongen told Mashableit remains "confident" in the system.
"People have 21 days from the date of their letter to go online and update their information," he said. "The department is determined to ensure that people get what they are entitled to, nothing more, nothing less."
Independent politician Andrew Wilkie accused the "heavy-handed" system of terrifying the community.
The siren call of big data has proved irresistible to governments globally, provoking a rush to automate and digitise.
"My office is still being inundated with calls and emails from all around the country telling stories of how people have been deemed guilty until proven innocent and sent to the debt collectors immediately," he said in a statement in early December.
The situation is upsetting albeit unsurprising. The siren call of big data has proved irresistible to governments globally, provoking a rush to automate and digitise.
What these politicians seem to like, above all, is that such algorithms promise speed and less man hours.
Alan Tudge, the minister for human services, proudly announcedthat Centrelink's system was issuing 20,000 "compliance interventions" a week in December, up from a previous 20,000 per year when the process was manual. Such a jump seems incredible, and perhaps dangerous.
As data scientist Cathy O'Neil lays out in her recent book Weapons of Math Destruction, the judgments made by algorithms governing everything from our credit scores to our pension payments can easily be wrong -- they were created by humans, after all.
The math-powered applications powering the data economy were based on choices made by fallible human beings. Some of these choices were no doubt made with the best intentions. Nevertheless, many of these models encoded human prejudice, misunderstanding and bias into the software systems that increasingly managed our lives. Like gods, these mathematical models were opaque, their working invisible to all but the highest priests in their domain: mathematicians and computer scientists.
These murky systems can inflict the greatest punishment on the most vulnerable.
Take, for example, a ProPublicareport that found an algorithm being used in American criminal sentencing to predict the accused's likelihood of committing a future crime was biased against black people. The corporation that produced the program, Northpointe, disputed the finding.
O'Neil also details in her book how predictive policing software can create "a pernicious feedback loop" in low income neighbourhoods. These computer programs may recommend areas be patrolled to counter low impact crimes like vagrancy, generating more arrests, and so creating the data that gets those neighbourhoods patrolled still more.
Even Google doesn't get it right. Troublingly, in 2015, a web developer spotted the company's algorithms automatically tagging two black people as "gorillas."
Former Kickstarter data scientist Fred Benenson has come up with a good term for this rose-coloured glasses view of what numbers can do: "Mathwashing."
"Mathwashing can be thought of using math terms (algorithm, model, etc.) to paper over a more subjective reality," he told Technical.lyin an interview. As he goes on to to describe, we often believe computer programs are able to achieve an objective truth out of reach for us humans -- we are wrong.
"Algorithm and data driven products will always reflect the design choices of the humans who built them, and it's irresponsible to assume otherwise," he said.
The point is, algorithms are only as good as we are. And we're not that good.
You will love/hate Cards Against Humanity's new fortune cookies2025-10-08 08:08
世界杯8強出爐:超級黑馬+7大豪門,西班牙爆大冷,葡萄牙62025-10-08 08:05
【波盈足球】 世足英格蘭肯恩踢飛12碼崩潰落淚 隊友怒斥記者:放尊重 ( 英格蘭,隊友 )2025-10-08 07:50
【波盈足球】 世界盃法國擊敗波蘭闖進8強 球評 :戰術靈活 ( 波蘭,法國 )2025-10-08 07:44
How Hyperloop One went off the rails2025-10-08 07:41
姆巴佩光芒四射 ,法國輕取波蘭進八強;三次經典反擊進球,英格蘭完勝塞內加爾(意大利英格蘭最終比分)2025-10-08 07:32
荷蘭 VS 阿根廷前瞻 :阿根廷 24 年來的第一場半決賽(荷蘭阿根廷半決賽)2025-10-08 07:01
【波盈足球】 世足生涯踢進9球寫多項紀錄 姆巴佩解釋先前未受訪原因 ( 進球,美聯社 )2025-10-08 06:54
Richard Branson 'thought he was going to die' in bike accident2025-10-08 06:42
【波盈足球】 世足摩洛哥苦練修得「拔牙」技能 成功報4年前一箭之仇 ( 摩洛哥,西班牙 )2025-10-08 05:53
Airbnb activates disaster response site for Louisiana flooding2025-10-08 08:20
【波盈足球】 世足自稱沒錢沒天賦 摩洛哥教頭:我們是世界盃的洛基 ( 摩洛哥,洛基 )2025-10-08 08:07
4:1開炸!五星巴西踢嗨了 ,刷新四大紀錄 ,隊史第19次晉級八強(巴西世界杯進球全記錄)2025-10-08 07:57
1/8決賽今晚上演!荷蘭VS美國+阿根廷VS澳大利亞解析已出!(阿根廷點球勝荷蘭)2025-10-08 07:17
Snapchat is about to explode in popularity, report says2025-10-08 07:11
C羅神奇依舊 ! 三球助葡萄牙淘汰瑞士晉級歐洲國家聯賽決賽(葡萄牙能晉級16強嗎)2025-10-08 07:11
【波盈足球】 世足好消息!巴西總教練親口證實 內馬16強賽有望複出 ( 巴西,韓國 )2025-10-08 06:57
真要會師決賽 ?若葡萄牙頭名晉級 梅羅將各守半區對陣兩大豪門(歐洲杯葡萄牙止步八強)2025-10-08 06:22
Watch MTV's Video Music Awards 2016 livestream2025-10-08 06:17
西班牙點球大戰遭淘汰 葡萄牙大勝瑞士晉級(葡萄牙進8強了嗎)2025-10-08 06:09