Aggregator
Yogi Berra Play Catch in the Park event!
Fully Vaccinated? St. Louis Court Offers $100 Incentive As Part Of Warrant Forgiveness
Navajo Chef Explores Indigenous Midwestern Cuisine With Bulrush
20 Years After Release Of ‘Beautiful’ Along The Mississippi, Javier Mendoza Band To Play It Again
McPherson Talks ArchCity Defenders & More With McGraw Milhaven
Friday, August 20, 2021 - The Debate Over Teaching Critical Race Theory
Your Weekend Plans: August 19-22
Music and dancing fill every corner this weekend on Cherokee. Enjoy a set by Makeda Kravitz on Thursday, a curbside concert by the Gaslight Squares on Friday, and afternoon rock, an evening Ska band and late night Afro-beat and Soca djs on Saturday.
The post Your Weekend Plans: August 19-22 appeared first on Cherokee Street.
How Photo Flood St. Louis Captured All 79 City Neighborhoods
How Modern Widows Club Is Helping New Widows During The Pandemic
8.21.2021 DeBaliviere Place Board Packet
8.21.2021 DeBaliviere Place Board Packet
The post 8.21.2021 DeBaliviere Place Board Packet appeared first on DeBaliviere Place - Special Business District.
Thursday, August 19, 2021 - Confusion Surrounds Missouri’s Residency Requirements For Elected Officials
Pictorial Maps
Your Library Podcast S4 Bonus- Celebrity Support
Your Library Podcast S4E4- Staff spotlight: Sunny Sickel
Wash U Biologist Explains How Lizards Evolved For Specialized Life In Trees
Apple’s device surveillance plan is a threat to user privacy — and press freedom
When Apple announced a new plan this month for scanning photos on user devices to detect known child sexual abuse material (CSAM), the company might have expected little controversy. After all, child sexual abuse is a problem everyone wants to solve.
But the backlash from privacy and human rights advocates was swift, loud, and nearly unanimous. The complaints were not largely about the precise implementation Apple announced, but rather the dangerous precedent it sets. The ways in which the technology could be misused when Apple and its partners come under outside pressure from governments or other powerful actors are almost too many to count.
Very broadly speaking, the privacy invasions come from situations where "false positives" are generated — that is to say, an image or a device or a user is flagged even though there are no sexual abuse images present. These kinds of false positives could happen if the matching database has been tampered with or expanded to include images that do not depict child abuse, or if an adversary could trick Apple’s algorithm into erroneously matching an existing image. (Apple, for its part, has said that an accidental false positive — where an innocent image is flagged as child abuse material for no reason — is extremely unlikely, which is probably true.)
The false positive problem most directly touches on press freedom issues when considering that first category, with adversaries that can change the contents of the database that Apple devices are checking files against. An organization that could add leaked copies of its internal records, for example, could find devices that held that data — including, potentially, whistleblowers and journalists who worked on a given story. This could also reveal the extent of a leak if it is not yet known. Governments that could include images critical of its policies or officials could find dissidents that are exchanging those files.
These concerns aren’t purely hypothetical. China reportedly already forces some of its citizens to install apps directly onto devices that scan for images it deems to be pro-Uyghur.
Apple has promised to stand up against the forced inclusion of non-CSAM images to the hash database in an FAQ document it published amidst the backlash: "Apple would refuse such demands and our system has been designed to prevent that from happening." If only it were that simple! Even with the best of intentions, Apple (and the organizations that maintain the databases in question) are likely to face extreme pressure from governments all over the world to expand their efforts to all sorts of other types of “illegal” content. And legal orders are not exactly something companies can just “refuse.”
As EFF said, “if you build it, they will come.”
After extensive criticism, Apple last week issued more clarifications about efforts to mitigate those concerns. It would only match against images that had been flagged by groups in multiple countries, and sophisticated users would be able to check that the list of images their own phone was checking against was the same as on every other phone. While these assurances help mitigate the risk of a single point of failure, they do not fully address the risks posed by a state-level actor.
And unfortunately, the company has in some cases yielded to that kind of pressure. Reporting earlier this year documented Apple agreeing to store user data and encryption keys in China, at the government's behest, and complying with requests for iCloud data. The company has also removed apps and games from its marketplace to comply with local regulations. What would it do differently in the face of new demands to misuse this image matching tech?
Beyond the possibility of database tampering, another way false positives could occur is if adversaries are able to generate files that are "collisions" with known images in the database. Since even before Apple's formal announcement, researchers have called for the company to publish its matching algorithm so they could see how susceptible it is to these kinds of generated bogus matches (which are usually called "adversarial examples" in the world of machine learning).
Apple has thus far declined to make that matching function available, even as the company has called on security researchers to check its work. However, researchers appear to have recently extracted the matching function from iOS, and even generated a "pre-image" match — that is, generating a file from scratch that Apple's matching function cannot distinguish from another known image.
This research represents a serious problem for Apple's plans: adversaries that can generate false positives could flood the system with bad data, even using the devices of unsuspecting users to host it. The earliest adversarial examples look like white noise, but it is likely only a matter of time before they can be embedded in another image entirely.
Journalists, in particular, have increasingly relied on the strong privacy protections that Apple has provided even when other large tech companies have not. Apple famously refused to redesign its software to open the phone of an alleged terrorist — not because they wanted to shield the content on a criminal’s phone, but because they worried about the precedent it would set for other people who rely on Apple’s technology for protection. How is this situation any different?
No backdoor for law enforcement will be safe enough to keep bad actors from continuing to push it open just a little bit further. The privacy risks from this system are too extreme to tolerate. Apple may have had noble intentions with this announced system, but good intentions are not enough to save a plan that is rotten at its core.
At Rise, Terrell Carter Aims To ‘Come Alongside’ Local Communities Dreaming Big
St. Louis County Sees Fewer Jail Admissions, But Longer Stays
Small Spaces for Living Large in Dutchtown
There’s one term that always comes to mind when discussing Dutchtown’s historic housing stock: variety. Dutchtown has something for everyone—spacious two story homes, an array of bungalows big and small, shotgun cottages, two and four family flats, and everything in between.
Read the rest of “Small Spaces for Living Large in Dutchtown”
The post Small Spaces for Living Large in Dutchtown appeared first on DutchtownSTL.org.