Nederland wil in Groningen een AI-fabriek bouwen, met SURF als partner. Deze moet onderzoekers, overheden, bedrijven en start-ups toegang geven tot krachtige rekenkracht, expertise en betrouwbare data om AI-toepassingen te ontwikkelen.

De fabriek krijgt een supercomputer, werkt met veilige Europese databronnen en brengt experts samen.

We zijn trots op het voorstel en kijken uit naar de beslissing van EuroHPC in september.

Lees meer: https://www.surf.nl/nieuws/nederland-gaat-voor-ai-fabriek?utm_medium=social&utm_campaign=2025-06-social&utm_kwd=nederland-gaat-voor-aifabriek&utm_source=mastodon&utm_content=surf

#ai

In a private #Github organization, in a private repo filled with NDA code, Github decided that to automatically start reviewing that code using Copilot.

Mind you, Copilot is disabled for this organization.

Could we please just fucking not ?! Not even mentioning the fact that the Github organization didn't enabled this, there is no data policy to be found in sight. I have no clue what Copilot does with the data after it "reviewed" the code and I could potentially be breaking the signed NDA.

#AI #Github #Copilot

I learned something today: Google's Gemini "AI" on phones accesses your data from "Phones, Messages, WhatsApp" and other stuff whether you have Gemini turned on or not. It just keeps the data longer if you turn it on. Oh, and lets it be reviewed by humans (!) for Google's advantage in training "AI" etc.

But this only came to my attention because of an upcoming change: it's going to start keeping your data long-term even if you turn it "off": "#Gemini will soon be able to help you use Phone, #Messages, #WhatsApp, and Utilities on your phone, whether your Gemini Apps Activity is on or off."

This is, of course, a #privacy and #security #nightmare.

If this is baked into Android, and therefore not removable, I'd have to say I'd recommend against using Android at all starting July 7th.

https://www.extremetech.com/mobile/gemini-ai-will-soon-access-calls-and-messages-on-your-android-even-if-you

#spyware#AI#LLM#Google #spying #phone#Android #private #data

Holy #surveillance hell, Batman.

Let me get this straight:

First, they feed your video, which is already stored in their cloud, into an #AI transformer to write descriptions.

Then they feed your descriptions into a pattern learning system (ML, maybe?) to figure out your patterns and habits.

All of this is stored in the cloud. So they not only have your video, but a narrative about your habits, ready to be exfiltrated, monetized, and shared with law enforcement.

#ai #enshittification#RingCamera

https://www.theregister.com/2025/06/25/amazons_ring_ai_video_description/

🐝 Dating platform #Bumble sends user data to #OpenAI without their consent. We have therefore filed a GDPR complaint against the company.

📰 Read more on our website: https://noyb.eu/en/bumbles-ai-icebreakers-are-mainly-breaking-eu-law

#AI #law#MakePrivacyReality

Have you used Bumble in recent years? Well, then chances are that your data has been unlawfully fed into an AI system, including sensitive data such as your sexual orientation. In December 2023, Bumble introduced the Icebreaker feature in the “Bumble for friends” app. This means that Bumble sent your data to OpenAI to help it generate an opening message. For example, if your profile says you are vegetarian and mine says I like to eat outside, the opening message could be: “Do you have a recommendation for a vegetarian restaurant?” Users were then shown a popup in the app informing them about the new Icebreaker feature. They had the option to click on okay or to try to close the popup, but the popup would appear every time they opened the app until they clicked ok. Okay, so what's the problem? Isn't it legal to collect data if users consent? Well, the problem is that Bumble actually only pretends to ask consent and then relies on legitimate interest to transfer your data to OpenAI. And that part about legitimate interest is nowhere to be found in the privacy notice. In summary, Bumble misleads the users in giving them an illusion of control, while in the end they send the data to OpenAI anyways. Because of that, we have filed a complaint with the Austrian Data Protection Authority. You can find all the details as well as previous projects on our website noyb.eu.
Have you used Bumble in recent years? Well, then chances are that your data has been unlawfully fed into an AI system, including sensitive data such as your sexual orientation. In December 2023, Bumble introduced the Icebreaker feature in the “Bumble for friends” app. This means that Bumble sent your data to OpenAI to help it generate an opening message. For example, if your profile says you are vegetarian and mine says I like to eat outside, the opening message could be: “Do you have a recommendation for a vegetarian restaurant?” Users were then shown a popup in the app informing them about the new Icebreaker feature. They had the option to click on okay or to try to close the popup, but the popup would appear every time they opened the app until they clicked ok. Okay, so what's the problem? Isn't it legal to collect data if users consent? Well, the problem is that Bumble actually only pretends to ask consent and then relies on legitimate interest to transfer your data to OpenAI. And that part about legitimate interest is nowhere to be found in the privacy notice. In summary, Bumble misleads the users in giving them an illusion of control, while in the end they send the data to OpenAI anyways. Because of that, we have filed a complaint with the Austrian Data Protection Authority. You can find all the details as well as previous projects on our website noyb.eu.