Hacker News Clone new | comments | show | ask | jobs | submit | github repologin
Microsoft Word and Excel AI data scraping switched to opt-in by default (www.tomshardware.com)
64 points by oldnetguy 1 hour ago | hide | past | web | 25 comments | favorite





Is this the correct use of "opt-in?"

To me, having things "opt-in" means they're off and you can turn them on if you want.

If it's "opt-out" it's automatically on, and you can turn it off.


Likewise, I think the title is literally of opposite what is actually happening.

I think they mean 'Enabled by default'

Thus opt-out would be the correct term.

You are correct. The headline author likely meant "opted in by default" or "enabled by default."

> Microsoft's Connected Experiences feature automatically gathers data from Word and Excel files to train the company's AI models. This feature is turned on by default, meaning user-generated content is included in AI training unless manually deactivated.

Not to say that Microsoft products respect privacy, but I don't see evidence that user-generated content is being used for training.

The linked services agreement has had the same language (copy/transmit/etc. "to the extent necessary to provide the services") since at least 2015[0], and "connected experiences" seems to group a wide range of integrations; some like dictation/translation probably utilise ML, but that does not mean training on user content.

[0]: https://web.archive.org/web/20150608000921/https://www.micro...


Agreed. This was raised within our corp the other week and we read through the privacy and security documentation as it relates to Connected Experiences. Microsoft has outlined specifically what Connected Experiences covers.[1] [2] You could argue that predictive text is a product of machine learning but there is no clause allowing for training any generalized large language models using this data. The confusion may have arisen, if they read an article about CoPilot. If the user had a Microsoft Copilot 365 license, then the data would be used as grounding for their personal interaction with CoPilot. But still not used to train any foundational LLMs. However, even this data is still managed in compliance with Microsoft's data security and privacy agreements.

[1] https://learn.microsoft.com/en-us/microsoft-365-apps/privacy...

[2] https://learn.microsoft.com/en-us/microsoft-365-apps/privacy...


This seems like a security shit show.

Can we disable it by group policy across entire domains?

Surely no business would ever allow Microsoft to 'reformat, display, and distribute' confidential company documents?

Or am I missing something.


Well, if there's some sort of cloud feature allowing you to share documents you write with others, it would make sense you would have to allow Microsoft to "reformat, display, and distribute" for the purpose of providing you that service.

However, the terms of service says "To the extent necessary to provide the Services to you and others, [...] and to improve Microsoft products and services". So they're saying they can use your content not just to provide you service, but to provide other people service and to improve all Microsoft products.


A word processor stealing the user's IP by default should carry massive fines in the EU. This is pure deception. 20% of annual revenue should be appropriate.

"In the M365 apps, we do not use customer data to train LLMs. This setting only enables features requiring internet access like co-authoring a document." @Microsoft365 https://twitter.com/Microsoft365/status/1861160874993463648

I just checked and this is turned off in my installation, but I’m not sure that’s from being EU based, or because my org has disabled it.

Microsoft = Spyware

Most tech theses days seems to fall into that classification.

There are not too many pieces of technology these days that intentionally avoid collecting your data in order to be sold to another company.


This would certainly be the cause of lots of GDPR violations, considering the kinds of information processed in Word and Excel. I know our condo's owners association keeps contact information of their members in Excel sheets, that's considered PII. It can also contain sensitive information like who is behind on their monthly contributions and by how much.

That's just the first thing I thought of. There must be tons of companies and organisations processing sensitive data in Word and Excel. What about doctor's offices and insurance companies handling medical information? What about banks, financial advisors, lawyers, etc.


> "To the extent necessary to provide the Services to you and others, to protect you and the Services, and to improve Microsoft products and services, you grant to Microsoft a worldwide and royalty-free intellectual property license to use Your Content, for example, to make copies of, retain, transmit, reformat, display, and distribute via communication tools Your Content on the Services," the clause reads.

Well, this does make sense in the context of Office 365, OneDrive and the Office web apps in general. (Still dodgy regarding the "worldwide" part but there's no way around that because people can and do expect to access their stuff even while on vacation)

Silently enabling the training of remote AI however? That's not covered under any reasonable interpretation of the above legalese.


>… intellectual property license to use Your Content

Seems clear to me. Use any way Microsoft wants. The “for example” list is not exhaustive nor limiting.


IANAL again, but I don't think they get to do literally anything with your data. The phrase used is "to the extent necessary". For instance, I don't think they could scrape their user data for trade secrets and then sell those to the highest bidder.

Who defines “necessary?” Use of Your Content is Necessary to support Microsoft’s business activities, including, but not limited to, training their AI.

Why not? Isn’t that the essential ethos Microsoft was founded on?

Because they boxed themselves in with legalese. Companies would definitely switch off Microsoft services if at all possible if the company's lawyers thought their trade secrets were getting sold off. So I think the "as necessary" framing does probably prevent them from doing some things.

As I laid out in my other comment, I think training AI in particular is covered under the "improving Microsoft products or services" bit of legalese. I do wonder how companies lawyers will respond to this though. They probably thought of that phrase as just allowing Microsoft employees access to documents to see how Word or other pieces of software were being used, or to fix crashes, etc.


I thought it was founded on Bill Gates’s mommy having strong connections to IBM that allowed little Bill to keep the rights to the source code they paid him to write. And the privileged position of having access to a computer at his school when 99.9% of the population did not.

"The funds from the bidder will be invested in to products in order to make a better user experience" /s

Reminds me of “this call will be used for training and quality purposes.”

IANAL, but I think the "to improve Microsoft products and services" bit does mean that they do legally get to train their AI (which is a Microsoft service) on your data. Still a bastard move though.

Does this circumvent Azure Information Protection policies as well? Would be fucking hilarious if it did.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: