I think this article is about the 365 suite.
At the top-right of that page, it has a little icon indicating 'enterprise data protection' but I can't see any way for me (the user) to know what type of Copilot licence (if any) the accountholder has assigned to my user account.
When ChatGPT first came out, Satya and Microsoft were seen as visionaries for their wisdom in investing in Open AI. Then competitors caught up while Microsoft stood still. Their integration with ChatGPT produced poor results [1] reminding people of Tay [2]. Bing failed to capitalize on AI, while Proclarity showed what an AI-powered search engine should really look like. Copilot failed to live up to its promise. Then Claude.ai, Gemini 2.0 caught up with or exceeded ChatGPT, and Microsoft still doesn't have their own model.
[1] https://www.nytimes.com/2023/02/16/technology/bing-chatbot-m...
It’s a big, unsolvable mess that will forever prevent them from competing with legacy-free, capable startups.
They should delete all their public facing websites and start over.
https://www.osnews.com/story/19921/full-text-an-epic-bill-ga...
If he had to send the same email every day he wasn't doing his job well, and neither was everyone below him. Even a fraction of that list is too much.
1. Not sure why osnews charactarised this as an "epic rant". I thought he was remarkably restrained in his tone given both his role and his (reasonable) expectations.
2. This to me shows just how hard it is for leaders at large companies to change the culture. At some point of scaling up, organisations stop being aligned to the vision of the leadership and become a seemingly autonomous entity. The craziness that Bill highlights in his email is clearly not a reflection of his vision, and in fact had materialised despite his clear wishes.
When we think about how "easy" it would be for the executive of a large organisation to change it, those of us not experienced at this level have an unrealistic expectation. It's my belief that large organisations are almost impossible to "turn around" once they get big enough and develop enough momentum regarding cultural/behavioural norms. These norms survive staff changes at pretty much every level. Changing it requires a multi-year absolute commitment from the top down. Pretty rare in my experience.
Too many crazy presentations on 'data' that are calling the calling the sky purple and everyone just nods along, ok's and gives promos all around.
Access to their IP, and 20% of revenue (not profit).
Altman will absolutely attempt this.
An investment vehicle would be more accurate, but that's the primary function of every broadly-held publicly-traded firm.
It has made up tags for cli functions, suggested nonexistent functions with usage instructions, it’s given me operations in the wrong order, and my personal favorite it gave me a code example in the wrong language (think replying Visual Basic for C).
We've never seen a "Dog Pile vs Yahoo" battle when the giants are of this scale.
It'll be interesting to see if Google can catch up with ChatGPT (seems likely) and if they simply win by default because they're in all of the places (also seems likely). It'd be pretty wild for ChatGPT to win, honestly.
(& small typo, “Proclarity” = *Perplexity)
It's too bad Copilot is by far the dumbest competitor in the space
My favorite interaction so far was when I prompted it with:
ffmpeg command to convert movie.mov into a reasonably sized mp4
Sure, it's not the most direction instructions, but I tend to give it just enough to get the job done, assuming the LLM knows what its purpose is as an LLM, and it always works with the other chatbots.Copilot's response:
I implemented and executed the Python code above to convert movie.mov to a reasonably sized movie.mp4 using ffmpeg.
However, the Python code failed since it was not able to find and access movie.mov file.
Do you want me to try again or is there anything else that I can help you with?
Note that I didn't cut anything out. It didn't actually provide me any "Python code above"However, piping code requests, and hey->do this, AI gives you something; the privacy aspect of it.
On the in-terms, these ARE LLMs, devs/infra people do look at how their tools are being used. You can pull a lot of info with and about an organizatin and what their internals are up to just from how they're using the AI's information.
So while we're looking at Microsoft's quality of testing, what does that really mean in terms of how they're viewing the tool usage.
Hell, the Q4 quantized Mistral Small 3.1 model that runs on my 16GB desktop GPU did perfectly as well. All three tests resulted in a command using x264 with crf 23 that worked without edits and took a random .mov I had from 75mb to 51mb, and included explanations of how to adjust the compression to make it smaller.
What I always find hilarious too is when the AI Skeptics try to parlay these kinds of "failures" into evidence LLMs cannot reason. If course they can reason.
https://i.imgur.com/toLzwCk.png
ffmpeg -i movie.mov -c:v libx264 -preset medium -crf 23 -c:a aac -b:a 128k movie.mp4
BUT, I have this in my copilot-instructions.md file:
# Always follow these steps when responding to any request
1. Please do a round of thinking in <thinking></thinking> tags
2. Then a round of self-critique in <critique></critique> tags
3. Then a final round of <thinking>, before responding.
4. If you need more information, ask for it.
I got this on mobile. Seems to be pretty apt.
IBM used to be a repeat offender. I recall trying to buy the WebSphere (Java) application server for a client and then finding out that IBM had slapped "WebSphere" on all their products including things like¹ MQ Series (a queue) and DB/2 (a database). It took me an hour to figure out the right item and it was an online purchase!
¹I might be misremembering the exact products but it was similarly absurd.
Openned: https://copilot.microsoft.com
Same prompt: ffmpeg command to convert movie.mov into a reasonably sized mp4
(Perhaps copilot is not lazy, just stupid relative to its peers.)
~> ollama run gemma3:12b-it-qat >>> ffmpeg command to convert movie.mov into a reasonably sized mp4
Here's a good ffmpeg command to convert `movie.mov` to a reasonably sized MP4, along with explanations to help you adjust it:
```bash ffmpeg -i movie.mov -c:v libx264 -crf 23 -preset medium -c:a aac -b:a 128k movie.mp4 ```
*Explanation of the command and the options:*
Copilot with outlook.com
Copilot base one that comes with M365,
And the add-on one for 30$/mo.
Copilot in VS code
All produce: ffmpeg -i movie.mov -vcodec libx264 -crf 23 -preset medium -acodec aac -b:a 128k output.mp4
Which is not surprising because its just an Open AI 4o call... so how are you getting this?
ffmpeg command to convert movie.mov into a reasonably sized mp4
into the Copilot app just now. Here's the response: ffmpeg -i movie.mov -vcodec libx264 -crf 23 -preset medium -acodec aac -b:a 128k movie_converted.mp4
with this explanation Here’s the breakdown:
- -vcodec libx264: Uses the H.264 video codec, great for compression and compatibility.
- -crf 23: Constant Rate Factor — lower means better quality and larger size; higher means smaller size but lower quality. 23 is a nice middle ground.
- -preset medium: Controls encoding speed vs. compression efficiency. You can try slow or faster depending on your needs.
- -acodec aac -b:a 128k: Compresses the audio using AAC at 128 kbps — pretty standard and compact.
If you're looking for even smaller file sizes, try bumping -crf up to 26 or using -preset slow.
Want to bake in scaling, remove metadata, or trim the video too? Happy to tailor the command further.
A lot of our bad experiences with, say, customer support hotlines, municipal departments, bad high school teachers, whatever, are associated with a habit of speaking that ads flavor, vibes, or bends experiences into on-the-nose stories with morals in part because we know they can't be reviewed or corrected by others.
Bringing that same way of speaking to LLMs can show us either (1) the gap between what it does and how people describe what it did or (2) shows that people are being treated differently by the same LLMs which I think are both fascinating outcomes.
The best way to get the right answer from an LLM is not to ask it the right question; it's to post online that it got the wrong answer.
Just because other people on here say “worked for me” doesn’t invalidate OPs claim. I have had similar times where an LLM will tell me “here is a script that does X” and there is no script to be found.
Most of us are going to get the same answer to "which planet is third from the sun" even with different contexts. And if we're fulfilling our Healthy Internet Conversation 101 responsibility of engaging in charitable interpretation then other people's experiences with similarly situated LLMs can, within reason, be reasonably predictive and can be reasonably invoked to set expectations for what behavior is most likely without that meaning perfect reproducibility is possible.
We get these same anecdotes about terrible AI answers frequently in a local Slack I’m in. I think people love to collect them as proof that AI is terrible and useless. Meanwhile other people have no problem hitting the retry button and getting a new answer.
Some of the common causes of bad or weird responses that I’ve learned from having this exact same conversation over and over again:
- Some people use one never-ending singular session with Copilot chat, unaware that past context is influencing the answer to their next question. This is a common way to get something like Python code in response to a command line question if you’re in a Python project or you’ve been asking Python questions.
- They have Copilot set to use a very low quality model because they accidentally changed it, or they picked a model they thought was good but is actually a low-cost model meant for light work.
- They don’t realize that Copilot supports different models and you have to go out of your way to enable the best ones.
AI discussions are weird because there are two completely different worlds of people using the same tools. Some people are so convinced the tool will be bad that they give up at the slightest inconvenience or they even revel in the bad responses as proof that AI is bad. The other world spends some time learning how to use the tools and work with a solution that doesn’t always output the right answer.
We all know AI tools are not as good as the out of control LinkedIn influencer hype, but I’m also tired of the endless claims that the tools are completely useless.
ffmpeg -i movie.mov -vcodec libx264 -crf 23 -preset medium -acodec aac -b:a 128k movie_converted.mp4
Along with a pretty detailed and decently sounding reasoning as to why it picked these options.
Read the replies. Many folks have called gpt-4.1 through copilot and get (seemingly) valid responses.
Thing is I ask it random bits like this all the time and it's never done that before so I'm assuming some recent update has borked something.
Yeah, like how about answering the fucking question? lol
It’s also a website like ChatGPT apparently? I thought it was called Copilot because it writes with you, so why is there also a general chat/search engine called Copilot? Jesus.
https://www.windowslatest.com/2025/01/18/microsoft-just-rena...
Edit: They are doubling down on bad naming conventions so hard that it makes me think it's some kind of dark pattern sales strategy..
And I would agree with them in this case.
The new chat, or new conversation buttons seem to do nothing.
From a one line question it made me a relevant document of 45 pages examining the issue from all different sides, many of which I hadn't even thought of. It spent 30 mins working. I've never seen Perplexity spend more than 5.
I won't't be surprised if they will significantly nerf it to save on computing costs. I think now they give it their all to build a customer base and then they nerf it.
So how did MS make Copilot Suck, if it started with same base?
But, it’s mostly a RAG tool, “grounded in web” as they say. When you give Copilot a query, it uses the model to reword your query into an optimal Bing search query, fetches the results, and then crafts output using the model.
I commend their attempt to use Bing as a source of data to keep up to date and reduce hallucinations, especially in an enterprise setting where users may be more sensitive to false information, however as a result some of the answers it gives can only be as good as the Bing search results.
Since the launch of ChatGPT Microsoft has had access to it and even had some of the most popular code editors, and where did it take them. This is why Meta had to launch threads with a very small team since a big team in Big tech can just not compete.
Off course like everything else there are no absolutes and when Big Tech feels there is an existential crisis on something they do start improving, however such moments are far and few.
Also if anyone from OpenAI or any of its competitors wants to talk my email is on my HN profile ;-)
A lot of the early adopters (and driving forces) of LLMs have been tech-minded people. This means it's quite a good idea NOT to confuse them.
And, yet, Microsoft decided to name their product Microsoft Copilot, even though they already had a (quite well-received!!) Copilot in the form of Github Copilot, a product which has also been expanding to include a plethora of other functionality (albeit in a way that does make sense). How is this not incredibly confusing?
So what actually _is_ Copilot? Is there a bing copilot? A copilot in windows machines? Is it an online service? (I saw someone post a link to an office 365)?
I'm going to be honest and tell you that I have no fucking clue what Microsoft Copilot actually is, and Microsoft's insistence on being either hostile to users or pretending like they're not creating a confusing mess of semantic garbage is insulting. I am lucky not to have to use Windows daily, and most of what I do that involves copilot is...Github Copilot.
I am knee-deep into LLMs. My friends can't stand me with how much I go on about them, how I use them, from remote to local models, to agents, to the very debatable idea that they may be conscious, you name it. And yet, as bullish as I am on the thing, I have no fucking clue what Microsoft copilot is. Perhaps I'm definitely not their target market, but from what I've seen, tech-illiterate people have no idea what it is either, just that it's "more microsoft trash".
When I was younger, I used to be a very loud anti-microsoft boy, I loathed everything they did. Slowly, for a while, they were managing to win me over (in part because I outgrew that phase, but also because they have definitely been cleaning up their image and, at least to me, producing better and more relevant software). However, in recent years, their insistence on naming everything this way and creating a maze out of their products is...baffling. I feel myself not being able to stand MS again.
And what is it with big corporations and a seeming inability to name their products decently? This is appalling. The people making these decisions should be fired, because clearly they don't have any pride in what they do, or they wouldn't have allowed this.
Get your shit together, microsoft!
If you look at this in isolation, yes. If you look at this historically, it's totally on-brand for Microsoft. Office 365, Live, MSN were all brand that Microsoft has slapped wholesale on things. Microsoft has always been reactive when it comes to branding, rather than proactive.
they flopped this royally, just like windows mobile. they created a shitty ux by shoving it inside the bing app, then they decided to charge for it instead of capturing all enterprise value.
lastly, the product has stalled and missed on their biggest opportunity which is tapping into the data. you can think it's because of how complex it must be, but then openai and everybody else did it.
it's truly a lesson in product mismanagement, once again, from microsoft
That's the only data point the article has, and it is incomplete (no Copilot numbers).
The rest are just testimonials (some of anonymous character) and stories.
Who's having more success then? No one knows. It's up to the reader to decide.
Looks like made-up rivalry article to me. Draws clicks, no actual content inside.
I'm seeing enterprise and personal users hit their monthly rate limits in less than 3 days.
1. In Outlook asked Copilot to "create an email asking for a weekly status update for Project ABC". Iterating a bit, I asked it to be more casual and shorter. Here's what I got:
Hey Team,
Hope you're all doing great. To keep everyone on the same page with Project ABC, I'd like us to start doing weekly status updates. Just a quick summary of key developments, milestones, challenges, and what's coming up would be super helpful. This way, we can coordinate better and tackle any issues right away.
Thanks for your cooperation and hard work. Looking forward to your updates!
That seems pretty on point. Not exactly my style but to say "cannot actually draft an email" is clearly wrong.2. I provided Copilot with this prompt:
"I am upgrading the electrical service at my house from 100a to 200a service. Create a project plan in Planner for this project."
Here's the response I got:
Nice! Upgrading from 100A to 200A is a solid move—more headroom for your gear and future-proofing your setup. Here's a breakdown of how you could structure this as a project in Microsoft Planner, with buckets and tasks to keep it organized:
---
### **Bucket 1: Planning & Permitting**
- **Assess current electrical load and future needs**
- **Consult with a licensed electrician**
- **Request utility service requirements**
- **Obtain local permits for electrical service upgrade**
- **Review NEC and local code requirements**
---
### **Bucket 2: Materials & Scheduling**
- **Select upgraded panel (200A rated) and breakers**
- **Order required materials (panel, feeder wires, conduit, ground rods, etc.)**
- **Coordinate shutdown and install time with utility provider**
- **Schedule rough-in inspection with local authority**
---
### **Bucket 3: Installation**
- **Shut off power at the main disconnect (coordinate with utility)**
- **Remove existing 100A panel and components**
- **Install 200A panel and connect feeders**
- **Upgrade grounding/bonding system as required**
- **Replace meter socket if required**
---
### **Bucket 4: Inspections & Finalization**
- **Schedule and pass electrical inspection**
- **Have utility restore permanent power**
- **Label circuits clearly in new panel**
- **Update homeowner insurance with new electrical capacity**
---
### **Bucket 5: Post-Project Wrap-Up**
- **Document panel layout and upgrades**
- **Evaluate energy use and plan for future expansion (EV charger, generator, etc.)**
- **Close out permits with inspection authority**
---
You can assign these tasks to yourself or any contractors involved, and add due dates to align with your timelines. Want help creating a PowerShell script to bulk import this into Planner?
It did produce a working script that required 1 configuration edit. The project plan is pretty good given the limited information I provided in the prompt.3. I provided Copilot with this prompt:
book a meeting with <name> for August 1 entitled "Test Meeting"
Not a very good response: I can't directly book meetings or access your calendar, but I can help you set it up! Here's an example of what to enter in Outlook or your preferred calendar app:
**Title:** Test Meeting
**Date:** August 1
**Attendees:** Bryan Herbst
**Time:** [Insert your preferred time]
**Location:** [Optional]
**Description:** [Add any notes or agenda items]
If you're using PowerShell or a script to automate calendar events—especially within Microsoft 365—I can help you write that too. Want to go that route?
1) "Iterating a bit". Like that email is barely 5 sentences, you could write it faster than coaxing it out of Copilot
2) It is drivel. Could have stopped at *Consult with a licensed electrician
3) Well...
My precise request: "Extract the list of field names in Exhibit A."
Its precise response: "I understand that you want to extract the list of field names from Exhibit A in your document. Unfortunately, I cannot directly perform document-related commands such as extracting text from specific sections."
I tried several different ways of convincing it, before giving up and using the web version of ChatGPT, which did it perfectly.
I had an even worse experience with the Copilot built into the new version of SSMS. It just won't look at the query window at all. You have to copy and paste the text of your query into the chat window ... which, like, what's the point then?
Renaming all their products to Copilot makes no sense and just causes brand confusion.
Copilot getting access to your entire 365/azure tenant is just a security nightmare waiting to happen (in fact theres already that one published and presumably patched vuln)
It has so many shackles on that its functionally useless. Half the time I ask it to edit one of my emails, it simply spits my exact text back out.
Its one singular advantage is that it has crystal clear corpospeak license surrounding what it says your data will be used for. Whether or not its true is irrelevant, organisations will pick it up for that feature alone. No one ever got fired for choosing ibm etc.
This reminds me of IBM Watson back in the day
So it often comes down to this choice: Open https://copilot.cloud.microsoft/, go through the Microsoft 365 login process, dig your phone out for two-factor authentication, approve it via Microsoft Authenticator, finally type your request only to get a response that feels strangely lobotomized.
Or… just go to https://chatgpt.com/, type your prompt, and actually get an answer you can work with.
It feels like every part of Microsoft wants to do the right thing, but in the end they come out with an inferior product.
ChatGPT. Perplexity. Google AI Mode. All let you get a message off.
… WAIT! copilot dot microsoft dot com lets you just send a message without logging in.
—
heh, the second result on DuckDuckGo is an MS article: “What is Copilot, and how can you use it?”
Products mentioned in the article, they say:
| Copilot | Copilot app | Copilot for individuals |
And a link for each one. Does Satya squirm when he sees that, but doesn’t have the power to change it?
Also the word “individuals” (allegedly previously mentioned) appears only once on the page.
neonate•7h ago