Computer

The M365 Copilot Assitants, Zey Do Nothing


Many Liked It, But Didn’t Use It

The UK Government tested out the effectiveness of Microsoft’s AI powered M365 assistant, picking up 1,000 Copilot licenses to distribute to a selection of employees.   The results are somewhat amusing, with 72% of those users stating they were satisfied or very satisfied in a survey at the end of the trial.  The users seem to have mostly used Copilot in Word, Teams, and Outlook; summarizing and tweaking text being the most popular usage.  However, the actual usage tells a different story as about 66% used M365 at least once a week, 30% used it at least once a day and the remainder even less.  That averages out to 1.14 M365 Copilot actions taken per user per day,

At a cost running between £4.90 to £18.10 per user per month this is not a great deal, as that adds up to a big bill for very little usage.  It’s nice to see that the users believe they liked M365 Copilot, but it’s hard to develop a fully fleshed opinion with this tiny amount of use.  The results of the usage that did take place were mixed, Copilot enhanced emails were judged to be of higher quality than those which weren’t AI enhanced but that was the only positive benefit.  PowerPoint slides were created more quickly with Copilot, but the quality was not good and required extra work to bring them up to an acceptable level of quality.  Excel was slower and significantly more inaccurate than non-Copilot generated spreadsheets, which is certainly a problem!

GitHub users are not quite as conflicted, the second most popular discussion on GitHub is how to disable Copilot and as of yet no one, including Microsoft, have come up with a way to do so.  They are up in arms over the fact they can’t stop Copilot from training itself on the code they create, even if it is under a license that should prevent it.  The AI generated bug reports that are completely inaccurate are tying up huge amounts of their time and can’t be stopped, and the Copilot generated posts are referenced by Copilot to spread this false information across GitHub.  As there is almost no way to tell that a comment was written by Copilot about a bug hallucinated by another Copilot post, actual information is being drowned in a sea of disinformation.

Yay Copilot. 



Source link