Back to Member Vault

The night I fell out with my Copilot

Tags:

Ad Tech Brand Safety Consumer Behaviour
Ad Tech Brand Safety Consumer Behaviour Diversity & Inclusion Effectiveness Market Overview Search User Experience

This content was created by an IAB UK member

Members of IAB UK can contribute to the Member Vault. Log in to submit your content.

In this blog, Helena Dillon - Regional Vice President, EMEA Corporate Sales at Microsoft Advertising - shares her personal experience of how a misunderstanding with her artificial intelligence assistant, Copilot, turned out to be a valuable lesson in generative AI

Have you ever had a disagreement with your favourite co-worker? You know, the one you always go to for advice, learning, brainstorming or general support? I have. And it was not pretty.

It all started when my son asked me to review his first yearbook report. He had to read a book that I had never heard of before, then write a summary and analysis of it.I agreed to help him, but I had a plan.

I would simply use Copilot, the amazing AI-powered tool from Microsoft, to write a book report for me. I figured it would be easy for Copilot to scan the book and provide a summary which I could then compare with my son's report and give him really useful and constructive feedback.

So, I opened my Copilot app, typed in the title of the book, and asked it to write a book report for me. I expected it to start typing right away, but instead it responded with…


No - that would be cheating

I was horrified. My trusted advisor, my go-to support and source of information, was telling me no and even worse, accusing me of cheating. How could it refuse to help me?

I tried to reason with it. I told it I just wanted a report to see whether I should read the book or not. I said it was not cheating, it was just simply research.

But Copilot was not convinced. It said that if it gave me a book report, there would be spoilers in it and that was not the best way to decide on whether to read it or not.  It could share a summary or a review of the book.  Annoyed, I gave up and closed the app, but then as I was talking with my son (who by now realised what I had tried to do and agreed with Copilot by the way!), I realised something.


This was ethical AI at work

This exchange made me think -

  • Good for Copilot - it was not going to encourage cheating and it called me out on it when it thought I was doing so. It reminded me of my sense of ethics and integrity, that it was not just a mindless machine that would do anything I asked. It made me respect it more.
  • AI tools are not shortcuts - they supplement our work, improve operations, and can elevate time pressures, but they cannot, and should not do the hard graft for us. It is a partner, not a replacement. It can help us with tasks that are repetitive, tedious, or complex, but it cannot do everything for us. We still need to use our own tools, our brains, skills, and imagination in our daily lives.
  • Using the right prompts is key - Copilot is a generative AI tool, which means it can produce content based on the input it receives. The input, or the prompt, is crucial for the quality and relevance of the output. A good prompt should be clear, specific, and appropriate for the task. A bad prompt can lead to confusion, errors, or unwanted results. For example, asking Copilot to write a book report was a bad prompt, because it was not related to its domain of expertise, it was unethical, and it was too vague. A better prompt would have been to ask Copilot to write a summary to help me understand the plot at a high level so I could ask my son useful questions to support his thinking.  Copilot would then understand I am not looking for it to do the work for me, but to support me.


If AI is for all, it must be used for good

As generative AI tools are rapidly adopted, ethics and AI are at the forefront of conversations. We need to be aware of the potential benefits and risks of using these tools, and how to use them responsibly and effectively. If you are a keen user of AI, ask yourself whether you’re backing the tools (and companies) that hold the same moral code as you, make sure you are not proliferating the advancement of tech that doesn't foster the same ethical standards you do.

At Microsoft Advertising, we are democratising the AI experience through Copilot in Bing, supporting creators of content by crediting, and sourcing them and encouraging the creativity of humans. We believe that AI can be a force for good, if we use it wisely and ethically.

So, the next time you use Copilot, or any other AI tool, remember to ask yourself: Is this a good prompt? Is this a good use of AI? Is this a good way to collaborate with my Copilot? Does this align to my values?

And don't be surprised if your Copilot says no. It could be for your own good.

By Helena Dillon, Regional Vice President, EMEA Corporate Sales

Microsoft Advertising

Microsoft Advertising offers scalable technology solutions, spanning search, native, display, video, and retail media, to reach consumers across all aspects of their digital lives. Connect with more than a billion people using Microsoft experiences that have a proven track record of higher buying power and online spending or reach any audience across the open web. We operate one of the world’s largest global marketplaces and are using our data-driven platforms, audience intelligence, and AI-capabilities to transform the industry. We are committed to making the web work for everyone – consumers, advertisers, publishers, and platforms alike. Our end-to-end solutions and globally scaled advertising business serves partner properties, and buyers and sellers of media to help them deliver business results. 

Posted on: Friday 16 February 2024