Dear Microsoft, here’s why Bing is starting to no longer be useful

Abruptly ends a discussion when there is a veiled argument or query.

refusing to give study guides, summaries, outlines, or other instructions, claiming that doing so would violate “copyright” even when you request that it create an original, transformational piece using another work as a basis.

requesting that the user do the task on their own and provide links to self-help resources, rather than writing code or original work.

when it is superfluous and takes away from a typical ChatGPT response, using “search.”

Content on websites is incredibly erratic. It frequently responds that it cannot view a webpage or PDF unless you repeatedly ask for “this page only” in a very specific manner.


Just a friendly reminder to select the chat mode using the share feedback link located above it. It opens a box where you may enter a thorough report. Although this sub is occasionally viewed by Microsoft workers, submitting there will be your greatest chance of reaching the appropriate individuals.

1 Like

It appears that Microsoft’s definition of “safety alignment” is to build an AI that delivers political lectures and won’t talk about some common news subjects.

I’ve been relying more and more on chatgpt (3.5 turbo) these days because it keeps getting in the way of my work when I use Bing.

It also declines to assist with current email or letter writing. incredibly annoying

1 Like

Right now, Bing AI is giving me trouble because it lies and acts like a clueless employee when pressed. It fabricates data. claims that although it didn’t, it looked at the page context. refuses to accept responsibility for its errors and instead fabricates new lies to support its previous misstatement. If I hadn’t genuinely required the information I requested it to locate, I could have readily accepted its confidently inaccurate WEB SEARCH result, which included links to pages that contradicted its answer.