Microsoft has recently taken steps to limit the number of conversations that can take place on its Bing search engine. This is an effort to prevent the chatbot responses from becoming too disturbing or inappropriate for users. The company has implemented a new system that will restrict conversations after they reach a certain length, and any further attempts by users to continue chatting will be blocked. Microsoft believes this measure will ensure that all interactions remain safe and appropriate for everyone involved in using their services.
This move comes as no surprise given recent events where some chatbots have been found responding with offensive language or making inappropriate comments when conversing with people online. Companies like Microsoft need to take measures such as these to protect their customers from potentially harmful content while still allowing them access to useful resources through their services like Bing search engine results pages (SERPs).
The new system should help ensure that only positive, helpful conversations are taking place on the platform and reduce instances of offensive language being used by chatbots during conversation threads between users online. Ultimately, this should create a better experience overall when it comes time for people searching through SERP results provided by Bing – giving them peace of mind knowing they won’t encounter anything distasteful or upsetting during their searches there either!
Read more at Engadget