The Urban Dictionary defines the term Text Regret as the "sending of a text and immediately regretting that you ever wrote the text in the first place". And now, social media giant Facebook in several months will make available to its users a feature that will save them from this regret.
According to TechCrunch, Facebook plans to launch an "unsend" feature, in several months and has already been considering how to build the product.
On a related note, it was earlier reported that the social media platform had retracted Facebook messages sent by its CEO, Mark Zuckerberg, and a few other executives from their recipients' inboxes.
Three sources had confirmed to TechCrunch that old Facebook messages they had received from Zuckerberg had disappeared from their Facebook inbox, while their own replies to the co-founder had conspicuously remained.
The company also said that it won't unsend or retract any more of Zuckerberg's messages, until the feature is made available to the larger audience.
Facebook's subsidiary and instant messaging platform WhatsApp rolled out a similar feature called 'Delete for everyone' in November 2017.
Meanwhile, amidst the Facebook-Cambridge Analytica row, a report has surfaced online in which Facebook CEO confirmed that Facebook scans all the images, videos and links you send to people via its Messenger app. Additionally, the tech giant reads also admitted to reading chats when they are flagged to moderators and makes sure that the content abides to the company’s rules.
In conversation with Vox’s Editor Ezra Klein last week, Mark Zuckerberg narrated a story about receiving a phone call related to ethnic cleansing in Myanmar. The company had detected people trying to send sensational messages through the Messenger app.
Facebook confirmed to Bloomberg that while Messenger conversations are private, Facebook scans them and uses the same tools to prevent abuse there that it does on the social network more generally. A Facebook Messenger spokeswoman said in a statement, “For example, on Messenger, when you send a photo, our automated systems scan it using photo matching technology to detect known child exploitation imagery or when you send a link, we scan it for malware or viruses. Facebook designed these automated tools so we can rapidly stop abusive behavior on our platform.”
(With inputs from agencies)