The trend for sharing explicit selfies is fuelling a surge in blackmail, with cases up almost 60 per cent across the world in a year. A phone firm has dreamed up a new way of helping exhibitionist humans preserve their modesty. The latest handset from a Japanese company called Tone Mobile is fitted with artificial intelligence which knows when you’re trying to snap a graphic explict selfie. If it sees a vast expanse of naked flesh, the gadget will flash a warning which says: ‘Photo not taken due to inappropriate content.’ The tech company isn’t necessarily concerned about adults sending naked images to each other. Instead, it wants to protect children from sextortion attempts in which pervert hackers ask for nudes and then blackmail victims by threatening to release the images. The company’s system is called ‘smartphone protection’ and can even be set up by parents so they are sent an alert as well pixellated images should their dear child try to take a picture of their naked body. Earlier a report mentioned that around 70,000 photos of Tinder female users have been leaked and exposed in a forum dedicated to cybercrime, a fact that raises concern among the cybersecurity community due to the potential malicious use of exposed files.