This is very interesting, a way to trip up computer vision by adding a strange image to fool classification tasks.
As the researchers write, the sticker “allows attackers to create a physical-world attack without prior knowledge of the lighting conditions, camera angle, type of classifier being attacked, or even the other items within the scene.” So, after such an image is generated, it could be “distributed across the Internet for other attackers to print out and use.” via The Verge
From around the Social Web!
Want to leave a comment?
If you want to give me some feedback on this post, please contact me
via email or on Twitter