Deepfakes by fraudsters are on the rise—but financial institutions can catch them before accountholders are violated. Deepfake images and video with increasingly realistic attributes can be used to circumvent identity verification and authentication methods when opening bank accounts or taking out credit. And it’s a financial crime made easier with the advent of generative artificial intelligence (gen AI) tools, according to a November alert issued by the U.S. Department of the Treasury’s Financial Crimes Enforcement Network (FinCEN).
FinCEN’s alert explains typologies associated with the latest deepfake schemes and provides red flag indicators to assist with identifying and reporting related suspicious activity (see accompanying graphic on page 4.)
Read more on Bai.



