Eerke Boiten, Professor of Cyber Security at De Montfort University Leicester, explains his belief that current AI should not be used for serious applications.
yes, but what you need to be doing is tons of multiply-accumulate, using a fuckton of memory bandwidth… Which a gpu is designed for. You won’t design anything much better with an fpga.
yes, but what you need to be doing is tons of multiply-accumulate, using a fuckton of memory bandwidth… Which a gpu is designed for. You won’t design anything much better with an fpga.