Efficient AI Inference With Analog Processing In Memory

Efficient AI Inference With Analog Processing In Memory

1.816 Lượt nghe
Efficient AI Inference With Analog Processing In Memory
Tanner Andrulis is a Graduate Research Assistant at MIT's Computer Science and Artificial Intelligence Laboratory (CSAIL), specializing in accelerator design for tensor applications and machine learning, with a focus on innovative analog and processing-in-memory systems. With a diverse background encompassing embedded software, hardware, mathematics, AI, and more, Tanner is an adept researcher and problem solver. Subscribe to FORBES: https://www.youtube.com/user/Forbes?sub_confirmation=1 Fuel your success with Forbes. Gain unlimited access to premium journalism, including breaking news, groundbreaking in-depth reported stories, daily digests and more. Plus, members get a front-row seat at members-only events with leading thinkers and doers, access to premium video that can help you get ahead, an ad-light experience, early access to select products including NFT drops and more: https://account.forbes.com/membership/?utm_source=youtube&utm_medium=display&utm_campaign=growth_non-sub_paid_subscribe_ytdescript Stay Connected Forbes newsletters: https://newsletters.editorial.forbes.com Forbes on Facebook: http://fb.com/forbes Forbes Video on Twitter: http://www.twitter.com/forbes Forbes Video on Instagram: http://instagram.com/forbes More From Forbes: http://forbes.com Forbes covers the intersection of entrepreneurship, wealth, technology, business and lifestyle with a focus on people and success.