Next-level computing powered by Intel Nervana
The Intel® Nervana™ Graph beta release enables bigger models with higher performance and more front-end framework connectors.
neon™ 2.0 features significantly boosted performance on Intel Architectures with Intel® Math Kernel Library.
Run your deep learning applications with the ultra-low power Neural Compute Stick from Movidius. This USB toolkit compiles, tunes, and accelerates neural networks at the edge.
NASA FDL is applying Intel Nervana technology to lunar prospecting.
Network Specialist, Deep Learning Platform
Ryan has over a decade of experience designing, building, managing, and updating IT and network infrastructure. Prior to joining Intel via the Nervana acquisition, he designed and built a multi-node cluster for training machine learning algorithms at a stealth mode data science startup.
Senior Software Engineer, Intel Nervana Graph
Anahita Bhiwandiwalla holds an M.S. in Computer Science from Columbia University specializing in Machine Learning. Anahita’s main interests are in Machine Learning, Natural Language Processing, Speech Recognition and Data Mining. She has presented her work at various meet-ups, webinars and conferences like PyCon and Data Science Summit.
Algorithms Engineer, AI Products
Hanlin builds deep learning models in computer vision, and has applied these models to domains such as satellite imagery or computational neuroscience. He also leads the group's AI projects with defense and intelligence agencies. Prior to Intel and Nervana, he investigated recurrent neural networks in human brain as part of his PhD.
Head of Products, AI Products
Mark earned a BS and MS in electrical engineering from Cornell and Caltech, respectively, where he also studied neural networks. Mark holds an MBA from Harvard Business School. Prior to Intel, Mark served as VP Product for Influitive and, before that, as VP Product for Chegg. Before Chegg, Mark was co-founder/CEO of Grouply, a social networking startup. Before Grouply, Mark was Sr. Director of Product Management at Siebel Systems through its acquisition by Oracle in 2006.
Head of Data Science, AI Products
Yinyin leads the data science efforts, applying deep learning and Intel Nervana technologies into business applications across different industry domains, and driving the development and design of Intel Nervana Platform. She and the Intel Nervana team developed open-source deep learning frameworks, neon and Intel Nervana Graph, brought state-of-the-art models on image recognition, image localization, natural language processing into the framework and deep learning solutions. Yinyin also has experience in computer vision, neuromorphic computing, and robotics.
Senior Principal Engineer, AI Products
Moustapha was a key contributor to a number of performance features in successive Intel processors, specifically the architecture of AVX-512 and its performance evaluation. He has more than 200 patents granted or pending in the areas of computer architecture, performance optimization, HPC, cloud and machine learning and over 20 peer-reviewed research publications.
Staff Data Scientist, AI Products
Anthony is an engineer working on deep neural network models and algorithms within the Artificial Intelligence Products Group at Intel. Previously, he worked on a variety of problems at IBM Research, including quantum computing, spintronics and neuromorphic computing. Anthony holds a PhD in theoretical physics from the University of California, Berkeley, where he worked on string theory, as well as topological phases, topological insulators and the fractional quantum Hall effect.
Keep tabs on all the latest news with our monthly newsletter.