SAN JOSE, Calif., Dec. 15, 2020 – Synaptics® Incorporated (Nasdaq: SYNA), today announced the Katana Edge AI™ platform, addressing a growing industry gap for solutions that enable battery powered devices for consumer and industrial IoT markets. The platform combines Synaptics’ proven low power SoC architecture with energy-efficient AI software, enabled by a partnership with Eta Compute. The Katana solution is optimized for a wide range of ultra-low power use cases in edge devices for office buildings, retail, factories, farms and smart homes. Typical applications include people or object recognition and counting, visual, voice or sound detection, asset or inventory tracking and environmental sensing.
The Most Power Efficient Edge AI Silicon
Katana features a multi-core processor architecture optimized for ultra-low-power and low latency voice, audio and vision applications. The full system SoC combines proprietary power and energy optimized neural network and domain specific processing cores, significant on-chip memory, and extensive use of multiple architectural techniques that save power for each unique mode of operation. The Katana Edge AI platform can be combined with Synaptics’ market-leading wireless connectivity offerings to provide complete system level modules and solutions.
The Most Power Efficient Edge AI Software
The growing demand for efficiency in battery operated devices requires software optimization techniques tightly coupled to the underlying silicon. As a result, Synaptics’ Katana SoC will be co-optimized with Eta Compute’s TENSAI® Flow software to create a complete platform that combines the industry’s most efficient AI compiler with an extensive set of performance and power optimized libraries.
The Fastest Time to Deployment
Synaptics will be working with Eta Compute to offer application focused kits that speed development and deployment. The kits will include pre-trained machine learning models and reference designs, while also enabling users to train the models with their own datasets using industry-standard frameworks such as TensorFlow, Caffe and ONNX.
“Today, there is growing demand for edge intelligence in devices that perform video-, audio- or voice-based detection and classification. However, there are significant gaps in the availability of power efficient solutions and the expertise to effectively program them,” said Satish Ganesan, Chief Strategy Officer at Synaptics. “The combination of Synaptics’ Katana platform and Eta Compute’s TENSAI flow software addresses these gaps while significantly growing our opportunity in a multi-billion dollar market.”
“The combination of Synaptics’ ultra-low-power silicon and Eta Compute’s power- and cost-optimized software will become a catalyst for customer innovation,” said Dr. Ted Tewksbury, CEO of Eta Compute. “Together, the optimized solution will enable the proliferation of AI edge inferencing in a wide range of existing applications, as well as new applications that were never thought to be possible.”
Synaptics (Nasdaq: SYNA) is leading the charge in AI at the Edge, bringing AI closer to end users and transforming how we engage with intelligent connected devices, whether at home, at work, or on the move. As the go-to partner for the world’s most forward-thinking product innovators, Synaptics powers the future with its cutting-edge Synaptics Astra™ AI-Native embedded compute, Veros™ wireless connectivity, and multimodal sensing solutions. We’re making the digital experience smarter, faster, more intuitive, secure, and seamless. From touch, display, and biometrics to AI-driven wireless connectivity, video, vision, audio, speech, and security processing, Synaptics is the force behind the next generation of technology enhancing how we live, work, and play.
Follow Synaptics on LinkedIn, X, and Facebook, or visit www.synaptics.com.
Synaptics and the Synaptics logo are trademarks of Synaptics in the United States and/or other countries. All other marks are the property of their respective owners.
For Public Relations inquiries, please contact:
For Investor Relations inquiries, please contact: