ADLINK Technology Inc., a global leader in edge computing, announces its collaboration with NVIDIA to incorporate NVIDIA® Jetson™ Orin SOMs (system on modules) into the newest members of its DLAP AI Inference Platform series. ADLINK DLAP-400 and DLAP-200 series will respectively feature NVIDIA® Jetson AGX Orin™ and Orin™ NX technologies.
ADLINK’s new DLAP-400 and DLAP-200 series Edge AI Platforms based on NVIDIA® Jetson™ Orin technology will be set to provide enhanced deep-learning acceleration with a wide variety of industrial I/O ports and visual inferencing capabilities, all packed in a durable, fanless, and energy-efficient design, while supporting extended operating temperatures.
“As demands for image rendering, analyses, compute acceleration, and artificial intelligence have continued to grow in the embedded market segment, we are privileged to work with NVIDIA to bring the latest NVIDIA Jetson technology into ADLINK's DLAP family.” said Ethan Chen, ADLINK’s senior manager of embedded platforms and modules BU. “Since NVIDIA introduced the Jetson Orin modules, their enhanced AI performance and energy-efficient features have been sought after by many of ADLINK’s clients. With NVIDIA Jetson Orin optimizing deep-learning, AI applications, we are able to provide our clients with more powerful edge inference platforms to the vertical industries, including medical, manufacturing, aerospace, transportation, maritime, hospitality, and more."
The NVIDIA® Jetson™ Orin SOMs utilize the NVIDIA Ampere architecture with up to 2048 NVIDIA® CUDA® cores, 64 Tensor Cores, and an up-to-12-core Arm® Cortex-A78AE v8.2 64-bit CPU to offer up to 275 TOPS, eight times that of its predecessor, yet maintaining the same compact form factor and pin compatibility.