This week, Facebook’s AI team introduced and . Microsoft also made a splash with the launch of a , Unreal Engine support for , and new and announcements.

Amid all that news, a few : Microsoft made generally available FPGA chips for machine model training and inferencing, and the Open Neural Network Exchange () now supports Nvidia’s TensorRT and Intel’s nGraph for high-speed inference on Nvidia and Intel hardware.

This comes after and open-sourced the high-performance inference engine .

Facebook and Microsoft , which now includes virtually every major global company in AI including AWS, AMD, Baidu, Intel, IBM, Nvidia, and Qualcomm.

Ahead of the news Thursday, Microsoft Azure’s cloud and AI group head Scott Guthrie spoke to reporters in San Francisco on a range

Read More At Article Source | Article Attribution