i.MX8 machine learning - inference engine

キャンセル
次の結果を表示 
表示  限定  | 次の代わりに検索 
もしかして: 

i.MX8 machine learning - inference engine

818件の閲覧回数
eli2
Contributor I

Hi,

Is anyone familiar with a NON-proprietary GPU accelerated inference engine for i.MX8 family?

Thanks.

0 件の賞賛
返信
2 返答(返信)

613件の閲覧回数
igorpadykov
NXP Employee
NXP Employee

Hi

such approach is developed by community and may be posted on meta-fsl-arm mailing list

https://lists.yoctoproject.org/listinfo/meta-freescale

Upstream Linux Support for New NXP i.MX 8 | Linux.com | The source for Linux information 

Best regards
igor
-----------------------------------------------------------------------------------------------------------------------
Note: If this post answers your question, please click the Correct Answer button. Thank you!
-----------------------------------------------------------------------------------------------------------------------

0 件の賞賛
返信

613件の閲覧回数
eli2
Contributor I

Thanks Igor.

I've subscribed to this mailing list. However, can you please provide direct pointers to more information about this?

Regards,

Eli.

0 件の賞賛
返信