Device-based Models with TensorFlow Lite

share ›
‹ links

Below are the top discussions from Reddit that mention this online Coursera course from DeepLearning.AI.

Offered by DeepLearning.AI. Bringing a machine learning model into the real world involves a lot more than just modeling. This ... Enroll for free.

Reddsera may receive an affiliate commission if you enroll in a paid course after using these buttons to visit Coursera. Thank you for using these buttons to support Reddsera.

Taught by
Laurence Moroney
Instructor
and 10 more instructors

Offered by
DeepLearning.AI

Reddit Posts and Comments

0 posts • 2 mentions • top 2 shown below

r/learnmachinelearning • comment
1 points • CodeScraper

I think this might help you.

r/deeplearning • comment
1 points • mohself

On Android phones, you can either convert your model to tflite if it is a tensorflow model, or convert to Qualcomm's SNPE SDK DLC model if it is tensorflow or Onyx based, and then run it on the phone.

Both tflite and DLC models can be run on the CPU, or GPU and DSP if your android phone is a qualcomm Snapdragon chipset. A large model will need to get optimized and/or quantized before it is executable on the phone.

​

As for courses, please checkout this Coursera course on deploying models on devices.