You can make a difference in the Apple Support Community!

When you sign up with your Apple Account, you can provide valuable feedback to other community members by upvoting helpful replies and User Tips.

Looks like no one’s replied in a while. To start the conversation again, simply ask a new question.

How to tell Core ML to use eGPU?

I am using this piece of software (StarXTerminator) for removing stars in an astrophoto. This program uses Core ML heavily in macOS. Core ML would use GPU to accelerate the process.


If I run StarXTerminator on 2019 Mac Pro (macOS 12.6) equipped with AMD 5700XT card installed, it would use 5700XT flawlessly. However, if I use StarXTerminator on 2018 Macbook Pro (macOS 12.6) with 5700XT sitting in Akitio eGPU box, it always chose to ignore eGPU and used Radeon Pro 560X inside Macbook Pro.


The author of StarXTerminator said he just throw the data to Core ML and let it decide whatever GPU was available for the task. It's out of his hands on which GPU to use.


The question: is there any way to force Core ML library to use eGPU on Macbook Pro? Terminal commands? Utilities?


Thanks for any telp.


Hojong Lin




MacBook Pro

Posted on Oct 7, 2022 6:37 PM

Reply
Question marked as Top-ranking reply

Posted on Oct 7, 2022 7:19 PM

There is an API to select the preferred GPU. See this link. But it has to be used by the app. You can’t set it yourself.

1 reply

How to tell Core ML to use eGPU?

Welcome to Apple Support Community
A forum where Apple customers help each other with their products. Get started with your Apple Account.