-
Notifications
You must be signed in to change notification settings - Fork 778
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
micromamba complain from @pypi_base about package incompatibility in intel macos #1930
Comments
@Shuaijun-Gao can you help me with the error message? |
Can someone assign this to me? I can take a look. |
It always says |
We don't create the environment on your local Mac for Linux - we just do a dry run to get a list of dependencies, download those packages, and upload them to S3 if they don't already exist there.
this works well for me, but I am on an M1 Mac |
yeah, I tried on my friend's m1 mac, it worked. But on intel mac, the dry run will fail |
Did you check if all your steps are indeed decorated with |
I am seeing a very similar issue. I have an Apple M2 and when I'm trying to deploy a job to a kubernetes cluster, Micromamba is giving me some very mysterious error:
I checked mamba in general and saw similar reports when people ran on Mac M1/M2 chips. The problem goes away completely when i run on my Linux machine. So I'm pretty confident this is some low level issue with micromamba doing a dry run of the environment before deploying the job. |
do you run into the same error if you set |
I didn't try that. i tried some other configurations that avoided this problem. However the general problem still holds. One of the dependencies i wanted to install isn't available on mac and wasn't able to progress. This time the error is
Which I interpret as saying I can't install vllm locally during the bootstrap process. Is there someway to skip this bootstrapping? I was hoping metaflow would let me write something I can't run locally and deploy it smoothly. |
Yes, that is the expected behavior. What are the dependencies that you are
trying to install? For executions remotely, we download the packages to be
shipped to remote compute by faking the architecture of your local machine
- which should work reliably. This error is usually indicative of not
specifying the conda channel where vllm is available.
…On Mon, 23 Sep 2024 at 5:51 pm, Keith Stevens ***@***.***> wrote:
I didn't try that. i tried some other configurations that avoided this
problem. However the general problem still holds. One of the dependencies i
wanted to install isn't available on mac and wasn't able to progress. This
time the error is
nothing provides requested vllm 0.6.1.post2
Which I interpret as saying I can't install vllm locally during the
bootstrap process.
Is there someway to skip this bootstrapping? I was hoping metaflow would
let me write something I can't run locally and deploy it smoothly.
—
Reply to this email directly, view it on GitHub
<#1930 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AAF2MO6X344CPI3VG3MEM23ZYCZRRAVCNFSM6AAAAABLMMHOHCVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDGNRZHA3TSNRYHE>
.
You are receiving this because you commented.Message ID:
***@***.***>
|
My dependencies are fairly simple:
Why would this work correctly on my Linux machine but not my Mac? |
vllm is not available as a conda package - so the dependencies shouldn’t
work either on Linux or Mac.
…On Mon, 23 Sep 2024 at 6:04 pm, Keith Stevens ***@***.***> wrote:
My dependencies are fairly simple:
@conda(
python="3.9",
packages={
"vllm": "0.6.1.post2",
},
)
Why would this work correctly on my Linux machine but not my Mac?
—
Reply to this email directly, view it on GitHub
<#1930 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AAF2MO67TBA722RRM6H4CFDZYC3ARAVCNFSM6AAAAABLMMHOHCVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDGNRZHA4TMNZXHA>
.
You are receiving this because you commented.Message ID:
***@***.***>
|
okay after tweaking things I think I figured it out and realize its my misunderstanding of how the decorator works in conjunction with running the launch step. I switched the decorator back to This at least is working for me even though I don't totally get why. |
I use decorator
@pypi_base
and@batch
for cloud computing a certain step.If i understand correctly,
@pypi_base
utilizes micromamba to test environment compatibility locally first. The problem is when using@batch
it will download pip files for linux, and my local computer is intel macos. Then micromamba will complain that some pip wheel files is not supported for my intel macos.Do you have a work around to solve this? Micromamba only does a dry-run right, is there a way to work around?
The text was updated successfully, but these errors were encountered: