Title: Fix: Install correct CUDA toolkit during build by chamalgomes · Pull Request #2088 · abetlen/llama-cpp-python · GitHub
Open Graph Title: Fix: Install correct CUDA toolkit during build by chamalgomes · Pull Request #2088 · abetlen/llama-cpp-python
X Title: Fix: Install correct CUDA toolkit during build by chamalgomes · Pull Request #2088 · abetlen/llama-cpp-python
Description: This ensures that correct requested cuda-toolkit is installed via mamba during the cuda build process. Fixes : #2089
Open Graph Description: This ensures that correct requested cuda-toolkit is installed via mamba during the cuda build process. Fixes : #2089
X Description: This ensures that correct requested cuda-toolkit is installed via mamba during the cuda build process. Fixes : #2089
Opengraph URL: https://github.com/abetlen/llama-cpp-python/pull/2088
X: @github
Domain: patch-diff.githubusercontent.com
| route-pattern | /_view_fragments/voltron/pull_requests/show/:user_id/:repository/:id/pull_request_layout(.:format) |
| route-controller | voltron_pull_requests_fragments |
| route-action | pull_request_layout |
| fetch-nonce | v2:cdaa697b-d350-7261-c83c-28803e35aa9e |
| current-catalog-service-hash | ae870bc5e265a340912cde392f23dad3671a0a881730ffdadd82f2f57d81641b |
| request-id | 86FE:140E58:2A31C22:36672F7:69748F58 |
| html-safe-nonce | 3e5f76b444ea53172374afade63281fcf1d3900e5fd0b3fe4cf5b5a50a99d8ca |
| visitor-payload | eyJyZWZlcnJlciI6IiIsInJlcXVlc3RfaWQiOiI4NkZFOjE0MEU1ODoyQTMxQzIyOjM2NjcyRjc6Njk3NDhGNTgiLCJ2aXNpdG9yX2lkIjoiMTc3NjEwMjEzNTUzNjMyNDQ0MCIsInJlZ2lvbl9lZGdlIjoiaWFkIiwicmVnaW9uX3JlbmRlciI6ImlhZCJ9 |
| visitor-hmac | 39d7c9349015f30226e3c458c16879eb5a4b6bf73ad5d6392bcb4be4f91c7f0e |
| hovercard-subject-tag | pull_request:3001186829 |
| github-keyboard-shortcuts | repository,pull-request-list,pull-request-conversation,pull-request-files-changed,copilot |
| google-site-verification | Apib7-x98H0j5cPqHWwSMm6dNU4GmODRoqxLiDzdx9I |
| octolytics-url | https://collector.github.com/github/collect |
| analytics-location | / |
| fb:app_id | 1401488693436528 |
| apple-itunes-app | app-id=1477376905, app-argument=https://github.com/_view_fragments/voltron/pull_requests/show/abetlen/llama-cpp-python/2088/pull_request_layout |
| twitter:image | https://opengraph.githubassets.com/413d67da588945a29063f8d8a38f4561b543b24b906f37649a17932f8ac005b4/abetlen/llama-cpp-python/pull/2088 |
| twitter:card | summary_large_image |
| og:image | https://opengraph.githubassets.com/413d67da588945a29063f8d8a38f4561b543b24b906f37649a17932f8ac005b4/abetlen/llama-cpp-python/pull/2088 |
| og:image:alt | This ensures that correct requested cuda-toolkit is installed via mamba during the cuda build process. Fixes : #2089 |
| og:image:width | 1200 |
| og:image:height | 600 |
| og:site_name | GitHub |
| og:type | object |
| og:author:username | chamalgomes |
| hostname | github.com |
| expected-hostname | github.com |
| None | 4a4bf5f4e28041a9d2e5c107d7d20b78b4294ba261cab243b28167c16a623a1f |
| turbo-cache-control | no-preview |
| go-import | github.com/abetlen/llama-cpp-python git https://github.com/abetlen/llama-cpp-python.git |
| octolytics-dimension-user_id | 6826477 |
| octolytics-dimension-user_login | abetlen |
| octolytics-dimension-repository_id | 617868717 |
| octolytics-dimension-repository_nwo | abetlen/llama-cpp-python |
| octolytics-dimension-repository_public | true |
| octolytics-dimension-repository_is_fork | false |
| octolytics-dimension-repository_network_root_id | 617868717 |
| octolytics-dimension-repository_network_root_nwo | abetlen/llama-cpp-python |
| turbo-body-classes | logged-out env-production page-responsive |
| disable-turbo | false |
| browser-stats-url | https://api.github.com/_private/browser/stats |
| browser-errors-url | https://api.github.com/_private/browser/errors |
| release | 488b30e96dfd057fbbe44c6665ccbc030b729dde |
| ui-target | full |
| theme-color | #1e2327 |
| color-scheme | light dark |
Links:
Viewport: width=device-width