Title: [BUG] Elementwise value assignment silently fails after in-place array downsize using `rows()` · Issue #3534 · arrayfire/arrayfire · GitHub
Open Graph Title: [BUG] Elementwise value assignment silently fails after in-place array downsize using `rows()` · Issue #3534 · arrayfire/arrayfire
X Title: [BUG] Elementwise value assignment silently fails after in-place array downsize using `rows()` · Issue #3534 · arrayfire/arrayfire
Description: In some scenarios, it seems that, after downsizing an array using an in-place call to rows() (ie, overwrite an array with only subset of its current rows), elementwise value assignment silently fails (ie, the element's value does not cha...
Open Graph Description: In some scenarios, it seems that, after downsizing an array using an in-place call to rows() (ie, overwrite an array with only subset of its current rows), elementwise value assignment silently fai...
X Description: In some scenarios, it seems that, after downsizing an array using an in-place call to rows() (ie, overwrite an array with only subset of its current rows), elementwise value assignment silently fai...
Opengraph URL: https://github.com/arrayfire/arrayfire/issues/3534
X: @github
Domain: github.com
{"@context":"https://schema.org","@type":"DiscussionForumPosting","headline":"[BUG] Elementwise value assignment silently fails after in-place array downsize using `rows()`","articleBody":"In some scenarios, it seems that, after downsizing an array using an in-place call to `rows()` (ie, overwrite an array with only subset of its current rows), elementwise value assignment silently fails (ie, the element's value does not change and there are no compile or runtime errors).\r\n\r\nDescription\r\n===========\r\n* Did you build ArrayFire yourself or did you use the official installers: **Built myself.**\r\n* Which backend is experiencing this issue? **CUDA.**\r\n* Do you have a workaround? **No.**\r\n* Can the bug be reproduced reliably on your system? **Yes.**\r\n* A clear and concise description of what you expected to happen. **Expected elementwise value assignment to succeed (eg, in example below, `Drows(0,0)` to have value `1234`). Given that that fails, expected a compile or runtime error.**\r\n* Run your executable with AF_TRACE=all and AF_PRINT_ERRORS=1 environment variables set:\r\n```\r\n# AF_TRACE=all AF_PRINT_ERRORS=1 aftest\r\n[platform][1708117940][9998] [ /tmp/arrayfire/src/backend/common/DependencyModule.cpp:102 ] Attempting to load: libforge.so\r\n[platform][1708117940][9998] [ /tmp/arrayfire/src/backend/common/DependencyModule.cpp:107 ] Unable to open forge\r\n[platform][1708117942][9998] [ /tmp/arrayfire/src/backend/cuda/device_manager.cpp:497 ] CUDA Driver supports up to CUDA 12.3.0 ArrayFire CUDA Runtime 11.3.0\r\n[platform][1708117942][9998] [ /tmp/arrayfire/src/backend/cuda/device_manager.cpp:478 ] CUDA driver version(12.3.0) not part of the CudaToDriverVersion array. Please create an issue or a pull request on the ArrayFire repository to update the CudaToDriverVersion variable with this version of the CUDA runtime.\r\n\r\n[platform][1708117942][9998] [ /tmp/arrayfire/src/backend/cuda/device_manager.cpp:566 ] Found 1 CUDA devices\r\n[platform][1708117942][9998] [ /tmp/arrayfire/src/backend/cuda/device_manager.cpp:588 ] Found device: NVIDIA RTX A3000 Laptop GPU (sm_86) (5.80 GB | ~12187.5 GFLOPs | 32 SMs)\r\n[platform][1708117942][9998] [ /tmp/arrayfire/src/backend/cuda/device_manager.cpp:652 ] AF_CUDA_DEFAULT_DEVICE: \r\n[platform][1708117942][9998] [ /tmp/arrayfire/src/backend/cuda/device_manager.cpp:670 ] Default device: 0(NVIDIA RTX A3000 Laptop GPU)\r\n[mem][1708117942][9998] [ /tmp/arrayfire/src/backend/common/DefaultMemoryManager.cpp:127 ] memory[0].max_bytes: 4.8 GB\r\n[mem][1708117942][9998] [ /tmp/arrayfire/src/backend/cuda/memory.cpp:155 ] nativeAlloc: 1 KB 0x7fe876800000\r\n[jit][1708117942][9998] [ /tmp/arrayfire/src/backend/cuda/compile_module.cpp:472 ] {14966320269747309860 : loaded from /root/.arrayfire/KER14966320269747309860_CU_86_AF_39.bin for NVIDIA RTX A3000 Laptop GPU }\r\n[kernel][1708117942][9998] [ /tmp/arrayfire/src/backend/cuda/Kernel.hpp:37 ] Launching arrayfire::cuda::range\u003cfloat\u003e: Blocks: [1, 1, 1] Threads: [32, 8, 1] Shared Memory: 0\r\nDrows\r\n[5 5 1 1]\r\n[mem][1708117942][9998] [ /tmp/arrayfire/src/backend/cuda/memory.cpp:155 ] nativeAlloc: 1 KB 0x7fe876800400\r\n[jit][1708117942][9998] [ /tmp/arrayfire/src/backend/cuda/compile_module.cpp:472 ] {291210446400920389 : loaded from /root/.arrayfire/KER291210446400920389_CU_86_AF_39.bin for NVIDIA RTX A3000 Laptop GPU }\r\n[kernel][1708117942][9998] [ /tmp/arrayfire/src/backend/cuda/Kernel.hpp:37 ] Launching arrayfire::cuda::transpose\u003cfloat,false,false\u003e: Blocks: [1, 1, 1] Threads: [32, 8, 1] Shared Memory: 0\r\n 0.0000 0.0000 0.0000 0.0000 0.0000 \r\n 1.0000 1.0000 1.0000 1.0000 1.0000 \r\n 2.0000 2.0000 2.0000 2.0000 2.0000 \r\n 3.0000 3.0000 3.0000 3.0000 3.0000 \r\n 4.0000 4.0000 4.0000 4.0000 4.0000 \r\n\r\n[jit][1708117942][9998] [ /tmp/arrayfire/src/backend/cuda/compile_module.cpp:472 ] {8447298384643760287 : loaded from /root/.arrayfire/KER8447298384643760287_CU_86_AF_39.bin for NVIDIA RTX A3000 Laptop GPU }\r\n[kernel][1708117942][9998] [ /tmp/arrayfire/src/backend/cuda/Kernel.hpp:37 ] Launching arrayfire::cuda::memCopy\u003cfloat\u003e: Blocks: [1, 5, 1] Threads: [32, 1, 1] Shared Memory: 0\r\n[jit][1708117942][9998] [ /tmp/arrayfire/src/backend/cuda/compile_module.cpp:472 ] {8641389271879371835 : loaded from /root/.arrayfire/KER8641389271879371835_CU_86_AF_39.bin for NVIDIA RTX A3000 Laptop GPU }\r\n[kernel][1708117942][9998] [ /tmp/arrayfire/src/backend/cuda/jit.cpp:512 ] Launching : Dims: [1,1,1,1] Blocks: [1 1 1] Threads: [128 1 1] threads: 128\r\nDrows\r\n[4 5 1 1]\r\n[kernel][1708117942][9998] [ /tmp/arrayfire/src/backend/cuda/Kernel.hpp:37 ] Launching arrayfire::cuda::transpose\u003cfloat,false,false\u003e: Blocks: [1, 1, 1] Threads: [32, 8, 1] Shared Memory: 0\r\n 0.0000 0.0000 0.0000 0.0000 0.0000 \r\n 1.0000 1.0000 1.0000 1.0000 1.0000 \r\n 2.0000 2.0000 2.0000 2.0000 2.0000 \r\n 3.0000 3.0000 3.0000 3.0000 3.0000\r\n```\r\n\r\nReproducible Code and/or Steps\r\n------------------------------\r\nProgram/output (note the `(0,0)` element of the last print):\r\n```\r\n#include \u003carrayfire.h\u003e\r\n\r\nusing namespace af;\r\n\r\nint main(int argc, char **argv)\r\n{\r\n array Drows = range(dim4(5,5));\r\n af_print(Drows);\r\n\r\n Drows = Drows.rows(0,3);\r\n Drows(0,0) = 1234;\r\n af_print(Drows);\r\n\r\n return 0;\r\n}\r\n\r\n\r\n# aftest\r\nDrows\r\n[5 5 1 1]\r\n 0.0000 0.0000 0.0000 0.0000 0.0000 \r\n 1.0000 1.0000 1.0000 1.0000 1.0000 \r\n 2.0000 2.0000 2.0000 2.0000 2.0000 \r\n 3.0000 3.0000 3.0000 3.0000 3.0000 \r\n 4.0000 4.0000 4.0000 4.0000 4.0000 \r\n\r\nDrows\r\n[4 5 1 1]\r\n 0.0000 0.0000 0.0000 0.0000 0.0000 \r\n 1.0000 1.0000 1.0000 1.0000 1.0000 \r\n 2.0000 2.0000 2.0000 2.0000 2.0000 \r\n 3.0000 3.0000 3.0000 3.0000 3.0000\r\n```\r\n\r\nInterestingly, initializing the array as a copy of an existing array yields the expected behavior: \r\n```\r\n#include \u003carrayfire.h\u003e\r\n\r\nusing namespace af;\r\n\r\nint main(int argc, char **argv)\r\n{\r\n array D = range(dim4(5,5));\r\n array Drows = D;\r\n af_print(Drows);\r\n\r\n Drows = Drows.rows(0,3);\r\n Drows(0,0) = 1234;\r\n af_print(Drows);\r\n\r\n return 0;\r\n}\r\n\r\n\r\n# aftest\r\nDrows\r\n[5 5 1 1]\r\n 0.0000 0.0000 0.0000 0.0000 0.0000 \r\n 1.0000 1.0000 1.0000 1.0000 1.0000 \r\n 2.0000 2.0000 2.0000 2.0000 2.0000 \r\n 3.0000 3.0000 3.0000 3.0000 3.0000 \r\n 4.0000 4.0000 4.0000 4.0000 4.0000 \r\n\r\nDrows\r\n[4 5 1 1]\r\n 1234.0000 0.0000 0.0000 0.0000 0.0000 \r\n 1.0000 1.0000 1.0000 1.0000 1.0000 \r\n 2.0000 2.0000 2.0000 2.0000 2.0000 \r\n 3.0000 3.0000 3.0000 3.0000 3.0000\r\n```\r\n\r\n**EDIT**: Additionally, here is the full debugging output for this case that succeeds (I note an additional `[mem]` line before the final `af_print`):\r\n```\r\n# AF_TRACE=all AF_PRINT_ERRORS=1 aftest\r\n[platform][1708119537][10870] [ /tmp/arrayfire/src/backend/common/DependencyModule.cpp:102 ] Attempting to load: libforge.so\r\n[platform][1708119537][10870] [ /tmp/arrayfire/src/backend/common/DependencyModule.cpp:107 ] Unable to open forge\r\n[platform][1708119537][10870] [ /tmp/arrayfire/src/backend/cuda/device_manager.cpp:497 ] CUDA Driver supports up to CUDA 12.3.0 ArrayFire CUDA Runtime 11.3.0\r\n[platform][1708119537][10870] [ /tmp/arrayfire/src/backend/cuda/device_manager.cpp:478 ] CUDA driver version(12.3.0) not part of the CudaToDriverVersion array. Please create an issue or a pull request on the ArrayFire repository to update the CudaToDriverVersion variable with this version of the CUDA runtime.\r\n\r\n[platform][1708119537][10870] [ /tmp/arrayfire/src/backend/cuda/device_manager.cpp:566 ] Found 1 CUDA devices\r\n[platform][1708119537][10870] [ /tmp/arrayfire/src/backend/cuda/device_manager.cpp:588 ] Found device: NVIDIA RTX A3000 Laptop GPU (sm_86) (5.80 GB | ~12187.5 GFLOPs | 32 SMs)\r\n[platform][1708119537][10870] [ /tmp/arrayfire/src/backend/cuda/device_manager.cpp:652 ] AF_CUDA_DEFAULT_DEVICE: \r\n[platform][1708119537][10870] [ /tmp/arrayfire/src/backend/cuda/device_manager.cpp:670 ] Default device: 0(NVIDIA RTX A3000 Laptop GPU)\r\nArrayFire v3.9.0 (CUDA, 64-bit Linux, build b59a1ae53)\r\nPlatform: CUDA Runtime 11.3, Driver: 545.23.08\r\n[0] NVIDIA RTX A3000 Laptop GPU, 5938 MB, CUDA Compute 8.6\r\n[mem][1708119537][10870] [ /tmp/arrayfire/src/backend/common/DefaultMemoryManager.cpp:127 ] memory[0].max_bytes: 4.8 GB\r\n[mem][1708119537][10870] [ /tmp/arrayfire/src/backend/cuda/memory.cpp:155 ] nativeAlloc: 1 KB 0x7fe1c8800000\r\n[jit][1708119537][10870] [ /tmp/arrayfire/src/backend/cuda/compile_module.cpp:472 ] {14966320269747309860 : loaded from /root/.arrayfire/KER14966320269747309860_CU_86_AF_39.bin for NVIDIA RTX A3000 Laptop GPU }\r\n[kernel][1708119537][10870] [ /tmp/arrayfire/src/backend/cuda/Kernel.hpp:37 ] Launching arrayfire::cuda::range\u003cfloat\u003e: Blocks: [1, 1, 1] Threads: [32, 8, 1] Shared Memory: 0\r\nDrows\r\n[5 5 1 1]\r\n[mem][1708119537][10870] [ /tmp/arrayfire/src/backend/cuda/memory.cpp:155 ] nativeAlloc: 1 KB 0x7fe1c8800400\r\n[jit][1708119537][10870] [ /tmp/arrayfire/src/backend/cuda/compile_module.cpp:472 ] {291210446400920389 : loaded from /root/.arrayfire/KER291210446400920389_CU_86_AF_39.bin for NVIDIA RTX A3000 Laptop GPU }\r\n[kernel][1708119537][10870] [ /tmp/arrayfire/src/backend/cuda/Kernel.hpp:37 ] Launching arrayfire::cuda::transpose\u003cfloat,false,false\u003e: Blocks: [1, 1, 1] Threads: [32, 8, 1] Shared Memory: 0\r\n 0.0000 0.0000 0.0000 0.0000 0.0000 \r\n 1.0000 1.0000 1.0000 1.0000 1.0000 \r\n 2.0000 2.0000 2.0000 2.0000 2.0000 \r\n 3.0000 3.0000 3.0000 3.0000 3.0000 \r\n 4.0000 4.0000 4.0000 4.0000 4.0000 \r\n\r\n[jit][1708119537][10870] [ /tmp/arrayfire/src/backend/cuda/compile_module.cpp:472 ] {8447298384643760287 : loaded from /root/.arrayfire/KER8447298384643760287_CU_86_AF_39.bin for NVIDIA RTX A3000 Laptop GPU }\r\n[kernel][1708119537][10870] [ /tmp/arrayfire/src/backend/cuda/Kernel.hpp:37 ] Launching arrayfire::cuda::memCopy\u003cfloat\u003e: Blocks: [1, 5, 1] Threads: [32, 1, 1] Shared Memory: 0\r\n[jit][1708119537][10870] [ /tmp/arrayfire/src/backend/cuda/compile_module.cpp:472 ] {8641389271879371835 : loaded from /root/.arrayfire/KER8641389271879371835_CU_86_AF_39.bin for NVIDIA RTX A3000 Laptop GPU }\r\n[kernel][1708119537][10870] [ /tmp/arrayfire/src/backend/cuda/jit.cpp:512 ] Launching : Dims: [1,1,1,1] Blocks: [1 1 1] Threads: [128 1 1] threads: 128\r\nDrows\r\n[4 5 1 1]\r\n[mem][1708119537][10870] [ /tmp/arrayfire/src/backend/cuda/memory.cpp:155 ] nativeAlloc: 1 KB 0x7fe1c8800800\r\n[kernel][1708119537][10870] [ /tmp/arrayfire/src/backend/cuda/Kernel.hpp:37 ] Launching arrayfire::cuda::transpose\u003cfloat,false,false\u003e: Blocks: [1, 1, 1] Threads: [32, 8, 1] Shared Memory: 0\r\n 1234.0000 0.0000 0.0000 0.0000 0.0000 \r\n 1.0000 1.0000 1.0000 1.0000 1.0000 \r\n 2.0000 2.0000 2.0000 2.0000 2.0000 \r\n 3.0000 3.0000 3.0000 3.0000 3.0000\r\n```\r\n\r\nSystem Information\r\n------------------\r\nPlease provide the following information:\r\n1. ArrayFire version: **3.9.0**\r\n2. Devices installed on the system **NVIDIA RTX A3000 Laptop GPU**\r\n3. Output from the af::info() function if applicable.\r\n```\r\nArrayFire v3.9.0 (CUDA, 64-bit Linux, build b59a1ae53)\r\nPlatform: CUDA Runtime 11.3, Driver: 545.23.08\r\n[0] NVIDIA RTX A3000 Laptop GPU, 5938 MB, CUDA Compute 8.6\r\n```\r\n4. Output from the following scripts:\r\nLinux output:\r\n```\r\n# bash afbugreport.sh \r\nNo LSB modules are available.\r\nDistributor ID:\tUbuntu\r\nDescription:\tUbuntu 20.04.6 LTS\r\nRelease:\t20.04\r\nCodename:\tfocal\r\nname, memory.total [MiB], driver_version\r\nNVIDIA RTX A3000 Laptop GPU, 6144 MiB, 545.23.08\r\nrocm-smi not found.\r\nclinfo not found.\r\n```\r\n\r\nChecklist\r\n---------\r\n\r\n- [x] Using the latest available ArrayFire release\r\n- [x] GPU drivers are up to date\r\n","author":{"url":"https://github.com/warnellg","@type":"Person","name":"warnellg"},"datePublished":"2024-02-16T21:36:39.000Z","interactionStatistic":{"@type":"InteractionCounter","interactionType":"https://schema.org/CommentAction","userInteractionCount":1},"url":"https://github.com/3534/arrayfire/issues/3534"}
| route-pattern | /_view_fragments/issues/show/:user_id/:repository/:id/issue_layout(.:format) |
| route-controller | voltron_issues_fragments |
| route-action | issue_layout |
| fetch-nonce | v2:462ce1d2-8f37-f5c2-d879-e3a8272d9525 |
| current-catalog-service-hash | 81bb79d38c15960b92d99bca9288a9108c7a47b18f2423d0f6438c5b7bcd2114 |
| request-id | A1AC:AD745:2F564F6:40CDC1E:6964E8F0 |
| html-safe-nonce | c023ce4ede8ba487edabfe0990bf673db255033160e89ba3726b4c02b3fac479 |
| visitor-payload | eyJyZWZlcnJlciI6IiIsInJlcXVlc3RfaWQiOiJBMUFDOkFENzQ1OjJGNTY0RjY6NDBDREMxRTo2OTY0RThGMCIsInZpc2l0b3JfaWQiOiI0OTcwODUxNTkwODc2NzUyMTEyIiwicmVnaW9uX2VkZ2UiOiJpYWQiLCJyZWdpb25fcmVuZGVyIjoiaWFkIn0= |
| visitor-hmac | 400ed461a1a036d0e7246e382eca2fc47713db31aec8d14d42e7e36ef2338515 |
| hovercard-subject-tag | issue:2139521232 |
| github-keyboard-shortcuts | repository,issues,copilot |
| google-site-verification | Apib7-x98H0j5cPqHWwSMm6dNU4GmODRoqxLiDzdx9I |
| octolytics-url | https://collector.github.com/github/collect |
| analytics-location | / |
| fb:app_id | 1401488693436528 |
| apple-itunes-app | app-id=1477376905, app-argument=https://github.com/_view_fragments/issues/show/arrayfire/arrayfire/3534/issue_layout |
| twitter:image | https://opengraph.githubassets.com/48cd1e19e39bdde427fa7c06d3bb8a7ecead35942791910877639c4718a1043b/arrayfire/arrayfire/issues/3534 |
| twitter:card | summary_large_image |
| og:image | https://opengraph.githubassets.com/48cd1e19e39bdde427fa7c06d3bb8a7ecead35942791910877639c4718a1043b/arrayfire/arrayfire/issues/3534 |
| og:image:alt | In some scenarios, it seems that, after downsizing an array using an in-place call to rows() (ie, overwrite an array with only subset of its current rows), elementwise value assignment silently fai... |
| og:image:width | 1200 |
| og:image:height | 600 |
| og:site_name | GitHub |
| og:type | object |
| og:author:username | warnellg |
| hostname | github.com |
| expected-hostname | github.com |
| None | 3c30097417ecb9dfdab6b2e8bc7dc38d73e183d6ad48f94cb69e4a5daa2dbb87 |
| turbo-cache-control | no-preview |
| go-import | github.com/arrayfire/arrayfire git https://github.com/arrayfire/arrayfire.git |
| octolytics-dimension-user_id | 5395442 |
| octolytics-dimension-user_login | arrayfire |
| octolytics-dimension-repository_id | 25889802 |
| octolytics-dimension-repository_nwo | arrayfire/arrayfire |
| octolytics-dimension-repository_public | true |
| octolytics-dimension-repository_is_fork | false |
| octolytics-dimension-repository_network_root_id | 25889802 |
| octolytics-dimension-repository_network_root_nwo | arrayfire/arrayfire |
| turbo-body-classes | logged-out env-production page-responsive |
| disable-turbo | false |
| browser-stats-url | https://api.github.com/_private/browser/stats |
| browser-errors-url | https://api.github.com/_private/browser/errors |
| release | 69fc54a84c74307369dba42af5401200531d116e |
| ui-target | full |
| theme-color | #1e2327 |
| color-scheme | light dark |
Links:
Viewport: width=device-width