From mboxrd@z Thu Jan 1 00:00:00 1970 Return-Path: Received: from smtp.gentoo.org (woodpecker.gentoo.org [140.211.166.183]) (using TLSv1.3 with cipher TLS_AES_256_GCM_SHA384 (256/256 bits) key-exchange X25519 server-signature RSA-PSS (4096 bits)) (No client certificate requested) by finch.gentoo.org (Postfix) with ESMTPS id D98031582EF for ; Sun, 09 Feb 2025 10:28:31 +0000 (UTC) Received: from lists.gentoo.org (bobolink.gentoo.org [140.211.166.189]) (using TLSv1.3 with cipher TLS_AES_256_GCM_SHA384 (256/256 bits) key-exchange X25519 server-signature RSA-PSS (4096 bits)) (No client certificate requested) (Authenticated sender: relay-lists.gentoo.org@gentoo.org) by smtp.gentoo.org (Postfix) with ESMTPSA id C689B343115 for ; Sun, 09 Feb 2025 10:28:31 +0000 (UTC) Received: from bobolink.gentoo.org (localhost [127.0.0.1]) by bobolink.gentoo.org (Postfix) with ESMTP id 084DD11047A; Sun, 09 Feb 2025 10:28:15 +0000 (UTC) Received: from smtp.gentoo.org (woodpecker.gentoo.org [140.211.166.183]) (using TLSv1.3 with cipher TLS_AES_256_GCM_SHA384 (256/256 bits) key-exchange X25519 server-signature RSA-PSS (4096 bits)) (No client certificate requested) by bobolink.gentoo.org (Postfix) with ESMTPS id EC45911047A for ; Sun, 09 Feb 2025 10:28:14 +0000 (UTC) Received: from oystercatcher.gentoo.org (oystercatcher.gentoo.org [148.251.78.52]) (using TLSv1.3 with cipher TLS_AES_256_GCM_SHA384 (256/256 bits) key-exchange X25519 server-signature RSA-PSS (4096 bits)) (No client certificate requested) by smtp.gentoo.org (Postfix) with ESMTPS id 9D69D3430E9 for ; Sun, 09 Feb 2025 10:28:14 +0000 (UTC) Received: from localhost.localdomain (localhost [IPv6:::1]) by oystercatcher.gentoo.org (Postfix) with ESMTP id A706E2700 for ; Sun, 09 Feb 2025 10:28:11 +0000 (UTC) From: "Florian Schmaus" To: gentoo-commits@lists.gentoo.org Content-Transfer-Encoding: 8bit Content-type: text/plain; charset=UTF-8 Reply-To: gentoo-dev@lists.gentoo.org, "Florian Schmaus" Message-ID: <1739033355.fb105c916a2ee882686351f01f149e71594261cf.flow@gentoo> Subject: [gentoo-commits] repo/proj/guru:master commit in: sci-misc/llama-cpp/ X-VCS-Repository: repo/proj/guru X-VCS-Files: sci-misc/llama-cpp/llama-cpp-0_pre4576.ebuild sci-misc/llama-cpp/llama-cpp-9999.ebuild X-VCS-Directories: sci-misc/llama-cpp/ X-VCS-Committer: flow X-VCS-Committer-Name: Florian Schmaus X-VCS-Revision: fb105c916a2ee882686351f01f149e71594261cf X-VCS-Branch: master Date: Sun, 09 Feb 2025 10:28:11 +0000 (UTC) Precedence: bulk List-Post: List-Help: List-Unsubscribe: List-Subscribe: List-Id: Gentoo Linux mail X-BeenThere: gentoo-commits@lists.gentoo.org X-Auto-Response-Suppress: DR, RN, NRN, OOF, AutoReply X-Archives-Salt: 896e840d-1923-4d5a-80f9-02db82e39e60 X-Archives-Hash: 80bae5a7d7200d5fdd2836634476053e commit: fb105c916a2ee882686351f01f149e71594261cf Author: Alexey Korepanov yandex ru> AuthorDate: Sat Feb 8 16:49:15 2025 +0000 Commit: Florian Schmaus gentoo org> CommitDate: Sat Feb 8 16:49:15 2025 +0000 URL: https://gitweb.gentoo.org/repo/proj/guru.git/commit/?id=fb105c91 sci-misc/llama-cpp: add dependencies on curl and numpy Signed-off-by: Alexey Korepanov yandex.ru> sci-misc/llama-cpp/llama-cpp-0_pre4576.ebuild | 9 +++++++++ sci-misc/llama-cpp/llama-cpp-9999.ebuild | 9 +++++++++ 2 files changed, 18 insertions(+) diff --git a/sci-misc/llama-cpp/llama-cpp-0_pre4576.ebuild b/sci-misc/llama-cpp/llama-cpp-0_pre4576.ebuild index 39210dddf..8b9c1438d 100644 --- a/sci-misc/llama-cpp/llama-cpp-0_pre4576.ebuild +++ b/sci-misc/llama-cpp/llama-cpp-0_pre4576.ebuild @@ -21,6 +21,14 @@ HOMEPAGE="https://github.com/ggerganov/llama.cpp" LICENSE="MIT" SLOT="0" CPU_FLAGS_X86=( avx avx2 f16c ) +IUSE="curl" + +# curl is needed for pulling models from huggingface +# numpy is used by convert_hf_to_gguf.py +DEPEND="curl? ( net-misc/curl:= )" +RDEPEND="${DEPEND} + dev-python/numpy +" src_configure() { local mycmakeargs=( @@ -28,6 +36,7 @@ src_configure() { -DLLAMA_BUILD_SERVER=ON -DCMAKE_SKIP_BUILD_RPATH=ON -DGGML_NATIVE=0 # don't set march + -DLLAMA_CURL=$(usex curl ON OFF) -DBUILD_NUMBER="1" ) cmake_src_configure diff --git a/sci-misc/llama-cpp/llama-cpp-9999.ebuild b/sci-misc/llama-cpp/llama-cpp-9999.ebuild index 39210dddf..8b9c1438d 100644 --- a/sci-misc/llama-cpp/llama-cpp-9999.ebuild +++ b/sci-misc/llama-cpp/llama-cpp-9999.ebuild @@ -21,6 +21,14 @@ HOMEPAGE="https://github.com/ggerganov/llama.cpp" LICENSE="MIT" SLOT="0" CPU_FLAGS_X86=( avx avx2 f16c ) +IUSE="curl" + +# curl is needed for pulling models from huggingface +# numpy is used by convert_hf_to_gguf.py +DEPEND="curl? ( net-misc/curl:= )" +RDEPEND="${DEPEND} + dev-python/numpy +" src_configure() { local mycmakeargs=( @@ -28,6 +36,7 @@ src_configure() { -DLLAMA_BUILD_SERVER=ON -DCMAKE_SKIP_BUILD_RPATH=ON -DGGML_NATIVE=0 # don't set march + -DLLAMA_CURL=$(usex curl ON OFF) -DBUILD_NUMBER="1" ) cmake_src_configure