* [gentoo-user] Upgrading from FX-5200 to a GeForce 6200 512MB @ 2010-10-19 7:45 Dale 2010-10-19 8:51 ` Florian Philipp ` (2 more replies) 0 siblings, 3 replies; 58+ messages in thread From: Dale @ 2010-10-19 7:45 UTC (permalink / raw To: Gentoo User Hi, I am thinking of upgrading from a FX-5200 with 128Mb video card to a GeForce 6200 with 512MB. It will be AGP since this is a older rig. My system is something like this: Mobo: Abit NF7 2.0 CPU: AMD 2500+ No overclocking Memory: 2Gbs of 333Mhz. Monitor: Gateway 19" running 1280 x 1024 I think my memory is fine, it never uses all of it, or even half of it, except for caching stuff. I may try to get a 3000+ or 3200+ CPU if I can run up on a good deal. I'm thinking of doing the video card first because it is cheaper. I have also noticed that playing movies on here is getting a bit slow if I go full screen or close to full screen. I'm bad to download from youtube and then play them locally full screen or as close as it will allow. I do use the nvidia drivers. Currently: nvidia-drivers-173.14.25 I'm on that one because I think I need to upgrade my kernel to use the latest one that was recently put in the tree. I'm looking at this card: http://www.newegg.com/Product/Product.aspx?Item=N82E16814133328 What kind of improvement can I expect from this video card upgrade? While I am at it, the CPU upgrade won't make that much difference right? Maybe 20% or so faster or something like that? Thoughts? Opinions? Thanks. Dale :-) :-) ^ permalink raw reply [flat|nested] 58+ messages in thread
* Re: [gentoo-user] Upgrading from FX-5200 to a GeForce 6200 512MB 2010-10-19 7:45 [gentoo-user] Upgrading from FX-5200 to a GeForce 6200 512MB Dale @ 2010-10-19 8:51 ` Florian Philipp 2010-10-19 12:23 ` Dale 2010-10-19 15:44 ` Paul Hartman 2010-10-25 23:42 ` [gentoo-user] " Dale 2 siblings, 1 reply; 58+ messages in thread From: Florian Philipp @ 2010-10-19 8:51 UTC (permalink / raw To: gentoo-user [-- Attachment #1: Type: text/plain, Size: 2320 bytes --] Am 19.10.2010 09:45, schrieb Dale: > Hi, > > I am thinking of upgrading from a FX-5200 with 128Mb video card to a > GeForce 6200 with 512MB. It will be AGP since this is a older rig. My > system is something like this: > > Mobo: Abit NF7 2.0 > CPU: AMD 2500+ No overclocking > Memory: 2Gbs of 333Mhz. > Monitor: Gateway 19" running 1280 x 1024 > > I think my memory is fine, it never uses all of it, or even half of it, > except for caching stuff. I may try to get a 3000+ or 3200+ CPU if I > can run up on a good deal. I'm thinking of doing the video card first > because it is cheaper. I have also noticed that playing movies on here > is getting a bit slow if I go full screen or close to full screen. I'm > bad to download from youtube and then play them locally full screen or > as close as it will allow. > > I do use the nvidia drivers. Currently: > > nvidia-drivers-173.14.25 > > I'm on that one because I think I need to upgrade my kernel to use the > latest one that was recently put in the tree. I'm looking at this card: > > http://www.newegg.com/Product/Product.aspx?Item=N82E16814133328 > > What kind of improvement can I expect from this video card upgrade? > While I am at it, the CPU upgrade won't make that much difference > right? Maybe 20% or so faster or something like that? > Hi Dale, first and foremost, a newer card will allow you to use the newest driver series (195.*.*) which is always a good thing ;) It also gives you more texture units. You can use these to transfer more work to your GPU (mostly scaling and such). Take a look at `man mplayer` section '-vo gl' for a list of options. VLC has similar options, I think. I don't know about gstreamer or xine. I could be wrong but I don't think that adobe-flash uses these options. That is probably part of the problem why flash is so much slower on GNU/Linux than on Windows. If my assumption is true, you are better off buying a faster CPU. You could also test how gnash performs. Since it uses ffmpeg (AFAIK) it might be worth a try. Please take my advices with a big dose of salt. While I still run an old desktop with nearly identical specs, I almost never use Youtube and therefore have no experience with that. Hope this helps, Florian Philipp [-- Attachment #2: OpenPGP digital signature --] [-- Type: application/pgp-signature, Size: 262 bytes --] ^ permalink raw reply [flat|nested] 58+ messages in thread
* Re: [gentoo-user] Upgrading from FX-5200 to a GeForce 6200 512MB 2010-10-19 8:51 ` Florian Philipp @ 2010-10-19 12:23 ` Dale 2010-10-19 12:42 ` Florian Philipp 0 siblings, 1 reply; 58+ messages in thread From: Dale @ 2010-10-19 12:23 UTC (permalink / raw To: gentoo-user Florian Philipp wrote: > Am 19.10.2010 09:45, schrieb Dale: > >> Hi, >> >> I am thinking of upgrading from a FX-5200 with 128Mb video card to a >> GeForce 6200 with 512MB. It will be AGP since this is a older rig. My >> system is something like this: >> >> Mobo: Abit NF7 2.0 >> CPU: AMD 2500+ No overclocking >> Memory: 2Gbs of 333Mhz. >> Monitor: Gateway 19" running 1280 x 1024 >> >> I think my memory is fine, it never uses all of it, or even half of it, >> except for caching stuff. I may try to get a 3000+ or 3200+ CPU if I >> can run up on a good deal. I'm thinking of doing the video card first >> because it is cheaper. I have also noticed that playing movies on here >> is getting a bit slow if I go full screen or close to full screen. I'm >> bad to download from youtube and then play them locally full screen or >> as close as it will allow. >> >> I do use the nvidia drivers. Currently: >> >> nvidia-drivers-173.14.25 >> >> I'm on that one because I think I need to upgrade my kernel to use the >> latest one that was recently put in the tree. I'm looking at this card: >> >> http://www.newegg.com/Product/Product.aspx?Item=N82E16814133328 >> >> What kind of improvement can I expect from this video card upgrade? >> While I am at it, the CPU upgrade won't make that much difference >> right? Maybe 20% or so faster or something like that? >> >> > Hi Dale, > > first and foremost, a newer card will allow you to use the newest driver > series (195.*.*) which is always a good thing ;) > > It also gives you more texture units. You can use these to transfer more > work to your GPU (mostly scaling and such). Take a look at `man mplayer` > section '-vo gl' for a list of options. > VLC has similar options, I think. I don't know about gstreamer or xine. > > I could be wrong but I don't think that adobe-flash uses these options. > That is probably part of the problem why flash is so much slower on > GNU/Linux than on Windows. If my assumption is true, you are better off > buying a faster CPU. > > You could also test how gnash performs. Since it uses ffmpeg (AFAIK) it > might be worth a try. > > Please take my advices with a big dose of salt. While I still run an old > desktop with nearly identical specs, I almost never use Youtube and > therefore have no experience with that. > > Hope this helps, > Florian Philipp > > This particular card I think uses the latest 260.* drivers. That's according to the nvidia site but sometimes that is not correct either. Anyway, I always download the videos off youtube or where ever and then watch them with smplayer locally. It generally works better for the most part. I just have the slow DSL so it skips a bit on some if I don't download it first. I do need a faster CPU but want to get the card first. I do sometimes max out the CPU when watching a video but I think most of the time it is the card that is just getting old and needs a new one that is a little faster at least. For the price, I was going to get a card that is a good bit faster. You are right about the flash thingy. It is a lot slower on this rig. It even slows scrolling down the web pages. It sort of ticks me off sometimes. It is slow on my brothers system to tho. He has windoze XP still. I'm working on putting Linux on there. Work on the CPU next I hope. ;-) Dale :-) :-) ^ permalink raw reply [flat|nested] 58+ messages in thread
* Re: [gentoo-user] Upgrading from FX-5200 to a GeForce 6200 512MB 2010-10-19 12:23 ` Dale @ 2010-10-19 12:42 ` Florian Philipp 2010-10-20 0:14 ` Dale 0 siblings, 1 reply; 58+ messages in thread From: Florian Philipp @ 2010-10-19 12:42 UTC (permalink / raw To: gentoo-user [-- Attachment #1: Type: text/plain, Size: 4103 bytes --] Am 19.10.2010 14:23, schrieb Dale: > Florian Philipp wrote: >> Am 19.10.2010 09:45, schrieb Dale: >> >>> Hi, >>> >>> I am thinking of upgrading from a FX-5200 with 128Mb video card to a >>> GeForce 6200 with 512MB. It will be AGP since this is a older rig. My >>> system is something like this: >>> >>> Mobo: Abit NF7 2.0 >>> CPU: AMD 2500+ No overclocking >>> Memory: 2Gbs of 333Mhz. >>> Monitor: Gateway 19" running 1280 x 1024 >>> >>> I think my memory is fine, it never uses all of it, or even half of it, >>> except for caching stuff. I may try to get a 3000+ or 3200+ CPU if I >>> can run up on a good deal. I'm thinking of doing the video card first >>> because it is cheaper. I have also noticed that playing movies on here >>> is getting a bit slow if I go full screen or close to full screen. I'm >>> bad to download from youtube and then play them locally full screen or >>> as close as it will allow. >>> >>> I do use the nvidia drivers. Currently: >>> >>> nvidia-drivers-173.14.25 >>> >>> I'm on that one because I think I need to upgrade my kernel to use the >>> latest one that was recently put in the tree. I'm looking at this card: >>> >>> http://www.newegg.com/Product/Product.aspx?Item=N82E16814133328 >>> >>> What kind of improvement can I expect from this video card upgrade? >>> While I am at it, the CPU upgrade won't make that much difference >>> right? Maybe 20% or so faster or something like that? >>> >>> >> Hi Dale, >> >> first and foremost, a newer card will allow you to use the newest driver >> series (195.*.*) which is always a good thing ;) >> >> It also gives you more texture units. You can use these to transfer more >> work to your GPU (mostly scaling and such). Take a look at `man mplayer` >> section '-vo gl' for a list of options. >> VLC has similar options, I think. I don't know about gstreamer or xine. >> >> I could be wrong but I don't think that adobe-flash uses these options. >> That is probably part of the problem why flash is so much slower on >> GNU/Linux than on Windows. If my assumption is true, you are better off >> buying a faster CPU. >> >> You could also test how gnash performs. Since it uses ffmpeg (AFAIK) it >> might be worth a try. >> >> Please take my advices with a big dose of salt. While I still run an old >> desktop with nearly identical specs, I almost never use Youtube and >> therefore have no experience with that. >> >> Hope this helps, >> Florian Philipp >> >> > > This particular card I think uses the latest 260.* drivers. That's > according to the nvidia site but sometimes that is not correct either. > > Anyway, I always download the videos off youtube or where ever and then > watch them with smplayer locally. It generally works better for the > most part. I just have the slow DSL so it skips a bit on some if I > don't download it first. I do need a faster CPU but want to get the > card first. I do sometimes max out the CPU when watching a video but I > think most of the time it is the card that is just getting old and needs > a new one that is a little faster at least. For the price, I was going > to get a card that is a good bit faster. > [...] Ah, in that case tweaking your mplayer config might really help. Look at the man page for options (-vo gl:...). You really have to try every option and sometimes reasonable combinations. I've found that even if the man page says it is a slow option, sometimes it's the fastest. As I've said before, you will reach the maximum number of texture units in your card, therefore certain options will not work together but the man page tells you how many texture units each option needs. With that info it should be easy to tweak your settings. To get you going, try mplayer -vo gl:yuv=2:lscale=1:cscale=1 <file> Don't forget to test it in fullscreen mode. You can later apply these options either in /etc/mplayer/mplayer.conf or in ~/.mplayer/config like this: "vo=gl:yuv=2:lscale=1:cscale=1" Hope this helps, Florian Philipp [-- Attachment #2: OpenPGP digital signature --] [-- Type: application/pgp-signature, Size: 262 bytes --] ^ permalink raw reply [flat|nested] 58+ messages in thread
* Re: [gentoo-user] Upgrading from FX-5200 to a GeForce 6200 512MB 2010-10-19 12:42 ` Florian Philipp @ 2010-10-20 0:14 ` Dale 0 siblings, 0 replies; 58+ messages in thread From: Dale @ 2010-10-20 0:14 UTC (permalink / raw To: gentoo-user Florian Philipp wrote: > Am 19.10.2010 14:23, schrieb Dale: > >> Florian Philipp wrote: >> >>> Am 19.10.2010 09:45, schrieb Dale: >>> >>> >>>> Hi, >>>> >>>> I am thinking of upgrading from a FX-5200 with 128Mb video card to a >>>> GeForce 6200 with 512MB. It will be AGP since this is a older rig. My >>>> system is something like this: >>>> >>>> Mobo: Abit NF7 2.0 >>>> CPU: AMD 2500+ No overclocking >>>> Memory: 2Gbs of 333Mhz. >>>> Monitor: Gateway 19" running 1280 x 1024 >>>> >>>> I think my memory is fine, it never uses all of it, or even half of it, >>>> except for caching stuff. I may try to get a 3000+ or 3200+ CPU if I >>>> can run up on a good deal. I'm thinking of doing the video card first >>>> because it is cheaper. I have also noticed that playing movies on here >>>> is getting a bit slow if I go full screen or close to full screen. I'm >>>> bad to download from youtube and then play them locally full screen or >>>> as close as it will allow. >>>> >>>> I do use the nvidia drivers. Currently: >>>> >>>> nvidia-drivers-173.14.25 >>>> >>>> I'm on that one because I think I need to upgrade my kernel to use the >>>> latest one that was recently put in the tree. I'm looking at this card: >>>> >>>> http://www.newegg.com/Product/Product.aspx?Item=N82E16814133328 >>>> >>>> What kind of improvement can I expect from this video card upgrade? >>>> While I am at it, the CPU upgrade won't make that much difference >>>> right? Maybe 20% or so faster or something like that? >>>> >>>> >>>> >>> Hi Dale, >>> >>> first and foremost, a newer card will allow you to use the newest driver >>> series (195.*.*) which is always a good thing ;) >>> >>> It also gives you more texture units. You can use these to transfer more >>> work to your GPU (mostly scaling and such). Take a look at `man mplayer` >>> section '-vo gl' for a list of options. >>> VLC has similar options, I think. I don't know about gstreamer or xine. >>> >>> I could be wrong but I don't think that adobe-flash uses these options. >>> That is probably part of the problem why flash is so much slower on >>> GNU/Linux than on Windows. If my assumption is true, you are better off >>> buying a faster CPU. >>> >>> You could also test how gnash performs. Since it uses ffmpeg (AFAIK) it >>> might be worth a try. >>> >>> Please take my advices with a big dose of salt. While I still run an old >>> desktop with nearly identical specs, I almost never use Youtube and >>> therefore have no experience with that. >>> >>> Hope this helps, >>> Florian Philipp >>> >>> >>> >> This particular card I think uses the latest 260.* drivers. That's >> according to the nvidia site but sometimes that is not correct either. >> >> Anyway, I always download the videos off youtube or where ever and then >> watch them with smplayer locally. It generally works better for the >> most part. I just have the slow DSL so it skips a bit on some if I >> don't download it first. I do need a faster CPU but want to get the >> card first. I do sometimes max out the CPU when watching a video but I >> think most of the time it is the card that is just getting old and needs >> a new one that is a little faster at least. For the price, I was going >> to get a card that is a good bit faster. >> >> > [...] > > Ah, in that case tweaking your mplayer config might really help. Look at > the man page for options (-vo gl:...). > > You really have to try every option and sometimes reasonable > combinations. I've found that even if the man page says it is a slow > option, sometimes it's the fastest. As I've said before, you will reach > the maximum number of texture units in your card, therefore certain > options will not work together but the man page tells you how many > texture units each option needs. With that info it should be easy to > tweak your settings. > > To get you going, try mplayer -vo gl:yuv=2:lscale=1:cscale=1<file> > Don't forget to test it in fullscreen mode. > > You can later apply these options either in /etc/mplayer/mplayer.conf or > in ~/.mplayer/config like this: "vo=gl:yuv=2:lscale=1:cscale=1" > > Hope this helps, > Florian Philipp > > I'll look into that in a bit. Sort of having a so so day today. Those options may help tho. I mostly play mp4's tho. They can be pretty big. Dale :-) :-) ^ permalink raw reply [flat|nested] 58+ messages in thread
* Re: [gentoo-user] Upgrading from FX-5200 to a GeForce 6200 512MB 2010-10-19 7:45 [gentoo-user] Upgrading from FX-5200 to a GeForce 6200 512MB Dale 2010-10-19 8:51 ` Florian Philipp @ 2010-10-19 15:44 ` Paul Hartman 2010-10-19 20:27 ` Dale 2010-10-25 23:42 ` [gentoo-user] " Dale 2 siblings, 1 reply; 58+ messages in thread From: Paul Hartman @ 2010-10-19 15:44 UTC (permalink / raw To: gentoo-user On Tue, Oct 19, 2010 at 2:45 AM, Dale <rdalek1967@gmail.com> wrote: > I am thinking of upgrading from a FX-5200 with 128Mb video card to a GeForce > 6200 with 512MB. It will be AGP since this is a older rig. My system is > something like this: > > Mobo: Abit NF7 2.0 > CPU: AMD 2500+ No overclocking > Memory: 2Gbs of 333Mhz. > Monitor: Gateway 19" running 1280 x 1024 Based on the selection at Newegg, I would highly recommend going with one of the Radeon HD 3650 or 4650 cards which only cost a little more than the one you're looking at. HD3650 is going to be 5x faster than GeForce 6200 and HD4650 probably 10x faster. I think your motherboard supports AGP 8x, and I'm not sure if there are any power supply considerations or other features (number of DVI heads, etc) but anyway that's my 2 cents. :) I am an Nvidia video card guy through and through, but in this case the AGP Nvidia cards on offer there are ancient and slow compared to their ATI counterparts. ^ permalink raw reply [flat|nested] 58+ messages in thread
* Re: [gentoo-user] Upgrading from FX-5200 to a GeForce 6200 512MB 2010-10-19 15:44 ` Paul Hartman @ 2010-10-19 20:27 ` Dale 2010-10-19 22:59 ` Paul Hartman 0 siblings, 1 reply; 58+ messages in thread From: Dale @ 2010-10-19 20:27 UTC (permalink / raw To: gentoo-user Paul Hartman wrote: > On Tue, Oct 19, 2010 at 2:45 AM, Dale<rdalek1967@gmail.com> wrote: > >> I am thinking of upgrading from a FX-5200 with 128Mb video card to a GeForce >> 6200 with 512MB. It will be AGP since this is a older rig. My system is >> something like this: >> >> Mobo: Abit NF7 2.0 >> CPU: AMD 2500+ No overclocking >> Memory: 2Gbs of 333Mhz. >> Monitor: Gateway 19" running 1280 x 1024 >> > Based on the selection at Newegg, I would highly recommend going with > one of the Radeon HD 3650 or 4650 cards which only cost a little more > than the one you're looking at. HD3650 is going to be 5x faster than > GeForce 6200 and HD4650 probably 10x faster. > > I think your motherboard supports AGP 8x, and I'm not sure if there > are any power supply considerations or other features (number of DVI > heads, etc) but anyway that's my 2 cents. :) > > I am an Nvidia video card guy through and through, but in this case > the AGP Nvidia cards on offer there are ancient and slow compared to > their ATI counterparts. > > I'm a nvidia guy. I'm not big on ATI at all. Just sort of not my cup of tea. I have read they have better Linux support than a long time ago but they came in a little to late for me. I just wish that thing had a bigger heat sink on it with fans. I may change that thing pretty quick. Thanks. Dale :-) :-) ^ permalink raw reply [flat|nested] 58+ messages in thread
* Re: [gentoo-user] Upgrading from FX-5200 to a GeForce 6200 512MB 2010-10-19 20:27 ` Dale @ 2010-10-19 22:59 ` Paul Hartman 2010-10-19 23:34 ` Adam Carter 2010-10-19 23:40 ` Dale 0 siblings, 2 replies; 58+ messages in thread From: Paul Hartman @ 2010-10-19 22:59 UTC (permalink / raw To: gentoo-user On Tue, Oct 19, 2010 at 3:27 PM, Dale <rdalek1967@gmail.com> wrote: > Paul Hartman wrote: >> >> On Tue, Oct 19, 2010 at 2:45 AM, Dale<rdalek1967@gmail.com> wrote: >> >>> >>> I am thinking of upgrading from a FX-5200 with 128Mb video card to a >>> GeForce >>> 6200 with 512MB. It will be AGP since this is a older rig. My system is >>> something like this: >>> >>> Mobo: Abit NF7 2.0 >>> CPU: AMD 2500+ No overclocking >>> Memory: 2Gbs of 333Mhz. >>> Monitor: Gateway 19" running 1280 x 1024 >>> >> >> Based on the selection at Newegg, I would highly recommend going with >> one of the Radeon HD 3650 or 4650 cards which only cost a little more >> than the one you're looking at. HD3650 is going to be 5x faster than >> GeForce 6200 and HD4650 probably 10x faster. >> >> I think your motherboard supports AGP 8x, and I'm not sure if there >> are any power supply considerations or other features (number of DVI >> heads, etc) but anyway that's my 2 cents. :) >> >> I am an Nvidia video card guy through and through, but in this case >> the AGP Nvidia cards on offer there are ancient and slow compared to >> their ATI counterparts. >> >> > > I'm a nvidia guy. I'm not big on ATI at all. Just sort of not my cup of > tea. I have read they have better Linux support than a long time ago but > they came in a little to late for me. > > I just wish that thing had a bigger heat sink on it with fans. I may change > that thing pretty quick. > > Thanks. Okay then :) To return to your original question, I think going from FX-5200 to Geforce 6200 should probably give you something like 15% performance improvement. I don't think either card is new enough to be supported by vdpau so there won't be anything gained there. 6200 uses the current drivers (260.xx) whereas the 5200 is on the legacy drivers (173.xx), maybe there are additional 3D effects supported by the newer chipset/drivers. There's a humongous matrix of nvidia chipset and model numbers somewhere on the internet that explains the differences but I can't seem to find it at the moment. My Google-fu is failing me. :) ^ permalink raw reply [flat|nested] 58+ messages in thread
* Re: [gentoo-user] Upgrading from FX-5200 to a GeForce 6200 512MB 2010-10-19 22:59 ` Paul Hartman @ 2010-10-19 23:34 ` Adam Carter 2010-10-20 0:11 ` Dale 2010-10-19 23:40 ` Dale 1 sibling, 1 reply; 58+ messages in thread From: Adam Carter @ 2010-10-19 23:34 UTC (permalink / raw To: gentoo-user [-- Attachment #1: Type: text/plain, Size: 280 bytes --] There's a humongous matrix of > nvidia chipset and model numbers somewhere on the internet that > explains the differences but I can't seem to find it at the moment. My > Google-fu is failing me. :) > > http://en.wikipedia.org/wiki/Comparison_of_NVIDIA_graphics_processing_units [-- Attachment #2: Type: text/html, Size: 585 bytes --] ^ permalink raw reply [flat|nested] 58+ messages in thread
* Re: [gentoo-user] Upgrading from FX-5200 to a GeForce 6200 512MB 2010-10-19 23:34 ` Adam Carter @ 2010-10-20 0:11 ` Dale 2010-10-20 3:54 ` Adam Carter 0 siblings, 1 reply; 58+ messages in thread From: Dale @ 2010-10-20 0:11 UTC (permalink / raw To: gentoo-user [-- Attachment #1: Type: text/plain, Size: 737 bytes --] Adam Carter wrote: > There's a humongous matrix of > > nvidia chipset and model numbers somewhere on the internet that > explains the differences but I can't seem to find it at the moment. My > Google-fu is failing me. :) > > > http://en.wikipedia.org/wiki/Comparison_of_NVIDIA_graphics_processing_units Nice link. I didn't even think of looking on that site. I guess one good thing to go by is the processing power and memory. After all, that's what makes it all work faster. Looks like I'm still getting a pretty old card but I don't play any hard core games or anything. Playing videos is about as much load as the card will see with me. I do play Kpatience tho. Love my card games. Thanks. Dale :-) :-) [-- Attachment #2: Type: text/html, Size: 1481 bytes --] ^ permalink raw reply [flat|nested] 58+ messages in thread
* Re: [gentoo-user] Upgrading from FX-5200 to a GeForce 6200 512MB 2010-10-20 0:11 ` Dale @ 2010-10-20 3:54 ` Adam Carter 2010-10-20 4:12 ` Dale 0 siblings, 1 reply; 58+ messages in thread From: Adam Carter @ 2010-10-20 3:54 UTC (permalink / raw To: gentoo-user [-- Attachment #1: Type: text/plain, Size: 743 bytes --] > Nice link. I didn't even think of looking on that site. I guess one good > thing to go by is the processing power and memory. After all, that's what > makes it all work faster. Looks like I'm still getting a pretty old card > but I don't play any hard core games or anything. Playing videos is about > as much load as the card will see with me. I do play Kpatience tho. Love > my card games. > > I would have thought you would have no problems at all with your current system. IIRC I had no problem with full screen SD video using mplayer on a Athlon 2200 with a crappy integrated 440MX video... what's the CPU utilization when you're playing the full screen video? I'm just thinking changing the card might not make any difference. [-- Attachment #2: Type: text/html, Size: 1011 bytes --] ^ permalink raw reply [flat|nested] 58+ messages in thread
* Re: [gentoo-user] Upgrading from FX-5200 to a GeForce 6200 512MB 2010-10-20 3:54 ` Adam Carter @ 2010-10-20 4:12 ` Dale 2010-10-20 16:25 ` Paul Hartman 0 siblings, 1 reply; 58+ messages in thread From: Dale @ 2010-10-20 4:12 UTC (permalink / raw To: gentoo-user [-- Attachment #1: Type: text/plain, Size: 1346 bytes --] Adam Carter wrote: > > Nice link. I didn't even think of looking on that site. I guess > one good thing to go by is the processing power and memory. After > all, that's what makes it all work faster. Looks like I'm still > getting a pretty old card but I don't play any hard core games or > anything. Playing videos is about as much load as the card will > see with me. I do play Kpatience tho. Love my card games. > > > I would have thought you would have no problems at all with your > current system. IIRC I had no problem with full screen SD video using > mplayer on a Athlon 2200 with a crappy integrated 440MX video... > what's the CPU utilization when you're playing the full screen video? > > I'm just thinking changing the card might not make any difference. The CPU is usually at about 40 or 50% or so. Sometimes it goes higher but I can usually watch a video while emerge is running as far as CPU time goes, although emerge takes longer that way. I'm wondering if the card may be getting hot and slowing down because of that? i replaced the heat sink a good while back and I got more than enough cooling on the case. The heat sink has a fan and maybe it is not turning or something. I did blow out the dust a while back and I do have filters over the intakes to help some. Dale :-) :-) [-- Attachment #2: Type: text/html, Size: 2005 bytes --] ^ permalink raw reply [flat|nested] 58+ messages in thread
* Re: [gentoo-user] Upgrading from FX-5200 to a GeForce 6200 512MB 2010-10-20 4:12 ` Dale @ 2010-10-20 16:25 ` Paul Hartman 2010-10-20 18:25 ` Dale 0 siblings, 1 reply; 58+ messages in thread From: Paul Hartman @ 2010-10-20 16:25 UTC (permalink / raw To: gentoo-user On Tue, Oct 19, 2010 at 11:12 PM, Dale <rdalek1967@gmail.com> wrote: > I'm wondering if the card may be getting hot and slowing down because of > that? i replaced the heat sink a good while back and I got more than enough > cooling on the case. The heat sink has a fan and maybe it is not turning or > something. I did blow out the dust a while back and I do have filters over > the intakes to help some. Some Nvidia cards can go into a slow-motion mode when they overheat, I had that happen on mine (it was a 6000 or 7000 series, I think) when the fan died and I didn't realize it. The slowdown was dramatic in those cases. It would usually happen if I was playing a game or a video, suddenly it would go 2 frames per second. I'd stop the game/video, and even things like opening a window were slow. After a minute or two, everything would be back to normal speed. Eventually I learned that the card was protecting itself by switching to an ultra-slow mode to try to fight the overheating. nvidia-settings may be able to show you the temperature and speeds on your card. You might need to add: Option "coolbits" "1" to the device section in your xorg.conf to get it to show you some of those options if they aren't initially visible. ^ permalink raw reply [flat|nested] 58+ messages in thread
* Re: [gentoo-user] Upgrading from FX-5200 to a GeForce 6200 512MB 2010-10-20 16:25 ` Paul Hartman @ 2010-10-20 18:25 ` Dale 2010-10-20 19:07 ` Paul Hartman 0 siblings, 1 reply; 58+ messages in thread From: Dale @ 2010-10-20 18:25 UTC (permalink / raw To: gentoo-user Paul Hartman wrote: > On Tue, Oct 19, 2010 at 11:12 PM, Dale<rdalek1967@gmail.com> wrote: > >> I'm wondering if the card may be getting hot and slowing down because of >> that? i replaced the heat sink a good while back and I got more than enough >> cooling on the case. The heat sink has a fan and maybe it is not turning or >> something. I did blow out the dust a while back and I do have filters over >> the intakes to help some. >> > Some Nvidia cards can go into a slow-motion mode when they overheat, I > had that happen on mine (it was a 6000 or 7000 series, I think) when > the fan died and I didn't realize it. The slowdown was dramatic in > those cases. It would usually happen if I was playing a game or a > video, suddenly it would go 2 frames per second. I'd stop the > game/video, and even things like opening a window were slow. After a > minute or two, everything would be back to normal speed. Eventually I > learned that the card was protecting itself by switching to an > ultra-slow mode to try to fight the overheating. > > nvidia-settings may be able to show you the temperature and speeds on > your card. You might need to add: > > Option "coolbits" "1" > > to the device section in your xorg.conf to get it to show you some of > those options if they aren't initially visible. > > I appear to have another issue to deal with right now. This is weird. When I type in any nvclock command, I get something like this: root@smoker / # nvclock -i *** buffer overflow detected ***: nvclock terminated ======= Backtrace: ========= /lib/libc.so.6(__fortify_fail+0x50)[0xb75af850] /lib/libc.so.6(+0xe18aa)[0xb75ad8aa] /lib/libc.so.6(+0xe0f78)[0xb75acf78] /lib/libc.so.6(__overflow+0x4a)[0xb753670a] /lib/libc.so.6(_IO_vfprintf+0x50b9)[0xb750db39] /lib/libc.so.6(__vsprintf_chk+0xa7)[0xb75ad027] /lib/libc.so.6(__sprintf_chk+0x2d)[0xb75acf6d] nvclock[0x8057317] [0x30322e34] ======= Memory map: ======== 08048000-08060000 r-xp 00000000 08:16 311032 /usr/bin/nvclock 08060000-08061000 r--p 00017000 08:16 311032 /usr/bin/nvclock 08061000-08062000 rw-p 00018000 08:16 311032 /usr/bin/nvclock 09369000-0938a000 rw-p 00000000 00:00 0 [heap] b731f000-b733b000 r-xp 00000000 08:16 2070143 /usr/lib/gcc/i686-pc-linux-gnu/4.4.3/libgcc_s.so.1 b733b000-b733c000 r--p 0001b000 08:16 2070143 /usr/lib/gcc/i686-pc-linux-gnu/4.4.3/libgcc_s.so.1 b733c000-b733d000 rw-p 0001c000 08:16 2070143 /usr/lib/gcc/i686-pc-linux-gnu/4.4.3/libgcc_s.so.1 b7360000-b7370000 rw-s dc300000 00:0d 8375 /dev/nvidia0 b7370000-b7470000 rw-s dc700000 00:0d 8375 /dev/nvidia0 b7470000-b74a0000 rw-s dc000000 00:0d 8375 /dev/nvidia0 b74a0000-b74a1000 rw-p 00000000 00:00 0 b74a1000-b74a5000 r-xp 00000000 08:16 242258 /usr/lib/libXdmcp.so.6.0.0 b74a5000-b74a6000 r--p 00003000 08:16 242258 /usr/lib/libXdmcp.so.6.0.0 b74a6000-b74a7000 rw-p 00004000 08:16 242258 /usr/lib/libXdmcp.so.6.0.0 b74a7000-b74a9000 r-xp 00000000 08:16 179499 /usr/lib/libXau.so.6.0.0 b74a9000-b74aa000 r--p 00001000 08:16 179499 /usr/lib/libXau.so.6.0.0 b74aa000-b74ab000 rw-p 00002000 08:16 179499 /usr/lib/libXau.so.6.0.0 b74ab000-b74ac000 rw-p 00000000 00:00 0 b74ac000-b74ae000 r-xp 00000000 08:16 3019384 /lib/libdl-2.11.2.so b74ae000-b74af000 r--p 00001000 08:16 3019384 /lib/libdl-2.11.2.so b74af000-b74b0000 rw-p 00002000 08:16 3019384 /lib/libdl-2.11.2.so b74b0000-b74ca000 r-xp 00000000 08:16 2178229 /usr/lib/libxcb.so.1.1.0 b74ca000-b74cb000 r--p 00019000 08:16 2178229 /usr/lib/libxcb.so.1.1.0 b74cb000-b74cc000 rw-p 0001a000 08:16 2178229 /usr/lib/libxcb.so.1.1.0 b74cc000-b760c000 r-xp 00000000 08:16 3018521 /lib/libc-2.11.2.so b760c000-b760e000 r--p 0013f000 08:16 3018521 /lib/libc-2.11.2.so b760e000-b760f000 rw-p 00141000 08:16 3018521 /lib/libc-2.11.2.so b760f000-b7612000 rw-p 00000000 00:00 0 b7612000-b7620000 r-xp 00000000 08:16 623393 /usr/lib/libXext.so.6.4.0 b7620000-b7621000 r--p 0000d000 08:16 623393 /usr/lib/libXext.so.6.4.0 b7621000-b7622000 rw-p 0000e000 08:16 623393 /usr/lib/libXext.so.6.4.0 b7622000-b773e000 r-xp 00000000 08:16 2143515 /usr/lib/libX11.so.6.3.0 b773e000-b773f000 r--p 0011b000 08:16 2143515 /usr/lib/libX11.so.6.3.0 b773f000-b7742000 rw-p 0011c000 08:16 2143515 /usr/lib/libX11.so.6.3.0 b774f000-b7751000 rw-s dc680000 00:0d 8375 /dev/nvidia0 b7751000-b7761000 rw-s dc610000 00:0d 8375 /dev/nvidia0 b7761000-b7763000 rw-s dc601000 00:0d 8375 /dev/nvidia0 b7763000-b7764000 rw-s dc100000 00:0d 8375 /dev/nvidia0 b7764000-b7765000 rw-s dc101000 00:0d 8375 /dev/nvidia0 b7765000-b7766000 rw-p 00000000 00:00 0 b7766000-b7767000 r-xp 00000000 00:00 0 [vdso] b7767000-b7783000 r-xp 00000000 08:16 3019734 /lib/ld-2.11.2.so b7783000-b7784000 r--p 0001b000 08:16 3019734 /lib/ld-2.11.2.so b7784000-b7785000 rw-p 0001c000 08:16 3019734 /lib/ld-2.11.2.so bfe28000-bfe49000 rw-p 00000000 00:00 0 [stack] Aborted root@smoker / # I guess I'll have to take the side off the case and use the infrared thingy and look to see if the fan is turning. I'm not sure what is going on with the buffer overflow error tho. Dale :-) :-) ^ permalink raw reply [flat|nested] 58+ messages in thread
* Re: [gentoo-user] Upgrading from FX-5200 to a GeForce 6200 512MB 2010-10-20 18:25 ` Dale @ 2010-10-20 19:07 ` Paul Hartman 2010-10-20 19:28 ` Dale 0 siblings, 1 reply; 58+ messages in thread From: Paul Hartman @ 2010-10-20 19:07 UTC (permalink / raw To: gentoo-user On Wed, Oct 20, 2010 at 1:25 PM, Dale <rdalek1967@gmail.com> wrote: > I appear to have another issue to deal with right now. This is weird. When > I type in any nvclock command, I get something like this: > > root@smoker / # nvclock -i > *** buffer overflow detected ***: nvclock terminated Seems like maybe that is glibc stopping you from running a program with a (potential) buffer overflow. You can set an environment variable to make it stop doing that and let you run the program anyway, assuming you don't want to edit nvclock's source code to fix the problem. :) Try: MALLOC_CHECK_=0 nvclock -i ("man malloc" for more info) ^ permalink raw reply [flat|nested] 58+ messages in thread
* Re: [gentoo-user] Upgrading from FX-5200 to a GeForce 6200 512MB 2010-10-20 19:07 ` Paul Hartman @ 2010-10-20 19:28 ` Dale 0 siblings, 0 replies; 58+ messages in thread From: Dale @ 2010-10-20 19:28 UTC (permalink / raw To: gentoo-user Paul Hartman wrote: > MALLOC_CHECK_=0 nvclock -i It appears that it is more serious than that setting can overcome. Same error as before. I'm running glibc-2.11.2. Anyone having a similar issue with that version? Try to fix one thing and find something else broke. lol Dale :-) :-) ^ permalink raw reply [flat|nested] 58+ messages in thread
* Re: [gentoo-user] Upgrading from FX-5200 to a GeForce 6200 512MB 2010-10-19 22:59 ` Paul Hartman 2010-10-19 23:34 ` Adam Carter @ 2010-10-19 23:40 ` Dale 1 sibling, 0 replies; 58+ messages in thread From: Dale @ 2010-10-19 23:40 UTC (permalink / raw To: gentoo-user Paul Hartman wrote: > On Tue, Oct 19, 2010 at 3:27 PM, Dale<rdalek1967@gmail.com> wrote: > >> Paul Hartman wrote: >> >>> On Tue, Oct 19, 2010 at 2:45 AM, Dale<rdalek1967@gmail.com> wrote: >>> >>> >>>> I am thinking of upgrading from a FX-5200 with 128Mb video card to a >>>> GeForce >>>> 6200 with 512MB. It will be AGP since this is a older rig. My system is >>>> something like this: >>>> >>>> Mobo: Abit NF7 2.0 >>>> CPU: AMD 2500+ No overclocking >>>> Memory: 2Gbs of 333Mhz. >>>> Monitor: Gateway 19" running 1280 x 1024 >>>> >>>> >>> Based on the selection at Newegg, I would highly recommend going with >>> one of the Radeon HD 3650 or 4650 cards which only cost a little more >>> than the one you're looking at. HD3650 is going to be 5x faster than >>> GeForce 6200 and HD4650 probably 10x faster. >>> >>> I think your motherboard supports AGP 8x, and I'm not sure if there >>> are any power supply considerations or other features (number of DVI >>> heads, etc) but anyway that's my 2 cents. :) >>> >>> I am an Nvidia video card guy through and through, but in this case >>> the AGP Nvidia cards on offer there are ancient and slow compared to >>> their ATI counterparts. >>> >>> >>> >> I'm a nvidia guy. I'm not big on ATI at all. Just sort of not my cup of >> tea. I have read they have better Linux support than a long time ago but >> they came in a little to late for me. >> >> I just wish that thing had a bigger heat sink on it with fans. I may change >> that thing pretty quick. >> >> Thanks. >> > Okay then :) To return to your original question, I think going from > FX-5200 to Geforce 6200 should probably give you something like 15% > performance improvement. I don't think either card is new enough to be > supported by vdpau so there won't be anything gained there. > > 6200 uses the current drivers (260.xx) whereas the 5200 is on the > legacy drivers (173.xx), maybe there are additional 3D effects > supported by the newer chipset/drivers. There's a humongous matrix of > nvidia chipset and model numbers somewhere on the internet that > explains the differences but I can't seem to find it at the moment. My > Google-fu is failing me. :) > > > One thing I was hoping is that the newer drivers would work better. I would think they only update what they have to for new kernels and such. That is my hope. It does seem to get slower as time goes on but I'm not sure how much that is the drivers and how much that is to do with the new KDE4. I'm sure KDE4 has a good bit to do with it too. That is about the fastest card I could find that was AGP tho. I may look around and see what else I can find to tho. Dale :-) :-) ^ permalink raw reply [flat|nested] 58+ messages in thread
* [gentoo-user] Re: Upgrading from FX-5200 to a GeForce 6200 512MB 2010-10-19 7:45 [gentoo-user] Upgrading from FX-5200 to a GeForce 6200 512MB Dale 2010-10-19 8:51 ` Florian Philipp 2010-10-19 15:44 ` Paul Hartman @ 2010-10-25 23:42 ` Dale 2010-10-26 0:08 ` Alex Schuster 2010-10-26 2:04 ` Iain Buchanan 2 siblings, 2 replies; 58+ messages in thread From: Dale @ 2010-10-25 23:42 UTC (permalink / raw To: Gentoo User Dale wrote: > Hi, > > I am thinking of upgrading from a FX-5200 with 128Mb video card to a > GeForce 6200 with 512MB. It will be AGP since this is a older rig. > My system is something like this: > > Mobo: Abit NF7 2.0 > CPU: AMD 2500+ No overclocking > Memory: 2Gbs of 333Mhz. > Monitor: Gateway 19" running 1280 x 1024 > > I think my memory is fine, it never uses all of it, or even half of > it, except for caching stuff. I may try to get a 3000+ or 3200+ CPU > if I can run up on a good deal. I'm thinking of doing the video card > first because it is cheaper. I have also noticed that playing movies > on here is getting a bit slow if I go full screen or close to full > screen. I'm bad to download from youtube and then play them locally > full screen or as close as it will allow. > > I do use the nvidia drivers. Currently: > > nvidia-drivers-173.14.25 > > I'm on that one because I think I need to upgrade my kernel to use the > latest one that was recently put in the tree. I'm looking at this card: > > http://www.newegg.com/Product/Product.aspx?Item=N82E16814133328 > > What kind of improvement can I expect from this video card upgrade? > While I am at it, the CPU upgrade won't make that much difference > right? Maybe 20% or so faster or something like that? > > Thoughts? Opinions? > > Thanks. > > Dale > > :-) :-) > OK. I been thinking on this. I decided to run glxgears to see what sort of frame rates I get. I used to get about 30 or so. I know this isn't the best test in the world but I should get something to compare to at least. This is what I get: 16 frames in 5.1 seconds = 3.148 FPS 16 frames in 5.1 seconds = 3.165 FPS 15 frames in 5.3 seconds = 2.811 FPS 16 frames in 5.2 seconds = 3.075 FPS 16 frames in 5.1 seconds = 3.130 FPS 16 frames in 5.1 seconds = 3.159 FPS 16 frames in 5.1 seconds = 3.167 FPS 16 frames in 5.1 seconds = 3.154 FPS So, same card as a year or so ago and same everything else but now I get only about 1/10th the frame rate. What gives? Is this a driver issue? I'm going to take the side off and blow out the case in a bit and test again. I'm open to ideas in the meantime tho. I may not need a upgrade, I may just need to fix what I got. Dale :-) :-) ^ permalink raw reply [flat|nested] 58+ messages in thread
* Re: [gentoo-user] Re: Upgrading from FX-5200 to a GeForce 6200 512MB 2010-10-25 23:42 ` [gentoo-user] " Dale @ 2010-10-26 0:08 ` Alex Schuster 2010-10-26 1:48 ` Dale 2010-10-26 2:04 ` Iain Buchanan 1 sibling, 1 reply; 58+ messages in thread From: Alex Schuster @ 2010-10-26 0:08 UTC (permalink / raw To: gentoo-user Dale writes: > So, same card as a year or so ago and same everything else but now I > get only about 1/10th the frame rate. What gives? Is this a driver > issue? Is OpenGL working at all? Does glxinfo produce lots of output, with 'direct rendering: Yes' near the top? If not, your're using software rendering, all is done by the CPU, not the GPU. Wonko ^ permalink raw reply [flat|nested] 58+ messages in thread
* Re: [gentoo-user] Re: Upgrading from FX-5200 to a GeForce 6200 512MB 2010-10-26 0:08 ` Alex Schuster @ 2010-10-26 1:48 ` Dale 2010-10-26 17:12 ` Peter Humphrey 0 siblings, 1 reply; 58+ messages in thread From: Dale @ 2010-10-26 1:48 UTC (permalink / raw To: gentoo-user Alex Schuster wrote: > Dale writes: > > >> So, same card as a year or so ago and same everything else but now I >> get only about 1/10th the frame rate. What gives? Is this a driver >> issue? >> > Is OpenGL working at all? Does glxinfo produce lots of output, with > 'direct rendering: Yes' near the top? If not, your're using software > rendering, all is done by the CPU, not the GPU. > > Wonko > That's what I am thinking. I notice here lately that my CPU is being used a LOT more then it used to when playing videos or something. I recently changed kernels and nvidia drivers, the kernel upgrade forced me to upgrade nvidia. It appears to have gotten worse with each upgrade. This is what I got from these two commands: root@smoker / # eselect opengl list Available OpenGL implementations: [1] nvidia * [2] xorg-x11 root@smoker / # glxinfo name of display: :0.0 display: :0 screen: 0 direct rendering: Yes server glx vendor string: NVIDIA Corporation server glx version string: 1.4 server glx extensions: GLX_EXT_visual_info, GLX_EXT_visual_rating, GLX_SGIX_fbconfig, GLX_SGIX_pbuffer, GLX_SGI_video_sync, GLX_SGI_swap_control, GLX_EXT_texture_from_pixmap, GLX_ARB_multisample, GLX_NV_float_buffer client glx vendor string: NVIDIA Corporation client glx version string: 1.4 client glx extensions: GLX_ARB_get_proc_address, GLX_ARB_multisample, GLX_EXT_visual_info, GLX_EXT_visual_rating, GLX_EXT_import_context, GLX_SGI_video_sync, GLX_NV_swap_group, GLX_NV_video_out, GLX_SGIX_fbconfig, GLX_SGIX_pbuffer, GLX_SGI_swap_control, GLX_NV_float_buffer, GLX_ARB_fbconfig_float, GLX_EXT_fbconfig_packed_float, GLX_EXT_texture_from_pixmap, GLX_EXT_framebuffer_sRGB, GLX_NV_present_video GLX version: 1.3 GLX extensions: GLX_EXT_visual_info, GLX_EXT_visual_rating, GLX_SGIX_fbconfig, GLX_SGIX_pbuffer, GLX_SGI_video_sync, GLX_SGI_swap_control, GLX_EXT_texture_from_pixmap, GLX_ARB_multisample, GLX_NV_float_buffer, GLX_ARB_get_proc_address OpenGL vendor string: NVIDIA Corporation OpenGL renderer string: GeForce FX 5200/AGP/SSE/3DNOW! OpenGL version string: 2.1.2 NVIDIA 173.14.25 OpenGL shading language version string: 1.20 NVIDIA via Cg compiler <<SNIPPED?? It says direct rendering is working but it sure doesn't act like it. I watched a video a bit ago and although it was a small thing, only took about 20% of my screen, it used just about all the CPU power. It didn't do that a few months or so ago. It used to take only 25% or so to do a full screen video. This is really weird. I did take the side off my case a hour or so ago. I took my air tank and blew it out pretty good. I also checked to make sure the fan was turning on the video card chip. It was spinning fine and I could feel a little bit of air. It's a small fan so I wasn't expecting a tornado or anything. Anyway, after blowing it out AND generating a xorg-conf with nvidia's program, I get this: root@smoker / # glxgears Running synchronized to the vertical refresh. The framerate should be approximately the same as the monitor refresh rate. 2932 frames in 5.0 seconds = 586.390 FPS 1260 frames in 5.7 seconds = 222.873 FPS 2 frames in 7.6 seconds = 0.263 FPS 2 frames in 8.0 seconds = 0.249 FPS 2 frames in 7.6 seconds = 0.264 FPS 2 frames in 7.7 seconds = 0.259 FPS XIO: fatal IO error 22 (Invalid argument) on X server ":0.0" after 58 requests (58 known processed) with 0 events remaining. root@smoker / # Well, if it wasn't bad enough before, it is really bad now. The first couple were when the window was really small. I adjusted it to full screen which is where the 0.2 FPS comes in. That used to be about 30 or so a while back. This is with the nvidia generated xorg.conf file. I'm going back to my hand made one. It seems to be a little better. Any ideas as to why everything says it is working but it isn't? Dale :-) :-) ^ permalink raw reply [flat|nested] 58+ messages in thread
* Re: [gentoo-user] Re: Upgrading from FX-5200 to a GeForce 6200 512MB 2010-10-26 1:48 ` Dale @ 2010-10-26 17:12 ` Peter Humphrey 2010-10-26 17:28 ` Arttu V. 0 siblings, 1 reply; 58+ messages in thread From: Peter Humphrey @ 2010-10-26 17:12 UTC (permalink / raw To: gentoo-user On Tuesday 26 October 2010 02:48:03 Dale wrote: > root@smoker / # glxinfo > root@smoker / # glxgears Would someone tell me which package has these two programs? The bit of poking about that I've done doesn't find them. -- Rgds Peter. Linux Counter 5290, 1994-04-23. ^ permalink raw reply [flat|nested] 58+ messages in thread
* Re: [gentoo-user] Re: Upgrading from FX-5200 to a GeForce 6200 512MB 2010-10-26 17:12 ` Peter Humphrey @ 2010-10-26 17:28 ` Arttu V. 2010-10-27 14:37 ` Peter Humphrey 0 siblings, 1 reply; 58+ messages in thread From: Arttu V. @ 2010-10-26 17:28 UTC (permalink / raw To: gentoo-user On 10/26/10, Peter Humphrey <peter@humphrey.ukfsn.org> wrote: > On Tuesday 26 October 2010 02:48:03 Dale wrote: > >> root@smoker / # glxinfo > >> root@smoker / # glxgears > > Would someone tell me which package has these two programs? The bit of > poking about that I've done doesn't find them. ~ $ equery belongs glxinfo [ Searching for file(s) glxinfo in *... ] x11-apps/mesa-progs-7.7 (/usr/bin/glxinfo) ~ $ equery belongs glxgears [ Searching for file(s) glxgears in *... ] x11-apps/mesa-progs-7.7 (/usr/bin/glxgears -- Arttu V. -- Running Gentoo is like running with scissors ^ permalink raw reply [flat|nested] 58+ messages in thread
* Re: [gentoo-user] Re: Upgrading from FX-5200 to a GeForce 6200 512MB 2010-10-26 17:28 ` Arttu V. @ 2010-10-27 14:37 ` Peter Humphrey 0 siblings, 0 replies; 58+ messages in thread From: Peter Humphrey @ 2010-10-27 14:37 UTC (permalink / raw To: gentoo-user On Tuesday 26 October 2010 18:28:16 Arttu V. wrote: > ~ $ equery belongs glxinfo > [ Searching for file(s) glxinfo in *... ] > x11-apps/mesa-progs-7.7 (/usr/bin/glxinfo) > ~ $ equery belongs glxgears > [ Searching for file(s) glxgears in *... ] > x11-apps/mesa-progs-7.7 (/usr/bin/glxgears Thank you, kind Sir! I didn't have that program installed. -- Rgds Peter. Linux Counter 5290, 1994-04-23. ^ permalink raw reply [flat|nested] 58+ messages in thread
* Re: [gentoo-user] Re: Upgrading from FX-5200 to a GeForce 6200 512MB 2010-10-25 23:42 ` [gentoo-user] " Dale 2010-10-26 0:08 ` Alex Schuster @ 2010-10-26 2:04 ` Iain Buchanan 2010-10-26 2:26 ` Dale ` (2 more replies) 1 sibling, 3 replies; 58+ messages in thread From: Iain Buchanan @ 2010-10-26 2:04 UTC (permalink / raw To: gentoo-user On Mon, 2010-10-25 at 18:42 -0500, Dale wrote: > So, same card as a year or so ago and same everything else but now I get > only about 1/10th the frame rate. What gives? Is this a driver issue? > I'm going to take the side off and blow out the case in a bit and test > again. I'm open to ideas in the meantime tho. I may not need a > upgrade, I may just need to fix what I got. > > Dale > > :-) :-) > I'm having issues with the latest mix of nvidia-drivers, xorg, and whatever else it might be! I'm getting bad performance when switching virtual dekstops and moving windows and such. GL screensavers seem to be ok though. Someone posted recently about an upgrade that affected him (looking... can't find it). He downgraded to fix it, but it wasn't nvidia or x from memory. Sorry for being vague, I'll keep looking. -- Iain Buchanan <iaindb at netspace dot net dot au> When an episode of Walker Texas Ranger was aired in France, the French surrendered to Chuck Norris just to be on the safe side. ^ permalink raw reply [flat|nested] 58+ messages in thread
* Re: [gentoo-user] Re: Upgrading from FX-5200 to a GeForce 6200 512MB 2010-10-26 2:04 ` Iain Buchanan @ 2010-10-26 2:26 ` Dale 2010-10-26 3:55 ` Dale 2010-10-26 9:18 ` Marc Joliet 2010-10-26 15:24 ` Paul Hartman 2 siblings, 1 reply; 58+ messages in thread From: Dale @ 2010-10-26 2:26 UTC (permalink / raw To: gentoo-user Iain Buchanan wrote: > On Mon, 2010-10-25 at 18:42 -0500, Dale wrote: > > >> So, same card as a year or so ago and same everything else but now I get >> only about 1/10th the frame rate. What gives? Is this a driver issue? >> I'm going to take the side off and blow out the case in a bit and test >> again. I'm open to ideas in the meantime tho. I may not need a >> upgrade, I may just need to fix what I got. >> >> Dale >> >> :-) :-) >> >> > I'm having issues with the latest mix of nvidia-drivers, xorg, and > whatever else it might be! > > I'm getting bad performance when switching virtual dekstops and moving > windows and such. GL screensavers seem to be ok though. > > Someone posted recently about an upgrade that affected him (looking... > can't find it). He downgraded to fix it, but it wasn't nvidia or x from > memory. Sorry for being vague, I'll keep looking. > > Well, at least we know there is a problem and it isn't just us. If you find something, let us know. I would hate to know I had to try to watch a DVD right now. I doubt it would even start up. o_O BTW, hal is disabled on xorg here. It's enabled on other things but not xorg. Dale :-) :-) ^ permalink raw reply [flat|nested] 58+ messages in thread
* Re: [gentoo-user] Re: Upgrading from FX-5200 to a GeForce 6200 512MB 2010-10-26 2:26 ` Dale @ 2010-10-26 3:55 ` Dale 2010-10-26 5:18 ` me 2010-10-26 5:27 ` Iain Buchanan 0 siblings, 2 replies; 58+ messages in thread From: Dale @ 2010-10-26 3:55 UTC (permalink / raw To: gentoo-user Dale wrote: > > Well, at least we know there is a problem and it isn't just us. If > you find something, let us know. I would hate to know I had to try > to watch a DVD right now. I doubt it would even start up. o_O > > BTW, hal is disabled on xorg here. It's enabled on other things but > not xorg. > > Dale > > :-) :-) > I have a update. Check this out: root@smoker / # glxgears Running synchronized to the vertical refresh. The framerate should be approximately the same as the monitor refresh rate. 1380 frames in 5.0 seconds = 275.147 FPS 242 frames in 5.0 seconds = 48.345 FPS 241 frames in 5.0 seconds = 48.191 FPS 246 frames in 5.0 seconds = 49.056 FPS 240 frames in 5.0 seconds = 47.902 FPS 238 frames in 5.0 seconds = 47.453 FPS 238 frames in 5.0 seconds = 47.566 FPS 240 frames in 5.0 seconds = 47.839 FPS 244 frames in 5.0 seconds = 48.577 FPS 242 frames in 5.0 seconds = 48.257 FPS 241 frames in 5.0 seconds = 48.196 FPS 242 frames in 5.0 seconds = 48.380 FPS 238 frames in 5.0 seconds = 47.573 FPS 238 frames in 5.0 seconds = 47.596 FPS 241 frames in 5.0 seconds = 48.168 FPS XIO: fatal IO error 22 (Invalid argument) on X server ":0.0" after 900 requests (900 known processed) with 0 events remaining. root@smoker / # What did I do you ask? Well, I did this: root@smoker / # eselect opengl list Available OpenGL implementations: [1] nvidia * [2] xorg-x11 root@smoker / # eselect opengl set 1 Switching to nvidia OpenGL interface... done root@smoker / # Yea, it SAID it was already set but I told it to set it again anyway. Now I get some good frame rates again. Excuse me while I go watch some videos I been wanting to watch but got tired of the spit and sputter. lol YEPPIE ! ! ! It breathes again. Dale :-) :-) P. S. Bonus points if someone can explain why that worked. o_O ^ permalink raw reply [flat|nested] 58+ messages in thread
* Re: [gentoo-user] Re: Upgrading from FX-5200 to a GeForce 6200 512MB 2010-10-26 3:55 ` Dale @ 2010-10-26 5:18 ` me 2010-10-26 5:27 ` Iain Buchanan 1 sibling, 0 replies; 58+ messages in thread From: me @ 2010-10-26 5:18 UTC (permalink / raw To: gentoo-user On Mon, Oct 25, 2010 at 11:55 PM, Dale <rdalek1967@gmail.com> wrote: > Dale wrote: >> >> Well, at least we know there is a problem and it isn't just us. If you >> find something, let us know. I would hate to know I had to try to watch a >> DVD right now. I doubt it would even start up. o_O >> >> BTW, hal is disabled on xorg here. It's enabled on other things but not >> xorg. >> >> Dale >> >> :-) :-) >> > > I have a update. Check this out: > > root@smoker / # glxgears > Running synchronized to the vertical refresh. The framerate should be > approximately the same as the monitor refresh rate. > 1380 frames in 5.0 seconds = 275.147 FPS > 242 frames in 5.0 seconds = 48.345 FPS > 241 frames in 5.0 seconds = 48.191 FPS > 246 frames in 5.0 seconds = 49.056 FPS > 240 frames in 5.0 seconds = 47.902 FPS > 238 frames in 5.0 seconds = 47.453 FPS > 238 frames in 5.0 seconds = 47.566 FPS > 240 frames in 5.0 seconds = 47.839 FPS > 244 frames in 5.0 seconds = 48.577 FPS > 242 frames in 5.0 seconds = 48.257 FPS > 241 frames in 5.0 seconds = 48.196 FPS > 242 frames in 5.0 seconds = 48.380 FPS > 238 frames in 5.0 seconds = 47.573 FPS > 238 frames in 5.0 seconds = 47.596 FPS > 241 frames in 5.0 seconds = 48.168 FPS > XIO: fatal IO error 22 (Invalid argument) on X server ":0.0" > after 900 requests (900 known processed) with 0 events remaining. > root@smoker / # > > > What did I do you ask? Well, I did this: > > root@smoker / # eselect opengl list > Available OpenGL implementations: > [1] nvidia * > [2] xorg-x11 > root@smoker / # eselect opengl set 1 > Switching to nvidia OpenGL interface... done > root@smoker / # > > Yea, it SAID it was already set but I told it to set it again anyway. Now I > get some good frame rates again. Excuse me while I go watch some videos I > been wanting to watch but got tired of the spit and sputter. lol > > YEPPIE ! ! ! It breathes again. > > Dale > > :-) :-) > > P. S. Bonus points if someone can explain why that worked. o_O It's been a while since I last had X + accelerated 3D on any of my systems here (my gaming box's running W7), but it would be my guess that, while Nvidia's libraries *were* configured as the GL implementation to be used, in the process of the last driver upgrade they got overwritten with upgraded versions, running things that should have referenced them failed and defaulted to mesa's libraries, and you got stuck with CPU based 3D rendering. Upon using eselect to set it to Nvidia's again, however, you refreshed the links to the proper, updated, libraries and things started using the GPU again. This is, of course, entirely a guess, and is at least moderately broken by this from glxinfo: server glx vendor string: NVIDIA Corporation Still... it's the best guess I have at 1:17 AM here, my time. -- Poison [BLX] Joshua M. Murphy ^ permalink raw reply [flat|nested] 58+ messages in thread
* Re: [gentoo-user] Re: Upgrading from FX-5200 to a GeForce 6200 512MB 2010-10-26 3:55 ` Dale 2010-10-26 5:18 ` me @ 2010-10-26 5:27 ` Iain Buchanan 2010-10-27 7:42 ` Dale 1 sibling, 1 reply; 58+ messages in thread From: Iain Buchanan @ 2010-10-26 5:27 UTC (permalink / raw To: gentoo-user On Mon, 2010-10-25 at 22:55 -0500, Dale wrote: > I have a update. Check this out: hey, don't get my hopes up like that. Still no improvement on my box. But then, I am seeing nearly 6500 FPS .... :D -- Iain Buchanan <iaindb at netspace dot net dot au> The chief enemy of creativity is "good" sense -- Picasso ^ permalink raw reply [flat|nested] 58+ messages in thread
* Re: [gentoo-user] Re: Upgrading from FX-5200 to a GeForce 6200 512MB 2010-10-26 5:27 ` Iain Buchanan @ 2010-10-27 7:42 ` Dale 2010-11-06 15:03 ` Dale 0 siblings, 1 reply; 58+ messages in thread From: Dale @ 2010-10-27 7:42 UTC (permalink / raw To: gentoo-user Iain Buchanan wrote: > On Mon, 2010-10-25 at 22:55 -0500, Dale wrote: > >> I have a update. Check this out: >> > hey, don't get my hopes up like that. Still no improvement on my box. > But then, I am seeing nearly 6500 FPS .... :D > > I noticed something on mine when I did that. I was actually doing the command in a Konsole. It seemed to mess up again later on. It got REALLY slow. I decided to do things differently. I logged out of KDE, went to single user mode, typed in the command to set opengl to nvidia, then went back to default runlevel and logged in. It worked fine and has ever since. I have logged out several times, been experimenting with fluxbox, and it is still fast as it was. So, it may be best to run that when logged out of a GUI at least but I went to single user just to be certain. I would think that stopping xdm would work just as well but one never knows about these things. Maybe that will help. Never hurts to hope. Dale :-) :-) ^ permalink raw reply [flat|nested] 58+ messages in thread
* Re: [gentoo-user] Re: Upgrading from FX-5200 to a GeForce 6200 512MB 2010-10-27 7:42 ` Dale @ 2010-11-06 15:03 ` Dale 2010-11-06 20:06 ` Robin Atwood 0 siblings, 1 reply; 58+ messages in thread From: Dale @ 2010-11-06 15:03 UTC (permalink / raw To: gentoo-user Dale wrote: > > I noticed something on mine when I did that. I was actually doing the > command in a Konsole. It seemed to mess up again later on. It got > REALLY slow. I decided to do things differently. I logged out of > KDE, went to single user mode, typed in the command to set opengl to > nvidia, then went back to default runlevel and logged in. It worked > fine and has ever since. I have logged out several times, been > experimenting with fluxbox, and it is still fast as it was. So, it > may be best to run that when logged out of a GUI at least but I went > to single user just to be certain. I would think that stopping xdm > would work just as well but one never knows about these things. > > Maybe that will help. Never hurts to hope. > > Dale > > :-) :-) > This is getting weird. I haven't rebooted in a few weeks now. I tried to watch a video a bit ago and it was slow again. It was down to about 2 or 3 frames per second. It is awful. If I go tell it to switch to opengl, it gets fast again but after a while it will go back to being really slow. Why do I have to keep telling it to use nvidia's opengl when it says it is using it and I have switched to a few times? If it is using it, why does it slow down until I tell it to switch? I did do a huge KDE upgrade the other day. I don't recall seeing anything else X related being updated but I could have missed something in that LONG list. I did do a baselayout upgrade and portage itself has been upgraded a few times. Any ideas on why this thing keeps doing this? Would a reboot even help in this situation? Dale :-) :-) ^ permalink raw reply [flat|nested] 58+ messages in thread
* Re: [gentoo-user] Re: Upgrading from FX-5200 to a GeForce 6200 512MB 2010-11-06 15:03 ` Dale @ 2010-11-06 20:06 ` Robin Atwood 2010-11-07 21:03 ` walt 2010-11-08 0:07 ` Dale 0 siblings, 2 replies; 58+ messages in thread From: Robin Atwood @ 2010-11-06 20:06 UTC (permalink / raw To: gentoo-user On Saturday 06 November 2010, Dale wrote: > Dale wrote: > This is getting weird. I haven't rebooted in a few weeks now. I tried > to watch a video a bit ago and it was slow again. It was down to about > 2 or 3 frames per second. It is awful. If I go tell it to switch to > opengl, it gets fast again but after a while it will go back to being > really slow. Why do I have to keep telling it to use nvidia's opengl > when it says it is using it and I have switched to a few times? If it > is using it, why does it slow down until I tell it to switch? > > I did do a huge KDE upgrade the other day. I don't recall seeing > anything else X related being updated but I could have missed something > in that LONG list. I did do a baselayout upgrade and portage itself has > been upgraded a few times. > > Any ideas on why this thing keeps doing this? Would a reboot even help > in this situation? When it gets very slow start up top and see what's using the CPU. My bet is the Xserver. I have a GeForce 9400 GT 512MB and the xserver will happily use 90% while nothing much is happening. Start a KDE4 app which constantly updates (ktorrent, kps are good 3rd party examples) and the xserver goes crazy. HTH -Robin -- ---------------------------------------------------------------------- Robin Atwood. "Ship me somewheres east of Suez, where the best is like the worst, Where there ain't no Ten Commandments an' a man can raise a thirst" from "Mandalay" by Rudyard Kipling ---------------------------------------------------------------------- ^ permalink raw reply [flat|nested] 58+ messages in thread
* [gentoo-user] Re: Upgrading from FX-5200 to a GeForce 6200 512MB 2010-11-06 20:06 ` Robin Atwood @ 2010-11-07 21:03 ` walt 2010-11-07 23:27 ` Robin Atwood 2010-11-08 0:07 ` Dale 1 sibling, 1 reply; 58+ messages in thread From: walt @ 2010-11-07 21:03 UTC (permalink / raw To: gentoo-user On 11/06/2010 01:06 PM, Robin Atwood wrote: > ,,,I have a GeForce 9400 GT 512MB and the xserver will happily use > 90% while nothing much is happening. Start a KDE4 app which constantly updates > (ktorrent, kps are good 3rd party examples) and the xserver goes crazy. That sounds to me like a bug somewhere. Do you have the fancy kde user interface enabled? (Can't remember what it's called.) ^ permalink raw reply [flat|nested] 58+ messages in thread
* Re: [gentoo-user] Re: Upgrading from FX-5200 to a GeForce 6200 512MB 2010-11-07 21:03 ` walt @ 2010-11-07 23:27 ` Robin Atwood 0 siblings, 0 replies; 58+ messages in thread From: Robin Atwood @ 2010-11-07 23:27 UTC (permalink / raw To: gentoo-user On Monday 08 November 2010, walt wrote: > On 11/06/2010 01:06 PM, Robin Atwood wrote: > > ,,,I have a GeForce 9400 GT 512MB and the xserver will happily use > > 90% while nothing much is happening. Start a KDE4 app which constantly > > updates (ktorrent, kps are good 3rd party examples) and the xserver goes > > crazy. > > That sounds to me like a bug somewhere. Do you have the fancy kde user > interface enabled? (Can't remember what it's called.) Compositing is turned on but turning it off doesn't help. -Robin -- ---------------------------------------------------------------------- Robin Atwood. "Ship me somewheres east of Suez, where the best is like the worst, Where there ain't no Ten Commandments an' a man can raise a thirst" from "Mandalay" by Rudyard Kipling ---------------------------------------------------------------------- ^ permalink raw reply [flat|nested] 58+ messages in thread
* Re: [gentoo-user] Re: Upgrading from FX-5200 to a GeForce 6200 512MB 2010-11-06 20:06 ` Robin Atwood 2010-11-07 21:03 ` walt @ 2010-11-08 0:07 ` Dale 2010-11-08 14:25 ` Robin Atwood 1 sibling, 1 reply; 58+ messages in thread From: Dale @ 2010-11-08 0:07 UTC (permalink / raw To: gentoo-user Robin Atwood wrote: > On Saturday 06 November 2010, Dale wrote: > >> Dale wrote: >> > >> This is getting weird. I haven't rebooted in a few weeks now. I tried >> to watch a video a bit ago and it was slow again. It was down to about >> 2 or 3 frames per second. It is awful. If I go tell it to switch to >> opengl, it gets fast again but after a while it will go back to being >> really slow. Why do I have to keep telling it to use nvidia's opengl >> when it says it is using it and I have switched to a few times? If it >> is using it, why does it slow down until I tell it to switch? >> >> I did do a huge KDE upgrade the other day. I don't recall seeing >> anything else X related being updated but I could have missed something >> in that LONG list. I did do a baselayout upgrade and portage itself has >> been upgraded a few times. >> >> Any ideas on why this thing keeps doing this? Would a reboot even help >> in this situation? >> > When it gets very slow start up top and see what's using the CPU. My bet is > the Xserver. I have a GeForce 9400 GT 512MB and the xserver will happily use > 90% while nothing much is happening. Start a KDE4 app which constantly updates > (ktorrent, kps are good 3rd party examples) and the xserver goes crazy. > > HTH > -Robin > Nope, it wasn't that here. This is what top says: PID USER PR NI VIRT RES SHR S %CPU %MEM TIME+ COMMAND 17995 root 20 0 45360 15m 3360 R 89.6 0.7 0:35.72 glxgears 32113 dale 20 0 305m 162m 27m S 3.3 8.0 17:56.38 seamonkey-bin 31796 root 20 0 187m 76m 30m S 2.0 3.8 21:51.94 X 31914 dale 20 0 286m 47m 24m S 1.7 2.3 18:04.02 kwin It was glxgears that was taking up the most CPU time but I think the rest of it was processing the video. Thing is, nothing has been updated and I have not even logged out of KDE since it was working this morning. So, without me doing a single thing, it has stopped working as it should. It's like the card is being bypassed as far as it using its own CPU to process the picture. Oh, look at this miserable mess: 2 frames in 8.5 seconds = 0.236 FPS 2 frames in 8.7 seconds = 0.230 FPS 2 frames in 8.3 seconds = 0.241 FPS 2 frames in 8.1 seconds = 0.246 FPS 2 frames in 8.1 seconds = 0.247 FPS 2 frames in 8.1 seconds = 0.247 FPS 2 frames in 8.3 seconds = 0.241 FPS Trust me, to see those little wheels turn that slow is really boring. Going back to single user and switch this again. I have noticed that telling it to switch to nvidia's opengl while in single user mode does seem to last longer. Going to re-emerge the drivers to while I am at it. Can't hurt anything. Still open to ideas cause this is weird. Dale :-) :-) ^ permalink raw reply [flat|nested] 58+ messages in thread
* Re: [gentoo-user] Re: Upgrading from FX-5200 to a GeForce 6200 512MB 2010-11-08 0:07 ` Dale @ 2010-11-08 14:25 ` Robin Atwood 2010-11-08 17:24 ` Dale 0 siblings, 1 reply; 58+ messages in thread From: Robin Atwood @ 2010-11-08 14:25 UTC (permalink / raw To: gentoo-user On Monday 08 November 2010, Dale wrote: > Robin Atwood wrote: > > On Saturday 06 November 2010, Dale wrote: > >> Dale wrote: > >> > >> > >> This is getting weird. I haven't rebooted in a few weeks now. I tried > >> to watch a video a bit ago and it was slow again. It was down to about > >> 2 or 3 frames per second. It is awful. If I go tell it to switch to > >> opengl, it gets fast again but after a while it will go back to being > >> really slow. Why do I have to keep telling it to use nvidia's opengl > >> when it says it is using it and I have switched to a few times? If it > >> is using it, why does it slow down until I tell it to switch? > >> > >> I did do a huge KDE upgrade the other day. I don't recall seeing > >> anything else X related being updated but I could have missed something > >> in that LONG list. I did do a baselayout upgrade and portage itself has > >> been upgraded a few times. > >> > >> Any ideas on why this thing keeps doing this? Would a reboot even help > >> in this situation? > > > > When it gets very slow start up top and see what's using the CPU. My bet > > is the Xserver. I have a GeForce 9400 GT 512MB and the xserver will > > happily use 90% while nothing much is happening. Start a KDE4 app which > > constantly updates (ktorrent, kps are good 3rd party examples) and the > > xserver goes crazy. > > > > HTH > > -Robin > > Nope, it wasn't that here. This is what top says: > > PID USER PR NI VIRT RES SHR S %CPU %MEM TIME+ COMMAND > 17995 root 20 0 45360 15m 3360 R 89.6 0.7 0:35.72 glxgears > 32113 dale 20 0 305m 162m 27m S 3.3 8.0 17:56.38 seamonkey-bin > 31796 root 20 0 187m 76m 30m S 2.0 3.8 21:51.94 X > 31914 dale 20 0 286m 47m 24m S 1.7 2.3 18:04.02 kwin > > It was glxgears that was taking up the most CPU time but I think the > rest of it was processing the video. Thing is, nothing has been updated > and I have not even logged out of KDE since it was working this > morning. So, without me doing a single thing, it has stopped working as > it should. It's like the card is being bypassed as far as it using its > own CPU to process the picture. > > Oh, look at this miserable mess: > > 2 frames in 8.5 seconds = 0.236 FPS > 2 frames in 8.7 seconds = 0.230 FPS > 2 frames in 8.3 seconds = 0.241 FPS > 2 frames in 8.1 seconds = 0.246 FPS > 2 frames in 8.1 seconds = 0.247 FPS > 2 frames in 8.1 seconds = 0.247 FPS > 2 frames in 8.3 seconds = 0.241 FPS > > Trust me, to see those little wheels turn that slow is really boring. > > Going back to single user and switch this again. I have noticed that > telling it to switch to nvidia's opengl while in single user mode does > seem to last longer. Going to re-emerge the drivers to while I am at > it. Can't hurt anything. > > Still open to ideas cause this is weird. AFAIK, all "eselect opengl" does is set up some symlinks so you use NVidia libraries and not Mesa ones. You might want to poke around and check last access dates. HTH -Robin -- ---------------------------------------------------------------------- Robin Atwood. "Ship me somewheres east of Suez, where the best is like the worst, Where there ain't no Ten Commandments an' a man can raise a thirst" from "Mandalay" by Rudyard Kipling ---------------------------------------------------------------------- ^ permalink raw reply [flat|nested] 58+ messages in thread
* Re: [gentoo-user] Re: Upgrading from FX-5200 to a GeForce 6200 512MB 2010-11-08 14:25 ` Robin Atwood @ 2010-11-08 17:24 ` Dale 2010-11-08 21:20 ` Dale 0 siblings, 1 reply; 58+ messages in thread From: Dale @ 2010-11-08 17:24 UTC (permalink / raw To: gentoo-user [-- Attachment #1: Type: text/plain, Size: 1044 bytes --] Robin Atwood wrote: > AFAIK, all "eselect opengl" does is set up some symlinks so you use NVidia > libraries and not Mesa ones. You might want to poke around and check last > access dates. > > HTH > -Robin > I was thinking the same thing. I figure something worked for a while and then had some sort of a error and then switched to something else that was slow. I don't know the inner workings of opengl so I am just guessing. I just know it worked for a while then didn't until I told it to switch again. It is weird tho. I did do this last night tho. I upgraded my kernel and updated to the latest nvidia drivers. I checked it again a few minutes ago by playing a video and it is still working like it should. At almost full screen my CPU was running at about 40 to 50% which is about like it was a while back. So, I figure it was either some sort of kernel issue or even more likely a nvidia driver issue. I'm just hoping it keeps working like this. Those little wheels are turning pretty good now. Dale :-) :-) [-- Attachment #2: Type: text/html, Size: 1504 bytes --] ^ permalink raw reply [flat|nested] 58+ messages in thread
* Re: [gentoo-user] Re: Upgrading from FX-5200 to a GeForce 6200 512MB 2010-11-08 17:24 ` Dale @ 2010-11-08 21:20 ` Dale 2010-11-08 21:26 ` Alex Schuster 2010-11-09 0:06 ` walt 0 siblings, 2 replies; 58+ messages in thread From: Dale @ 2010-11-08 21:20 UTC (permalink / raw To: gentoo-user [-- Attachment #1: Type: text/plain, Size: 1487 bytes --] Dale wrote: > > I was thinking the same thing. I figure something worked for a while > and then had some sort of a error and then switched to something else > that was slow. I don't know the inner workings of opengl so I am just > guessing. I just know it worked for a while then didn't until I told > it to switch again. It is weird tho. > > I did do this last night tho. I upgraded my kernel and updated to the > latest nvidia drivers. I checked it again a few minutes ago by > playing a video and it is still working like it should. At almost > full screen my CPU was running at about 40 to 50% which is about like > it was a while back. So, I figure it was either some sort of kernel > issue or even more likely a nvidia driver issue. > > I'm just hoping it keeps working like this. Those little wheels are > turning pretty good now. > > Dale > > :-) :-) Well, I worked on my air compressor and played in the dirt in my garden for a while and now I get this again: 2 frames in 7.6 seconds = 0.263 FPS 2 frames in 7.7 seconds = 0.259 FPS I don't know what the issue is but it is getting on my nerves. I have not even logged out of KDE and it is slow again. The only thing I have done was to downgrade gtkam to see if the old version crashes too. Nothing else has been messed with since this morning. Any ideas at all? I'm about ready to do a emerge -e world and see if that helps. It's getting cool so I could use the heat anyway. Dale :-) :-) [-- Attachment #2: Type: text/html, Size: 2070 bytes --] ^ permalink raw reply [flat|nested] 58+ messages in thread
* Re: [gentoo-user] Re: Upgrading from FX-5200 to a GeForce 6200 512MB 2010-11-08 21:20 ` Dale @ 2010-11-08 21:26 ` Alex Schuster 2010-11-08 22:29 ` Dale 2010-11-09 0:06 ` walt 1 sibling, 1 reply; 58+ messages in thread From: Alex Schuster @ 2010-11-08 21:26 UTC (permalink / raw To: gentoo-user Dale writes: > Well, I worked on my air compressor and played in the dirt in my garden > for a while and now I get this again: > > 2 frames in 7.6 seconds = 0.263 FPS > 2 frames in 7.7 seconds = 0.259 FPS D'ouch! > I don't know what the issue is but it is getting on my nerves. I have > not even logged out of KDE and it is slow again. The only thing I have > done was to downgrade gtkam to see if the old version crashes too. > Nothing else has been messed with since this morning. > > Any ideas at all? I'm about ready to do a emerge -e world and see if > that helps. It's getting cool so I could use the heat anyway. Anything in syslog, Xorg.log or dmesg about drm suddenly being turned off? I'd get back into the garden and turn the air compressor to reverse. Wonko ^ permalink raw reply [flat|nested] 58+ messages in thread
* Re: [gentoo-user] Re: Upgrading from FX-5200 to a GeForce 6200 512MB 2010-11-08 21:26 ` Alex Schuster @ 2010-11-08 22:29 ` Dale 0 siblings, 0 replies; 58+ messages in thread From: Dale @ 2010-11-08 22:29 UTC (permalink / raw To: gentoo-user Alex Schuster wrote: > Dale writes: > > >> Well, I worked on my air compressor and played in the dirt in my garden >> for a while and now I get this again: >> >> 2 frames in 7.6 seconds = 0.263 FPS >> 2 frames in 7.7 seconds = 0.259 FPS >> > D'ouch! > > >> I don't know what the issue is but it is getting on my nerves. I have >> not even logged out of KDE and it is slow again. The only thing I have >> done was to downgrade gtkam to see if the old version crashes too. >> Nothing else has been messed with since this morning. >> >> Any ideas at all? I'm about ready to do a emerge -e world and see if >> that helps. It's getting cool so I could use the heat anyway. >> > Anything in syslog, Xorg.log or dmesg about drm suddenly being turned off? > > I'd get back into the garden and turn the air compressor to reverse. > > Wonko > > I checked messages, Xorg.log and dmesg, nothing out of the ordinary in there. Just me plugging up my camera, ntpd setting the clock and such nothingness as that. I can't think of any other logs that I can check either. That air compressor has been giving me fits. First a dirt dobber built him a nice house in it and then that caused a wire to burn out on a run capacitor. I evicted the wasp a week or so ago and fixed the wire today. That bug better not try to move in again either. I'll evict him next time with bug spray. Make it a permanent eviction. lol Dale :-) :-) ^ permalink raw reply [flat|nested] 58+ messages in thread
* [gentoo-user] Re: Upgrading from FX-5200 to a GeForce 6200 512MB 2010-11-08 21:20 ` Dale 2010-11-08 21:26 ` Alex Schuster @ 2010-11-09 0:06 ` walt 2010-11-09 0:21 ` Dale 1 sibling, 1 reply; 58+ messages in thread From: walt @ 2010-11-09 0:06 UTC (permalink / raw To: gentoo-user On 11/08/2010 01:20 PM, Dale wrote: > Dale wrote: >> >> I was thinking the same thing. I figure something worked for a while and then had some sort of a error and then switched to something else that was slow. I don't know the inner workings of opengl so I am just guessing.I just know it worked for a while >> then didn't until I told it to switch again. It is weird tho. >> >> I did do this last night tho. I upgraded my kernel and updated to the latest nvidia drivers. I checked it again a few minutes ago by playing a video and it is still working like it should. At almost full screen my CPU was running at about 40 to 50% >> which is about like it was a while back. So, I figure it was either some sort of kernel issue or even more likely a nvidia driver issue. >> >> I'm just hoping it keeps working like this. Those little wheels are turning pretty good now. >> >> Dale >> >> :-) :-) > > Well, I worked on my air compressor and played in the dirt in my garden for a while and now I get this again: > > 2 frames in 7.6 seconds = 0.263 FPS > 2 frames in 7.7 seconds = 0.259 FPS Is it possible that something slowly fills up RAM so your system has to start swapping? KDE used to have a 'system monitor' thingy that displays usage of all the various system resources like RAM and swap and CPU. I always have the equivalent gnome applet displayed on the gnome panel and it's alerted me to countless similar bugs over the years. ^ permalink raw reply [flat|nested] 58+ messages in thread
* Re: [gentoo-user] Re: Upgrading from FX-5200 to a GeForce 6200 512MB 2010-11-09 0:06 ` walt @ 2010-11-09 0:21 ` Dale 2010-11-10 20:18 ` Dale 0 siblings, 1 reply; 58+ messages in thread From: Dale @ 2010-11-09 0:21 UTC (permalink / raw To: gentoo-user walt wrote: > On 11/08/2010 01:20 PM, Dale wrote: >> Dale wrote: >>> >>> I was thinking the same thing. I figure something worked for a while >>> and then had some sort of a error and then switched to something >>> else that was slow. I don't know the inner workings of opengl so I >>> am just guessing.I just know it worked for a while >>> then didn't until I told it to switch again. It is weird tho. >>> >>> I did do this last night tho. I upgraded my kernel and updated to >>> the latest nvidia drivers. I checked it again a few minutes ago by >>> playing a video and it is still working like it should. At almost >>> full screen my CPU was running at about 40 to 50% >>> which is about like it was a while back. So, I figure it was either >>> some sort of kernel issue or even more likely a nvidia driver issue. >>> >>> I'm just hoping it keeps working like this. Those little wheels are >>> turning pretty good now. >>> >>> Dale >>> >>> :-) :-) >> >> Well, I worked on my air compressor and played in the dirt in my >> garden for a while and now I get this again: >> >> 2 frames in 7.6 seconds = 0.263 FPS >> 2 frames in 7.7 seconds = 0.259 FPS > > Is it possible that something slowly fills up RAM so your system has to > start swapping? > > KDE used to have a 'system monitor' thingy that displays usage of all the > various system resources like RAM and swap and CPU. I always have the > equivalent gnome applet displayed on the gnome panel and it's alerted me > to countless similar bugs over the years. > According to top, gkrellm and cat /proc/meminfo there is no swap in use. I have 2Gbs of ram and have swappiness set to 20 or 30. I rarely use swap unless I am compiling something huge, OOo comes to mind, or have a LOT of images open with GIMP. I did check to make sure tho. My swappiness did get magically changed once before. I wish it was something that easy tho. Still open to ideas. I started a emerge -e world. Dale :-) :-) ^ permalink raw reply [flat|nested] 58+ messages in thread
* Re: [gentoo-user] Re: Upgrading from FX-5200 to a GeForce 6200 512MB 2010-11-09 0:21 ` Dale @ 2010-11-10 20:18 ` Dale 2010-11-10 21:53 ` Alan McKinnon 0 siblings, 1 reply; 58+ messages in thread From: Dale @ 2010-11-10 20:18 UTC (permalink / raw To: gentoo-user Dale wrote: > > According to top, gkrellm and cat /proc/meminfo there is no swap in > use. I have 2Gbs of ram and have swappiness set to 20 or 30. I > rarely use swap unless I am compiling something huge, OOo comes to > mind, or have a LOT of images open with GIMP. > > I did check to make sure tho. My swappiness did get magically changed > once before. I wish it was something that easy tho. > > Still open to ideas. I started a emerge -e world. > > Dale > > :-) :-) > Just to update here. I started a emerge -e world. It has not even finished yet but it appears to be working fine now. It was working yesterday, last night, this morning and was working fine when I tried just a minute ago. So, it appears that something needed to be recompiled somewhere but no clue what that could have been. I'll keep testing over the next few days and may report back if it is still working correctly. I hope that it does tho. It was getting on my nerves. Thanks for the ideas and help. Dale :-) :-) ^ permalink raw reply [flat|nested] 58+ messages in thread
* Re: [gentoo-user] Re: Upgrading from FX-5200 to a GeForce 6200 512MB 2010-11-10 20:18 ` Dale @ 2010-11-10 21:53 ` Alan McKinnon 2010-11-11 2:00 ` Dale 0 siblings, 1 reply; 58+ messages in thread From: Alan McKinnon @ 2010-11-10 21:53 UTC (permalink / raw To: gentoo-user Apparently, though unproven, at 22:18 on Wednesday 10 November 2010, Dale did opine thusly: > Dale wrote: > > According to top, gkrellm and cat /proc/meminfo there is no swap in > > use. I have 2Gbs of ram and have swappiness set to 20 or 30. I > > rarely use swap unless I am compiling something huge, OOo comes to > > mind, or have a LOT of images open with GIMP. > > > > I did check to make sure tho. My swappiness did get magically changed > > once before. I wish it was something that easy tho. > > > > Still open to ideas. I started a emerge -e world. > > > > Dale > > > > :-) :-) > > Just to update here. I started a emerge -e world. It has not even > finished yet but it appears to be working fine now. It was working > yesterday, last night, this morning and was working fine when I tried > just a minute ago. So, it appears that something needed to be > recompiled somewhere but no clue what that could have been. > > I'll keep testing over the next few days and may report back if it is > still working correctly. I hope that it does tho. It was getting on my > nerves. Useful tip to keep in mind: Sometimes emerge -e world works out great. It's way overkill mostly but unlike a sledgehammer to kill a mosquito, doesn't break the wall as well as kill the insect :-) IIRC, revdep-rebuild came about from the same line of thought. Some libs were being wrongly linked or linked to missing stuff and it was a huge ball-ache to find them all. Imagine running ldd on every binary and grepping for "not found" :-) It might even have been a glibc update (memory weak this end). revdep-rebuild finds the easily detectable stuff. But there's other problems that can happen with binaries that are not so easy to check (or not known to the dev), and none of the Gentoo tools help locate the culprit. emerge -e world will just rebuild everything in sight with the nice side effect of taking care of these mysterious problems. Hello sledgehammer. Pity that it can't record what it fixed though. It's interesting to see why Ubuntu and other binary distros never have this problem. First, they don't rip foundation libs out underneath a running system and insert different ones on the fly, and the API/ABI of their libs doesn't change for the life of that release of the distro. Plus, their build farms that generate new rpms/debs/pkgs nightly, essentially do the equivalent of a full emerge -e world daily and copy the binaries to the download server So sometimes when all else fails and suicide seems attractive, this is a workable approach that can help. Now if we can just get the gcc upgrade docs changed to reflect intelligent reality, we can get newbies to grok that emerge -e world is not suitable for the *first* fault-finding tool one uses.... -- alan dot mckinnon at gmail dot com ^ permalink raw reply [flat|nested] 58+ messages in thread
* Re: [gentoo-user] Re: Upgrading from FX-5200 to a GeForce 6200 512MB 2010-11-10 21:53 ` Alan McKinnon @ 2010-11-11 2:00 ` Dale 2010-11-11 12:39 ` Robin Atwood 0 siblings, 1 reply; 58+ messages in thread From: Dale @ 2010-11-11 2:00 UTC (permalink / raw To: gentoo-user Alan McKinnon wrote: > Apparently, though unproven, at 22:18 on Wednesday 10 November 2010, Dale did > opine thusly: > > >> Dale wrote: >> >>> According to top, gkrellm and cat /proc/meminfo there is no swap in >>> use. I have 2Gbs of ram and have swappiness set to 20 or 30. I >>> rarely use swap unless I am compiling something huge, OOo comes to >>> mind, or have a LOT of images open with GIMP. >>> >>> I did check to make sure tho. My swappiness did get magically changed >>> once before. I wish it was something that easy tho. >>> >>> Still open to ideas. I started a emerge -e world. >>> >>> Dale >>> >>> :-) :-) >>> >> Just to update here. I started a emerge -e world. It has not even >> finished yet but it appears to be working fine now. It was working >> yesterday, last night, this morning and was working fine when I tried >> just a minute ago. So, it appears that something needed to be >> recompiled somewhere but no clue what that could have been. >> >> I'll keep testing over the next few days and may report back if it is >> still working correctly. I hope that it does tho. It was getting on my >> nerves. >> > Useful tip to keep in mind: > > Sometimes emerge -e world works out great. It's way overkill mostly but unlike > a sledgehammer to kill a mosquito, doesn't break the wall as well as kill the > insect :-) > > IIRC, revdep-rebuild came about from the same line of thought. Some libs were > being wrongly linked or linked to missing stuff and it was a huge ball-ache to > find them all. Imagine running ldd on every binary and grepping for "not > found" :-) It might even have been a glibc update (memory weak this end). > > revdep-rebuild finds the easily detectable stuff. But there's other problems > that can happen with binaries that are not so easy to check (or not known to > the dev), and none of the Gentoo tools help locate the culprit. emerge -e > world will just rebuild everything in sight with the nice side effect of > taking care of these mysterious problems. Hello sledgehammer. Pity that it > can't record what it fixed though. > > It's interesting to see why Ubuntu and other binary distros never have this > problem. First, they don't rip foundation libs out underneath a running system > and insert different ones on the fly, and the API/ABI of their libs doesn't > change for the life of that release of the distro. Plus, their build farms > that generate new rpms/debs/pkgs nightly, essentially do the equivalent of a > full emerge -e world daily and copy the binaries to the download server > > So sometimes when all else fails and suicide seems attractive, this is a > workable approach that can help. Now if we can just get the gcc upgrade docs > changed to reflect intelligent reality, we can get newbies to grok that emerge > -e world is not suitable for the *first* fault-finding tool one uses.... > Yea, this for me was only considered when there was no more ideas coming. People posted ideas and I tried different things but it still messed up with no error that I could find. I guess I could have just did a emerge -e nvidia-drivers and that would have rebuilt everything needed by nvidia and should in theory have worked. I do wish I knew what fixed it tho. It may be a bug or like when we have to rebuild keyboard and mouse drivers after a xorg update. It may be something that others need to know about as well. Right now, we don't know what was wrong. This particular hammer just hit everything instead of one nail that was popping up. I just checked again, it is still working. I'm liking that I can watch a video whenever I want instead of when it decides to work. ;-) Dale :-) :-) ^ permalink raw reply [flat|nested] 58+ messages in thread
* Re: [gentoo-user] Re: Upgrading from FX-5200 to a GeForce 6200 512MB 2010-11-11 2:00 ` Dale @ 2010-11-11 12:39 ` Robin Atwood 2010-11-11 14:13 ` Alan McKinnon 0 siblings, 1 reply; 58+ messages in thread From: Robin Atwood @ 2010-11-11 12:39 UTC (permalink / raw To: gentoo-user On Thursday 11 November 2010, Dale wrote: > I just checked again, it is still working. I'm liking that I can watch > a video whenever I want instead of when it decides to work. ;-) I also have good news to report. I upgraded to Qt 4.7.0 and my xserver is a reformed character! It sits and chugs away at 1.0% CPU like it used to with Qt/KDE 3.5. No other graphics related packages were updated at the same time, so it's definitely Qt which made the difference. HTH -Robin -- ---------------------------------------------------------------------- Robin Atwood. "Ship me somewheres east of Suez, where the best is like the worst, Where there ain't no Ten Commandments an' a man can raise a thirst" from "Mandalay" by Rudyard Kipling ---------------------------------------------------------------------- ^ permalink raw reply [flat|nested] 58+ messages in thread
* Re: [gentoo-user] Re: Upgrading from FX-5200 to a GeForce 6200 512MB 2010-11-11 12:39 ` Robin Atwood @ 2010-11-11 14:13 ` Alan McKinnon 2010-11-11 16:11 ` Dale 0 siblings, 1 reply; 58+ messages in thread From: Alan McKinnon @ 2010-11-11 14:13 UTC (permalink / raw To: gentoo-user; +Cc: Robin Atwood Apparently, though unproven, at 14:39 on Thursday 11 November 2010, Robin Atwood did opine thusly: > On Thursday 11 November 2010, Dale wrote: > > I just checked again, it is still working. I'm liking that I can watch > > a video whenever I want instead of when it decides to work. ;-) > > I also have good news to report. I upgraded to Qt 4.7.0 and my xserver is a > reformed character! It sits and chugs away at 1.0% CPU like it used to with > Qt/KDE 3.5. No other graphics related packages were updated at the same > time, so it's definitely Qt which made the difference. So I wasn't imagining things :-) I noticed the same and it seems even a bit better with the very latest qt-4.7.1. Some time ago I read a blog post about KDE and plasma, where stuff will be rewritten for kde-4.6. I can't find it now, it might have been on kde.org, slashdot or even the gentoo planet but it was by that Aaron fellow. He also said that there were speedups possible in Qt as well. Seems like we are now getting some of that benefit. -- alan dot mckinnon at gmail dot com ^ permalink raw reply [flat|nested] 58+ messages in thread
* Re: [gentoo-user] Re: Upgrading from FX-5200 to a GeForce 6200 512MB 2010-11-11 14:13 ` Alan McKinnon @ 2010-11-11 16:11 ` Dale 2010-11-11 16:49 ` Dale 0 siblings, 1 reply; 58+ messages in thread From: Dale @ 2010-11-11 16:11 UTC (permalink / raw To: gentoo-user Alan McKinnon wrote: > Apparently, though unproven, at 14:39 on Thursday 11 November 2010, Robin > Atwood did opine thusly: > > >> On Thursday 11 November 2010, Dale wrote: >> >>> I just checked again, it is still working. I'm liking that I can watch >>> a video whenever I want instead of when it decides to work. ;-) >>> >> I also have good news to report. I upgraded to Qt 4.7.0 and my xserver is a >> reformed character! It sits and chugs away at 1.0% CPU like it used to with >> Qt/KDE 3.5. No other graphics related packages were updated at the same >> time, so it's definitely Qt which made the difference. >> > > So I wasn't imagining things :-) > > I noticed the same and it seems even a bit better with the very latest > qt-4.7.1. Some time ago I read a blog post about KDE and plasma, where stuff > will be rewritten for kde-4.6. I can't find it now, it might have been on > kde.org, slashdot or even the gentoo planet but it was by that Aaron fellow. > He also said that there were speedups possible in Qt as well. > > Seems like we are now getting some of that benefit. > > Well, I'm on qt-4.7 so I guess I got that already. I have noticed that KDE is getting a little faster tho. I guess they are tightening up the code a bit. I did a test just then and my video was slow again. This is what I get now. root@smoker / # glxgears Running synchronized to the vertical refresh. The framerate should be approximately the same as the monitor refresh rate. 677 frames in 5.6 seconds = 120.183 FPS 1 frames in 7.6 seconds = 0.132 FPS 1 frames in 7.6 seconds = 0.132 FPS 1 frames in 8.2 seconds = 0.122 FPS 1 frames in 7.6 seconds = 0.131 FPS 1 frames in 7.6 seconds = 0.132 FPS XIO: fatal IO error 11 (Resource temporarily unavailable) on X server ":0.0" after 46 requests (46 known processed) with 0 events remaining. root@smoker / # So, looks like emerge -e world didn't help after all. So what should I check now? Do I need to get my hammer out? Dale :-) :-) ^ permalink raw reply [flat|nested] 58+ messages in thread
* Re: [gentoo-user] Re: Upgrading from FX-5200 to a GeForce 6200 512MB 2010-11-11 16:11 ` Dale @ 2010-11-11 16:49 ` Dale 2010-11-11 19:21 ` Alan McKinnon 0 siblings, 1 reply; 58+ messages in thread From: Dale @ 2010-11-11 16:49 UTC (permalink / raw To: gentoo-user Dale wrote: > > I did a test just then and my video was slow again. This is what I > get now. > > root@smoker / # glxgears > Running synchronized to the vertical refresh. The framerate should be > approximately the same as the monitor refresh rate. > 677 frames in 5.6 seconds = 120.183 FPS > 1 frames in 7.6 seconds = 0.132 FPS > 1 frames in 7.6 seconds = 0.132 FPS > 1 frames in 8.2 seconds = 0.122 FPS > 1 frames in 7.6 seconds = 0.131 FPS > 1 frames in 7.6 seconds = 0.132 FPS > XIO: fatal IO error 11 (Resource temporarily unavailable) on X server > ":0.0" > after 46 requests (46 known processed) with 0 events remaining. > root@smoker / # > > > So, looks like emerge -e world didn't help after all. So what should > I check now? Do I need to get my hammer out? > > Dale > > :-) :-) > False alarm. I looked at top and noticed that that stupid hp-systray was using about 90% of my CPU. I killed that thing and now it plays fine again. I do wish they would fix that thing. For some reason hp-systray does that from time to time. It is usually after dbus has a hiccup. Logging out of KDE and back in again usually fixes it tho. Anyway, back to normal. Well, normal for me anyway. ;-) Dale :-) :-) ^ permalink raw reply [flat|nested] 58+ messages in thread
* Re: [gentoo-user] Re: Upgrading from FX-5200 to a GeForce 6200 512MB 2010-11-11 16:49 ` Dale @ 2010-11-11 19:21 ` Alan McKinnon 2010-11-12 1:46 ` Dale 0 siblings, 1 reply; 58+ messages in thread From: Alan McKinnon @ 2010-11-11 19:21 UTC (permalink / raw To: gentoo-user Apparently, though unproven, at 18:49 on Thursday 11 November 2010, Dale did opine thusly: > Dale wrote: > > I did a test just then and my video was slow again. This is what I > > get now. > > > > root@smoker / # glxgears > > Running synchronized to the vertical refresh. The framerate should be > > approximately the same as the monitor refresh rate. > > 677 frames in 5.6 seconds = 120.183 FPS > > 1 frames in 7.6 seconds = 0.132 FPS > > 1 frames in 7.6 seconds = 0.132 FPS > > 1 frames in 8.2 seconds = 0.122 FPS > > 1 frames in 7.6 seconds = 0.131 FPS > > 1 frames in 7.6 seconds = 0.132 FPS > > XIO: fatal IO error 11 (Resource temporarily unavailable) on X server > > ":0.0" > > > > after 46 requests (46 known processed) with 0 events remaining. > > > > root@smoker / # > > > > > > So, looks like emerge -e world didn't help after all. So what should > > I check now? Do I need to get my hammer out? > > > > Dale > > > > :-) :-) > > False alarm. I looked at top and noticed that that stupid hp-systray > was using about 90% of my CPU. I killed that thing and now it plays > fine again. > > I do wish they would fix that thing. For some reason hp-systray does > that from time to time. It is usually after dbus has a hiccup. Logging > out of KDE and back in again usually fixes it tho. > > Anyway, back to normal. Well, normal for me anyway. ;-) Whatever hp-systray is, I'd dump it. When a single app or feature kills the entire machine and it's ability to draw on the screen, it's time to find a way to do without the app. -- alan dot mckinnon at gmail dot com ^ permalink raw reply [flat|nested] 58+ messages in thread
* Re: [gentoo-user] Re: Upgrading from FX-5200 to a GeForce 6200 512MB 2010-11-11 19:21 ` Alan McKinnon @ 2010-11-12 1:46 ` Dale 0 siblings, 0 replies; 58+ messages in thread From: Dale @ 2010-11-12 1:46 UTC (permalink / raw To: gentoo-user Alan McKinnon wrote: > Apparently, though unproven, at 18:49 on Thursday 11 November 2010, Dale did > opine thusly: > > >> Dale wrote: >> >>> I did a test just then and my video was slow again. This is what I >>> get now. >>> >>> root@smoker / # glxgears >>> Running synchronized to the vertical refresh. The framerate should be >>> approximately the same as the monitor refresh rate. >>> 677 frames in 5.6 seconds = 120.183 FPS >>> 1 frames in 7.6 seconds = 0.132 FPS >>> 1 frames in 7.6 seconds = 0.132 FPS >>> 1 frames in 8.2 seconds = 0.122 FPS >>> 1 frames in 7.6 seconds = 0.131 FPS >>> 1 frames in 7.6 seconds = 0.132 FPS >>> XIO: fatal IO error 11 (Resource temporarily unavailable) on X server >>> ":0.0" >>> >>> after 46 requests (46 known processed) with 0 events remaining. >>> >>> root@smoker / # >>> >>> >>> So, looks like emerge -e world didn't help after all. So what should >>> I check now? Do I need to get my hammer out? >>> >>> Dale >>> >>> :-) :-) >>> >> False alarm. I looked at top and noticed that that stupid hp-systray >> was using about 90% of my CPU. I killed that thing and now it plays >> fine again. >> >> I do wish they would fix that thing. For some reason hp-systray does >> that from time to time. It is usually after dbus has a hiccup. Logging >> out of KDE and back in again usually fixes it tho. >> >> Anyway, back to normal. Well, normal for me anyway. ;-) >> > Whatever hp-systray is, I'd dump it. > > When a single app or feature kills the entire machine and it's ability to draw > on the screen, it's time to find a way to do without the app. > > Well, it makes my printer print. That's what I *think* anyway. I may not need it anymore. I need to check on that. It has been doing that once in a blue moon for a long time. Usually, it is after I upgrade dbus and the config file changes and I haven't restarted both dbus and KDE. There seems to be some sort of clash between the two, or three, and the race is on and my CPU is the gas pedal. I didn't pay much attention this very last time but once before it was dbus, hp-systray and something KDE that was fighting. Bad thing is, it started last night and raced all night long and I didn't notice it. My eyes were closed. I'm trying to run folding since it is cool now and it wasted all my folding time on worthless crap. :-@ I just wish it would suicide itself instead of making my CPU go nuts. Glad I have a HUGE CPU heat sink too. Dale :-) :-) ^ permalink raw reply [flat|nested] 58+ messages in thread
* Re: [gentoo-user] Re: Upgrading from FX-5200 to a GeForce 6200 512MB 2010-10-26 2:04 ` Iain Buchanan 2010-10-26 2:26 ` Dale @ 2010-10-26 9:18 ` Marc Joliet 2010-10-26 12:01 ` Iain Buchanan 2010-10-26 15:24 ` Paul Hartman 2 siblings, 1 reply; 58+ messages in thread From: Marc Joliet @ 2010-10-26 9:18 UTC (permalink / raw To: Gentoo-User ML [-- Attachment #1: Type: text/plain, Size: 485 bytes --] Am Tue, 26 Oct 2010 11:34:58 +0930 schrieb Iain Buchanan <iaindb@netspace.net.au>: [...] > Someone posted recently about an upgrade that affected him (looking... > can't find it). He downgraded to fix it, but it wasn't nvidia or x from > memory. Sorry for being vague, I'll keep looking. Ah, I seem to remember the problem was/is mesa 7.8.2 being slow, in which case a downgrade helped. Was that it? I can't find the thread myself right now, though. -- Marc Joliet [-- Attachment #2: signature.asc --] [-- Type: application/pgp-signature, Size: 198 bytes --] ^ permalink raw reply [flat|nested] 58+ messages in thread
* Re: [gentoo-user] Re: Upgrading from FX-5200 to a GeForce 6200 512MB 2010-10-26 9:18 ` Marc Joliet @ 2010-10-26 12:01 ` Iain Buchanan 2010-10-26 14:30 ` Alan McKinnon 0 siblings, 1 reply; 58+ messages in thread From: Iain Buchanan @ 2010-10-26 12:01 UTC (permalink / raw To: gentoo-user On Tue, 2010-10-26 at 11:18 +0200, Marc Joliet wrote: > Am Tue, 26 Oct 2010 11:34:58 +0930 > schrieb Iain Buchanan <iaindb@netspace.net.au>: > > [...] > > Someone posted recently about an upgrade that affected him (looking... > > can't find it). He downgraded to fix it, but it wasn't nvidia or x from > > memory. Sorry for being vague, I'll keep looking. > > Ah, I seem to remember the problem was/is mesa 7.8.2 being slow, in which case > a downgrade helped. Was that it? I can't find the thread myself right now, > though. That's it! mesa tinka yousa system broken! 7.8.2 down to 7.7.1 worked for the OP: http://article.gmane.org/gmane.linux.gentoo.user/234610 "Preventing a package from being updated" thanks, -- Iain Buchanan <iaindb at netspace dot net dot au> Chuck Norris doesnt wear a watch, HE decides what time it is. ^ permalink raw reply [flat|nested] 58+ messages in thread
* Re: [gentoo-user] Re: Upgrading from FX-5200 to a GeForce 6200 512MB 2010-10-26 12:01 ` Iain Buchanan @ 2010-10-26 14:30 ` Alan McKinnon 2010-10-30 20:12 ` walt 0 siblings, 1 reply; 58+ messages in thread From: Alan McKinnon @ 2010-10-26 14:30 UTC (permalink / raw To: gentoo-user; +Cc: Iain Buchanan Apparently, though unproven, at 14:01 on Tuesday 26 October 2010, Iain Buchanan did opine thusly: > On Tue, 2010-10-26 at 11:18 +0200, Marc Joliet wrote: > > Am Tue, 26 Oct 2010 11:34:58 +0930 > > schrieb Iain Buchanan <iaindb@netspace.net.au>: > > > > [...] > > > > > Someone posted recently about an upgrade that affected him (looking... > > > can't find it). He downgraded to fix it, but it wasn't nvidia or x > > > from memory. Sorry for being vague, I'll keep looking. > > > > Ah, I seem to remember the problem was/is mesa 7.8.2 being slow, in which > > case a downgrade helped. Was that it? I can't find the thread myself > > right now, though. > > That's it! mesa tinka yousa system broken! 7.8.2 down to 7.7.1 worked > for the OP: > > http://article.gmane.org/gmane.linux.gentoo.user/234610 > > "Preventing a package from being updated" Can someone who hit this and cured it describe the symptoms seen? I might have the same issue but it's a lot of rebuilding. I might just be suffering from the recent 2.6.33 - 2.6.35 IO issues. Throw stuff at wall doesn't strike me as an effective troubleshooting method :-) -- alan dot mckinnon at gmail dot com ^ permalink raw reply [flat|nested] 58+ messages in thread
* [gentoo-user] Re: Upgrading from FX-5200 to a GeForce 6200 512MB 2010-10-26 14:30 ` Alan McKinnon @ 2010-10-30 20:12 ` walt 0 siblings, 0 replies; 58+ messages in thread From: walt @ 2010-10-30 20:12 UTC (permalink / raw To: gentoo-user On 10/26/2010 07:30 AM, Alan McKinnon wrote: > Apparently, though unproven, at 14:01 on Tuesday 26 October 2010, Iain > Buchanan did opine thusly: > >> On Tue, 2010-10-26 at 11:18 +0200, Marc Joliet wrote: >>> Am Tue, 26 Oct 2010 11:34:58 +0930 >>> schrieb Iain Buchanan<iaindb@netspace.net.au>: >>> >>> [...] >>> >>>> Someone posted recently about an upgrade that affected him (looking... >>>> can't find it). He downgraded to fix it, but it wasn't nvidia or x >>>> from memory. Sorry for being vague, I'll keep looking. >>> >>> Ah, I seem to remember the problem was/is mesa 7.8.2 being slow, in which >>> case a downgrade helped. Was that it? I can't find the thread myself >>> right now, though. >> >> That's it! mesa tinka yousa system broken! 7.8.2 down to 7.7.1 worked >> for the OP: >> >> http://article.gmane.org/gmane.linux.gentoo.user/234610 >> >> "Preventing a package from being updated" > > > Can someone who hit this and cured it describe the symptoms seen? I might have > the same issue but it's a lot of rebuilding. I might just be suffering from > the recent 2.6.33 - 2.6.35 IO issues. I had a strange glitch recently where upgrading the nvidia drivers failed to create one important symlink. I think it was this one but I can't promise: /usr/lib/xorg/modules/extensions/libglx.so -> ../../../opengl/nvidia/extensions/libglx.so.173.14.28 I use 260.19.12 on my newer machine, so the glitch may have happened on that machine instead of this one. IIRC I fixed it my creating the symlink by hand. I haven't used eselect since then to find out if my fix is permanent, though. ^ permalink raw reply [flat|nested] 58+ messages in thread
* Re: [gentoo-user] Re: Upgrading from FX-5200 to a GeForce 6200 512MB 2010-10-26 2:04 ` Iain Buchanan 2010-10-26 2:26 ` Dale 2010-10-26 9:18 ` Marc Joliet @ 2010-10-26 15:24 ` Paul Hartman 2010-10-26 19:33 ` Alan McKinnon ` (2 more replies) 2 siblings, 3 replies; 58+ messages in thread From: Paul Hartman @ 2010-10-26 15:24 UTC (permalink / raw To: gentoo-user On Mon, Oct 25, 2010 at 9:04 PM, Iain Buchanan <iaindb@netspace.net.au> wrote: > I'm having issues with the latest mix of nvidia-drivers, xorg, and > whatever else it might be! > > I'm getting bad performance when switching virtual dekstops and moving > windows and such. GL screensavers seem to be ok though. Same here. My fast desktop with Core i7 920, Nvidia GX 240, has a slower KDE UI than my 6-year-old laptop that has AMD Athlon 3200+ and ATI Radeon Mobility 9700. Simply opening a konsole window on my desktop with compositing enabled can take 2-3 seconds, when it is instant on the laptop. People have been complaining about it for years, KDE and nvidia-drivers don't always get along with each other. The usual answer is that it works with Intel and ATI cards, and Nvidia's drivers are closed-source, so nobody can guess what the problem is and all we can do is hope Nvidia in their ivory tower can one day bless us with an update that makes things better. And then of course there are people who have Nvidia cards and everything works great and they don't know what the complainers are talking about. :) In my personal experience, on my Nvidia machine KDE 4.2 was the fastest, and it has gotten slower with each subsequent KDE release (with 4.5 being the worst one yet - so bad that I've disabled compositing entirely). Or maybe it has gotten slower with each nvidia-drivers release over the same period of time, I can't say. Maybe it is all a coincidence. However, on my old laptop with xorg radeon drivers, it has been getting faster with each KDE release, with KDE 4.5 is the fastest one yet. Both machines run latest everything on ~amd64. The only significant configuration difference between the two is nvidia-drivers vs radeon. ^ permalink raw reply [flat|nested] 58+ messages in thread
* Re: [gentoo-user] Re: Upgrading from FX-5200 to a GeForce 6200 512MB 2010-10-26 15:24 ` Paul Hartman @ 2010-10-26 19:33 ` Alan McKinnon 2010-10-28 17:21 ` Paul Hartman 2010-10-28 23:30 ` Iain Buchanan 2 siblings, 0 replies; 58+ messages in thread From: Alan McKinnon @ 2010-10-26 19:33 UTC (permalink / raw To: gentoo-user Apparently, though unproven, at 17:24 on Tuesday 26 October 2010, Paul Hartman did opine thusly: > On Mon, Oct 25, 2010 at 9:04 PM, Iain Buchanan <iaindb@netspace.net.au> wrote: > > I'm having issues with the latest mix of nvidia-drivers, xorg, and > > whatever else it might be! > > > > I'm getting bad performance when switching virtual dekstops and moving > > windows and such. GL screensavers seem to be ok though. > > Same here. My fast desktop with Core i7 920, Nvidia GX 240, has a > slower KDE UI than my 6-year-old laptop that has AMD Athlon 3200+ and > ATI Radeon Mobility 9700. Simply opening a konsole window on my > desktop with compositing enabled can take 2-3 seconds, when it is > instant on the laptop. Let me weep on your shoulder with you. This laptop has a pair of these: vendor_id : GenuineIntel cpu family : 6 model : 23 model name : Intel(R) Core(TM)2 Duo CPU T9300 @ 2.50GHz 4G RAM, 1920x1200 screen and this video card: 01:00.0 VGA compatible controller: nVidia Corporation G84 [GeForce 8600M GT] (rev a1) and nvidia-drivers-260.19.12 KDE performance is pathetic especially with compositing enabled. It was really bad sometime around 4.3 with that screw up in the driver for resizing windows. Some releases are better, some worse. I'm getting disheartened trying to figure out what to downgrade: mesa, xorg, drivers, kde.... A colleague has the identical machine running Ubuntu. Gnome flies on that. I don't want to go to Ubuntu - I detest it on a laptop and detest Gnome even more. I might go back to e17 and just put up with the reduced desktop functionality. > > People have been complaining about it for years, KDE and > nvidia-drivers don't always get along with each other. The usual > answer is that it works with Intel and ATI cards, and Nvidia's drivers > are closed-source, so nobody can guess what the problem is and all we > can do is hope Nvidia in their ivory tower can one day bless us with > an update that makes things better. And then of course there are > people who have Nvidia cards and everything works great and they don't > know what the complainers are talking about. :) > > In my personal experience, on my Nvidia machine KDE 4.2 was the > fastest, and it has gotten slower with each subsequent KDE release > (with 4.5 being the worst one yet - so bad that I've disabled > compositing entirely). Or maybe it has gotten slower with each > nvidia-drivers release over the same period of time, I can't say. > Maybe it is all a coincidence. > > However, on my old laptop with xorg radeon drivers, it has been > getting faster with each KDE release, with KDE 4.5 is the fastest one > yet. Both machines run latest everything on ~amd64. The only > significant configuration difference between the two is nvidia-drivers > vs radeon. -- alan dot mckinnon at gmail dot com ^ permalink raw reply [flat|nested] 58+ messages in thread
* Re: [gentoo-user] Re: Upgrading from FX-5200 to a GeForce 6200 512MB 2010-10-26 15:24 ` Paul Hartman 2010-10-26 19:33 ` Alan McKinnon @ 2010-10-28 17:21 ` Paul Hartman 2010-10-28 23:30 ` Iain Buchanan 2 siblings, 0 replies; 58+ messages in thread From: Paul Hartman @ 2010-10-28 17:21 UTC (permalink / raw To: gentoo-user On Tue, Oct 26, 2010 at 10:24 AM, Paul Hartman <paul.hartman+gentoo@gmail.com> wrote: > My fast desktop with Core i7 920, Nvidia GX 240, has a > slower KDE UI than my 6-year-old laptop that has AMD Athlon 3200+ and > ATI Radeon Mobility 9700. Simply opening a konsole window on my > desktop with compositing enabled can take 2-3 seconds, when it is > instant on the laptop. Reply to myself here :) I did some Googling and found some possible explanations/workarounds. I'm away from home at the moment, so I cannot try them yet, but maybe someone else can be the guinea pig. The problem seems to be that TextureFromPixmap in Nvidia's drivers is really slow, but it is really fast in other brands' drivers. This supposedly heavily affects anything that involved a window being created or resized. (Which is exactly where I see the worst slowdowns) Changing from OpenGL to xrender will cause those actions to be much faster, but other OpenGL things like animations will be slower. Disabling direct rendering may help in some areas and hurt in others. Changing OpenGL shared memory settings may help. If someone tries these, feel free to post your results. :) ^ permalink raw reply [flat|nested] 58+ messages in thread
* Re: [gentoo-user] Re: Upgrading from FX-5200 to a GeForce 6200 512MB 2010-10-26 15:24 ` Paul Hartman 2010-10-26 19:33 ` Alan McKinnon 2010-10-28 17:21 ` Paul Hartman @ 2010-10-28 23:30 ` Iain Buchanan 2 siblings, 0 replies; 58+ messages in thread From: Iain Buchanan @ 2010-10-28 23:30 UTC (permalink / raw To: gentoo-user On Tue, 2010-10-26 at 10:24 -0500, Paul Hartman wrote: > On Mon, Oct 25, 2010 at 9:04 PM, Iain Buchanan <iaindb@netspace.net.au> wrote: > > I'm having issues with the latest mix of nvidia-drivers, xorg, and > > whatever else it might be! > > > > I'm getting bad performance when switching virtual dekstops and moving > > windows and such. GL screensavers seem to be ok though. > > Same here. My fast desktop with Core i7 920, Nvidia GX 240, has a > slower KDE UI than my 6-year-old laptop that has AMD Athlon 3200+ and > ATI Radeon Mobility 9700. Simply opening a konsole window on my > desktop with compositing enabled can take 2-3 seconds, when it is > instant on the laptop. the difference being that for me, it wasn't always like this. I used to have a REALLY fast snappy UI, now it's a bit sluggish, so I assume it's a version of some package, not just all nvidia drivers for this chip... -- Iain Buchanan <iaindb at netspace dot net dot au> He who hates vices hates mankind. ^ permalink raw reply [flat|nested] 58+ messages in thread
end of thread, other threads:[~2010-11-12 1:47 UTC | newest] Thread overview: 58+ messages (download: mbox.gz follow: Atom feed -- links below jump to the message on this page -- 2010-10-19 7:45 [gentoo-user] Upgrading from FX-5200 to a GeForce 6200 512MB Dale 2010-10-19 8:51 ` Florian Philipp 2010-10-19 12:23 ` Dale 2010-10-19 12:42 ` Florian Philipp 2010-10-20 0:14 ` Dale 2010-10-19 15:44 ` Paul Hartman 2010-10-19 20:27 ` Dale 2010-10-19 22:59 ` Paul Hartman 2010-10-19 23:34 ` Adam Carter 2010-10-20 0:11 ` Dale 2010-10-20 3:54 ` Adam Carter 2010-10-20 4:12 ` Dale 2010-10-20 16:25 ` Paul Hartman 2010-10-20 18:25 ` Dale 2010-10-20 19:07 ` Paul Hartman 2010-10-20 19:28 ` Dale 2010-10-19 23:40 ` Dale 2010-10-25 23:42 ` [gentoo-user] " Dale 2010-10-26 0:08 ` Alex Schuster 2010-10-26 1:48 ` Dale 2010-10-26 17:12 ` Peter Humphrey 2010-10-26 17:28 ` Arttu V. 2010-10-27 14:37 ` Peter Humphrey 2010-10-26 2:04 ` Iain Buchanan 2010-10-26 2:26 ` Dale 2010-10-26 3:55 ` Dale 2010-10-26 5:18 ` me 2010-10-26 5:27 ` Iain Buchanan 2010-10-27 7:42 ` Dale 2010-11-06 15:03 ` Dale 2010-11-06 20:06 ` Robin Atwood 2010-11-07 21:03 ` walt 2010-11-07 23:27 ` Robin Atwood 2010-11-08 0:07 ` Dale 2010-11-08 14:25 ` Robin Atwood 2010-11-08 17:24 ` Dale 2010-11-08 21:20 ` Dale 2010-11-08 21:26 ` Alex Schuster 2010-11-08 22:29 ` Dale 2010-11-09 0:06 ` walt 2010-11-09 0:21 ` Dale 2010-11-10 20:18 ` Dale 2010-11-10 21:53 ` Alan McKinnon 2010-11-11 2:00 ` Dale 2010-11-11 12:39 ` Robin Atwood 2010-11-11 14:13 ` Alan McKinnon 2010-11-11 16:11 ` Dale 2010-11-11 16:49 ` Dale 2010-11-11 19:21 ` Alan McKinnon 2010-11-12 1:46 ` Dale 2010-10-26 9:18 ` Marc Joliet 2010-10-26 12:01 ` Iain Buchanan 2010-10-26 14:30 ` Alan McKinnon 2010-10-30 20:12 ` walt 2010-10-26 15:24 ` Paul Hartman 2010-10-26 19:33 ` Alan McKinnon 2010-10-28 17:21 ` Paul Hartman 2010-10-28 23:30 ` Iain Buchanan
This is a public inbox, see mirroring instructions for how to clone and mirror all data and code used for this inbox