I think that kid is mostly crying because he’s got so many extra fingers that he doesn’t have a middle finger to return the gesture.
Why can ai not figure out fingers?
hands are hard even for real artists.
Yes Nvidia is still a shitty company like every other. But their open source drivers run pretty well by now.
Assuming you’re talking about Nouveau, it’s pretty hit or miss depending on what card you have. My previous laptop had an MX330 and it couldn’t do hardware acceleration stuff and 120Hz via HDMI, not to mention screen sharing on Wayland was wonky.
Oh, and it’s worth to mention that “their” open source driver had nothing to do with Nvidia themselves; they absolute do not care, as opposed to AMD.
I think he means nvk. It’s a whole new world. I thought I just heard it’s ready. So worth checking out I guess.
Judging by the docs at mesa3d, I don’t even have a card that supports it
NVK currently supports Turing (RTX 20XX and GTX 16XX) and later GPUs. Eventually, we plan to support as far back as Kepler (GeForce 600 and 700 series) GPUs but anything pre-Turing is currently disabled by default.
I’m not talking about Nouveau. I’m talking about the Open Source drivers from Nvidia. They released them a while ago and they’ve gotten pretty good lately.
I heard back than that the open source driver is pretty much just a layer of abstraction and that most actual code runs in the firmware which is proprietary
The driver you’re likely referring to is NVK, which is also not developed by Nvidia, check out the annoucement post by Collabora, it says:
As said above, NVK is a new open-source Vulkan driver for NVIDIA hardware in Mesa. It’s been written almost entirely from scratch using the new official headers from NVIDIA. We occasionally reference the existing nouveau OpenGL driver […]
And also
a few months ago, NVIDIA released an open-source version of their kernel driver. While this isn’t quite documentation, it does give us something to reference to see how NVIDIA drives their hardware. The code drop from NVIDIA isn’t a good fit for upstream Linux, but it does give us the opportunity to rework the upstream driver situation and do it right.
So they’re developing a driver based off headers made available by Nvidia and some of the reverse engineered code from regular Nouveau. In fact, it seems to be a branch of Nouveau as it stands:
Trying out NVK is no different than any other Mesa driver. Just pull the branch nvk/main branch from the nouveau/mesa project, build it, and give it a try
So the “OSS drivers from Nvidia” aren’t what makes it work, it’s the whole community effort to build NVK from scratch.
Regardless, it only supports the most recent cards using Turing architecture. From mesa3d docs:
NVK currently supports Turing (RTX 20XX and GTX 16XX) and later GPUs. Eventually, we plan to support as far back as Kepler (GeForce 600 and 700 series) GPUs but anything pre-Turing is currently disabled by default.
And still I’m not taking about nvk nor am I taking about nouveau. I’m taking about the open source drivers from nvidia. https://github.com/NVIDIA/open-gpu-kernel-modules https://developer.nvidia.com/blog/nvidia-releases-open-source-gpu-kernel-modules/
The kernel driver isn’t all you need, you still have to pair it to their closed-source drivers or Nouveau/NVK, which are the device drivers
They released them a while ago and they’ve gotten pretty good lately.
Based on what?
Based on what I heard from people who use it. I don’t have a nvidia card so can’t tell.
The narrative around Nvidia seems to have done a complete 180 in the last year or two. I’m skeptical that it’s as good as many are stating.
Are these OSS drivers maintained by Nvidia? Or is it Nouveau?
I know practically nothing about Nvidia’s drivers, but I can see them doing a 180° flip, because they want to grab a chunk of that AI market.
Even before LLMs, they were investing there, and now it’s just completely settled that tons of AI-related hardware needs to be either a beefy a server machine or a low-profile edge PC. For both, Linux is very much preferrable.Nvidia pretty much always dominated the AI GPU market with their closed source driver and Cuda. Nothing has changed about that except for more competition in AI specific hardware which you can buy from several vendors now. But no one has ever used AMD cards with OpenCL for AI or ML. If you were serious about it, you always used Nvidia with Cuda or nowadays some dedicated AI accelerator card (DPUs).
Yes, and with the addition of NVK it’s gotten a lot better. There is still some issues on Wayland and specific problem cards, but it’s not nearly as bad as it was even 2y ago.
I’ve been using NVIDIA cards on Linux for 20 years… I don’t get this
I tried it for 2 years. After having a lot of weird issues I finally upgraded to an AMD card and so many of those issues went away. Firstly I can install updates without worrying about breaking games or random graphical things. AMD has been way more solid.
AMD is way better than NVIDIA
That is highly subjective
For Linux it is
Weeelll not for long. Open kernel module works like a charm for me. Wayland support is now actively worked on and is already functional for the most part. HDMI 2.1 was always supported while AMD would not be able to. HDR and 10 bit support also dropped just now.
The biggest difference is that amd drivers are open source
I don’t know who would be mad about a 3090 I’d be cool fighting nvidia for more than 3x my rx480
Missing the joke here? We run a 3090 and a 3900x just fine on ArchLinux.
Maybe it’s a Wayland joke?
I’m pretty sure the 3090 works fine under Wayland these days.
Does screen recording work yet? Thats the only issue i have with my 1660 on wayland
It depends, but yes. Screen recording has worked under Wayland for quite some time now.
Spectacle used in example isn’t very good at screen recording, but it’s the best I can manage with the size cap on Lemmy.
You’ll likely have to do some configuration. I’d recommend using OBS-Studio and Pipewire, as well as checking the AUR for patched versions if needed if you happen to be on Arch ofc.
Question: Is buying a Laptop with a Nvidia graphic card is bad idea for Linux(XFCE user)?
Yes, IMO. If you haven’t bought the hardware yet, there’s no reason to subject yourself to the headache of lacking Linux support, instead support companies that value open source.
AMD and Intel GPU’s simply work out of the box with all features.And it’s not like on a laptop you need the highest of high end graphics acceleration anyway.
Thanks
I use the integrated graphics of my Ryzen 9 7940HS and they’re more than enough for all workloads. Light weight gaming also works pretty well on it.
If you actually have a choice stay away from Nvidia although it’s been a hot minute since I last saw a laptop with an amd gpu.
Yes.
If possible get a laptop with a thunderbolt 3 port and a compatable external gpu
I own an Omen 15 inch with 3060. It has some issues but it works fine. However, my next one will definitely be AMD.
One major issue is that I have to use my desktop manager (mutter, for Gnome on Fedora) with the Nvidia drivers, not the integrated GPU of AMD, otherwise external monitors do not work at all. This is a problem because dedicated GPU cannot go to sleep amd constantly uses at least 15 watts, reducing the battery life.
Another issue is, a lot of times, my laptop won’t wake up after sleeping. I have checked the logs, and I am 90% sure that it is because I login to my desktop manager using dedicated GPU. If you don’t need an external monitor, or if you have a dedicated mux switch, you should not have to face any of these problems.
A few minor problems are that I cannot use the official builds with Nvidia drivers, if I want to use secure boot. For secure boot, I have to rely on third party developers for this. An issue I saw sometime ago was, when I used Manjaro, my maximum TDP of the GPU never exceeded 79 watts. When using Fedora, ot goes up to 95 watts. On Windows it used to go upto 100 watts. Also, there are some softwares like keyboard lighting manager, bios updater etc, which work on Windows only, not even on a VM. Also, the fans never exceed 4099 RPM on Linux, whereas on Windows they could go upto 6500. But I have always seen Linux to be 10-20% faster in my Blender render tests.
I hope this helps. If you have any questions, feel free to DM.
Thanks you so much for such detail reply.
I’ve got a laptop with nvidia graphics and I’ve never had a problem.
Most of the serious problems have to do with Wayland, so xfce will be fine. I’m running it on a t480 with a geforce mx150 just fine.
If it’s a good deal, take it. Even if you do decide to switch ot Wayland at some point, those issues should be mostly fixed soon™
Yea XFCE use X protocol for now but maybe in next release it will use Wayland that’s why I was asking and Thanks
They’re adding Wayland support, but xfce moves slowly and focuses on stability. I doubt x11 will be gone any time soon.
Oh Thanks
Will it be called WFCE?
The funny thing is that the vast majority of NVIDIA GPUs are probably used in Linux-based systems because of the MLAI hype.
what happens when you have the money to write software for your hardware and not give back to open source because $$$.
I just bought an Nvidia GPU because my main use is vifo, and AMD is hard to make work.
Im still salty