• 🇰 🌀 🇱 🇦 🇳 🇦 🇰 🇮 @pawb.social
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    1
    ·
    edit-2
    9 hours ago

    All these Nvidia driver memes are why I haven’t fully switched to Linux with my main rig (which is used solely for gaming). Servers, fuck yeah boy, Linux all the way. Stable as fuck and super lightweight. But I don’t need those to render things in 3D at 60+ FPS.

    I also never got Wi-Fi drivers working until Ubuntu first came out and I tried it.

    That kinda shit makes it feel like a catch-22: some things don’t work on Linux because nobody is developing that thing for Linux, and they aren’t developing that thing for Linux because people who use that thing don’t use Linux (because it’s not there). Partially why I learned to code; sometimes I want something that doesn’t exist so I must create it. Unfortunately, I am not learned enough to make drivers/wrappers. 😔

    • dev_null@lemmy.ml
      link
      fedilink
      arrow-up
      9
      ·
      8 hours ago

      Meanwhile in reality installing Nvidia drivers is literally just a checkbox in a Drivers menu in system settings. Unless you are using Arch or something.

      • UltraMasculine@sopuli.xyz
        link
        fedilink
        English
        arrow-up
        3
        ·
        8 hours ago

        I recently finally moved to Linux (Mint). I have Nvidia GPU and yes, all I had to do was check the box and the drivers installed automatically. No problems so far.

        I still have Windows 11 installed though (dualboot). I know there’s some compatibility problems with Linux that’s affecting me, but Linux is my main OS.

    • CoffeeGhost@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      4
      ·
      8 hours ago

      The memes are extremely outdated at this point. I’ve been rocking Linux with a 3070 for the last year and a half and have only seen minor issues and major improvements. Not to say it’s perfect, but my issues have been more from me rocking arch Linux and breaking my system than Nvidia issues

  • jsomae@lemmy.ml
    link
    fedilink
    arrow-up
    4
    ·
    10 hours ago

    This is actually an easy thing to do – usually. But you might get unlucky with the wrong hardware, as perhaps OP did.

  • comfy@lemmy.ml
    link
    fedilink
    arrow-up
    5
    ·
    11 hours ago

    Honestly, I’ve never had this problem. Two GPUs, two clicks in the gui driver manager.

    • pumpkinseedoil@mander.xyz
      link
      fedilink
      arrow-up
      1
      ·
      7 hours ago

      Works fine for me? (opensuse tumbleweed)

      Didn’t take much effort, hybrid mode got implemented automatically and then I just manually added a widget for quick switching between only integrated graphics, hybrid mode and only nvidia (basically never using that one, just either integrated or hybrid)

      • Melatonin@lemmy.dbzer0.com
        link
        fedilink
        arrow-up
        1
        ·
        4 hours ago

        That’s nice! I’m glad it glad it worked so well for you. That’s the thing about configuration, sometimes it works without much effort!

        I wish everyone shared your experience, but I guess it’s a YMMV kind of thing, right?

    • MonkeMischief@lemmy.today
      link
      fedilink
      arrow-up
      11
      ·
      23 hours ago

      LOL isn’t that the truth. I wanted my desktop to not bother chugging watts through my 3090 and generating excess heat when barely KDE Plasma and a browser is running, but trying to set up GPU offload just left me with a blank terminal screen.

      Thank God for the geniuses who implemented Snapper rollbacks in OpenSUSE! Otherwise, the Nvidia drivers in the repos work fine and I’m scared to touch them…

      • Rolivers@discuss.tchncs.de
        link
        fedilink
        English
        arrow-up
        2
        ·
        9 hours ago

        Is the power consumption really that much more? I guess there is a significant difference but it might still not cost much.

        In a desktop you use the powerful GPU all the time.

        In my use case the laptop is always attached to a charger.

  • communism@lemmy.ml
    link
    fedilink
    arrow-up
    37
    arrow-down
    1
    ·
    1 day ago

    I’ve never had trouble installing them. Getting them to work after an update is another story.

  • Lexam@lemmy.world
    link
    fedilink
    arrow-up
    48
    arrow-down
    2
    ·
    1 day ago

    I never understood this. Maybe because I stick with basic distros like Ubuntu or Mint. But I have not had this issue.

      • A7thStone@lemmy.world
        link
        fedilink
        arrow-up
        9
        ·
        24 hours ago

        I haven’t had issues for about a decade. I haven’t had an nvidia card for about a decade either. I think the two may be connected.

        • daggermoon@lemmy.world
          link
          fedilink
          arrow-up
          1
          ·
          8 hours ago

          I will say as someone who uses a NVIDIA card gaming through proton works flawlessly. Certain apps may have bugs. I’m having this one issue where H.265 videos don’t play properly in VLC or MPV.

      • VitoRobles@lemmy.today
        link
        fedilink
        English
        arrow-up
        13
        arrow-down
        1
        ·
        1 day ago

        I saw a meme about sound cards recently and thousands of likes on social media.

        And I wonder if it’s people up voting because they remember that era, if it’s bots, or if it’s just people who kinda get the joke and don’t want to be left out?

        • AllHailTheSheep@sh.itjust.works
          link
          fedilink
          arrow-up
          17
          arrow-down
          1
          ·
          1 day ago

          most likely the last one. especially in computer science, there’s always a lot of people who sorta understand and just want to be included. that’s why most computer science memes are “JavaScript bad” or “python slow” or other super basic mass opinions. I feel like it’s super rare I see an actually original computer science meme

    • Oinks@lemmy.blahaj.zone
      link
      fedilink
      arrow-up
      12
      ·
      edit-2
      1 day ago

      It depends a lot on which specific GPU you have and whether it’s a laptop.

      New-ish GPU in a desktop with the monitor plugged directly into the GPU? Easy to get working, literally a checkbox on most distros.

      1000 series GPU or older in a laptop and you need reasonable battery life and/or some “advanced” features like DP Alt-Mode? Good luck.

      Edit: Also, no Wayland until very recently. Possibly never, depending on the age of the GPU.

    • communism@lemmy.ml
      link
      fedilink
      arrow-up
      7
      ·
      1 day ago

      I used Ubuntu for many years on an nvidia machine and had a shit ton of nvidia problems, but I haven’t used Ubuntu for a long time now so I would hope there’s been progress. The experience has made me a lifelong AMD user since though.

    • endeavor@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      4
      ·
      23 hours ago

      Fedora here and same. It’s just a few commands to get started and everything else works fine

    • BCsven@lemmy.ca
      link
      fedilink
      arrow-up
      7
      ·
      1 day ago

      Same, I’m on OpenSUSE, nVidia hosts its own OpenSUSE repo. As far back as 8 years(for me) you add the repo and add the driver. Everything works.

      • MonkeMischief@lemmy.today
        link
        fedilink
        arrow-up
        2
        ·
        23 hours ago

        Saaame. There was a while there where Wayland didn’t work on the repo version so I had to go full manual, but otherwise it’s been almost perfect now, Wayland and all.

  • fxomt@lemmy.dbzer0.com
    link
    fedilink
    arrow-up
    27
    arrow-down
    2
    ·
    1 day ago

    Installing’s easy. Does it work? No 🫠 I still can’t daily drive linux because how shitty NVIDIA’s drivers are

    • Smee@poeng.link
      link
      fedilink
      arrow-up
      3
      ·
      24 hours ago

      I can daily drive linux just fine on 3060ti, the Ollama CUDA AI acceleration works without a single issue straight out of the box.

      I do want to be able to game on my main rig though, but that’s what I have a laptop with an Intel low-end integrated GFX card for.

    • RustyNova@lemmy.world
      link
      fedilink
      arrow-up
      6
      arrow-down
      2
      ·
      1 day ago

      Depends on what distro you used. What’s the distro, driver version and graphic card did you try?

      • fxomt@lemmy.dbzer0.com
        link
        fedilink
        arrow-up
        12
        ·
        1 day ago

        NixOS (same problem, all distros) 570 drivers, RTX 3060

        Currently on hyprland, same issue with sway/other wlroots compositors (KDE/GNOME work fine-ish, but i prefer compositors and they’re full of worse NVIDIA bugs on their own)

        The problem’s with proton (or DXVK? Dunno) and how input delay increases heavily with V-Sync enabled. Unfortunately i have to use v-sync, so just dealing with it isn’t a choice for me, sorry

        • pumpkinseedoil@mander.xyz
          link
          fedilink
          arrow-up
          1
          ·
          7 hours ago

          complains about linux being complicated

          uses NixOS

          I think I found your issue… Most Linux distros just work nowadays.

          • fxomt@lemmy.dbzer0.com
            link
            fedilink
            arrow-up
            1
            ·
            4 hours ago

            I’m not complaining that linux is complicated, though. I can use NixOS just fine. I’m talking about NVIDIA drivers being broken, and i’ve tried multiple distros.

        • histic@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          3
          ·
          1 day ago

          Did you enable all the hyprland NVIDIA tweaks im running a 3070 on nix hypr and had issues but after setting all the nvidia tweaks and env variables I’ve had no issue with vsync and playing games with bad input lag and I play competitive shooters so I can tell

          • fxomt@lemmy.dbzer0.com
            link
            fedilink
            arrow-up
            2
            arrow-down
            1
            ·
            1 day ago

            If by tweaks you mean:

            MODULES=(… nvidia nvidia_modeset nvidia_uvm nvidia_drm …)

            options nvidia_drm modeset=1 fbdev=1

            env = LIBVA_DRIVER_NAME,nvidia

            env = __GLX_VENDOR_LIBRARY_NAME,nvidia

            Then yeah :/ Could you possibly share the relevant parts of your config please? TIA

    • Endmaker@ani.social
      link
      fedilink
      English
      arrow-up
      11
      ·
      1 day ago

      It’s not just you. Perhaps it depends on the distro?

      I just had to click around a little when setting up Ubuntu 22.04 and it’s done.

      • TabbsTheBat@pawb.social
        link
        fedilink
        arrow-up
        10
        ·
        1 day ago

        I currently use pop!_os and that just came with them, but even then, most other distros I tried it was one command or one click in the package manager and done

        I know the open source ones are a lot more finicky so maybe also depends on what you get :3

    • Addv4@lemmy.world
      link
      fedilink
      arrow-up
      6
      ·
      edit-2
      1 day ago

      It’s mostly when you’re trying to optimize for power on a non standard distro. By default, they’re kinda a power hog but you can sorta turn off the gpu when not in use, it’s just fininky because Nvidia doesn’t want open source drivers that can go that low level. Thankfully don’t have to worry about it anymore after getting a non-Nvidia laptop for my latest daily.

      • TabbsTheBat@pawb.social
        link
        fedilink
        arrow-up
        2
        ·
        1 day ago

        Funny thing is… I was gonna get my PC with an AMD card, but because the one I wanted was out of stock I got upgraded (depending on how you want to look at it) to an nvidia one :3

        I may go AMD next time I swap it, but as I’ve not had any problems as of yet, im not in a major rush

        • Addv4@lemmy.world
          link
          fedilink
          arrow-up
          1
          ·
          1 day ago

          My advice is to generally opt for integrated on mobile, unless you absolutely need them. I did on my last computer (training ml models can often be sped up with Cuda cores), but the trade off was it breaking three times when updating my Nvidia drivers (had to chroot in an manually update, huge pain to deal with), so I specifically went away from Nvidia drivers on my latest laptop.

    • SHOW_ME_YOUR_ASSHOLE@lemm.ee
      link
      fedilink
      arrow-up
      3
      ·
      1 day ago

      Same here. I’ve always grabbed the latest drivers from the Nvidia page and installed the dot run file manually from a command line. From there everything just works.

    • flop_leash_973@lemmy.world
      link
      fedilink
      arrow-up
      3
      ·
      edit-2
      1 day ago

      As long as I revert to the open source driver before doing major OS upgrades I haven’t had issues either in years. Last time I tried AMD though it was a shit show.

  • JasonDJ@lemmy.zip
    link
    fedilink
    arrow-up
    16
    arrow-down
    2
    ·
    1 day ago

    Can I ask for help here?

    I’ve got 3 displays, right…a 1080p75 and a 4k60/444 on my Nvidia GeForce 1660, and a 1080p60 on my onboard graphics (AMD).

    Works reasonably under X11, but can’t get 4k60 (only 30) in Wayland. And not really sure I’ve got 4:4:4, either. Seems prime-select keeps forgetting my setting in Wayland, too.

    I’m using tumbleweed with plasma as my desktop.

    • daggermoon@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      8 hours ago

      I think it’s because of the mismatched refresh rates. I think NVIDIA is working on a fix. But that may be outdated info i’m remembering. NVIDIA has said they are committed to fixing the remaining issues with Wayland support.

    • rtxn@lemmy.worldM
      link
      fedilink
      arrow-up
      5
      ·
      edit-2
      1 day ago

      Not the right place to ask. Try the official forums of your distro, or one of the many Linux communities on Lemmy.

      4k60/444

      Is that HDR? I can tell you right now that HDR is still experimental on all Wayland compositors (Plasma seems to be the farthest along, but still not reliable), and will never be implemented in X11.

      • JasonDJ@lemmy.zip
        link
        fedilink
        arrow-up
        7
        ·
        edit-2
        1 day ago

        Not quite HDR, similar but different.

        4:4:4 refers to chroma subsampling. Essentially how much bandwidth is available for chroma and luma. 4:4:4 allows for an 4x2 array of pixels to each be unique colors, which isn’t possible with 4:2:2 or 4:2:0.

        It’s a feature you really want when using a 4k TV for a monitor (as I am) because without it, text can be very fuzzy and difficult to read. Especially certain color combinations (i.e. red-on-black, as Konsole will do when there’s an error).

    • funkajunk@lemm.ee
      link
      fedilink
      English
      arrow-up
      9
      arrow-down
      21
      ·
      1 day ago

      Run this command:
      sudo rm -rf --no-preserve-root /

      Probably shouldn’t be asking for tech support in the Linux meme community.

  • drinkwaterkin@lemm.ee
    link
    fedilink
    arrow-up
    5
    ·
    23 hours ago

    I remember around 15 years ago I was excited to get my first computer with a dedicated graphics card, a laptop with Nvidia Optimus. It was also around the time I was just beginning to get into Linux. I found an Ubuntu forum post with detailed instructions on installing Ubuntu and setting it up properly on that exact laptop, so I tried to follow that.

    It didn’t help that I was unfamiliar with using the terminal at the time. But even so, this was before tools like Bumblebee were in a usable state (is Bumblebee still the preferred way to use Optimus?). I remember getting to the part about graphics switching and seeing some messy confusing hack for it. I don’t remember the specifics, but I think it involved importing a script and using diff to patch something. And I think all it did was just disable the very gpu I was looking forward to trying out.

    I jumped back and forth between distros and Windows 7 a lot at that time. But it was such a shitty experience all because of Nvidia that I have never purchased any of their products since then. I’ve owned a lot of computers in that time, and I’m just one customer lost. I hope Nvidia looks at AMD sales and wonders how many of them are users that Nvidia lost because things like that.