• palordrolap@kbin.social
    link
    fedilink
    arrow-up
    1
    ·
    7 months ago

    You want to win me over? For starters, provide a layer that supports all hooks and features in xdotool and wmctrl. As I understand it, that’s nowhere near present, and maybe even deliberately impossible “for security reasons”.

    I know about ydotool and dotool. They’re something but definitely not drop-in replacements.

    Unfortunately, I suspect I’ll end up being forced onto Wayland at some point because the easy-use distros will switch to it, and I’ll just have to get used to moving and resizing my windows manually with the mouse. Over and over. Because that’s secure.

  • jmankman@lemmy.myserv.one
    link
    fedilink
    arrow-up
    1
    ·
    7 months ago

    I’m not touching Wayland until it has feature parity with X and gets rid of all the weird bugs like cursor size randomly changing and my jelly windows being blurry as hell until they are done animating

    • AMDIsOurLord@lemmy.ml
      link
      fedilink
      arrow-up
      0
      ·
      7 months ago

      Sure, let me dust off my fucking SPARCStation and connect up to my fucking NIS server so I can fuck off and login to my Solaris server and run X11

      Fucking WHO needs mainframe oriented network transparency in the 21 century leave that shit in 1989 like it belongs

        • AMDIsOurLord@lemmy.ml
          link
          fedilink
          arrow-up
          0
          ·
          7 months ago

          I know this as a fact that Nvidia GT730 under Nouveau and Intel HD 2500 can run Wayland without issues

          • interdimensionalmeme@lemmy.ml
            link
            fedilink
            arrow-up
            0
            ·
            7 months ago

            You misunderstand, I don’t want crap graphics on every computer, I want the 4090 driving every computer without having to buy one per computer.

            That’s what you could do with network transparency.

            • AMDIsOurLord@lemmy.ml
              link
              fedilink
              arrow-up
              0
              ·
              7 months ago

              RDP (Remote Desktop Protocol) works leaps, bounds and miles better than the 1989 X11 Network Transparency system ever did. Especially so that X11 was never intended for hardware accelerated compositing or 3D apps.

              • interdimensionalmeme@lemmy.ml
                link
                fedilink
                arrow-up
                1
                ·
                7 months ago

                PCs were not intended to have more than 640kb of ram and yet.

                The blame can squarely be placed on nvidia for this decrepitude of X11 and its functionality which is in contradiction of nvidia’s unlimited profit ambitions.

                RDP is the anachronism. Why would I want to stream a whole desktop environement with its own separate taskbar, clock, whole user environement. Especially given how janky and laggy it is.

                No, I want to stream -just- the application, it should use my system’s color and temperature scheme, interoperate clipboard and drag&drop, be basically indistinguishable from a locally running app, except streaming at 500mbps AV1 hardware encoded, 12 ms latency max, 16k resolution, yes this is not a typo, 16 bit hdr, hdr that actually works, the sounds works too, works every time, yes 8 channel 192khz 24 bit lossless. Also capable of pure IP multicast streaming. Yes that means one application instance visible on multiple computer, at the same time and can be interacted with multiple users at the same time with -no- need for the app to be aware if any of this.

                Do that with no jank and I’ll sing wayland’s praises.

    • mr_right@lemmy.dbzer0.com
      link
      fedilink
      arrow-up
      0
      ·
      7 months ago

      Was gonna say the same thing, HDR is like flac and expensive amps for audiophiles Maybe we should start calling them visualphiles ? 🤷‍♂️

      • Count Regal Inkwell@pawb.social
        link
        fedilink
        arrow-up
        1
        ·
        edit-2
        7 months ago

        “FLAC? Mate I destroyed my ears when I was 14 and listening to Linkin Park MP3s grabbed off Kazaa in the cheapest chinese earbuds my allowance could buy, at the highest volume my fake iPod could drive. I cannot hear the subtleties in your FLAC if I tried.”

        Cheek aside I believe the word would be Videophiles to pair with Audiophiles.

  • onlinepersona@programming.dev
    link
    fedilink
    English
    arrow-up
    0
    arrow-down
    1
    ·
    edit-2
    7 months ago

    Does wine run on wayland?

    Edit, had to look up wth HDR is. Seems like a marketing gimmick.

    Anti Commercial AI thingy

    CC BY-NC-SA 4.0

    Inserted with a keystroke running this script on linux with X11

    #!/usr/bin/env nix-shell
    #!nix-shell -i bash --packages xautomation xclip
    
    sleep 0.2
    (echo '::: spoiler Anti Commercial AI thingy
    [CC BY-NC-SA 4.0](https://creativecommons.org/licenses/by-nc-sa/4.0/)
    
    Inserted with a keystroke running this script on linux with X11
    ```bash'
    cat "$0"
    echo '```
    :::') | xclip -selection clipboard
    xte "keydown Control_L" "key V" "keyup Control_L"
    
    
    • Eager Eagle@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      7 months ago

      It isn’t, it’s just that marketing is really bad at displaying what HDR is about.

      HDR means each color channel that used 8 bits can now use 10 bits, sometimes more. That means an increase of 256 shades per channel to 1024, allowing a higher range of shades to be displayed in the same picture, and avoiding the color banding problem:

      • onlinepersona@programming.dev
        link
        fedilink
        English
        arrow-up
        0
        arrow-down
        1
        ·
        7 months ago

        Thank you.

        I assume HDR has to be explicitly encoded into images (and moving images) then to have true HDR, otherwise it’s just upsampled? If that’s the case, I’m also assuming most media out there is not encoded with HDR, and further if that’s correct, does it really make a difference? I’m assuming upsampling means inferring new values and probably using gaussian, dithering, or some other method.

        Somewhat related, my current screens support 4k, but when watching a 4k video at 60fps side by side on a screen at 4k resolution and another 1080p resolution, no difference could be seen. It wouldn’t surprise me if that were the same with HDR, but I might be wrong.

        Anti Commercial AI thingy

        CC BY-NC-SA 4.0

        Inserted with a keystroke running this script on linux with X11

        #!/usr/bin/env nix-shell
        #!nix-shell -i bash --packages xautomation xclip
        
        sleep 0.2
        (echo '::: spoiler Anti Commercial AI thingy
        [CC BY-NC-SA 4.0](https://creativecommons.org/licenses/by-nc-sa/4.0/)
        
        Inserted with a keystroke running this script on linux with X11
        ```bash'
        cat "$0"
        echo '```
        :::') | xclip -selection clipboard
        xte "keydown Control_L" "key V" "keyup Control_L"
        
        
        • Eager Eagle@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          7 months ago

          yes, from the capture (camera) all the way to distribution the content has to preserve the HDR bit depth. Some content on YouTube is in HDR (that is noted in the quality settings along with 1080p, etc), but the option only shows if both the content is HDR and the device playing it has HDR capabilities.

          Regarding streaming, there is already a lot of HDR content out there, especially newer shows. But stupid DRM has always pushed us to alternative sources when it comes to playback quality on Linux anyway.

          no difference could be seen

          If you’re not seeing difference of 4K and 1080p though, even up close, maybe your media isn’t really 4k. I find the difference to be quite noticeable.

          • onlinepersona@programming.dev
            link
            fedilink
            English
            arrow-up
            0
            arrow-down
            1
            ·
            edit-2
            7 months ago

            yes, from the capture (camera) all the way to distribution the content has to preserve the HDR bit depth.

            Ah, that’s what I thought. Thanks.

            If you’re not seeing difference of 4K and 1080p though, even up close, maybe your media isn’t really 4k. I find the difference to be quite noticeable.

            I tried with the most known test video Big Buck Bunny. Their website is now down and the internet archive has it, but I did the test back when it was up. Also found a few 4k videos on youtube and elsewhere. Maybe me and the people I tested it with aren’t sensitive to 4k video on 30-35 inch screens.

            Anti Commercial AI thingy

            CC BY-NC-SA 4.0

            Inserted with a keystroke running this script on linux with X11

            #!/usr/bin/env nix-shell
            #!nix-shell -i bash --packages xautomation xclip
            
            sleep 0.2
            (echo '::: spoiler Anti Commercial AI thingy
            [CC BY-NC-SA 4.0](https://creativecommons.org/licenses/by-nc-sa/4.0/)
            
            Inserted with a keystroke running this script on linux with X11
            ```bash'
            cat "$0"
            echo '```
            :::') | xclip -selection clipboard
            xte "keydown Control_L" "key V" "keyup Control_L"
            
            
            • accideath@lemmy.world
              link
              fedilink
              arrow-up
              0
              ·
              7 months ago

              aren‘t sensitive to 4K video

              So you’re saying you need glasses?

              But yes, it does make a difference how much of your field of view is covered. If it’s a small screen and you’re relatively far away, 4K isn’t doing anything. And of course, you need a 4K capable screen in the first place, which is still not a given gor PC monitors, precisely due to their size. For a 21" desktop monitor, it’s simply not necessary. Although I‘d argue, less than 4K on a 32" screen, that’s like an arms length away from you (like on a desktop), is noticeably low res.

              • onlinepersona@programming.dev
                link
                fedilink
                English
                arrow-up
                0
                arrow-down
                1
                ·
                7 months ago

                So you’re saying you need glasses?

                No. Just like some people aren’t sensitive to 3D movies, we aren’t sensitive to 4k 🤷

                Anti Commercial AI thingy

                CC BY-NC-SA 4.0

                Inserted with a keystroke running this script on linux with X11

                #!/usr/bin/env nix-shell
                #!nix-shell -i bash --packages xautomation xclip
                
                sleep 0.2
                (echo '::: spoiler Anti Commercial AI thingy
                [CC BY-NC-SA 4.0](https://creativecommons.org/licenses/by-nc-sa/4.0/)
                
                Inserted with a keystroke running this script on linux with X11
                ```bash'
                cat "$0"
                echo '```
                :::') | xclip -selection clipboard
                xte "keydown Control_L" "key V" "keyup Control_L"