• balsoft@lemmy.ml
    link
    fedilink
    arrow-up
    11
    ·
    1 day ago

    I don’t generally disagree, but

    You don’t just double the current you send over USB and expect cable manufacturers to adapt

    That’s pretty much how we got to the point where USB is the universal charging standard: by progressively pushing the allowed current from the initially standardized 100 mA all the way to 5 A of today. A few of those pushes were just manufacturers winging it and pushing/pulling significantly more current than what was standardized, assuming the other side will adapt.

    • xthexder@l.sw0.com
      link
      fedilink
      arrow-up
      2
      ·
      1 day ago

      The default standard power limit is still the same as it ever was on each USB version. There’s negotiation that needs to happen to tell the device how much power is allowed, and if you go over, I think over current protection is part of the USB spec for safety reasons. There’s a bunch of different protocols, but USB always starts at 5V, and 0.1A for USB 2.0, and devices need to negotiate for more. (0.15A I think for USB 3.0 which has more conductors)

      As an example, USB 2.0 can signal a charging port (5V / 1.5A max) by putting a 200 ohm resistor across the data pins.

      • balsoft@lemmy.ml
        link
        fedilink
        arrow-up
        5
        arrow-down
        1
        ·
        1 day ago

        The default standard power limit is still the same as it ever was on each USB version

        Nah, the default power limit started with 100 mA or 500 mA for “high power devices”. There are very few devices out there today that limit the current to that amount.

        It all begun with non-spec host ports which just pushed however much current the circuitry could muster, rather than just the required 500 mA. Some had a proprietary way to signal just how much they’re willing to push (this is why iPhones used to be very fussy about the charger you plug them in to), but most cheapy ones didn’t. Then all the device manufacturers started pulling as much current as the host would provide, rather than limiting to 500 mA. USB-BC was mostly an attempt to standardize some of the existing usage, and USB-PD came much later.

        • xthexder@l.sw0.com
          link
          fedilink
          arrow-up
          2
          ·
          1 day ago

          A USB host providing more current than the device supports isn’t an issue though. A USB device simply won’t draw more than it needs. There’s no danger of dumping 5A into your 20 year old mouse because it defaults to a low power 100mA device. Even if the port can supply 10A / 5V or something silly, the current is limited by the voltage and load (the mouse).

          • balsoft@lemmy.ml
            link
            fedilink
            arrow-up
            1
            ·
            23 hours ago

            Well, the original comment was about “pushing more current through than the spec”, and that’s pretty much what we did…

            • xthexder@l.sw0.com
              link
              fedilink
              arrow-up
              2
              ·
              21 hours ago

              Well, regardless, the spec only cares about devices drawing more current than the host can supply, and that has always been consistent. Electricity doesn’t really work in a way the host can “push” current, the only way it could do that would be with a higher voltage, which would damage anything not designed for it. But that’s what the USB-PD spec is for, negotiating what voltage to supply, up to 48V now.

              • balsoft@lemmy.ml
                link
                fedilink
                arrow-up
                1
                ·
                edit-2
                13 hours ago

                Electricity doesn’t really work in a way the host can “push” current

                On a basic level this is precisely how electricity works, a power supply literally pushes electrons by creating a difference in electric field magnitude between two points; or, in other words, by applying an electromotive force to electrons; or, in other words, by creating a voltage between two points. A load then does something with those electrons that usually creates an opposing electric field, be it heating a wire, spinning a motor, or sustaining a chemical reaction within a battery. The amount of power produced by the source and released at the load is proportional to (voltage) * (number of electrons being pushed by the supply per unit of time); usually, this is the limiting factor for most power supplies. They can hold a steady voltage until they have to push too many electrons, then the voltage starts dropping.

                Edit: I see what you mean now. Yeah, for a given voltage, it is the load that determines the current, so there’s no safety issue with this for the load. However there could be issues with the cables. IIRC there was an issue with noise being introduced by higher current draws that meant you couldn’t charge and transfer data at the same time with some cables.