In other news, I do think desk makers should start incorporating the USB dock inside the top board of a desk. People go through a lot of money and bullshit to keep their setup clean, especially those who swap computers.
Will be obsolete in 5 years every time
Hopefully there will be a matching sized one (60mm) doing something useful, but if not you can just put a normal grommet cover in and do the stuff under the desk.
I'm using these things a bunch.
It’s called GEO and it’s becoming a thing.
The connector is solid but my god have there been disasters because of USB-C.
1. Power distribution upto high wattage, not always with auto sensing, 2. Wites rated to different data transmission speeds. 3. USB standard data transfers and Thunderbolt over the same connector and wire but most accessories are not rated for Thunderbolt.
Omg I love it and I hate it.
....like?
USB-C hubs and my slow descent into madness (2021) - https://news.ycombinator.com/item?id=30911598 - April 2022 (513 comments)
The main gist was that almost every hub used the same board.
I was under the impression that the USB protocol just fell back to 1a 5v when power negotiation was unsure.
What kinds of devices?
The manufacturers cheaped out in not including the right resistors.
And if you spin the new board and it works with the A->C cable sitting on your desk, then what could possibly be different about plugging it into a C<->C cable, right?
The spec was written by a SW engineer, this explains some things. /s
I have 2 powerbanks that cannot be charged by USB-C port when at 0%. The signaling circuitry simply doesn't work. No idea who designed this. It is silly beyond belief. I have to charge it with normal 5V A-to-C cable for 30 seconds, then unplug, then all the PD stuff will start working and I can fastcharge it with USB-C port again. I'm screwed without A-to-C cable.
Through a pair of resistors.
The unpowered device connects each of the two CC pins to the ground pin through a separate resistor of a specific value. The cable connects one of these CC pins to the corresponding pin on the other end of the cable (the second pin on each side is used to power circuitry within the cable itself, if it's a higher-spec cable). On the host side, each of the two CC pins is connected to the power supply through a separate resistor of another specific value. When you plug all this together, you have the host power supply connected to ground through a pair of resistors, which is a simple voltage divider. When the host detects the resulting voltage on one of the CC pins, it knows there's a device on the other end which is not providing power, and it can connect the main power pins of the connector to its power supply.
Out-of-spec USB-C devices sometimes skip that, and out-of-spec USB-C chargers often (somewhat dangerously) always supply 5V, so the two mistakes sort of cancel out.
Since there aren’t any active chips in these cables, an A to C cable happens to have 5V hot on the usb c side, but this should not be relied on as it isn’t true for C to C
We can't have nice things.
Cheap, bad, shortcuts, etc. will result in an out of spec cable being necessary for an out of spec device to work correctly with an in or out of spec hub. It's terrifically frustrating but a fact of the world.
And this isn't just random no name knockoffs. The Raspberry Pi in certain versions didn't correctly follow the power spec. Both the Nintendo Switch and Switch 2 either incompletely, incorrectly, or intentionally abused the USB spec. The Lumen metabolism monitoring device doesn't follow the USB spec. This is one of those things where you want a bit of a walled garden to force users of a technology to adhere to certain rules. Especially when power and charging is involved which can cause fires.
That’s what consumer protection laws with teeth and electric safety certifications like CE or UL are for, not walled gardens.
History has shown that relying on hardware DRM, like Apple did with Lightning doesn’t prevent manufacturers, from doing dangerous things, because they’ll find ways around it sooner rather than later.
Found plenty of people online with the same issue but no resolution.
Finally just paid the $25 to get the OEM SteelSeries replacement cable and it charges fully again. wtf… I guess the replacement cable was USB-A to C and I’ve only tried USB-C to C cables?
They seem to be a popular brand, but can’t even get charging right. Ironically, the speaker doubles as a portable charger.
So sometimes I have to plug it, realize nothing is happening, unplug, flip the cable and plug it again for it to start charging.
Your Pixel 4A is entering debug accessory mode (DebugAccessory.SNK state in the USB-C port state machine); other devices probably don't support debug accessory mode and just shrug.
But I believe the specs are that way to avoid problems with OTG devices. If both devices decide to just push a voltage into the cable at the same time, you risk spending a great deal of energy or even start a fire. That said, there are better ways to deal with this issue; those are just slightly more expensive.
And a rant, for free: Holy smokes, I think OTG may be the second most braindead marketing dribble sort of acronym to come out of tech, right behind wifi(wireless... fidelity? what does that even mean?)
It does include a pull-down resistor, but wired incorrectly (compliant devices need two), which results in compliant chargers only correctly detecting it when using a “dumb” (i.e. containing no e-marker chip) USB-C-to-C cable. Your Apple cable probably has a marker (all their Macbook charging cables have one, for example).
I had to get this USB "power blocker" that only passes the data pins through, otherwise the Pi runs off the computer it is plugged into all the time
That's because USB 3 is not natively provided by the SoC on the RPi 4, but rather by a dedicated IC, connected to the main SoC via PCIe :)
Hence there's two USB 3 ports and two USB 2 ports, but wired completely differently internally! Presumably the USB-C port is connected to the SoC more directly.
A couple 5.1k resistors add about $0.00001 to the BOM cost. The terrible mistake is on the designers of devices who try to forego these.
It's unlikely that anything will be damaged, but the device likely will not work until the issue is resolved.
Because of all these supplies work with transistors they do not act as a load to the other. Is like if the 2 had a diode in the output (in fact they do have one, but not directly in the output).
This is my typical experience in HN lately, is getting full of people with absolutely no idea what they talk about, and are constantly downvoting good comments.
This is utter nonsense, Ohm's law doesn't magically stop working with a transistor. I do know about this, I've designed power supplies and USB devices, and I've destroyed more than a few components accidentally by connecting two switching supplies together. Yes, there will be current flowing, and yes, sometimes a fuse or breaker will trip, I have experienced this many times, and just because you haven't doesn't mean it doesn't happen.
>Is like if the 2 had a diode in the output (in fact they do have one, but not directly in the output)
Sounds like you're referring to either ESD protection diodes, or flyback diodes, neither of which do anything in the case of two similar but unmatched power supplies.
I'd advise you to get a degree in engineering (as I have), or do some serious studying, as this kind of uninformed discussion is not productive or helpful to anyone, it's just noise.
Man… you are really a nice case.
Let me make a last attempt, even when I know it will fail:
1) “Ohm's law doesn't magically stop working with a transistor”
Ohm’s law works only with linear components, is a linear relation. So NO it does not work in a transistor or a diode. No it doesn’t. No because of your magic ignorance, but because they are not frigging linear! Go study some physics.
2) No, I was certainly not referring to ESD diodes, but the rectifier, at the end of any SMPS. Some may have a last stage linear regulator, in that case the diode is part of the juncture of the output transistor. At any rate, ANY wall mounted power supply, and 99% of all supplies in the world, when the output is higher that the target voltage will just shut down. GO TEST IT AND STOP with your nonsensical replies.
BTW: the 1% of supplies that do regulate down are called “4-cuadrant-supply” are much more complicated, and expensive, and makes no sense to use in a USB charger.
I don’t care which degree you have, if you really do, and was expensive, ask for your money back. In case is not a degree in prompt engineering…
If you have two power supplies at different voltages and connect them together, there will be a finite resistance through the cable and Ohm's Law applies. Current will flow. With a low resistance and big enough voltage difference, there will be a significant current, and it can trip the supply. This is not difficult to achieve.
The rectifier you're referring to is the flyback diode in that case. But now as you've said yourself, the power supply will shut down if the voltage coming in is too high, which is frequently due to either a polyfuse or an efuse tripping. So it sounds like you're just arguing to argue, while actually agreeing with my point. You said "nothing will happen", but if one shuts off, something has happened.
I don't need to test this, I have done it. I also have quite literally thousands upon thousands of other engineers, books, universities etc backing me up, and you do not. Connecting two USB supplies together is a bad idea and will likely result in one switching off. Don't do it.
Either way, I'm done trying to convince an amateur. Feel free to do what you want.
“If you have two power supplies at different voltages and connect them together, there will be a finite resistance through the cable and Ohm's Law applies.”
That is an error that only a person with minimal knowledge from TV shows can do. That is absolutely bot true for regulated power supplies, SMPS or Linear, becauae (even if the later is called linear) they are not linear. So no, you cannot apply Ohm’s law.
“The rectifier you're referring to is the flyback diode in that case. But now as you've said yourself, the power supply will shut down if the voltage coming in is too high, which is frequently due to either a polyfuse or an efuse tripping”
Noooo I’m talking about the diode at thr end of any SMPS, buck, boost, fly-back or whatever type, or even a linear regulator. Is about how a regulated power supply works: as the voltage goes up in the output, it shuts down, stops the output current, in an intent to liwer the voltage. There is no fuse in 99% of supplies out there, because they have active protection. Is called a transistor, not a polyfuse.
“I also have quite literally thousands upon thousands of other engineers, books, universities etc backing me up”
That is what you think, but you are wrong. Sorry mate. If you would just go to a lab and put 2 power supplies in parallel, like I do pretty much daily, you will see you are wrong.
You are the most ignorant, obstinate and arrogant person I’ve seen in HN… and here is full of it. I’m 100% sure you studied CS in a medicre university, work as sw dev, and think you can talk with an EE. You say you work with electrical stuff, but your other posts reveal you are a SW dev. Obviously you are a liar, and trying to be right when you are sooo obviously wrong.
I do care what you do, because with such an arrogance and incompetence together, you are going to get somebody hurt or killed. Please start studying and stop being so stubborn when you are just wrong!
https://www.mikrocontroller.net/topic/458093
Basically the OP is asking what kind of resistors he needs so that he can get 5V out of USB C.
The first response is "No, 5V is always present." (incorrect)
The second response by another poster is "5V is ALWAYS present at the USB port..."
Only the fourth person actually answers the question and does it in a single sentence: "There needs to be 5k1 between CC and GND."
Is it a charge only cable (no data)? Is it usb3 5gbps ? Is it 100 watt power delivery? Is it thunderbolt 3/4/5?
For example: Usb 5Gbps is the same regardless of usb 3.0, 3.1 or 3.2 gen 1. In fact, customers dont need to know that "3.0" number. they just need their ports support 5gbps
[0]: https://www.usb.org/sites/default/files/usb_type-c_cable_log...
Even if it were purely cost, even then I think we still benefit: the alternative is cheaper devices will use a different cheaper cable spec, and you end up with many different cable types where you can't use them at all. Sure, maybe I won't get max speed from one cable, but still being able to get half speed and or power is better than a cable I can't use at all.
Honestly, I just have never got this criticism, the alternative is just "have completely different cables" and that has literally all the same problems but worse, outside of "it can be hard to identify what this cable supports", which is solvable in much better ways than making them incompatible (as with the cable I'm looking at on my desk right now which explicitly says 40gbps/240w on it).
I grew up in the era of every mobile phone having a different charger, and it was always a nightmare when you were at a friend's house and they didn't have the one you need. I may not get max charging speed, but now everyone always has a charger I can use for any of my devices. Please, I don't want to go back.
This seems like what will happen anyway to avoid cable confusion. Nobody wants to carry more than one type of cable.
I've been on a Macbook M1 Air for the last few years and wanted multiple screens, so I got a USB 3 hub (Dell D6000) which does DisplayLink. I had almost everything hooked in there, but still connected one screen direct via usb-c. Displaylink is good for an M1 as you can add loads of screens if you want, but you can't watch streaming services on them as MacOs thinks your screen is being 'watched'.
I did want a thunderbolt hub but as far as I could tell at the time Thunderbolt and Displaylink just didn't come in the same package, so I was stuck with two cables.
Three years on, I picked up an M4 machine that can do two external screens natively, great, I can reach my goal of only plugging one cable into my macbook. But the Dell can't do that on a Mac because of something or other meaning it would work as mirrored-only.
Time to grab an actual thunderbolt dock. I picked up an HP TB3 refurb (HP P5Q58AA) which was cheap (30 AUD) and seemed great on paper, only to find it needed a type of power adaptor I didn't have that put me back another 60 bucks, and when I got it all hooked up it didn't always work and couldn't support decent refresh rates on my monitors, with one always being stuck at 30Hz. There was a software update available, but for that you need a windows machine with a TB3 port, not something I have.
So then I grabbed a Kensington SD5750T, which was another 140 bucks, but I am pleased to report that it just works and supports dual 4k60 over thunderbolt/USB-C. There is no HDMI or Displayport on this thing, but my monitors both have USB-C in so... Unfortunately, now that I've read the article, I can also confirm it contains a Realtek 0x8153, and is an OEM'd 'goodway' dock.
Just as well I'm happy with wireless networking!
LOL. Welcome to the world of OEM/ODM. As a conservative estimate I'd guess >95% of all consumer electronics is done this way. Even the big names like Apple, Dell, Lenovo, etc. do it.
However, if you are - according to Wikipedia - a two-billion-dollar company like Realtek, then I expect you to get your shit together. There are exactly zero excuses for Realtek to not have a driver release ready almost a year after Big Sur has been announced. Zero.
Mac users are in the minority. It's worth noting that the RTL8153 is a native RNDIS device, which has its history on the Windows side, and Realtek has only started contributing drivers to Linux relatively recently.
FWIW I've had great luck with Realtek NICs, although I don't specifically recall using their USB NICs.
I envy you. Realtek NICs (especially over USB) are tantamount to trash in my mind after 2 decades of fighting their frequent failures. Be it failure to initialize at all to driver crashes to pisspoor feature sets (or claiming to have features that don't work at all), and a myriad of other problems. Granted, they usually work in Windows, but I don't work in windows (I live and work in linux/BSD). It has become my personally policy to avoid/disable and realtek NICs and replace them with something actually functional whenever possible.
Hopefully their work on linux-side drivers will change this given their proliferation.
To be honest I've yet to find a reliable USB based network interface regardless of chipset/brand/manufacturer, outside of the ones that do PCIe passthrough via USB4/Thunderbolt and those tend to be quite expensive (though they are starting to come down in price).
Ironically, the only problems I've had with NICs were on an Intel and a Broadcom.
Most certainly. Doesn't change the fact that Realtek being present is a huge redflag, even if it's not a cheap device, regardless of whether it's realtek's fault or the OEM/ODM/SI that integrated them into the system in question. It basically screams "we phoned this part in", though it's certainly not always true, it's true enough that I refuse to use them (be it by disabling them or just opting for entirely different hardware so I can avoid that headache).
Broadcomm is certrainly better than Realtek, but it's still a "Replace at soonest possible convenience" tier as well. Intel is far far more reliable in my experience (save for some of their combo bluetooth/wifi cards, but their dedicated wired ethernet cards have always been great for me. The i210/211 class of integrated bottom tier ones can be hit and miss though.
I had a reliability issues using a Realtek 2.5 Gbps USB network interface. Kept locking up, or having latency issues. Until I switched which USB port I plugged it into (one that used a different chipset), and after that it was solid.
Realtek itself (Questionable quality on a good day)
The implementation of Realtek by the ODM/OEM/SI into whatever part is being shipped, which given Realtek is the defacto "budget" option for networking, it's often done as cheaply and shoddily as possible, even if the chip itself actually isn't crapware.
And the USB interface as you point out. There's a whole rabbit hole that I'm unfortunately all too familiar with when it comes to diagnosing and dealing with USB. PCIe passthrough via a USB4/TB combo chip isn't as reliable as just using PCIe directly, but it's still better than a non-pcie passthrough usb interface.
Are they truly though, given that MacBooks were the first to drop all ports except USB-C, pushing people to look into buying hubs?
Yeah they cost more but they actually work properly.
In my experience USB ethernet adapters send out pause frames which shit-tier switches replicate to all ports in direct contravention of the ethernet specifications.
Other than that it seems to work fine. Even though it's only 100W I haven't had to use the Dell 130W charger in a while.
I slowly replaced my home network piece by piece trying to find the bottleneck that was causing my gigabit internet to top out at ~300kbps in my office on the other side of the house from the modem.
After replacing the Ethernet cable run from the second floor to the basement with fiber optic... And the switches in between... And seeing no improvement... I tried a different computer with a built-in ethernet port on the same cable, and pulled 920kbps.
The problem... Was my Caldigit Thunderbolt Dock. I replaced it with an OWC one from Facebook marketplace for cheap and it solved the problem... Roughly $500 in. I'm still angry I didn't check that early earlier.
My network is 10 gigabit now though.
(although.... looks like it's a realtek with the r8169 driver)
My work laptop has just a single USB-C cable plugged into the monitor for everything making it super trivial to plug it back in when I use it away from my desk (which I do regularly).
My personal desktop has a single DP and also a USB A to B cable. The monitor has KVM capability so I can super conveniently switch between them even with the two screens.
Cables are completely minimized with this set up, I’m very happy.
The only thing that’s unfortunate is that I occasionally work on a MacBook Pro 16” M4 and it can’t drive the second monitor over the same USB-C cable as Apple CBA to have support for DP MST on even their premium priced hardware. So I have to also plug in an HDMI cable to the second monitor.
Also unfortunate with the MacBook Pro is that macOS UI scaling doesn’t allow ratios like 125% meaning the UI elements aren’t quite at my preferred size. My Windows 11 handles this perfectly.
[0] https://www.dell.com/en-us/shop/dell-pro-27-plus-video-confe...
I'm really glad we went with that, so far they've been great (for software devs, no color sensitive stuff).
Try BetterDisplay, and enable "Flexible Scaling" (a per-display setting).
Bonus: enable "UI Scale Matching" and output will be same physical size across all displays (as long as they report proper DPI, but I think you can hack that as well right there)
They do this for MBP market segmentation now, but really, this started with the ProDisplay XDR.
At the time, people wondered about how they were able to drive the 6K 10 bit display, as the math didn't fit the spec.
People like me found out. Because they completely fucked with DP1.4 specs to make it happen with Big Sur.
In Catalina, my Intel Mac Pro happily drove 2 4K HDR monitors at 144 Hz. Upgrade to Big Sur, not any more. 95Hz for 4K SDR, 60Hz for 4K HDR. Not the cables, not the monitors. Indeed, "downgrading" the monitors advertised support to DP 1.2 gave better options, 120Hz SDR, 75Hz HDR.
Hundreds of reports, different card/cable/monitor combos.
Was at least still the case as of Ventura, which makes sense, because it wasn't a bug, it was just an Apple style fuck you to anyone not fully invested in "all Apple, all the time".
Another problem is that USB-A ports are dirt cheap and simple to implement, so hub makers feel like "leaving free IO on the table" by not sprinkling them on everything. Whereas each "decent" USB-C port has enough complexity to think twice about adding it.
Nevertheless, there are a couple of options. Try searching for "USB-C only hub". You will get some results, but they are basically the identical product (same IO card), just with different housings. So you can pretty much count with these specs: 1 USB-C in for power, 3–4 USB-C out, 5 or 10Gbps each, Power Delivery at various wattages. No video support.
I have one of these on my desk right now, it's from the brand "Minisopuru", I get power and four USB-C "3.2 Gen 2" ports. It's fine. But like I said, it's no Thunderbolt, and no video support, so I have to "waste" the other port on my MacBook just for my external display.
There are also Thunderbolt / USB4 devices which will give you a bonkers amount of IO, including good TB / USB-C ports usually (plus some USB-A of course, as a spit in the face – so you'd need to ignore those). But these are not hubs, they are docks, which is a different product class entirely (big and heavy, more expensive, dedicated power supply).
Something I've been doing recently to salvage the USB-A ports I still begrudgingly encounter, while continuing to (force myself to) upgrade all my devices to type-C, are these: [0]. 1-to-1 USB-A male to USB-C female adapters. I just stick them in all USB-A ports I encounter, leave them there all the time, and move on with my life. It's a bit bulky and looks kinda stupid, but it basically gives me USB-C everywhere I need (including work-issued PCs and docking stations) for just a couple of bucks. For low-bandwidth devices like headphones, keyboard / mice / gamepads, or even my phone, it works perfectly fine.
[0] – https://www.amazon.com/UGREEN-Adapter-10Gbps-Converter-Samsu...
https://www.aliexpress.com/item/1005008363288681.html
But you can also find them in smaller hub form.
It's the power consumption.
IIRC, USB-C has a base power per port of 15W (5V @ 3A) with just basic CC resistors. USB 2 starts at 0.5W (5V @ 0.1A) and is only supposed to allow 2.5W (5V @ 0.5A) after negotiation. USB 3 is 4.5W (5V @ .900A).
Note that the Caldigit hub linked in a sibling has a power supply of 20V @ 9A. That's 180W!
And the cheapest way to measure bit error rates?
Debug tools for eg. FPGA transceivers or some redriver chips can measure BER and show fake eye diagrams. In the few-hundreds to the few-thousands of USD range, but you may need a company to sing NDAs. Eg.:
https://www.intel.com/content/www/us/en/docs/programmable/68...
But customers can not see the difference between good and bad USB cables, just their design and their price.
Slightly off-topic, but I wonder if "beamer" is a "real" German word or one they borrowed from English in a weird way.
The first time I heard it completely threw me for a loop, as in the UK, "beamer" is shorthand for a BMW car.
It’s both :) See also: “Handy” for mobile phone.
Of course given the history of English it's not unlikely that "beam" came from German roots and they have the same meaning.
Same goes with 3.5mm jack on the phones. Freaking adapters are just an ominous thing to use. They are just bad and they always break. The port is so loose so after 3 months of using they just start falling away from it. There is no decent phone left with a 3.5mm jack, which is a really sad state of things... Unless you know one? Feel free to suggest.
Sony Xperia phones traditionally have a 3.5 jack and a normal screen (without holes and other nonsense for the camera).
They will start appearing, but this seems like a no-brainer to me.
rr808•6mo ago
gleenn•6mo ago
jeffbee•6mo ago
rr808•6mo ago
cyral•6mo ago
avisser•6mo ago
So now it's 2 cables: 1 from the hub, 1 from the monitor. Both USB-C.
WTF guys?
My Apple monitor from 2009 just worked with 1 cable (no power, but still).