Does anyone know of a good comparison resource?
This is a little bit more interactive and detail-oriented. I think they also have flashy onesheet PDFs that are more marketing oriented.
For a typical consumer home use case continuing to use 2.4 GHz is most likely ideal though. Though some apartment complexes have such bad 2.4 GHz interference even that might not be universal.
Really it's the same reason computers moved to 5GHz, and now 6GHz.
To an IOT application, it's the difference between chatting with a friend at a quiet outdoor cafe and trying to shout at her in a crowded bar.
5Ghz doesn't propagate very far and putting IoT devices inside your home on 5Ghz makes a lot of sense. With 6Ghz coming on line and being reserved for high bandwidth applications, 5Ghz for IoT makes even more sense.
I think moving from Make in the old version of IDF to CMake was a mistake.
Considering that they are supporting Linux, there was no real reason to make it so Linux-specific that all other Unix-like systems got excluded.
And just like any build system for everything language/stack, there is a small group of hardcore "enthusiasts" who create and push their true build tech to rule them all and then there is the large majority of people who have to deal with it and just want to build the damn thing.
I mean, I use it, but I'm not very happy about it.
My impulse purchase has been tempered with "eh, do I really need it?"
That alone puts US-based sellers at a mega disadvantage compared to cheap Chinese goods - and it's not a good thing.
These things are tiny and very cheap to ship. I could probably pack 40 of them into a USPS flat rate box shipped anywhere in the US for $9.30.
This was a decent argument when you could get things shipped from China for $0.50, but not now or in this case.
If all you need is Zigbee/BLE and a few IO pins, an nRF52840 dongle is still $10 on DigiKey.
interface class 3 (a Human Interface Device) sub class 1 (Boot protocol or Boot interface)
In contrast the sub class 0 of HID is just the ordinary case, which is arbitrarily complicated (six thousand keys and four axis input? Why not) and so understandably a BIOS or similar environment might not implement that but a full blown OS usually does.
Doubtless tools exist for other platforms which can show the same information as lsusb
It was an eye opener for me.
Language does not matter (I used Go and Ruby) as long as bindings are reasonable.
So your original comment / request was regarding USB specifically then?
I ask because I'd have guessed (possibly incorrectly!) that implement HID via GATT (BLE) you'd be able to support anything the BLE hardware revision could implement?
Perhaps the disconnect for me is that it's unclear when there is some special hardware that exists within the ESP32 itself (I think I2C, I2S, etc would be examples of this) vs something you are just implementing by manipulating the IO pins. Perhaps HID is one of those things?
Maybe not applicable for this new one, but that's my understanding for the S3/C5 models. (something like 16mb NAND and 8mb PSRAM)
It also says 320 KB of ROM, which seems low. Judging from the product name (DevKitC-1-N8R4) and their other products, it has 8 MB of flash.
Can anyone answer my question will the C5-WROOM be a pin for pin dropin replacement for a C6?
Contact them directly and you might get 10 at this point.
Does it have CAN?
How does the core compare to their old ones?
I'm a little disappointed that it only has one core even though I haven't used the second one on the older chips yet.
Pinout for the dev board.
https://www.erlendervik.no/ESP32-C5%20Beta_ESP32-P4_ESP8686_...
So you can't really use it yourself unless you don't want the wifi to be reliable.
One of the main reasons RISC-V is gaining popularity is that companies can implement their own cores (or buy cheaper IP cores than from ARM) and take advantage of existing optimizing compilers. Espressif are actually a perfect example; the core they used before (Xtensa) was esoteric and poorly supported and switching to RISC-V gives them better toolchain support right out of the gate.
The reason is that CPU cores only form a tiny part of the SOC, the rest of the SOC is proprietary and likely to be documented to whatever level the company needs and the rest if available hidden under layers of NDA's. Just because the ISA is open source does not mean you know anything about the rest of the chip.
saying that, the C5 is a nice SOC, and it is nice that we have some competition to ARM.
For example, put a sleep(100us) as a hook before packet transmission to allow capacitors to recharge between packets.
Had to do this on a design powered by a cr2032 because the peak power draw from those batteries is really limited.
Though note that for most of its life, the cr2032 will deliver slightly below the ESP's minimum spec of 3V. From experience, that's not really an issue.
If you can get away with spending most of your time in the ESP's deep sleep state however, battery life is gonna be way better, and that's probably what you'd want to do if you're using a cr2032. In deep sleep, the ESP32-C3's data sheet says it consumes around 5µA (0.005mA). With 200mAh, this gives a battery life of 40 000 hours, or 4.5 years. At those time scales, battery self-discharge becomes relevant, so let's say a couple of years.
So a decent mental model is: you can do a few hours of cumulative compute over the course of a couple of years.
Unless you decide to boost the voltage of the cr2032 to be within the ESP's spec. In that case, the whole deep sleep discussion might be moot; I suspect the regulator's own power draw would absolutely dominate the 5µA of the ESP. But I'm not super familiar with the world of power regulators, maybe there are ultra low power regulators which can do the job with minimal losses even in the µA regime.
I also don't think it's too unreasonable to use a C3 as an MCU in settings where a radio isn't required. The IC itself (aka not the wroom module etc) isn't that much more expensive than equivalently specced MCUs without a radio, and if you're already more familiar with developing for the ESP software ecosystem it might well be worth the few pennies extra. The ESP32-C3FH4X (so the IC with 4MB onboard flash, super easy to integrate into in a design) only costs $1.6 per unit on LCSC for 100 units (and mouser etc is similar).
Why do I have this sickening feeling that in a few years anyone doing anything with hardware is going to be ordering everything direct from China, like we're some kind of undeveloped client state?
This transition happened so quickly that most people haven't fully cought up to the implications to the full extent. In my mind, China is already the center of gravity.
It's been well over a decade since I was doing embedded design professionally, so my perspective is coming more from a hobby/3d printing/"maker" place. But it feels like one of the main results of these tariffs is that the bottom is going to drop out on Chinese and Chinese-adjacent sellers preloading so much stuff into US warehouses ahead of sale, and instead just shipping orders direct from China. Using a US warehouse means the seller has to front the money for the tariffs as well and takes a risk of them being lowered depending on Krasnov's whims. Whereas shipping direct from China, even if the seller is handling the tariffs (eg Aliexpress Choice), they've already got the cash in hand from a confirmed purchase.
Edit: oh the MSP430 is neat! If I cared about battery life (driving 200ma of LEDs anyway...) I'd totally use that.
Use something based on the STM32WLE5JC or a discrete SX1262.
Yes, there's a security angle, but if I have the chip in my hands, I should be able to flip some pin to reprogram the chip and prevent all the e-waste.
If you're the vendor, you can add a tamper-resistant or tamper-evident design to raise the cost of ,component-replacement attacks. Which can be countered by whole-device replacement, which in turn is countered by device identity attestation, amd so on, in an endless arms-race.
Should we just be pushing harder for "Works with Homeassistant" certification?
Through hole parts need a lot more heat across a bigger area, or you have to go pin by pin. I've scorched many a through hole board trying to desolder something, cursing at those who didn't socket the chip in the first place.
Want to annoy a repair person? Pot the whole thing in epoxy.
For ESP these modules are the WROOM line.
To my understanding, there's nothing specifically preventing companies from giving the user the ability to disable write protection or load their own signing keys, but it means that the default will be to have locked-down devices and companies will have to invest extra resources and take extra risks with regard to certification into enabling users to do what they want with the hardware. I predict that the vast majority of companies making random IoT crap won't bother, so it's e-waste.
An (equally narrow ;)) quote:
"ensure that vulnerabilities can be addressed through security updates, including, where applicable, through automatic security updates that are installed within an appropriate timeframe enabled as a default setting, with a clear and easy-to-use opt-out mechanism, through the notification of available updates to users, and the option to temporarily postpone them;"
Thus, I expect RED to stipulate only radio firmware to be locked down to prevent you from unlocking any frequencies but the CRA to require all other software to be updatable to patch vulns.
I don't doubt you with regard to what the RED and the CRA actually says. However I'm afraid that my understanding of it better reflects the practical real-world implications of companies who just need to go through the certification process.
18031 requires an update mechanism for most products, yes, however it some very stringent requirements for it to be considered a Secure Update Mechanism. I sadly don't have the 18031 standard anymore so I can't look up the specific decision nodes, but I know for sure that allowing anyone with physical access to just flash the product with new unsigned firmware would not count as a Secure Update Mechanism (I think unless you can justify that the operational environment of the product ensures that no unauthorized person has physical access to the device, or something like that).
EDIT: And I wanted to add, in one common use case for microcontrollers, namely as one part of a larger product with some SoC running Linux being the main application processor and with MCUs handling specific tasks, you can easily get a PASS in all the EN-18031 decision trees without an upgrade mechanism for the MCUs themselves. In such products, I can imagine a company deciding that it's easier to just permanently lock down the MCU with a write protect than to justify leaving it writeable.
These IoT manufacturers keep making all of these new products but the thing is, an ESP32 from several years ago is not that much different than one from today. They don't need much compute, anything difficult can take place on the cloud. So how do you sell someone new hardware if the first gen device is still perfectly capable? How do you sell a premium version if it's just the same parts inside? For the former, you can EoL a product by blocking it from cloud services (like Nest this week). If the firmware is locked, a hobbyist can't just flash modified gen 2 firmware and have the device functioning like normal. For the latter, you can lock the bootloader firmware so that it will only load the firmware that you want it to run (i.e. the basic or premium version).
Also for what it’s worth these ESP chips are unbelievably cheap when bought at scale. The box the product comes in is probably more expensive
There's a liability angle too. If a company (or person) makes a product that has any potential for harm and you reprogram it prior to an accident, YOU must take responsibility but will probably not.
Another angle is that the hardware may be cloneable and there's no reason anyone should be able to read out the code and put it into a clone device. There is a valid use case in making a replacement chip for yourself.
Companies will buy far more chips than hobbyists, so this feature caters to them and for valid reasons.
>> Yes, there's a security angle, but if I have the chip in my hands, I should be able to flip some pin to reprogram the chip and prevent all the e-waste.
What if the chip used masked ROM? Your desire is not always feasible. You can always replace the chip with another one - and go write your own software for it </sarcasm>.
BTW I'm a big fan of Free Software and the GPL, but there are places where non-free makes sense too.
Seriously now, where is that? The only scenarios I can think of are devices that could put others at risk. Large vehicles. But even, many countries allow modified vehicles on the road.
But everything else should be game. If it's my device and only me at risk, why should anyone else get a say.
It's not a particularly common thing yet, but smart home enthusiasts are becoming increasingly concerned about the expense and effort required to replace cloud-dependent hardware because the manufacturer decided the cloud service isn't worth maintaining anymore.
I recently reverse engineered an e-waste STEM toy from scratch ( https://github.com/padraigfl/awesome-arcade-coder ) and the general response I got from places were:
a. to salvage the microcontroller and other relevant parts (probably worth $4 off a board that would cost $100+ to replicate)
b. a weirdly hostile attitude about the ethics of reverse engineering regardless of the motives (guessing people have been burned a lot with people stealing their designs)
I've mostly worked on the frontend and don't have much knowledge of embedded systems at all but it wasn't anywhere near as hard as I expected. Keen to find some other ESP32 devices to tweak (suggestions welcome!). I guess even if making them unflashable becomes the norm it won't be too hard to just swap the ESP32 off the board with a new one.
I feel like the Trump admin is going to have to make a carve out for the esp32 or certain Espressif products. So many IOT businesses going to go out of business if these MCs baloon in price.
snvzz•1d ago
platevoltage•1d ago
colechristensen•1d ago
>Espressif Systems (SSE: 688018.SH) announced ESP32-C5, the industry’s first RISC-V SoC that supports 2.4 GHz and 5 GHz dual-band Wi-Fi 6, along with Bluetooth 5 (LE) and IEEE 802.15.4 (Zigbee, Thread) connectivity. Today, we are glad to announce that ESP32-C5 is now in mass production.
hughc•1d ago
bobmcnamara•1d ago
platevoltage•1d ago
bobmcnamara•14h ago
Still gonna be in a bunch of DSPs and stuff
snvzz•1d ago
ESP32-S3 was, AIUI, their last non RISC-V chip.
It was announced in 2020 and released in 2022.
0. https://www.hackster.io/news/espressif-s-teo-swee-ann-confir...
osrec•1d ago
connicpu•1d ago
baby_souffle•1d ago
bobmcnamara•1d ago
bobmcnamara•1d ago
Better compiler support for RISC-V, but everything I've seen from them is a much shorter pipeline than the older Xtensa cores, so flash cache misses hit it harder.
Both RISC-V and Xtensa suffer from the lack of an ALU carry bit for the purposes of improving pipelining. But for these small cores it means 64-bit integer math usually takes a few more cycles than a Cortex-M Arm chip
viraptor•1d ago
So... depends on the project.
bobmcnamara•1d ago
fidotron•1d ago
My playing with C3 betrayed that you have to use much larger buffers for things like i2s to make it work without glitching.
bobmcnamara•22h ago
IshKebab•3h ago
bobmcnamara•1h ago
Xtensa pays for it with crippled 64-bit performance, which has a lot of downstream impacts. Ex: division by a constant is also slower. Most compilers don't even bother fast pathing 64-bit division by a constant.
I was surprised to find Apple kept ADC/ADCS in aarch64. Maybe this ends up being one of those things that's less useful or potentially a bottleneck depending on the specific implementation. Edit: backwards compatibility probably.
The fact that a few cores have bolted it on to RISC5 makes me think I must not be alone in missing it.