The new HSTX interface on the RP2350 seems to be squarely targeted at this use case (video output) and doesn't require the use of PIO or consuming a ton of CPU cycles. There's a nice write up on the capability here: https://www.cnx-software.com/2024/08/15/raspberry-pi-rp2350-...
The main advantage of the SIO TMDS encoder, is that it allows you to output TMDS on any GPIO, instead of the eight that the HSTX is restricted to.
And they allow an easy upgrade path for projects that were already bit-banging DVI on the rp2040. Other than those two advantages, I don't think there is any advantage to the SIO TMDS encoder over the HSTX TMDS encoder.
I get the impression that the SIO TMDS encoder was added to the design first, and there wasn't a good reason to remove them after the introduction of HSTX.
Isn't hstx mainly good for streaming out? My naive guess would be that it wouldn't have as much transceiver offload capabilities, like what the SIO I naively guess would be good for.
The nice part about using SIO seems to be that you can do the tdms encoding there. With hstx you need the output bitstream in the right format already which seems like you might be back to needing the CPU to do the encoding.
HSTX has a built in TMDS encoder, and as far as I can tell, it has all the functionality of the SIO TMDS encoder.
You can configure it to directly consume any line buffer with any pixel format, from 1 to 8 bits per color channel. It even supports formats with different numbers of bits per channel like 8-bit RGB322 and 16-bit RGB565.
i have the impression, that the main motivation for this project was learning the intricacies of TMDS encoding and providing an open implementation as reference for other ppl, who want to learn too.
there are too many black box electronics these days, so it's very much welcome to virtually open them up, by providing software equivalents of their operation.
This is amazing, not only did they port from the 2040 to the 2350, but they also ported all the functionality they could support over to the RV32 cores as well. PicoDVI is usable no matter what cores one chooses to use on the RP2350. Well done!
The author is also the creator of the RISC-V core found in the RP2350, you don't get your own core design taped out as an ASIC and then not show it off :p
If anyone wants a PicoDVI sock or 4, I had 5 PCBs manufactured and am only using one.
I wanted to put the HDMI connector onto the PCB itself, so I could avoid soldering the very small pin pitch. So I took some measurements and did my first KiCad project!
Just send me an email if you're interested in having one, and I'd be happy to post it to you. The resistors are already populated, too, you just need to solder on the Pico using the through-hole connections.
It says 60 percent of CPU cycles on one core with the other core free. What does the available 40 percent of the one core look like? I'm assuming its got some high frequency interrupts going that can't be disturbed? What restrictions are on using the remaining capacity of the one core?
My understanding is that the HSTX can do twice the bit clock as the frequency of the CPU, as opposed to 1x for RP2040. So 800x480 (60Hz) is possible with no overclocking (295MHz bit clock) and Luke said [1] he got 1280x720 50Hz with overclocking (530MHz using CVT-R according to the video timings calculator [2]).
Especially with the HSTX block which can apparently output at double the system clock (while the rp2040 was limited to 1/10th of the system)
If my rough math is correct, it should be possible to do output 720p 60hz video with an overclock to ~320Mhz. Though, actually generating that many pixels might be hard when you have nowhere near enough memory.
I would love to see some solid infomation about how well the rp2350 overclocks, apparently 300mhz is easy. 60hz 1080p is almost certainly out of reach, it would require something like a 700mhz overclock, but 30hz 1080p is probably viable.
> Though, actually generating that many pixels might be hard when you have nowhere near enough memory.
Even if you don't have enough memory or CPU time to render true 720p, it's nice to be able to output 360p or 240p inside a 720p container with each pixel and line duplicated two or three times. Doing it that way gives you nice crisp pixels rather than the blurry mess the displays internal upscaler would probably produce. You can even insert blank lines to get a faux-CRT-scanline effect.
And HSTX is actually pretty flexible. Nowhere near as flexible as PIO, but it can repeat pixels by itself (though, it uses the shift, so you can either pack multiple pixels per 32-bit word, or repeat pixels, not both).
Scanline doubling/tripping and blank line inserting can be done with nothing more than DMA chaining.
The two chips I have will both run at 350MHz with Vcore at 1.3V. No glitches seen. I am not sure how safe it is to run with that Vcore for long though. Probably fine. Stock Vcore will take you to 300 seemingly on all instances of rp2350
What kind of retro computing are you doing that would work well with high definition (720p). Standard definition I believe only started to be limited around 2007 when Xbox games like Dead Rising had captions that where too small to be legible on SD tvs
Is there any way to select between two HDMI signals?
I have an nVidia Jetson but its boot sequence turns off the HDMI output at some point. It would be nice if I could show a placeholder screen while the Jetson is booting. Would something like that be possible by using this RP2350 board?
Not directly because it can't handle retransmitting the existing HDMI signal. You probably need to add a HDMI/DVI switch IC like this one: https://www.analog.com/en/products/ad8196.html
I've tried a few commercial hdmi switches, to share a monitor between machines. The switch time is long and the monitor doesn't reliably wake up. Perhaps it's related to eDID?
Do you know how that chip is for real-world usability?
> switch time is long and the monitor doesn't reliably wake up
In my personal experience, obtuse monitor's firmware is often to blame.
Using a switch with one extra input port, where all data pins are connected to ground (via series resistors), and switching to that port for a brief moment, before switching again to the real intended input, usually helps some.
Isn't it possible to create a valid and stable HDMI signal based on two inputs? I can imagine that some memory is required to synchronize both signals. Is there an IC that can do that?
Is it theoretically possible to have 2-way HID (keyboard, mouse) and 1-way video output on a single USB-c cable? It would simplify connection to a KVM or software-emulated display/keyboard/mouse. Low FPS would be enough for dashboards and CLI.
Yes, this theoretically could be done using DP Alt-mode for type-c. It allows up to 4 lanes(commonly 2 lanes) DP signal over typically USB3 assigned lanes. USB2 data lanes are still functional in this mode.
https://newnex.com/technology-articles-dp-alt-mode-over-usb-...
I recently found the Openterface Mini-KVM [1], which is supposed to allow you plug it into your laptop over USB-C, and some arbitrary system with USB+HDMI. Then, using their host software, view the HDMI display output and send mouse and keyboard inputs with ease.
I haven't tried it, so I can't speak to how well it works, but it sounds promising.
It is a crowdfunded project, so the usual caveats apply
> KVM-over-USB .. macOS, Windows, and Linux [clients], with Android support in development..
HDMI and emulated keyboard/mouse (HID) input.. video up to 1920x1080@30Hz with under 140ms latency.. play target device's audio directly on the host computer.. [send text] from the host to the target device, ideal for copying usernames, passwords.. Switchable USB-A Port.. for transferring files.. or sharing other USB devices.
With USB storage emulation, this approaches BMC remote mgmt, minus network attack surface.
I think most people just weren't sure what you're talking about.
Communications between such a thingamajig capture card and a laptop has nothing to do with DP Alt mode or USB HID, it's whatever custom USB packet types that capture card manufacturer comes up with. That technically wouldn't be an answer to your original comment in strict sense.
The most likely heuristic paraphrasing for your original query is something like "are there ways to multiplex DP _source_ and HID _peripheral_ on a single USB3+DP Alt connection". That made no sense at multiple levels on top of being unlikely to be possible. That lead to people hallucinating wildly.
The alternate question "are there ways to plug an SBC into laptop somehow for remote control, over a single USB-C cable" has that dongle as an answer as we now know in hindsight. But that is not a straightforward interpretation and response for your original comment, especially with your follow-up replies about DP Alt mode specs that would not be used by such a device.
As mentioned by others, there are USB-C docking stations/monitors where a single cable provides power to the device, takes video from the device, and sends keyboard/mouse events to the device. I should have asked whether software on a laptop could emulate a USB-C docking station, i.e. DP display sink + keyboard + mouse, since docking stations already exist.
> Communications between such a thingamajig capture card and a laptop has nothing to do with DP Alt mode or USB HID, it's whatever custom USB packet types that capture card manufacturer comes up with.
Revisiting the question above, could a manufacturer make capture card output compatible with existing USB-C docking stations, instead of inventing bespoke USB packet types? If custom hardware is needed, why not emulate standard protocols?
I was surprised to learn that Windows/Linux/Mac/Android userspace software can encode/decode custom USB packets from a USB-C cable, without a custom kernel driver. Could RP2350 implement a similar custom protocol at the other end of the cable, removing the need for a hardware capture device?
The host and peripheral PHYs are different. Protocols going downstream and upstream are different. USB peripherals are literally not allowed to speak unless spoken to. It's always the king and his slaves. It's that way in the hardware.
You can make such a standalone computer, painted in orange and marketed as an ice cream, that works as the king class when a slave-class is connected to its sole USB-C port, and as a slave class when a king-class like a laptop connected to it. This is in fact how many smartphones work. But that's again not what you asked.
Exactly what you're asking in the way you're asking, standard USB protocols(and DisplayPort signals) going in and out of devices and computers freely like Ethernet packets, just isn't possible with USB.
> USB protocols (and DisplayPort signals) going in and out of devices and computers freely like Ethernet packets, just isn't possible with USB
That reminds me of Intel Thunderbolt Share [1][2], which offers sharing of screen/keyboard/mouse between 2 PCs, and is probably software-emulated ethernet over Thunderbolt.
Host-to-host connections over USB4 (which is Thunderbolt without Intel's marketing) actually just have a packet interface over which you can pass IP, no need for Ethernet emulation.
This already exists, right now...? I've seen tons of USB hubs with an hdmi output and several USB ports that can be used concurrently. Hell I'm using one right now. Are you asking about something else?
There's no current specs for dual directional USB over any cable, afaik. There very well should be.
You could present as a device and offer networking and have usb-ip advertised over multicast on that port. Easy, weekend project at most, would be dead obvious to any practitioner.
USB4 as a packetized protocol really should offer something. I do wish there were like a half speed 2.5Gbit usb4 option, that microcontrollers could have some hope of accelerated bit banging.
[2.0] .. using only two lanes on the USB-C connector via DP Alt Mode to allow for simultaneous SuperSpeed USB data and video
[2.1] .. tightened its alignment with the USB Type-C specification as well as the USB4 PHY specification to facilitate a common PHY servicing both DisplayPort and USB4. In addition, DisplayPort 2.1 has added a new DisplayPort bandwidth management feature to enable DisplayPort tunnelling to coexist with other I/O data traffic more efficiently over the USB4 link.
I believe that the author is talking about plugging in a USB-C cable between your laptop and some headless system, and having your laptop send HID data, while also capturing a display output from the headless system
Isn't that exactly what a USB-C docking station does? It receives display and audio data, and acts as a USB device (keyboard, mouse, webcam). You'd need hardware that can receive DisplayPort, but it's doable.
Looks like an Alt Mode host with DP sink capability is technically allowed...? But it also looks like USB host controllers with Alt Mode support exposes internal DP source driver inputs, rather than handing you bunch of bare copper wires in Alt Mode, so I doubt it realistically has an implementation.
But neither USB nor DP(or DVI, VGA...) are symmetrical interfaces. They don't work like RS232C or Ethernet ports, a product that use these standards must be designed from beginning as a host or as a peripheral(or as a special 2-in-1 gadget that can exclusively switch roles). I think that's what GP meant by dual directional.
I wish the RP2350 would include a fuse to disable for good the ARM cores for those who want only the risc-v cores (and maybe not pay ARM royalties on those chips).
See pg.1259 in the datasheet, the ARM_DISABLE fuse does exactly that. Putting that in was a smart move because it means they can always make a RISC-V-only variant of the chip without having to tape out new silicon - they just have to blow that fuse at the factory.
That's actually a super common way to handle licensing in hard IP. You don't want to spin different revisions of a chip with and without the IP, that's expensive. So you build in fuses to permanently disable the IP. This happens a lot with hardware video encoder/decoder blocks.
Or you do it the other way around in software - remember the Pi 1/2 era where you had to buy separate licenses for MPEG/h264 HW decoders, tied to the Pi's serial number?
It means the chip CPU cores won't be tied to ARM PI enforcement, and won't have to pay for the license to implement ARM ISA (or implement ARM core design).