Hacker Newsnew | past | comments | ask | show | jobs | submit | thefourthchime's commentslogin

I can tell you the light rail in Austin is a complete failure. There was some ridership before the pandemic, but after a few years, the numbers are dismal. They've covered the windows with ads, so you can't even tell how empty they are inside. Meanwhile, they crisscross the city, constantly blocking streets with rail guards just to shuffle a handful of people north and south.

> light rail in Austin is a complete failure

Light rail is stupid. It’s a bus that can’t change lanes. A train that gets stuck in traffic.

And, as you said, they visibly disrupt drivers which generates class animosity.


I think you’re thinking of streetcars—trains that share right of way with cars. Light rail often has its own right-of-way with priority over cars. (That’s what the crossing guards are for.)

> Light rail often has its own right-of-way with priority over cars

It’s still at grade. Priority is meaningless if there is a car in the way when the guards come down. And those guards, in interrupting traffic, are annoying to drivers. (I’d also point out that the line between trams, street cars and light rail is ambiguous. It’s an American term describing principally European infrastructure.)


> Priority is meaningless if there is a car in the way when the guards come down.

This possibility is so far outside my experience I can only think your perspective has more to do with emotion than logic. Maybe it happens more often in your city than mine.


The Austin train you are talking about is heavy rail. Not to be confused with Austin light rail which is Coming Soon (TM).

It's still more reliable than the busses. I think it's pretty fun.


I'm not sure how you'd measure the effectiveness of light rail/trams vs buses - a hybrid of average journey duration, number of passengers, and I suppose some ROI type metric?

Either way personally priority bus lanes feel significantly more flexible and cheaper to implement than LR/trams...but that's just a personal opinion.


I completely agree. I can't even imagine using a local model when I can barely tolerate a model one tick behind SOTA for coding.


I've used all of these tools and for me Cursor works just as well but has tabs, easy ways to abort or edit prompts, great visual diff, etc...

Someone sell me on how Claude Code, I just don't get it.


Having only used the base price of each, I loved the ux of cursor and what it enabled me to do, but I hit my monthly cap in 2 days. Whereas Claude code (on pro) I do hit my session limit and even weekly limit once but never have I had to been tools down for 20+ days.

I hear codex is even more generous.

Admittedly all seem cheap enough, but there does seem to be a large diff in pricing


I’m with you, I’ve used CC but I strongly prefer Cursor.

Fundamentally, I don’t like having my agent and my IDE be split. Yes, I know there are CC plugins for IDEs, but you don’t get the same level of tight integration.


My father, who never did any public speaking, and as much an introvert as you'll find, did this for my wedding rehearsal.

I was amazed at how naturally and well he did. All he wrote down were 6-7 topics to talk about. He got a huge applause.


This is exactly how I write code. I never engineer anything until I have to. I don't try to get rid of code duplication until it works. And I try to be as "least clever" as possible.


I'm the same way. Underengineering is so much easier to fix than overengineering.


And yet somehow in the enterprise software you always find 'EntityModelFactoryProvider' or 'BusinessRelationValidationService'


I've worked on a playout system for broadcast television. The software has to run for years at a time and not have any leaks, We need to send out one frame of television exactly on time, every time.

It is "C++", but we also follow the same standards. Static memory allocation, no exceptions, no recursion. We don't use templates. We barely use inheritance. It's more like C with classes.


I worked on the same for many years; same deal - playout system for broadcast, years of uptime, never miss a frame.

The C++ was atrocious. Home-made reference counting that was thread-dangerous, but depending on what kind of object the multi-multi-multi diamond inheritance would use, sometimes it would increment, sometimes it wouldn't. Entire objects made out of weird inheritance chains. Even the naming system was crazy; "pencilFactory" wasn't a factory for making pencils, it was anything that was made by the factory for pencils. Inheritance rather than composition was very clearly the model; if some other object had function you needed, you would inherit from that also. Which led to some object inheriting from the same class a half-dozen times in all.

The multi-inheritance system given weird control by objects on creation defining what kind of objects (from the set of all kinds that they actually were) they could be cast to via a special function, but any time someone wanted one that wasn't on that list they'd just cast to it using C++ anyway. You had to cast, because the functions were all deliberately private - to force you to cast. But not how C++ would expect you to cast, oh no!

Crazy, home made containers that were like Win32 opaque objects; you'd just get a void pointer to the object you wanted, and to get the next one pass that void pointer back in. Obviously trying to copy MS COM with IUnknown and other such home made QueryInterface nonsense, in effect creating their own inheritance system on top of C++.

What I really learned is that it's possible to create systems that maintain years of uptime and keep their frame accuracy even with the most atrocious, utterly insane architecture decisions that make it so clear the original architect was thinking in C the whole time and using C++ to build his own terrible implementation of C++, and THAT'S what he wrote it all in.

Gosh, this was a fun walk down memory lane.


A multi-inhertiance system is certainly not something somebody who "was thinking in C" would ever come up with. This sounds more like a true C++ mess.


I worked on a pure C system early in my career. They implemented multiple inheritance (a bit like Perl/Python MRO style) in pure C. It was nuts, but they didn't abuse it, so it worked OK.

Also, serious question: Are they any GUI toolkits that do not use multiple inheritance? Even Java Swing uses multiple inheritance through interfaces. (I guess DotNet does something similar.) Qt has it all over the place.


The best example I can think of is the Win32 controls UI (user32/Create window/RegisterClass) in C. You likely can't read the source code for this but you can see how Wine did it or Wine alternatives (like NetBSD's PEACE runtime, now abandoned).

Actually the only toolkit that I know that sort of copied this style is Nakst's Luigi toolkit (also in C).

Neither really used inheritance and use composition with "message passing" sent to different controls.


I take this back ;-) People come up with crazy things. Still I would not call this "C thinking". Building object-oriented code in C is common though and works nicely.


One could say toolkits done in C++ use multiple inheritance because C++ doesn't have interfaces though.


This is a good point. It would be better for me to say pure abstract base classes... that simulate interfaces in C++. That said, I can say from experience that Qt does more than multi-inheritance with pure abstract base classes. I think the QPainter class is mixed into a few places, and that class is fuckin' major -- it is responsible to paint every (cross platform) pixel in the whole framework.


GTK does not support multiple inheritance afaik.


It doesn't but it definitely "implements" a single inheritance tree (with up casting/down casting) which I believe Xt toolkits (like Motif) also did.


it is also interesting that places where you would expect to have quite 'switched-on' software development practices tend to be the opposite - and the much-maligned 'codemonkeys' at 'big tech' infact tend to be pretty damn good.

it was painful for me to accept that the most elite programmers i have ever encountered were the ones working in high frequency trading, finance, and mass-producers of 'slop' (adtech, etc.)

i still ache to work in embedded fields, in 8kB constrained environment to write perfectly correct code without a cycle wasted, but i know from (others) experience that embedded software tends to have the worst software developers and software development practices of them all.


My test of a new model is always:

"Generate a Pac-Man game in a single HTML page." -- I've never had a model been able to have a complete working game until a couple weeks ago.

Sonnet Opus 4.5 in Cursor was able to make a fully working game (I'll admit letting cursor be an agent on this is a little bit cheating). Gemini 3 Pro also succeeded, but it's not quite as good because the ghosts seem to be stuck in their jail. Otherwise, it does appear complete.


Isn't that what GPT 4.5 was?


That was a large model that iiuc was too expensive to serve profitably

Many people thought it was an improvement though


The price of inference has been dropping like a rock. I wouldn't expect that 2c to be true in a couple of years.


Likewise, was the cost of a Google search 20 odd years ago those amounts?


As a serial DIYer, I respect the engineering depth here, especially the custom vector index, but I disagree on the self-hosted ML approach. The innovation in embeddings is just too fast to keep up with locally without constant refactoring. You can actually see the trade-off in the "girl drinking water" example where one result is a clear hallucination.


Currently (Semantic) ML model is the weakest (minorly fine-tuned) ViT B/32 variant, and more like acting as a placeholder i.e very easy to swap with a desired model. (DINO models have been pretty great, being trained on much cleaner and larger Dataset, CLIP was one of first of Image-text type models !).

For point about "girl drinking water", "girl" is the person/tagged name , "drinking water" is just re-ranking all of "girl"s photos ! (Rather than finding all photos of a (generic) girl drinking water) .

I have been more focussed on making indexing pipeline more peformant by reducing copies, speeding up bottleneck portions by writing in Nim. Fusion of semantic features with meta-data is more interesting and challenging part, in comparison to choosing an embedding model !


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: