Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Man, I miss making a living as a C programmer. My happiest days as a dev were when my "stack" was Linux, C, Makefiles, and some shell scripts. Only thing I'd change is that source control was svn instead of git.

Sure, there was a lot more to type. Debugging was harder. But there was a beauty to that. A simple mental model. A sense of control.

These days all jobs seem to be sort of online crap. Piles and piles of layers and complexity and heterogenous frameworks and tools. Being on call. Never being able to truly master anything because everything changes all the time.

/nostalgia



Oh man, I feel exactly the same my friend. People who have never gotten into the C world find it frightening, but it's a beautifully simple language loaded with (at times dangerous) power. It's so much closer to the hardware that you can't use a declarative approach easily (which now that I've drunk the functional programming kool-aid, I do love), but in many ways it's actually much simpler to understand. You can also be pretty declarative if you are just smart about breaking things into functions.

I still glance longingly at my dusty paper copy of The Linux Programming Interface (https://nostarch.com/tlpi)

/nostalgia


> People who have never gotten into the C world find it frightening...

I got into C over 25 years ago. Didn't find it frightening back then, but I sure do now.

Still use it pretty often for firmware and kernel driver development, but I want to replace it with something safer. Then again, I also use assembler for same reasons... sometimes C just doesn't cut it for interrupt handlers when every cycle and/or code byte size counts.


> Didn't find it frightening back then, but I sure do now.

Why is that? Security related stuff?


Yup. Things are increasingly connected to network. Including legacy codebases.

Also when you have larger teams and more people touching the code, C can really shoot at your feet and elsewhere. From new and surprising angles.


No doubt. It's amazing too how some code that was never expected to be exposed to untrusted/unsanitized data gets re-factored into a new spot or called from somewhere else, and fails to provide sanitation expecting that the callee will do it, or simply forgetting altogether (easy when under pressure to deliver). I coded a pretty bad security hole myself once by doing something like that, and I am a security specialist that knows what to look for lol!

I love C, but it really is a security nightmare full of footguns.


> Still use it pretty often for firmware and kernel driver development, but I want to replace it with something safer. Then again, I also use assembler for same reasons... sometimes C just doesn't cut it for interrupt handlers when every cycle and/or code byte size counts.

That's what you have ASM for :D j/k


> beautifully simple language

:-)

Of course there are those who claim it is actually frightfully complex, when it is those same people who are re-interpreting the standards to actually create that very complexity.


Not sure if this is what you meant, but a lot of the stuff they added to C in the latest standards really turn me off. C89 has a special place in my heart.


i still only code in C, didn't start in the stone ages, but i love it. C and asm give me the feeling i'm programming my computer, and really for low level stuff trying to find a good alternative which is as good and simple (yes simple :D c/asm is just pure logic!) is difficult for me.

I don't code professionally though, since i can't for the life of me find a job in C which doesn't have already tons of guys like you guys with a century of experience in the language lining up to take it :D

C/asm is awesome, can't bare anything else


Honestly, if you use one of the available GCs out there (like Boehm's), and give up on static typing, and heavily rely on function pointers, you can write C similar to how you'd write something like Haskell. Yes, it won't go as fast as it the most idiomatic C, and you can't really make an operating system if you have a GC, but really, how often do most of us actually write code that can't use a GC these days? Even with a GC, it'll still probably perform better than 90% of languages.


At that point it's practically not C anymore :)

I'm not a fan of adding GC to C. I've hard my fair share of stress caused by GC issues. It's great 99% of the time but when you run into performance issues caused by the GC it becomes a very very leaky abstraction.


I'm ok with it not being 70's-era C to be honest; I even with the extra stuff, I find the language to be fairly simple to pick up compared to C++.

I haven't had any problem with performance with the Boehm GC personally, though for what I've used C for is not-real-time video processing stuff. I found I typically got better throughput using Boehm than I did when I was manually managing my memory, but for what I was using it for, a small pause wasn't really a problem as long as the throughput wasn't really affected.


One may as well just use C# without classes.


Pedantically speaking that's impossible. But loosely speaking that's the essence of how I write most of my C#.


If you are doing that, you might as well use Go and get green threads and little or no undefined behaviour in a modern actively developed language for free. Go is practically C without the undefined behaviours.


If you have a GC and no static typing, is it better than other functional languages (many of which have benefits of both GC and static typing)? Not a rhetorical question, I have never used a GC with C.


For what I was doing, which was video processing stuff, I'm not sure that it was faster than if I had written it in Haskell. For this job, I was required to use C or C++, and so I never attempted to port it to Haskell or OCaml or something. I'm more of a wannabe-academic and certainly not a systems programmer, so I did what I could.

If I were to guess, the C version would have a bit better performance in a single-threaded context due to the fact that I would occasionally use tricks to avoid reallocations/frees, and the Boehm GC is opt-in, so when I was reasonably certain I could handle the memory correctly, I would do it myself, minimizing how much was actually being done by the GC.

I do feel that a static functional language like Haskell might perform well (maybe even better-on-the-average-case?) in a multi-threading context, since I personally find dealing with locks in C (and C++) to be very difficult to do correctly, so utilization of some of the cool tricks in Haskell to avoid manual locking might benefit. I was too much of a coward to use threads and locks much at that job, so I haven't had much of a chance to test it.


Static typing is the whole point behind Haskell.

C - static typing + closures and GC ≈ Scheme. If I wanted to write in that style, I'd just use Gambit and be done with it.


Programming is kind of joke these days.

Each time a new "hotness" language, framework, pattern, or dev/devops process comes out you have a rush of professional gurus working hard to build their business (e.g., speaking fees/books/blogging/training) by teaching that this is the new true way.

In the late 90's I began to notice how the new kids kept advocating for the newest so-called best practices to do things that the bad-old practices could handle just fine. Seeing the writing on the wall, I dipped out of the game a few years later.

Unfortunately, the unhelpful tech churn has worsened. Similarly, the quality of the product produced has worsened, or at best, not improved.

Note, hyper-scale advertisement services, global human tracking, and digital Skinner boxes are not an improvement to anything.

Meanwhile 50 year-old programmers with deep general development knowledge cannot find jobs. I guess it is easier for young founders to justify using frothy tech if there are fewer old-timers around to suggest otherwise.

edit-to-add:

/get-off-my-lawn


There's a lot of BS, but there are some real improvements as well. Many innovations that are considered best practices today actually came out in the 1990s. The Java programming language became incredibly popular, being a machine-independent, memory-safe language and system that could deal out-of-the-box with concurrency and networking-- The Go of its day, to some extent (and indeed it had a whole lot in common with Go's predecessors, Alef and Limbo). Functional programming was popularized around that time. The C++ standard introduced us to the notion of zero-overhead abstractions combined with strong static checking, which Rust is refining today.

And of course, the Web became widespread around that time, as well. Whatever you might think about "hyper-scale advertisement services, global human tracking, and digital Skinner boxes", Amazon first became prominent in the dotcom era, and it's quite massive today.


I have been reacquainting myself with some so-called best practices as I muddle through my recent side-projects.

While some newer languages are interesting, the dev stacks today are a mess.

Also, While we have safe pointers and GC everywhere, the lack of technical discipline/professionalism in the industry is worse than ever. I recognize that the C-suite and VCs share in the blame for this, but devs are the ones building things and evangelizing the newest-hotness that comes onto the scene.

But I do have to remind myself that compared to traditional engineering tracks, software engineering is still in its infancy.

/get-off-my-lawn


> ...the lack of technical discipline/professionalism in the industry is worse than ever.

What concerns me quite a bit is the overt opposition we're now seeing to professionalism in the industry. The whole 'post-meritocracy' shtick, regardless of the best intention of the "useful innocents" who came up with that particular phrasing, is really a way of saying: "Professionalism? What professionalism? There's no such thing, we know better than that! Brogrammers r00lz FTW!" Again, this is clearly not what the proponents were seeking-- but in some sense, it's what the phrase actually means, out there in the real world.


Two week release cycles mean you never have to stand behind your work.

/salty

I will shut up and go back to building this Android app I am working on -- what a horror-show of a platform.


caught the tiger by the tail on this one. Brilliant reduction and post mortem btw.


This is one reason I wonder whether there is room in the world for a better C. Low complexity programming languages with a simple machine mental model along the lines of Go (or perhaps in future, Zig and Jai?) for doing systems programming, with a strong static type system, and a rock solid build system. Early in my career I did a lot of bare metal and embedded systems programming and the one thing I miss about C is the predictable assembly output. I primarily use Rust for this purpose right now but I wonder if there's a place for something simpler for doing really low level stuff (i.e. programming hardware directly, device drivers) that's better than C.

EDIT: grammar.


> I wonder if there's a place for something simpler for doing really low level stuff (i.e. programming hardware directly, device drivers) that's better than C.

Like this? https://andrewkelley.me/post/zig-stack-traces-kernel-panic-b...


I like Zig. Have you tried it? I’d be interested in hearing about your experience.

I haven’t been able to think of a project to use it for, though, as all of my work these days is web related.


(you replied to the creator of Zig, fyi)


facepalm I need to start looking at usernames before replying!


Maybe a cleaned-up Pascal would do the trick? It was a great teaching language back when I was a student. Low complexity, strong static typing, compiled language, no GC, pretty fast. No pointer arithmetic, harder to shoot oneself in the foot, but still easy access to pointers and easy ability to manage memory.

edit: What I meant by "cleaned-up" Pascal was addressing some of Kernighan's criticisms as seen in https://www.lysator.liu.se/c/bwk-on-pascal.html (also, the Pascal syntax is a bit bloated)


Freepascal pretty much is the cleaned up version you describe. Fast, free, multiplatform, and just plain sensible. Overdue for a resurgence of use. Maybe the foundation in charge could rename it Cpascal and it would suddenly lift in popularity.

Pascal was a language I learnt in 1982 and love it for its elegance. The only thing i dislike and is still around its the begin... End and only because I'm a lazy typist and lazy reader.For me is hard to find Begin-end blocks... Harder than looking for stupid squiggles used in other languages. Go figure.


Check out Oberon which is the current Pascal. I use the OBNC compiler (https://www.miasap.se/obnc).


The Oberon specification on this website is also a great and concise read: https://www.miasap.se/obnc/oberon-report.html


Modula-2 comes to my mind. :)


Yep. Languages I used in school included Pascal, Modula-2, Scheme, and of course C. Fond memories. Everything seemed simple then.


Pssst, come over to embedded. We’re in C all day long using kilobytes not gigabytes.

I don’t need to chase the latest framework... but I’m also on my own for almost everything. Pros and Cons, but I wouldn’t leave it for anything web related.


The Rust Evangelism Strike Force is gunning for the embedded space, too. As soon as the tooling becomes widespread enough to support the most-used microcontrollers, C will be a niche language even in embedded.


Yea... We'll see. Rust has had quite awhile to make an attempt at Embedded and unless you count drivers for a couple STMs and a few other (mostly outdated) chips - I haven't seen a single thing that says progress.

I'd like to see Rust happen, because I don't see C++ as the embedded future. At least Rust had the good sense to leave garbage collection out.

However... I'll believe it when I see it. And by that, I mean when STM and Nordic and NXP and others are pushing out their own Rust device support files on their sites. When Keil or IAR or Rowley or Atollic pushes out a full featured IDE that uses a Rust compiler. When Rust is not only supporting the latest run of chips but there is a way to debug them with code-overlay. But until then...


> I don't see C++ as the embedded future.

AUTOSAR moved from C to C++14 as the certified language.

mbed is written in C++.


mbed is a toy, at the same professional level as Arduino, although ARM does a better job of marketing as something much more serious.

However, AutoSAR is a fair point. There are a lot of hands in the pot on that and them coming to an agreement to use C++ is rather surprising - counterpoint that Automotive ECU code is absolutely HORRIBLE right now and no one seems to know it. I work in Auto and all the mfgs have moved to a fix in the field model. It’s really bad, and I wouldn’t touch a first year new model for any reason.

So while AutoSAR C++ is interesting, this isn’t exactly the group that I want to model myself on.

Fair point though.


The biggest are of progress has been in using those devices as a test bed to stabilize features working on embedded generally. You can now develop for those devices on stable, which is a huge advantage.


The year is 2048 - the year of Rust: literally, figuratively and mentally.


Was this comment meant pejoratively?


> ...But there was a beauty to that. A simple mental model. A sense of control.

There's something to that, I think. And it's still an open question whether the newer languages that are in development at present will succeed in bringing some of that underlying "beauty" back. It's an important issue if we want lower-level programming to be more widely accessible.

(To clarify - the underlying machine model of C is beautifully simple, and newer low-level languages tend to be built on something very much like it anyway. The C language itself is frightfully complex - albeit less so than that other language in the C family that's often conflated with it.)


I wonder if, now that processor speeds are increasing at a much slower rate, there will be a return to that lower level programming to focus on speed improvements through more efficient code.


I bet that's what some people were saying referring to assembly when C came out :)


People said it about assembly when it came out. The actual (binary or octal) machine codes gave you an intimacy with the hardware that assembly took away. (People actually said that.)


If you like "intimacy with hardware" you can drop down one level below machine code and design a processor on an FPGA with Verilog. Or, to go even deeper, design a custom circuit with SPICE.

C might be the optimal point on the abstraction ladder as far as the trade off between (exposed) complexity and control.


You can go still lower. You can create custom transistors. You might even create custom semiconductor substrates...

... and there we stop. Nobody gets to create custom atoms.


We are always hiring full time C developers doing exactly what you said. Msg me if you (anyone) is interested.


I see the insides of a lot of startups. The C/C++ based companies generally seem to be solving serious problems with good engineering.

Startups using frothier tech are often (not always) toy companies building toy products (hyper-scale toys are still toys).


This is why I feel that the solution to most of the modern ailments in development is to just put Lua everywhere. All the great stuff of C, and all the new-school shit too.

If you do this, it'll seem soon enough that the Javascript nightmare was just a dream. Takes balls though.


I wish! One of the coolest projects I've worked on used Lua. We wrote the core in C++ and then everything else on top of it in Lua.


I do the same thing with a twist [0], since I'm not very fond of some of Lua's design choices. Lack of a decent type system, the table mess, etc. And I think Forth and Lisp make better glue languages.

Can't stop smiling these days when I see people fighting over which language will rule them all. I spent tens of years searching for that language myself. Time I would rather have spent solving real problems using the best tools.

[0] https://gitlab.com/sifoo/snigl


What would some examples be of Lua addressing modern ailments? I'm also curious to hear why it's such a good fit for those ailments. Thanks.


Its not Lua specifically addressing modern ailments, its the attitude that taking full control to put the same common codebase on as many of the target platforms as possible can be profitable, in light of the vendor mess which is, presumably, what we're talking about here. It can be a very disturbing thing to realise what a few tweaks here and there to package.json might do to ones love life.

Lua is a great, easy to use, easy to apply, language -- with a healthy framework ecosystem, and it is very easy to put it to use in a legacy code-base, since its C-based, and we all know that C still makes the world go around. However, its not the fact of Lua, but the fact of 'put our own VM everywhere' that wins, imho.


Lua? People still use that? How about the whole lmod debacle?


James Mickens has a great part of a talk where he talks about this: https://youtu.be/7Nj9ZjwOdFQ?t=1574

A little too close to home.


I was about to say "warning, these are the kind of talks that will revive COBOL.. "


Golang gives me that same C feeling.


The GC and goroutines make it more abstracted from the metal than it looks, while lacking the conveniences of the zero-cost abstractions available in lower-level languages like Rust. The only upside of taking the opposite side of the Rust tradeoff is compile time.


> The only upside of taking the opposite side of the Rust tradeoff is compile time.

Not so, there are a few specialized domains where having tracing GC easily available is genuinely useful. (Pretty much anything having to do with general graphs - the sort of stuff you'd traditionally use LISP for!) Go is a great fit for these use cases.


It still gives me the feel of C, despite the abstractions (which I generally appreciate).

Rust is like something else entirely.


I've been slowly making my way through A Tour of Go and I really like what I've seen so far.


Funny. I looked through the GO book and walked away feeling justified in being diligent with C. Maybe that's because I dislike jumping through other peoples hoops when I know better.


Russ Cox articles do have that simple and defined quality of old days.


I miss it so much..

btw if you haven't check out Casey Muratori's Handmade Hero on youtube, it will fill you with joy.


>Man, I miss making a living as a C programmer.

Same here. I worked quite a lot on C on Unix and some on Windows (including working on a successful database middleware product on Windows, which was used in some VB+Sybase/Oracle client-server projects) before I got into other languages like Java, Ruby (and Rails) and now Python for quite a while. Great fun working in C, although of course frustrating at times too, debugging weird issues. Also, somehow, I never found working with pointers (at least for simple to intermediate uses of them) confusing, like some do. (I once taught a C programming class at a large public sector company; while I was explaining the int main(int argc, char argv stuff, and the pointer to pointer stuff, the head of their IT dept. who was in the class, said "now it is 'overhead transmission' :)". Maybe I didn't have trouble with pointers because I had some hobbyist assembly language programming background from earlier, including learning about different addressing modes, such as direct, indirect, indexed indirect (or vice versa) (on the 6502 processor), etc., plus used to read a lot on my own about microprocessors, computer architecture, and suchlike related areas, even though they were not directly relevant to my higher-level application programming work (hint hint, to kids learning computers today). Working close to the machine is good. Also, a bit off topic, I kind of independently discovered the concept of refactoring. I was in a nice relaxed state one afternoon, at work, after a good lunch, but not heavy, so not drowsy, working on some Unix C program (likely a CLI one), and over a period of time, I noticed that I had been making small (behavior-preserving) incremental improvements to the code, one after another. In a very relaxed way, without any tension, taking my time for each small change, so I was fairly sure that each small refactoring change did not change the meaning or correctness of the program. Thought that was a good technique after I realized that I had been doing it. Unfortunately did not keep doing it with other code. It was only some years later that I read about the term "refactoring" and about Martin Fowler's book on the subject. I'm sure others must have discovered the concept similarly. Anyway, interesting incident.


>In a very relaxed way, without any tension, taking my time for each small change, so I was fairly sure that each small refactoring change did not change the meaning or correctness of the program.

Unlike the rushed, tense way in which some (many?) projects are conducted these days (and plenty earlier too), with people playing whack-a-mole with bugs introduced by said rush, "because we have to ship last week".


The high number of the external dependencies that have to be used just to get basic projects off the ground don't help either.


Well said. And often buggy or rapidly changing (or both) dependencies too - because the authors are keen on showing they are keeping up with the Joneses (er, times) and so their project is the latest and greatest - never mind if stuff doesn't work, the next version will be even more awesome!!! [1]

[1] (Frequent use of) exclamation marks is obligatory or you're not a (team) player, go home. /s

/s


Debugging was harder as compared to what?

I'll take debugging C over modern C++ any day...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: