Hacker Newsnew | past | comments | ask | show | jobs | submit | mprovost's commentslogin

I was in college when v6 was going through the RFC process. In my networking class we had to learn Netware (IPX) and v6, which have both turned out to be equally irrelevant, for different reasons. At this stage, I fully expect to retire having never deployed a single resource using v6.

The terminfo database is one of those thankless xkcd dependencies. In this case, it's been thanklessly maintained since forever by Thomas Dickey.

https://xkcd.com/2347/


It reminds me of the old OSKit project from the Univ of Utah, which was also developed for research and teaching.

https://www-old.cs.utah.edu/flux/oskit/


I don't think either SCCS or RCS tracked merges, so everything looks like a new revision.


Correct. I had used both at work up until around 2005. The idiot large companies I worked at did not believe in Source Code Control. That is the one thing I liked about RCS/SCCS, once I checked out an item, no one could check in their changes unless they contacted me. Forcing a coordinated manual merge between us.

I tried to get our org on to something for a while, but got massive push back until 5 or 6 years ago when they setup corporate wide paid githup repo.

Before that, I found a small group of developers around 2005 that used CVS and they allowed me to leverage that for my group. But of course I was the only one who used it.

Back then I guess people loved loosing source code, which happened a lot until git.


I convinced a software company to use a version control system (RCS on shared disk) back in 1993. To make it work we had to setup a network — Ethernet over (thin) coaxial cable at the time. This was so new to us that we didn't know we needed to use terminators on the two cable ends.


also rebases instead of merges wouldn't count as merges


I don't think the concept of a rebase existed before Bitbucker and Git.


It wasn't called rebasing, but working on a shared branch and updating that branch while having local changes did the same thing.


There's a really good emulator for the iPhone! Back when I bought it, it came from HP themselves, but a few years ago they sold it to another company which actually maintains it. They just released a major new version a few weeks ago.


Yes, the new version is really nice now in portrait mode. Nice app. Now if only we could have tactile buttons...


I have that and do like it even though it lacks that HP tactile feel.


Yes but they're using this fund to prop up their core business (and share price) by artificially creating demand for their own products. Most of the money that they invest comes back to them when these companies buy GPUs.


I wouldn't say they are artificially creating the demand. They artificatially create capacity to make a purchase by enabling their customers to pay with ownership of their business rather than with money. It's just an alternative financing scheme.


> They artificatially create capacity to make a purchase

If the company did not intend to purchase anything but Nvidia used the investment to "incentivize" a purchase then this is artificially creating demand where there used to be none. It's very different from Nvidia allowing a company to purchase Nvidia products that they already wanted to purchase but pay with stock.


Are you implying that 1B investment gives Nvidia control over Nokia procurement?


Then they are acting as a lender which can be problematic.


I think this will continue. They can't change 3GPP's vision with just Nokia. They need to bribe other companies. Ericsson is the other big vendor. I think there is a possibility of that. However, Huawei is impossible. Who is gonna provide a GPU to them? Therefore, they simply can't just put a GPU on every base station around the world.


>Therefore, they simply can't just put a GPU on every base station around the world.

I dont think that is what's happening here. Base station has been power limited for quite some time and part of the whole 5G / Cloud RAN promise was moving a lot of the processing off base station. Ignoring GPU there are a lot of the current stack fits Nvidia portfolio especially DPU. May be Nvidia have figure out a way to use CUDA and have it perform better than Ericsson and Huawei.

Nokia is also the smallest of the three and has been in decline for quite some time. Part of me also wish Nvidia just buy Nokia and start competing against Ericsson and Huawei.


You are right on your points. I agree on DPU but it's on the network stack. I think Nvidia wants to get into the PHY and MAC layer (CuPHY, etc.). That's where I find it unlikely due to the cost of latency. If Nvidia had wanted to buy Nokia, they could've already completed the deal. It's a possibility in the future but this $1B investment kind of showed that they are more interested in creating artificial demand for their GPUs rather than diversifying their product portfolio. I agree with you that Nvidia should just buy Nokia.


Strategic Enablement

Yup, give them $1B so they can build out AI DC’s stocked with $1B of Nvidia chips.


More importantly than gaining a client for Nvidia's AI chips, this investment gives the company a solid foothold in a competitor to Broadcom in the wireless, datacenter and networking solutions space. I wouldn't be surprised if Nvidia eventually scoops up all of Nokia.


> I wouldn't be surprised if Nvidia eventually scoops up all of Nokia.

That is what I wish would happen as well.


Except Nokia use Broadcom chips in pretty much all their datacentre and cheapo networking products.


buy in, let them use your money to buy your product back from you, then if you think they'll actually succeed you make money and if not you can slowly dump that stock back onto the market and end up ahead. you've basically manufactured a customer at that point.


Exactly, the risk is extremely small. If they fail, there will be others too, which means that WSBro’s can bundle and derivatives their way into at least a neural exit.


This is a shorter version of Neal Stephenson's metaphor of Unix as a Hole Hawg drill from "In the Beginning was the Command Line".


It's the bean soup theory ("what if I don't like beans") in action.


Which I believe is an effect of main-character-syndrome or just good ol’ classical narcissism


Ed is the standard text editor.



Doing edlin as high-school typing exam was already enough, and ed wasn't much better, which was an opinion shared by our customers back then.


And not installed by default in many distros. FML.


> And not installed by default in many distros. FML.

ed (pronounced as distinct letters, /ˌiːˈdiː/)[1] is a line editor for Unix and Unix-like operating systems. It was one of the first parts of the Unix operating system that was developed, in August 1969.[2] It remains part of the POSIX and Open Group standards for Unix-based operating systems

so it is a bug in those distros.


The point is that we always exist at a point on a continuum, not at some fixed time when the current standard is set in stone. I remember setting up Solaris machines in the early 2000s with the painful SysV tools that they came with and the first thing you would do is download a package of GNU coreutils. Now those utils are "standard", unless of course you're using a Mac. And newer tools are appearing (again, finally) and the folk saying to just stick with the GNU tools because they're everywhere ignore all of the effort that went into making that (mostly) the case. So yes, let's not let the history of the GNU tools dictate how we live in the present.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: