What market share would they have without offering the free tier?
Much lower than what they have now, and that would make for a more decentralised and resilient internet
> and that would make for a more decentralised and resilient internet
The only people that say that haven't run a site on the open internet in the last decade plus. It's such an ignorant takes it's hard to take anything you say seriously.
I’m not advocating for having no protections at all. Without Cloudflare giving away protection for free it’s entirely possible that we would have multiple smaller provider offering protection at a fair price so maybe only a smaller fraction of the internet goes offline next time Cloudflare pushes a bad configuration
I think the fine is wrong, but the attempt to weaponise JD Vance and Elon Musk doesn’t look well at all.
The next time they see something they don’t like hosted/protected by Cloudflare they will only have to ask more or less nicely and there is a good chance Cloudflare will handle it for them
Happy to see this happening. You know what would make me even happier? Having open source alternatives available to use as soon as I buy the device, not only after it's discontinued
Has anyone noticed their employer actually cutting back on employee hardware purchases? Because if new laptops are still being handed over to developers on a 2–4 year cycle, then probably not.
Game developers might have to do something though if high-end GPUs are going to end up being $5000.
At least so far the RTX 5090 seems to be available and at the same price it’s been at for the past six months (around $3000). I’m not sure when you’d see GPUs affected by the RAM price increases.
Higher in some respects (bandwidth), lower in others (latency, even though ordinary DDR5 is already no speed demon there and LPDDR5 is worse). At least from the spec sheet, these kinds of RAM are so different that I don’t really understand how demand for one can cause a shortage of the other, unless they are competing for the same manufacturing lines.
FWIW: GDDR is not higher latency than DDR. It just seems that way because the GDDR interface clock is much higher, so the number of clocks are higher too. But in terms of wall clock time, the latency is very similar.
Which makes sense: the latency is determined by the underlying storage technology and the way to access that storage, which the same for both.
I've declined the refresh I've overdue for. My 2021 model MBP has 32gb and a 1TB SSD. They're currently giving out the base model Air: 16gb and 256gb. No thanks.
We used to get whatever was most appropriate for our role, now we get whatever is cheapest.
Sure, except this time also add on US pressuring Korean companies to not sell their old equipment to Chinese manufacturers, so the supply could actually keep up. But no no, it's all OpenAI's fault for behaving like the capitalistic swines they are. Both suck, but one has a long-lasting impact, the other is just what capitalists always done, and will continue in the future.
Agreed - it's more likely to lead to poor user experience than anything else. Most end-user facing software/service companies will probably bet on the DRAM price peak being temporary.
Software takes a lot of time to build. Codebases live for decades. There's often an impossibly large cost in starting over with a less wasteful architecture/language/etc. Think going from an Electron/Chromium app to something built using some compiled language and native OS GUI constructs that uses 10x less resources.
The impossibly large cost is the difference between hardware and software returns.
Hardware by nature forces redesigns whereas software it's always possible to just keep building on top of old bad designs, and so much cheaper and faster to do so. That's why hardware is 10,000x faster than 30 years ago, and even simple word processors are debatabely faster than 30 years ago. Maybe even slower.
Hardware isn't much better actually. There isn't a good way I can show you this, but every x64 CPU contains an entire ARM CPU whose job is to initialize the x64 CPU. And of course it runs two operating systems - TrustZone and Minix.
The ARM Core starts up, does crypto, Loads the SecureOS and the BIOS, then it starts the x86 CPU - In 16 bit mode! Which then boostraps itself through 32 then 64 bit mode.
So in the first couple sends of power on, your CPU is at various points ARM, i386, x86, and x86_64.
> The ARM Core starts up, does crypto, Loads the SecureOS and the BIOS, then it starts the x86 CPU - In 16 bit mode! Which then boostraps itself through 32 then 64 bit mode.
Well, what if I want to run a 16-bit OS?
Also, I wonder if the transistor count of a literal entire 8086 processor is so small relative to the total that they just do that.
Compatibility mode doesn't work by having a separate 16-bit core. It's random bits of spaghetti logic to make the core work like a 16-bit core when the 32-bit flag isn't set.
There isn't a separate 8086 core in every x64 core. The whole core has "if/then" spaghetti logic scattered throughout to alter its behaviour based on being in 16-bit mode.
At best, they might have been able to confine the needed logic patches to the instruction-decoding front end.
It began life as an "out of band" way to administer servers so that an ops. team could do everything (other than actual hardware changes) remotely that would otherwise need a person to be standing in front of the server in the datacenter poking commands into a keyboard.
It then grew in responsibilities to also support the "secure boot" aspect of system startup, and beyond some Intel CPU version point (I do not remember which point), it exists in every Intel CPU produced.
I remember being flabbergasted when I worked at the open source development lab and we got our first itanium system in, a multi-core, multi-rack nec system, with its own windows pc to boot up in order to get to linux.
This. In my tenure I’ve worked on projects that ended up being blocked by absolutely everything, then there is a reorg and when you need to do something technical debt keeps you at a leisurely pace.
If you want to do absolutely the least go to an office (RTO for the win!!!) but make sure your team is on the other side of the pond. You get some annoying early/late meetings and get the rest of the day to rest.
As long as you have something you to say at standup (it doesn’t matter if it’s meaningful) you’re golden
Sarah Wynn-Williams - Careless People: A Cautionary Tale of Power, Greed, and Lost Idealism
I'm not a Facebook fan but the more I read the book the more I dislike them.
The author seems well-intentioned but I the overall feeling is that she was part of the system. I know it's difficult but she followed along enabling what the Facebook leadership wanted to do.
I found that book very interesting. The author doesn't come out of it looking well. That made the book feel more credible to me, although considering that it's by and about a group of people that isn't at all trustworthy, I still take the details with a large grain of salt.
What do you want to run? Small services with an handful of users? Anything can serve them.
Media libraries? As long as you have a CPU with QuickSync you’re good for on the fly transcoding and the real limiting factor becomes storage.
A TinyMiniMicro https://www.servethehome.com/introducing-project-tinyminimic... used PC is more than adequate for most workloads (except for local AI and if you want to have a huge amount of storage).
Last time I checked the prices were in the ballpark of $100/$150 for a working machine.
New machines with a N series Intel CPU are in the similar ballpark.
If I am being honest, My father is in the broadband/bandwidth business where I live and I recently told him about the fact that I was thinking of opening up a simple cloud/extremely low cost (in hardware)/mini datacenter or just thinking about it/tinkering with the software sides of these things (proxmox,incus a lot more) and all and he was interested in converting his office into a rack and he can get a lot of static ips and power supply so I was thinking something about this workflow as I had thought about it and the biggest problem to me seems to be the hardware
He is really excited for this project, he brought me newspaper clippings the other day showing that my idea has potential and other things so that's nice and I have given him the task to get his contacts in our small city for hardware, auctions and rents and try to get more information about some cheapped out specs starting out as I don't want us to invest in with a lot of hardware/investment up front but rather reinvesting the profits and maintaing a clear transparency.
Do you think we should postpone this idea till 3-4 years (I am thinking so) honestly because I would love to build my own software and I am thinking that within these years I can try more pain points of other providers and build a list of the nice features I like (If you know of any, please let me know as well as I still am making the list)
I am not trying to achieve AI purposes at all but rather simple compute (even low-end compute starting out)
Power consumption comparison isn't that much of an issue I think
Honestly I am thinking that we should wait out this cycle of rising hardware so that the hardware prices can go down in the start of the next cycle but I am interested if NUC's would be good enough for my workflow as I can redirect my father more about it because I am not that expertised about the hardware side of things so much so I would really appreciate it if you can tell me more about it/what could be the best use cases for that?
I saw from your article that chic-fil-a uses intel nucs to run their kubernetes clusters so I am assuming that it can be good enough for my use case as well?
Also, there is no guarantee that I end up doing it and its still more so an idea than anything and as I would probably do some projections to see if its worth it and a lot of other things before we get ourselfs some basic cheap equipment to even start and If we do we would probably start out with homelabbing equipment itself but just to be more clear, storage compactness isn't that big of a worry starting out as I think his office is good enough.
Honestly right now, In my understanding Ram Prices are the ones which extremely kill the project and makes me want to reconsider the software side of things (to build things myself/learn more) for a few years so that we can then build the hardware. I think this is the way to go but talking to my father and he was super excited about it I am not exactly sure but still it might give him a few years of his spare time to be more familiar with the hardware side of auctions etc. that he can find us better deals etc. too so please share any advice that you (or anyone) has about it as I would love sharing it down to my father so he that can do some queries about somethings in the local markets / his contacts as well.
The Plex rug-pull from excellent software to commercial gimmick happened years ago when they removed your ability to search your personal media library.
I assumed that they were being forced by the copyright mafia, but they’re perfectly capable of making these decisions on their own.
reply