

This is about datacenters and HPC, hence the reference to Meta.


This is about datacenters and HPC, hence the reference to Meta.


Ew humans


Is this the most bizarre tech? These things seem lame as hell to me. Except for the knife, that’s cool. I’ve seen those ultrasonic cutters used in hobby and industry and those work well. Using them in the kitchen I don’t know about, especially given the price, but still a cool thing.


Remember this is the same company that had a huge lead and monopoly in the mobile OS market and they fucked that up royally.
How exactly did you get Internet Explorer working on Arch?

Plants don’t absorb CO2 at all actually. Sure they do photosynthesis get the C out of the CO2 and use that to grow. The O2 gets released, the CO2 is no more and everyone is happy right? Well not really. While this is true in the short term, it isn’t true in the long term. In the long term, the plant dies and all of that C gets combined with O2 again and releases CO2. How long this takes depends on the plant of course, some only live a year or shorter, others go for decades. And trees for example can live for hundreds of years, that is very much the exception to the rule. Even within trees only a fraction of them get that old, with most dying in a couple of decades.
The core concept is carbon cycles: https://en.wikipedia.org/wiki/Carbon_cycle
The fast (sometimes called short) carbon cycle is what plants usually do, they grow and they absorb, they die and they release. This cycle works on the order of years or decades. This is super duper overly simplified, but you get the idea.
The slow (sometimes called long) carbon cycle is where carbon is captured in geological processes. Carbon gets absorbed from the atmosphere and is captured for millions of years. This cycle is why our planet isn’t a super hot house like Venus, as the carbon is incorporated in the mantle and core. This cycle works on the order of millions of years.
What humans have been doing for the past 150 years is taking carbon from the slow carbon cycle and putting that carbon into the atmosphere. We’ve mined all sorts of carbon rich materials for our industrial revolution. We’ve taken a huge shortcut, exploding our population and technology with essentially stolen energy.
Extracting carbon from the slow cycle and injecting it into the fast cycle isn’t the same as putting it back into the slow cycle. It’s a short term hack that might seem to work, but in reality only slows down the process a tiny bit. We have no idea how to put carbon back into the slow cycle and from what we do know, it takes probably at least 5 times as much energy as we stole by releasing the energy. So we don’t just need to pay back what we stole, we also need to pay interest and a fine. The debt humanity has been accumulating has gotten bigger and bigger and the time left to pay has gotten shorter and shorter.
Sure in theory if we substantially and over a long period of time increase the amount of plant mass on the planet, we can store some of the carbon in that surplus. However in practice this has often lead to so called Green Washing. Companies buy so called carbon credits to offset the carbon they release. The idea is that the money they pay is used to increase the plant mass and thus store some of that carbon. This however turned out to be mostly a scam. A big issue was the amount of plant mass increase didn’t match the carbon released, so it was a nett release and not neutral or capture. Another issue was credits were being sold for plant mass that already existed and wouldn’t be increased at all. Often they promised not to decrease it, but this promise was either not backed up by anything, or the plant mass was already protected and wasn’t going to decrease anyways. Another huge issue was credits being sold multiple times for the same plantmass, so in fact just a straight up scam. And like I said, unless the plantmass would be increased for at least hundreds of years in a sustainable way it’s just a small bandaid. Most companies have stopped greenwashing these days as people have learnt it is a total scam and stricter laws have been applied to prevent these bogus claims.
Another big issue is the amount we release each second is so huge, we can’t fight that with an increase in plantmass. So whilst I support the increase of plantmass and the protection of existing nature (for more reasons than just the carbon), it isn’t a solution and I hate that greed fucked up this entire concept. Another L for capitalism I guess.
Now not all is lost, there is still time to pay of the debt. But we need to reduce the amount we are actively stealing and start thinking about paying some of it back. As for how to do this? I have no idea, but I’m sure it can be done. We need to move away from capitalism, number must go up and consumerism. We must start building products we use for life and not so cheap they fall apart or are even planned to not last. We need to reduce, re-use and recycle. We need our food to be more sustainable (and might even make it more healthy while we are at it). We need right to repair, with manufacturers being legally forced to provide documentation, spare parts and repairable designs. And all firmware and accompanying software needs to be open sourced.
But in a world where technology moves so fast and everything is designed around consumerism, I doubt we will be able to do it. Especially the last few years we have accelerated in the wrong direction. But we shouldn’t give up hope, we can still do it.
And don’t believe in that bullshit argument that we will create an AI that solves global warming for us. We already know exactly how to stop global warming, we just lack the motivation to do so. AI can only accelerate global warming, not stop it or reverse it.


The most annoying thing is: There are so many risks with AI use right now. Why are we talking about some fantasy where the AI turns into Skynet and destroys the world? We are already destroying the world with the tech we have today, why aren’t we talking about that.
We currently have:
We should be working on this shit, not some fantasy. But these AI companies deflect all the blame saying they have an AI safety team working on it.


He mentioned Cuba on hit wish list


deleted by creator


I really liked that dude that at the start of his presentation introduced a little dude he had drawn on paper, gave it a name and did a skit with it. He then beheaded the little dude and proceeded to proclaim he was dead. The audience did a D: and were shocked and appalled. He then proceeded to explain that’s exactly what humans always do and how we treat AI. Our brains automatically anthropomorphise anything and everything. We assign properties based on feelings and not what it really is. The audience got it right away, really convincing demo. I don’t remember who it was, but it was so good to watch it happen with the audience there.


I remember I had a Voodoo card at the time of GTA2. Playing the Glide version of that game (if you could get it working) was like being transported into the future. The resolution was higher, the framerate was higher and more smooth, the lighting effects were insane. Especially on a large CRT with vibrant colors that game looked absolutely amazing.
Skill issue
MongoDB is normally not public facing right?


He also points out that there are many ISBNs that are “wrong”, but are actually correct in the real world. This is because publishers don’t always understand about the checksum and just increment the ISBN when publishing a new book. In many library systems there is this checkbox next to the ISBN entry field where you can say something like “I understand this ISBN is wrong, but it is correct in the real world”.
So just flagging wrong ISBNs would lead to a lot of false positives and would need specific structures to deal with that.


Alright, I’ve got nothing for you then.
I didn’t think the thing would be good. When he got it in, we spent a day running benchmarks and fooling around with it. We compared it to his old workstation and my desktop system. It wasn’t a very controlled environment, we were just having fun and putting the thing through it’s paces.
I asked my friend yesterday how he liked the machine having worked with it for some time now and he’s really happy with it. It is across the board faster than his old machine and is wonderful to work with. He can setup complex simulations and take it with him to the office. This was always a bit of a pain point in the past, where he would run the simulations at home on his workstation, but then could only share the results. Sometimes they would rent server time to run the simulation on a cloud system, but that was a bit of a hassle and had costs. Now he just uplugs his notebook, puts it in the bag and off he goes. He also now doesn’t have 2 systems from work he needs to regularly log into and keep up to date. Sometimes he had a couple of months where he didn’t need the laptop and it would get fussy over missing updates etc. So for him at least it’s a big win and to me shows you can run some pretty heavy stuff on those machines.
Are there faster machines out there? Absolutely. Are there even better notebooks out there? For sure, Apple M3 is faster and M4 is even faster still. And with Apple the performance per watt is better as well. But running Windows on those is (for now at least) not something that’s suitable for work. The security department would certainly not approve of a highly modified version of Windows.
The whole point of this post was Arm chips might be huge in the future and I have to agree. These current gen Arm CPUs are impressive and the next gen will be even more so.
You also seem to indicate running benchmarks and running applications is somehow not the same thing? Sure not all benchmarks are realistic, but are more of an indication and relative performance thing, as to easily compare different systems. And not all applications stress the system the same way. But every benchmark I’ve seen says that notebook is on par and exceeds the performance of my 5950X desktop and to me that’s impressive. In the real world if we are using simple office applications or websites/web-apps, I doubt we would notice the difference in performance, both are equally fast and perhaps the latency of the internet connection is a bigger factor there. But something like Speedometer shows the real world browser performance of the laptop is better than on my desktop.
Did the engineers at Qualcomm spend a couple of weeks with a small team to optimise a custom Linux environment for Geekbench and put a boatload of cooling on the chip? Sure, I believe that. They want to show the CPU in the best possible conditions. Is the real world performance still very good? Yes, it is. And there are so many notebook reviews that back this up.
Are there also terrible notebooks with a CPU throttled all the way down and lacking enough cooling? Also yes. But the same can be said for x86 notebooks. Especially Intel notebooks of 12th and 13th gen, those ran hot and slow all the time.
If you are convinced all Arm notebooks suck, I’m not here to change your mind, I’m not here to provide any kind of proof. All I can tell you is I know of one real life case where I saw with my own eyes the thing was pretty damned good. If you don’t believe me, that’s just fine. It’s just a discussion on the internet, don’t take it too seriously.
It’s not like anyone can afford a new laptop in 2026, with the RAM prices being what they are. So it probably won’t be the year of the Arm CPU, no matter how good those chips actually are.


It’s a work machine, he uses it for work. He runs a custom simulation package from work, I can’t name the app without doxxing my friend. It scales well with CPU and memory and uses an optimal number of threads for the amount of cores (and even works well with stuff like multiple CPUs or different cores having access to different cache). For running most stuff at least 32GB of memory is required and for the stuff he does 64GB of memory is an absolute must. Simulations take between 20 mins and 8 hours depending on what he wants from it. The simulation tool does not use the GPU at all, so that’s a non-factor. The tool is x86 based, with an Arm version coming soon™, so there might even be performance improvements in the future. The simulations run faster in all scenarios as compared to his old workstation, even the long ones. Cooling is not an issue on this particular machine and due to the many core load boosting isn’t done anyways.
We ran Speedometer because many laptop reviews include that one and it’s very quick and easy to run. Specifically the 3.0 version because we had a source open with Apple M benchmarks that included that one. The result was somewhere around Apple M2, maybe a bit faster than an M2 but def slower than an M3.
You can try it yourself and read about it here: https://browserbench.org/Speedometer3.0/ It benchmarks regular use in web-apps, as a lot of apps these days are web-apps. So it gives an impression of every-day tasks in websites and web-apps.
I think the results you mention back up what I said? It regularly outperformed my 5950X desktop machine in benchmarks or was at least on par in other cases. My desktop is a big case with water-cooling and when benching the fans do make a bit of noise. That little notebook outperformed it and the fan was barely noticeable.
Like I said, I was sceptical, but that thing impressed me a lot. You can draw your own conclusions, that doesn’t really matter to me. If you think Arm laptops suck by definition, that’s fine, you do you. But don’t say you can’t use it for heavy applications, because at least for some cases that’s just not true. I think the GPU (especially the driver) is a weak point for these system, so anything that leans on them probably should use a systems with a separate GPU and not the builtin one. But this is also true for Intel and AMD, so not really any difference there.


And because these machines don’t use any memory, we will actually be able to afford them!
Wait… what’s that? Hmmm, yes, I understand.
Correction: They do in fact use the same memory all other systems use, so we will not be able to afford them at all.


We have tried a whole bunch of benchmarks and the laptop was on par or faster than the older Threadripper workstation and my 5950X desktop. Most benchmarks were multithreaded, but there was some singlethreaded stuff as well. He uses the system to run simulations for work and that software also runs faster than the old workstation. I can’t run that on my system, so I wouldn’t know how it compares.
I don’t have the exact bench results as we didn’t write them all down, just ran and compared. But I do have a result screenshotted of 27.9 in Speedometer 3.0, which is pretty good I think.
As it’s the laptop from work he runs Windows on it, the new Windows Arm version which wasn’t even fully released at the time he got it. That version seems to be a big step up from the old Arm Windows which was used for budget Bing books. His model is the most high end one, 15" with 64GB of memory and 1TB SSD with a Qualcomm Snapdragon X Elite X1E-80-100 cpu. That one has some pretty good cooling inside.
I was sceptical at first as well, as I would have thought the performance wouldn’t be great and there would be compatibility issues. But he’s been using it for a while now and says everything works just fine. Replacing a big box workstation with a thin and light notebook and have it perform better is pretty wild. There would absolutely be faster systems available, for example a 9950X system or latest gen Threadripper workstation would be faster. But that would have been more expensive because those systems are more expensive to start with and he would then need a separate laptop as well. Having something in a thin laptop form factor and it be an upgrade in performance is pretty mind blowing.
deleted by creator