'Kernel memory leaking' Intel processor design flaw forces Linux, Windows

Your nick is noted and appreciated, but I’m not absolutely sure you’re correct in this case. C suite, yes. Engineering, it’s much harder to conceal lack of talent in a very competitive environment.

5 Likes

High pay is a great way to attract talent, especially if it serves as a way of increasing the size of the pool of available people from which you can choose your next employee.

That said, there are a lot of reasons other than pay that someone might choose to work for, say, GCHQ rather than a dot company, immunity from prosecution not being the least of these.

Governments attract a lot of talent, for a lot of different reasons. One only needs to look at the Snowden papers to realise that there are a lot of very talented black hats working for the Five Eyes, whatever their personal motivations may be.

4 Likes

Many of which are extremely self-serving, btw. I am sure that NSA and GCHQ have some very good people, but to be as diplomatic as possible I have heard that morale at GCHQ at least is not of the highest. I am merely venturing a guess that Google might well have beaten the agencies to it. Apart from anything else, Google wants a share of Wintel’s desktops. Finding a serious weakness in Windows that doesn’t exist in Chrome or Android or is easily mitigated isn’t exactly doing them any harm.

3 Likes

I just had a nice thought-- could this be used to extract media decode keys?

7 Likes

It took a while, but oblig:

8 Likes

He’s wrong about one thing though; we don’t suck at computers. Compared to just about every other technology we’ve developed, computers have been rather defect free given that they’ve been around for 70 years. More than a hundred years after the development of steam engines boilers and engines were blowing up with depressing regularity, and over a hundred years after the Diesel cycle was made practical, the things were literally poisoning people in cities.

7 Likes

Having read, reviewed, debugged, and written code for many years, I’d say given the code that we write, it’s kind of a miracle that they do anything successfully. A double miracle when they do what we want them to. It’s ridiculously easy to overlook some tiny detail that could result in catastrophe but might not be noticed for years until just the right combination of circumstances triggers it.

10 Likes

A confession here.

Years ago, early in my career, I had to design a piece of equipment intended to carry out accurate resistance measurements at 170C +/- 0.025 degrees. The measurement was carried out in fluorinert. The fluorinert was agitated by a big paddle and stationary blades; it was heated by a PWM 450W DC power supply driving an element immersed in the fluorinert. It was all controlled by a TMS9995 microprocessor (that long ago) running at 12MHz - the performance being why we didn’t use a more conventional CPU.
The code was entirely written in assembler, carefully checked. It was quite a short prgram - it fitted into 2 32K Eproms. That’s bits, not bytes.

The thing ran for 6 months perfectly 24 hours a day 7 days a week, and then one morning the fluorinert had boiled dry, all $2500 worth of it.

As you can probably guess, there was a single bad instruction, a jump to a label which was one instruction away from the correct label. Under an error condition the thing was supposed to shut down. It should have jumped to the label that wrote an I/O instruction that killed power to the heater, then went into the main shutdown sequence. Instead, it went into the shutdown sequence without turning the heater off. And the error? The PWM controller had failed hard on, and so when the shutdown sequence set power to zero nothing happened, leaving the heater on full power. So, the tank boiled dry.

Amazingly the company forgave me, partly because my boss, who was a good guy, said the electrical engineer who built in, and who had left, had failed to include a simple temperature trip - which was duly added. But I have never forgotten how disastrous a single bad instruction can be, even in a system which is supposed to have failsafes.
Now think of a modern multi cpu system running a complex multitasking operating system, such as a phone,with gigabytes of storage and of running code, and consider my phone currently has an uptime over 1 month.

14 Likes

Well the first round of kernel updates are out from Debian

https://www.debian.org/security/2018/dsa-4078
https://www.debian.org/security/2018/dsa-4082

I am not noticing any difference in performance :+1:

6 Likes

https://cloudblogs.microsoft.com/microsoftsecure/2018/01/09/understanding-the-performance-impact-of-spectre-and-meltdown-mitigations-on-windows-systems/?ranMID=24542&ranEAID=nOD/rLJHOac&ranSiteID=nOD_rLJHOac-xGdY62pU18RrkA4p29iZ2w&tduid=(cb84c6e247d414a6ef339ce0bcf7e208)(256380)(2459594)(nOD_rLJHOac-xGdY62pU18RrkA4p29iZ2w)()

With Windows 10 on older silicon (2015-era PCs with Haswell or older CPU), some benchmarks show more significant slowdowns, and we expect that some users will notice a decrease in system performance.

looks at his 4690 equipped imac , and cries

5 Likes

Microsoft’s apparently having issues making the patches work on older AMD machines, and is blaming AMD’s documentation. Which is odd, considering that the Meltdown patches aren’t supposed to be needed for AMD processors, and the Spectre ones shouldn’t be causing the kernel to crash…

4 Likes

I’ve got a PC that was high-end about 5 or 6 years ago: I7-3770k, 16gb ram, GTX 680, with a couple of SSDs.

I use it for gaming, mostly, with some low-end MS Office stuff for when I absolutely have to work from home. The security patch has yet to hit, but am I right in thinking that (even with a 30% hit to CPU performance) I’m not going to notice the difference?

I bought it second hand and I’ve been trying to work out what it’s capable of but weirdly, I’ve found that underclocking the GPU by about 10% makes it a lot more stable.

Given that the CPU never seems to be the bottleneck in my setup (I’m looking at you, dusty and old GPU with a weird overclock), should I carry on regardless? Do I need to tweak the GPU settings more or less to compensate for the hit to the CPU bandwidth?

PC gaming is so weirdly arcane, I’m tempted just to gut a rooster over it and hope for the best. Any more-productive solutions would be greatly appreciated…

4 Likes

No you need to draw a pentagram on it with thermal paste.

8 Likes

(note: it’s been a long time since I’ve delved deeply into hardware tweaking, so take this with a grain of salt)

I wouldn’t expect tweaks to the GPU to be needed. Most of the performance overhead involved with the patches seems to involve I/O-intensive operations, like large amounts of disk accesses. I’ve seen databases mentioned a lot as being impacted by it… that might also hit things like video editing or 3D rendering. There may be a framerate hit on gaming, but for the most part games don’t appear to do as much of the operations that are hit.

Here’s an article I found on a quick search with some real-world testing:

5 Likes

I have a lot of trouble with opening larger jpx-based pdfs, such as many internet archive pdfs, with navigating them, ocring them, and processing them so they are faster on the Mac and readable on the Kindle. I expect the update will give me more trouble.

For scans, I usually use k2pdfopt, but I often get seg faults with large files. For others, I get better results by converting to grayscale on an older version of os x.

2 Likes

Thanks; I should have thought to check EG. The article’s pretty inconclusive but the comments on it make for interesting reading. Looks like I’m spending this evening patching everything… :confused:

2 Likes

https://www.washingtonpost.com/world/national-security/the-nsas-top-talent-is-leaving-because-of-low-pay-and-battered-morale/2018/01/02/ff19f0c6-ec04-11e7-9f92-10a2203f6c8d_story.html?utm_term=.5c340583d0a7

The NSA is bleeding people like crazy. If you read the article, one of the cited reasons is the pay compared to…silicon valley.

9 Likes

Of course the money offered by Silicon Valley attracts a lot of people but that’s not the same as saying that all the talent is where the money is.

If anything, the fact that the intelligence services are losing enough people to Silicon Valley for it to be considered a problem suggests that there are (or were) talented people working for the government despite the poor pay…

1 Like

It’s not even that there haven’t been defective computers (remember the Cyrix coma, Pentium FDIV and F00F bugs?). It’s just that there’s rarely something so catastrophic that it couldn’t be worked around with clever software tricks. Spectre’s breadth across numerous procesor architectures is definitely an anomaly.

6 Likes

No one has said the the NSA didn’t have talented people unless I mis-read the thread above. The point that was made is that they aren’t the best (at least any longer). Patriotism and sense of mission only gets you so far, and Snowden didn’t help either since it exposed their hypocrisy.

6 Likes