NIST Issues Call for "Lightweight Cryptography" Algorithms

This is interesting:

Creating these defenses is the goal of NIST’s lightweight cryptography initiative, which aims to develop cryptographic algorithm standards that can work within the confines of a simple electronic device. Many of the sensors, actuators and other micromachines that will function as eyes, ears and hands in IoT networks will work on scant electrical power and use circuitry far more limited than the chips found in even the simplest cell phone. Similar small electronics exist in the keyless entry fobs to newer-model cars and the Radio Frequency Identification (RFID) tags used to locate boxes in vast warehouses.

All of these gadgets are inexpensive to make and will fit nearly anywhere, but common encryption methods may demand more electronic resources than they possess.

The NSA’s SIMON and SPECK would certainly qualify.

Posted on May 2, 2018 at 6:40 AM37 Comments

Comments

Dave May 2, 2018 7:23 AM

We don’t need lightweight cryptography, we need lightweight protocols. The crypto algorithm is never more than a tiny fraction of the overall amount of code and overhead required in securing a system, it’s the rest of the protocol that’s the problem. I mean, it’s a fun area to play in, but unless they also put out a call for lightweight security protocols to replace the current collection of heavyweight, super-heavyweight, and ultra-heavyweight ones, they may as well not bother.

MGD May 2, 2018 7:46 AM

Please consider this …
“Champagne is a wine of legend. The entire history of the Veuve Clicquot House is marked by mythical wines, all of which respect the demand for quality that was the force that drove Madame Clicquot.
True to this heritage, the House is proud of its motto: “Only one quality, the finest”.”

M. Turan May 2, 2018 8:35 AM

The block ciphers Simon and Speck don’t qualify, as they are block ciphers, not AEAD schemes. (At least, they need to be merged with an AE mode).

PeaceHead May 2, 2018 9:08 AM

I think what the IoT needs most along these lines is a “skeleton key” style OFF SWITCH. Preferably, it would be like a universal TV remote with built in manufacturer scan codes. It would scan through the list of all known model groups and send the POWERDOWN / POWEROFF / SHUTDOWN / SHUTOFF / STANDDOWN / YANK THE PLUG command!

Even better, it ought to be available at all dollar stores for 10 dollars or less and similarly at Wal-Marts, Targets, Best Buys, WalGreens, and all other convenience and consumer electronics stores.

We need times for when we can’t hold the power down. So just it case, it would also include the capability to send an EMP (electromagnetic pulse) to knock the gear offline.

So, as you can see, problems have solutions.

–May Peace Prevail Within All Realms of Existence.

uh, Mike May 2, 2018 10:17 AM

One standard crypto method is a lot like having one standard national authenticator, isn’t it?
Wouldn’t we benefit from using a small number of very different crypto methods, so we have alternatives if^H^H when one fails?

echo May 2, 2018 10:18 AM

While the US has become a regime where business does the outsourced dirty work the US for good or ill does take a position and spread opportunity to one degree or another. In contrast in the UK GCHQ are playing fuedal lords and “regulatory capture” with business atthe start up and recently qualified graduate level. We never get well publicised and easy to understand specifications for action in the UK (and nobody seems to know how to write a manual either) let alone opportunities like this for creatign public discussion and self-starting education or anything, heaven forbid, which can be sucessfully commercialised.

I cannot think of one even half decent security orientated blog in the UK that isn’t either heavily beaurocratic or academic.

Bauke Jan Douma May 2, 2018 12:51 PM

What’s with the promotion of the NSA’s ‘SIMON and SPECK’ lately?

I am trapped in a endless loop May 2, 2018 12:55 PM

Oh come one people @Bruce was being snarky. Now, I admit that is a tad bit unusual for him but still….obvious troll is obvious.

Ratio May 2, 2018 1:05 PM

@I am trapped in a endless loop, no, @Bruce isn’t being snarky, I don’t think.

Spooky May 2, 2018 1:16 PM

NSA generates weakened algorithms, the world politely declines, so NIST generates a manufactured demand for the same algorithms as part of a de facto standards setting process. And thus, all of Clive’s earlier concerns about compliant implementations requiring automatic fallback to shortened block/key sizes during autonegotiation are now realised…

On the one hand, better security in the world of IoT is desperately needed; we’ve talked about that extensively here. So, encrypting and signing all sensitive payloads would seem to be a relatively good start in that direction (irrevocable keys in cheap, mass-produced devices are problematic tho). On the other hand, if common devices begin generally encrypting all communications, you are prevented from ever knowing what data is being collected and exfil’d over your local network. At some point, you can easily imagine every conceivable device demanding inscrutable network access in the name of security while conveniently using the same channel for monetization and surveillance. If I have to constantly worry about whether the refrigerator and microwave are conspiring with my thermostat to dox me, I will grudgingly activate a low-yield EMP and happily return to the joys of 19th century living… 🙂

Our future smells a lot like manure.

Cheers,
Spooky

Alejandro May 2, 2018 2:00 PM

Wow. This sounds familiar.

Anyway, I propose allowing NSA-NIST to issue “Stickers of Trust -SoT” to apply to all IoT devices certifying you can trust the device because: NSA/NIST. Very cheap, very lightweight.

willmore May 2, 2018 3:54 PM

I’m with @Dave on this. AES, for example, can be performed on even the smallest of 8 bit microcontrollers at kilobit/s speeds. That’s more than enough for the amounts of data these devices will be passing. If we need anything, it’s the communications protocols to be lighter weight. TCP/IP is huge and way more than these little things need. Layer HTTP on that, maybe some JSON and things balloon out of control.

Come on, NIST, you’re solving the wrong problem.

Or, maybe “obvious NSA water carrying is obvious”, NIST.

Festus May 2, 2018 6:38 PM

@Spooky

There are very few in the western world who could live in an 1800s society. Too accustomed to electricity, plumbing, food in a grocery store, medical care, dentistry, a/c, central heat,et cetera. You wouldn’t like it either.

Craig McQueen May 2, 2018 7:04 PM

I’ve made an implementation of 128-bit AES for small microprocessors (aes-min). It’s intended to be as small as possible yet still reasonably fast. I would consider AES to be suitably light-weight for small embedded systems, suitable for IoT. So I don’t see a need for anything smaller.

So far, I haven’t implemented, or seen implementations, of hash algorithms (e.g. SHA-2 or SHA-3) on small microprocessors, but I’m very interested to try it.

I’d also be interested to try implementing the 128-bit Galois multiply used in AES-GCM.

What seems much more challenging for small microprocessors is, public key encryption algorithm components that run at a reasonable speed.

Nick P May 2, 2018 7:37 PM

@ Bruce

You might find this comparison interesting of eStream algorithms implemented on 8-bit to see which would replace AES. I’m glad NIST is trying to generate more, though, as I’m sure stuff purpose built for the resource-constraints can get more work done per cent and watt. That market minimizes those two things at about any cost. So, the security has to cost about nothing for widest deployment outside regulations.

Sam in Sydney May 2, 2018 9:18 PM

I agree with Craig – what we really need for IoT is usable PKI and lightweight protocols. It’s very easy to end up paying by the byte for data, including packet headers. GPRS, for example, is still widely used and also widely billed by the byte. But if you can live with 1MB/mo you can get it for about 1 dollar/euro a month. Then you set up one TCP session to send one 32 byte AES block and burn over a kilobyte. The payload percentage in that transaction is a joke (Most IoT use UDP partly for that reason).

We’d also be better off if we didn’t constantly have to poll in order to be reachable. Many devices constantly pump garbage out in order that the phone app can contact them if it needs to. Since there’s no standard timeout for UDP “connections” a lot of devices just poll every 10 seconds to make sure. Or every 60. Even if the server at the other end has stopped responding (like Nest servers did).

Thoth May 2, 2018 11:47 PM

@Spooky, Clive Robinson, all

NSA should be feeling the hurt and sourness for not being able to enter RFC standards so they want to try again and as @Spooky mentioned, with weakened and possibly backdoored algorithms by churning the commercial demand engine that NIST is so good at.

They simply need to nudge NIST after their defeat at RFC standards meeting and once there is a NIST FIPS PUB for Simon and Speck as the default algorithm for Lightweight Cipher class, RFC would have no choice but to be forced to accept Simon and Speck with the reason being that companies that want to conform to national regulatory demands would have to conform to FIPS PUB and thus create a real world flood of demand for FIPS PUB based Simon and Speck.

Because FIPS and CC are globally recognized, NSA manages to ride on NIST to ensure that everything is backdoored and accessible in a NOBUS fashion, to NSA only.

This will effectively force RFC to reconsider their stance and force the RFC to accept NSA controlled NIST Simon and Speck as a must to be included into the RFC standards if RFC is to remain relevant as a standards body.

I hate to praise them but this is a brilliant move using an indirect attack at RFC and thumbing their fingers at RFC for the results of rejecting their possibly backdoored and weakened algorithm.

Before this turns out to be a reality, I think a lot of noise must be made against NIST for the uselessness of this so called lightweight cipher category and focus on flexible ciphering as mentioned by @Clive Robinson and lightweight protocols by @Dave.

Stephen Chasko May 3, 2018 12:01 AM

Resource constrained devices have been running hashing and encryption algorithms for thirty years and asymmetric algorithms for over fifteen years and that includes even energy and memory constrained devices such as RFID tags. It’s unclear what the motivation may be but it is unlikely based on mitigating risk for IOT devices.

Clive Robinson May 3, 2018 3:07 AM

@ Willmore,

Come on, NIST, you’re solving the wrong problem

They have been for years in my opinion.

Base crypto algorithms are nice, but, there are a whole heap of things above this that need to be fixed and standardised and thus be made usefull.

What we need is a traditional framework or “bus structure” that alows different algorithms, modes, protocols to be pluged in and used. But more importantly a reliable method to depreciate / remove / replace ctypto algorithms, modes, methods or put new ones in.

Lets be honest the history of crypto algorithms and their modes suggests they have at best a 25year life cycle before they have to be significantly upgraded for key size or have question marks hanging over there security due to the progress of research, thus need to be changed or replaced etc.

This in turn means of course that the underlying electronics has to be capable of being upgraded, which in most modern embedded systems is not the case. If you look at the likes of implanted medical devices, utility meters and Industrial Control Systems these often have “normal life time” expectations of thirty years plus or minus a decade. Which means they will in all probability need upgrading atleast twice in their expected working life.

This requires a high availability algorithm that is above all safe, reliable and secure to do the updating. Current patching methods including those with “signed code” are a very very long way from this. To a lesser extent you need the same sort of process for Key Managment (KeyMan) and we are not even close to that, as we have found the likes of CA’s are at best “paper tigers” promising everything but delivering very little.

We realy need progress on these things rather more than we need nrw base crypto algorithms. As others have pointed out most if not all modern CPU’s used in embedded systems are quite capable of doing AES and to a limited extent some PubKey algorithms. It is only very much older 8bit technology such as the old 6502, 68H, PIC, 8041/51 etc chips that “do not have the grunt” to do this. But then not being nasty they had trouble with DES… As such the price differential between an old 8bit CPU and a modern CPU on silicon is at a point where most things considered the old CPUs are now costing more than lower end ARM and MIPS CPU’s when looked at over the development cycle. And it’s a “no brainer” over expected product life times.

Such old CPU designs need to be asigned a place in landfill as quickly as possible, and that is only going to happen via some form of regulatory process. Whilst legislation and government regulation are the 132,000 and 22,000 pound steam hammers that will last a life time, standards are much faster in response and technically closer to being what is rrquired, so the better tool for the job.

Of course you will hear “vested interest screams” as you always will when certain types feel their rice bowl is in danger. History shows however that they are usually wrong dead wrong not just in the long term but almost immediate short term. I often point to the idea of the car safety legislation that are sometimes refered to as “Lemon Laws”. It’s clear that not only was a lack of regulation killing their customers in their new cars but the auto industry had entered a slow tail spin that was a race for the bottom and was turning into a death spiral for the entire industry. The legislation and regulation changed that and gave new ways and methods to the industry. The result is safer better quality cars, at a lower percentage of income point and a still viable auto industry (even though they still cheat where they think they can get away with it which the recent emmissions standards fiddle by various manufactures shows).

As was once observed “The price of freedom is eternal vigilance”, and an older observation says why “the mice will play when the cat is away”. Any form of regulation needs to be checked and double checked to stop the managment types starting another short term profit long term death spiral.

Clive Robinson May 3, 2018 4:01 AM

@ Bruce,

    Creating these defenses is the goal of NIST’s lightweight cryptography initiative, which aims to develop cryptographic algorithm standards that can work within the confines of a simple electronic device.

This show’s a distinct lack of common sense on NIST’s behalf, and it will without a doubt turn into a longterm nightmare.

As you have often pointed out the strength of security fails with the weakest link, and methods of attack generaly get better with time never worse.

But there is another issue we should consider and that is the “increasing connectivity”. That is as time goes on more and more low power low end functionality devices get given –usually wireless– connectivity. Due to some kind of mass stupidity originating with the likes of Google et al, we make this connectivity onwards available to “the world and his dog”.

If you think back some on this blog were waving big red flags about connectivity and Industrial Control Systems more commonly known as SCADA systems and how they were getting hooked up to the Internet to save costs. The significant risks were pointed out and it took the industry a good decade to realize that there was risk and when they looked APT malware was found. We also have reasonable suspicion that both the US and Russia have at state level attacked the infrustructure of other sovereign nations, causing harm and disruption. Worse we have reason to belive that other nation states have used APT for economic warfare in the likes of the oil industry in both the Middle East and South America. Realistically we can only expect such things to get worse as world wide connectivity descends down the functionality stack… After all do we realy need a bluetooth enabled tooth pick? Because that’s the next layer down from toothbrushes that are already enabled…

But more importantly there are two major incorrect assumptions in the NIST proposal.

Firstly the power level of embedded devices. It’s an out of date illusion, go and have a look at the computing power required to do the level of communications such as Bluetooth, ZigBee and many other wireless protocols such as WiFi. It’s atleast a couple of orders of magnitude more than for doing AES. That alonr should be ringing alarm bells in peoples heads.

Secondly is the “environmental polution” that causes “interoperability failure”. Much as we might like otherwise the usable RF spectrum is finite. Back in the 1970’s engineers realised there were problems which is why we have EMC regulation. But the sheer number of devices trying to connect wirelessly is rising rapidly and with it the effective noise floor. The range a device can communicate is based on “signal to noise ratio” that is the more noise the less range you have unless you increase the signal. But when the noise is based on an amalgamation of the interference effects of other signals, raising the signal level becomes quite pointless. Even switching to other transmission types does not solve this problem. That is the more devices you have in a given area the shorter the effective omnidirectional communication range they can achive.

But there is a fly in that ointment, from a security stand point an EmSec opponent does not need to use omnidirectional antennas or just one antenna. They can use high gain narrow beamwidth antennas set up in a Very Long Baseline (VBL) configuration thus effectively only picking up signals from a much reduced area, thus their effective range to an omnidirectional device is many many times that between omnidirectional devices. This range can be further increased by other phase and amplitude based “nulling techniques” which in effect remove the stronger interfearing signals. Such techniques using adaptive systems are now used in what is called “MIMO systems”.

But there is that second flawed assumption, increasing the range by either upping the radiated power or by using MIMO techniques shoots another hole in the NIST argument.

Put simply we do not need these algorithms for the reasons NIST has stated, they are a provable falsehood. Thus why are NIST going down this path?

Thus I have very deep suspicions as to why, or to be more honest who is behind this.

At best case NIST are living two decades in the past, and have failed to realise the world has moved way way beyond the old use case for such algorithms.

Thus people should be asking NIST some long hard questions, as well as examining the provenance of those behind the decision. I suspect you will find the usual first and second hand IC / SigInt Agency personal pulling the strings. Oh and managment types who have not one jot of real world communications engineering experience.

That is the whole idea is a costly mistake that will keep loading up technical debt untill the whole thing becomes a major disaster…

Clive Robinson May 3, 2018 4:12 AM

@ Spooky,

And thus, all of Clive’s earlier concerns about compliant implementations requiring automatic fallback to shortened block/key sizes during autonegotiation are now realised…

And a number of other issues as well (see my two above for some of them).

Thus I have a very bad feeling about the whole issue.

The fact I can not see a real “use case” anylonger raises lots of flags for me. But worse even if there was a minimal wired communications only use case inside a system like say a car, I know that “feature creep” will make it world available. We still see this sort of nonsense with “plaintext” communications in utility networks, that in effect provide “backdoors” into their infrastructure in a number of ways. Which is atleast a third of a century after people were making noises about the Telnet protocol leaking username and password details on networks where they could be seen. Apparently we don’t learn by common sense but by lurching from one disaster to the next.

Clive Robinson May 3, 2018 4:48 AM

@ Thoth,

Because FIPS and CC are globally recognized, NSA manages to ride on NIST to ensure that everything is backdoored and accessible in a NOBUS fashion, to NSA only.

The known information so far does support that viewpoint, with one exception… We know that NOBUS is short lived at best, they get found out as the work by Dan Shumow and Niels Ferguson based on work by others such as Mat Green with Bruce Schneier’s way more public comment on it showed[1].

Oh and then there is the Diesel emissions test work around by BMW and others, that got found out about even though the entire industry tried to keep it secret.

But as I’ve pointed out NOBUS has distinct limitations, if you know or suspect that it’s in place there are mitigations you can use that makes NOBUS worthless. Better such mittigations give you way more control on your secirity than the likes of any supposedly secure application on a computer, pad or phone…

Thus whilst my view is that there is no “use case” for this anylonger and it should be dropped as a waste of resources, even if it does get through there are easily understood mitigations that can be put in place, with a little effort and knowledge.

Thus watch out for any wording in the standard that is designed to stop or make difficult any mitigations. Which is the “eternal vigilance” asspect arising yet again.

[1] There is a timeline presentation on the Dual Eliptic Curve Digital Random Number Generator (Dual EC-DRNG) getting into and then been withdrawn from a NIST standard at,

https://csrc.nist.gov/csrc/media/projects/crypto-standards-development-process/documents/dualec_in_x982_and_sp800-90.pdf

Which includes the NSA originated email sent to Bruce from NIST.

Clive Robinson May 3, 2018 5:01 AM

@ Stephen Chasko,

It’s unclear what the motivation may be but it is unlikely based on mitigating risk for IOT devices.

Your reasoning is similar to my own which is why I say I can not see “the use case”.

But worse, standards being standards it is very likely they will put in a whole load more than just a couple of base crypto algorithms. Which is in all probability going to significantly increase the complexity of any resulting code. Which will not just make it even less likely as a “use case” but also a significant danger of adding vulnerabilities.

But worse such code will almost certainly be pulled in from a CPU manufacturer “App Note” or public domain high level language library. In either case, history has shown us that it will have at a minimum implementation faults that will open up time based and other side channels. Thus making “standard attacks” for the likes of the SigInt agencies to deploy.

Which at the end of the day is something we most definitely do not need…

Thoth May 3, 2018 5:11 AM

@Clive Robinson

“But as I’ve pointed out NOBUS has distinct limitations, if you know or suspect that it’s in place there are mitigations you can use that makes NOBUS worthless.”

It still has it’s place in the NOBUS arena despite it being suspected of backdoors.

We know that most industries are so deeply stuck in meeting certifications than real world security. A few good examples includes Governments (i.e. diplomatic and military), Financial sectors, Medical sectors, Industrial Control Systems and many more.

When you sell a security product to them, what they want to see is the golden stickers and seals all over the labels and that’s about it.

It is really that shallow for most security businesses. All people care are about the golden stickers and seals. Even if you have a fantastic theory with some high assurance methodology you can pull out from some Orange Book, in a Government bidding scenario where you bid for security contracts or even in the private sector (especially Banking and Finance), if you do not have those golden stickers and seals, those wonderful high assurance products would simply not be sold and I have tripped over these stickers and seals so many times in my career that I have lost track of them.

The power of those golden stickers and seals cannot be under-estimated.

I won’t be surprise how many old banking and financial systems are still stuck with old BSAFE libraries with the backdoored DUAL_EC backdoored RNG still running.

These banks and financial institutions are unwilling to “upgrade” their old clunky mainframes and thus the backdoored BSAFE libraries are still sitting in there running happily despite them knowing it’s backdoored but they do not see the danger because they think there is nothing to worry about … until they are hit by something …

Also, I have recently had to expand my product offering profiles in order to make some bucks by partnering with another HSM manufacturer and those NDAs they bring … 😀 .

Yes, I am guilty of peddling golden stickers and seals covered security products to make some cash barely enough to allow the business to survive which without those golden sticker covered products, nobody would even bet an eye on the sales pitches I try to make and would simply walk away.

Reality is very harsh and I found it out the hard way by stumbling over all these red tapes by running my own security business.

Keeping these nasty backdoored algorithms out of the NIST and CC process as early as possible is highly important because once it ever gets into the recommendations and turns into real world implementations, these backdoored ciphers would be really sticky to remove due to legacy systems and all sorts of excuses.

One example is the reports regarding vulnerable SSL library versions, protocols and ciphers are still in the wild and that is how much impact something along the lines of export grade ciphers, backdoored algorithms and RNGs and so forth are detrimental to security.

Guten Tag.

Dave May 3, 2018 5:49 AM

@Sam: “what we really need for IoT is usable PKI and lightweight protocols”

So which one do you want, PKI or a lightweight protocol?

Cassandra May 3, 2018 5:53 AM

I can only agree with Clive Robinson’s reasoning in the previous comments on this topic.

One minor thing I might add is that standardisation only works well over multiple development cycles when it is flexible. Clive is right to spotlight the failure of development of robust update processes to allow retirement of old algorithms and hardware and the introduction of new ones. In some cases, making a standard too soon, and too inflexible leads to a monoculture rather than an ecosystem: and monocultures are susceptible to nasty shocks. One further problem is that if a minimum standard is built, too often, most implementations conform only to the minimum standard rather than looking to see if the standard is fit for the purpose to which it is being put. In other words: the box is ticked, but no one is checking to see if it is the right box for the situation.

The ability to easily and quickly transition to a different class of algorithm is important. While the NSA can claim to have saved everybody’s bacon by choosing the S-box parameters for DES, if they had not, and differential cryptanalysis had been discovered in the public domain, tremendous economic damage would have ensued. If someone were to find a fast way of determining the prime factors of large composite numbers*, or discrete logarithms*, we would be in a similar situation. People are not planning for failure of the algorithms that protect their most important secrets.

Of course, this all adds cost, which will only be accepted if it is part of the regulatory burden – e.g. standards compliance includes demonstrating your implementation has a tested method of moving to a different class of algorithm, should it become necessary. While I am realistic about the chances of such a far-sighted approach being taken, I can at least try and plant a seed, and hope never to have the opportunity (despite my pseudonym) to say “I told you so.”.

Cassandra

*I know both of these are unlikely – but ‘unlikely’ and ‘impossible’ are different, and there is more than one avenue of attack. The theory may well be flawless, but actual implementations usually are not. Programmers can be surprising predictable, and it is not unknown for separate teams to independently produce code with the same flaws, due to extensive sharing of concepts, programming algorithms, and shared coding snippets (e.g. Stack Exchange) and libraries (e.g. CPAN) across the industry.

Clive Robinson May 3, 2018 5:57 AM

@ Festus, Spooky,

There are very few in the western world who could live in an 1800s society.

I suspect Spooky actually ment 20th Century not 19th Century. It’s an oh so common mistake, even I make it from time to time.

A low yield EMP device would take us back to around 1960’s technology when radios and TV’s were still predominantly based on thermionic valves / tubes and the phone system was still at the local level electro-mechanical.

Having lived through the 1960’s I don’t actually remember it that badly. For instance the fridge in our house did not run of electricity but gas as did a number of other things. I’m hard pressed to remember what our first “transistorised” piece of equipment might have been. I suspect it was a tie between a 20 in One electronics experiment kit for me and a transistor radio for my dad both given as Christmas presents.

Funnily enough, it was the IC in a calculator that appeared quite some time before the transistor in the TV in our house, dad needed it for work. In fact I got my first “programable” device before the TV with a transistor in it. It was the Sinclair Programable Calculator, with the four books of programes including a lunar lander game I’ve still got them somewhere… I also have and still use my dad’s portable typewrighter that is still “finger powered” and though getting harder to find yes I still use “carbon paper” with it.

Would I object to being thrown back to 1960’s technology in and around the house… Well apart from a few items that are not actually that important to me, no probably not. It’s mostly “entertainment” and hobby stuff only some of which would stop entirely. As for the proffessional side of my life, yes it’s very much based on modern electronics, but more for conveniance and speed than need. I can still design using valve/tube technology and it’s only fairly recently that high power LDMOSS devices have got to the point where high power RF amplifiers can be designed into smaller cases than the valve equivalents (ie 5-10KW MF through VHF/UHF).

However I suspect my son would find it a bit more difficult even though he does a lot of out-doors almost “back-woodsman” type activities for entertainment and hobbies.

The people who would be most hurt would be what certain economists have refered to as the “make-work” generations.

It was around 1973 when office work was at it’s most efficient and as well lean. Since then “paperwork for paperworks sake” has become a major form of mainly useless employment. In the UK it has been estimatrd that about 1/3 of work done serves absolutly no purpose other than to fill time and a little over 50% of the working population are involved with such processes.

Much of this work is based on “Office Systems” and other ICT systems that are absolutly reliant on semiconductors that would cease working with say a low yield EMP device exploded over the North Sea…

Thus it is those “makework workers” and the eco-system that supports the makework who are probably most vulnerable… But if you ask some of Jeffrey Preston Bezos’s employees it’s at best “drudge work” they do in what some have described as “labour camp conditions” with a high degree of fear involved due to the likes of unfair terms of employment. Some would almost certainly have greater quality of life and life times and life expectations if such jobs did not exist.

Which brings me to the point that would effect us all, social care. If you are young fit and health you don’t think about pensions, life cover and health care… However they are all fairly well based on ICT these days along with Banking. If these got EMP’d we would have problems however most would in a year or two return to older ways of doing things. However health care is probably the major exception in most peoples day to day lives.

PeaceHead May 3, 2018 10:40 AM

On a more serious note than my previous post of just “holding down the poweroff button”, I pretty much agree with Dave. In terms of bandwidth, most everything else is too much. I almost feel like there should be entirely separate internets for each type of dataflow. Having everything now, all at the same time, all sharing the same resources, is just too much and exaccerbates security issues.

I know this is evading the question, somewhat, but I do feel that it would solve a lot of problems.

For example, if there were separate internets:

1) academia and science only
2) banking only
3) consumer purchases only
4) software and firmare downloads only
5) video streaming only
6) audio streaming only
7) government and military only
8) news only
9) chat and forums only
10) international exchanges only
11) personals and dating only
12) hotels and housingn only
13) IoT and Black Market TOR stuff (turn this one off)
14) internet infrastructure only
15) experimental networks and protocols and softwares and firmwares

It would be nice, but I know it’s kind of just a pipe dream. And yet the current internet is way too sloppy…

With total isolation and insulation between internet tiers, it would probably be a LOT better.

(required) May 4, 2018 2:46 AM

NIST you will recall also backed the sufficiently-trodden 9/11 report, the part about the buildings falling right into their (steel, reinforced) footprint perfectly because that’s just what happens in giant (steel, reinforced) skyscraper fires that have never happened before or since. We can take their word, they MODELED it.

https://www.youtube.com/watch?v=SoCS6lNpqhE

Clive Robinson May 4, 2018 10:31 AM

@ BrianR,

any links for those figures on make-work?

Funny you should mention that, according to yesterday’s London Evening Standard somebody has just published a book with upto date info. However I’ve just looked on the ES archive and it’s not yet posted up,

https://www.standard.co.uk/archive/2018-5-3

Clive Robinson May 4, 2018 4:30 PM

@ BrianR,

As the London Evening Standard a Tory rag eddited by the ex UK Chancellor George Osbourne failed to archive their article…

We will have to switch to another Newspaper where the ex chancellor is also,known as “Gidiot”. The Gurdian jouralists have put a nicer version on line with a much more humorous turn of phrase,

https://www.theguardian.com/money/2018/may/04/i-had-to-guard-an-empty-room-the-rise-of-the-pointless-job

It’s still light on facts and figures so you might have to buy the book.

Spooky May 8, 2018 3:09 PM

Sorry, snowed under with work lately. I may have to answer in two parts if I run out of time…

@ Festus,

I expect you’re probably right. To be honest, I’d keep the advances in dental as well as surgical anesthetics and perhaps some modern metallurgy, etc. Radios and calculators are certainly nice. I’ve also lived in situations that required washing with a pitcher and basin, living without heat or A/C, and so forth. The human body is amazingly adaptable, the mind a bit less so. 🙂

@ Thoth,

You bring up several good points here and at times I wonder whether the collective economic weight of NIST standards would generally trump RFC recommendations in cases where the two are in conflict. The desire for continued interoperability (even with a bad standard) strongly figures into most IT purchasing decisions. I am curious whether European leaders would strenuously object to the flawed standard specifically and choose deliberate non-compliance, or allow themselves to be bought off by the usual assurances and concessions from the US, Microsoft and other US corporations.

@ Cassandra, @ Clive,

Out of time again. Will come back this evening to finish (no worries, I think this thread has gone quiet).

Cheers,
Spooky

Spooky May 8, 2018 5:33 PM

@ Cassandra,

Apart from your larger point on the benefits of flexible standards (if they’re good flexible standards), I’d like to briefly discuss your bit about DES. See, although NSA’s modifications of what became the DES S-box (which, if memory serves, were allegedly minor) did improve overall resistance to differential cryptanalysis, what intrigues me is that a default S-box providing ideal resistance to attack was found through exhaustive computer search, not human design. In other words, it is not enough for an S-box to merely be random or randomly selected, as the actual values chosen may lend inherent (unwanted) structure to the resulting ciphertext as it passes through various iterations of whatever round function is being used. It is quite interesting to consider that for a given block and a given number of passes through the round, some defaults are significantly better than others when it comes to producing a better cascade and overall output distribution. That’s fascinating, really. I have periodically wondered whether the Blowfish S-box default using the hex digits of Pi was equally strong, or possibly not? So far as published attacks go, I think they’ve only managed to discover a few weak classes of keys…

If DES had been compromised through differential cryptanalysis at the time of its actual use, the results would probably not have been catastrophic. The hardware to break it at the time would have cost millions of dollars and the break would have still required considerable time (days, not real-time), so the pay-off would have had to justify the expenditure and would likely need to be useful before the system being attacked re-keyed (depending on the type of attack, of course). DES was later followed in the 90s by 3DES, using three unique keys, although there were weaker variants. I believe that would only grant a small multiple increase in the effort needed to attack it, so not a huge win, though it extended the useful lifetime of the algorithm.

Okay, sorry, have to take off again. BBL, as they say…

Cheers,
Spooky

Patriot May 31, 2018 7:51 PM

This is the NOBUS philosophy in action.

The cat is already out of the bag, and the blame for that lies directly on the DIRNSAs who decided to spy on all of humanity, to include U.S. citizens–to ignore the U.S. Constitution, to grow a bureaucracy until it is unmanageable, and maybe grow their own pay too. It is all a big mistake, amazingly dumb, and downright pathetic. It has been profoundly damaging to the United States, to free speech, to the dignity of what America is and what it aspires to. The positive image of the U.S. as a freedom-loving and decent place has now been sullied by serial liars pumped up on greed. It is a monumental failure on the scale of nation states and whether or not they continue to exist.

If spying on its own population is a growth industry in the U.S., if tearing up the Constitution while taking an oath to it is the new normal, the U.S. is not going to last as long as many think.

They can push weak encryption all they want, but too many people in too many countries are not going to accept this. OpenGPG can use weak authentication and offer compromised curves from the NIST, but they too will someday be held to account–and questions will be asked.

As Mr. Schneier has said, the NIST is in danger of presenting protocols that no one will use. They are also in danger of being treated with contempt.

Patriot May 31, 2018 8:41 PM

Look at this from the viewpoint of the attacker. The IoT is clearly wonderful; its birth is an occasion for joy, for backslapping and drinks all around, for “fine dining”. You want steams of electrons–that’s data. You want a lush forest of devices that are not updated, secured, patched, or standardized to be secure in any manner. Whatever the Muppets (consumers) are told in the U.S. does not really matter. Offer them “hackerproof” devices and (yawn) they will feel good and even pay more–for something that does no such thing. +1 So far so good. Real security would take a lot of cooperation and hard work across several industries–and that is not going to happen–so entropy is on our side. +2

You want devices to be standardized to be unsecure according to NOBUS. If NOBUS is actually NOBUS + NOBTHEM + NOB-ANY-1, that is OK. NOBUS is still cool as an ideal. Everyone needs an ideal. We dominate; we win. We are winning, and we have always been winning.

Stardardized non-security starts with weak encryption. You also want devices to proliferate. You want to be able to use public key cryptography against devices to exfill data (kleptography). +3 So you get more access to people. +3 Even 24/7 access, like in 1984. +5 America will then be fully protected.

One gets the sense that this effort is not intended for specific targets such as, say, the Foreign Minister of Belorussia, or a Taliban leader in a mud hut on the Pakistani border. It looks like more of an inward-looking effort, a collection package against the average American/European.

Leave a comment

Login

Allowed HTML <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre> Markdown Extra syntax via https://michelf.ca/projects/php-markdown/extra/

Sidebar photo of Bruce Schneier by Joe MacInnis.