Press Releases Rhebo

News

Who’s responsible for Security By Design? (not who you think of first)

In the latest episode of OT Security Made Simple, OT security experts Sarah Fluchs and Klaus Mochalski discuss the sense and nonsense of security by design and ask the crucial question of who is responsible for its implementation. Their answer is that device and system vendors not the only ones liable.

 

 

 

Listen to us:

  

 

Transcript

 

Klaus Mochalski

Hello and welcome to a new episode of our podcast OT Security Made Simple. My guest today is Sarah Fluchs. She is CTO at admeritia GmbH. She has been working on the topic of OT and OT security for many, many years and, unusually for Germany, is also one of the most visible experts internationally. And that is very, very exciting. Sarah, why don't you briefly introduce yourself?

Sarah Fluchs

Actually, you did a great job already. I'm CTO at admeritia. CTO for a consulting company, a somewhat unusual position. That's simply because we do quite a lot of research and develop methods. And also develop a piece of software. We got there by accident, so maybe we'll come back to that in a moment. And so, in principle, I cover everything relating to security methods for OT only, i.e. automation technology. Originally, I'm also an automation engineer. I once studied it, but then I quickly took a detour via the German Federal Office for Information Security (BSI) in the direction of security.

Klaus Mochalski

In other words, this is a topic that has actually been on your mind for a very, very long time.

Sarah Fluchs

At least definitely since graduating. Yes, automation is traditionally - and that's part of the problem - not so much concerned with security, but rather with its technology, of course. And it's only slowly getting closer to IT, using more IT components and so on. And that's something you only learn slowly. And the topic of security was at most something you could look into as an optional subject in our Master's program and then not really specifically for automation systems. It's a topic that I only learned because the BSI had advertised Master's theses on it.

Klaus Mochalski

Yes, it's interesting that this is still the case. There's a lot of talk about this problem, but you'd think that it would have become part of training by now. Well, that was a few years ago for you too. I don't know what it's like today, but I think that's a different topic.

Today we are looking at the topic of "Security by Design". And "security by design" has become a bit of a buzzword in recent months - you could even say it has degenerated. In preparation, we talked about what else can be said about it. There's a lot of talk about it at the moment and today we want to dispel some of the myths that are circulating.

A few weeks ago, I was at an event in Israel where the topic was "Security by Design", especially for critical infrastructures. And during the panel discussion, I actually had the feeling that this topic is still very much in its infancy. And that, above all, many people are throwing up their hands at the challenge of implementing "security by design" in the long term. This is because the solution to the problem is often sought from the manufacturers of components, software and systems. From your point of view, tell us what the current status of "security by design" is and what you have to consider as an infrastructure operator, also known as an asset owner, and how best to proceed. What is the current status of "Security by Design" as you see it today with your customers?

Sarah Fluchs

First of all, it's really interesting that the topic of "security by design" is still not really established, because it's not a new invention. Even if it has recently gained a bit more momentum. Instead, in relation to software systems, it's been around since the 1980s and 1990s, with people saying "Hey, maybe it would be good if we took this into account directly when we develop systems and not just at some point later. Let's think about how we can do that".

Klaus Mochalski

The idea sounds very good, and it makes intuitive sense to say that we are thinking about security. Not when the product is on the road, but at the very beginning, when we are planning and building it. In other words, the idea has of course been around for many, many years, but apparently it hasn't really made it into practice yet.

Sarah Fluchs

It's completely obvious and I think that's also the reason why it's so successful as a buzzword. Because, of course, it's something that everyone likes to jump on and say "of course, it makes sense". So I don't think you really, really need to explain to anyone why "security by design" is or at least would be a good idea.

After all, there is also a much-documented thesis in engineering - in other areas too. Although it is no longer a thesis. It really has been proven. That if you find errors early in the design, it is exponentially cheaper to fix them than if you find them later. There are such nice curves for that, and of course that applies to security just as much. And you just said: What do operators have to consider? And I think that's probably the first knot in the listeners' heads, because "security by design" is usually a topic that is very strongly associated with manufacturers. Which also makes sense when you say "Okay, security should be taken into account during the development of a component". Of course, you talk to the people who are developing the component. They have to do something. That also makes sense.

We are dealing with a very specific segment here - security for automation systems and automation technology. And there are different roles for the system on the market. There are manufacturers of automation systems. There are the so-called integrators, who assemble the whole thing into an overall system and are really there to automate a plant. Then there are the plant operators. So, I have a sewage treatment plant, I am an operator of such a plant, and he usually has a service provider or two service providers. This [operator] says: "Okay, I would like to have a control system from Siemens.” Someone has to turn this control system from Siemens and the controllers from Siemens and the sensors from Wago - or whatever - into a system that actually automates this water treatment process.

In other words, you always have these three roles. But that means that if you say - which is relatively easy to implement in software development - "Yes, man, security by design is the job of the software manufacturer or the component manufacturer", then Microsoft or whoever produces this software has to do In automation technology, however, it's not that simple. You can of course say - as in software development - "Siemens, you make the control system. Wago, you make the sensor. Now you have to take "security by design" into account". That's analogous, you only talk about manufacturers. But that is by no means all there is to designing an automation system. It only becomes an automation system when it is integrated into an overall system. And that's where it stops with the manufacturers.

They sometimes also take on this role of integrator. Nevertheless, it is still another role. And it is often the case - it also depends on the organization - that the operator of the sewage treatment plant, for example, or the manufacturer of chemical products, or whatever, also has engineering departments that ensure that the overall system works afterwards. And of course there is also a piece of design and security has to be part of it. That's why it's really not so easy in automation technology to say that the manufacturer is responsible and everybody else is off the hook.

Klaus Mochalski

That is, if I understand you correctly... First of all, there are three roles: The manufacturers, the integrators and the operators. And you're saying that it's not just the manufacturers, as one would assume with this topic, but the integrators play a very important role. So what role do operators play? Do they also play a role? Is the responsibility equally distributed, or does one responsibility stand out in a particular area?

Sarah Fluchs

The problem is that this varies greatly. So this is exactly the question we asked ourselves. Three years ago, we started a major research project on the topic of "Security by Design". We said: "It would be good - as we want to incorporate security into the engineering process - if we could also do this for automation systems". And then you think about it like a drawing board: what do you do there? You say: "Okay, it's actually relatively straight forward. We'll do it like this, we'll take a look at how the engineering process works and then we'll see how we can get security into it". So we had a plan.

Then we started to look at how the engineering process works. And we found that, firstly, it is very difficult to generalize because the studies on this are also completely different, if only for the different geographical conditions. The engineering process is completely different in Europe than in the USA or South America. This also has to do with how many new plants are actually being built or whether there is a tendency to build completely new plants or expand existing ones. Europe has a space problem, so new plants are no longer being built so often. The situation is different in other countries. There are other standards that they have to meet, which have major implications and so on.

But the important thing was that in the end we said, "How this responsibility is distributed is really very individual and depends entirely on the corporate culture". For example, we have a use case in our research project where the culture is that they do quite a lot themselves, that they do quite a lot of engineering themselves. Accordingly, they also do quite a lot of these integrator jobs. So the questions "How do I get the system to form a complete system? How do I program my controller, etc.?", they actually answer them themselves. At the same time, there are also a lot of operators of the organizations who don't do this themselves and are completely dependent on service providers. That's why it makes more sense to think in terms of these roles and less in terms of organizations that are firmly attached to them. In the end, it is important who contributes to the design and this can be totally distributed across these roles.

Klaus Mochalski

Are there or is there a lack of standards or standardization frameworks? When I hear corporate culture, that is an aspect that is very difficult to measure. And there are many different forms of it. But I still need a certain basis from which to start in order to achieve measurable progress. Which is what I want to do when I securely design and implement my system.

Sarah Fluchs

We are now entering a very large field of research, one that no longer has much to do with security, where at some point we also said "Not our business". Incidentally, this has also been a very important realization along the way, that security cannot presume to change these existing engineering processes that have long been optimized. We have to work with what we have.

So from a security perspective, we're not going to say "Engineer the system differently". They would quite rightly give us the cold shoulder. Because they've been working on it for years, for decades. How do you do that?

And of course there are such standards. In Germany, for example, there is the Namur standard NA 35 for the process industry, which describes relatively clearly how such an engineering process actually works. But when you talk to the people who developed it, they say "Well, what you can write in the standards is not so much what we implement". There's probably no organization in the world that follows it exactly the same way. It's more the distillation of what everyone does. And often, of course, the companies themselves have their own standards, which they themselves have adapted to what they do and implement. And some of the companies we're talking about are large companies. So if you're talking about, I don't know, Shell or BASF or whatever, they're global and of course they have standards for how they do things. This is driven from many directions. So I wouldn't presume to have an opinion on whether there are any standards missing, nor would I say that there aren't any. In fact, security should stay out of it. They have to work with the material they have. Nobody should change their engineering process just because of security.

Klaus Mochalski

That's a very good statement that we - and by we I mean us security service providers - have to adapt to the circumstances and that the options for changing certain things are very limited. We can't tell the engineers, "You have to build the system differently now because it's not secure".

Sarah Fluchs

That would be completely presumptuous. Honestly, it would be like the tail wagging the dog. Security has no reason to exist if this system is not running. So in that sense, security is, I like to say provocatively, an auxiliary science. Yes, we are there to ensure that it runs securely and resiliently, etc. But if the system doesn't exist, then we no longer exist.

Klaus Mochalski

Of course. And I can also very well imagine how it works in the really big companies. Because there is of course a lot that already exists, including internal company standards on how to do certain things. And in the end, security implementation is an extension of this very internal framework that is set there. But how do I do this from the point of view of a smaller company that operates a plant and has not yet dealt much with the topic? A company that doesn't have a large department for this and perhaps can't afford the large consulting support. As a small to medium-sized company, I would like to start with the topic of "security by design" for my critical systems. What are your specific recommendations for these companies? What should they do first? What are the priorities and how can the fastest measurable added value be achieved?

Sarah Fluchs

So I think, first of all, you have to understand that, especially if you don't do much engineering yourself as an operator or as the owner of a system, then of course "security by design" is primarily a communication problem. After all, I don't do much engineering myself. But I can't change that much in engineering either. But I have to make it very clear to the service providers - the engineering is probably done by an integrator or a manufacturer or someone in both roles - what I need. And that is the responsibility of the asset owner, i.e. the operator.

If I really don't do any engineering, then of course I can't integrate security into my engineering. [And there is a difference between the user and the operator.] If I have pure software, then I am a user of a system and I don't do much with it. But an operator interacts with such a system on a completely different basis and still makes a few adjustments. And they have more responsibility and, above all, often [legally] have operator responsibility in terms of security.

Klaus Mochalski

That is very intriguing. In other words, as the operator, I am responsible and cannot completely rely on my manufacturers and system integrators and tell them: "Your job is to supply me with a system that is secure by design". Instead, I have to give them more specific information about what I need.

Sarah Fluchs

So let's put it this way, it's no surprise that the operators have an operator responsibility. That's why we have so much regulation for critical infrastructures etc., because this responsibility could of course be assigned to someone else.

Klaus Mochalski

So, of course, responsibility for the secure operation of the systems. That is of course indisputable, but also responsibility for the successful implementation of a "security by design" concept. Because you could have said that the [operators] have relatively little to do with it. But that doesn't seem to be the case.

Sarah Fluchs

Let's put it this way, you can say "Security by design is not my job" from the operator's point of view. Experience shows that the operator then says "Not my job. You do "Security by Design", offer me something that is secure by design". The manufacturer then comes back with what they see as a "secure by design" system. They have to interpret what the operator wants and the operator thinks it's all great. And then the manufacturer puts a price tag on it. And at this point, what often happens with security is: "Well, okay, if that's the case, then maybe a little less security would be enough". And that's when the gambling starts, you cut back or go for the insecure system after all.

So there are actually many manufacturers who have two versions for all components for precisely this reason. "I have my engineering station in secure and non-secure. What do you want? One can do this this this. The other can do this this that. Here are the price tags." And now you can make a good guess as to what most operators take. But if you throw in the issue of security at the very end, many other things are are already fixed. Then you have to work with the budget you have. And then there's simply no more room for it.

Klaus Mochalski

And how do I, as the operator, get out of this dilemma? That's the interesting question.

Sarah Fluchs

Of course, the operators have a responsibility here. It's there in black and white [in the regulations]. Either I don't care at all and I just do it. Something will come of it. Or, of course, I can say in advance and consider - as with all features in this world - what I need. And that's not so easy in security. So there are also differences of opinion about a concept like security level etc., where you can say, "Okay, for security level four you have these features, for three you have these features, for two you have these features". And then the wishful thinking is always that such an operator can go and say: "Why don't you make me security level X and then I'll be happy".

But the big question is, how do I as an operator even come to the realization of what is important to me? In the end, it's not important to throw all the security features I can think of at my product. Instead, it is important that I think about what is really relevant for the operation of my system. And of course the manufacturer can only know this to a limited extent, because they can only know to a limited extent what the priorities of such an operator actually are. Is it bad for the [operator] if the system fails or not? Is it bad for him if the system fails for X minutes? Is this sensor important or that one? The [manufacturer] can't know everything. You often know parts of it from functional safety, but that has a different focus.

This means that the operator has a responsibility to communicate what they actually want. And communicating what I actually want always goes hand in hand with thinking about what I want to a certain extent [in the first place]. And that brings us to the topic of "security by design".

Klaus Mochalski

This means that the operator cannot avoid consulting both the system operation expert, which he most likely has, and the expert for the security of such systems, which he may or may not have - or more often does not have. And they have to sit down together in advance and think about what my optimization goals are? What are the critical elements of my plant and how much security and what kind of security do I need? And only when I have gathered this information for my specific system - and for this I need both technical and specialist cyber security expertise - only then can I really approach a system integrator or manufacturer and say "I would like to have this system from you". Is that right?

Sarah Fluchs

There are two points that I would like to put in a different light. One is what you just said. That the system expert and the security expert need to sit down together. Experience has shown that this doesn't work so well. Experience shows that you have two camps. One is the person who wants everything to work. And then there's the spoilsport who wants the security, but has no idea about the system and therefore has to work hard to establish a position in the company on this topic, to put it mildly.

Over the last three years, we have carried out many test projects on the subject of "security by design" with manufacturers and operators. Experience has shown that it works much better if you take these system experts, present them with these decisions and ask them certain questions. So that the [system experts] take over the topic. It's not always easy from an organizational point of view, but experience has shown that it works better to say: "We take system experts and instruct them so that they can help make certain security decisions". Instead of saying that the security expert has to make this decision. The system expert would drop out at this point, because the most important knowledge an operator can put in there is knowledge about their systems. And if this is decoupled from the security decisions, then you make botched decisions. If they come together, it can work. And, of course, the security expertise has to come from somewhere in terms of how to solve this organizationally. There are many roads to Rome, but the system expert should be involved in the decision at all cost.

Klaus Mochalski

This is of course a very controversial topic, which I believe has already been the subject of much discussion and debate. Is it even solvable? There are also statements circulating out there that a system expert can never make up for years of experience and attitude with regard to cyber security and that I therefore always need this cyber security expert. But then, of course, I have this conflict that you describe. In other words, I still think the ideal solution has not been found. It will probably never be found. That's also my feeling from my experience with our customers. Every customer behaves differently because of the people involved. So what expertise do I get from which person? How much or little do they know about security? How open are they to it? There is often a certain defensive attitude. I think this is a huge topic that could fill an entire podcast.

Sarah Fluchs

A quick point on this. It's the other way around. You can throw the security decision into the room and take the security expert out once you've made the decision: "This needs to be decided: Do we want X or Y?" You can then take the security expert out of the equation and the result will be a halfway sensible decision. But if you take the system expert out of the equation, all you'll end up with is a mess. The security people simply can't make a decision without this system expertise. It has to be in there somehow, and the [system expertise] is very specific to a particular company. Security expertise is often not so company-specific.

Klaus Mochalski

Sure, the system won't run without the system expert, but of course it will run without the security expert. The only question is how long and how [high] is the operating risk.

Sarah Fluchs

I did not say that the system is running. I said that the security decision can be made. So the security decision, if you have made it once. And that's the big "if" behind it, I know. Once you have made it clear that we have to decide, and that these are the parameters. But this is knowledge that is relatively independent of the system itself. That's why it's more helpful to include it in such a decision than the system knowledge for an individual company. The system expert can make such a decision with a little guidance. But the security expert cannot make it without this system knowledge. And this system knowledge is difficult to generalize across systems. That is not possible.

Klaus Mochalski

Yes, that makes sense. I think the conclusion from what we have discussed so far could really be - and this is also a bit of a twist that you give to the topic of "security by design", which is quite different from what we usually hear. The responsibility cannot be given to the manufacturers and system integrators alone, but the operator of the system, the asset owner, has a very important role to play here. And only if substantial specifications come from this direction can the overall project be successful. Can you summarize it like that, or what else would you add?

Sarah Fluchs

So two small exceptions. Firstly, we are of course talking about automation systems, where there is a great deal of distributed engineering across different organizations. Of course, this does not apply to commodity IT products, where the entire user community can be asked what they would like. So I don't want to bring the grassroots democratic approach into this at all.

And the second thing is that there are, of course, certain practices and principles that make sense for manufacturers to observe regardless. Of course, it is still desirable to have as few vulnerabilities as possible in the end product - this is always what "security by design" demands, but it is a very difficult requirement. This is always easy to demand, just like "security by design" itself. And everyone signs up to it. It's a great thing. But how do I get there? So, how do I get it right? And of course that's something the operator can do relatively little about. They can only say where weak points are particularly bad for them and where it is particularly important. Or what they consider to be a vulnerability, because that brings us back to another topic. So the whole topic of "Insecure by Design" features or features that are simply needed by plant operators or in OT, which could potentially be defined as a vulnerability in other areas.

A good example is always the question: "Can I actually update my PLC during operation?" From a security perspective, I naturally throw up my hands and say "For heaven's sake, no! That opens the door to attackers. Wide open barn doors!" But this is possible in almost every business because the operator needs it. They can't shut down the system every time because they want to change the logic, because they change it surprisingly often. So, these are classic cases where you have to strike a balance: Where you can say, "Yes, no more weak points". That sounds like a great goal. But what does that actually mean? What is actually the weak point? The manufacturer can't decide that completely on its own. A dialog is needed.

Klaus Mochalski

In other words, as is so often the case, it's about balancing responsibility between the operator, the manufacturer and the integrator. The roles may well merge or overlap. But that the responsibility is distributed among all these parties. And that a project can only be successful if the problems in this tense relationship between these parties are discussed and first defined through communication and then solved during implementation. This means that none of the parties can solve the problem alone.

A manufacturer cannot build the perfect, secure product and then it is solved for the operator. Instead, the operator is never completely free of responsibility. And the important thing here is always communication between the parties in order to precisely define the requirements from the operator's point of view. The manufacturer must then have a product that can implement these requirements and the integrator must of course bring the two together and then build a system that meets these requirements.

Sarah Fluchs

I think we need to somehow reduce the blame game. At the moment, the operators are saying: "Manufacturers, just build me something that's secure!" and the manufacturers are saying "Just pay for it" or "Then tell me what you want". And I don't think it's important that we work out who has what responsibility down to the last detail. It's simply important that we start to accept that everyone has a bit of responsibility and that it makes sense to talk about it transparently, and early enough. Above all, "security by design" also means "communication by design". In other words, starting early enough to talk about certain things, to get a common picture of what you actually want and that both sides can contribute to this.

Klaus Mochalski

This means cooperation and transparency, instead of communication barriers and finger-pointing at the end when things don't work out.

Sarah Fluchs

Yes, especially not such an armament war. I would like to have security level four, and that includes these X requirements. And everyone actually knows that this is not possible for certain products. Then the manufacturer comes along, and of course they want to stonewall somehow and explain that they can do it all somehow. And if not, then it's not transparent either. Then they come up with what they're trying to do somehow.

And then the operator says again, "Yes, but you can't do this and you can't do that. And it's too expensive". That doesn't help in the end. It helps if you don't even start with: "I would like Security Level X. The point is in IEC62443. Do it like this!" But if you start one step earlier with "What do I actually want to achieve?"

Then the manufacturer who receives such a requirement has a chance to say: "Look, I know my systems. If you want to achieve this, this requirement doesn't make sense at all, but let's talk about whether we want to do it differently". And yes, this is more tedious than simply pushing over a feature list. But in the end, the result is cheaper because you're not constantly implementing features that perhaps nobody wanted and that cost a lot of money.

Klaus Mochalski

Yes, that's a nice closing statement. We are calling for more cooperation. You advocate more cooperation between the parties involved and less aggressive language and arms races. In our experience too, a tender in particular often feels as if the parties are facing each other and the other is the enemy. But we actually need to be allies from the outset, and cooperation needs to be at the forefront right from the start. Otherwise, it is much more difficult for a successful project to emerge in the end.

Sarah Fluchs

Lower expectations and increase transparency. That would be nice, even if we know that transparency is always extra difficult with security.

Klaus Mochalski

Very nice, thank you very much for that. I enjoyed it. I think we can go much deeper into this topic. I was delighted that you were here on the show today.

Sarah Fluchs

Thank you. It was fun.