Our past and history are perspectives that can help us to predict, forecast, and strategize around future events. But just because something happened in the past does not mean it will happen in the future.

But the past can help us in learning about, for example, how to prepare and take on new things in an approaching future horizon. Predicting or forecasting the future in absolute terms is impossible. But that does not mean we should ignore what is approaching us, i.e. what is moving towards us from a distant horizon. Ignorance of things moving towards us that potentially could harm us is a highly sub-optimal way of doing things.

In this article, I will speak about how we as security professionals, can better prepare our organizations for future things emerging us. I will explain the term “emerging risk” in the context of security. I will explain what it is and why I see it as something that organizations should spend resources on. Unfortunately, emerging risk management is one of those things I have noticed very few organizations invest time and resources on.

If you want to understand what risk is in the context of security, I have explained it in this article What is Risk? Modeled & Explained. I think it makes up a good starting point for the subject of “risk”.



An emerging risk is a new form of or unforeseen risk that has not yet been contemplated and understood. This is a risk that should be on our risk radar.

An emerging risk usually carries an uncertain trajectory due to its rapid evolution and nonlinear progression. The potential negative impact, harm, damage, or other forms of negative consequence is not fully known and understood.

So what does this explanation mean? This explanation is, somewhat comparable to, what you will find in textbooks, frameworks, and standards of what an emerging risk is. I think the explanation is good. And I do not and will not argue against it or try to cut through the semantics of it. I am less about that stuff (as you as a reader might have figured out from my previous articles).

I will elaborate on the explanation in this article as I will exemplify the meaning with the help of some analogies, past scenarios, and actual ongoing emerging risks (to the date when this article was written). Stay tuned!

Emerging risk is not a concept isolated to or something only applicable to security. But as digitization reaches a higher and higher penetration in organizations and our society many emerging risks are, in general, related to new emerging technologies. Keep in mind though, emerging risks can also be something derived from for example geopolitical, regulatory, and economical perspectives. It is not a “security concept/tool/method”.


Take a look at the illustration and think about the explanation I provided above, of what an emerging risk is.

The illustration shows a boat approaching an iceberg. The iceberg only exposes the top of the iceberg. The boat does not see the iceberg hiding under the sea level.

Emerging risks are like icebergs

This analogy, the iceberg and boat thing, is a pretty good one for what emerging risks are.

My analogy does not though cover the “future” perspective. The analogy can to some extent be put in relation to “horizon scanning” though. I.e. the earlier the iceberg is discovered, the better the conditions for the captain and crew to take action to avoid a potential negative impact. The same thinking goes along with how emerging risks shall be managed.

Let us put some more context and analogies around emerging risks and what it means from a future perspective in the context of security. But before that, I will explain what an emerging risk is not.


Icebergs are large chunks of ice that break off from glaciers. This process is called calving. Icebergs float in the ocean, but are made of frozen freshwater, not saltwater.

Some icebergs near Antarctica can be as big as the Italian island of Sicily, the largest island in the Mediterranean Sea. As little as one-eighth of an iceberg is visible above the water. Most of the mass of an iceberg lies below the surface of the water. This is where the phrase "tip of the iceberg" came from, meaning only part of an idea or problem is known.



Emerging risks are not those crazy science fiction things that we see in movies or in our dreams. Or that we fantasize about when being drunk. Emerging risks are not those things that are totally out of proportion to realism. These things are fantasies and dreams without attachment to reality.

I am not speaking about emerging risk as something in terms of a super-fast and evil-looking alien spacecraft flying in from another galaxy through a wormhole that opens up in the sky.

It lands on our planet and the aliens jump out and are savant intelligent hackers. They plug their brains into the core network of our infrastructure and wipe out our planet’s digital technology, now that we just entered the state of technological singularity. The Aliens pull off an evil laugh, smirk at us, and send us back to the stone age. <MOOOOHAAAAHAHAHAHA>.

They levitate back into the spacecraft and warp back into their own dimension. At a top of a mountain, in the desert in the middle of f-ing nowhere, they leave a secret and encrypted message. We as humans do not manage to put our brains together to understand what it means. The message is written in the Aliens language and we do not hold the capabilities to decipher it. But we clearly see it, written in green neon text. It is the only piece of technology left on planet Earth carved into a mountaintop. The message is the answer to all mysteries and questions of mankind.

The message says:

“We did this to let you start over and give you a new chance to build up the earth to a wonderful place. Please do not f@*k this up again dear friends.”.

This is a less likely emerging risk. This thing might turn out in the future but the scenario is less realistic. It might more play out as a bad science-fiction movie. Like Sharknado. Who even comes up with and produced a movie like that? Sharks morphing up in a tornado and going bananas. And there are 6 of these movies! The last one was released in 2018! No offense to those who produced the movies. Obviously, Sharks in tornados are a thing when it comes to the movie screen.



Maybe my emerging risk analysis skills about those alien hackers are something. A Hollywood production of these Alien hacker dudes saving planet after planet that is getting destroyed by their native species. If you have any good suggestions on a title for this movie or feel to cooperate with the project, let me know! I think I am on to something big here! Hollywood, if you read this, this might be the next big thing after Sharknado.



Ok, so now I have put out for you what emerging risks are not. The scenario I described off those Alines does not, at least how I see it, feel like a scenario that is very likely to play out in a near future.

To give you a couple of examples of the hottest topics in the latest years around emerging risks, these are:

  • Quantum computing
  • Nanotechnology
  • Artificial intelligence
  • Increased regulations

This article was first written in 2023. For obvious reasons, the list will change in relation to time and as the years go by. I guess this is quite obvious though but I make it super clear. You never know, crazier things have happened. Like Sharknado.

The list can be made longer, this is not by any means “all emerging risks” out there. But maybe this list provides some inspiration for you and your organization to start out the discussion around these topics and how they may affect your organization?

The quantum threat is an emerging risk

I strongly recommend spending time researching the topics and getting some feeling around each of them. There are tons of useful information available on the internetz. Maybe just give each topic 60 minutes of reading, which will probably open some more questions and lead to some interesting discussion in your organization?

That quantum topic maybe leads up to something like this:

“I read about this thing called quantum computing. We as an organization have loads of digital solutions and security capabilities relying on these encryption methods that is a threat from a quantum perspective. Wouldn’t it be a good thing to start off with an inventory of our encryption landscape to better understand and document our current situation?”

“Oh, I had not even thought about that! Sounds like a complicated exercise but we definetly should do that to ensure we and our customers are prepared in the best ways for that coming threat and risk.”


A relevant emerging risk scenario, from a security perspective, that is from the past but still valid to speak about is the Cloud. As the cloud came into the industry as a digital enabler loads of organizations dived head in first even without thinking about the risks involved. The cloud was not just something that appeared from nowhere though. It was a hot topic for years! And still is! It was there on the horizon, approaching organizations and emerging.

Many organizations do not even today assess the risks involved when investing in cloud services. At this point, the cloud is not categorized as an emerging risk in its current form.

I think if organizations would have approached the cloud and started out early to apply analysis of the potential risks related to the “cloud-things” may have looked a bit different here and there. The penetration of cloud services might have been higher in our world? Maybe we would have been approaching Cloud-vNext-security things? Maybe we would have seen fewer security breaches related to the cloud? Maybe laws and legislation would have been more tailored toward the cloud?

Past emerging risk of cloud computing
Data and information skyrocket into the cloud.

Lots of those debates and risks we speak about today maybe would have been managed. Or maybe not. Maybe we would be in the totally same situation, but I do not believe so. I think things would have looked a bit different.

The cloud transformation was unstoppable and still is. It is today a commodity service. This is the form of digital service that organizations expect. And do not get me wrong, I am a strong supporter of cloud services. And I am a strong supporter of risk management. Not only when it comes to my profession but in life in general. That is just who I am.

And to put this example into the perspective of what we see taking place at the moment as ChatGPT came to alive. Do you see the similarities? Are we sliding down the mountaintop and trying to hold on to our asses? Are we jumping onto the bandwagon too fast in terms of not reflecting and contemplating the implications of AI? Are we, as security professionals/humans/organizations/society, moving a bit too fast due to that we blind spot ourselves by only looking at the positive effects without spending the same amount of time and energy to think about the potential negative effects?

Some of those who know me, have heard my reflection for a couple of years, on how I think AI will impact us when the capabilities reach a higher penetration in our organizations, society, and us as humans. Not only from a security perspective but in general. I do not say this with a negative or positive tone. This is just how I see it. It will make an impact on many places for us as humans. Some might be negative and some might be positive. But I do not think that the solution is to prohibit things or work against them. That did not really work with the cloud. Yes, AI is a different animal and I am totally aware of that. But the principles should still remain the same. The evolution of technology is exponential, we can not fight against it. It will only become more and more advanced and fast-paced.

We as security practitioners/humans/organizations/society should analyze, reflect, discuss, and contemplate the effects (negative and positive) the change (AI or not) will have on us. If this is something we start to do, as soon as or in the earlier phases at least when we identify those emerging technologies it will for sure be to our advantage.

And personally, I think that the risks that we see today related to AI are the very top of the iceberg. Data leakage and unethical usage of AI to conduct adversarial attacks are just the beginning. And these are nothing new. These forms of risks have been manifested since before but through different methods, scenarios, contexts, and so forth.

But what other forms of risk are there when it comes to AI? And what may AI mean for us as humans from the perspectives, of for example:

  • Psychological
  • Existential
  • Ethical
  • Financial
  • Intellectual
  • <the list can go on>

And as you see, the list above does not mention those technical things related to AI or “AI will take my job”. Those are also risks but why not zoom out and look at the bigger picture and from that zoom in on the details when needed? Technology often comes down to those more detailed things on a lower level. Anyway, this is more about how my own brain works.

I like to analyze stuff and contemplate things. I do not see risks as something “bad” or things that I get paranoid about. Risks are everywhere. And when those are identified, there is usually a better way how to approach them. How to apply an intelligent response and treatment option to reduce a potentially negative outcome or event.

I think that “sitting and waiting for someone else to come up with the answers” is one way to do things but I do not think this is what we as collective humans should do. I rather think that it is better to take an active approach and seek knowledge to gain wisdom from those uncertainties. Or at least close the gap by getting a holistic understanding of what is approaching from the unknown. And by doing so together, with the help of a group of people amplifies the powers. The more brain power we can bring together, the stronger we (in general) become.

For those who think that I am against AI and a joiner of the “Nay-AI-team”. No, I am not against it at all. I am a fan of it. But I am contemplative about it but as an individual security dude, writing articles on my blog, will not change the course or direction where things are going. If we want to do so, the way we need to do it is as a team.

We, as Humans (independent of our profession), need to come together and have those wider discussions around the subject. It is not enough if a little group of experts sits in one corner. A guy like me writing about it on a website. General ranting on the internetz about the subject. All these forms of initiatives have their place though, but I do not think this is the way how we will best prepare our organizations, society, and ourselves for AI or other things that are approaching from a future horizon.

And yes, there are many organizations (non-profit, profit, companies) and individuals who are leading the way around AI and the discussions. You, i.e. organizations and individuals who are leading the way: thank you for all you are doing!

But, unfortunately I think that to few of the general population takes interest in AI or future things. There is nothing wrong about it but it is just how it is. And this, the low interest in future technology, I can see as a risk on its own. That we as humans are a bit to uninterested in how emerging technology will shape our future and next coming generations. For example, what will AI mean, both from a positive and negative intellectual perspective, for those who grow up with it? Will it make them smarter? Will it make them better at asking questions? Will it make them <insert your thoughts here>?

If we take an active approach towards emerging technology and risks we can better prepare for the future. Not predict it or forecast it in absolute terms or forms. But we will for sure have a better preparedness.


Remember, emerging risks are like icebergs. They are risks that we do not understand what is hidden “underneath” them. We do not understand the harm, negative impact, or damage they may cause if actualized. We might only see the tip of them.

And emerging risks, in the context of security, come in many different forms. AI, Quantum, regulations, cyber crime, cyber war, emerging technology. The list goes on.

Emerging risks are like icebergs

One of the takeaway messages from today’s article, that I want to provide you who are reading this article, is to ask yourself and your organization how you are working with emerging risks.

Take a couple of minutes and reflect on the subject of this article and ask yourself these questions:

  • How are we working with emerging risks in our organization?
  • What emerging risks apply to our organization?
  • Who owns the process for emerging risk management in our organization?

Henrik Parkkinen