4 September 2019 • Arie Verhoef
Cyber crime and organisational culture
Organisations and SMEs often don’t believe they will become the victim of a cyber crime. But are they right? To what extent is the organisational culture a weakness in the defence against cyber criminals? Nineteen groups of students from the university of applied sciences’ ICT programme planned out an attack. They dispelled the sense of safety at many organisations. What do organisations think of this experience?
Using a ladder to gain entry to an organisation. Carrying a cake and following someone through a security gate. Sneaking around and taking pictures of information displayed on unlocked computer screens. Attempting to obtain login details. The students had a field day with their assignment. They tested the cyber security of organisations on at least 80 different occasions. At the request of the organisations themselves.
The 2019 student project of the ICT degree programme was the tenth edition of this project. Michelle Ancher is one of the initiators. She is an Information Security Management lecturer at the ICT degree programme and a researcher at the research group ‘Cyber security in SMEs’. As a social psychologist, she focuses on the human factor in information security. Criminals know there are often weaknesses in that area and they fully exploit them. The fancy term for this is social engineering.
Four times a winner
Michelle sees the student project as a wonderful example of a win-win situation. “Students love to participate in this. By experiencing how easy it is to influence behaviour, they will be better equipped to protect themselves as professionals later on.”
“The degree programme includes these experiences in the education. The research group uses these to build knowledge and to conduct follow-up studies, for example to set up cyber interventions. The participating companies can safely test their cyber security and gain insight into their weak spots.”
The latter issue is what Eric Dittmar wants to know about. This was the third time he participated in the student project. As a system administrator at a government agency, he is responsible for operational security, among other things. “We have a legal obligation to test our cyber security every year and we must comply with the minimal national standard at least. This student project is very interesting to us. Unlike commercial researchers, the students take an unconventional approach.”
Up a ladder
Students were able to enter Eric Dittmar’s organisation by using a ladder. Eric: “This is not the kind of scenario that our security plans take into account.”
Another group of students was able to walk around inside the organisation for a few hours, after they entered the building slipping in behind someone going through the security gates. Unnoticed, they went around taking pictures of unstaffed desks with unlocked computer screens.
The students also sent out a phishing email to a small group of staff members, using the name of a person in authority. Eric: “Someone clicked on a link they weren’t supposed to click on.”
Eric’s organisation didn’t perform significantly worse than other companies. Michelle: “The students used many of the seduction tactics that cyber criminals apply as well. For example, taking advantage of the herd mentality of employees: if others are doing it, it must be ok. Or authoritative behaviour: if an email comes from my superior, it must be fine.
“If you ask someone in a friendly but urgent tone a few minutes before the end of the day to provide their password ‘for security reasons’, they will rush to comply. Because it must be important and they want to go home.”
Michelle explains that they took a closer look at the impact of organisational culture on cyber security this year. “To what extent does the way people interact with each other (the social norm) or a certain type of leadership, affect security within an organisation? Before cyber criminals strike, they often take their time getting to know the organisation. They will uncover the weak spots in the organisational culture.”
She notices that SMEs have a misguided risk perception. Michelle: “Companies think that they are cyber secure because they have taken security measures and staff members have responded positively. But we now know that human error often leads to cyber risks. For example, by accidentally clicking on a link in a phishing email.”
“Even if the policy is translated into good measures, the company is still at risk. Not only because there is something to be gained in small or medium-sized enterprises, such as client data. Criminals can also hack computers in the SME sector to create a botnet, using your ‘safe’ computer for criminal purposes.”
Facing the facts
The students forced Eric Dittmar and many other participants to face the facts. Eric sees it as a positive experience. “Every year we are quite surprised by the outcome of the project. We will be happy to participate again next year.”
Michelle: “We are looking at starting up a so-called ‘human security behaviour lab’ at The Hague University of Applied Sciences. Here we can work with our students and external partners to set up experiments to measure unsafe behaviour and design and test interventions.”
These experiments would be a great fit with the goal and the mission of the research group Cyber security in SMEs: “We want to give companies some practical ways to improve their digital security, especially by focusing on the human factor. The student project is very helpful here. They point out who is vulnerable in which way and give companies advice on how to strengthen their weak spots.