ChatGPT is a term that has seemingly graduated to dinner table conversation status over the last month. You can’t seem to go anywhere without hearing or reading about how the emerging technology is responding to questions or prompts in human-like text.
Its ability to generate words has been used to make digital lives more efficient, spark creativity, and even write music, plan events, and support technical tasks. But as with any new technology, there is concern about its potential misuse and cyber criminals taking advantage of the technical advancements to create sophisticated phishing emails, ransomware, and malware.
Emerging cyber security threat landscape
While tools like ChatGPT attempts to limit malicious input and output, threat actors are always exploring unique ways to leverage emerging technology for nefarious purposes. In the wrong hands, computer-generated content can be used to spread misinformation, create fake identities, and manipulate public opinion, thereby exacerbating social and political unrest, especially in countries like Australia. Trellix’s latest Threat Report: February 2023 examined the already fragile state the country is in. Malicious activity in Q4 of 2022 revealed Australia was the sixth-most affected country by the LockBit 3.0 ransomware, one of the most aggressive forms of ransomware.
The report also highlights the transportation and shipping sector has been hit particularly hard by nation-state activity, with 69 per cent of attacks targeting the critical infrastructure industry. This has significant implications for supply chain security, as disruptions to transportation and shipping can have a ripple effect on other industries. The energy, oil, and gas sectors are also at risk, emphasising the importance of strong cyber defences in these critical infrastructure industries.
Business email compromise (BEC) is another growing threat, with fake chief executive emails being a common tactic used by cyber criminals recently. According to the Trellix Threat Report, 78 per cent of BEC attacks involve fake CEO emails using common CEO phrases. This represents a 64 per cent increase over the last three months, indicating that this tactic is becoming more prevalent. In some cases, cyber criminals are using vishing schemes to extract sensitive information from employees. This involves using a fake phone number to call employees and ask them to confirm their direct phone number, which can then be used in further attacks. Perhaps most concerning is that 82 per cent of these attacks are sent using free email services, meaning that threat actors require no special infrastructure to execute their campaigns. This makes it easier for cyber criminals to carry out attacks and makes it harder for organisations to defend themselves.
As we explore new technological advancements such as ChatGPT, organisations need to make the most effective use of scarce resources to protect themselves against emerging potential threats. Cyber security must be a top priority during this exploration era, with businesses and individuals alike taking steps to protect themselves against the growing threat of cyber attacks. This includes implementing strong security protocols, regularly updating software, and educating employees across all industries on how to recognise and respond to potential threats.
The hidden truth
These threatening possibilities have raised the notion that the emergence of computer-generated content has become a battleground for both benign and malicious intent in Australia. However, it is essential to remember that ChatGPT in and of itself is not malicious. In fact, the innovative tool has the potential to make Australians’ lives easier and more efficient, and even support cyber security professionals in a variety of ways.
As a language model, it has been trained to understand natural language and can be used to make difficult concepts easy to understand by explaining them in simpler terms. Through this model, the powerful artificial intelligence (AI) tool can potentially aid in cyber security complexity by developing code, steps, guided investigations, and plans that can help combat possible threats. Its ability to understand and respond to natural language queries makes it a valuable asset for supporting sophisticated challenges, while its scalability and adaptability make it easy to help large organisations manage systems.
In addition to supporting cyber security professionals, ChatGPT can help people generate text for routine tasks such as responding to emails, drafting social posts, creating blog posts, or scheduling meetings, freeing up time for more important tasks. It can also be used to provide personalised recommendations based on a user’s preferences and past behaviour, making it easier to find the information and services that are most relevant to them.
As ChatGPT is exposed to more data, it will become better at understanding and responding to natural language queries. This means that it can adapt to new opportunities and challenges over time, making it a valuable asset for our society, like many other previous technological advancements in our time. As the world becomes more reliant on technology, innovations like ChatGPT will play an increasingly important role in making our lives easier across all aspects of our lives.
Untapped success on the horizon
The fact of the matter is we’re entering a new era of technology. It’s time to start looking at how to integrate these services and learn new skills so we can be more effective. While we explore those opportunities, we must remain vigilant about potential risks and remember it’s still an evolving technological tool with much room for growth.
Incorporating security solutions, such as email security, endpoint security, data loss prevention, and network detection and response, can help keep Australian organisations protected. There is a real opportunity to harness AI technologies to make what we do more efficient and effective if we can keep systems protected while we explore this new frontier.
Luke Power, ANZ managing director at Trellix
Advice businesses continue to evolve, shifting from responding to regulatory change to focusing on opportunities to ...
The advice industry’s all-talk, no-action approach to the intergenerational wealth transfer is turning this golden ...
The future of financial advice is digital – it has to be. With the average cost of receiving financial advice currently ...
Never miss the stories that impact the industry.
Get the latest news! Subscribe to the ifa bulletin