InfoSec and Compliance – With 20 years of blogging experience, DISC InfoSec blog is dedicated to providing trusted insights and practical solutions for professionals and organizations navigating the evolving cybersecurity landscape. From cutting-edge threats to compliance strategies, this blog is your reliable resource for staying informed and secure. Dive into the content, connect with the community, and elevate your InfoSec expertise!
For theĀ thirdĀ timeĀ in the past four months, LinkedIn seems to have experienced another massiveĀ data scrapeĀ conducted by a malicious actor. Once again, an archive of data collected from hundreds of millions of LinkedIn user profiles surfaced on a hacker forum, where itās currently being sold for an undisclosed sum.
The growing reliance on public cloud services as both a source and repository of mission-critical information means data owners are under pressure to deliver effective protection for cloud-resident applications and data. Indeed, cloud is now front of mind for many IT organisations. According toĀ recent researchĀ by Enterprise Strategy Group (ESG) cloud is āvery well-perceived by data protection decision makersā, with 87% of saying it has made a positive impact on their data protection strategies.
However, many organisations are unclear about what levels of data protection are provided by public cloud infrastructure and SaaS solutions, increasing the risk of potential data loss and compliance breach. At the same time, on-premises backup and disaster recovery strategies are increasingly leveraging cloud infrastructure, resulting in hybrid data protection strategies that deliver inconsistent service levels.
Despite these challenges, there are a significant number of organizations that still donāt use a third-party data protection solution or service. This should be cause for concern considering that everything an organization stores in the cloud, from emails and files to chat history and sales data (among many other datasets) is its responsibility and is subject to the same recoverability challenges and requirements as traditional data. In fact, only 13% of survey respondents see themselves as solely responsible for protecting all their SaaS-resident application data.
Records and Information Management: Fundamentals of Professional Practice, Fourth EditionĀ presents principles and practices for systematic management of recorded information. It is an authoritative resource for newly appointed records managers and information governance specialists as well as for experienced records management and information governance professionals who want a review of specific topics. It is also a textbook for undergraduate and graduate students of records management or allied disciplinesāsuch as library science, archives management, information systems, and office administrationāthat are concerned with the storage, organization, retrieval, retention, or protection of recorded information.
The fourth edition has been thoroughly updated and expanded to:
Set the professional discipline of RIM in the context of information governance, risk mitigation, and compliance and indicate how it contributes to those initiatives in government agencies, businesses, and not-for-profit organizations
Provide a global perspective, with international examples and a discussion of the differences in records management issues in different parts of the world. Its seven chapters are practical, rather than theoretical, and reflect the scope and responsibilities of RIM programs in all types of organizations.
Emphasize best practices and relevant standards.
The book is organized into seven chapters that reflect the scope and responsibilities of records and information management programs in companies, government agencies, universities, cultural and philanthropic institutions, professional services firms, and other organizations. Topics covered include the conceptual foundations of systematic records management, the role of records management as a business discipline, fundamentals of record retention, management of active and inactive paper records, document imaging technologies and methods, concepts and technologies for organization and retrieval of digital documents, and protection of mission-critical records. In every chapter, the treatment is practical rather than theoretical. Drawing on the authorās extensive experience supplemented by insights from records management publications, the book emphasizes key concepts and proven methods that readers can use to manage electronic and physical records.
The role of a Data Protection Officer (DPO) is a fairly new one in many companies. Whatās more, the need to hire a DPO often comes as a response to the General Data Protection Regulations (GDPR) which were implemented back in 2018. As such, the responsibilities, reporting and structure of the role are primarily defined by GDPR guidelines.
But though it might be a fairly new role, it can be a very exciting and rewarding one. So if youāre considering a career as a data protection officer, this guide is for you. Below, weāll take a look at what the role entails and what you need to do to get a job as a DPO.
What is a Data Protection Officer and What Do They Do?
In a nutshell, a data protection officer is a steward for data protection and privacy within a business. They must implement effective data protection strategies and facilitate a culture of data protection throughout the company. This is to ensure companywide compliance with GDPR. The appointment of a DPO is mandatory in some businesses, particularly those in the public sector or those that process a large amount of personal data. That being said, some businesses choose to appoint a DPO even though they are not legally required to as it pays to have someone in charge of compliance and data privacy.
In the general data protection regulations, it is stated that the DPO should report directly to the highest management level. As a DPO, some of the key responsibilities include:
Ensuring that a business applies the laws of data protection appropriately and effectively, as well as following these regulations and legislations.
Educating and training management and all other employees about GDPR and other data protection statutes as well as about compliance and demonstrating effective measures and strategies for data handling and processing.
Conducting regular security audits.
Acting as the point of contact between the company and any supervisory authorities (SAs). For example, if there is a data breach, it is the job of the DPO to report this to the relevant authorities.
With this in mind, hereās how you can tailor your career path to lead to the role of a data protection officer.
In order to become a DPO, What skills you may need…
She alleges that TikTok is violating the GDPR (General Data Protection Regulation) by collecting excessive data and failing to explain what itās used for.
Childrenās data is subject to special protections under the GDPR, including the requirement that privacy policies must be written in a way thatās understandable to the serviceās target audience.
Today Iām launching a legal claim against @tiktok_uk on behalf of millions of children whose data was illegally taken and transferred to unknown third parties for profit. Learn more about our fight to protect children's privacy @TikTokClaimUK for updates https://t.co/eSCxj4Jwqlpic.twitter.com/LBvNHq7Oth
Ata Hakcil led the team of white hat hackers fromĀ WizCaseĀ in identifying a major data leak on online trading broker FBSā websites.
The data from FBS.com and FBS.eu comprised millions of confidential records including names, passwords, email addresses, passport numbers, national IDs, credit cards, financial transactions and more.
Were such detailed personally identifiable information (PII) to fall in the wrong hands, it could have been used in the execution of a wide range of cyber threats. The data leak was unearthed as part of WizCaseās ongoing research project that randomly scans for unsecured servers and seeks to establish who are the owners of these servers. We notified FBS of the breach so they could take appropriate action to secure the data. They got back to us a few days later and secured the server within 30 minutes.
Whatās Going On
Forex, a portmanteau of foreign currency and exchange, is the process of converting one currency into another for a wide range of reasons including finance, commerce, trading and tourism. The forex trading market averages more than US$5 trillion in daily trading volume. Forex trading may be dominated by banks and global financial services but, thanks to the Internet, the average person can today dabble directly in forex, securities and commodities trading.
In the rush toward online trading though, users have entrusted terabytes of confidential data to online forex trading platforms. With financial transactions being at the core of forex trading, the nature of user data held in these trading databases is highly sensitive. This has made online trading sites a lucrative target for cybercriminals.
FBS, a major online forex trading site, left an unsecured ElasticSearch server containing almost 20TB of data and over 16 billion records. Despite containing very sensitive financial data, the server was left open without any password protection or encryption. The WizCase team found that the FBS information was accessible to anyone. The breach is a danger to both FBS and its customers. User information on online trading platforms should be well secured to prevent similar data leaks.
Data hygiene consists of actions that organizations can, and should, take as a matter of following not only compliance requirements, but also as part of basic risk management program practices. Consistent, risk-specific data hygiene practices supports not only a very wide range and number of data protection compliance requirements, but performing data hygiene activities also demonstrably improves an organizationās data security effectiveness without significantly increasing IT or information security costs. Most of these actions involve people performing activities that all personnel within an enterprise can take. No specialized tools are typically neededājust some training and ongoing awareness reminders, or periodic use of data management tools.
These actions serve to:
limit the amount of data collected to only that which is necessary to support the purposes of the data collection
keep data from being modified in unauthorized ways, or accidentally
destroy/delete data when it is no longer needed to support the purpose(s) for which it was collected and to meet legal retention requirements
prevent access to data to only those entities (devices, individuals, accounts, etc.) that have a business/validated need to access the data
not share data with others unless necessary and with the consent of those about whom the data applies, as applicable
keep your own personal and business data from being used and posted in ways for which you did not consent or is not necessary to support the purposes for which you originally allowed the data to be collected or derived
Jean Le Bouthillier, CEO of Canadian data security startupĀ Qāohashā, says that organizations have had many issues with solutions that generate large volumes of (often) not relevant and not actionable data.
āMy first piece of advice for organizations looking for the right data security solutions would be to consider whether they provide valuable metrics and information for reducing enterprise data risks. It sounds obvious, but youād be surprised at the irrelevance and noisiness of some leading solutions ā a problem that is becoming a nightmare with data volumes and velocity multiplying,ā he told Help Net Security.
They should also analyze the pricing model of solutions and ensure that they are not presenting an unwelcome dilemma.
āIf the pricing model for protecting your data is volume-adjusted, it will mean that over time, as data volumes increase, youāll be tempted to reduce the scope of your protection to avoid cost overruns,ā he noted. Such a situation should ideally be avoided.
Another important point: consider returning to basics and ensuring that you have a solidĀ data classificationĀ policy and the means to automate it.
āData classification is the fundamental root of any data security governance because it provides clarity and authority to support standards and other programs like user awareness efforts. In the context of data governance, data visibility and, ultimately, data-centric controls canāt work without data classification,ā he explained.
āThink back on the millions of dollars spent on artificial intelligence projects that didnāt result in operational capabilities because little attention was paid to data quality, and accept that data protection projects ā like any other ambitious project ā canāt succeed without rock-solid foundations.ā
OVH, one of the largest hosting providers in the world,Ā has sufferedĀ this week a terrible fire that destroyed its data centers located inĀ Strasbourg. The French plant in Strasbourg includes 4 data centers,Ā SBG1, SBG2, SBG3, and SBG4 that were shut down due to the incident, and the fire started in SBG2 one.
The fire impacted the services of a large number of OVHsā customers, for this reason the company urged them to implement their disaster recovery plans.
Nation-state groups were also impacted by the incident, Costin Raiu, the Director of the Global Research and Analysis Team (GReAT) at Kaspersky Lab, revealed that 36% of 140 OVH servers used by various threat actors as C2 servers went offline. The servers were used by cybercrime gangs and APT groups, including Iran-linkedĀ Charming KittenĀ andĀ APT39Ā groups, theĀ BahamutĀ cybercrime group, and the Vietnam-linkedĀ OceanLotusĀ APT.
Out of the 140 known C2 servers we are tracking at OVH that are used by APT and sophisticated crime groups, approximately 64% are still online. The affected 36% include several APTs: Charming Kitten, APT39, Bahamut and OceanLotus.
Of course, the incident only impacted a small portion of the command and control infrastructure used by multiple threat actors in the wild, almost any group leverages on multiple service providers and bulletproof hosting to increase the resilience of their C2 infrastructure to takedown operated by law enforcement agencies with the help of security firms. āIn the top of ISPs hosting Command and control infrastructure, OVH is in the 9th position, according to our tracking data. Overall, they are hosting less than 2% of all the C2s used by APTs and sophisticated crime groups, way behind other hosts such as, CHOOPA.āĀ Raiu told toĀ The Reg.
āI believe this unfortunate incident will have a minimal impact on these groups operations; Iām also taking into account that most sophisticated malware has several C2s configured, especially to avoid take-downs and other risks. Weāre happy to see nobody was hurt in the fire and hope OVH and their customers manage to recover quickly from the disaster.ā
In this post, we are going to talk about MITRE ATT&CKĀ® techniqueĀ T1001 ā Data Obfuscation. As the name indicates, this technique consists in making data, usually sent over Command and Control (C&C) communications, more difficult to detect and decode. There are countless ways to do that, but here we are going to focus on disguising payloads ā which can simply be information, but also files written as malware or scripts ā as images.
Why would someone do that? Mainly because every day lots of images are downloaded when a user is surfing the internet. Downloading an image-like file therefore blends perfectly into regular traffic and does not stand out for a network security control that, for instance, blocks the download of Windows binaries or PowerShell scripts, or does not look for malicious content in an image file. Since these files do not show up as executable, they can fly under the radar of an antivirus or endpoint detection and response (EDR) capability more easily.
Below we will show three examples of how to obfuscate data into image files, namely:
Adding a JPEG header to the data;
Appending the data to a JPEG image; and
Embedding the data into a PNG image using Least Significant Byte (LSB) steganography.
For the second time in as many years, Google is working to fix a weakness in its Widevine digital rights management (DRM) technology used by online streaming sites like Disney, Hulu and Netflix to prevent their content from being pirated.
The latest cracks inĀ WidevineĀ concern the encryption technologyās protection for L3 streams, which is used for low-quality video and audio streams only. Google says the weakness does not affect L1 and L2 streams, which encompass more high-definition video and audio content.
āAs code protection is always evolving to address new threats, we are currently working to update ourĀ WidevineĀ software DRM with the latest advancements in code protection to address this issue,ā Google said in a written statement provided to KrebsOnSecurity.
In January 2019, researcherĀ David BuchananĀ tweeted about the L3 weaknessĀ he found, but didnāt release any proof-of-concept code that others could use to exploit it before Google fixed the problem.
5 Common Accidental Sources of Data Leaks – Nightfall AI
How do bad actors gain access to a company’s data? Most of the time, well-meaning everyday people are the real source of data insecurity.
In cybersecurity and infosec, itās common to assume that criminals are behind all data breaches and major security events. Bad actors are easy to blame for information leaks or account takeovers, because theyāre the ones taking advantage of vulnerabilities in systems to worm their way in and cause massive damage. But how do they gain access in the first place? Most of the time, well-meaning everyday people are the real source of data insecurity.
A study of data from 2016 and 2017 indicated that 92% of security data incidents and 84% of confirmed data breaches were unintentional or inadvertent. Accidental data loss continues to plague IT teams, especially as more organizations are rapidly moving to the cloud. While itās important to prioritize action against outside threats, make sure to include a strategy to minimize the damage from accidental breaches as well.
This list of five common sources of accidental data leaks will help you identify the problems that could be lurking in your systems, apps, and platforms. Use these examples to prepare tighter security controls and keep internal problems from becoming major issues across your entire organization.