Dataguise Presents 10 Best Practices for Securing Sensitive Data in Hadoop
Guidance Aimed at Protecting Hadoop Deployments Against Data Exposure Risks
(businesspress24) - FREMONT, CA -- (Marketwired) -- 04/03/13 -- Dataguise (), a leading innovator of data security intelligence and protection solutions, today released ten security best practices for organizations considering or implementing Hadoop. By following these procedures to manage privacy risk, data management and security, professionals can prevent costly exposure of sensitive data, reduce their risk profile and better adhere to compliance mandates. With Hadoop security deployments among the Fortune 200, Dataguise has developed these practices and procedures from significant experience in securing these large and diverse environments.
The explosion in information technology tools and capabilities has enabled advanced analytics using Big Data. However, the benefits of this new technology area are often coupled with data privacy issues. In these large information repositories, personally identifiable information (PII), such as names, addresses and social security numbers may exist. Financial data such as credit card and account numbers might also be found in large volumes across these environments and pose serious concerns related to access. Through careful planning, testing, pre-production preparation and the appropriate use of technology, much of these concerns can be alleviated.
The following provide valuable guidance throughout Hadoop project implementations, but are especially important in the early planning stages:
1. Start Early! Determine the data privacy protection strategy during the planning phase of a deployment, preferably before moving any data into Hadoop. This will prevent the possibility of damaging compliance exposure for the company and avoid unpredictability in the roll out schedule. 2. Identify what data elements are defined as sensitive within your organization. Consider company privacy policies, pertinent industry regulations and governmental regulations. 3. Discover whether sensitive data is embedded in the environment, assembled or will be assembled in Hadoop.
4. Determine the compliance exposure risk based on the information collected. 5. Determine whether business analytic needs require access to real data or if desensitized data can be used. Then, choose the right remediation technique (masking or encryption). If in doubt, remember that masking provides the most secure remediation while encryption provides the most flexibility, should future needs evolve. 6. Ensure the data protection solutions under consideration support both masking and encryption remediation techniques, especially if the goal is to keep both masked and unmasked versions of sensitive data in separate Hadoop directories. 7. Ensure the data protection technology used implements consistent masking across all data files (Joe becomes Dave in all files) to preserve the accuracy of data analysis across every data aggregation dimensions. 8. Determine whether a tailored protection for specific data sets is required and consider dividing Hadoop directories into smaller groups where security can be managed as a unit. 9. Ensure the selected encryption solution interoperates with the company's access control technology and that both allow users with different credentials to have the appropriate, selective access to data in the Hadoop cluster. 10. Ensure that when encryption is required, the proper technology (Java, Pig, etc.) is deployed to allow for seamless decryption and ensure expedited access to data.
By starting early and establishing processes that define sensitive data, detect that data in the Hadoop environment, analyze the risk exposure and assign the proper data protection using either masking or encryption, enterprises can remain confident their data is protected from unauthorized access. In following these guidelines, data management, security and compliance officers cognizant of the sensitive information in Hadoop can not only lower exposure risks, but increase performance for a greater return on Big Data initiatives.
"Thousands of firms are working on big data projects, from small startups to large enterprises. New technologies enable any company to collect, manage, and analyze incredibly large data sets. As these systems become more common, the repositories are increasingly likely to be stuffed with sensitive data," said Adrian Lane, Analyst and CTO, Securosis. "Only after companies find themselves reliant on Big Data do they ask how to secure it. Having a plan in place to secure these unique environments during the planning phase is essential."
"Enforcing security and compliance in Hadoop is not a simple matter and requires the right combination of people, processes and technology. The best practices presented here illuminate the important procedures required to maintain data privacy of sensitive data stored in Hadoop. As indicated above, it is critical that organizations place priority on protecting the data first to provide a strong line of defense against unlawful exposures before moving forward," said Manmeet Singh, CEO, Dataguise. "With significant experience in securing Fortune 200 environments, we encourage practitioners to consult with experts when data exposure and non-compliance is not an option. This is the value beyond software provided by Dataguise."
Built for enterprise deployments of Hadoop, DG for Hadoop™ helps evaluate exposure risks and enforces the most appropriate remediation to prevent unauthorized access to sensitive data. This protects organizations against severe financial penalties and the negative impacts to brand that can result from exposure. The solution allows the user to define and detect the data in a Hadoop installation that is sensitive in nature (credit card numbers, social security numbers, account numbers, personally identifiable information, etc.), analyze the company's risk from the exposure of that data and protect the information with masking or encryption so the data can be used safely.
(at)Dataguise Highlights 10 Best Practices for Securing Sensitive Data in Hadoop -
Dataguise is the leading provider of data privacy protection and compliance intelligence for sensitive data assets stored in both Big Data and traditional repositories. Dataguise's comprehensive and centrally managed solutions allow companies to maintain a 360 degree view of their sensitive data, evaluate their compliance exposure risks, and enforce the most appropriate remediation policies, whether the data is stored on premises or in the cloud.
TORONTO, ONTARIO -- (Marketwired) -- 07/25/16 -- ZoomerMedia Limited (the "Company" or "ZoomerMedia") (TSX VENTURE: ZUM) announces it has granted 3,100,000 stock options to various officers, employees and consultants. The options are ...
SOUTH WINDSOR, CT -- (Marketwired) -- 07/25/16 -- Ticket Summit, the leading conference and trade show for ticketing and live entertainment professionals, recently hosted its 10th annual conference, July 12-14, 2016 at The Venetian Las Vegas. Ticket Summ ...
CHICAGO, IL -- (Marketwired) -- 07/25/16 -- Sage, a market leader in cloud accounting software, today announced at Sage Summit 2016, the world''s largest gathering of entrepreneurs and business builders, the next stage of Sage''s stra ...
JERICHO, NY -- (Marketwired) -- 07/25/16 -- , a leader in the development and deployment of proactive network monitoring and management software for the Unified Communications (UC) industry, announced today that ®, a brand of , has recognized Nectar in i ...
CHICAGO, IL -- (Marketwired) -- 07/25/16 -- Sage, a market leader in cloud accounting software, announced today at Sage Summit a number of new initiatives which build on its Partner strategy and emphasise Sage''s focus on driving growth through ...
Who is online
Don't have an account yet? You can create one. As registered user you have some advantages like theme manager, comments configuration and post comments with your name.