Science topic

Data Protection - Science topic

Explore the latest questions and answers in Data Protection, and find Data Protection experts.
Questions related to Data Protection
  • asked a question related to Data Protection
Question
7 answers
Courts are notoriously overloaded. Added to this are long breaks during ongoing proceedings, which require repeated re-reading of the case. Furthermore, while judges are formally independent and external interference is hardly verifiable (although in the case of the Federal Constitutional Court, many now doubt this), even Don Corleone's blackmail was not verifiable – he only made offers to people they couldn't refuse. Thus, there are a number of factors that can lead a court to take a shortcut to reaching a verdict.
The problem here is the sheer power with which judges are ultimately endowed. While there are legal remedies for appeal and revision, this does not necessarily mean that the next instance will deal with the case more responsibly, nor can all defendants afford to take legal action. How can we better control the arbitrariness that is often suspected, especially in political matters (as suggested in this case by an expert opinion-like article: https://www.achgut.com/artikel/Gefaehrden_Karikaturen_den_oeffentlichen_frieden)?
AI may offer a way forward. The article in question sets out the aspects that the courts should have considered. Since the article was written by a judge, it can be assumed that he described the framework correctly. If an AI were trained to evaluate cases and compare them with the judgments, it would likely reveal that certain aspects were not taken into account in the judgment. The AI ​​is not intended to reach a judgment itself, but rather to analyze the completeness of the evidence.
If the AI ​​evaluation is made public, the court would at least be required to explain why it reached certain conclusions that only partially harmonize with the current one, or would have to face unpleasant criticism in an appeal/review (the opposite is also true, of course: if the appellate/review court wants to view certain cases differently, it would also have to provide more detailed reasons than is currently the case). At the very least, politically motivated judgments would be significantly more difficult to enforce, and presumably, the burden on appeal and review courts would even be reduced, because a lack of due diligence at the lower level would hardly be an effective justification.
How would such an AI be trained? Of course, such training should not end up in the hands of anyone with any ties to politics, as this would open the door to bias. Two possibilities are available:
(1) Training based on judgments in completed proceedings, taking all instances into account. During testing, the AI ​​should already notice that the above-mentioned judgment differs from other judgments because it does not take certain procedural aspects into account. Since judgments are always public, there would be no data protection issues.
Perhaps somewhat more specific, but also conceivable, would be the inclusion of case files, although this would be assessed more strictly from a data protection perspective. However, this would also reveal sloppy work by the investigating authorities in certain cases.
(2) Training based on "case reports." For this purpose, universities can prepare exam papers that analyze cases according to all legal rules, specifically highlighting procedural omissions, and are evaluated by professors for completeness and accuracy. Such reports could be prepared independently of time constraints and other pressures, would not represent a significant cost factor, and could potentially have a positive impact on future careers as judges or prosecutors.
In principle, such approaches already exist, although they are often met with critical comments that an AI could not ethically compete with or replace humans. However, that is not what it is intended to do, and there are different opinions about the ethical standards of some state lawyers. If such tools exist, we should consider using them sensibly instead of rejecting them outright.
Relevant answer
Answer
Besnik Sulmataj I think the word "ethics" should be fundamentally removed from the discussion, because a binding code of ethics simply doesn't exist. In Germany, for example, you can now get into serious legal trouble if you refuse to hate certain ethnic groups with a passion and instead stand up for their rights. Depending on the issue, it will probably be no different in other countries.
  • asked a question related to Data Protection
Question
2 answers
Should Big Data Analytics be used more for personalising services or improving cybersecurity systems?
Currently, it is assumed that Big Data Analytics is a key tool for both personalising services and strengthening cybersecurity. The dilemma is which of these areas to invest more resources in and what the consequences of these decisions may be.
Companies and institutions face the challenge of choosing a strategy for using big data analysis. Personalisation allows for the creation of more attractive products and services, which leads to an increase in sales and customer satisfaction. On the other hand, investments in cybersecurity are crucial in the face of the growing number of cyberattacks and threats to users' privacy. The challenge is to find a balance between the benefits of better personalisation and the need to ensure data protection. In a world of growing digital threats, organisations must decide whether to invest more in protection against cyberattacks or rather in the development of tools to better tailor products to customer expectations.
In view of this, personalising services through Big Data brings greater business benefits than using it in the area of cybersecurity. Big Data should be used primarily to improve cybersecurity, as this is a fundamental prerequisite for the development of the digital economy. Therefore, the optimal approach requires the simultaneous development of both areas, but with a priority depending on the specifics of the industry.
The issue of the role of information, information security, including business information transferred via social media, and the application of Industry 4.0/5.0 technologies to improve systems for the transfer and processing of data and information in social media is described in the following articles:
THE QUESTION OF THE SECURITY OF FACILITATING, COLLECTING AND PROCESSING INFORMATION IN DATA BASES OF SOCIAL NETWORKING
APPLICATION OF DATA BASE SYSTEMS BIG DATA AND BUSINESS INTELLIGENCE SOFTWARE IN INTEGRATED RISK MANAGEMENT IN ORGANISATION
The role of Big Data and Data Science in the context of information security and cybersecurity
Cybersecurity of Business Intelligence Analytics Based on the Processing of Large Sets of Information with the Use of Sentiment Analysis and Big Data
And what is your opinion on this topic?
What is your opinion on this matter?
Please answer,
I invite everyone to the discussion,
Thank you very much,
Best wishes,
I invite you to scientific cooperation,
Dariusz Prokopowicz
Relevant answer
Answer
Analytics have long been used for commercial purposes, where the return on investment is easier to see and justify. Now the ISF is urging business to use the same concepts to secure its networks. “Few organizations currently recognize the benefits for information security, yet many are already using data analytics to support their core business,” says Michael de Crespigny, CEO at ISF. “With the speed and complexity of the threat landscape constantly evolving and the prevalence of combined threats, organizations need to start moving away from being retrospective and reactive to being proactive and preventative.”
The digital age is awash in data. The amount of information organizations collect is growing exponentially, creating a vast data repository with immense potential. This data goldmine holds the key to not only understanding customers but also fortifying security.
Regards,
Shafagat
  • asked a question related to Data Protection
Question
3 answers
Hi,
I have been worked with Healthcare customers for several years and have experience over the typical data protection and privacy challenges.
Combining IAM with AI / ML models, we can create robust health security system. Please check this research and let me know your thoughts...
Relevant answer
Answer
Yes, that's a quite emerging issue to address. Ethical AI building is as much necessary as much as building new AI model
  • asked a question related to Data Protection
Question
3 answers
I want to implement Chatgpt into my website, I wonder about the risks. I know the benefits of putting it into for helping me out with some processes. Is there any recommendations about the right setup? What key priorities would you say for data protection?
All ideas are welcome!
Relevant answer
Answer
Thanks Mfundiso Nongqwenga, "How AI Alters CRM Processes" it is my biggest concern. I will check with the best practices regarding the priorities for keeping the data protected at all the times. Thanks again!
  • asked a question related to Data Protection
Question
3 answers
I’m exploring AI-powered tools to analyze both qualitative (text, interviews) and quantitative (statistical, numerical) data. I’m particularly looking for tools that are user-friendly, efficient, and safe for academic research, ensuring data privacy and security. Recommendations for platforms or software that prioritize data protection would be greatly appreciated.
Relevant answer
Answer
For qualitative analysis, ATLAS.ti, MAXQDA, and NVivo now all incorporate generative AI to different degrees, so if you already have access to one of these, it should serve your purposes.
For quantitative analyses, I think contemporary AIs are likely to be less practical because they can only work with data that they have "seen" before. For example, ChatGPT cannot solve complex addition problems unless they are part of its "reference data set."
  • asked a question related to Data Protection
Question
3 answers
Hi everyone,
For a qualitative research study, we are scoping for potential transcription software to transcribe our interviews. We have the following criteria in mind:
  • We are looking for affordable software (maximum cost of $200 for 16 hours/960 minutes of transcripts).
  • Software should be good at transcribing East-African accents in English (specifically, Ugandan accents).
  • Software should have high data protection mechanisms in place. At minimum, it should be compliant with GDPR legislation.
I already came across Otter.ai, Trint, Sonix.ai, and Rev.com. I am wondering if you have used any of this software before and can provide feedback? Other suggestions for software that meets the aforementioned criteria are also welcome.
Thank you in advance for your responses!
Relevant answer
Answer
For transcription software suitable for research in Uganda, consider the following recommendations:
  1. Otter.ai: Offers real-time transcription and is user-friendly, with a free version available.
  2. Rev: Provides accurate transcription services, though it requires payment for human transcriptions.
  3. Sonix: An automated transcription service that supports multiple languages and offers a trial period.
  4. Descript: Combines audio editing with transcription and allows easy editing of audio based on text.
  5. Trint: An AI-powered transcription tool that provides editing features and collaboration options.
These tools can enhance transcription efficiency for research projects.
  • asked a question related to Data Protection
Question
3 answers
I am writing to request assistance in obtaining the contact information of a researcher whose profile is listed on ResearchGate. The researcher in question is Abhinav Gupta is waiting for your full-text, and their profile URL.
As part of my ongoing research project on [Review on Design and Analytical Model of Thermoelectric Generator], I am very keen to discuss potential collaboration opportunities and exchange insights with [Abhinav Gupta]. Having access to their contact details would greatly facilitate this process.
I understand the importance of privacy and data protection and assure you that any contact information provided will be used solely for the purpose of academic collaboration and will not be shared with third parties.
Thank you very much for your assistance. I look forward to your positive response.
Relevant answer
Answer
Satayu,
I am not sure which research area you are trying to connect to Abhinav Gupta. My guess based on your inquiry is Electrical Engineering or Computers and Information Technology. Not here, but I found two that are possible contacts for your topic???
Abhinav Gupta - Carnegie Mellon University
Abhinav Gupta - Biochemist/Machine Learning
Hope this helps???
  • asked a question related to Data Protection
Question
1 answer
I am conducting research as part of my academic work, focusing on legal awareness in cyberspace.
Throughout this process, I have appreciated the importance of diverse perspectives that enrich the dialogue on these issues. Your expertise and insights would be invaluable in understanding the complexities of this issue from different perspectives.
I’m sharing the link to the survey. I appreciate your consideration and participation in this project. I am available to discuss any questions or provide more information about the research.
Relevant answer
Answer
The complexities of data protection legislation, such as the CCPA in the US and the GDPR in Europe, are critical to understanding when it comes to legal awareness in cyberspace. This entails understanding the rights of individuals with regard to their personal data, the responsibilities of organisations that handle such data, and the possible repercussions of non-compliance. Investigating the ways in which legal systems in various jurisdictions handle data protection issues might yield important information for improving legal awareness in cyberspace.
  • asked a question related to Data Protection
Question
1 answer
challenges affecting adherence as per data protection policies is concerned
  • asked a question related to Data Protection
Question
3 answers
What measures should be implemented when utilizing artificial intelligence within institutional educational frameworks to ensure the protection of minors' personal data? Moreover, how should these measures be aligned with broader regulatory standards such as the European General Data Protection Regulation (GDPR)?
Relevant answer
Answer
Here are simplified strategies for implementing AI in education while protecting minors' data:
Limit Data Collection: Only collect necessary data and avoid collecting personal information unless absolutely required.
Obtain Consent: Get permission from parents or guardians before collecting any data from minors.
Anonymise Data: Remove personal identifiers whenever possible to protect students' identities.
Encrypt Data: Use encryption to safeguard data both in transit and storage.
Access Control: Restrict access to data to only authorized individuals who need it for educational purposes.
Regular Checks: Conduct regular checks and assessments to ensure data protection measures are effective.
Transparency: Clearly communicate data protection policies to parents, students, and staff.
By following these simple steps, educational institutions can use AI responsibly while ensuring minors' data remains safe and secure.
  • asked a question related to Data Protection
Question
1 answer
Hi Fellow Researchers,
I, Vinden Wylde, would greatly appreciate your participation and/or feedback in an anonymous Data Protection questionnaire. Your insights in particular (like-minded individuals) are of crucial value in validating my data protection framework "(VDaaS): A Novel SLEPT Data Protection Framework", and ultimately for the successful completion of a my PhD project/Thesis.
The questionnaire should take approximately 20 minutes, and I understand the value of your time in contributing to this important project.
Benefits:
  • Gain awareness of current data protection practices and upcoming legislation.
  • Provide framework guidance in the Technical Application of Data Protection by Default and Design (GDPR: Article 25).
  • Enable greater customisable control over data transmission and collection.
  • Foster a better overall data protection culture.
Thank you very much in advance for your valuable contributions.
N.B. If there are any other avenues that you deem appropriate for this questionnaire, then please let me know.
Vinden Wylde.
Context/Rationale behind questionnaire: Why is this important?
Regardless of age group, social status, purchasing behaviour, transaction methods (e.g., post-pandemic cashless society trends), intrinsic data protection levels, or external frameworks like the General Data Protection Regulations (GDPR), Digital Footprints (DF) are traceable and globally shared. This often leads to unsolicited approaches, potentially exposing personal information without consent to unwarranted users and adversaries.
For instance, our active DF—manifested through social media, emails, blogs, 'likes,' and published photos—reveal our activities. Conversely, passive DF consists of unintentionally left data recorded by cookies during website visits, including search history and viewed content. This information is frequently leveraged to enhance products and services, serving as a tool to track and analyse the behaviours and demands of online users.
Given the limited control individuals often have over their DF, these digital identities can be exploited for revenue generation without their knowledge. This exploitation encompasses targeted advertising, the use of cookies, and implications for browsing and shopping industries. Moreover, DF visibility by potential employers, schools, creditors, etc., can have enduring effects on reputations, relationships, and employment opportunities.
Relevant answer
Answer
Only two more weeks to go of data gathering 😎Thank you again for all for your help so far. However, we still need additional participation to make it up to the necessary sample-size/to successfully validate the research undertaken to date.
So, if you are working with the technical application of Data Protection (i.e., privacy, security, cybersecurity), governance, ethics, and compliance, then please take part in the anonymous questionnaire (https://lnkd.in/eXU9wmSD). Your contributions are appreciated, and fundamental in providing support to my research/framework development efforts. #GDPR, #iso27001, #privacy, #gdprcompliance, #datagovernance, #datasecurity, #cybersecurity,
  • asked a question related to Data Protection
Question
3 answers
Artificial intelligence augmented governance of healthcare and ethical issues need attention to ensure data protection
Relevant answer
Answer
Absolutely, there are ethical and privacy concerns with AI-augmented healthcare.
AI can process and analyze vast amounts of medical data, which is great for diagnosis and treatment. But, it also raises questions about data privacy. Patients' health information must be protected from breaches or misuse.
Then there's the issue of bias in AI algorithms. If the data used to train AI models isn't diverse, it can lead to biased recommendations or diagnoses, which isn't fair or ethical.
So, while AI has incredible potential in healthcare, it's essential to ensure patient privacy, fairness, and ethical use of AI to make it a win-win for everyone.
  • asked a question related to Data Protection
Question
6 answers
HI,
I'm a master degree student intresested in Access Control Management and Data Protection. Any good topic suggestion and material will be apprciaated.
Relevant answer
Answer
For your master's degree dissertation in cybersecurity, with a focus on Access Control Management and Data Protection, here are some compelling topic ideas: First, consider exploring Advanced Access Control Models in Cloud Computing. This area involves developing or assessing innovative access control mechanisms suited for cloud environments, addressing the unique security challenges of cloud-based data. Another intriguing topic is the application of Blockchain Technology in enhancing secure access control systems. Investigating how blockchain's inherent features can revolutionize access management offers a rich research avenue. Additionally, the integration of Machine Learning for Predictive Access Control presents an opportunity to design systems that adaptively modify access rights based on behavioral analysis and threat prediction. The specific challenges of IoT Security and Access Control also make for an important study, focusing on developing scalable and efficient solutions for the diverse IoT landscape. Lastly, analyzing the interplay between Data Protection Laws and Access Control Compliance could provide valuable insights into aligning organizational policies with legal requirements while safeguarding sensitive data. Each of these topics not only aligns with your interest but also contributes significantly to the evolving field of cybersecurity.
  • asked a question related to Data Protection
Question
3 answers
Here are some key points for a discussion forum
1. Data Protection and Privacy
2. Preventing Unauthorized Access
3. Protection Against Cyber Attacks
Relevant answer
Answer
In the realm of intelligent transportation systems (ITS), cybersecurity plays a crucial role, primarily due to the extensive reliance on technology for traffic management, vehicle communication, and data handling.
Key discussion points include Data Protection and Privacy, where the focus lies on safeguarding the vast amount of personal data collected by ITS. Techniques like encryption and anonymization are essential, alongside ensuring compliance with data protection laws like GDPR.
Preventing Unauthorized Access is another vital aspect, emphasizing the need for robust authentication protocols. It also explores the potential of blockchain for secure, decentralized communication in ITS networks.
The third significant point, Protection Against Cyber Attacks, delves into ITS's susceptibility to various cyber threats such as hacking and malware. This involves deploying advanced intrusion detection systems, conducting regular security audits, and establishing a rapid response framework to effectively mitigate such threats.
Overall, this discussion underscores the evolving nature of cyber threats and the imperative for continuous advancement in cybersecurity strategies to safeguard these essential systems.
  • asked a question related to Data Protection
Question
1 answer
The Information Technology (Reasonable Security Practices and Procedures and Sensitive Personal Data or Information) Rules, 2011, fall under the Information Technology Act, 2000, and govern the collection, use, storage, and sharing of sensitive personal data or information (SPDI) in India. These rules protect individuals' privacy and regulate how organizations handle sensitive personal information.
According to the rules, any unauthorized sharing, disclosure, or misuse of sensitive personal data or information can have legal consequences. The punishment for sharing clinical data, private data, or personal details in violation of the Information Technology (Reasonable Security Practices and Procedures and Sensitive Personal Data or Information) Rules, 2011, can include:
  1. Civil Liabilities: Individuals or organizations found guilty of violating the rules may be subject to civil liabilities, including payment of damages or compensation to the affected parties.
  2. Criminal Liabilities: In serious breaches, criminal liabilities may be imposed on those responsible. This can include imprisonment or fines, depending on the severity of the offence.
It's important to note that penalties and consequences may vary depending on the nature and extent of the data breach and any applicable laws or regulations related to data protection and privacy.
To ensure compliance with the Information Technology (Reasonable Security Practices and Procedures and Sensitive Personal Data or Information) Rules, 2011, organizations and individuals handling sensitive personal data or clinical information should implement adequate security measures, obtain explicit consent from data subjects for data sharing, and adhere to the principles of data protection and confidentiality.
Relevant answer
Answer
In this regard, the IT Act also prescribes criminal penalties that include both imprisonment of up to three years and fines for persons that disclose personal information without the consent of the person to whom the data relates, where such disclosure is in breach of a contract or results in wrongful loss or gain.
  • asked a question related to Data Protection
Question
3 answers
How do I report the authors of a journal paper that I have been communicating with for over 3 months to grant me access to the dataset used in their published paper. I have sent them my CV, proposal report, filled several forms and signed agreements with them only to get this message at the end.
Dear XXXXX,
Thank you for your interest in using data from our institution.
Unfortunately due to data protection issues the data cannot be shared.
We apologize for this inconvenience and wish you all the best with your research/publication.
It was stated in their paper that the corresponding author should be contacted for dataset availability
Relevant answer
I know it's hard. I will not share my data either. I think forget about these people and produce your own data, but copywrite it 100% :)
  • asked a question related to Data Protection
Question
3 answers
Is there any platform that can be secured and compliant with relevant privacy laws and regulations, such as the General Data Protection Regulation (GDPR) or the California Consumer Privacy Act (CCPA)?
Relevant answer
Answer
Another non-traditional approach would be the use of Block-Chain tech, but it would not qualify under the standards you mentioned. Could be an elegant solution to the problem though.
  • asked a question related to Data Protection
Question
2 answers
The impact of digital health technologies on healthcare system is becoming relevant day in day out and its adoption is making access to to patient care more easily accessible.
Introduction of deep learning into the security of EHR data is one of the areas that is being looked into, to ascertain better protection of these sensitive personal identifiable data.
Your input will help a lot to ensure great achievement of this project.
Relevant answer
Answer
Is there a specific view of security you are thinking of, or is this just a wide and general question?
Thinking of security with respect to network/cyber attacks, then Electroni Health Records simply inherit the same deep learning techniques that can be applied to any networked systems/applications (perhaps with some diffeent 'value' calclulations as to the monetary expression of the damage done in such an attack).
If this question is about privacy concerns and data use - who has access to and whwo uses your data, then mostly the access controlls and logging of data access are within the EHR applications themselves (EHR, personal health record or any system/application that looks after health information).
The are also in some networked environments with multiple actors, there are audit logs that can be inspected (the IHE ATNA being an example). I know of a few real world examples (but with no accademic publications) where such logs were inspected and the evidence of un-authorised access (or miss use of authorised access) was proven. Since this activity is looking for patterns in a log file (or set of log files with some correlation) then this looks to me exactly like the kind of problem that deeo learning could be applied to. Note "who" is looking at "what" data implies you need to know the identities of the "who" in a real world context that would confirm or deny that they are allowed to look at "whatever-it-is".
In private healthcare based economies, your insurance premiums are based on your declared health. Health insurance companies may use deep learning tools in analysis of their customers health records - but this is ouutside of my experience and it is of course linked to the legality of doing this (which I am surer varies from country to country).
Effectively though, this kind of deep learning would ook like a specialisation of the kind of deep learning applied to fraud (specialised for healthcare insurance premiums and delcarations).
I am sure there are other areras. Computer architecture that results in multiple copies of healthcare data items and records leads to ambiguity in records - where is the source of truth. You can see how in the above fraud example any ambiguity might lead to loopholes for fraudulent activity or put a consumer in a position where they find it difficult to prove their health records. I am not sure if deep learning would be applicable here, when inspection of the architeture would tell you about this potential-duplicates problem.
Your question seems quite wide, I hope my contribution is useful for you and I haven't missed the poiont of your question.
  • asked a question related to Data Protection
Question
9 answers
As an important factor of production, data is actively protected by criminal law in most countries, such as Germany and the United States.
But where is the boundary of criminal law protection?
How can protection be achieved for the entire data life cycle (fetch, process, expose, store, and destroy)?
What are the differences between criminal law obligations undertaken by different subjects (platforms, countries and individuals)?
please give me your answer
Relevant answer
Answer
Laws will definitely govern the life cycle, as you put it, of any data collected. Needless to say, different jurisdictions/countries will have their own respective laws on the matter. One concept of relatively recent development in "destroying" data collected is the "right to be forgotten."
  • asked a question related to Data Protection
Question
3 answers
I obtained a dataset from an official body. The data is tabulated so in the cells there are actual case numbers. When the numbers are lower than three but higher than zero they are intentionally removed as a result of data protection. I wanted to run an imputation model to impute those numbers in SPSS. However the model impute values lower than zero (even minus 56). When I put the options like one for the lowest value and two for the highest value the process ends with empty cells. Does anyone recommend anything to overcome this obstacle?
Kind regards.
Relevant answer
Answer
@Erdem Erkoyun, you could also use median number as a replacement of missing numbers and rerun the model.
This link maybe useful
"Hands-on with Feature Engineering Techniques: Imputing Missing Values | by Younes Charfaoui | Heartbeat" https://heartbeat.comet.ml/hands-on-with-feature-engineering-techniques-imputing-missing-values-6c22b49d4060
  • asked a question related to Data Protection
Question
5 answers
My name is Laura Lomax and I am an undergraduate student at the University of Bolton completing my 3rd-year project, under the supervision of Professor Jerome Carson. My study is examining whether adverse childhood experiences (ACES), affect an individual’s level of flourishing in adulthood.
Your participation should take approximately 10-15 minutes and will require minimal demographic information such as age, gender, and country. There are two questionnaires to complete, with eight short questions at the end. The first questionnaire consists of 10 questions regarding Adverse Childhood Experiences and the second questionnaire has a total of 23 questions that all relate to one’s flourishing.
Participation in the study is voluntary and you are free to withdraw from the study at any time before the last question. Completing the survey is giving your consent to participate. Once you have completed the questionnaire and pressed ‘submit’, you will no longer be able to withdraw from the study. Please note that all responses to this study are completely anonymous and your identity will remain unknown throughout. Once submitted, all data will be stored in line with the General Data Protection Regulation. The only people with access to the results are the researcher and her supervisor.
This study has been approved by the Psychology Department’s ethical committee, which adheres to the British Psychological Society’s guidelines. It is not my intention to cause you any psychological distress, however, some questions are of a sensitive nature that you may find distressing. If this is the case, or if you are interested in receiving any support about any issues raised in this study, please contact the following helplines:
Samaritans (UK & Ireland) – Call 116 123
CALM – 0800 58 58 58
Thank you for taking the time to complete the study. Should you feel you require any additional information concerning this study either before completing it or afterwards, please do not hesitate to contact me via email at ll6eps@bolton.ac.uk My supervisor’s details are J.Carson@bolton.ac.uk
Warm Regards
Laura
Relevant answer
Answer
Hi, I am looking for participants for the study if you are able to take part? x
  • asked a question related to Data Protection
Question
6 answers
The EU General Data Protection Regulation (GDPR) is among the world’s toughest data protection laws. Under the GDPR, the EU’s data protection authorities can impose fines of up to up to €20 million (roughly $2,372,000), or 4 percent of worldwide turnover for the preceding financial year—whichever is higher.
Since the GDPR took effect in May 2018, we’ve seen over 800 fines issued across the European Economic Area (EEA) and the U.K. Enforcement started off somewhat slow. But between July 18, 2020, and July 18, 2021, there was a significant increase in the size and quantity of fines, with total penalties surging by around 113.5%. And that was before the record-breaking fine against Amazon—announced by the company in its July 30 earnings report—which dwarfed the cumulative total of all GDPR fines up until that date.
Top 10 fines so far:
  1. Amazon — €746 million
  2. Google – €50 million
  3. H&M — €35 million
  4. TIM – €27.8 million
  5. British Airways – €22 million
  6. Marriott – €20.4 million
  7. Wind — €17 million
  8. Vodafone Italia — €12.3 million
  9. Notebooksbilliger.de — €10.4 million
  10. Eni — €8.5 million
Relevant answer
Answer
Dear Mr. Sekulovic!
You pointed to an important issue. There might be a need for studies depicting the context and impact of this regulation package:
1) Karen Yeung, Lee A. Bygrave (2021). Demystifying the modernized European data protection regime: Cross-disciplinary insights from legal and regulatory governance scholarship, Regulation & Governance Early View, 04 May 2021, Open access:
2) Hallinan D. (2021) Biobank Oversight and Sanctions Under the General Data Protection Regulation. In: Slokenberga S., Tzortzatou O., Reichel J. (eds) GDPR and Biobanking. Law, Governance and Technology Series, vol 43. Springer, Cham. https://doi.org/10.1007/978-3-030-49388-2_8 Available at:
3) Ilse Heine (2021). 3 Years Later: An Analysis of GDPR Enforcement, Center for Strategic & International Studies, Sept., 13, 2021, Free access: https://www.csis.org/blogs/strategic-technologies-blog/3-years-later-analysis-gdpr-enforcement
Yours sincerely, Bulcsu Szekely
  • asked a question related to Data Protection
Question
3 answers
Hello. I am conducting a psychological research study with participants in the UK, and I want to know what privacy and data protection rules there are about recording participants' microphone audio and camera video for research purposes. We have a video-conferencing simulation with a virtual human (recorded actor) and would like to record short clips of participants' audio and video during the video interaction. Is this possible, and, if so, what types of informed consent and data protection need to be provided?
Thank you so much.
Relevant answer
Answer
Your question is very interesting, thank you for taking me into account
  • asked a question related to Data Protection
Question
1 answer
Given the privacy and data protection concern, is there a good place to get anonymous runners training data for their previous marathon for academic research purpose? What are that proper ways to ask companies, such as Garmin, Strava etc. for a dataset for research purpose?
Relevant answer
Answer
As a runner, user of Garmin and Strava, and someone who has dabbled in the analytics space, I think that this is a really interesting question. First, Kaggle has a number of data sets on marathon prediction and some kernels that attempt to solve that problem. You can find them with this query:
Second, getting data from Strava would be more challenging. While the have an API you can use with athletes' permission described here:
they have a ToS prohibiting doing analytics on their data:
I suggest asking their developer forum to see how to get permissions for doing analytics.
Third and last, I found this fitness database called fitabase that is de-identified. Never used it, but thought I would mention it is there:
  • asked a question related to Data Protection
Question
3 answers
Data Protection and Privacy- Comparative perspective
Relevant answer
Answer
send details " inbox"
  • asked a question related to Data Protection
Question
5 answers
Even though the idea of a "pay with money not with data" principle is not new in the literature, so far this idea has not translated into domestic legislation. Why? Are there hurdles or obstacles to it? if so, of what nature (legal, political, financial)? If such principle were to be implemented, what should be the features of a corresponding legal provision under domestic law? Regarding the international regulatory framework of data protection, you are welcome to look at my last research on the matter at :
Relevant answer
Answer
Interesting question for sure. There is a semantic difficulty since "paying" somewhat presuppose a financial transaction with money. While data is certainly something that worth money, it requires quite a lot of intermediate steps to do so.
Any basic website collects automatically lot of data (connection logs, information on browser and location, sessions...). To distinguish data collected as payment from data collected not as payment might not be very easy (you would need to carefully examine the use of data in many steps).
  • asked a question related to Data Protection
Question
3 answers
The compatibility of DLT-based applications with the GDPR has been reviewed in the past years, but the conclusions were in general not very sharp. Often, scholars underscored the fact that compatibility or lack thereof can only be assessed on a case-by-case basis. This is at least the conclusion I drew in my article on the matter, available at :
.Yet, I wonder if, with the recent developments in technology and applications, and with the better understanding of how the GDPR is implemented, time has come for a renewed assessment of the relationship between the two. Are there ways to make DLT applications a priori GDPR compatible? if so, how? Or, to the contrary, are DLT a priori not meeting the GDPR requirements? and if so why and what should be fixed when it comes to concrete use cases?
Many thanks for a lively discussion.
Christian Pauletto
Relevant answer
Answer
It's definitely tricky, as you have two opposing principles:
1) What happens on the blockchain, stays on the blockchain
2) The GDPR "right to be forgotten"
This implies that you can certainly never place any form of personal information on a blockchain, but only links to or hashes of such information. However, this in turn breaks another principle, namely that you should never sign something you don't know the content of. What happend if the information that is pointed to changes? Is it possible to perform a birthday attack on the hash by preparing two different messages with the same hash?
  • asked a question related to Data Protection
Question
4 answers
Hello everyone, right now I am working in the very initial phase of a dissertation proposal revolving around the Article 16 and Article 17 of the EU GDPR, and compliance of a Blockchain model to it.
Relevant answer
Answer
Whether the blockchain is permissioned or not does not make any difference regarding Art. 16 and 17 GDPR. As long as the blockchain has the standard functionality that you cannot modify any data once they have been included in the blockchain, you can by definition not satisfy the requirements regarding correction or erasure.
A workaround is that the personal data are stored within the blockchain, but the blockchain only contains a link to the personal data. Then you cannot modify the link but you can modify the data itself since they are outside the blockchain.
  • asked a question related to Data Protection
Question
16 answers
Over 10000 people from 84 countries have already participated. Survey available in 24+ languages. In order to join, please visit: https://www.yashchawla.in/corona-virus
Description: Researchers, policymakers and societies alike are increasingly discussing the New Corona Virus (COVID-19) situation, globally and in their respective countries of residence. The European Commission, national governments and private foundations invest large amounts of money for research, focusing on finding the potential cure for the virus.
In line with these developments, our international research consortium is conducting a research survey to better understand public awareness, opinions of COVID-19, and the role of various communication channels in the propagation of myths and facts. We invite you to participate in this anonymous survey, as learning about your opinions will help us to provide recommendations to both, research institutions and policymakers, regarding the process of effective communication with society. The results of this survey will be published in international academic outlets, with no identifiers to individual respondents. The data collection procedures are in-line with the General Data Protection Regulations (GDPR).
Relevant answer
Answer
I have sent the survey to some colleagues and encouraged their participation.
On another note my work has focused on global citizenship education and competence, as well as diversity, equity, and inclusion. I find that this Covid-19 issue has highlighted the continued need to develop Global Citizenship Education program, as well as global citizenship competencies in today's students. There have been numerous examples from around the world (and so many in the US) of people ignoring the well being of their families and communities in order to go to Spring Break, go to a ski resort, or have 1 last party before leaving campus. In each of these cases participants got sick and brought this home to their loved ones and their community. This has led to increases in the spread of the virus, which has directly led to an increase in the number of deaths.
If anything this pandemic has showed that unchecked globalization, without the global citizenship competencies needed to balance it, is dangerous to us all, and a silver lining would be an increase in the promotion of these competencies across higher education institutions.
I would like to discuss this with you and see if we can collaborate on some future research. I will email you.
  • asked a question related to Data Protection
Question
8 answers
A need for Data Protection Officers is emerging very fast. After adoption of GDPR, organizations worldwide need hundreds of thousands of DPOs. Are universities ready, are there enough data privacy programs/courses that putts together information security and law?
Relevant answer
Answer
Agree with Ralf's views on this. Universities can look at industry linked programs in Risk and Compliance space and privacy can be covered under that.
  • asked a question related to Data Protection
Question
12 answers
Do you think Data Protection Officer should be a lawyer or an infosec expert? Since it is very hard to get 2 in 1 in one person, do you thing that DPO should be a team of at least two people?
GDPR says:
The data protection officer shall be designated on the basis of professional qualities and, in particular, expert knowledge of data protection law and practices and the ability to fulfil the tasks referred to in Article 39.
Article 39
Tasks of the data protection officer
1.   The data protection officer shall have at least the following tasks:
(a) to inform and advise the controller or the processor and the employees who carry out processing of their obligations pursuant to this Regulation and to other Union or Member State data protection provisions;
(b) to monitor compliance with this Regulation, with other Union or Member State data protection provisions and with the policies of the controller or processor in relation to the protection of personal data, including the assignment of responsibilities, awareness-raising and training of staff involved in processing operations, and the related audits;
(c) to provide advice where requested as regards the data protection impact assessment and monitor its performance pursuant to Article 35;
(d) to cooperate with the supervisory authority;
(e) to act as the contact point for the supervisory authority on issues relating to processing, including the prior consultation referred to in Article 36, and to consult, where appropriate, with regard to any other matter.
2.   The data protection officer shall in the performance of his or her tasks have due regard to the risk associated with processing operations, taking into account the nature, scope, context and purposes of processing.
Relevant answer
Answer
Thank you very much for a comprehensive and helpful answer! I myself am an Information Security Manager, and also an Operational Risk Manager in my institution and I am pretty much familiar with your experiences. So I think, definitely, a lawyer and an infosec manager should make a DPO team. Lawyers know the legislation and infosec managers know standards and data protection side of the story. It's like two sides of the same coin, both necessary for a good privacy management.
Dear Mr.
Syed Hassan
Thank you very much for good wishes! All the best to you too in the coming year!
Best Regards,
Rajko Sekulović
  • asked a question related to Data Protection
Question
3 answers
I am looking for case studies of actual privacy risks. At the core of privacy and data protection impact assessments, we find the concept of 'risk' meaning - in this case - the probability of a threat to personal data and the possible harm or damage caused by this threat. E.g. I fall victim to a phishing attack and the attacker gains access to my bank account, the actual harm being that my account is emptied. Another example would be that my account at a social media platform is hacked and my identity is used to "go shopping".
Now, one finds a lot of literature on privacy (PIA) and data protection impact assessments (e.g. the edited volume by Wright and De Hert (2012) on PIA), on the potential risks of low levels of data security (e.g. Rosner, Kenneally (2018): Clearly Opaque: Privacy Risks of the Internet of Things), on technological and organization standards (e.g. ISO 27001 on Information security management), or on the regulatory frameworks of privacy and data protection (e.g. everything on the details of the GDPR in the EU). But I have a hard time to find research results evaluating actual risks similar to your risk to fall victim to a traffic accident, have your home being broken into, or get cancer.
I would welcome any hint to empirical publications on actual privacy risk analysis be it from medical, social, internet-based or any other research that you consider as most important. I am *not* looking for literature on how to conduct privacy and data protection impact assessments or standards for this purpose. Thank you.
Relevant answer
Answer
This is a great question, and inspired to me to look for some quantification of the risk and probability of data breaches and harm. Found the following reports which may be of interest. They are largely from security companies and insurance companies, which would have access to this kind of data and might need data like that to set insurance policies.
  • asked a question related to Data Protection
Question
8 answers
Has the the the GDPR effectuated the application and enforcement of data protection controls in the non-European countries? How has it impacted on your own country's legal system (if applicable)?
Relevant answer
Answer
Japan changed its legislation in order to get an adequacy decision from the European Commission and South Korea is in the way to do the same. There is a clear convergence trend from some countries who aspire to have a free flow of personal data with the EU. In my opinion, the EC rushed with the Japanese decision which should had set an example as it was the first after the application of the GDPR started to be applicable but the EU wanted to apply it in parallel with the EU/Japan trade agreement so it could not wait any more. I am finishing a paper on the topic if you want to check the draft you can send me a message.
  • asked a question related to Data Protection
Question
3 answers
Hi,
I am participating in a scientific study. Our general aim is to collect financial information about the amount of the funds that natural protected areas are using (and a breakdown of the spending of those funds), and what is the estimated budget needed for the PA system to achieve its objectives (the ideal budget). At the country level, or broken down in individual protected areas. 
Apart from that, we have more specific questions, like how much would it cost to create new protected area, and how much is the positive economic value of the protected area, taking into account ecosystem services and visitor-based income. 
📷
If you were so kind as to help me, I would be very grateful. Many thanks and best regards
Relevant answer
Answer
Hard to find the data on the Internet.
Ecosystem service approach is still not used properly, but hopefully some rough estimates exist. Perhaps Robertina Brajanoska can help.
Visitors don't pay for entrance yet, so that income is 0.
  • asked a question related to Data Protection
Question
11 answers
The General Data Protection Regulation (GDPR) has been in force since May 2018 and thus for almost a year. Do you know of cases in which fines were imposed for violating the requirements of the GDPR? How high were these fines and which companies were affected? Background of the question are the fears of the fines at that time (up to 20 million Euro or in the case of a company up to 4% of the total worldwide annual turnover of the previous business year). To what extent were these fears justified?
Relevant answer
Answer
The report is in German but it covers other EU countries as well.
  • asked a question related to Data Protection
Question
4 answers
Implementation of GDPR will reduce the access to personal data by the corporations in several aspects. The effects will have several pros and cons for the economy and business. What are the prospects that GDPR will effect?
Relevant answer
Answer
Please read my papers below for my take on the GDPR's impact on non-EU businesses, in particular on the controversial topics of extraterritorial applicability and cross-border data transfer:
  • asked a question related to Data Protection
Question
10 answers
I was wondering what the feelings were (esp for those in the EU) about the new General Data Protection Regulation (GDPR) that comes online in 4 months...
Relevant answer
Answer
I have set out some analyses of the GDPR from the perspective of non-EU businesses, in the following research pieces, feel free to have a read:
  • asked a question related to Data Protection
Question
19 answers
In my opinion, information contained in posts, posts, comments on social media portals can be used as research material, which can be used for scientific research if appropriate standards of ethics and personal data protection are maintained.
In view of the above, I am asking you: Does the information contained in posts, posts, comments on social media portals can be used for scientific research?
Please reply.
I invite you to the discussion
Thank you very much
Best wishes
Dear Friends and Colleagues of RG
The problems of the analysis of information contained on social media portals for marketing purposes are described in the publication:
I invite you to discussion and cooperation.
Best wishes
Relevant answer
Answer
For me, a part of this kind of information can be used for scientific research, under the context of the question. Of course, scrutiny is necessary.
Regards
  • asked a question related to Data Protection
Question
8 answers
The ongoing saga of improper access to personal medical data continues despite the multitude of computing standards for data protection.
Could Blockchain ensure the security of patient data while allowing access to appropriate healthcare staff?
Relevant answer
Answer
The answer is yes !!!
  • asked a question related to Data Protection
Question
3 answers
Dear colleagues,
I will be really thankful if you share your experience , research, pre-prints on the impact of GDPR on planning and executing the surveys. How you achieve the consent for data protection? Whaat is the respondent reaction? What is the response rate after GDPR? We expect some decrease, for now more visible in online surveys than in face-to-face.
Relevant answer
Answer
Dear Ekaterina, I enclose a copy of the Baker McKenzie report titled, GDPR National Legislation Survey issued in January 2018 and it covers a number of your queries.
  • asked a question related to Data Protection
Question
3 answers
The GDPR seems to be more a protectionist initiative for large and rich publishers. They say that "the GDPR improves transparency and data privacy rights of individuals", but seems to be an initiative to restrict science and reduces the access to information, but is it? Please you must say what your opinion.
Relevant answer
Answer
GPDR can be considered in many aspects.
For small organizations, this will involve many new responsibilities. It will also affect honest and high standards - as there will be a need to document compliance with these standards, while it has been sufficient to comply with them so far.
Larger organizations are likely to feel less because they are more formalized and bureaucratic anyway.
There will be fear of penalties, which may limit some activities. While most probably agree that abuses and uncontrolled trade in personal data should be limited, the problem of borderline events, taken in good faith, will also appear, which, however, can also be interpreted as abuse and transgression.
Will this improve the protection of the right to privacy? It really depends on people and their awareness. The last affair with Facebook showed how easy private data can be used, but on the other hand people should be able to count on such a situation by sharing their data on the Internet. No regulation can replace reason and caution.
Will it affect learning and information flow? I do not think so much. If these regulations were in force many years ago, today we would probably know that the AIDS patient traveled a lot and was homosexual - because they influenced the way the disease spread and that it affected homosexuals in the first place, but we would not know the name this patient, which does not matter to understand the mechanism of disease spread.
I work in data recovery. This is a very sensitive area when it comes to confidentiality and data security. If someone entrusts me with his medium, he expects that the data I will recover will not be disclosed to anyone else or to a jealous wife or police - if I suspect that they may be evidence of a crime. On the other hand, if the client is, for example, the Police, it is not my role to protect someone's intimate secrets that concern legal but very personal matters.
Usually I do not analyze the contents of the media, unless I am explicitly asked for it, so I do not even know about many ethically doubtful situations. However, over the years I have seen something like this several times that I had serious doubts as to how to proceed. Always prevailed loyalty to the client. He is the owner of the data and he is responsible for their use.
  • asked a question related to Data Protection
Question
5 answers
I am currently working on a catalogue of good practices in the sphere of privacy and data protection learning and this information will help me decide if to include your project in the catalogue.
Thank you in advance,
Dilyana
Relevant answer
Answer
Thank you very much and all the best with the project,
Dilyana
  • asked a question related to Data Protection
Question
11 answers
The use of big data to target groups with advertising is a mild form of manipulation but considering all the things that it could be used for, good and bad, how can a set of ethics be maintained until there is some way of policing the internet. There is a very fine line between using big data to protect people to using it to control people.
Relevant answer
Answer
  • Almost fifty years ago, it was clearly acknowledged by leading authorities of that time (1967) that data and publications had already got out of hand -- "surfeit-data syndrome".
  • Today even the accomplished specialist cannot claim to be remaining abreast of all developments in her/his field.
  • Big data, if applied to science and scientific discovery, will lead to a catastrophic nightmare of confusion and abolition of straight thinking. This is not just a warning, it is a guarantee from a person at the frontier of medical research for almost three decades now.
  • People will then think through numbers and statistics.
  • asked a question related to Data Protection
Question
3 answers
I am wondering particularly in light of the new general data protection regulation coming into force across Europe shortly. We found data protection presented a significant challenge in our long term cohort follow up study.
Relevant answer
Answer
This is a really good question, and the general issue has recently come up in my work. I looked around for studies, and found a couple of analysis of its effects. They seem to point out that there is an exception for "research," but the definition of "research" can vary by EU member state. Anonymization is said to help with the restriction, but pseudo anonymization, where data can be re-identified, has particular issues. Here are links I saw that may be helpful:
  • asked a question related to Data Protection
Question
7 answers
Today, in the UK, the Department of Digital, Culture, Media and Sport (DCMS) announced that organisations could face a fine of 4% of global turnover or £17 million for the failure of critical infrastructure, including within energy, water, transport, and health. Overall it is part of the UK's response to an EU directive on Network and Information Systems (NIS), and levies the same levels of fines of GDRP (which focuses on data protection).
This also comes on the back of recent power-related outages at BA and Capita, which led to serious problems their systems. A key focus is that organisations will be required to prove that they have a disaster recovery plan in place, and have plans to enact it on serious incidents.
But, will fines actually improve things or will auditors and the legal industry be rubbing their hands with the increasing fees for the work?
Relevant answer
Answer
 I'm tempted to say "let's wait and see", but I suspect that the GDPR may end up being just so much smoke - everyone is currently in a frenzy, worrying about the 4% of global turnover fines, but will they actually ever be levied? Or will we just see the customary wrist-slapping from the respective DPAs?
  • asked a question related to Data Protection
Question
3 answers
When working with collaborators in Switzerland it became obvious that the government shot itself in the foot when regulating the use and transfer of patient data. Even some of the simple anonymized statistical data sets require permission from the ethics committee.
These country-specific regulations also serve as a barrier to entry when asking someone for a data set from a published paper in order to replicate their experiments to see if these experiments are reproducible. 
In your experience, what are the best countries (including Asia), where data transactions for research purposes are not regulated and most fluid? 
Relevant answer
Answer
Here is an article discussing the privacy laws between the EU and the US: 
As for the determination of the best country, it will depend on your preset determining factors.
  • asked a question related to Data Protection
Question
3 answers
I am doing reseach on electronic solutions to substitute the Excel/Word checklist to facilitate the Inspectors' tasks and then a possible second phase to support the centres' preparations. Do members have any suggestions of solutions that you have seen and that could work as an alternative to the current Excel file? One requirement is that the solution is hosted in the EU to meet data protection requirements.
Relevant answer
Answer
Very kind of you Bob. I will read it with interest.
Thank you.
Eoin
  • asked a question related to Data Protection
Question
11 answers
Hello! I am doing Master of Laws in Criminal Procedure in São Paulo (Brazil) and my dissertation involves data protection according to its sensitivity aspect. The main objective of my dissertation is to study the classification of data, mainly the sensitive category, i.e., when this category came up, in which context this category came up, how can I define sensitive data, which parameters could be used to conclude that a data is sensitive, etc. Does anyone have/know any article/book related to this subject? Tks
Relevant answer
Answer
Dear Arthur, you might consider what he says to his profession. Everything should be monetized.
Then sencibilidade a document is directly linked to the value that it has, or projuízo would cause if disclosed.
In my profile you will find many articles on privacy and data loss and data leakage.
Estou em Campinas caso precise.
  • asked a question related to Data Protection
Question
2 answers
Relevant answer
Answer
I assume you are referring to web app security assessment.
  • asked a question related to Data Protection
Question
8 answers
Hello people,
how can i evaluate the security on IoT 
Any suggestion, resource or comment will help
Thank you! 
Relevant answer
Answer
The evaluation of any security devices is in fact an evaluation of security algorithms and security implemented protocols. However, the evaluation in not mainly related to Devices. For example we evaluate AES or RSA cipher algorithms regardless they are implemented in smart card or in a router. 
  • asked a question related to Data Protection
Question
2 answers
I need to used Weka tool for analyse the anonymization algorithms
Relevant answer
Answer
can i use weka for data anonymization.Please explainme
  • asked a question related to Data Protection
Question
8 answers
I am interested to find the advantages for each method , because there are a  lot of researchers used k-anonymity and they were make many enhancement to it. Some others worked on l-diversity also for protecting sensitive data privacy.If anyone know other techniques I appreciate that..  
Relevant answer
Answer
Authentication User’s Privacy: An Integrating Location Privacy Protection Algorithm for Secure Moving Objects in Location Based Services
read this paper also related reference it will help you .
  • asked a question related to Data Protection
Question
9 answers
Hello everybody,
Which algorithm of anonymous data is more useful to preserve a big data from a data analizer. For example I want to outsourcing a database for data mining. What is the effective data anonimizing in the big data to protect the personal infomation? 
Relevant answer
Answer
The first issue with any public-use file (open data) is providing a file that allows reproduction of 1-2 (but hopefully more) analyses that might be performed the the original, non-public file prior to the 'masking' of certain fields to prevent re-identification. If the file has valid analytic properties, then the data producer should (attempt to) justify that re-identification of a small proportion of individuals is exceptionally difficult or impossible.
 This has been an open area of research for 35+ years. Do a bunch of tabulations (queries) with the underlying microdata in a manner so that individuals cannot reconstruct the underlying microdata from the tabulations (even when noise (epsilon) is added in a suitable manner). At one point, major database groups and IBM concluded that it was an impossible problem (even after millions in dollars of NSF support).
 In the mid 1990s, Latanya Sweeney (then a CS Ph.D. student at MIT, now a Harvard professor) took an 'anonymized' set of health data from Massachusetts State employees and showed how to re-identify most of them using a Massachusetts voter registration database. The health data had been anonymized by removing individual's names, SSNs, heath insurance IDs, doctor's names, hospital names and just about any other identifiers that individuals could think of. For analytic purposes, ZIP codes, sex, and date-of-birth were left in the file and these fields were used to re-identify more than 70% of the individuals in the file, including the Governor.
 In 2003 Dinur and Nissim provided methods with rigorous guarantees on the privacy.
Dinur, I., and Nissim, K. (2003), “Revealing Information while Preserving Privacy,” ACM PODS Conference, 202-210.
Much of the work in recent years has been on improving analytic properties.
Dwork, C. (2008), “Differential Privacy: A Survey of Results,” in (M. Agrawal et al., eds.) TAMC 2008, LNCS 4978, 1-19.
Barak, B., Chaudhuri, K., Dwork, C., Kale, S., McSherry, F., and Talwar, K. (2007), “Privacy, Accuracy, and Consistency Too: A Holistic Solution to Contingency Table Release,” PODS ’07, Beijing, China.
Hardt, M,, Ligget, K., and McSherry, F. A (2010), A simple and practical algorithm for differentially private data release, available at http://arxiv.org/abs/1012.4763 .
Over the last seven years, the American Statistical Association has had three invited paper sessions at their Annual Meeting on differential privacy. Different groups would like to provide 'protected' microdata with guarantees on privacy and on valid analytic properties.
The answer is that there are presently no systematic methods of both assuring analytic properties and privacy. As long as your original data are suitably clean (this is not always assured in many databases), then your tabulations should be valid. Unfortunately, there are very sophisticated methods of working backwards from the tabulations to subsets of the microdata. Look at the following for improved methods of working backwards from the tabulations to the microdata.
Dwork, C. and Yekhanin, S. (2008), “New Efficient Attacks on Statistical Disclosure Control Mechanisms,” Advances in Cryptology—CRYPTO 2008, to appear, also at http://research.microsoft.com/research/sv/DatabasePrivacy/dy08.pdf .
Many CS and other researchers have shown how to re-identify with seemingly innocuous files.  The used public-use IMDB information to re-identify individuals in the Netflix public-use that Netflix subsequently took down.
Narayanan, A. and Shmatikov, V. (2008), Robust De-anonymization of Large Sparse Datasets, Proceedings of the 2008 IEEE Symposium on Security and Privacy, 111-125.
  • asked a question related to Data Protection
Question
5 answers
The future data protection package includes a General Regulation and a Directive on the protection of individuals with regard to the processing of personal data by competent authorities for the purposes of prevention, investigation, detection or prosecution of criminal offences or the execution of criminal penalties, and the free movement of such data.
However, the data protection package initially leaves unaffected Prüm regime as was pointed out by the European Data Protection Supervisor (Opinion of the European Data Protection Supervisor on the data protection reform package, 7 March 2012, 443, page 68 ).
The Amendment 6 of EU Parliament (14 March 2014) introduced it. (EP legislative resolution of 12 March 2014 COM(2012)0010 – C7-0024/2012 – 2012/0010(COD)) Today (4 December 2014) is in discussion within the Council (http://eur-lex.europa.eu/procedure/EN/201285)
I am interested in know any comments or articles regarding this question, thanks!
Relevant answer
This paper perhaps does not respond to your question but is a good overview of the efforts to come to common ground and to identify the minimum standards between US-EU privacy law 
  • asked a question related to Data Protection
Question
3 answers
Hi All. I'm Khairil from Malaysia.
Currently, I've being working on developing a model / framework regarding the Data Leakage Protection for government sector. Does anyone have references? Thank you so much for your help.
Relevant answer
Answer
these are reference articles which discussed about data leakage protection and model, hopefully can help you.
  • asked a question related to Data Protection
Question
7 answers
Patient clinical data is private and acquiring it for research especially by researchers outside the affiliation to healthcare organisations can be close to impossible. Even when such data is anonymised.
Can patient data be simulated? Will it then be considered applicable for use in a research?
Are there any samples of simulated data which can be tested?
Relevant answer
Answer
I also agree with Lazaridis, Ngafeeson and Aghatise. Simulated data is possible, and results based on them even publishable, as long as it refers to simulation. Its connection to real clinical outputs, however, remains doubtful, precisely because it is based on a simulation.
  • asked a question related to Data Protection
Question
4 answers
Dear colleagues, we are looking for data on informality among firms in Latin America, and comparisons to other emerging regions.
In particular, we would like to know estimates of the share of firms not registered (while they should); or firms paying taxes, etc...
Apparently, some household surveys include the question, but we would favor firm surveys.
Many thanks!
Note: The WB Entreprise survey only surveys formal firms, and how they compete with informal ones.
Relevant answer
Answer
Gracias Fabio! I'll take a look (although I doubt that most workers know whether their employers are registered or pay taxes...)
  • asked a question related to Data Protection
Question
4 answers
The data protection can be studied in the two different angles, that is technical and legal. Can we consider data security as a technical issue and data protection as a legal issue?
Relevant answer
Answer
The challenges of data protection/security in an organization include:
-Availability of trained staff (from user to administrator)
-Regular training of master trainers as new threats are always growing
-Bringing all machines on the network 
-Disabling USB data storage devices
-Controlled access to internet
-SAN for data storage and archiving
-Effective group policies
-Regular updating/upgrading of OS
-Installation of antivirus software and its regular updating
-Legal support / supporting policies/Laws
-& physical security as an additional protection layer
Data protection is bigger picture and can be see as institutional issue where as  data security is an organizational matter and it deals more with technicalities or implementation of data protection policies. 
  • asked a question related to Data Protection
Question
1 answer
Using file entropy or the chi square test seems to generate too many false positives (i.e., encrypted files are reported as unencrypted).
Perhaps one can use the FRSS score mentioned on page 12 here (https://www.utica.edu/academic/institutes/ecii/publications/articles/A0B3DC9E-F145-4A89-36F7462B629759FE.pdf), but I'm not sure how to apply that patch to SleuthKit.
Any ideas?
Relevant answer
Answer
I can understand the opposite: unencrypted data being classified as encrypted. This can happen if data is compressed, thus increasing the entropy to the max limit (at least should). One thing that might help is to first identify the encoding of the source (e.g., raw or base64 encoded, quoted_printable, ...) Once this is know, the limit of entropy is also identified. Another practical issue, is to skip the initial blocks of the source file (e.g., the header info of a zip file will have low entropy, but actual encrypted content afterwards will be as random as they should).
  • asked a question related to Data Protection
Question
2 answers
My main concern is how to identify individuals who are prone to manifest psychological or social problems even in a well-managed and friendly working environment, without infringing their personal space and having in consideration data protection issues.
Relevant answer
Answer
I have done research and published on spatial position and movement in groups
as non-verbal behavior. There is some evidence that controlling for cultural definitions of permitted spatial closeness, those who infringe such boundaries and or, who move around a lot in a stable group such as a classroom, business or military group (after four sessions) are possibly alien to or uncomfortable with the group. There is also a difference depending on seat position, in dominant and minority group status, possible detection of pro or con attitudes toward authority or peers in the group. Experiments on spatial distance from others controlling for culture maybe class, might indicate emotional disturbance. Marking territory and defense of territory in invasion experiments also identifies dominants and "strangers" in the group. I would like to do more research on this topic that could benefit the field, also business and security, but would want to be recompensed. Also, many other nonverbal behaviors indicate cultural, class and other origins. Gilda Haber, PhD
  • asked a question related to Data Protection
Question
6 answers
Is jurisprudence of privacy law different from jurisprudence of data protection law?
Relevant answer
Answer
Confidentiality assumes full protection of any information relating to a person. Data protection legislation intended to ensure the integrity of the exclusive list of personal information about a person who strictly listed in a specific legislative act.
  • asked a question related to Data Protection
Question
2 answers
Like techniques for ensuring the integrity of data on the server-side (the service providers) or during the transfer of the data.
Relevant answer
Answer
hi ahmed. You can search for provable data possession for cloud data integrity.
  • asked a question related to Data Protection
Question
3 answers
In Clinical database management system, researchers can handle enormous patient records. Those records may consist of sensitive information. How to preserve the individual privacy of patients in clinical data management and biobank?
Relevant answer
Answer
This topic is of high-interest at OHRP right now and at every conference I've been to where a representative from OHRP is present, this issue is addressed by referring back to http://www.hhs.gov/ohrp/policy/reposit.html and specifically the guidance at http://www.hhs.gov/ohrp/policy/cdebiol.html. OHRP views personally identifiable private information stored as data synonymous with tissue samples, even if it's coded but could possibly be linked to specific individuals by the investigator(s) either directly or indirectly through a key (available to the investigator). There are exceptions spelled out in the guidance so take a close look to see if your project meets one of the exclusions.