Looking for a Privacy Impact Assessment Case Study Question to Answers solution? Cyber-security & privacy are two of the biggest issues facing businesses operating in the Information Age.
Cyber-security questions with answers in Privacy Impact Assessment.
GetAssignment Questions answers to Questions in Privacy Impact Assessment from IT Assignment Experts. Get related Assignment Help & Topics written by native Expert writers in Australia.
Privacy Impact Assessment (PIA)Sample Answers
Executive Summary
Consumer choices to accept the risk inherent in a given transaction depend on trust in the vendor. ‘Decisions regarding the trustworthiness of an e-vendor result from accumulated transactions in the past (cognitive trust) or stem from more emotive bases (affective trust).’
The cybersecurity breach reduced cognitive trust in Deltex and consequently the company must act in a way that promotes customer security of data, since compensatory perks such as discounts can restart the relationship, but without trust, customers will only take advantage of offers that don’t put them in a compromising position.
In order to regain this trust, Deltex must display behaviours and cues that instil consumer trust in their ability, integrity, predictability, and benevolence when dealing with or serving online shoppers.
Trust is also a prerequisite for social licenses and should be factored into best practice considerations due to the primary need to repair damage to increase revenue. Companies should empower stakeholders to be as informed as possible so that choices are made on objective instead of subjective assessments of the compny.
It is therefore argued that Deltex should avoid the strategy of ‘knowing more than the user knows about themselves’ due to the risk that it poses to a revamped reputation of accountability.
It is proposed that the company adopt the converse strategy and provide individuals with access to all information that Deltex sources from other sites so that individuals are attracted to Kartman for its privacy-promoting functions.
While this act may bring in minimal revenue, the benefits of regaining trust will outweigh any potential lost revenue from dishonest practices such as understaffing the complaints department and covert keystroke surveillance.
Other k recommendations include simplifying the privacy policy to aid understanding and encourage transparency. The policy should include only one clear opt-out source for targeted advertising as well as individual requests that must be responded positively to for each financial transaction and for disclosure to third parties, specifying the content of data to be sent.
The ‘boiling-frog’ strategy and policy of keystroke/online monitoring should be abandoned, and all data whose use has expired should be de-identified or destroyed.
Project designers should be wary of different jurisdictional requirements in overseas data centres, the need to encrypt transmitted data and should implement an adequately funded complaints process that provides proper access to data.
Threshold Assessment||Description
Amy Hofsteder’s Kartman project,7 involves the development of an electronic personal assistant, which will collect large amounts of personal information, including sensitive medical information, financial details, other personal, social media profiles and data imputed to the user,8 through keystrokes and online activity.
Data collected will be retained as trade secrets,9 held overseas after unencrypted10 transfers and some disclosed to third parties for advertising purposes.11 There are many uses for which the data is collected, some of which, employed without express permission, such as taking out loans in the person’s name.12.
This combination of collection, storage, use and disclosure renders it necessary to conduct a PIA.13 The project’s key milestone is the point at which the project is successful enough to market to other facets of the ‘corporate foodchain.’14
Methodology||Stakeholders
Whilst Deltex must comply with the Australian Privacy Principles (‘APPs’), in order to increase public confidence, this PIA must also address wider privacy concerns through public consultation tailored towards key stakeholders, particularly since ‘consulted stakeholders are less likely to criticize a project than those who were not consulted.’
The sensitive nature of the data and previous cybersecurity breach carry the consequential obligation to demonstrate that the company is ‘forthright and careful with information.’
As the initial target market, this approach should prioritise interviews/online surveys with CEO’s and follow with focus groups and surveys of employees who, on the basis of their workload, would appreciate this service to assess attitudes trickling down ‘the corporate foodchain.’
Public submissions should be encouraged and meetings arranged with representatives of civil liberties groups and the Federal Privacy Commission. Other relevant stakeholders include designers and manufacturers who may provide further privacy-enhancing recommendations. This PIA must be reviewed and updated throughout the project.
The following list of ranked risks is not exhaustive. It is assumed that Deltex, as a publically listed company, has an annual turnover exceeding $3million, but would likely also constitute a ‘health service provider,’ since the profile of the calorific intake and exercise habits, PT appointment booking service and contact with doctors would constitute recording and maintaining the individual’s health, particularly since the OAIC identifies weight loss clinics as health service providers.
Thus Deltex must comply with the APPs, breach of which may result in enforceable undertakings, injunctions and civil penalty orders.
For ethical considerations, reference will be made to the Ireland Technical Consultancy Group (TCG) Ethical awareness framework, which takes account of the context and use of the data, consent and choice of the subject, reasonableness of the use, fairness of the outcome, ownership of rights, access to the data and accountability of the entity.
1||PROCEDURAL RISKS
||Encryption||HIGH
Deltex must take reasonable steps to protect information from misuse, interference and unauthorised access. In AAPT and Melbourne ITthe OAIC refers to their Guide to Securing Personal Information to clarify ‘reasonable steps’.
This highlights the importance of encrypting ‘data in transit,’ indicating that the current transmission plan will render Deltex non-compliant. If financial information is transmitted unencrypted this will breach requirements for encryption under the PCI Data Security Standard.
If this design is not revised, Kartman will not obtain social license, due to failure to establish trust and social legitimacy: which requires compliance with community norms. Society generally expects that sensitive information will be well protected and may even have an expectation of encryption.
In 2006, a US Judge declined to find that ‘encryption should be used as a routine securityprecaution,’ however technological development may render this notion outdated. To foster trust Deltex must display accountability for its previous breach through heightened security.
Notably, Microsoft includes encryption as a ‘best practice’ in relation to ‘sensitive data, including personally identifiable information and financial information’ in transit.
Consequently Deltex should encrypt data in transit, accompanied by proper key management, which is the responsibility of administering employees.
||Complaints Process||HIGH-Individual||MEDIUM-Deltex
Deltex is obliged to provide individuals with access to their information on request, and it is unlikely any APP 12 exceptions will apply.
Plans to avoid permitting user access to information places Deltex in breach of their legal obligations and bars achievement of the credibility element of social license due to the purposeful decision not to deliver on the promise of an adequate complaints system.
The entrenchment of this privacy principle in international human rights norms, indicated by inclusion in the OECD Guidelines for Protection of Personal Information, shows that failure to provide a complaint/access system would be unethical under community standards.
This is high risk for the individual, whose civil liberties are undermined. The CEO should direct more funding towards this process to, at a minimum, meet APP standards.
||Privacy Policy||MEDIUM
While the ‘initial privacy-friendly’ policy would likely adhere to APP requirements, the ‘boiling-frog’ strategy, though not expressly outlawed, breaches the ethical principle of credibility due to post-breach desires for transparency.
The settlement between Facebook and the FTC, requiring that Facebook obtain express consent before enacting policy changes, indicates that non-consensual privacy policy changes, and automatically changing opt-outs upon updates may amount to deceptive trade practices in breach of section 18 of the Australian Consumer Law.
The fact that Deltex is drawing upon knowledge that most users do not read terms and conditions isprobative of an intention to deceive, which is relevant to determinations of liability. However Re Apple Inc, may indicate that statements in privacy policies won’t result in liability,since the Court held that the blatant lies in Apple’s privacy policy were not important deciding factors in their iPhone purchases.
Regardless, media coverage of social networking site policy changes encroaching upon privacy indicate that the ‘boiling-frog’ strategy will not necessarily go unnoticed and could tarnish Deltex’s reputation in a time requiring behavioural cues that instil consumer trust in their integrity.
Examples of this uproar include the challenge by six major consumer privacy groups to Facebook’s policy change in 2013. This ‘boiling-frog’ strategy is not only an issue in its offence to community values but may result in liability and should be abandoned.
Websites asking users what information they want to share empowers them to share more information than they would have if not given control over sharing information.
Thus options for control should be embedded in the product design and specified in the policy to account for varying preferences in relation to privacy protection. Additionally, simplicity and clarity are key concepts to be mindful of when redesigning the privacy policy to simultaneously avoid information overload and comply with APP requirements of consent.
||Retained Information||MEDIUM
Deltex will also be in breach of APP 11.2 for retaining unused data, such as license and/or passport details obtained for initial identification purposes. It may be that BANK requirements under the Anti-Money Laundering and Counter-Terrorism Financing Act 2006 (Cth) excuse the license/passport information from this requirement and satisfy the exception to APP 9 prohibiting use of a ‘government related identifier.’
However in relation to other data kept after being used for its consented purpose, it is recommended that Deltex be mindful of its obligation to destroy data rather than retain it as ‘trade secrets.’
||Overseas Cloud Storage||LOW
Applicable APPs when sending personal information overseas depend on whether it is a ‘use’ or a ‘disclosure’ of information.
The APP Guidelines (‘AG’) provide that ‘use’ occurs when the entity has effective control in handling and managing the information.’ Since the datacentres are in-house and cloud based, Deltex retains rights of data access and control as well as choice of security measures, thus APP 8 and the Privacy Act 1988 (Cth)(‘PA’) s 16C will not apply.
Whilst the in-house factor satisfies the requirement that personal information only be transmitted across international borders with consent or if the organisation believes that the recipient is subject to comparable privacy regulation, Deltex will be subject to different requirements in each jurisdiction.
As the Brussels data centre is an ‘establishment in the EU’ and if Kartman is sold in the EU, the General Data Protection Regulation will apply from 2018, obliging Deltex to comply with the right to erasure which would conflict with business objectives to keep user data as trade secrets.
If Deltex retains overseas storage, it must comply with each law and be wary of differences that may affect privacy rights, for example, inadvertent disclosure from Chinese network providers scanning Shang-Hai networks for content harmful to the state and monitoring content potentially infringing another’s civil rights.
The PA does not prevent cloud storage,’ however multiplication of data makes it difficult to ascertain whether information has been permanently destroyed, which may conflict with obligations to destroy or de-identify data no longer needed.
Google has overcome this issue by including a warning statement in relation to its Google Docs program, a practice approved of by the Acting Assistant Privacy Commissioner. It is recommended this statement be included in Deltex’s privacy policy.
2||DESIGN RISKS
||Indirectly-disclosed Information||HIGH
Whilst there may be social license for companies to use cookies for advertising, customers generally condoneadvertising companies holding deeply personal information, for example Target’s 2012 prediction of a pregnancy.
Keystroke surveillance has the potential to pick up personal information as well as highly-sensitive information such as bank passwords.
There is no ethical justification for intrusive covert surveillance, which is indicated by media backlash and legal action for condoned behaviours such as activation of webcams to photograph students and keystroke monitoring by employers.
The lack of consent for this renders it ethically unacceptable and places Deltex in breach of obligations to notify individuals when information is collected about them.
Even if Deltex can successfully argue that the information obtained from keystroke logging is reasonably necessary for, or directly related to functions of marketing services to the user, it would breach requirements to collect personal information only by lawful and fair means.
In ‘LP’ and The Westin Sydneythe commission referred to the AGnoting that a ‘fair means’ of collection does not involve intimidation or deception, is not unreasonably intrusive and that usually it would be unfair to covertly collect personal information covertly, subject to the circumstances.
Seeing as the keystroke surveillance is non-consensual and could be incredibly intrusive in data collected, this would not constitute ‘fair’ collection.
Due to the immense breach of both ethical standards and APPs that would occur, it is strongly recommended that this be abandoned from the design.
Utilising location services to suggest nearby restaurants/services may be more feasible, due to acquiescence of Apple’s use of location services.
||Autonomous Action||HIGH-customers||MEDIUM-Deltex
It is not stated whether Kartman contacts doctors and takes out loans with express user approval for each transaction, similar to the Acorn program or whether the privacy policy provides blanket approval for Kartman to invest on the user’s behalf at its discretion.
In regards to the latter, this practice may breach requirements for data accuracy, since this information is inferred. For example, diagnosing a cold on the basis of heart-rate and sleep habits, which may be attributed to other causes.
From an ethical perspective, disclosure of information without direct consent, particularly incorrect data, may result in prejudicial treatment from insurance companies, similarto John Hancock offering discounts proportional to exercise recorded on Fit-Bit, and has the potential to undermine credibility and trust for the company, particularly in conjunction with the understaffed complaints system.
To avoid any prejudicial mistakes, it is recommended that Kartman display notifications to the user seeking consent for each action and confer with professionals only with de-identified information to obtain general (as opposed to user-specific) advice.
||Advertising Disclosure||MEDIUM
While the prevalence of targeted marketing, may indicate its normative status and accordingly a social license permitting extraction of data for this purpose, it could also be attributed to the lack of opt-out function and ubiquity of programs such as Facebook that exclude those without accounts.
Socialist privacy legislation requires companies to advertise only once users opt-in, to maximise self-determination. This contradicts plans to split opt- outs across different menus, which offends TCG principles of consent and choice due to limited opportunities to decline and the overriding of choices upon upgrade.
Since the PA only applies to ‘personal information’ or information able to be used for identification, sifting through ‘online activity’ might only collect general information about the subject’s interests and browsing history, which would release Deltex of PA obligations.
However, the large quantity of Kartman data collected may rule out de-identification. In this instance, Deltex may not use/disclose the information for the purpose of direct marketing without consent of the user.
Kartman activities constitute ‘direct marketing’ due to use of personal information to select which advertisements are displayed, thus consent of users must be sought.
The opt-out policy would breach the requirement to provide a simple means to request ceasing communication, since the AG clarify simplicity as a ‘process for opting out requiring minimal time and effort’ and that ‘the individual should be able to easily find out how to opt out.’
If an individual has already requested not to receive advertising material, automatic opt-ins when updating the app would breach APP 7.3(e). Suppose an opt-in policy is too risky in terms of losing advertising profits. In that case, it is recommended that the opt-out policy be simplified and its positions remain stagnant as the product is upgraded further.
||Correcting Inconsistencies||LOW
Kartman’s function of correcting inconsistencies will be subject to the requirement that Deltex ensure the information is incorrect before taking reasonable steps to correct it and notifying third parties where the information being corrected is that ‘held’ or in the possession/control of Deltex.
The difficulty of discerning what data is attached to your name indicates that there may be a market, following practices of Axicom and RapLeaf, in revealing all information found in other databases since ‘up to 30% of a person’s profile may be wrong at any given time’ which can be harmful when data is used for employment screening, etc.
While charges for this access may not be excessive, profits for this function would be sourced through the social value of transparency and allowing individuals to correct misinformation before it has a prejudicial effect since it would regain trust. This notion of ‘knowing more than the user knows’ does not accord with the ethical framework element of accountability and is too dangerous to adopt when seeking to regain trust.
Bibliography
A Articles/Books/Reports
Anderson, Collin, ‘Iranian Internet Infrastructure and Policy Report’ (Policy Report, Small Media, July-August 2013)
Australian Privacy Commissioner, ‘AAPT and Melbourne IT’ (Own Motion Investigation Report, Office of the Australian Information Commissioner, 6 August 2012)
Beckett, Lois ‘Everything We Know About What Data Brokers Know About You’, ProPublica (online) 13 June 2014
Carolan, Eoin and Rosario Castillo-Mayen, ‘Why More User Control Does Not Mean More User Privacy: An Empirical (and Counter-Intuitive) Assessment of European E-Privacy Laws’ (2015) 19(2) Virginia Journal of Law and Technology 324
Fang, Yulin et al, ‘The Moderating Role of Perceived Effectiveness of Third-Party Control on Trust and Online Purchasing Intentions’ (Paper presented at Reaching New Heights. 13th Americas Conference on Information Systems, Colorado, 9 August 2007)
Federal Trade Commission, ‘Facebook Settles FTC Charges That It Deceived Consumers By Failing To Keep Privacy Promises’ (Press Release, 29 November 2011)
Fuchs, Christian, ‘The Political Economy of Privacy on Facebook Author’ (Research Paper No 9, Unified Theory of Information Research Group, Vienna, 13 January 2011)
Goel, Vindu, ‘Privacy Groups Aim to Stop Facebook Policy Changes’ New York Times (New York) 4 September 2013 Lewis, JD and Andrew Weigert, ‘Trust as a Social Reality’ (1985) 63(4) Social Forces 967
Lipman, Rebecca, ‘Online Privacy and the Invisible Market for Our Data’ (2016) 120 Pennsylvania State Law Review 777
Livingston, Scott and Graham Greenleaf, ‘Tort Liability for Online Privacy Violations in China: The 2014 SPC Regulation’ (2015) 136 Privacy Laws & Business International Report 24
Lyon, David and Elia Zureik (eds), Computers, Surveillance, and Privacy (University of Minnesota Press, 1996) Macnish, Kevin, The Ethics of Surveillance: An Introduction (Routledge, 2017)
McCole, Patrick, Elaine Ramsey and John Williams ‘Trust Considerations on Attitudes Towards Online Purchasing: The Moderating Effect of Privacy and Security Concerns’ (2010) 63(9) Journal of Business Research 1018
Morrison, John, The Social License: How to Keep Your Organization Legitimate (Springer, 2014)
Office of the Australian Information Commissioner, ‘Australian businesses and the EU General Data Protection Regulation’ (Business Resource No 21, May 2017)
Guide to securing personal information (January 2015) Office of the Australian Information Commissioner
<https://www.oaic.gov.a
u/agencies-and-organisations/guides/guide-to-securing-personal-information>
‘Guide to Information Security: ‘Reasonable Steps’ to Protect Personal Information’ (Commissioner Report, Office of the Australian Information Commissioner, April 2013)
‘Guide to Undertaking Privacy Impact Assessments’ (Report, Office of the Australian Information Commissioner, May 2014)
‘Online Behavioural Advertising — Know Your Options’ (Privacy Fact Sheet No 4, Office of the Australian Information Commissioner, December 2011)
Office of the Australian Information Commissioner, ‘Sending Personal Information Overseas’ (Business Resource No 8, May 2015)
Schaub, Florian, Rebecca Balebako and Lorrie Faith Cranor, ‘Designing Effective Privacy Notices and Controls’ (2017) 21(3) EEE Internet Computing 70
Sloan, Robert and Richard Warner, Unauthorized Access: The Crisis in Online Privacy and Security (CRC Press, 2016)
Solomon, Andrew, ‘Privacy and the Cloud’ (Speech delivered at the Cloud Computing Conference and Expo, 9 September 2010) Wright, David, ‘Making Privacy Impact Assessment More Effective’ (2013) 29(5) The Information Society 307
Wright, David, Rachel Finn and Rowena Rodrigues, ‘A Comparative Analysis of Privacy Impact Assessment in Six Countries’ (2013) 9(1) Journal of Contemporary European Research 160
B Cases
Google Inc v Australian Competition and Consumer Commission (2013) 249 CLR 43
‘LP’ and The Westin Sydney [2017] AICmr 53 (7 June 2017)
Re Apple Inc. iPhone/iPad Application Consumer Privacy Litigation (US District Court, Northern District of California, 11-md- 02250, 25 November 2013)
C Legislation
Privacy Act 1988 (Cth)
General Data Protection Regulation [2016] OJ L 119/29
D Other
Bishop, Bryan, Phone Location-Tracking Lawsuit Against Apple is Dismissed (27 November 2013) The Verge <https://www.th everge.com/2013/11/27/5153954/iphone-location-tracking-lawsuit-against-apple-is-dismissed>
Fleishman, Glenn, ‘How iOS 11 changes location tracking on your iPhone and iPad’ MacWorld (online) 10 July 2017
<https://www.macworld.com/article/3203365/ios/how-ios-11-changes-location-tracking-on-your-iphone-and-ipad.html>
Gemalto Solutions, Enterprise Data Encryption Best Practices <https://safenet.gemalto.com/protect-sensitive-data-enterprise- encryption/>
Glasgow, Seth, PCI Encryption Requirements (28 December 2015) Secure State <https://www.securestate.com/blog/ 2015/12/28/pci-encryption-requirements-part-1>
Hicken, Melanie, ‘Find Out What Big Data Knows about You (It may Be Very Wrong)’, CNN Money (online), 5 September 2013
<http://money.cnn.com/2013/09/05/pf/acxiom-consumer-data/index.html>
Mearian, Lucas, ‘Insurance Company now Offers Discounts — if you Let it Track your Fitbit’, Computerworld (online), 17 April 2015 <https://www.computerworld.com/article/2911594/insurance-company-now-offers-discounts-if-you-let-it-track-your- fitbit.html>
McCullagh, Declan, Judge: Firm not Negligent in Failure to Encrypt Data (16 February 2006) C-Net <https://www.cnet.co m/au/news/judge-firm-not-negligent-in-failure-to-encrypt-data/>
Newton-Sims, Timo, ‘Could your Fitbit Data be Used to Deny You Health Insurance?’, The Conversation (online) 17 February 2017 <http://theconversation.com/could-your-fitbit-data-be-used-to-deny-you-health-insurance-72565>
Office of the Australian Information Commissioner, Australian Privacy Principles Guidelines (31 March 2015) <https://www.o aic.gov.au/resources/agencies-and-organisations/appguidelines/APP_guidelines_complete_version_1_April_2015.pdf>
Business resource: Key Health Privacy Concepts (2015) Office of the Australian Information Commissioner <https://www.oaic. gov.au/engage-with-us/consultations/health-privacy-guidance/business-resource-key-health-privacy-concepts>
Privacy Law, Office of the Australian Information Commissioner < https://www.oaic.gov.au/privacy-law/>
Organisation for Economic Co-operation and Development, OECD Guidelines on the Protection of Privacy and Transborder Flows of Personal Data (23 September 1980) <https://www.oecd.org/sti/ieconomy/2013-oecd-privacy-guidelines.pdf>
Petru, Alexis, ‘Can Companies Restore Consumer Confidence After a Data Breach?’, Triple Pundit (online) 8 July 2014
Randtronics, Key Management and PCI DSS <https://www.randtronics.com/blog/item/key-management-and-pci-dss>
Selby, Judy, Ethical Considerations in Using Social Data (30 September 2015) Judy Selby Consulting < https://judyselbyconsu lting.com/2015/09/30/considerations-in-social-data/>
Thompson, Ian, What Is the Social License? (2017) Shinglespit Consultants Inc <https://socialicense.com/definition.html> Vinton, Kate, How Companies Can Rebuild Trust After A Security Breach (1 July 2014) Forbes < https://www.forbes.com/sites
/katevinton/2014/07/01/how-companies-can-rebuild-trust-after-a-security-breach/#732dd4de5e6c>
Whitney, Lance, ‘School Escapes Charges in Webcam Spying Case’ C-Net (online), 18 August 2010
<https://www.cnet.com/news/school-escapes-charges-in-webcam-spying-case/>
Reference No. CSH019001992