I’ve attached the rubric, the ethical principles, and a sample paper
The paper would be based on this reading:
SCENARIO 55: Data Mining at the XYZ Credit Union Jane, a senior real estate professional at CBA Real Estate, wishes to purchase a condominium, and she has recently applied for a mortgage at the XYZ Credit Union. To be considered for this loan, Jane is required to fill out a number of mortgagerelated forms, which she willingly completes. For example, on one form, she discloses that she has been employed by CBA for more than seven years and that her current annual salary is $95,000. On another form, Jane discloses that she has $50,000 in her savings account at a local bank (much of which she plans to use for the down payment on the house she hopes to purchase). Additionally, she discloses that she has $1,000 of credit card debt and still owes $3,000 on an existing car loan. The amount of the loan for the mortgage she hopes to secure is for $100,000 over a 30year period. After Jane has completed the forms, the credit unions computing center runs a routine data mining program on information in its customer databases and discovers a number of patterns. One reveals that real estate professionals earning more than $80,000 but less than $120,000 annually are also likely to leave their current employers and start their own businesses after 10 years of employment. A second data mining algorithm reveals that the majority of female real estate professionals declare bankruptcy within two years of starting their own businesses. The data mining algorithms can be interpreted to suggest that Jane is a member of a group that neither she nor possibly even the mortgage officers at the credit union had ever known to existnamely, the group of female real estate professionals likely to start a business and then declare bankruptcy within two years. With this newly inferred information about Jane, the credit union determines that Jane, because of the newly created category into which she fits, is a long term credit risk. So, Jane is denied the mortgage.
Above is the scenario the paper is about. Below is the extra reading that’ll help with the paper.
Does the credit unions mining of data about Jane raise any significant privacy concerns? At one level, the transaction between Jane and the credit union seems appropriate. To secure the mortgage from XYZ Credit Union, Jane has authorized the credit union to have the information about her, that is, her current employment, salary, savings, outstanding loans, and so forth, that it needs to make an informed decision as to whether or not to grant her the mortgage. So, if we appeal to Nissenbaums framework of privacy as contextual integrity, it would seem that there is no breach of privacy in terms of norms of appropriateness. However, Jane gave the credit union information about herself for use in one context, namely, to make a decision about whether or not she should be granted a mortgage for her condominium. She was also assured that the information given to the credit union would not be exchanged with a third party, without first getting Janes explicit consent. So, no information about Jane was either exchanged or crossreferenced between external databasesthat is, there is no breach of the norms of distribution (in Nissenbaums model, described in Section 5.2.5). However, it is unclear whether the credit union had agreed not to use the information it now has in its databases about Jane for certain inhouse analyses. Although Jane voluntarily gave the credit union information about her annual salary, previous loans, and so forth, she gave each piece of information for a specific purpose and use, in order that the credit union could make a meaningful determination about Janes request for a mortgage. However, it is by no means clear that Jane authorized the credit union to use disparate pieces of that information for more general data mining analyses that would reveal patterns involving Jane that neither she nor the credit could have anticipated at the outset. Using Janes information for this purpose would now raise questions about appropriateness in the context involving Jane and the XYZ Credit Union. The mining of personal data in Janes case is controversial from a privacy perspective for several reasons. For one thing, the information generated by the data mining algorithms suggesting that Jane is someone likely to start her own business, which would also likely lead to her declaring bankruptcy, was not information that was explicit in any of the data (records) about Jane per se; rather, it was implicit in patterns of data about people similar to Jane in certain respects but also vastly different from her in other respects. For another thing, Janes case illustrates how data mining can generate new categories and groups such that the people whom the data mining analysis identifies with those groups would very likely have no idea that they would be included as members. And we have seen that, in the case of Jane, certain decisions can be made about members of these newly generated groups simply by virtue of those individuals being identified as members. For example, it is doubtful that Jane would have known that she was a member of a group of professional individuals likely to start a business and that she was a member of a group whose businesses were likely to end in bankruptcy. The discovery of such groups is, of course, a result of the use of data mining tools. Even though no information about Jane was exchanged with databases outside XYZ, the credit union did use information about Jane internally in a way that she had not explicitly authorized. And it is in this senseunauthorized internal use by data usersthat many believe data mining raises serious concerns for personal privacy. Note also that even if Jane had been granted the mortgage she requested, the credit unions data mining practices would still have raised privacy concerns with respect to the contextual integrity of her personal information. Jane was merely one of many credit union customers who had voluntarily given certain personal information about themselves to the XYZ for use in one contextin this example, a mortgage requestand subsequently had that information used in ways that they did not specifically authorize.