How can technology be exploited




















It is designed to confer authority on the presenter, but giving a third or a fourth grader that sense of presumed authority is often counterproductive. The PowerPoint aesthetic of bullet points does not easily encourage the give-and-take of ideas, some of them messy and unformed. The opportunity here is to acknowledge that PowerPoint, like so many other computational technologies, is not just a tool but an evocative object that affects our habits of mind.

We need to meet the challenge of using computers to develop the kinds of mind tools that will support the most appropriate and stimulating conversations possible in elementary and middle schools. But the simple importation of a technology perfectly designed for the sociology of the boardroom does not meet that challenge. It is not good for the organization for you to be doing X right now.

If a technology as simple as PowerPoint can raise such difficult questions, how are people going to cope with the really complex issues waiting for us down the road—questions that go far more to the heart of what we consider our specific rights and responsibilities as human beings? Would we want, for example, to replace a human being with a robot nanny? A robot nanny would be more interactive and stimulating than television, the technology that today serves as a caretaker stand-in for many children.

Indeed, the robot nanny might be more interactive and stimulating than many human beings. Yet the idea of a child bonding with a robot that presents itself as a companion seems chilling.

We are ill prepared for the new psychological world we are creating. The image of the nanny robot raises a question: Is such a robot capable of loving us? Let me turn that question around. David expresses his love to a woman who has adopted him as her child. In the discussions that followed the release of the film, emphasis usually fell on the question of whether such a robot could really be developed. Was this technically feasible? And if it were feasible, how long would we have to wait for it?

The question is not what computers can do or what computers will be like in the future, but rather, what we will be like. What we need to ask is not whether robots will be able to love us but rather why we might love robots. Some things are already clear. We create robots in our own image, we connect with them easily, and then we become vulnerable to the emotional power of that connection.

This was not surprising; evolution has clearly programmed us to respond to creatures that have these capabilities as though they were sentient. Interestingly, the so-called theory of object relations in psychoanalysis has always been about the relationships that people—or objects—have with one another. From that point of view, in the computer we have created a very powerful object, an object that offers the illusion of companionship without the demands of intimacy, an object that allows you to be a loner and yet never be alone.

In this sense, computers add a new dimension to the power of the traditional teddy bear or security blanket. So how exactly do the robot toys that you are describing differ from traditional toys? Well, if a child plays with a Raggedy Ann or a Barbie doll or a toy soldier, the child can use the doll to work through whatever is on his or her mind.

Some days, the child might need the toy soldier to fight a battle; other days, the child might need the doll to sit quietly and serve as a confidante. Some days, Barbie gets to attend a tea party; other days, she needs to be punished. You might say that they seem to have their own lives, psychologies, and needs. Indeed, for this reason, some children tire easily of the robots—they simply are not flexible enough to accommodate childhood fantasies.

These children prefer to play with hand puppets and will choose simple robots over complicated ones. If we can relate to machines as psychological beings, do we have a moral responsibility to them?

And so, they often feel that they owe it something—some loyalty, some respect. So once again, I want to turn your question around. The sight of children and the elderly exchanging tenderness with robotic pets brings philosophy down to earth. In the end, the question is not whether children will come to love their toy robots more than their parents, but what will loving itself come to mean?

So could machines take over specific managerial functions? For example, might it be better to be fired by a robot? We need to know what business functions can be better served by a machine. There are aspects of training that machines excel at—for example, providing information—but there are aspects of mentoring that are about encouragement and creating a relationship, so you might want to have another person in that role.

The focus of this article is on exploring the relationship between information technology and privacy. We will also discuss the role of emerging technologies in the debate, and account for the way in which moral debates are themselves affected by IT.

Discussions about privacy are intertwined with the use of technology. The publication that began the debate about privacy in the Western world was occasioned by the introduction of the newspaper printing press and photography. Samuel D. Since the publication of that article, the debate about privacy has been fuelled by claims regarding the right of individuals to determine the extent to which others have access to them Westin and claims regarding the right of society to know about individuals.

Information being a cornerstone of access to individuals, the privacy debate has co-evolved with — and in response to — the development of information technology. It is therefore difficult to conceive of the notions of privacy and discussions about data protection as separate from the way computers, the Internet, mobile computing and the many applications of these basic technologies have evolved. Inspired by subsequent developments in U. Think here, for instance, about information disclosed on Facebook or other social media.

All too easily, such information might be beyond the control of the individual. Statements about privacy can be either descriptive or normative, depending on whether they are used to describe the way people define situations and conditions of privacy and the way they value them, or are used to indicate that there ought to be constraints on the use of information or information processing. These conditions or constraints typically involve personal information regarding individuals, or ways of information processing that may affect individuals.

Informational privacy in a normative sense refers typically to a non-absolute moral right of persons to have direct or indirect control over access to 1 information about oneself, 2 situations in which others could acquire information about oneself, and 3 technology that can be used to generate, process or disseminate information about oneself.

The debates about privacy are almost always revolving around new technology, ranging from genetics and the extensive study of bio-markers, brain imaging, drones, wearable sensors and sensor networks, social media, smart phones, closed circuit television, to government cybersecurity programs, direct marketing, RFID tags, Big Data, head-mounted displays and search engines.

The other reaction is that our privacy is more important than ever and that we can and we must attempt to protect it. On one end of the spectrum, reductionist accounts argue that privacy claims are really about other values and other things that matter from a moral point of view. According to these views the value of privacy is reducible to these other values or sources of value Thomson Proposals that have been defended along these lines mention property rights, security, autonomy, intimacy or friendship, democracy, liberty, dignity, or utility and economic value.

Reductionist accounts hold that the importance of privacy should be explained and its meaning clarified in terms of those other values and sources of value Westin Views that construe privacy and the personal sphere of life as a human right would be an example of this non-reductionist conception. More recently a type of privacy account has been proposed in relation to new information technology, which acknowledges that there is a cluster of related moral claims underlying appeals to privacy, but maintains that there is no single essential core of privacy concerns.

This approach is referred to as cluster accounts DeCew ; Solove ; van den Hoven ; Allen ; Nissenbaum From a descriptive perspective, a recent further addition to the body of privacy accounts are epistemic accounts, where the notion of privacy is analyzed primarily in terms of knowledge or other epistemic states.

An important aspect of this conception of having privacy is that it is seen as a relation Rubel ; Matheson ; Blaauw with three argument places: a subject S , a set of propositions P and a set of individuals I. Here S is the subject who has a certain degree of privacy. Another distinction that is useful to make is the one between a European and a US American approach. A bibliometric study suggests that the two approaches are separate in the literature.

In discussing the relationship of privacy matters with technology, the notion of data protection is most helpful, since it leads to a relatively clear picture of what the object of protection is and by which technical means the data can be protected. At the same time it invites answers to the question why the data ought to be protected, pointing to a number of distinctive moral grounds on the basis of which technical, legal and institutional protection of personal data can be justified.

Informational privacy is thus recast in terms of the protection of personal data van den Hoven Personal information or data is information or data that is linked or can be linked to individual persons. In addition, personal data can also be more implicit in the form of behavioural data, for example from social media, that can be linked to individuals.

Personal data can be contrasted with data that is considered sensitive, valuable or important for other reasons, such as secret recipes, financial data, or military intelligence. Data used to secure other information, such as passwords, are not considered here. Although such security measures passwords may contribute to privacy, their protection is only instrumental to the protection of other more private information, and the quality of such security measures is therefore out of the scope of our considerations here.

A relevant distinction that has been made in philosophical semantics is that between the referential and the attributive use of descriptive labels of persons van den Hoven Personal data is defined in the law as data that can be linked with a natural person. There are two ways in which this link can be made; a referential mode and a non-referential mode. In this case, the user of the description is not — and may never be — acquainted with the person he is talking about or intends to refer to.

The following types of moral reasons for the protection of personal data and for providing direct or indirect control over access to those data by others can be distinguished van den Hoven :. These considerations all provide good moral reasons for limiting and constraining access to personal data and providing individuals with control over their data.

Acknowledging that there are moral reasons for protecting personal data, data protection laws are in force in almost all countries. The basic moral principle underlying these laws is the requirement of informed consent for processing by the data subject, providing the subject at least in principle with control over potential negative effects as discussed above. Furthermore, processing of personal information requires that its purpose be specified, its use be limited, individuals be notified and allowed to correct inaccuracies, and the holder of the data be accountable to oversight authorities OECD The challenge with respect to privacy in the twenty-first century is to assure that technology is designed in such a way that it incorporates privacy requirements in the software, architecture, infrastructure, and work processes in a way that makes privacy violations unlikely to occur.

New generations of privacy regulations e. The data ecosystems and socio-technical systems, supply chains, organisations, including incentive structures, business processes, and technical hardware and software, training of personnel, should all be designed in such a way that the likelihood of privacy violations is a low as possible. The debates about privacy are almost always revolving around new technology, ranging from genetics and the extensive study of bio-markers, brain imaging, drones, wearable sensors and sensor networks, social media, smart phones, closed circuit television, to government cybersecurity programs, direct marketing, surveillance, RFID tags, big data, head-mounted displays and search engines.

The impact of some of these new technologies, with a particular focus on information technology, is discussed in this section. Typically, this involves the use of computers and communication networks. The amount of information that can be stored or processed in an information system depends on the technology used. This holds for storage capacity, processing capacity, and communication bandwidth.

We are now capable of storing and processing data on the exabyte level. These developments have fundamentally changed our practices of information provisioning. The rapid changes have increased the need for careful consideration of the desirability of effects.

Some even speak of a digital revolution as a technological leap similar to the industrial revolution, or a digital revolution as a revolution in understanding human nature and the world, similar to the revolutions of Copernicus, Darwin and Freud Floridi In both the technical and the epistemic sense, emphasis has been put on connectivity and interaction.

Physical space has become less important, information is ubiquitous, and social relations have adapted as well. As connectivity increases access to information, it also increases the possibility for agents to act based on the new sources of information.

When these sources contain personal information, risks of harm, inequality, discrimination, and loss of autonomy easily emerge. For example, your enemies may have less difficulty finding out where you are, users may be tempted to give up privacy for perceived benefits in online environments, and employers may use online information to avoid hiring certain groups of people. Furthermore, systems rather than users may decide which information is displayed, thus confronting users only with news that matches their profiles.

Although the technology operates on a device level, information technology consists of a complex system of socio-technical practices, and its context of use forms the basis for discussing its role in changing possibilities for accessing information, and thereby impacting privacy. We will discuss some specific developments and their impact in the following sections. The Internet, originally conceived in the s and developed in the s as a scientific network for exchanging information, was not designed for the purpose of separating information flows Michener The World Wide Web of today was not foreseen, and neither was the possibility of misuse of the Internet.

Social network sites emerged for use within a community of people who knew each other in real life — at first, mostly in academic settings — rather than being developed for a worldwide community of users Ellison It was assumed that sharing with close friends would not cause any harm, and privacy and security only appeared on the agenda when the network grew larger.

This means that privacy concerns often had to be dealt with as add-ons rather than by-design. A major theme in the discussion of Internet privacy revolves around the use of cookies Palmer However, some cookies can be used to track the user across multiple web sites tracking cookies , enabling for example advertisements for a product the user has recently viewed on a totally different site.

Again, it is not always clear what the generated information is used for. Similarly, features of social network sites embedded in other sites e. Previously, whereas information would be available from the web, user data and programs would still be stored locally, preventing program vendors from having access to the data and usage statistics.

In cloud computing, both data and programs are online in the cloud , and it is not always clear what the user-generated and system-generated data are used for. Moreover, as data are located elsewhere in the world, it is not even always obvious which law is applicable, and which authorities can demand access to the data.

Data gathered by online services and apps such as search engines and games are of particular concern here. Which data are used and communicated by applications browsing history, contact lists, etc. Some special features of Internet privacy social media and big data are discussed in the following sections. Social media pose additional challenges. The question is not merely about the moral reasons for limiting access to information, it is also about the moral reasons for limiting the invitations to users to submit all kinds of personal information.

Users are tempted to exchange their personal data for the benefits of using services, and provide both this data and their attention as payment for the services. When the service is free, the data is needed as a form of payment.

One way of limiting the temptation of users to share is requiring default privacy settings to be strict. Also, such restrictions limit the value and usability of the social network sites themselves, and may reduce positive effects of such services. A particular example of privacy-friendly defaults is the opt-in as opposed to the opt-out approach. Consider the problem of fake news , for instance.

Misinformation on the internet has severely impacted political campaigns and electoral results in several countries in recent years, including the U. Then there is the problem of infringement of privacy , the recent Facebook scandal being the biggest example.

There are always two sides to a coin, and that is the case with information technology, too. While it allows for the faster spread of information, safe spaces and access to information for all, it also allows the spread of misinformation that can instigate riots, online bullying and easier access to child pornography.

While technology alone can't mitigate the evils, there are initiatives that it can undertake to continually improve society. India has seen a rise in mob violence in recent months, which is a direct result of the spread of misinformation on Facebook-owned WhatsApp.

Even news channels have fallen prey to false stories circulated via social media. Education can play a key role in tackling the menace of fake news. WhatsApp has started a user education drive in India at the grassroots level. It plans to organize workshops in several Indian cities, educating community leaders on how to discern between news and opinion. The less educated, and people over the age of 55 who are fairly new to the world of social media, are particularly more susceptible to misinformation.

Big companies have the resources to invest in user education at the ground level. However, even if you are a small social media company, you can be proactive in educating people without spending too many resources. For instance, you could have an onboarding process with videos and tutorials on how to spot fake news. Read how hackers are exploiting emerging technologies. Learn what you can do to help protect your business. Cutting-edge technology invites new cyber risks Emerging technologies like Internet of Things IoT and 5G are creating opportunities for businesses to implement innovative solutions to boost productivity, expedite data flow to make faster decisions, and improve customer experiences.

Cybersecurity Solutions Security is never one size fits all. Learn more. Technical expertise As new technologies emerge, so must the people who use them. Share this quote. Share this with others. Twitter Linkedin Facebook Email. Continue reading.



0コメント

  • 1000 / 1000