A few observations and a request.
Technology development and policy (public or corporate) are quite separable.
Chasing both at the same time often serves neither well. Technology
developers can help policy makers by being very clear about the consequences
of their innovations, so policy makers can avoid unintended consequences.
In the early stages of development I find it better to talk of consequences
rather than risks because risk implies harm and "harm" is a judgment call
(e.g., chemotherapy harms cells in a good way). In later stages of
development technologists can also provide input on issues affecting
diffusion or adoption of their innovations.
For SNA the discussion can quickly degrade into a privacy debate. My
experience in international consulting is that views on "privacy" vary
greatly from country to country, and that most of the world doesn't share
the U.S. view. Privacy is constantly traded for goods and services. For
example, an appendectomy is invasive and personal, but most patients don't
consider it an invasion of their privacy. (Sidebar: For an interesting
discussion of the use of information transparency in policy making, see
"Regulation by shame,” Mary Graham, The Atlantic Monthly, April 2000.
Here's a request.
What are common, unintended consequences of SNA?
What are the barriers to widespread adoption in corporate settings?
[log in to unmask]
----- Original Message -----
From: "Sam Friedman" <[log in to unmask]>
To: <[log in to unmask]>
Sent: Thursday, August 15, 2002 7:58 AM
Subject: Re: Social Network Research Partners
> I for one have ethical qualms about working for an organization whose
> stated aim is to kill people