“Ideology of use: How software is deployed, particularly in a world that is hypersensitive to global security concerns, has much farther reaching implications and consequences than the ideologies used to create and select it.” (PNNOnline, February 2003)
Open source software has intensified the ideological debate over what technology to deploy in a given circumstance. The public sector, always price sensitive to any technology solution, has embraced the idea of open source as a cheaper alternative to commercial applications. Open Source refers to a program in which the source code is available to the general public for use and/or modification from its original design free of charge. It is typically created as a collaborative effort in which programmers improve upon the code and share the changes within the community. There is also a strong ideological lobby that sees it as the alternative to commercial dominance by any one player in the software industry and as an equalizer with the potential of wresting control away from US predominance in the software industry.
I wonder however, if we aren’t having the wrong debate about technology and ideology, particularly in these troubling times. Ideology and technology cohabit the same plane of existence, but on three distinct levels:
- Development ideology: How is the technology developed?
- Selection ideology: Why is the technology chosen?
- Ideology of use: What is the technology ultimately used for?
My experience is that the most important and thorniest ideological consideration is the ideology of use. Unfortunately far too much time is spent obsessing about the ideology of software selection to meet a particular need and far too little time considering the effects of its application. How software is deployed, particularly in a world that is hypersensitive to global security concerns, has much farther reaching implications and consequences than the ideologies used to create and select it.
Ideological considerations occur early in the development process. Is software developed for free, on a commercial basis or as a hybrid of the two? Is an application designed to meet a social mission, a personal interest or a business requirement? On the legal front should applications be fully available to the public for the purposes of modification, or hidden behind proprietary legal constructs? From a standards point of view are considerations purely technical or are the needs of the disabled and disadvantaged taken into account when designing new technology specifications?
Developers ultimately decide why they build applications. They decide if they wish to generate profit, simply sustain ongoing development and maintenance costs or if contributing a piece of code to the world is payment enough for their efforts. In our current reality, lower price points, mass distribution networks and a proliferation of useful tool sets have allowed software developers a far more significant range of ideological decisions to make when they create software. They have a plethora of commercial and open source languages, tools, operating systems and even legal frameworks to choose from in order to develop and distribute their creations.
In this new environment it is also far easier to develop tools for the social sector than it ever has been. The advent of the PC in the 1980’s made technology affordable for the first time to many nonprofits. The PC created a market for the social sector that in large part did not exist in the costlier mainframe context. In the 90’s, the Internet once again lowered the barriers by providing a technology that allowed nonprofits to reach out and extend their constituencies at a far lower cost. Open source tools have unlocked even more development opportunities for this market. They have spurred commercial software developers to rethink their price structures in order not to lose this relatively new market consisting of literally millions of social purpose nonprofits, educational institutions and health facilities globally.
Developers of commercial software maintain a straightforward profit-based ideology for any market they sell to. However, that does not preclude them from doing pro bono work or developing applications for the social sector that are heavily discounted or distributed freely. Salesforce.com has a foundation and distributes discounted and free licenses of its products to nonprofits. Techsoup.org provides a variety of software aggregated from different vendors who are interested in providing discounted commercial applications to the nonprofit sector. Open source developers operate on a number of levels as well. Some have strong ideological convictions that tools should be developed free of charge for the social sector as well as for any non-commercial user. Others are driven by a need to limit the dominance of a single, perceived commercial player. Still others simply wish to demonstrate their creativity to the world and to build a better mousetrap.
Both the commercial and open source developer community may operate on development ideologies that are purely technical, focusing on building software tools for other developers that allow them to in turn build end-user tools. Developers may also choose to build generic end-user products that meet the needs of any sector. Word processors and spreadsheet products for example can be built using either commercial or open source tools, and following either commercial or open source principles of distribution. All sectors including the social sector have a need for these basic tools in whatever form they are built.
The social sector also needs specialized mission-focused applications to meet their needs which are often not available as mass-produced shrink wrapped applications. There are software designers who follow a development ideology specifically focused on designing tools for the social sector. As part of my philanthropic activities with the Open Society Institute, I am the co-founder and President of a 501C3 called Aspiration. Aspiration helps mission-focused software tools reach a broader market of nonprofit users by marrying selected tool builders with nonprofit technology support organizations. The objective is to develop better strategies of cooperation in order to enhance the successful design, roll out and support of applications to the social sector. Aspiration’s mission mandates an ideological position supporting only tools that directly satisfy social mission agendas. However it works with tool builders who develop their applications using a variety of tools and methodologies. For Aspiration the tools are secondary to the purpose of developing the applications. The supported applications reflect this diversity:
Martus: An open source human right monitoring tool.
FACTS: Food distribution and management tool for humanitarian relief organizations built by Microsoft.
ActionStudio: a web publishing and advocacy tool for grassroots organizations developed in Macromedia’s Dreamweaver and Cold Fusion.
ActionStudio introduces us to a lesser known cousin of open source, called community source. “Community software licenses” essentially facilitate the same goals as the open source model. Proprietary software development tools (Macromedia’s Cold Fusion, Microsoft ASP, etc.) are used as the underlying technology to build freely distributable and open source end user applications. The social purpose applications developed from these tools are made available to other social purpose organizations through a community source license. All coding improvements are given back to the community for adoption and enhancement by others – this accelerates the creation, adoption and ongoing evolution of software to affect social change. Examples of this include NPower’s TechAtlas, a technology assessment and planning tool. This approach adopts the spirit and intent of open source but makes the underlying technology tool used to develop socially responsible applications a secondary consideration.
Finally there are also destructive software development ideologies. Some developers create viruses, worms, trojans and other harmful applications for no other purpose but to cause disruption. Destructive developer ideologies aside, commercial, non-commercial, or socially responsible development ideologies are all equally valid. They represent the product of their developer’s creative interests in solving a particular problem.
Selection Ideology: Why is the technology chosen?
As a user, how do I choose a software application that best meets my particular requirements? Unfortunately, the questionable practice of applying software development ideology as the primary decision point to the software selection process is becoming far too common and has created an unnecessary complication for nonprofits trying to employ technology to meet their mission objectives. The idea that software selection choices should be made based primarily on the premise of free versus commercial technology and open source versus proprietary technology is entirely misguided.
I profess to be an agnostic when it comes to software selection ideology. Here I diverge from those who equate open source to open society. Years spent managing systems operations and satisfying the needs of real users meeting long term organizational objectives has made me a pure pragmatist in this regard. Personally, I am very happy that software developers make decisions to create free, commercial, proprietary and open source solutions. I applaud their various ideologies in developing these products, because it gives me what I want most: The freedom to choose the best solution from a diversity of options for the job at hand. When I make a software selection decision, ideology goes right out the window — or rather the operational considerations behind an ideological choice (cheaper, more secure, etc.) become part of the consideration along with many other variables that I must prioritize. My decision is based on these operational prerequisites:
- Do the application’s functions meet the user specifications?
- Do the design considerations meet project requirements?
- Do the cost considerations meet project requirements?
- Do the security considerations meet project requirements?
- Do the networking considerations meet the project requirements?
- Are the necessary resources there to program or deploy the application?
- Are the necessary resources there to maintain the application?
- Are the necessary training and documentation resources available to satisfy project requirements?
- Is the hardware available and appropriate to meet the needs of the software application?
- Is there a facility to convert data?
- Are the necessary integration points there if the application must interface with other applications?
- What is the evolutionary trajectory of the software I choose?
Answering these questions may lead me to select applications built on particular development ideologies. However, the selection process is based purely on an objective set of operational criteria to deliver the most effective solution to satisfy a stated need.
When I work with the NGO community, I know price is a very sensitive factor that can influence the selection of applications. However, I also know that ease of installation and use and continued high touch support are also important factors to take into consideration when satisfying this sector. I know that when applications don’t work in this environment and there is no support around to provide basic assistance, people become very reticent to use the technology again; much more so than in the commercial context. I weigh these and other factors in making final technology selection decisions as any good project manager would.
I can think of three good reasons why not to select software based primarily on an ideological preference:
A software implementation is a costly and complex affair that involves a sophisticated behavioral interplay between people and technology. Often it means changing the way departments or whole institutions do things as they adapt to often less than intuitive automated processes. Most people are naturally resistant to these changes. Technologists who manage software implementations will tell you that there are many pitfalls to watch out for even in the best of circumstances. Choosing an application for any reason other than how it meets specified business requirements is a tremendous gamble.
If you were to build a house, would you select your tools of choice based on the alloys they were built with? Their craftsmanship? Their cost? The method that went into forging them? Most probably your primary consideration would be to select the right tools necessary to complete the building project. Craftsmanship, cost, alloys and method of creation might all be considerations, but these factors would be weighted based on how they contributed to the tool’s success in helping you complete your building project. As attractive as it might be, you would not use a hammer forged on Thor’s anvil if what you needed was a screwdriver from The Home Depot.
Many organizations find open source versus commercial applications more attractive because they are free to use. Often there is never any real plan to actually tinker with the application code to modify how it works – a major benefit of open source products. The limitation on technical resources are already major limitations in developing a product further in the nonprofit environment. If cost of purchase is the main motivation, let the buyer beware. The real costs of any application deployment outside of initial purchase relate to installation, training, data conversion, ongoing maintenance, support and new version upgrades. These must all be taken into consideration if using commercial or open source applications. What is free now may also have a cost later. The once free, open source, Star Office revision now has a price attached to it. This often happens as an application gains significant market share. The need arises to better support its continued development and maintenance for an increasing and more demanding end-user market in an organized and timely fashion.
I am also not convinced that the “social value” case that some argue for open source software is compelling enough to influence a selection decision, (e.g. that because open source is free and open to redesign, nonprofits end up with access to richer, less costly and more reliable applications, freeing themselves up to spend their limited resources elsewhere). In fact, I could argue just the opposite. Consider this:
The social benefit of most open source applications is primarily in their free use and less so in their extensibility. The benefit of free, modifiable code would constitute a far more significant social benefit if most nonprofits took advantage of it, but most cannot because of resource constraints. There are also training and documentation costs associated with any new and significant software modification. Commercial software is typically closed and de facto has an expense connected with its purchase. However, it is often deeply discounted for the nonprofit and educational environments, although not all over the world as it should be.
Commercial software developers that discount for their nonprofit customer base may create far more social value if they also convert some of their commercial sales revenue directly to philanthropic purposes. I deal with philanthropic institutions on a regular basis. Some are funded by significant commercial software profits and are contributing to the global fight against aids, the reform of micro lending and economic development, training and education, library support, children’s programs, media development and a plethora of other social value activities. The Gates Foundation has the largest endowment of any US foundation dwarfing the Ford, Rockefeller, MacArthur endowments and the Open Society Institute’s yearly allocations. It must allocate at least 5% of that endowment (about one billion dollars) of grant funding annually. One cannot separate the direct correlation between revenue generated from commercial software and the work of the Gates Foundation, the Microsoft Community Affairs Department, the Time-Warner AOL Foundation, The Real Foundation and Glaser Family Fund, The Paul Allen Foundation, etc. I am familiar with all these institutions, the quality of their work and the dedication of the people who are committed to the value proposition of assisting civil society. It is disingenuous to compare the social value of both commercial and open source applications without recognizing this other dimension of social benefit that accrues from commercial application development.
Applying ideology to the selection process in either a commercial or open source context is a tricky business. Personally, I have worked with and supported a mixed environment of commercial and open source applications for years. Let software developers choose the ideology they are most comfortable developing applications in. When it comes to selecting an application to meet a particular user need, always select the application based solely on the operational criteria that best satisfies the need.
Ideology of Use: What is the technology ultimately used for?
The deployment of any technology is by far the most interesting ideological concern but often the one least focused upon. Most software is built to solve a particular problem or to create a new functionality. All technology development is informed by values. However, a technology tool, once developed, can be applied in many ways that reinforce the original intention, run counter to it or spur new possibilities never thought of by the developer. Ideological debates around technology development and selection are easier to have because the issues are far more limited, and revolve around technology choices and objective operational requirements. The genie is let out of the bottle only once a technology, any technology, is deployed. The ideology of use poses far more serious ethical issues than the development and selection ideologies previously discussed. Here are five examples in the current global context:
Case #1 Ideology and Terms of Use: What if I have a technology that allows me to encrypt hidden messages in a digital image and then pass them to an intended recipient who has the key to unlock the message? This application can be used by the Otpor Student movement in Serbia to clandestinely pass information between members in its bid to change the autocratic Milosevic Government by democratic means, or it can be used by Al Qaeda to communicate its next major terrorist attacks against an innocent target. Should the usage of such tools be somehow regulated?
And what if they are regulated? Some years ago, then-Russian President Yeltsin issued a decree that the keys to all encryption designed into software and distributed in Russia must be provided to the FSB, (the Russian successor of the KGB). That would cover the example above but it would also cover a securely encrypted, open source, human rights application. A Chechen NGO in Russia using such an application to track human rights abuses would not necessarily be as fully protected by the laws in that country as a similar organization tracking abuses against Islamic citizens in the US. However what if this application did fall into the wrong hands and was used by a Chechen terrorist organization?
Here the ethical dilemma takes on an interesting twist. Reporting the encryption keys to the appropriate authorities could put a legitimate human rights organization in jeopardy given the anti-terrorist, anti-Chechen environment. However, not reporting the keys might allow the application to fall into the wrong hands, allowing secure encrypted communications in a country where it is clearly illegal without the government having a key. What is the responsibility of the developer who makes a secure application freely available in SourceForge, (the online open source software repository)?
Hacktivismo has taken a crack at this type of ethical dilemma by developing an ideology-based licensing regime, the hacktivismo enhanced-source software license.
This modified, open source license regime requires that applications be used for their intended purpose, to support Hacktivismo’s political agenda: Assertions of liberty in support of an uncensored Internet. Martus, the secure, open source human rights monitoring application referred to above uses strengthened “anti-hacking”clauses in a standard open source software license to protect its application and users.
Making the application available with a license for intended use and clear instructions that it should be used legally in the environment in which it is deployed is probably the best solution for the developer to avoid both extremes. It creates a contract between the developer and the end user but leaves it up to the user in country to abide by both pre-requisites. Restricting the application’s use in Russia altogether would probably be as ineffective as the PGP encryption software ban was here in the US. On the other hand providing pre-assigned keys is not really an option as neither the FSB or the developer would have the processes and resources in place to track every user that could pull it off an open source application catalog like SourceForge.
This example is not as extreme as it sounds. Commercial vendors are making their software code available to governments in order to meet their national security concerns in light of the global terrorist threat. In making the code available however, trust is being put in the various governments not to abuse or exploit this information.
Case #2 Ideology and Hacktivism: Denial of service attacks have brought down major web sites like Yahoo and eBay causing millions of dollars in lost business and annoying service disruptions. They have even precipitated arrests for criminal mischief. However, the famous Chiapas denial of service (DoS) attack attributed to the Electronic Disturbance Theater was an act of civil disobedience, commonly referred to as hacktivism. Hacktivism promotes social causes online, in this case the plight of the indigenous people of Chiapas Mexico. In the current world context, what application of technology constitutes criminal behavior, terrorism or hacktivism/civil disobedience?
The originator of the Chiapas (DoS) attack argues that the Chiapas attack was technologically full of holes. It was acknowledged as easy to get around and obviously technologically flawed as DoS attacks go. It was designed as an act of civil disobedience to send a message clearly related to an issue of social importance. Finally, it was attributed to an organization with known credibility in the hacktivist community, a community driven to advocate for social justice through the creative use of technology. Given the new threats we face today, can we distinguish the nature and intent of these attacks by the sophistication of the software involved, the nature of the cause, the amount of damage done or the entity from which it emanates?
Just as we must be able to distinguish activism and civil disobedience from criminal behavior and environmental terrorism we must learn to distinguish between hacktivism and cybercrime / cyberterrorism. Billions of dollars of national security technology R&D coupled with a push to standardize privacy and surveillance laws internationally have the potential to make the Internet a much less open and democratic place than it has been. It may be far easier to mislabel hacktivism cyberterrorism or at least criminal mischief in the future. Yet activism and civil disobedience are valid forms of protest and should be protected civil liberties even as we address very valid national and global security concerns. There may well come a time when we have to include “traditional” hacktivists as arbiters of what constitutes hacktivism and what does not in a society that is more sensitive to national security concerns.
Case #3 Ideology and the Technical Fix: There are two parts that make up the Martus Human Rights application, a secure client and a secure server. The latter can sit in a different country to securely store human rights reports. The developer wishes to make Martus an open source application along with a modified open source license. However, doing so might open Martus up to dangerous hacking by those who would undermine the application and get to the human rights data it is designed to protect. Is the Hacktivismo modified licensing agreement the application’s only protection against people who are already have no issue violating human rights? Doesn’t the nature of the application disqualify its submission as an open source product?
In this case, the intelligent design of the application has not only informed its use but also its security. The basic application can be modified as open source software. However, the security it uses to protect users against access to their records is the same strong encryption protocol employed by secure tools such as PGP. This encapsulated module within the Martus product cannot be modified. On the server side, the application designed to store information does nothing but authenticate users and store their data. It cannot even read the encrypted messages. There is not a whole lot of sophistication built into the server side outside of doing very discreet and simple tasks. The processing decisions are made on the client side. Hence there is far less reason to release the server side software as open source because it would not be particularly useful to build upon. The entire application speaks to both development and use ideologies focusing on two objectives: Making it secure enough for the human rights constituency to be able to trust it, and freely available as open source so they can afford to use it.
Case #4 Ideology and Destructive Technology: We assume viruses are all bad. But what if for national security purposes a democratic government creates a virus that infiltrates a terrorist’s PC and captures his keystrokes so that important information is uncovered that prevents an attack and saves thousands of innocent lives?
It sounds like a good idea, and is technically quite feasible. Can we be sure that such a virus does not fall into the wrong hands or that is not used improperly in the right hands? Just as a socially responsible application can be used for destructive purposes, so can a typically destructive application be used for benevolent purposes. The ideology of use and the user often determine the context. It is more logical to regulate applications typically used for destructive purposes than those purposed for benevolent use.
Case #5 Free Market Ideology and Technology: What is the responsibility of any commercial corporation that has developed its technology in a free and democratic society not to sell this same technology to repressive governments in order censor, secretly monitor or otherwise oppress its people? What is its obligation once in a repressive country not to use its software to help a government harass or detain its citizens in contravention on international conventions or treaties on human rights.
At this crucial intersection between social welfare and free enterprise we have not found the appropriate answer in many contexts. The debate around the publish what you pay movement, conflict diamonds, generic drugs to the developing world and breaking the technology filtering regimes of oppressive countries all have their roots in better defining the traffic lights for this intersection. Often governments are left to regulate business interests as a result of public outcry after the damage has already been done.
Summary
Technology is neither an enabler nor a facilitator of civil society in its own right. Nor is it a decider of its own ethical or non-ethical use. The mechanism that ultimately decides the ideology behind any given technology is its application. It should not be surprising that software development, an area of computer science, presents the same range of ethical dilemmas that most of the other sciences do. In this new environment we live in that seeks to strike a balance between civil liberties and national security we must begin focusing the software ideology debate on the more important issues of what software is developed and deployed for. Software selection should be left to the same operational criteria that have always facilitated successful application deployment — meeting a defined user need.
– Jonathan Peizer –