digital media musings

Thursday, November 30, 2006

technology meets snow

I'm going to keep this short, but Blackboard (a learning management software we use which basically creates an online classroom "shell") was a lifesaver for communicating with students following Monday's snowstorm. It allowed me to:

*Teach my online class without interruption
*E-mail my face-to-face class groups with an update that the college was closed
*Administer a final exam online which had been scheduled to be given face-to-face on Wednesday. 13 of 15 students completed the exam during the scheduled class time, which I thought was phenomenal, given that not everyone had power and/or internet access.
*Issue surveys to two classes asking if they wanted to present final projects as scheduled or extend class by one day to make up for the missing lab hours. (Unsurprisingly, they voted overwhelmingly for the extra time.)

You know, I do my share of complaining about technology, but computer-mediated education has its merits!

Into the wilds of IE7

"Think of some piece of technology that you have recently purchased or a piece of software or feature of an existing piece of technology that youhave recently started using. Think about why you made this decision - what factors caused the"adoption." Now - talk about your experience ... and then think about the theories we've talked about in class. Which theories help explain your behavior?"

So I bit the bullet and downloaded IE7 on one of the five computer stations I regularly use (one in my office, one in the classroom, two desktops at home, and a laptop--if anyone's counting). Was not terribly keen to do so, and I'm sure some of you are wondering why I use IE7 in the first place since Firefox has had a better product for quite some time. The answer is that most students use Internet Explorer and when I'm demo-ing in front of a full class of 30, confusion will reign for less technologically-adept students if their screen looks different than mine. Thinking about multiple user interfaces is tiring (trust me, I used to switch teaching software packages AND platforms every three hours, and I thought my brain would explode when I couldn't remember the shortcut that worked 20 minutes before in another class!).

However I've been annoyed with the IE6 interface for a long time, so I read two or three industry reviews on IE7, which convinced me that it would be a big improvement and a relatively painless conversion. Nonetheless, I decided to use only one station as a guinea pig.

Surprise! It was painless! Gone are all the stupid pull down menus (tho advanced internet options are still squirreled away where Microsoft figures most people won't find them). The interface is clean and minimalist, which translates to valuable screen real estate freed up.

How about applying Everett Rogers' characteristics of innovations theory to my little upgrade experiment?
1) relative advantage: "better than the idea it supersedes." It wasn't hard to improve on IE6.
2) compatibility: "consistent with existing values, past experiences, and needs of potential adopters." Since I was already comfortable with websurfing, this was not taking a big risk.
3) complexity: "degree to which an innovation is perceived as difficult to understand and use." I was pretty confident from the industry reviews I read that this wouldn't be an issue.
4) trialability: "degree to which an innovation may be experimented with on a limited basis." Since I could try the new software on a single computer, and didn't need to uninstall IE6, my experiment had a high degree of trialability.
5) observability: "degree to which the results of an innovation are visible to others." Again, since the reviews were by technogeeks whom I trusted, and screenshots of the new interface were provided, I was willing to sample the new product. Next, over the upcoming break I plan to upgrade the classroom computers, which will in turn expose 100+ students to the new environment. I would definitely avoid upgrading the classroom computers if I weren't confident that students will quickly and easily adapt.

Tuesday, November 21, 2006

Wash me, dry me, spin me. . .




A dirty little secret lives called VNR.

Three years out from an undergrad advertising course and 18 months out from a public relations course, today's reading of Gillmor's reference to what he calls video press releases [more commonly called video news releases] (p. 184) reminded me that I had studied them from a practitioner's point of view.

So I dug out my textbooks, and found these references:

Marketers like to have as much control as possible over the time & place where information is released. One way to do this is with the video news release, a publicity piece produced by publicists so that stations can air it as a news story. The videos almost never mention that they are produced by the subject organization, and most news stations don't mention it either. Many pharmaceutical companies like Pfizer, Aventis, and AstraZeneca have used VNRs, as have GNC, Mercedes, Nieman Marcus, and others (Belch, 2004, p. 581).

Even more insidious:

News releases in video form, known as VNRs, have become standard tools in the practice of public relations. The best VNRs are those that cover "breaking" news--a press conference or news announcement that broadcasters would cover themselves if they had the resources. Such "breaking" news VNRs are delivered by satellite directly to TV newsrooms. [Caption reads:] The real thing. Once a VNR makes it into a newsroom like CNN's and over the airwaves, few viewers question the story's authenticity or origination.

Satellite feeds of unedited footage, called B-roll, include a written preamble-story summary and sound bites from appropriate spokespersons. The TV stations then assemble the stories themselves, using as much or as little of the VNR footage as they see fit.

Questions practitioners are asked to consider include: is the VNR needed? How much time do we have? How much do we have to spend to make the VNR effective? What obstacles must be considered, including bad weather, unavailablity of key people, and so on? Is video really the best way to communication this story?

Oh, yeah, and one more thing is mentioned in the text: Then, too there is the controversy surrounding VNRs in general. . . . TV guide's researchers reported that, although broadcasters used elements from VNRs, rarely were they labeled so that viewers could know their sponsor's identity.

. . . The fact remains that if an organization has a dramatic and visual story, using VNRs may be a most effective and compelling way to convey its message to millions of people (Seitel, 2004, pp. 247-249).


Anybody besides me feel like they want a shower?



Belch, G. and Belch, M. (2004.) Advertising and Promotion, 6th ed. New York: McGraw-Hill.

Seitel, F. (2004.) The Practice of Public Relations, 9th ed. Upper Saddle River, NJ: Pearson.



Additional reading:

http://www.sourcewatch.org/index.php?title=Video_news_releases

http://www.democracynow.org/article.pl?sid=06/04/06/1432244



Thursday, November 16, 2006

A perspective on Web 3.0, Net Neutrality and the DMCA

Online media columnist Sean Carton explains why he thinks the web-based software services will not catch on unless Net Neutrality and digital media copyright issues are sorted out. I thought this tied nicely to our discussion about Net Neutrality in last Tuesday's class.

Read it @:

http://www.publish.com/article2/0,1895,2060291,00.asp

Monday, November 13, 2006

The Tragedy of the Commons

I'm glad we read this. I encountered it earlier in an environmental science class, and always wanted to come back to it.

Hardin ties the idea of a shared pasture with individual herdsmen vying for ever-increasing shares of the resource to overpopulation, and argues that population problem "has no technical solution, but requires a fundamental extension in morality." His scientific paper was notable for its discussion of morality (Wikipedia, 2006).

Last year I wrote a paper for an international relations class examining the population problem and have posted some excerpts here. Current projections indicate human population may top out at about 9 billion by mid-century and then begin declining (PRB, 2004).

"Historically, human population has been remarkably stable, with high birth and death rates ranging from 30 to 50 per thousand per year (PRB, 2004). High birth rates were necessary to maintain population numbers; this pattern held true regardless of culture or ethnicity. For example, in England in the 17th century, 60% of children did not survive to the age of five, and only 30% survived to age 15—old enough to procreate (PRB, 2004). For most of human history, the combined high birth and death rate resulted in very slow population growth.

The first well-known attempt to describe human population trends was by Thomas Malthus. At the end of the 18th century, he published a book, Essay on the Principle of Population, which postulated that “war, famine and disease” were inevitable to counteract the tendency of population to grow geometrically while food production could only grow arithmetically. This theory gained credence because it was mathematically formulated, at a time when the rational approach to intellectual inquiry was first becoming fashionable. He believed that the poor in particular should exercise “self-restraint” to keep from increasing in number (Montgomery, no date). His theory of population growth being tied to war, famine, and disease became known as the Malthusian trap (Peterson, 1999).

Beginning in the 1920s, demographic transition theory took root as a way to explain population trends. This theory posits that industrializing nations pass through four stages: stage one, with high birth and death rates; stage two, with high birth rates and falling death rates (due mainly to decreasing death rates of children 0-5 as sanitation and medical care improve); stage three, in which birth rates fall since children become an economic liability instead of an asset, and access to contraception improves; and stage four, characterized by the low birth and death rates seen in industrialized countries (Montgomery, no date). Large increases in population are seen in stage two since birth rates far exceed death rates at this juncture. This is the state that many developing nations are currently in, which explains the continuing rapid increase in world population even though birth rates are stabilizing or even falling.

Interestingly, it has been noted that developing nations appear to be going through the demographic transition much more rapidly than the developed nations did. A comparison of Mexico’s transition to that of Sweden’s in the 18th century shows that while Sweden’s fertility and mortality declined fairly slowly over a period of 150 years, Mexico experienced a rapid increase in population when the death rate fell quickly, which led to the government policies which encouraged contraception and education (PRB, 2004).

Mexico’s birthrate fell from 7 children per woman in 1965 to 2.5 in 1999. The government encouraged this trend with a 25-year ad campaign extolling that “small families live better.” Traditionally, Mexicans held a belief that one of the reasons they lost the southwest to the U.S. was that the sparse population impeded their ability to defend the territory, and as a result of this loss, population growth was encouraged. The government’s 1974 about-face on population growth expanded women’s access to contraception and dramatically affected the birthrate (Dillon, 1999). Likewise, China’s well-known “one-couple, one-child” policy has dramatically dropped the birthrate, although not without controversy or unintended consequences—such as the gender imbalance brought on by the cultural preference for sons and consequent abortion of female fetuses (PRB, 2004). Nonetheless, these examples show that government policies can have dramatic effects on population numbers.

A more recent concern is what the implications of declining national populations are. Whole new issues arise in nations which have a fertility rate below the replacement level: low birth and death rates imply an aging population, with a set of problems the world has not seen before (Longman, 2005). The capitalist model assumes continued economic growth; what happens to this growth if fewer people need goods and services? Who will care for an aging population, and how will government entitlement programs deal with a declining worker-to-retiree ratio? Governments are just beginning to address the far-ranging implications of these trends, which are evident in countries such as Japan, Russia, and Germany.

Post-industrial societies have factors which work in favor of very low fertility. Contraceptives, career choices, the economic drain of childrearing, “biological clock” factors, lack of a desirable partner, unstable employment, and high housing prices can all cause women to reconsider how many children they have, if they have them at all. Even in developing countries, rising education levels cause women to delay marriage, use more contraception, and have fewer children (PRB, 2004).

There are positive benefits, called the “demographic dividend” of declining populations. Some experts believe that having fewer children frees resources for investment, labor, and other pursuits (Longman, 2005). Certainly, the environment benefits when fewer humans compete for resources."

While it could be argued that Hardin is correct that a "moral" solution (we could have quite a discussion over the morality of China's policy, in particular) rather than a technical one is needed to solve the problem of overpopulation, he did not foresee the benefits, and therefore the motivation, to individual women of having fewer children (better educational and economic opportunity), nor did he predict that later-industrializing countries would progress more rapidly through the cycles.

I'll have to further ponder the relationship between tragedy of the commons to today's technological environment, but I thought brief review of the human population problem might provide an interesting background.


Dillon, S. (June 8, 1999). Small Families Bring Big Change in Mexico. The New York Times. Retrieved 8/9/05 from the World Wide Web. http://www.uwmc.uwc.edu/geography/Demotrans/mexpop.htm

Longman, P. (2005). The Global Baby Bust. Annual Editions: Developing World. Dubuque, IA: McGraw-Hill/Dushkin.

Montgomery, K. (no date). Demographic Transition. University of Wisconsin, Department of Geography and Geology. Retrieved 8/9/05 from the World Wide Web. http://www.uwmc.uwc.edu/geography/Demotrans/demtran.htm

Montgomery, K. (no date). Thomas Malthus. University of Wisconsin, Department of Geography and Geology. Retrieved 8/9/05 from the World Wide Web. http://www.uwmc.uwc.edu/geography/Demotrans/malbox.htm

Population Bulletin (March 2004). Population Reference Bureau. Retrieved 8/9/05 from the World Wide Web. http://www.prb.org/Template.cfm?Section=Population_Bulletin2&template=/ContentManagement/ContentDisplay.cfm&ContentID=12488

Projected Population of the United States, 2000 to 2050. (2004). U.S. Census Bureau. Retrieved 8/12/05 from the World Wide Web.
http://www.census.gov/ipc/www/usinterimproj/natprojtab01a.pdf

World population growth rate continues to plummet. (May 2, 2005). Mongabay.com http://news.mongabay.com/2005/0502

World Population Prospects: The 2004 Revision and World Urbanization Prospects. (May 2, 2005). Population Division of the Department of Economic and Social Affairs of the United Nations Secretariat. Retrieved 8/12/05 from the World Wide Web. http://esa.un.org/unpp

Tragedy of the Commons. (October 2006). Wikipedia. Retrieved 11/13/06 from the World Wide Web. http://en.wikipedia.org/wiki/Tragedy_of_the_commons

Tuesday, November 07, 2006

More Friedman-related talk

So I emailed my nephew, who is living in Bangalore teaching Indian tech support engineers how to deal with American customers, and asked him about what he sees. Here's some of his response:

"My sense is that the pace of outsourcing is slowing . . . I've also read articles that have discussed companies bringing business back to the States because they were losing customers. Further, [large international software company] seems to be trying to shift a lot of their services from voice/phones to writing/email. Training Indian employees in cross cultural sensitivity definitely has its limits. There are some cultures sensibilities that simply can't be taught but must be experienced and learned through socialization. Even language training has its limits. There are definitely a lot of issues with outsourcing here and costs are going up dramatically, so it'll be interesting to see what happens over the next 5 years or so."

All is not rosy in the flat world. . .

Online community

I'm in a hurry, but I'm going to try to dash this off before my next meeting. I wanted to share my own positive experience with online community. In 1996, when the web was just taking off, a family member was diagnosed with a life-threatening form of leukemia. I immediately searched the web for information, and within a short time met a guy online with the same disease in England, who had just started a list-serve for patients.

Keep in mind that list-serves were not widely used by the public @ that time. Soon, others joined, and one brave leukemia survivor named Barb Lackritz became our de facto leader (sadly, she passed away from this stupid disease a few years ago). This kind of leukemia is often slow, but deadly, so patients often have a good deal of time to research treatments. Through this listserve we found the best center in the world for this kind of leukemia (not, as you might think, Fred Hutch here in Seattle, but M.D. Anderson in Houston). We traveled there for a consultation (on Sept. 11, 2001, ironically--but that's another story) and my family member ended up using their experimental protocol. Today, that protocol has become the gold standard and people enjoy longer remissions because of it.

Our listserve was a precursor to ACOR, which today has listserves covering many types of cancers. Things I learned from the listserve that would have been impossible before the internet included: real patient stories from around the world of treatment successes, failures, and side effects; information directly from leading international researchers who monitor the list; and information that contradicted "standard medical practice" of the day, but which in the intervening 10 years has been shown to improve outcomes. Patients and families from our list have testified before the FDA and founded multiple research organizations. List participants range from patients who happen to be biology professors and statisticians (both handy people to have onboard, by the way) to all manner of scared but determined everyday people, to highly respected researchers who have a unique opportunity to connect directly with patients living with this disease. Believe me, courage lives on this list.

The listserve has the usual problems of flame wars and misinformation (people's tempers run a tad high when they have an incurable illness), but a savvy listmember can find information unavailable anywhere else.

And that's why I get really irritated when medical doctors pooh-pooh the internet. Some medical doctors are themselves unlucky enough to get this kind of cancer, and they tend to seek out our little listserve and participate actively.

So--Vaun's drawing of the toilet stall is accurate (and I agree with him in many ways). But there is also a noble--and sometimes lifesaving--side to internet community.

Monday, November 06, 2006

Breaking up with Winston

Dear Winston,

These kinds of letters are always hard to write. When we first met, I thought you were pretentious, eccentric, and oh-so-British, but undoubtedly smart. I soon fell for your British accent (what American woman doesn't?) and began to lend credence to your talk of supervening social necessity, law of suppression of radical potential, and diffusion of innovation. Your insider accounts of the "real" story of technological innovations were fascinating.

We spent quiet nights together--you, me, and my 15-watt reading light--while my husband slept, oblivious to our trysts.

Your conviction that today's technological upheaval was really nothing new was so reassuring to a woman buffeted by technology overload. "This has happened before," you whispered. "There's really nothing so revolutionary about it." Oh, how comforting. The new technology's not going to overwhelm us after all! If it's been seen before, we have tools to deal with it!

Mistresses always want to believe pretty things, and it was a pretty thing indeed to think that we could predict technology's effects on society, and therefore be proactive, instead of reactive.

Last night, when we settled in together, I was so ready to hear your take on the Internet. "Prototypes and Ideation," the chapter head read importantly. "This Grand System," read the page headers--okay, I'm probably the only person who reads headers in books; it's a failing born of my years spent as a typesetter.

It started so promisingly--with a thorough analysis of the genesis of Arpanet, born of ARPA, soon-to-be-rechristened the faintly evil-sounding DARPA. Vannevar Bush was duly noted, the players and the plays painstakingly recounted. Oh, you had me last evening, Winston, as soon as we said hello.

But affairs usually end badly, and this one was no exception. Your painstaking analysis of the past broke down completely when you tried to predict the future. Hindsight's 20/20, alright, but you must cringe when you read your 1998 prophecies: "the limited Boolean logic of the search engines," "the disastrous application of the concept of commoditisation of information" [ed.: British spelling intact], "little support the idea that the net will become a crucial method for selling goods and services," "the creation of a virtual social community seems to have less, if any, purpose except as a sort of hobby." The evidence is damning.

Like a mistress being told "I love you, you're beautiful, but I'm going back to my wife," your words predicting the future of holography fell on deaf ears, Winston. How could any of the pretty tales you'd spun be believed, since you so clearly missed the implications of the Internet?

Well, I'll get over you, Winston. You led me down a garden path, but I should have trusted my instincts at the beginning. Like all relationships, I've come out of this one a changed, perhaps wiser woman. I will say this: I'll never forget you, and maybe there was something to your beloved concepts of supervening social necessity, and suppression of radical ideas.