digital media musings

Monday, October 30, 2006

just in time for halloween. . .

. . . a search engine hosted by a hot-looking black woman.

Go ahead, check her out:

http://www.msdewey.com/

Type in "digital media" as a search term, and get back a diatribe on mainstream media.

By the way, if you leave her up on the screen, the more you ignore her, the angrier she gets. Sort of like Glenn Close in Fatal Attraction.

Funny, but creepy. . .

Saturday, October 28, 2006

"I told you so"

I noticed that the 2nd ed. of the Friedman book has a different cover. The first one has this iconic image by Ed Miracle, titled "I told you so." There is something about this painting I find compelling; the rowers frantically trying to avoid the inevitable, the ships about to plunge into the abyss. I'm not really sure how it ties to the content of the book, since Friedman's point, as I see it, is essentially that a flat world is not, after all, the catastrophe for exploration (a metaphor for progress?) predicted by the conventional wisdom of Columbus' day. Maybe that's why it was pulled from the second edition.

So as long as we're going to talk about Friedman, Highline Community College professor T.M. Sell recently wrote an op-ed piece for the Seattle Times in which Friedman's book does not fare well. His criticism includes the following:

"Friedman never really says what he means by the world is flat, but apparently it's a metaphor for the world getting smaller and the playing field getting more even.
This metaphorical machete has a couple of problems. Think about it: The shortest distance between two points on a globe is a curve. In a flat world, those curves would get straightened and hence be farther apart.
The second problem is the metaphor treats world trade like a competition between nations, which it demonstrably isn't."


Sell goes on to point out that lack of infrastructure and corruption in places like India hampers these countries' ability to compete for ever-higher level outsourced jobs.

Both Friedman and Sell recognize the decline of high-pay, low-skill jobs that were a mainstay of the American economy in the 20th century. Both stress the importance of Americans teaching their kids to value education as a way to protect themselves from the downsides of globalization; predictions are that our children will change careers much more frequently than we did. "Lifelong learning" is a buzzword I see in action on my campus every day. People who come to our college range in age from 16-70, and many are retraining from all sorts of careers. It takes a lot of courage to come be a multimedia student in your 40s or 50s if you lack a technical background, and I have many hard-working students who fit that profile.

When I first read Friedman's book, I found it explained many seemingly disconnected events in my own personal & professional life, but his outlook was a little rosy for my taste. Since the release of the 1st ed., I have the sense that companies have pulled back somewhat from international outsourcing. Two cases in point: a friend of mine is a lead scientist at a large division of an international company, and several months ago told me that their local production facility had been shut down and the jobs sent to Pennsylvania. I expressed surprised that the jobs weren't outsourced to China. "No," he replied; "our predictions are that within five years it will not be economically attractive to manufacture in China." The other case: a start-up flat-screen TV company called Olevia has decided to manufacture 50 miles outside of L.A., instead of dealing with the set-up, distribution headaches, and tariffs involved in offshore manufacturing.

Which is not to say that globalization won't continue, but perhaps trends such as locally grown produce instead of peaches from Chile represent a recognition that the wage savings realized by globalization are at least partially offset by the costs of transportation, cross-cultural communication, and need for complex international business structures.

Wednesday, October 25, 2006

Apropos of nothing, 2


It's come up a couple of times in class now: the idea that reading is much faster than listening. I'm curious, is there research that shows that? It might kind of shoot my grand plan to post lectures as podcasts in the foot.

Monday, October 23, 2006

Unintended consequences, distributed networks, and sushi in Spokane

One paragraph in this week's reading of the third metamorphosis jumped right off the page at me:
"The focus of the [ARPANET, precursor to the internet] research was to design an 'internetwork' of computers that would continue functioning even if major segments were knocked out by nuclear bombs or saboteurs. Thus, the network itself was assumed to be inherently unreliable, with a high probability that any portion could fail at any moment. . . . they adopted a 'headless' distributed network approach modeled after the postal system. . . communication always takes place between a source and a destination."

This explains precisely why the Internet has been so powerful, and why traditional institutions such as governments (think porn and gambling regulations in the U.S., or ongoing Chinese efforts at political suppression) and corporations (think entertainment conglomerate efforts to protect their content--i.e. profits--to squish Napster and YouTube) have been largely unsuccessful at controlling it to date. The Internet is headless; no entity has the overarching ability to direct it. Therein lies its freedom and ability to empower citizens--not to mention vexing problems such as how governments can prosecute online child pornography rings which may span the globe.

Talk about unintended consequences! I doubt the designers envisioned just how profound their invention would be.

The effects of the distributed network model adopted by the internet founders continue to transform society in ways trivial and profound. This weekend, my husband and I traveled to Spokane to visit our freshman son at Gonzaga. As mentioned by someone in class last week, Spokane has free internet access throughout downtown (more progressive than Seattle??), so I was conveniently able to put out a work-related fire or two while online. When we walked into our son's room, he was listening to music online via Ruckus (I'd never heard of it, but I'm sure many of our classmates have), and showed us a video parody of a Tupac song set at Gonzaga by two students--who promptly went on to get a contract with NBC, he told us proudly. (I can just hear their parents: we spent $150K on their education for this?)

Our son wanted to take us to a sushi place, so I kept my doubts about ordering raw fish while east of the Cascades to myself, and much to my surprise, the atmosphere, clientele and food rivaled a place in Santa Monica I'd been to. I kept thinking to myself, in Spokane?

But as we drove home past Moses Lake, where housing developments are sprouting up like cornstalks, I recalled that either Google or Microsoft is building a huge server farm there. Because of the distributed network created by the internet, it's perfectly plausible that tech-savvy people would choose to live somewhere like Spokane, where housing is affordable. Hence the market for good sushi.

The distributed network is no longer merely virtual; it's also transforming geography, redefining where and how people live and work.

Wednesday, October 18, 2006

Apropos of nothing

A random thought occurred to me as I was trying to locate the logout button on one of the 3 online banking sites I use: has anybody such as ISO looked at standardizing web interfaces?

Sunday, October 15, 2006

Uses and Gratifications

Uses and Gratifications is a new concept to me, but worth considering. Had a bit of trouble swallowing the academia-speak in "Determining the Uses & Gratifications for the Internet." (Does the article really need to quote multiple social scientist studies to back up assertions such as "the Internet is revolutionary; it represents a paradigm shift in the way we do business"?) In some ways the article seems naive. "Users of AOL can be a reasonably representative sample of consumer Internet use." Really, in 2004, when it was written? I'd argue that AOL ceased being representative of Internet users by the millenium, at the latest. I've been trying to get my 83-year-old dad to dump AOL for a long time.

"Around the World Wide Web in 80 Ways" suffered from some of the same conceits (again, do we need four studies to validate the statement that "entertainment is a major motivation for going online"?). However, reading the two articles made me reflect for probably the first time on why I spend so much time online. The reasons are complicated, sometimes conflicting, and heretofore quite unconscious. One primary reason is work email; gratification results from responding quickly to student, administrator or business community representatives promptly and therefore increasing the reputation and visibility of our program. A second would be to interact with my online students. A third is just to "zone out" at the end of the day or sometimes midday, surfing for general or industry-specific news and entertainment. Social connections are another reason; research for my own student activities another, banking and checking stocks quite another; staying informed on community and political events a distinctly different reason. What I'd never considered from a uses & gratifications perspective, or really any conscious one at all, is "what motivates me?" What I was struck by as I pondered that (@ 2 am last night because I couldn't sleep) is that all of those motivations might occur in rapid succession or simultaneously. I regularly switch effortlessly between work and personal online activities. Without really ever stopping to consider it, I think I've taken the approach that if I need to check my online bank account during work hours, I should be able to, because the advent of 24/7 computing means I'm also likely to talk to students @ 6 am or on the weekend.

The lines blur; the motivations for being online are interconnected and hazy. No wonder studies have a hard time parsing the data (as the "80 Ways" article seemed to admit at the conclusion).

The "80 Ways" article also had some questionable conclusions, in my view. It tried to separate "serious activities" (such as politically-motivated surfing) from casual ones (downloading music/videos); but today downloads would rank much higher, thanks to Apple & YouTube--I don't think this reflects the less-serious nature of the audience, but the fact that technology has made it easy to do so.

Finally, how has the political landscape changed due to the Internet? I think the study misses the boat by trying to separate "serious" political surfers from others. This summer my 20- and 18-year old sons were both home from college and regularly watched Jon Stewart and Stephen Colbert online. Young people are getting their news from sources like these because of the disconnect between what they experience as draft-age Americans and how the traditional media portrays the U.S. political environment. I watched with interest as my 18-year-old watched Colbert's "Better Know a District" profile of Bellingham which had a thoughtful (if funny) analysis of U.S./Canada border security. I fear these kids will become quite cynical, with Leno, Letterman, Stewart and Colbert as their primary political news sources.

Warming up to Winston

So at first I found him a little too obtuse, and a little too British, but the more I read him, the more I find to sink my teeth into.

I think we have a tendency to idealize both political history and scientific history. Certainly my own recollection of the invention of the telephone was a romanticized version of "Come here, Mr. Watson, I need you" (probably a relic of 4th grade science instruction). The reality is much more nuanced. I was struck by the idea that Alexander Graham Bell's "amateur status is in complete contrast to the way in which things are done these days, not the least the mighty research laboratories that bore his name." When I worked in telecommunications many moons ago, our company occasionally hired Bell Labs veterans. They were respected indeed in our company; recently I read an article lamenting the lack of industrial research such as that which used to come out of Bell Labs and Xerox's Palo Alto Reasearch Center (PARC) , to name a couple. Today the tradition is being renewed by Google and Microsoft research departments.

I was ignorant that Western Electric, which manufactured for AT&T for many years, was founded by Bell's rival, Elisha Gray--both men and their companies apparently profited nicely from their agreement. The rather sordid legal battles and backroom negotiations described in the text actually made me feel that the era we're living through now is not so unique. This is somehow comforting. Perhaps we've been here before after all.

Monday, October 09, 2006

The Telegraph

So I think my first “a-ha” from the Media & Technology book is that it’s getting harder for me to read “real” books, especially those with dense content, without my trusty computer as a sidekick. First thing I wanted to do was Google Paul Klee’s Angelus Novus. Funny, I envisioned the painting as very gothic, even though I was pretty sure he was involved in Bauhaus and therefore his style would hardly be gothic. The image itself did not impress me in the least—my imagination, and Benjamin’s words, painted a much more vivid, tragic, picture of a determined angel fighting against the blowback of events. Klee’s image looked like a childish cartoon. So what does that say about “the rise of the image, the fall of the word” (to crib from Mitchell Stephens, whom I’ll return to in my research)?

I found the reading slow going. I wanted to define apposite, so tantalizingly close to opposite, but clearly separate in meaning. So that required a trip to dictionary.com, where I learned that it means relevant. Quickly followed by a sidetrip into jeremiad, meaning bitter lament. Mmm, good words. Hadn’t consciously thought much about vocabulary till taking the GRE this summer. At 46, how many words should you expect to add to your vocabulary per week?

Well, I digress. The telegraph chapter was illuminating. We see the suppression of radical potential today, do we not, in the media companies’ fight to keep free creative content off the web (recently Napster, now YouTube)? I’ve been dimly aware that history attributes many scientific breakthroughs to the wrong player, but this case study was interesting.

Finally, a typographic rant about this book: it’s set in a font called Perpetua, which has a very small x-height (height of lower case letters that don’t have ascenders). At the very least it needs to be set larger. Perpetua also lacks a property we look for in choosing body copy fonts: that of invisibility. Its personality gets in the way of delivering the message.

Social Aspects of New Media Technologies, pt. 2

We can see why the internet is so compelling, under uses & gratifications theory. It easily outperforms TV in at least three of the four gratifications. Entertainment? Far more choices, on demand. Personal relationships? TV strikes me as more of an impediment than a gratifier in personal relationships, while the internet allows people to connect with old friends, find romance, and participate in worldwide listserves based on personal or professional interests. Surveillance? Both media offer extensive opportunities to be aware of local, regional, national or international events of interest, but the internet allows individuals to express opinion, and perhaps more easily get involved.

The critical mass piece got me wondering, how long until traditional "paper" banking is entirely obsoleted by online banking? How will slow adopters/low resource individuals be served?

The diffusion of innovation steps look a lot like those used in advertising. First consumers must be made aware, then persuaded, the item purchased, and their decision confirmed.

Something I'm troubled by with much of the technology we're currently surrounded by is the increasing speed of obsolescence. (Is there a companion graph to level of diffusion called level of dissolution?) The entired lifespan of VHS was about 25 years; recently I went through our family's collection of home movies and Disney tapes. The home movies will need to be recorded to a newer media; the Disney tapes, which a few years ago were worth as much as $200 each as collector items--well, I'm not sure what to do with them. How about all those digital .jpg pictures we've all been snapping the past few years? All the information in .pdf's? Adobe Systems, Inc. recently introduced a file format called .pdf x/a, the "a" standing for archive. They committed to support the file format for 50 years. But given corporate mergers, bankruptcy, etc., how can we really know if this file format will survive that long?

Sunday, October 08, 2006

Social Aspects of New Media Technologies

Right off the bat, this article leads me to recognize a problem I'm struggling with at work. We are revising curriculum for our technical college's two-year Multimedia Design & Production degree, and I'm researching potential name changes. The particular degree I'm working on revising has been known as the Multimedia degree, but now that we offer another degree focusing on video production for the web, and a third in 3D animation, the name needs to differentiate from those. This degree is really a degree for print and web designers--but how sexy does an A.A.S. in "Web & Print" look on a transcript? So I've been looking at "Graphic Design," which by current U.S. Department of Labor job descriptions is accurate--but I fear sounds so stodgy that young people will avoid it. "Multimedia" is technically inaccurate, because print publishing does not fulfill the common definition of multimedia, which requires at least 3 of the following: text, graphics, interactivity, sound, animation. I love the UW's "Digital Media" moniker, but since our video production degree is title "Interactive Digital Media," any similar reference is bound to cause confusion. "New Media" is just too vague, as the article points out.

So here I am, teaching in a discipline undergoing so much transformation that we're not quite sure how to label it. Almost, but not quite, makes me wish I was teaching Nursing. Ideas for names, anybody?

the telecom biz

I had never heard the term "disruptive technology" until reading the Economist article, but it certainly made sense in the given context. I couldn't help but contrast how VOiP is transforming phone service to the telecom industry environment 20+ years ago, when I worked for a local telecommunications company, Teltone (then based in Kirkland, now in Bothell). I remember a great deal of talk about UMS (Universal Measured Service), where phone companies were expecting to generate revenue by charging for local calls on a usage basis just like long distance, instead of charging a flat fee. Instead the nearly the opposite has happened--long distance is nearly free, and most young people I know dispense with land lines entirely. When we dropped off our second kid to college at Gonzaga this summer, it seemed almost archaic that dorm rooms had land line phones that students could rent.
The article also clarified for me why, when our family went in for our "forced" cell phone upgrade after two years this summer, Verizon pushed so hard for us to add services such as texting and internet access. They need the revenue stream generated by the value-adds.

Tuesday, October 03, 2006

Day 1

Well, this is a homecoming of sorts for me, here in the Communications Building at the UW. Many years ago I worked in this very building, back-in-the-day of typesetting-before-desktop-publishing-was-invented.