14 September 2010

Traditonal Publishers Still Hidebound

"The idea that something that appeared in print is automatically worth paying for is nonsense." says Mark Coatney in Evaluating Time Magazine's New Online Pay Wall

This is an example of thinking from the traditional publishing world, where if something made it into print or was "published" it meant the content with through a lengthy process of adding value and checking quality, through the editorial, fact-checking and proofreading process. This was thought in the olden days to mean something. Yes, it did, but not always. That editors and fact-checkers were available or that they had a hand in content did not necessarily mean puff-pieces, fabricated stories, falsehoods, mistakes, typos never made it into that published content polished to shine like your grandmother's counter tops.

Publishing was a measure of trust and quality from the pre-network world. The network has a new set of criteria and indicators of trust and quality.

I find that often writers who do not get paid, who are passionate about a subject or cause and write on their own, are more timely, accurate and effective than authors working for a magazine. There is something about how writers are hired, directed and influenced within the publishing world that biases, distorted, subverts them and their content. Its not always money, or being paid to ghostwrite or pander or a puff-piece or just being paid to write a certain kind of article that the editor wants. Its something inherent in the process. It may be a consequence of the time it takes to polish a piece to perfection. It may be the idea that a piece needs to be polished. These requirements place their own burdens and biases on writing. Of course, it seems rational that a more polished piece is better, but that is not always true. Sometimes diamonds are more useful and beautiful in the rough than cut and polished.

Often published print authors come across as hired-guns, slick, indifferent, arrogant. They often know so much, they know too much, and become arrogant, infusing their writing with their opinions and indifference to reader's and other's views on the subject. The very act of being a "filter" means possibly useful information may be omitted. What if the filter is wrong? What if an author is giving advice, carefully researched and polished, so it looks good, but has become obsolete by the time it is published, has drawn the wrong conclusions, used the wrong sources of information, yet, speaks with an authoritative voice? This goes on without much accountability in the slow moving print world.

My own idea for payment, which I proposed back in the late 90's, was to adopt the "PBS model" of threatening to "take Big Bird away" or in other words, remove content after a certain time if not enough people viewing it online paid for it. The page would display the number of days remaining until the content is pulled, and the number of people paying for it, with the threshold, like one of those donation thermometers. Perhaps the pledgemusic model would work for publishing content as well as it does for music. Authors could offer additional items, such as autographed books, handwritten manuscripts or donations to charity for payment.

02 September 2010

Angry Diggers and the Death of the Author

Veteran users of Digg are upset with changes to the site aimed at reducing their influence. They have begun gaming the "voting" system

What is interesting about this is:

When I first encountered and thought about sites using voting systems to surface desirable information, I understood that all algorithms for voting can be gamed.

That to deter gaming, very sophisticated and arcane algorithms were required. That these discourage contribution because contributors never know where their work will rank nor why it ranks low or high (this is similar to authors puzzling over Amazon's ranking system).

I was surprised when sites based on user voting systems began to succeed by simplifying their voting to the thumbs up/down basic counts or other simple and easily gamed voting systems.

I believe that when users are satisfied with outcome of their vote, which for Digg means, contributors get their links or comments surfaced and readers feel that the surfaced content is useful, there is no reason to game the system. A little childish game playing might go on, but as long as Digg was a useful tool to most of its contributors and readers (perhaps the same individual, but I would guess the standard 2% participate as contributors and the rest are readers), the system was in equilibrium, running on "social balance."

Only a minority of pranksters might want to game the voting system. Until the contributors become dissatisfied.

I think it shows that simple voting systems can work, which was something I believed was unrealistic when I first encountered them. But through good social engineering, strict controls are not required, sophisticated and opaque voting algorithms are not necessary and the "helpful/unhelpful," "interesting/uninteresting," "thumbs up/thumbs dn" type of voting system is transparent and intuitive, and robust as long as users are happy with the results.

The controversy also reveals the differences in "publishing" model. Digg operated as kind of newspaper edited by a small group of contributors, who were opinion makers or controlled mindshare of an audience, gatekeepers, more like traditional media. It may appear democratic, but in reality was a traditional publishing model where a small number of contributors and editors create a filtered flow of content, like publishing or broadcast media.

The opposing model is the "grapevine" or social model, in which information flows organically, laterally, potentially exponentially, through social connections.

Authors do not like readers having a "custom experience" because it reduces the influence of authors. It makes their work pointless. I would not describe a customized information flow, such as provided by software agents or through user personalization, the same as a social information flow. Social media may reduce the influence of gatekeepers, authors, editors, publishers and broadcasters, but that is probably beneficial. I would not call that customized, but social. The social flow does reduce the influence of authors, but also makes everyone an author and the more influential authors will build social audiences, as they already do with followers on Twitter or through posts to pages on Facebook.

Authors won't go away, because they put time and effort into understanding something others don't have the time for or are unwilling, unable to put in the effort to learn. Authors generally notice things going on, and have something to say about events. Writers like Michael Pollan, for example, are not going to vanish because your Facebook friend told you all you need to know about humanity's divorce from nature and how important local food is at restoring your connection to the natural world. That friend probably isn't noticing, isn't thinking about things going on in the world in a careful way, that's what authors do for a living. And they often become evangelists for their point of view, something few people are going to become.

That's all I've got to say for now, readers.

01 September 2010

How I got started writing haiku

When I was about five years old I began having experiences of things that stuck in my mind. I would see something, encounter something, and I would freeze for a moment. When I think of it now, I realize this was "noticing" the whatever-it-was, but very intensely, compared to other things, for a moment.

I noticed the freshly washed sheets my grandmother had hung on the clothesline to dry, billowing in the breeze. I saw this from my vantage point sitting in the sandbox. It was memorable for some reason I did not consciously think about then.

I was never bored riding in the car on family trips because I was constantly entertained by noticing all the details of everything along the road, there were always things that raised interesting questions in my mind, drew out my curiosity, such as the light on the window in a shop in a strip mall, or the neon lights at night, the stars reflected in the window, the hum of the tires on the highway.

On one of my first trips to the beach, I ran down the big dune at Delaware Seashore towards the ocean, as I got closer to the water, I spotted something in the sand and came to an abrupt halt. It was a jellyfish half buried in the sand. For some reason this first encounter with death and fear of treading on it and being stung, stuck with me.

Over the years, I tried to write poems about my experiences or even make them into lyrics, but nothing ever worked out right. It never felt right. I always had to add something to the experience to make it a poem or song. In 2008, I was reintroduced to haiku by reading Basho's Narrow Road to the Interior while recovering from illness.

I was familiar with haiku from my school English classes, but never took it seriously. Even though I enjoyed haiku. I got the not so subtle message that haiku is a trivial form, not valued in Western poetry. For some reason, when I think of making something I feel the gaze or regard of this invisible audience.

For example. When get the urge to make a guitar because it would be rewarding in of itself without any external reward, I think immediately, yeah, by the time you build one, guitars will be going out of favor and no one will care, you'll be stuck building obsolete stuff no one cares about during a period of decline in respect and interest for the instrument (ironically, this did not happen, but the opposite happened in the last decade). I thought I would be throwing away any effort on a poetic form that was not even recognized as poetry (I have an irrational need for fame, not celebrity).

It didn't take long after reading Oku for me to realize that I'd been barking up the wrong tree all those years. I'd wasted twenty years trying to fit my experiences into forms that didn't fit. I started translating my experiences and unfinished poems into haiku similar to Basho's. It worked. The haiku was perfectly suited to capturing and conveying the kind of experiences I've had since childhood.

I said to myself at the time, I don't know if my haiku are haiku, or if they are really good haiku, but I was happy to have found a form for expressing my experiences in a satisfying way. This was more important than fame. It would have been wasteful for my experiences to have never seen the light of day, and they just kept bugging me to write them down.

I was given a book on haiku writing. This is when I first learned that what is central to haiku is not form, but experience. That my childhood experiences had a name, they are called "haiku moments" in the community of writers.

I always say I don't write haiku. I get asked sometimes by people to "write a haiku" and I can't. A haiku starts with an experience and if I don't have the experience (or can't borrow one), I have nothing to go on. So I don't get haiku contests or challenges. Art is an expression, its not a craft, although it may involve craft. I am sometimes inspired by other people's lives, so I steal their experiences for my haiku, when I've none of my own at the moment worth writing about. But to write a haiku to fit a description or theme provided by a contest seems to undermine what haiku is about, experience.

I've learned through this. I learned that I sometimes you have to wait for the right form to come along for your expression. Western poetry didn't cut it. I've learned that expression had to come from myself. I've learned that I can only do what I love and enjoy otherwise it will never amount to anything. I suppose the other thing is that there was some element of obsession and I had to recognize that, whatever it was there had to be some creative impulse that was driving me, even
perhaps against my will, to make these things.

I am always fascinated by Minnie Evans who began to draw after a voice said to her in a dream, "Why don't you draw or die?" I wish I could draw like Minnie Evans, but I am held back by my scientific nature. I could never see elephants dancing around the moon and if I did I would keep it secret lest people think I'm crazy. I find in writing haiku a way of expressing the mysterious nature of life in concrete terms, hiding the mysteries in juxtapositions, which must be unraveled in the mind of the reader.

03 August 2010

Bring the Island to You Instead of You Going to the Island

To those of us who are blind to the night sky, and deaf to the language of clouds, currents and ocean swells, it seems like a mystical or superhuman act.

I've always been fascinated by the ideas involved in Polynesian wayfinding. The idea of moving the island to you instead of you moving toward the island is so novel to anyone raised on Western thinking. We take so much of our rational, reductionist scientific beliefs for granted, our coordinates and maps, and compasses, as if they are the only way to navigate. While we ignore the most powerful navigational "computer" of all, the human brain. We forget in our "rationality" that there are other equally valid ways of "reasoning" about the world, coping with the world around us, that do not involve precise "facts", numbers and reasoning, but that use our powers of observation, pattern and cleverness. The NY Times has an article on the passing of an important Pacific traditional navigator, who helped restore navigational folkways.

28 May 2010

Psychology and Politics

I am disappointed by seeing a significant number of articles (mostly in the blogs, such as this article that makes an assertion and then follows with several anecdotes about mental patients to justify the assertion) on Psychology Today where the author is: employing psychology to support negative characterizations of persons holding political views different from the author's own; employing psychoanalysis at a distance to explain the political beliefs and policy opinions of others; using psychology to support speculation about the intentions of others whose beliefs about the world and policy differ from the author's; the labeling of people with opposing views as suffering from diagnosable mental illness, arguing or implying those views are a result of mental illness. Doing so without holding the same mirror of analysis up to their own self seems hypocritical and intellectually sloppy on the face of it.

I prefer inquiry that follows the rule of curiosity. Instead of characterizing or questioning, the curious person wishes to learn what another person's views are. An excellent example of curiosity-driven inquiry is Brain Lamb's interviews conducted for C-SPAN.

I believe in the importance of free intellectual inquiry, which is driven by curiosity about the nature of things, so it is not the political content of these articles that offends me, but the undermining the grand project of enlightened inquiry, to which I am dedicated.

As someone who has a lot of contact with the world of folk studies, I am aware that all people without any exception posses a "folk wisdom" about the world, which they absorb by osmosis from their family, neighbors, and community. Our opinions, decisions in life and as policy makers (whether you are a politician or a voter, or opinion shaper), are affected by this personal viewpoint, almost without thinking. I believe this is why so many articles of this type are emerging, and I would like to remind people to follow their curiosity in order to be led beyond our prejudices and ideologies that we may hold without even being aware of them. We also have to be careful about assuming our beliefs are automatically real or true and not conclusions about the world, possibly shaped by our temperament, folk culture and education.

22 April 2010

Twitter's Game of Telephone

I find the criticisms of Twitter, especially by literate people or authors tiresome. They are so wrapped up in their own cherished conception of what literacy, writing and authorship is, they can't see the creativity and value of Twitter's social sharing mechanism.

At its best, Twitter is like the game of telephone. That is where a child tells the child next to them something, then that child tells the next child, and after going through several children, a slightly different story emerges. I believe this is a _good_ thing. What I loved about "retweeting" when I first discovered it on Twitter, was how it was a editorializer's paradise. Tweets in the process of being retweeted simply begged me to rewrite them, reorganize them, expand or comment on the idea, adding my own ideas and thoughts to the original tweet. Perhaps even shifting it entirely into my own framework. I posted my retweet in the glorious knowledge that someone else might take my words and reformulate them. I welcomed this lateral change.

The social retweeting created a kind of sideways motion as a tweet passed through many hands unlike anything media has ever seen before. It is not commentary, nor sharing, but a process only a social network could produce. It was not an author reacting to another's essay. Or a commentator commenting on an original with an original of their own. It was more like the wiki process only sideways through time and information space. Each person contributed a small effort, made a small change, but the results were not collected together into a single document, but spread out through time and place.

This was something completely unexpected by me (I had worked on a blog system in the late 90s that enforced a limited content length but gave up on it as being impractical...who would want to use it? But combined with social following and sharing, that was something entirely different. The brevity of the post lowered the barrier to participation, but the retweet and the resulting game of telephone was something unexpected). Its a shame the new built-in retweet system discourages this fabric of editorializing. I suppose it was done in the name of efficiency, or perhaps fears about copyright resulting from the game of telephone. It would be a shame to bring such a wonderful experiment to and end out of such absurd fears.

(By the way, this game of "telephone" adds value, not noise to the signal.)

21 April 2010


The post print project is thinking about how mobile devices and networked media "could redefine how we do a couple of very basic things: how we tell stories and how we learn."

I'm fascinated with this.

I believe story telling is bound to the way our brains evolved and isn't really going to change much no matter what technology does. The networked and mobile space we inhabit could change how we learn and use information. I think it already has. I've been reading Jane Austen's novels as Gutenberg etexts on the iphone. The iphone is passable as a reader. I've not got eyestrain yet. I find it hasn't done anything new, but it has restored reading as a regular activity for me. I hate reading at the computer. I'm too lazy to go to the library (I'd have to drive across town to the central library where all the really good books are). Its just too easy to pick up the iphone, download a new book and start reading. That is different. I am unwilling to do this on the desktop computer.

I think the uniqueness will come from how mobile network devices let us assemble small bits of information together, get timely information, communicate with friends easily, or keep to ourselves in a private moments of reading or listening to music. It might influence learning by enabling people to draw together different sources right in the palm of their hand while they are experiencing something, which is often important for learning.

18 April 2010

People want their life to tell a story

People want their life to tell a story. When life diverges from the
story they wish it to tell, they become anxious and frustrated. Zen
Buddhism teaches desire is the cause if suffering. When we as the
fulfillment of the desire us threatened. By avoiding attachment to the
story our life tells, we can be free of suffering caused by our life
failing to live up its story. We are then able to enjoy our real life,
the one that just happens, without requiring it to tell a story. This
is the true story of our life.

This is not a passive attitude toward life. Life happens to us and we
make life happen through what we do and our choices. Life happens, we
make things happen, and chance and the cards we are dealt govern our

07 April 2010

Stop the Excuses for School Bullying

Although I doubt prosecution will do any good, that is not the real question, it is just the only response a failed society has to clean up the mess its made, to lessen the shame of failing to provide a safe learning environment for Phoebe. Stalking, assaulting and verbally abusing an adult is a crime. It ought to be treated seriously when one child commits violence on another.

Bullying is a serious violation of human and civil rights of the individual. Those rights do not disappear just because a person is a child. Ensuring the right of an individual to autonomy and safety requires greater vigilance when a child is concerned, because they are less capable of defending their self or even prohibited from self-defense by school rules, which the bully does not care to follow, but the victim must to avoid being doubly victimized, first by the bully and second by the clueless school administrators.

The bullied child is often put in a situation with no way out. They are forced by law to attend the place of torture (school). They must choose between suffering the assaults of the bully or the punishments of the place of torture (school). They may see the only way out of their predicament to be suicide or violence. It is the school that ties the hands of the victim for the bully. In that way they are responsible.

Although children must be given room to make mistakes, it seems absurd on its face to classify bullying, a premeditated, systematic and consistent assault on a child, as a mistake. We are told teenager's brains are not fully developed. The victim's brains are not fully developed either, yet there is no protection for them, no consideration for them; the bullies are adept at manipulating the school rules, pulling the wool over the eyes of administrators and teachers, while the victim may be without the social tools to deal with them. Most children are not true bullies and they make mistakes in teasing, but bullying is not teasing. If we can't recognize the difference between teasing and bullying, perhaps the adults need to go to school. Our concern should not be for bullies, but should be for the victim of bullying. The bully has made their bed, let them lie in it. That is the best way for the them to learn the folly of their ways.

It is claimed the bullying was no more than that experienced by other children at the school. Are we to say she did not have a right to a safe learning environment because others did not either? Are we to judge and determine Phoebe's experience, moreover, by those who would bully her, who may have bullied her, who were indifferent to her and who are said to not be as sensitive to bullying as her? It only matters what the experience was to her.

Research into bullying at school and in the adult workplace shows a connection between bullying and the prevalence of Post Traumatic Stress Disorder. This is a serious matter with life long psychological and health consequences. Who will pay for these consequences over the lifetime of the victim, society? Are those who say a certain amount of bullying is just part of growing up willing to compensate victims?

Each child has the right to a safe, peaceful, comfortable learning environment without exception and it is the responsibility of society to ensure it as long as children are required by society to attend public school. If society cannot secure such an environment, then children should not be required to attend public places of torture (so-called schools).

(In reply to Should we be criminalizing bullies?)

25 March 2010

This blog has moved

This blog is now located at http://brandymorecastle.blogspot.com/.
You will be automatically redirected in 30 seconds, or you may click here.

For feed subscribers, please update your feed subscriptions to

02 March 2010

Twitter and Facebook

When I went to Twitter today, it displayed a dialog

We were hoping you could help us make it easier for people to discover
their friends and colleagues on Twitter. Review your settings below to
make sure the people you care about can easily find you.

asking me to update my name, bio, location and email fields.

This suggests that both Twitter and Facebook are insecure about each other, seeing strengths in the other and weaknesses in their own service. Twitter feels threatened by Facebook's focus on a true circle of friends and colleagues. Facebook feels threatened by Twitter's capacity for marketing and building followers in public.

It suggests they may eventually become very similar in the features they offer, with Twitter integrating photos, video, circles of friends and Facebook making their content more public (which they are doing). Perhaps both sites will give users more control about who can see what content.

05 January 2010

Irons in the Fire

The blacksmith knows, when you have too many irons in the fire, the iron you leave in the fire will burn before you have time to hammer the iron you're working on. The expression 'having too many irons in the fire' comes from blacksmithing and stands for having too many tasks competing for your attention. I just realized how accurately it describes being overwhelmed by stressful commitments.

The trouble is, in life, we often need to put several irons in the fire. For example, you may need to go back to school for continuing education, but you can't go right away, so you make plans, you make an appointment for the required tests and schedule of classes, you anticipate months of class work. This one task, going back to school, becomes an 'iron in the fire.'

While you're anticipating going back to school, other things will come up, daily life, a new person, a new project, but all the worries of going back to school will still be on your mind. As life goes on, we collect more tasks and responsibilities that stretch out over time or will need to be done in the future, along with the task's we've already begun. Each becomes an iron in the fire, until we are overwhelmed by anticipation.

The irons, the things we need to do, but can't do right away but must eventually complete, the things we start but can't or won't finish, build up in the fire until we become overwhelmed, knowing we will have to abandon some of the irons to burn or abandon the iron we're working on.

Blacksmiths have a way of dealing with too many irons in the fire. They keep some of the irons out of the fire until they are ready, until they've hammered the irons they're working on. Maybe there is some way in life to keep some of those irons beside the fire, waiting, until the ones you're hammering are done, and the ones in the fire are ready. I'm going to try to mentally pigeonhole those tasks and responsibilities, setting them down beside the fire, but out of it, waiting their turn.

By the way, I've learned (to my surprise, since it is such a traditional, low technology craft), blacksmithing is an art that can teach us a lot of important lessons. It teaches that some things can only be learned through experience. Getting good at smithing requires using the hammer. It requires creating a muscle memory of simple moves, before you can make more complicated or sophisticated ones. It requires building up sufficient muscles before you can wield the hammer effectively. It is impossible to become a blacksmith just by being an educated person and following a series of instructions in a book cold turkey, at least not without going through the actual practice of making things. Blacksmithing, is a lot like Zen, it requires practice to realize.

I don't mean the kind of practice your piano teacher had you do as a child, although that is related, but the kind practice that means actually doing something, not as a study, but as a reality. You could purposefully make simple things to teach and strengthen your muscles, but the point is that you have to do it in order to learn it, to realize it, to gain the benefits of it. Talk won't get you there. Reading won't get you there. Knowing won't get you there.

Twitter and A Flock of Seagulls, Publishing in a Networked World

I'm not going to name the site that got me starting writing this post. Its a sentiment I've seen on many sites with a traditional publishing orientation. They follow the old tradition from the age of print, where all submitted works are required to be "not published elsewhere," requiring "first print" rights and demanding every "reprint" (copy) should cite the publisher as place of first publication (what is this, vanity?).

These guidelines ignore the reality of the new age of immediacy, of information abundance, of venue abundance, the network. There is no scarcity in publication, there is no value in "first publication" or artificial scarcity on the network. The document is the conversation the conversation is the document. The old publishing world is gone, stop trying to hang on.

The attitude simply does not fit with a universe of networked information being shared and reshared by millions of people, winding its way in bits and pieces and fits and starts through the social network of friends, family, colleagues. The network is the world of social publishing.

Why? Because it is to difficult to find works online among billions of documents and uncounted trillions of ever expanding words. You just can search for things you do not know exist. The social network trades in attention, which is necessary to discover what exists, through your social contacts.

It just does not make sense to "publish" a work to a certain location (or a physical book), then try to get everyone to come read it through clever marketing. It makes no sense to prevent copying, since copies are the method by which information spreads through a social network. The idea of scarcity and exclusivity makes no sense at all in a socially networked world, unless by exclusive you mean being friends with the author.

The network, by the way, does not really need to worry about this issue of citation, since there is usually a trail back to the original author, through a 'retweet path' (if dutifully or automatically maintained) or through carrying authorship information with the work through the social network (as I've talked about here before).

As a poet, nearly every poem I write is immediately published to the social network, so I can't give anyone "first rights" to it, and moreover, that is meaningless. I noticed the Haiku Society of America states, at least for some submissions, " The appearance of poems in online discussion lists or personal Web sites is not considered publication." a much more adaptive policy.

What happens on Twitter is more like a flock of seagulls, making all references to publication, first publication, second publication utterly meaningless, as we tweet to others and they tweet back at us, retweeting and retweeting. I suppose the next thing, is they will want "first tweet" rights. I understand the goal is to keep your publication fresh, but that simply does not fit reality. It says something about a publishing world where the consumer needs to be reassured they are not being "cheated" by recieving old goods, which are turned over from elsewhere, similar to the way "shovelware" became a problem in the 1990s CDROM publishing era. I suppose the same problem exists with bloggers, twitterers, who merely repeat what others write, but I just don't see the problem. In a network world it costs nothing to unfollow or unfriend a source.

03 January 2010

"Tyler Cowen: I don't think it's a useful description to say autistics are only focused on on thing, but I would say there's a lot of tasks you can give autistics, like picking out small details in locked patterns, or picking out different musical pitches, where autistics seem especially good at attention to small detail. So if you think of the web as giving us small bits, like a tweet or a blog post is shorter than a novel, if you think of that as the overall trend, like an iPod, a song is shorter than an album. It seems that we're now all living in a world where we manipulate small bits effectively, it doesn't mean any of us is just interested in one thing, but we manipulate these small bits to create bigger ideas that we're interested in, and those bigger ideas are synthetic, and I think it's another way in which we are using information technology to mirror or mimic capabilities of autistics without usually people knowing it. "


This is what I suspected when I envisioned Strands in the late 90s, before Twitter existed. That shortening the length of information might be another instance of the medium being the message, that it might broaden the number of people writing by lowering the barrier (less memory, organization required to write), and that there might be some way of using the "many small pieces loosely joined" to create some kind of better, large paradigm of writing than the book. And perhaps we could give to writing the same kind of flexibility we give to data in relational databases, for combining, recombining in novel ways, mining and analyzing.

What if we could create a Twitter Query Language? Enabling virtual documents consisting of projections and selections of real time status streams?