tag:blogger.com,1999:blog-61052378262080608102024-02-06T22:08:43.110-05:00Matthew Putman on the Arts and SciencesFor more blogs by Matthew Putman go to http://insearchoflabs.blogspot.com/and http://mcputman.blogspot.com/Anonymoushttp://www.blogger.com/profile/07835248533705400567noreply@blogger.comBlogger43125tag:blogger.com,1999:blog-6105237826208060810.post-6369578928808757052011-02-27T14:36:00.000-05:002011-02-27T14:36:02.409-05:00Doctor Joan Who?<div class="MsoNormal"></div><div style="background-color: transparent; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px;"><span id="internal-source-marker_0.6941803588997573" style="background-color: transparent; color: black; font-family: Arial; font-size: 11pt; font-style: normal; font-weight: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;">I had one of those dreams which are made of equal parts thought experiment and unconscious chaos. I imagined, or dreamed, of a well known woman in her early fifties. Let’s call her Joan. In this dream/ thought experiment I read a biography of her, which was, I must admit, so thick that I had expected more of it. The book was packed with tiny font type for nearly 1000 pages, but mainly it had nearly no personal details about what Joan was like. It was more a detailed and annotated account of her accomplishments, which is why despite its vastness; I could fill in the emotional blanks that the author left out. Since Joan is the invention of my imagination, I will not bore you with the same monotonous 1000 pages, but instead, just bullet points of a few milestones in her 52 year old life.</span></div><div style="background-color: transparent; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px;"><span id="internal-source-marker_0.6941803588997573" style="background-color: transparent; color: black; font-style: normal; font-weight: normal; text-decoration: none; vertical-align: baseline;"></span><span class="Apple-style-span" style="font-family: Arial;"><span class="Apple-style-span" style="font-size: 15px; white-space: pre-wrap;"><br />
</span></span><span style="background-color: transparent; color: black; font-family: Arial; font-size: 11pt; font-style: normal; font-weight: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;">1. Her Birth was not so exceptional, so I am not sure why I am including it, but she was born around the summer solstice of 1960, though that makes no difference to anything. Sorry.</span><br />
<span style="background-color: transparent; color: black; font-family: Arial; font-size: 11pt; font-style: normal; font-weight: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;">2. She learned to read at 3, and by the age of 5 had snuck a copy of that long Victor Hugo book about the history of world off of the shelf of her Francophile parents, and unlike even her parents, she read it, which was to shape her worldview by skewing it towards unrealistic dreams of empirical grandeur.</span><br />
<span style="background-color: transparent; color: black; font-family: Arial; font-size: 11pt; font-style: normal; font-weight: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;">3. At 6 she admitted to being able to read, but only “Ted and Jane” stories. Her parents quickly ordered a subscription to “Lisette”, which was a French children’s magazine, in French. Rather than going back to that long Hugo book to read in the original French rather than in translation, she chose the only French language book she could find in the living room, “Madame Bovary”. Why the author of the biography did not dig deeper into the long term psychological effects of numbers 2 and 3, I do not know.</span><br />
<span style="background-color: transparent; color: black; font-family: Arial; font-size: 11pt; font-style: italic; font-weight: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;">4. <span class="Apple-tab-span" style="white-space: pre;"> </span></span><span style="background-color: transparent; color: black; font-family: Arial; font-size: 11pt; font-style: normal; font-weight: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;">Her mathematical and musical abilities formed at around the same time, (also age 6) leading legions of unmathematical and unmusical people to make that tired argument, that music and math are the same thing, which I personally have never understood. This could of course be due to my own lack of ability to do either with the prodigiousness of Joan. Anyway, she played the usual Bach “Well Tempered Clavier”, but preferred to do improvisational deconstructions of Elvis Presley hit songs. Mathematically she learned arithmetic, but found it boring. She again sought out the library of her academic parents and found a book on linear algebra, where she succeeded in conquering 2 dimensional matrices nearly immediately, and moved on to the 3 dimensional variety, at this young age when most children didn’t even know there were 3 spatial dimensions</span><span style="background-color: transparent; color: black; font-family: Arial; font-size: 11pt; font-style: italic; font-weight: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;">. As a sidenote, but of relevance here, she later admitted that she did suspect even at that time that it was possible that there were other dimensions than the 3 spatial ones we were familiar with, but that they must be too small for us to see. This would mean that her mathematical intuition predated string theory by nearly 20 years, but who would have listened to such ramblings from a 6 year old?</span><br />
<span style="background-color: transparent; color: black; font-family: Arial; font-size: 11pt; font-style: italic; font-weight: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;">5. <span class="Apple-tab-span" style="white-space: pre;"> </span></span><span style="background-color: transparent; color: black; font-family: Arial; font-size: 11pt; font-style: normal; font-weight: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;">In high school, the enormity of her intellect was still mostly unknown, though she had shared it with one person, a boy, which the book only describes as average, but makes me (both the me of my dream, and the me reading the bio) suspicious, as I see no reason for an exceptional person such as Joan to share her secret genius with an average person. </span><span style="background-color: transparent; color: black; font-family: Arial; font-size: 11pt; font-style: italic; font-weight: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;">As another sidenote this may be exactly what an exceptional person does which is why exceptional people are so hard to relate to, or so my thinking goes at this point in the biography, and actually at this point in this story.</span><br />
<span style="background-color: transparent; color: black; font-family: Arial; font-size: 11pt; font-style: normal; font-weight: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;">6. At age 21 Joan had the only extreme normal excess decadence recorded in all 1000 pages, which strikes me as strange for 2 reasons. First why she would be so normal as to celebrate such a common birthday with such a common thing as beer keg parties and shots of gin, but also because as my thinking goes, 21 wasn’t really 21 for Joan. I justify this as a calculation similar to the way people calculate dog years, though maybe not for the same reason. The dog year’s thing is depressing when you actually think about it. It is a way for children, and I guess all of us who love dogs, to justify their demise. Like that one year of the dog’s life was really worth 7 of ours. Which is also curious, as this would lead to me to wonder whether that means that it is actually the dog that is living the full life, as it takes us 7 years to do what we should be doing in 1 year.</span><br />
<span style="background-color: transparent; color: black; font-family: Arial; font-size: 11pt; font-style: normal; font-weight: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;">7. At age 21.5 Joan makes the major decision in her life which she is now famous for. Famous is over stating it of course, but for which she is most well known amongst those who know of her at all. She decides that despite her deity like ability to do anything, she was going to use her skills in a place where it was most needed, and that was going to take her on a long journey across the ocean from her native Westchester County to London England. When in London she followed in a long history of European women of intellectual import, and changed her name to the masculine Georges, and though she was indeed in England she kept the French spelling in honor of her Francophile parents who were Stateside locked deeply in the confines of tenure professorships, never to leave for the promised old world.</span><br />
<span style="background-color: transparent; color: black; font-family: Arial; font-size: 11pt; font-style: normal; font-weight: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;">8. Between the ages of 21.5 and 36 she went from hope to despair and back again, with a number of survival jobs, while she tried to find an approach to that ultimate release of her mind and body. Of her jobs, she was able to make enough of a living wage to move into a rather nice flat near Piccadilly Circus, by having a busker show where she calculated the day of the week each passerby was born on simply by knowing their birthdays.</span><br />
<span style="background-color: transparent; color: black; font-family: Arial; font-size: 11pt; font-style: normal; font-weight: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;">9. At the age of 36 she was finally approached by the BBC, for the very purpose she had emigrated for, and suffered all of those years. That is, she was asked to be a writer on the BBC show “Doctor Who”.</span></div><div style="background-color: transparent; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px;"><span style="background-color: transparent; color: black; font-family: Arial; font-size: 11pt; font-style: italic; font-weight: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;"><span class="Apple-style-span" style="font-style: normal;">10. Despite the 10 years between the old series and the new, Joan/Georges was able to use her “Doctor Who” knowledge and her own genius to predict everything from hurricane Katrina to the upset victory of the Red Sox in the 2003 World Series, though at the time, she kept this information in the confines of an anonymous blog,(yes predating the widespread use of blogs by 3 years) by the name of quantumfluxgirl. She is now happily back at work on the 6th season of the new series.</span> </span></div>Anonymoushttp://www.blogger.com/profile/07835248533705400567noreply@blogger.com0tag:blogger.com,1999:blog-6105237826208060810.post-53851241498341343342011-02-07T20:33:00.000-05:002011-02-07T20:33:15.862-05:00How I Came to Hate Oprah<div style="background-color: transparent; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px;"><br />
<span style="background-color: transparent; color: black; font-family: Arial; font-size: 12pt; font-style: normal; font-weight: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;"></span><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgg9ebaN8E9oSwTtFleFlOKfagBA7GDVsIys4RBLJPhTt-dl86LivGJCjPpEPcZ3aqpLQHDARcBI6XHrS-YQ4AcaNZon5GaE3VD6GVkYc4_EbXDo0y7jIePZG42KuT4F7IwLXXiW3huGRo/s1600/oprah_october.jpg" imageanchor="1" style="clear: left; float: left; margin-bottom: 1em; margin-right: 1em;"><img border="0" height="200" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgg9ebaN8E9oSwTtFleFlOKfagBA7GDVsIys4RBLJPhTt-dl86LivGJCjPpEPcZ3aqpLQHDARcBI6XHrS-YQ4AcaNZon5GaE3VD6GVkYc4_EbXDo0y7jIePZG42KuT4F7IwLXXiW3huGRo/s200/oprah_october.jpg" style="cursor: move;" width="164" /></a><span class="Apple-style-span" style="font-family: Arial;"><span class="Apple-style-span" style="white-space: pre-wrap;"><br />
</span></span><span style="background-color: transparent; color: black; font-family: Arial; font-size: 12pt; font-style: normal; font-weight: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;">I did something that even as contrarian and controversial as I can be at times, I was embarrassed by. I wrote a comment on a website that I like very much called </span><a href="http://blackartinamerica.com/"><span style="background-color: transparent; color: #000099; font-family: Arial; font-size: 12pt; font-style: normal; font-weight: normal; text-decoration: underline; vertical-align: baseline; white-space: pre-wrap;">Black Art In America</span></a><span style="background-color: transparent; color: black; font-family: Arial; font-size: 12pt; font-style: normal; font-weight: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;">, where I criticized a perfectly talented and well intentioned artist who had organized a commemorative book of art for Oprah Winfrey. I did this with the following comment, among others “Oprah succeeds based on a basic and I believe dangerous delusion that fame and riches (or even happiness) can be achieved through magical thinking, like that which is promoted from books like the "Secret". This is relevant to artists, as the success of an artist would appear to be in the sole power of the artists' thoughts. This discounts so much of the great art that has not made the artist rich and famous, or even happy. “</span><br />
<span style="background-color: transparent; color: black; font-family: Arial; font-size: 12pt; font-style: normal; font-weight: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;"></span><br />
<span style="background-color: transparent; color: black; font-family: Arial; font-size: 12pt; font-style: normal; font-weight: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;">After being spanked by many members of the website community, and even my friends for insensitivity, I was asked by a very close friend if any of this has changed my mind on Oprah, and I had to reply no. What it has done though is to make me think about myself a bit more, not likely the goal of the creators of this book. Would the sensitive Oprah crowd want to see a vehement Oprah hater turned into a narcissist? Though it is nothing to be proud of, that is indeed what has happened.</span><br />
<span style="background-color: transparent; color: black; font-family: Arial; font-size: 12pt; font-style: normal; font-weight: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;"></span><br />
<span style="background-color: transparent; color: black; font-family: Arial; font-size: 12pt; font-style: normal; font-weight: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;">Oprah represents more to me than I think I have let on in my blogging blasts of her, or my family fights with my Oprah adoring mom around the Thanksgiving day table. It actually goes back to high school for me, where like most adolescents, my ideas emerged from intuition into full fledged viewpoints. Oprah’s show was new when I was in high school, and though I am of a different generation than Oprah, her show was aiming in many ways to find its identity as I was aiming at discovering my own. I have looked through some of those early season topics and can understand why every day in the Winter (as the Fall and Spring I was going to Cross Country and Track Practice) I would curl up on the sofa and watch Oprah at 4:00. Titles like “Trouble Getting a Date?”, or more provocatively “Snobby, overbearing, or just plain hard to deal with...Rude, obnoxious people turn off their friends”. I had both of these issues as a teenager, as many teenagers do. To hear an adult have a serious forum to discuss those issues legitimized my anxieties. There is no wonder that my generation has helped to make Oprah a billionaire. She was there for us when we needed her</span><span style="background-color: transparent; color: #333333; font-family: Arial; font-size: 12pt; font-style: normal; font-weight: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;">. </span><br />
<span style="background-color: transparent; color: #333333; font-family: Arial; font-size: 12pt; font-style: normal; font-weight: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;"></span><br />
<span style="background-color: transparent; color: black; font-family: Arial; font-size: 12pt; font-style: normal; font-weight: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;">In the Television business when a television drama has lost its original intention it is said to have “jumped the shark”. Examples of this can be seen everywhere from “House” dating Cutty, to “Chuck” hooking up with the pretty FBI girl. This is why HBO and Showtime remain so good. They never Jump the Shark. In the “Sopranos” Tony never leaves the mafia. In the “Wire” drug dealers still kill, and cops still get drunk and drive around town hurling Bourbon bottles from their cruisers. Oprah on the other hand was not happy with connecting to us anxious, but rather normal rational people. Instead she jumped the shark in the most flagrant of ways. She started moralizing. She grew a heart, but that heart was so bogged down by superstition, and a realization that her every word would be followed, that the show became a religion. The problem in my view came in 1998 when she made an announcement that she would de-tabloid her show. This is not such a bad thing, but instead of de-tabloidization she created a new type of sensation, which I would call pseudoscientification. Oprah became the equivalent of L. Ron Hubbard who went from writing sci-fi novels to founding Scientology. America followed her down that path, and they enriched her along the way. By 2000 she had fermented her spiritual quest, and that evangelism became the focus of her career. The Wall Street Journal called this effect" Oprahfication", and explained that Oprah had embraced a public form a therapy.</span><br />
<span style="background-color: transparent; color: black; font-family: Arial; font-size: 12pt; font-style: normal; font-weight: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;"></span><br />
<span style="background-color: transparent; color: black; font-family: Arial; font-size: 12pt; font-style: normal; font-weight: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;">What I hadn’t realized is that what she also did but jumping the shark, is left the rest of us in a tank full of piranhas. She had perfected a type of TV, but when she left to become a cult icon, we were left first with Jerry Springer, and then a myriad of reality show hell. There was no room for a 16 year old sitting on the sofa listening to honest, and interesting people talk about not being able to get a date. Instead it is now Oprah telling the world how to find an elusive inner being. Or giving credit to that now proven fraudulent anti-vaccine movement. Or Oprah guiding a delusion of free will and personal success, while the viewers are growing in frustration with their own unresolved lives. Perhaps this is why I am so anti-Oprah. She has left a void where there was simple questioning of normal issues to a disaster of false hope and naivety. That said “Skins” did not exist on MTV when I was 16, so perhaps I would have had something to watch after all.</span><br />
<span style="background-color: transparent; color: black; font-family: Arial; font-size: 12pt; font-style: normal; font-weight: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;"></span><br />
<span style="background-color: transparent; color: black; font-family: Arial; font-size: 12pt; font-style: normal; font-weight: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;">Thinking of Oprahfication as a new cultural norm has become an excuse for me, and by its very popularity made me an elitist. It is tempting to think that this was not the case. Oprah programming of old was mainstream programming. Still it is not those early programs but the extremity of Oprah's success and popularity in the days since that separates me from large portions of American society. In fact, tracing my own personal development I can start in 1991 when Oprah had the dating, and fighting friends episodes and watch as my life has unfolded in direct opposition to hers. Of course she has become a billionaire, but that is only the most obvious of the divergences. The greater one is a move away from banality towards two opposing, yet strongly profound world views that separate my psyche from an Oprah psyche more than politics, class or even religion. </span><br />
<span style="background-color: transparent; color: black; font-family: Arial; font-size: 12pt; font-style: normal; font-weight: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;"></span><br />
<span style="background-color: transparent; color: black; font-family: Arial; font-size: 12pt; font-style: normal; font-weight: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;">Aging is of course different for everyone, but the struggles of acknowledging physical limitations, and how we deal with illness and fear is what may be the biggest divide in the Oprah worldview, and in my own. Like most people I have faced sickness, and tragedy, to a greater extent than some, but to a much lesser extent than others, Oprah included, who suffered so greatly as a child that I can’t even fathom the pain and repercussions. Still, these sicknesses and anxieties have shaped me, as I am sure her suffering has her. The difference is that for her they have given her a belief that everyone has a power that extends beyond themselves. For me, I am everyday humbled by the opposite. I am humbled by the fact that I am powerless, yet still loved and alive. I don’t need an inner strength, or a strength from God. This is where Oprah’s world and mine differ. We both want to conquer fear and mortality, but I think that the only hope comes through physics and material action, she believes it comes from spiritual and supernatural strength. These views are so different that they are a line in the sand of existence, which Oprah herself has helped to draw, and I am happy to take my place on the other side.</span><br />
<span style="background-color: transparent; color: black; font-family: Arial; font-size: 12pt; font-style: normal; font-weight: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;"></span><br />
<span style="background-color: transparent; color: black; font-family: Arial; font-size: 12pt; font-style: normal; font-weight: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;">I titled this blog, “How I came to hate Oprah”. This is meant to be provocative, and a little ridiculous. There is no reason for me to hate Oprah, and I don’t. She is clearly a deeply sensitive and caring person. Maybe a better title would have been “How I came to hate Oprahfication”, but that is too sociological, and not personal enough. So in retrospect I think the best title should be “How I came to know myself in the age of Oprah.”</span></div>Anonymoushttp://www.blogger.com/profile/07835248533705400567noreply@blogger.com2tag:blogger.com,1999:blog-6105237826208060810.post-29912743430026088202011-01-22T16:47:00.000-05:002011-01-22T16:47:29.485-05:00Designing my Epiphany<div class="separator" style="clear: both; text-align: center;"></div><table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhzTOO09gc9Vml1h7Zv_BSLO81nJMCW2xYeuDzJtZrKkcs5XQUTo_5BTOnJ6LkSOR4Scjjx-ahyphenhyphenzoKtoqoH0j4hvS0huZasVK7DtcSPpZqr67UW-N5nL724nwM98aEU3Ef72TGp032YiZeV/s1600/RW-stained-glass-closeup-150x150.jpg" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhzTOO09gc9Vml1h7Zv_BSLO81nJMCW2xYeuDzJtZrKkcs5XQUTo_5BTOnJ6LkSOR4Scjjx-ahyphenhyphenzoKtoqoH0j4hvS0huZasVK7DtcSPpZqr67UW-N5nL724nwM98aEU3Ef72TGp032YiZeV/s1600/RW-stained-glass-closeup-150x150.jpg" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">By Karen Starr</td></tr>
</tbody></table><span style="background-color: transparent; color: black; font-family: Arial; font-size: 11pt; font-style: normal; font-weight: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;">There is a design department at the Museum of Modern Art in New York. This is not recent news, as it has been a department since 1943, yet for some of us it has taken years to seamlessly move from VanGogh to Stickily. This for me is even the same within my own home, where I tend to nearly universally ignore design, while embracing art. I have contemporary art and Chagall, all on boring white walls, with a range of furniture bought from Ikea, Sears and Jennifer Convertibles. I don’t take pride in this, or even consider it a grand aesthetic choice. I like nice furniture, but for the price of that nice furniture I could buy a painting by a young artist that I admire. For me that choice has always been easy to make. When my long, long long time friend Karen Starr started her business, I began to change the way I thought. Suddenly it was possible for me to think about both design and art, without necessarily a compromise on either. I have done nothing about this, because I tend to over analyze the decision. Last week when I visited <a href="http://hazeltreeinteriors.com/">Hazel Tree</a>, Karen's business, I had a bit of a catharsis in regards to design as not only an aesthetic choice, but also an environmental and sociological statement. <a href="http://karenstarrredesign.com/contact-me/">Karen’s site</a> explains this clearly so often that I must have been especially thick to not have recognized it’s point. I think though that it came, like so many things, from a convergence of momentary experiences. </span><br />
<span style="background-color: transparent; color: black; font-family: Arial; font-size: 11pt; font-style: normal; font-weight: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;"></span><br />
<span style="background-color: transparent; color: black; font-family: Arial; font-size: 11pt; font-style: normal; font-weight: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;">I have worked for many years in industries that have questionable reputations. For instance I have worked with, and teach polymer technologies. That is plastics and rubber, which we all know have a detrimental impact on our environment. I have justified my involvement as I work on research and on instrumentation that can make these products better, and on educating students who can come up with newer and more sustainable ideas. At the same time I have rejected recycling as a useful solution for a world with too much plastic. I have said that small moves like recycling can do one of two things. They can make people feel like they are being productive, when they are really doing very little. This is not so terrible, but feels a bit like we as a society are lying to ourselves. The larger problem is when recycling actually does more harm than good, such as paper recycling in rural areas. The carbon footprint of the recycling process is larger than the that of using new paper. So knowing all of this, I have had to think of ways in which a future of polymer manufacturing, and new polymer products will result in a net gain for society, both in life style and environmentally. </span><br />
<span style="background-color: transparent; color: black; font-family: Arial; font-size: 11pt; font-style: normal; font-weight: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;"></span><br />
<span style="background-color: transparent; color: black; font-family: Arial; font-size: 11pt; font-style: normal; font-weight: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;">After visiting Hazel Tree and after speaking extensively with Karen, I realized that our goals were not only similar in an abstract way, but even symbiotic. After all, furniture is made of materials, and the use of those materials is not insignificant. Karen has done something which is very different from recycling, which is reusing. I don't like antiques or old things in general, so I have not been attracted by them, even though they broadly fall into the category of reused. What Karen does is to re-imagine and revive while reusing. This is the type of system that benefits society and the environment and serves as an inspiration to me. Before long I will have a lab where we reuse old concepts and materials, and a house with redesigned, and finally beautiful furniture.</span>Anonymoushttp://www.blogger.com/profile/07835248533705400567noreply@blogger.com0tag:blogger.com,1999:blog-6105237826208060810.post-11153368027287819572010-12-08T20:36:00.000-05:002010-12-08T20:36:02.901-05:00Facing Eternity In San Francisco<div style="background-color: transparent; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px;"><a href="http://upload.wikimedia.org/wikipedia/commons/thumb/b/b8/Aubrey_de_Grey.jpg/180px-Aubrey_de_Grey.jpg" imageanchor="1" style="clear: left; float: left; margin-bottom: 1em; margin-right: 1em;"><img border="0" src="http://upload.wikimedia.org/wikipedia/commons/thumb/b/b8/Aubrey_de_Grey.jpg/180px-Aubrey_de_Grey.jpg" /></a><span id="internal-source-marker_0.2383074308745563" style="background-color: transparent; color: black; font-family: Arial; font-size: 11pt; font-style: normal; font-weight: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;">Last night I was walking to a technology event in San Francisco, and being early and thirsty from a reckless run up and down Lombard Street, I found a pub to grab a beer. Though it was dark, I noticed an absolutely impossible not to notice character in the science world, </span><a href="http://en.wikipedia.org/wiki/Aubrey_de_Grey"><span style="background-color: transparent; color: #000099; font-family: Arial; font-size: 11pt; font-style: normal; font-weight: normal; text-decoration: underline; vertical-align: baseline; white-space: pre-wrap;">Aubrey De Grey.</span></a><span style="background-color: transparent; color: black; font-family: Arial; font-size: 11pt; font-style: normal; font-weight: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;"> De Grey has a beard which goes below the waste, making the fact that I wear Mickey Mouse socks and play the banjo look much less eccentric. I have written here before about De Grey, </span><a href="http://mcputman.blogspot.com/2010/03/jellyfish-frog-and-man.html"><span style="background-color: transparent; color: #000099; font-family: Arial; font-size: 11pt; font-style: normal; font-weight: normal; text-decoration: underline; vertical-align: baseline; white-space: pre-wrap;">(The Jellyfish)</span></a><span style="background-color: transparent; color: black; font-family: Arial; font-size: 11pt; font-style: normal; font-weight: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;">and the philosophical struggle I have, which wavers between a desire to embrace his research to engineer immortality, or to brush it aside as harmful delusion. I introduced myself, and unsurprising he is a major intellect, who enjoys a pint and has an excellent sense of humor. When I told him how much I liked living in Paris, he said something which is insightful, but not usually pointed out which was “why did you like it? Certainly it wasn’t for the beer.” So within 20 minutes I was again thinking that it might be possible to live forever, even though we didn’t even talk about this.</span><br />
<span style="background-color: transparent; color: black; font-family: Arial; font-size: 11pt; font-style: normal; font-weight: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;"></span><br />
<span style="background-color: transparent; color: black; font-family: Arial; font-size: 11pt; font-style: normal; font-weight: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;">The evening was intriguing, mind boggling, encouraging and the food was delicious. This was a group of 200 hundred or so Silicon Valley insiders who were there to talk about a range of foundations that are especially forward thinking, one of which was </span><a href="http://www.sens.org/"><span style="background-color: transparent; color: #000099; font-family: Arial; font-size: 11pt; font-style: normal; font-weight: normal; text-decoration: underline; vertical-align: baseline; white-space: pre-wrap;">The SENS Foundation</span></a><span style="background-color: transparent; color: black; font-family: Arial; font-size: 11pt; font-style: normal; font-weight: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;">, which De Grey is a founder of. The noble pursuits of the group were on the border of science, economics and possibly science fiction, which is a place I am comfortable inhabiting. With all of the interesting people I talked to, I thought of every conversation through the lens of immortality, as represented by De Grey. Suddenly my reliance on quantitative data became a little less stringent than it usually is amongst scientists. All of the science presented was well researched and intelligent, but many of it exists just beyond the complete technological, or even mathematical grasp of our time. This is important I think. I write about feeling on the fringe of science and music, and the respect I have for the fringe. What I realized from this event is that some of the most successful and smart thinkers and entrepreneurs in the country are not so much on the fringe, but off of the table completely. In this frame of mind a singularity, nanobots and even living forever are technical challenges, not fantasies. For many of these people the science and technology fills the place in their lives that the combination of science and art fill in mine. So is a garage start-up mentality dedicated to eternal rejuvenation (or super humans, or nano robots) a way of dealing with existential dread, or more simply an act of the curious inventor? Are either of these more noble than the other? If an unexamined life is not worth living, does it mean that the search for immortality is examining life more or less? The film maker Darren Aronofsky made a film I liked very much called “</span><a href="http://www.mrqe.com/movie_reviews/the-fountain-m100007118"><span style="background-color: transparent; color: #000099; font-family: Arial; font-size: 11pt; font-style: normal; font-weight: normal; text-decoration: underline; vertical-align: baseline; white-space: pre-wrap;">The Fountain”</span></a><span style="background-color: transparent; color: black; font-family: Arial; font-size: 11pt; font-style: normal; font-weight: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;">. When asked in an interview why he made this film about the search for the fountain of youth, set in the past, present, and a future in a floating bubble in space, he said that it was to show that at some point, regardless of how long we</span><span style="background-color: transparent; color: black; font-family: Arial; font-size: 11pt; font-weight: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;"><i> can</i></span><span style="background-color: transparent; color: black; font-family: Arial; font-size: 11pt; font-style: normal; font-weight: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;"> live, humans need to come to terms with death. It is perhaps the greatest emotional and intellectual chasm I have; that I believe De Grey and Aronofsky at the same time.</span></div>Anonymoushttp://www.blogger.com/profile/07835248533705400567noreply@blogger.com1tag:blogger.com,1999:blog-6105237826208060810.post-29258493922868062002010-10-28T09:06:00.000-04:002010-10-28T09:06:44.140-04:00Will Thinking in Pop Culture<div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhikJn5aeYGsyuG4ZVSTG84yiZTqSZr93_uUAjh1Ic60G_wf61HwdUCDFh6amdj_CaNfCm6r9tF3cvpjJYpbPKHCXUaASA62PBsohyphenhyphenOgT3F1NCWWTFtbspnwhC69amwRTMRUiHgojJw08i9/s1600/freedom.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="200" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhikJn5aeYGsyuG4ZVSTG84yiZTqSZr93_uUAjh1Ic60G_wf61HwdUCDFh6amdj_CaNfCm6r9tF3cvpjJYpbPKHCXUaASA62PBsohyphenhyphenOgT3F1NCWWTFtbspnwhC69amwRTMRUiHgojJw08i9/s200/freedom.jpg" width="133" /></a></div><br />
<div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgHuipq2Oa16jqOhq4nv3JClzWz8JZQn0cDvDbTuzio1ge-_g5PAiOs0b3nP6O7kyTlazo78Pjb5LshiQkdYGdrG-FYTbKuCxDWHKf3LWkEuSoC2Pr171W9gPgRnHPK4yT5Etpoajp9hajH/s1600/fringe-poster-1.jpg" imageanchor="1" style="clear: left; float: left; margin-bottom: 1em; margin-right: 1em;"><img border="0" height="200" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgHuipq2Oa16jqOhq4nv3JClzWz8JZQn0cDvDbTuzio1ge-_g5PAiOs0b3nP6O7kyTlazo78Pjb5LshiQkdYGdrG-FYTbKuCxDWHKf3LWkEuSoC2Pr171W9gPgRnHPK4yT5Etpoajp9hajH/s200/fringe-poster-1.jpg" width="134" /></a></div><div style="background-color: transparent; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px;"><span id="internal-source-marker_0.49347927630878985" style="background-color: transparent; color: black; font-family: Arial; font-size: 11pt; font-style: normal; font-weight: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;">I have felt pushed into a prison of disconnect with pop philosophy. This can lead down paths so promising, only to be destroyed by pseudo-science, superstition, and religiosity. Most obviously this is “</span><a href="http://www.amazon.com/Secret-Rhonda-Byrne/dp/1582701709"><span style="background-color: transparent; color: #000099; font-family: Arial; font-size: 11pt; font-style: normal; font-weight: normal; text-decoration: underline; vertical-align: baseline; white-space: pre-wrap;">The Secret”</span></a><span style="background-color: transparent; color: black; font-family: Arial; font-size: 11pt; font-style: normal; font-weight: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;">, that best seller of self delusion, and Deepak Chopka, the man most responsible for teaching the world the the words quantum mechanics, without having the slightest idea of what he is talking about. I know that I, and thousands of bloggers, make this complaint so often that we have become like street corner preachers, who are unheard even though they boom their voices through megaphones and amps. This blog however is not about this, but rather about signs that American culture may be as polarized as American politics. This is actually a relief to me, as I have not heard scientific reason applied to popular culture in any major way in a long time. The practitioners of the movement I mention above I will refer to as the willful thinkers. They tie various unrelated elements in science, which they don’t understand, into a basic theory that puts free will in a more powerful place in society than it has ever been historically. Never before have people connected neuro-science, physics, and free -will in such broad ways as this. This type of self determination likely has some commercial motives, like convincing people to use a lot of credit on useless books, because they are capable of earning enough money to pay it back. Or it could be politically motivated in order that people feel they are never stuck in despair, but instead vote for candidates that promise that they will change things. It basically puts the responsibility for happiness and success on the individual by some spiritual connection with a universal energy. The reasoning goes that even if your individual energy is limited, your </span><span style="background-color: transparent; color: black; font-family: Arial; font-size: 11pt; font-style: italic; font-weight: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;">will</span><span style="background-color: transparent; color: black; font-family: Arial; font-size: 11pt; font-style: normal; font-weight: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;"> can allow you access to a larger life force that can aid in success. This is a lot of science-like talk which really is just explaining free will the same way it is explained, rather unconvincingly to me, in the Bible. </span><br />
<span style="background-color: transparent; color: black; font-family: Arial; font-size: 11pt; font-style: normal; font-weight: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;"></span><br />
<span style="background-color: transparent; color: black; font-family: Arial; font-size: 11pt; font-style: normal; font-weight: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;">There are books that I consider rational alternatives to this thought, but they have significantly less readership. Just this week though I had two examples, one light of substance but entertaining, and the other much more profound. </span><a href="http://www.fox.com/fringe/"><span style="background-color: transparent; color: #000099; font-family: Arial; font-size: 11pt; font-style: normal; font-weight: normal; text-decoration: underline; vertical-align: baseline; white-space: pre-wrap;">"Fringe"</span></a><span style="background-color: transparent; color: black; font-family: Arial; font-size: 11pt; font-style: normal; font-weight: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;"> last week had this compelling plot line, where (in a parallel universe, but that is not so important) a man was on a drug trial designed to increase the intellectual range of very low IQ people. The trial worked better than expected, and this particular man ended up being hundreds of times more mentally capable than other humans. I know that already it seems hypocritical for me to blame the will thinkers for scientific faking, when the entirety of “Fringe” is so clearly unscientific. This is the case with this episode, as I don’t feel that there is a superpower capability of the human brain, but still this provided a nice metaphor on free will. The character was not only smart, but extremely proficient in probability, so much so that he could predict the future of events. In other words he understood the deterministic nature of existence and to connect the dots from the past and present out into the future. This is more thought provoking than the average prime time sci-fi episode. It makes us think about a relevant question: how much information would we, or a computer, need to have in order to statistically know future events? That then leads to the question of whether, if such events can be mathematically predicted, there is any role for free-will.</span><br />
<span style="background-color: transparent; color: black; font-family: Arial; font-size: 11pt; font-style: normal; font-weight: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;"></span><br />
<span style="background-color: transparent; color: black; font-family: Arial; font-size: 11pt; font-style: normal; font-weight: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;">The other deeper look at free-will, is the large, and brilliant new novel</span><a href="http://www.google.com/products/catalog?rlz=1C1SKPC_enFR372FR372&q=freedom+franzen&um=1&ie=UTF-8&cid=2874931126921248933&ei=y3HJTMLOHoT7lweSxuyAAg&sa=X&oi=product_catalog_result&ct=result&resnum=3&ved=0CCoQ8wIwAg#"><span style="background-color: transparent; color: #000099; font-family: Arial; font-size: 11pt; font-style: normal; font-weight: normal; text-decoration: underline; vertical-align: baseline; white-space: pre-wrap;"> “Freedom”</span></a><span style="background-color: transparent; color: black; font-family: Arial; font-size: 11pt; font-style: normal; font-weight: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;"> by Jonathan Franzen. It is the story, through various perspectives, of a woman, a family and the the ideals of recent generations. There is too much here to talk about, other than to refer back to the title “Freedom” where we see Franzen’s characters forever unable to escape the past in order to create an independent future. They are, like all of us, trapped by causality, and therefore freedom itself alludes them.</span><br />
<span style="background-color: transparent; color: black; font-family: Arial; font-size: 11pt; font-style: normal; font-weight: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;"></span><br />
<span style="background-color: transparent; color: black; font-family: Arial; font-size: 11pt; font-style: normal; font-weight: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;">It may be that the answers, such as will thinking, are the most satisfying, which is why they may always remain in society. That said, a slow enlightenment seems to have risen on the horizon of mainstream culture, at least enough to start making us all question what it means to be free.</span></div>Anonymoushttp://www.blogger.com/profile/07835248533705400567noreply@blogger.com1tag:blogger.com,1999:blog-6105237826208060810.post-29198807272053086232010-09-27T14:50:00.002-04:002010-09-27T14:50:29.776-04:00Specializing in Everything<div style="background-color: transparent; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px;"><span id="internal-source-marker_0.9519708207808435" style="background-color: transparent; color: black; font-family: Arial; font-size: 11pt; font-style: normal; font-weight: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;">I read biographies too much perhaps. They tend to make me feel a bit inferior, but I always consider that the inspiration from reading about Benjamin Franklin, Mark Twain or Joe Louis far outweighs the likelihood that I will not be a founder of the world’s largest democracy or win the heavy weight championship. It is not unusual to be more admiring of historical heroes, as they are no longer around to let us down. In many ways it also makes me feel privileged to be living in a technological age that many of these people, the three above included, were not lucky enough to be a part of. It would be difficult to invent aboard a ship crossing through pirate infested waters on the way to France as Franklin did, or write books without spell check like Twain, or box.. well that is pretty much the same as always. Where genius becomes more complicated in the modern world is not in the areas that can be aided by science and technology, but in the fields of science and technology themselves. </span><br />
<span style="background-color: transparent; color: black; font-family: Arial; font-size: 11pt; font-style: normal; font-weight: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;"></span><br />
<span style="background-color: transparent; color: black; font-family: Arial; font-size: 11pt; font-style: normal; font-weight: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;">I decided to work for a Ph.D when I was 29 years old. I had already worked several different career type jobs, from producing plays, to managing sales and marketing for my parents business. Luckily for me that business was a technological one, where I was exposed to the exciting worlds of chemistry, physics and computer science. Exposure is nice, but when I mentioned to real scientists that I wanted to get a Ph.D. they were encouraging with a caveat. They said that in modern science it was important to be specialized, and I tended to be a rather scattered generalist. This was, and I think still is, the common wisdom, which is easy to understand if you look at academia. Knowledge in each small field has become so great that to know everything about a problem it takes years to learn. </span><br />
<span style="background-color: transparent; color: black; font-family: Arial; font-size: 11pt; font-style: normal; font-weight: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;"></span><br />
<span style="background-color: transparent; color: black; font-family: Arial; font-size: 11pt; font-style: normal; font-weight: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;">The Noble prize winning neuroscientist Dr. Kandel when asked at a conference about how a young scientist should choose an area of research said that he should pick something that takes a lifetime to solve. This statement seems like a call for focus, and for specialization until I considered Kandel’s career. Kandel is in his seventies, working hard on a problem. It is true that he has been focused but that focus is on something extremely large; understanding memory. Kandel’s approach to this was to use theory, experiment and even Freudian psychoanalysis to get there. In other words he was a specialist of everything it means to be a thinking being.</span><br />
<span style="background-color: transparent; color: black; font-family: Arial; font-size: 11pt; font-style: normal; font-weight: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;"></span><br />
<span style="background-color: transparent; color: black; font-family: Arial; font-size: 11pt; font-style: normal; font-weight: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;">Just looking at the faculty of Columbia alone I found another very well known example of the same type of contemporary specialization. Brian Greene, who the author of 3 best selling books, is a theoretical physicist who works in the highly specialized field of String Theory. While having lunch with a friend of mine last month we both made a rather obvious realization about Brian and String Theorists in general. The goal of this science is to find a link between quantum mechanics and gravity. This is often called the theory of everything, as it would be truly fundamental in our understanding of the entire universe. So how specialized can it be to be working on everything? </span><br />
<span style="background-color: transparent; color: black; font-family: Arial; font-size: 11pt; font-style: normal; font-weight: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;"></span><br />
<span style="background-color: transparent; color: black; font-family: Arial; font-size: 11pt; font-style: normal; font-weight: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;">All of this is to take a perspective on the biographies of my heroes from the past, and those innovators of today. Perhaps the advice to be specialized is both right and wrong. We need to be specialists on the big questions, because we have the time, the technology and the work of those geniuses of the past to help us.</span></div>Anonymoushttp://www.blogger.com/profile/07835248533705400567noreply@blogger.com0tag:blogger.com,1999:blog-6105237826208060810.post-85517542397683162502010-08-22T18:29:00.002-04:002010-08-22T18:33:35.960-04:00Hair and Bass in an Age of Apathy<span xmlns=''><p>I just moved back to New York from France, and it is my 36<sup>th</sup> birthday. I say this because there is a mist of unconscious nostalgia permeating the air around me these last two weeks, which certainly influences the ideas in this blog. There is a natural result of being back in August in the States, and that is I am in my car more often going to work with partners and clients in nearby States. My European friends and family stay at beaches until the start of September. I was happy to discover that I could get XM Satellite radio in my car, which meant for me (so I thought) a chance to listen to NPR continuously, rather than surfing for new stations when between cities. I have done some of this, but listening to tales of the end of the IRAQ war for hours made me feel sad and old at the same time, which is difficult on a birthday. So I switched to music stations, and instead of listening to my favorite jazz and classical stations I listened to 80's metal and 80' rap. These stations must exist to transport people of my generation, and it has worked to do that. It has not really worked to get me out of the aging and moving funk though. The reason is that the music was so original. The contradictory crispness and saturation of <em>Guns and Roses;</em> the revolutionary, sad, yet hilarious raps of N.W.A. When this music came out I listened to it of course. I eventually was even a DJ and played a lot of it. The 80's and 90's were looked at as a musical cesspool, while a large portion of society looked backed to Beetles era rock, and Dylan protest music as the last throws of civil consciousness in popular culture. This made some sense, as my generation was more politically apathetic than the previous, and wars were only being fought in secret, leaving no official regime to fight. Also the economy appeared to be strong, at least as it was presented by Reagan and Bush I. Growing homelessness and the rampant spread of AIDS were mostly ignored by popular music. I feel nostalgia then not for a time of progress, but for a time where certain segments, like metal and rap, were innovating, and expressing not necessarily politically useful anger, but instead personal rage against loss, emptiness and marginalization. This made it perfect teenage music.<br />
</p><p>It seems now that perhaps contemporary serious jazz musicians and classical performers are revisiting some of this music, by deconstructing, reinterpreting, and in a sense calming the fire to find the remains of red hot embers. I have heard Vijay Iyer play M.I.A and Michael Jackson, I have heard Yaron Herman play Nirvana. I have heard Brad Mehldau play countless 80's and 90's rock, punk and rap classics his own way. The band <em>Wake Up!</em>, who I was proud to perform with last week, doesn't dissect directly but with full force refers to those genres , bringing us backward into the past and forward into the future at the same time. I am not sure if this is a nostalgic journey for them, but for me taking the morsels of interest from the past and finding a musically relevant voice for it gives us a history while influencing the present. This is not new of course, as Dvorak, Stravinsky, Chopin and Liszt all used folk music as a basis for the creating of a contemporary symphony. I guess the sad part is that the music of my youth is now the ruins of a time passed. It is a folk history of big hair bands with killer guitar solos, and bouncy cars with giant sub woofers. In other words, I am OLD.<br />
</p></span>Anonymoushttp://www.blogger.com/profile/07835248533705400567noreply@blogger.com0tag:blogger.com,1999:blog-6105237826208060810.post-73244511596925057732010-08-18T23:55:00.000-04:002010-08-18T23:55:28.382-04:00POÄNG or Wassily<div class="MsoNormal">Would IKEA be the utopia of 20<sup>th</sup> century modernism? Is it the populist achievement of revolutionary Bauhaus design, the architecture industrialization of Mies Van der Rohe, and the physical embodiment of Mondrian minimalism? At first glance, or from some distant academic watch tower it would appear so. IKEA would also seem to be an internationalist victory of sorts. The Swedish behemoth offers sleek design, at cheap prices, and nearly everyone goes there at some point to either buy furnishings for a dorm room, a first apartment, a baby’s room, or for some of us a seemingly lifetime of bookshelves and dressers. </div><div class="MsoNormal">This summer I visited Weimar Germany, where my main tourist goals were to see the Goethe and Schiller homes. Still, having taken a great Bauhaus class at MoMA in 2005, my friends and I visited the Bauhaus museum, which was the site of the original Bauhaus school and studios. The Bauhaus is interesting as it was really the combination of industrial means, towards high art, for the purpose of providing design for all of society. It moved away from its original arts and crafts ideas to do this, and produced some of the most recognizable furniture and architecture that we associate with the 20<sup>th</sup> century. The museum was interesting in both its contrast to Goethe’s romanticism, and its large picture similarities. That is Goethe was a singular artist and scientist, but was a populist in many ways. Bauhaus did the same, but for a new age in which individualism was being replaced by group efforts politically, such as communism, and consumer industrialization such as cars. The Bauhaus artists were futurists as much as modernists, in that they were predicting a future of modularity, simplicity and raw form. How nice it would be to see them as the prophets of this institution, IKEA, which so many of us use?</div><div class="MsoNormal"><br />
</div><div class="MsoNormal">I believe however that IKEA representatives one of two options where the Bauhaus prophecy is concerned. In the first IKEA is the future that the Bauhaus had predicted and influenced, and it manages to fill me with emptiness and anxiety, or this is not at all what the Bauhaus had actually wanted, and I would therefore be drinking schnapps with Walter Gropius and complaining of long days shopping, and weekends with Allen wrenches. </div><div class="MsoNormal"><br />
</div><div class="MsoNormal">My dislike for IKEA comes with a certain amount of both guilt and plain old self doubt. After all I should be happy for IKEA and all of the shoppers who have filled their homes with those products. The stuff looks nice and it’s cheap. The problem for me is that it sucks the creativity of choosing a living place, creating instead a delusion. We feel that we are going to IKEA, which is a gigantic warehouse, and can choose the furniture that is right for us. In fact though, everyone who is even a little bit like us will buy many of the same things. We have friends with the same pieces we have. My daughter’s bed is the same as her friend’s bed. As Pete Seeger laminated in his song about suburbanization called “Little Boxes”, he sings “they all look just the same”.<span style="mso-spacerun: yes;"> </span></div><div class="MsoNormal"><span style="mso-spacerun: yes;"><br />
</span></div><div class="MsoNormal">The possibility that this is not what the Bauhaus envisioned is also very convincing. The need to assemble cheap particle board for hours is not the same as mass producing a Bauhaus chair and selling them as a complete chair. Another key difference to me is the IKEA inclusive look, which I do not relate to Bauhaus. That is, people buy all of their furniture from one store, so the styles are basically all the same, even though the designs are called something different. Bauhaus and other twentieth century minimalism stressed repeatability and simplicity, but every artist had a unique interpretation of what that was. Mondrian and Malevich were geometrical but nothing alike, as are Eames chairs and Wassily chairs.</div><div class="MsoNormal"><br />
</div><div class="MsoNormal">This all may be me again putting off IKEA assembly, while my wife slaves away at them. It might also be that I am a snob, and would like to buy more expensive furniture. I don’t think though that either of these is the main reason. Mostly IKEA causes me anxiety, and I am trying to understand how such a nice place with such a nice philosophy can do that to me, even though I love to eat meat balls and drink lingonberry juice. I think it is because I recognize that there is something cynical about IKEA. It is a dream, and idea and now a way a life, which is based not on creativity and people, but the perception that it is. The Bauhaus may have been for the masses, but it was designed with care and creativity by individuals. IKEA is a mega company of committee design led by market analysis and quarterly stock valuations. This is not to say that it isn’t useful. It is just a not the dream store of the Bauhaus or me.</div>Anonymoushttp://www.blogger.com/profile/07835248533705400567noreply@blogger.com3tag:blogger.com,1999:blog-6105237826208060810.post-41555048156909751992010-07-14T09:05:00.000-04:002010-07-14T09:05:22.921-04:00An Eternal Thinker<div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgRm5wuZqeNeHhsmBCWrspnHs9BM3McMOmIL7-RjK_CKW9rvYTMD5GsdbKK6P0jQcYlR25rnaUo2NSQ5WROUkzGE9yK-cS2zCjJL2-sA0Mf9nOcs0_lgf7trkpvap3PdUWGo8WqRnUeSPb_/s1600/thinker.JPG" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="320" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgRm5wuZqeNeHhsmBCWrspnHs9BM3McMOmIL7-RjK_CKW9rvYTMD5GsdbKK6P0jQcYlR25rnaUo2NSQ5WROUkzGE9yK-cS2zCjJL2-sA0Mf9nOcs0_lgf7trkpvap3PdUWGo8WqRnUeSPb_/s320/thinker.JPG" width="165" /></a></div><div class="MsoNormal"><br />
</div><div class="MsoNormal">There must be hundreds of reasons why people collect art. It can be as investment or as inspiration. It can be as a way to see a reflection of yourself or society. It can be a way to remind you of human potential, or of human folly. It can be to surround yourself with beauty, or contrast with nature’s beauty in order to appreciate the space in which the art rests. One thing that I think this wide dispersion of reasons has in common is speculation. That is speculation on the monetary value of a work of art or on its influence on you. Both of these are ways of peering into the future. There is also something unique about bronze sculptures. Bronze is as close to eternal as humans know. Bronze will outlive not only us, but outlive canvas, outlive paint and could be one of the few archeological treasures of future alien visitors who will find only this art as a reminder of humanity.</div><div class="MsoNormal"><br />
</div><div class="MsoNormal">My family bought a sculpture from my friend, the artist Mark Pilato, which is titled “The Modern Day Thinker”. Though you have no doubt seen the photo of it here on this blog, if you had not, your first thought would be to notice the reference to the famous work of Rodin called simply the “Thinker”. Rodin’s “Thinker” needs no description as it is one of the most famous sculptures of any period. Rodin must have created this work in a moment where time stood still long enough for it to neither exist in modernity or antiquity, but rather in universality. The “Thinker” is a strong man. He is a worker, hardened by labor, but confounded by self reflection not by action. I can think of nothing more meaningful to the struggles of humans, who more than any other animal are lost in their own silent ideas after the labors of days that fail to fill the empty space which surrounds consciousness. <span style="mso-spacerun: yes;"> </span>“The Modern Day Thinker” is equally as timeless, but seems to push beyond physical constraints in ways that Rodin did not intend with his own work. In this work the Thinker is feminine, but not a woman or a man. The Modern Day Thinker is sexually provocative, but without sex. I refer to the piece as she because I do not want to reduce her form to an “it” merely because we cannot readily acknowledge gender. Her thinking differs from the Rodin "Thinker". It is not the thinking of someone filling the void left after physical effort as in Rodin’s sculpture, but is thinking to fill the entirety of existence. <span style="mso-spacerun: yes;"> </span>This may very well come from Mark’s admiration for Dante’s “Divine Comedy”. This Modern Day Thinker is like Virgil trapped in Limbo, eternally gazing upward to paradise, while forever unable to make that journey. This gaze is not so much a gaze since “The Modern Day Thinker” has no eyes, but instead only an ethereal gesture of anticipation through inward reflection.</div><div class="MsoNormal"><br />
</div><div class="MsoNormal"><span style="mso-spacerun: yes;"> </span>If it is true that collecting art is about speculation, and that bronze is near eternal, how can we imagine our finite lives and the infinity of the art? “The Modern Day Thinker” embraces the paradox inherent in this question. Her geometry is both a minimalist reduction, and mathematically complex. Unlike sharp edges which can easily be solved, “The Modern Day Thinker” wraps, warps and curves its way through space. Maybe this is why she is so beautiful to me. She makes me contemplate and speculate on the future, but leaves enough mystery to make that future ambiguous and exciting.</div><div class="MsoNormal"><br />
</div>Anonymoushttp://www.blogger.com/profile/07835248533705400567noreply@blogger.com0tag:blogger.com,1999:blog-6105237826208060810.post-80326486042970011692010-07-12T09:58:00.000-04:002010-07-12T09:58:04.893-04:00How to win a chili making contest<div class="MsoNormal">Sammy Davis loved to cook. When I first heard how dedicated he was to cooking it surprised me, because he was always traveling on tours, and filming. Usually this lifestyle is a restaurant based existence, but Sammy traveled with all of his pots and pans and knifes. Many of his friends commented on what a good cook he was, including Bill Cosby, who said that Sammy was a true gourmand. He never used recipes or wrote down what he had done. Instead Cosby said that if Sammy made a truly remarkable meal you had to merely live with the memory of it, as he would never be able to recreate a dish. I heard this quote when I was a teenager, which was the same time I was having similar experiences at home, where my father (who was new to the kitchen) began to approach cuisine with a gusto of invention, which was inspired by his travels to Mexico, Asia, Europe and Israel. At the time there was only one thing repeatable in Dad’s cooking which was that everything was extremely spicy, which served to separate the men from the boys, or in our case those that had ulcers already, from those that would soon be getting them. Dad was not the only improvisational cook around. <span style="mso-spacerun: yes;"> </span>I was soon to discover underground chefs in my home town of Akron, many of them men, who were not simply weekend baroque beer drinkers. This was before the cooking television craze, which seems to have made gourmets of couch potato corn dog eaters. In my view though this small unsung group of hard working friends from my community was much more interesting, as was the food they made. Someone who comes to mind was a long time engineer at our family company Tech Pro. This engineer Don Watson and I would spend a lot of time at company parties, not talking sports, or technology, but rather cooking. Don had tweaked traditional Akron cuisine like Picasso did African masks, making it his own expression. Though I should in good taste keep Don’s reputation intact, as well as the other great cooks at Tech Pro who were in the same tradition such as Harold Vunderlink, I cannot go without mentioning the fact that I married someone who was more than up to the challenge of competing in an area which these others were truly experts in; chili making. </div><div class="MsoNormal"><br />
</div><div class="MsoNormal">I had of course eaten chili my whole life, but my wife Marine, being from France had not, nor had either of us ever made it before. We entered our company chili making contest as extreme underdogs for the Halloween contest of 2005. To our surprise, Marine and I won. A competition like this is subjective of course, and it is not certain that we deserved the award against such formidable competitors, but I did learn something from this which I keep in mind in most things I do. Marine didn’t have preconceptions of a good chili, only a rough idea of the ingredients normally used. Therefore she made impromptu substitutions, which made the chili unique. She used black beans, instead of red. She used Cilantro and crème fresh. She used tofu burgers instead of hamburger. This was a proud day for us.</div><div class="MsoNormal"><br />
</div><div class="MsoNormal">This is why I don’t like cooking shows. They are like those old painting television shows I remember as a child, where a very boring artist teaches how to paint a beautiful landscape. They are false, and lack spontaneity. Cooking is now like every prepackaged food, only slightly longer to prepare. My suggestion is an outing of the closeted cooks in companies, who labor by day and invent masterpieces in the kitchen by night. As I have said about free-jazz, poetry, origami, graffiti, science and living in general, the true innovation will come from a mix of intellect, intuition and chance. For this reason I keep picking out new vegetables and meats hoping to stumble across the next great meal or even a chili.</div>Anonymoushttp://www.blogger.com/profile/07835248533705400567noreply@blogger.com1tag:blogger.com,1999:blog-6105237826208060810.post-57245170592718018772010-06-30T04:21:00.000-04:002010-06-30T04:21:12.474-04:00Is the Margin the Whole Page?<div class="MsoNormal">In 2008 I wrote a <a href="http://mcputman.blogspot.com/2008/11/fringe.html">blog</a> about the feeling of being marginalized by the things I do, such as polymer physics, free-jazz and experimental theatre and poetry. I wasn’t so much feeling sorry for myself, but rather just confused by how humans can be so alike, yet have such a broad dispersion of interests. Since I wrote this I am starting to see that marginalization is being moved from its long history of being outside of the box of societal norms, into a new type of box, where the most bizarre, funny, brilliant and creative people go. This box is not full of financial rewards. It is instead a place of self esteem, minor recognition and community. <span style="mso-spacerun: yes;"> </span>It is easiest to see this when looking at the connectivity made possible through social networks. Clay Shirky in his new book <a href="http://www.amazon.com/Cognitive-Surplus-Creativity-Generosity-Connected/dp/1594202532">“Cognitive Surplus”</a> provides dozens of examples of groups who have found homes on Facebook, Nings, and fan sites. Through this he says that whether you are interested in macramé or sci-fi comic books, you will have a group of likeminded friends to virtually share with. Even <a href="http://www.jaronlanier.com/">Jaron Lanier</a>, who is critical of Web 2.0 style mob mentality networking, is a part of a rare instruments forum, where he can share his own music and collections with others who respect this music like he does. Shirky and Lanier, like bloggers and traditional journalists, differ about how far this should and does go. Shirky feels strongly, and quotes academic studies, that show that people are perfectly willing to do things they care about without financial reward. If we replace the time we spend watching TV, with time spent on our hobbies, there is no financial loss, just self esteem gain. Lanier is not convinced that taking professionalism out of all media and creation in general is a good idea, as it lowers the overall quality, and gives wealth to those who are not doing the creating, such as large corporations and advertisers. Both make strong points, and this is an internal argument I will continue to have.</div><div class="MsoNormal"><br />
</div><div class="MsoNormal">What interests me even more, is the underlying psychology with finding deeper meaning in things that society has generally considered fringe behavior. This may very well be internet enabled but it is not strictly an internet phenomena. My wife and I saw and intriguing Argentinean film yesterday called “<a href="http://www.imdb.com/title/tt1517238/">Puzzle”</a>. <span style="mso-spacerun: yes;"> </span>The premise of the movie is extremely simple. A 50 year old woman, who has taken care of her husband and late teenage sons, receives a jigsaw puzzle for her birthday. Not being able to sleep she tries the puzzle, and realizes how fulfilling it is to do puzzles. She then searches for puzzles, and finds a wealthy puzzle master who teaches her how to be a competitive jig saw puzzle player. I won’t give away the rest, but there really isn’t much more to it, yet this film is strangely moving, and speaks to the contemporary sentiment of self realization through formally marginalized activities. </div><div class="MsoNormal"><br />
</div><div class="MsoNormal">A margin on a page in general a small portion of a page. Maybe though margins in society are not margins at all. Maybe they actually are the whole page. Perhaps I am not so odd in the activities I like, or maybe I am common because everyone has odd activities that appeal to them?<span style="mso-spacerun: yes;"> </span>I hope that this continues to lead to self empowerment for everyone, and if I am lucky others will be join me in my passion for polymer physics, and free-jazz!</div>Anonymoushttp://www.blogger.com/profile/07835248533705400567noreply@blogger.com1tag:blogger.com,1999:blog-6105237826208060810.post-83481383556283199432010-06-28T05:51:00.000-04:002010-06-28T05:51:34.198-04:00The Skeptical Anarchist<div class="MsoNormal">If I look over my blogs and Facebook posts, I realize that two topics tend to come up more often than others: superstition and free expression. Though I have made attempts to connect a number of ideas together in the past, I think I have failed to unite these two accurately, mainly because I think that there are enough internal contradictions that I am disposed to what I condemn; belief. The belief I am speaking of is the creation of order through chaos. The music I like, and play is free, even anarchistic in form. There is very little applied structure, yet when analyzed later by critics, musicians and listeners of any kind cohesion can be found, even if that unity wasn’t intended. This happens all of the time to me. I speak with a musician, and compliment them on the use of spectral dissonance through harmonic clustering, or something pretentious like that, only to hear “we were just jamming at midnight.” Now that doesn’t make my analysis wrong, or the intent wrong, but it does suggest that I am deeply involved in trying to mentally connect dots, even though the dots were actually laid out randomly.</div><div class="MsoNormal"><br />
</div><div class="MsoNormal"><span style="mso-spacerun: yes;"> </span>The wonderful documentary film “Between The Folds” <a href="http://www.greenfusefilms.com/">http://www.greenfusefilms.com/</a>, explores a world of extremely serious origami artists and scientists, which I had no idea existed. This group includes a range of people, from the compulsive paper folder who creates life like animals with over 1000 folds, to an MIT mathematician who uses origami techniques to solve some of math’s most difficult questions to a group of style improvisers of the form, which the film calls The Anarchists. The absolutely contradictory styles of the Mathematician and the Anarchists especially appeals to me. The Mathematician concentrates on the perfection of each fold in relation to consecutive folds. He uses computer models to enhance this. While this may seem crazy, in doing so he was able to not only solve some esoteric mathematics theories, but even practical ones, like the most efficient way to fold an airbag for car safety. The Anarchists by contrast bunched and folded paper in completely random chaotic ways. It is the free jazz of origami, and like free jazz they create something that is both highly interesting, and also complex when analyzed. The forms they create may or may not resemble figures, but they do have inherently enlightening results at their best. The Anarchists even perform improvised experiments on the completed forms, like seeing the effect of sunlight over time, or water, or heat. You in a sense viscerally learn things that the mathematician would have trouble formulating. </div><div class="MsoNormal"><br />
</div><div class="MsoNormal">The fascination with these two approaches is what leaves my time partially in the quantitative and experimental world of applied physics, and partially in the anarchistic world of free jazz and surrealist poetry. Somehow I think that by doing both I will be able to recognize patterns that are unique and surprising. Though I suppose that there is nothing wrong with this, there may be nothing right about it either. In his recent TED Talk<a href="http://www.ted.com/talks/michael_shermer_the_pattern_behind_self_deception.html">http://www.ted.com/talks/michael_shermer_the_pattern_behind_self_deception.html</a>, the great skeptic Michael Shermer points out that animals (humans included of course) are predisposed to search for patterns, even when they do not exist. There is good Darwinian reasons for this because the problems with not seeing patterns in life and death situations are more immediately life threatening. He speaks of a predator/prey situation. When an animal has before heard a predator rustling leaves, he learns to run. Therefore even if he hears the wind rustle leaves, he is likely to run. He may have been wrong, but it was safer to be wrong. With more highly evolved pattern recognition we can arrive at misinterpreted correlations, which in turn can do harm. <span style="mso-spacerun: yes;"> </span>Decisions made about education, food, weapons and drugs, which are based on false pattern recognition can do a great deal of harm, and do all of time. In fact Shermer points out that we are not even all that good at recognizing patterns. We see patterns in everything, and not always the correct ones, as he shows with some slides of dots, some of which have embedded figures and some nothing. People will see figures where none exist, see wrong ones where they do, and sometimes get it right. All is possible with our limited abilities.</div><div class="MsoNormal"><br />
</div><div class="MsoNormal">So what is the point of all of this introspection about my own abilities at pattern recognition? Perhaps it is to enjoy the anarchy even when there is no pattern to be found. There is truth in its own right in chaos, where our thoughts and anxieties so often reside. Then, perhaps some lucky time useful patterns will emerge that will allow a communication between reasoning and freedom. </div>Anonymoushttp://www.blogger.com/profile/07835248533705400567noreply@blogger.com2tag:blogger.com,1999:blog-6105237826208060810.post-87893138899771957052010-06-12T15:32:00.000-04:002010-06-12T15:32:43.467-04:00The Life of The Party<div class="MsoNormal">It should have been that the most offensive thing I did for the World Science Festival was to write a blog criticizing the Templeton Foundation, and the invitation of Francis Collins (<a href="http://my.technologyreview.com/mytr/social/blog/post.aspx?wuid=121430&bpid=746">http://my.technologyreview.com/mytr/social/blog/post.aspx?wuid=121430&bpid=746</a>). Though I don’t really feel much differently about the fact that a religious fundamentalist should not be the most powerful government funder of science, I do think that I may have been too hard on the science festival. I went to the Collins event “Our Genomes, Ourselves”, and thought it was thought-provoking, and completely scientific. Collins did not appear to be an extremist. What I really regret though is that even though this year’s festival was so wonderful, and my family was involved in a number of ways, I still managed to be on the obnoxious side at the Jazz Party that my wife and I were hosting. The issue was that our featured band “Wake Up!” who I consider the best live band in New York was well-liked, but several people complained that the music was too loud. Instead of being a gracious host I said things like “if you don’t want to stay, no one is forcing you.” I did this for two reasons, as inexcusable as they may be. The first is that the music is so important to me that I wanted to share it with the World Science Festival which is also very important to me. It made me think that these people who wanted it quieter did not appreciate the brilliance of the band, and that frustrated me for artistic reasons. The other reason is more unreasonable, which is that I think music should be loud. </div><div class="MsoNormal"><br />
</div><div class="MsoNormal">My visceral response to loud/quality music is certainly not unique to me. The desire to create music which has a level which physically alters the surrounding through power and volume goes at least as far back as Vivaldi and Bach. At St. Thomas in Leipzig Germany where Bach was the music director for much of his life, he spent a great deal of time raising the funds for large pipes for the organ. The purpose of large pipes is similar to the purpose of large sub woofers that people put in their cars. It is so that you feel the low tones. Bach combined this with as many pipes of various sizes as possible in order to have such a sound that the “Passions” were truly passionate. The floors, walls and ceiling of St. Thomas shook with beautiful, stunning and yes…loud music. </div><div class="MsoNormal"><br />
</div><div class="MsoNormal">I do want to make it clear that I don’t think Bach was equivalent to a car stereo system. There is something moving about the live experience. I also think this is true of “Wake Up!” which played unamplified at this party. I also don’t think that Bach or “Wake Up!” wanted to inflict pain. If the experience is actually painful, as it may have been for some, then that is a problem. Instead it is more a question of focus. While a party is about being with other people, if music is the centerpiece of that party it must also be the most present participant in the room.</div><div class="MsoNormal"><br />
</div><div class="MsoNormal">Another example of having music which is felt, rather than just heard comes from the inventor of the phonograph. Thomas Edison had very poor hearing to begin with, yet he was able to go to concerts and hear the music. Just hearing it was not enough for him. Once he invented the phonograph, he would literary bite down on the frame of the machine in order to feel the sound. As strange as this sounds, we know that the human ear in its present form is a rather recent evolutionary development. Mammals used to do much of their hearing though interpretations of vibrations in their bones, eventually (like Edison) in their jaws. This was naturally a survival mechanism at the time, as mammals needed to hear dangers coming from nocturnal predators. Since early instruments came so early to humans, it is likely that music has always been both felt and heard.</div><div class="MsoNormal"><br />
</div><div class="MsoNormal">This doesn’t excuse my rudeness, but it does explain my priorities a bit. The World Science Festival is successful because of its highly engaging blend of arts and sciences, and its ability to do this without dumbing down either the science or the art. Furthermore the audiences really do come from various backgrounds, unlike most conferences. At the Armitage Dance Event associated with the Festival, the moderator of the science discussion Steve Mirsky following the performance took an informal poll of the audience, asking who comes from the arts, and who from science. The audience appeared almost equally divided. This is such an inspiring and hopeful sign. My desire for people to feel the power of a new form of musical expression comes from the same place as my desire for people to hear Stephen Hawking, or watch Armitage’s dance. Like those two things it is important, and can’t be done quietly.</div>Anonymoushttp://www.blogger.com/profile/07835248533705400567noreply@blogger.com1tag:blogger.com,1999:blog-6105237826208060810.post-4164935915713673912010-05-07T09:40:00.000-04:002010-05-07T09:40:13.545-04:00An Open Apple a Day<div class="MsoNormal">Probably the most over debated topic of the last month has been about Apple computer culture versus a more open computing culture. I have wasted my time on this on a number of websites. I think that the reason I became so interested in this is because somehow I was trying to work out for myself some more fundamental questions about<span style="mso-spacerun: yes;"> </span>technology, art and modern society in general, which just happen to come together in this particular debate. The broader discussion has not been so much about the I-Pad versus a range of existing and potential competitors, but instead deals with something that we all feel much more personally invested in; expression. The argument can be boiled down to two philosophies, which I now maintain are less transparent than we think. Apple philosophy is based on control and intellectual property. This can be seen in three famous ways; the fee based system for buying media content, the Apple Apps system which prevents any application not approved by Apple to be used and Apple’s choice to not support Adobe Flash. This has been portrayed as an opposition to Google which promotes a more open culture in most ways. My instinct was to oppose the Apple closed structure. After all I love the idea that anyone can develop anything. I like the idea of an open internet culture. The problem I find though is that neither model is completely satisfying, as the fruits of contemporary content and creativity are now lost on opposing business models, rather than the creations themselves. There are two examples, both of which are personally both frustrating to me. The first is scientific knowledge.</div><div class="MsoNormal"><br />
</div><div class="MsoNormal">Yesterday I spoke at a seminar with 5 other speakers in Paris, all of whom work in a very similar field to mine, which is the physics of polymers. I left the meeting with one thought, which is that everyone should be required to go to a scientific conference, regardless of whether they are scientists. In this group, despite the daily work I do, or Ph.D. I have, I was challenged and confused. The basic science of my field is so complex that no matter how much I work in it; there is an infinity of knowledge to be gained. By the way, it is not irrelevant to this that polymers like plastic and rubber (the things we were presenting on) make up so much of our world. It may not be important for everyone to understand the models, equations and experiments that all of these scientists were working with, but to know that there are people working on them at least gives the possibility of a deeper appreciation for them. </div><div class="MsoNormal"><br />
</div><div class="MsoNormal">This reminds me of the Apple Google debate because our debating is basically from an uniformed user perspective. We don’t know how Apple or Google do what they do. We use these products every day, and even debate the validity of the ideology of them without understanding how the engineers and business leaders at these companies manipulate data and perception in order to profit. This is not to say that they shouldn’t be profiting, just that it is hard to judge a particular philosophy as superior if we come from a place of ignorance. Even if we look at what we do know the situation becomes more complicated than at first glance. Apple makes their money by selling products, whether it is I-Pads, I-Phones, and computers, or paid downloads of music, movies, apps and books. How and who makes these may be secretive, but what it is we are buying is not. Google by contrast gives away most of what it has. Google makes money through advertisement. This may seem innocent enough, but all of the algorithms Google uses for everything from search to books are controlled by proprietary algorithms, which if they are serving their customer (the advertiser) will benefit their customer (the advertiser). This example is to point out the complexity of choosing a moral high ground and the basis of openness. It is also to say that we should all dig a little deeper because it is interesting and enlightening to do so.</div><div class="MsoNormal"><br />
</div><div class="MsoNormal">The second point is not scientific or economic at its core, but rather artistic. I am reading the book “You Are Not a Gadget” by Jaron Lanier<a href="http://www.amazon.com/dp/0307269647">http://www.amazon.com/dp/0307269647</a> I am not sure how much of this book I agree with, but it is very thought provoking. One of the points he makes is a criticism of open computing culture in its ability to generate and promote originality. He mentions a belief that the last decade has failed to produce a unique musical style, which has not been true in one hundred years or more. I agree with this, even though it is impossible to quantify. Generally the access to music has been good for exposure, but has created such a mix and remix of styles that even new music sounds retro. That is not to say that there is not new music or new styles available. I work with a group called Wake Up! <a href="http://www.wakeupnyc.com/">http://www.wakeupnyc.com</a> I think is very new. The problem is how that will be perceived when there is so much to sample. It is a game of statistics. The more music that is available the more the average will be the same. The outliers will be ignored, rather than seen as visionaries. This applies to the Apple Google debate as well, since the two models of music distribution will push listeners in one direction or the other. Wake Up! Seems to be trying to be a part of both camps, which is what it must do in order to be responsible to its own mission. They are giving their music away for free by playing in parks, and on their website. They are also selling it on I-Tunes. My worry is that neither system is supportive of the new; the hive mentality or the micro managed authorities. This makes the struggle to get new music heard exiting, as the possibility exists to work with and without the system, but the problem of finding an audience more difficult, as there is a new paradigm which is more concerned with the system of distribution than with the content itself.</div><div class="MsoNormal"><br />
</div><div class="MsoNormal">This is all to say that I, and many others on the internet, am spending too much time arguing about how something is presented to us, than what is being presented. We care whether it is advertiser, stock market, or consumer based revenue, rather than the technology or the content. I don’t have answers, other than that we need to think and listen in retro ways. That is we need to look at the technology not just as users but as scientists. We need to listen to music and make music like people who love discovering music that represents us best. We need however to take chances in new ways. We need to ignore the new common wisdom, rather it is Google “openness” or Apple “sleekness”, and focus instead on what we want to say, see and hear. </div>Anonymoushttp://www.blogger.com/profile/07835248533705400567noreply@blogger.com0tag:blogger.com,1999:blog-6105237826208060810.post-43509021321909762152010-04-26T12:52:00.000-04:002010-04-26T12:52:41.010-04:00Acting Out in the Age of The Cloud<div class="MsoNormal">It is a strange time that life has become a stage for most of us, in ways that are more concrete and less metaphorical than Shakespeare refers to in Hamlet. We may have always been mere actors. Now as I write these blogs nearly every week, as millions of other people do, <span style="mso-spacerun: yes;"> </span>I am performing, or at least expressing myself, for audiences all of the time, even though I never actually go on a stage anymore. This has not always been the case for me. Actually, I tried to be an actor and found that the experience was completely different depending on the size of the audience, and who was in that audience, more so than my own performance. This may very well have been due to my inadequacies as an actor.</div><div class="MsoNormal"><br />
</div><div class="MsoNormal">In 1995 I had just finished two years of studying in a musical conservatory. Mostly that means taking classes and practicing alone in a room (that is the school part at least. I did my fair share of college dorm parties as well). I also had the chance to be in operas, concerts, and musicals. I loved being on stage, despite the fact that I never played major roles, and looked for every chance the college or the community in Ohio where I lived gave me. I started to look outside of music, and did a few plays, rather unsuccessfully. Acting is a very personal art form when done correctly. It is not so much about performing, but rather about empathy. It is important to relate to your character and the others on stage. I am an empathetic guy, but I am also easily distracted by my own anxieties. I could never get the audience out of my mind. A close friend, Jenn Gambatese, who was going to NYU had recently left the formal study of musical theatre, and started training in an intense acting method called the Meisner technique. She explained the technique, and together we did some of the exercises. It was inspiring to me, because in those exercises I wasn’t acting at all, but rather connecting with someone else in a moment that was often emotionally charged. That summer I went to New York to study this technique further at a small acing school called the Neighborhood Playhouse. There were very few college acting students in my class. Mostly they were models who were trying to transition to acting or former childhood television stars who were trying to grow up and be serious adult actors. The classes were everything I had hoped they would be. In fact I did very well in them, as I took my teacher’s advice and didn’t act at all. Rather I was myself in either an improvisation or a scene. Since it was only a summer class we never had the chance to go as far as actually becoming a character that was so different than ourselves that research was required. Still I went back for a final year in Ohio, prepared to finish my degree and hurry to New York for a career on the stage. Though I would finish and move to New York, I had an experience during that last year which made me realize that I would likely not be an actor.</div><div class="MsoNormal"><br />
</div><div class="MsoNormal">Because I had such a good experience at the Neighborhood Playhouse, I had picked up an acting agent who was not ready to exactly take me on, but to at least try me out with some auditions. He called me in Ohio and asked if I could be in New York the next day for a terrific opportunity, which was to audition for a lead role in a new Neil Simon play which was going to Broadway. The agent faxed the portion of the script I would be using for the audition to me at my school. I picked up my mother who agreed to help me out, got in the car and drove to New York. My mom and I practiced the scene in the car.<span style="mso-spacerun: yes;"> </span>I was unreasonably confidant that the role was perfect for me. I went to the audition and in front of me were the director and Neil Simon himself, as well as two other people I didn’t recognize. I did the scene very poorly with another actor, and the director gave me some notes, and asked me to try again. The second time I was even worse. They thanked me for coming, and I left the room. I wouldn’t admit this to anyone out of embarrassment, but I knew I would never be a professional actor after that. While I was good in class, I could not even be convincing in front of even that small audience, let alone an entire theatre.</div><div class="MsoNormal"><br />
</div><div class="MsoNormal">I think that being a good actor is a very rare talent, as it is so much unlike what we are used to. If he is empathetic, as an actor must be, how can he take into account all of the feelings of the other actors on stage and the audience as well? It is just too much to think about simultaneously when you need to be impulsively in the moment of a scene. This kind of exposure is not for everyone, yet many of us now do it much more frequently than we used to, even when we are not seeking acting careers. The speed at which we blog, and tweet, and group IM, and receive comments and responses requires an emotional and intellectual vulnerability that only artists, such as actors were exposed to in the past. This is often criticized, just as actors are criticized, as being self indulgent. Perhaps it is in a sense, but it also, like acting, requires empathy. Millions of us are forced to think about audiences and collaborators in new ways which in turn makes our writing more profound at its best. Even at its worst it is an attempt to live more fully. Perhaps that part of us that is attracted by actors is those emotions, and that vulnerability. The technology available to us lets us all do that without subjecting audiences in a theatre, or playwrights in an audition to painfully bad performances. On the other hand we may very well be compensating for a lack of direct contact. </div><div class="MsoNormal"><br />
</div><div class="MsoNormal">Last week I went to the Abramavic show at MoMa in New York, which in so many ways succeeded better in being art and theatre than any class or stage production. The first piece is simply the artist herself sitting at a table. Visitors can wait in line to sit across the table from her. Once you are in that seat, it is a wordless communication, which is unique with each coupling, even though the setting never changes. I was told that one visitor sat through an entire 8 hours with the artist. </div><div class="MsoNormal"><br />
</div><div class="MsoNormal">So can we achieve anything similar to this through blogs, and comments, or is there something about being alive that manifests itself only in person? Is this what I had hope to achieve as a college actor? I am not sure.<span style="mso-spacerun: yes;"> </span>This is one reason that I also play improvisational piano with groups of other musicians. Finding connections through art and writing has many new opportunities. Whether we take them or not is the challenge.</div>Anonymoushttp://www.blogger.com/profile/07835248533705400567noreply@blogger.com0tag:blogger.com,1999:blog-6105237826208060810.post-25547419718589291202010-03-20T05:59:00.000-04:002010-03-20T05:59:41.857-04:00Does Creativity Require a Day Job<div class="MsoNormal">About ten years ago I had a conversation over a beer (or a few) with the poet John Greiner, who is also a very close friend of mine. Actually some of the thought provoking conversations on art, philosophy, food and drink have been with John on a Friday afternoon in a bar in Manhattan.<span style="mso-spacerun: yes;"> </span>This particular lament was over the fate of the poet as an occupation. I had assumed that John was unhappy by the fact that it is impossible to make a living as a poet. This seems unfair from a values perspective to me. Poetry has always been one of humanities purest ways of communicating the personal, the natural, the political and the spiritual. If you were to judge a culture; its sculptures, paintings and poetry are of fairly equal importance. Doesn’t Homer tell us more about the ancient Greek imagination, and Dante of late medieval Italy than any business of that time? In 20<sup>th</sup> century life cinema has also made a lasting impression in the artistic landscape. <span style="mso-spacerun: yes;"> </span>All of these other art forms though have the potential of making the artist a lot of money. The artist Jeff Koon’s is a muti millionaire from his contemporary sculptures. Even playwrights like David Mamet are very wealthy due to royalties from plays. Steven Spielberg is a billionaire. There is an obvious difference with poetry, which is that the sales of poetry don’t fit into the capitalist incentive structure of these other arts. It is hard to build reputation with a poem, even with sufficient hype, that can be monetized. Sotheby’s doesn’t auction off the latest book of poetry. Poetry readings aren’t shown on prime time television. So, the inability of poets to make a living writing poetry is fair in a market system, but such economic theory rarely gets in the way of John and my utopian dreams of the purpose of art. What he said surprised me. He said that it is better that the poet can’t earn a living with poetry. TS Elliot worked as a banker and editor. Many poets work as teachers. John said that even though he is a poet, he didn’t resent not being paid much for his poetry. From what I remember, (sorry John if I get this wrong) he said that by not having a financial motivation, the writing was uncompromised by money. Also, working other jobs keeps you an active part of society, which feeds the expression in the poems. This conversation, which we have had more times throughout the years, has not only stuck with me, but in some ways inspired me to publish my own poems.</div><div class="MsoNormal"><br />
</div><div class="MsoNormal">Since that time ten years ago, much has changed due to technology, which puts other more traditional types of writing in much the same boat as poetry has been. Journalism is no longer what it used to be, as newspaper revenues suffer, and staff is being eliminated. Writers are turning to new media, like blogs, where they are not paid. There is original thought, and a very democratic freedom to this expression, but for millions of people writing essays, commentary and criticism, it is for the pure love of doing it, not to make a living. Photography is another example. A high art, where photographers were rewarded well, has become a vehicle for amateurs. Or perhaps the amateurs are becoming professionals, but just aren’t getting paid for it. The open source movement in software design is even like this. All of this concerns me in some ways, as I have always said that the ability to sell art is important in validating art. This is not to say that the amount attached to the acquisition is equal to the quality, just that it is one way to show that the artist is dedicated to an audience. This is one of the biggest questions of our time. How do incentives affect quality, and how do they reflect our values as a society? I don’t know the answer to this. Perhaps John and I will solve this over a few pints when I return from Paris.</div>Anonymoushttp://www.blogger.com/profile/07835248533705400567noreply@blogger.com0tag:blogger.com,1999:blog-6105237826208060810.post-34947968142126819852010-03-10T16:09:00.000-05:002010-03-10T16:09:19.213-05:00You Are Not Your AvatarI responded to an article in the New York Review of Books about James Cameron, <br />
<a href="http://www.technologyreview.com/blog/post.aspx?bid=354&bpid=24897">http://www.technologyreview.com/blog/post.aspx?bid=354&bpid=24897</a> and his exploration of what it means to be human. I have not just heard this talked about with Cameron lately, but it is an old, tired, but still somehow mainstream philosophy-light concept. I have mentioned in my blogs before that I think there is no place in modern thought for mind body duality. We know one thing with near certainty, which is that the mind (the brain) is an organ in the body. A separation from it has been speculated on by philosophers such as Plato and Descartes, but neither of these brilliant men had the tools for understanding the brain the way we do now. While the big question, the one of why we are conscious at all, is still being debated and studied, the neuro-physical partnership is well understood. I was in Athens this week and had the strong feeling that despite mythological gods and beasts, and Plato’s elevation of the mind over the body, the Greeks in general did understand the physical nature of being. The sculptures depict athletic beauty in ways that are so convincing that it is impossible to remove the mind from the physicality. In fact I would argue that theatre itself is a dedication to mind body singularity. The transformation of characters to people, is an example of muscular and memory cognition. It is also why two actors never play a role the same way.<br />
<br />
<br />
It is understandable that we are questioning these ideas again. For once a virtual world seems actually possible. Even contemplating the downloading of the entire brain seems one day likely, as computer memory increases. The Avatar in Cameron’s film is farfetched, but not impossible. I would like to propose that a very different outcome though would occur, if it were possible to separate mind from body, in the Avatar sense. The resulting person would be nothing like us. Imagine how we change even in our own bodies. When we are sick for instance. Or when we are drunk. Or when we break a bone. To speculate on having a whole new body, other than brain is hard but not impossible. A paraplegic who was paralyzed in an accident essentially takes on a new body. The one thing that they don’t do however is take on a new brain, whether that brain is biological, as in Avatar, or a computer. If this were to happen axons would be farther from certain receptors, synapses would happen differently. Memory would last for different amounts of time, as all tissue behaves differently. Perception would be different. In essence we would not be ourselves. We could not remove our body from our mind. This doesn’t mean it would not be a fun thing to try, and I am game if anyone wants to try after my demise, but I just don’t think the new me will be my charming self.<br />
<br />
This actually came to me in a rather decadent moment, while I was sitting in a spa in Athens. Sitting in spas in Athens is a great experience, because wrapped in those towels, with a foot bath, you really do not feel so far away from the baths Sophocles may have been taking while listening to Plato ramble on about a better republic. What I thought though, was how much the relaxation of my body affected my mind. Surely the Greeks thought of this too. When the water is bubbling, or you are having a message, it is nice to be yourself, not an Avatar.Anonymoushttp://www.blogger.com/profile/07835248533705400567noreply@blogger.com0tag:blogger.com,1999:blog-6105237826208060810.post-15277382816173177972010-02-16T05:44:00.003-05:002010-02-16T12:09:41.396-05:00Insanity or Persistence?<span xmlns=""></span><br />
The power of a word to invoke emotions is certainly evident in the word insanity. There are 10 normal definitions for this, all of which are familiar, having to do with lack of mental health, court room pleas and your run of the mill "craziness". The definition that I have known for about 6 years now is one that was first credited to one of my heroes Albert Einstein. He said that the " definition of insanity is doing the same thing over and over again and expecting different results". The reason I am familiar with this connotation of insanity is that it is the one that is explored in a movie that my wife and I were Associate Producers for, which is just now available on DVD called "The Definition of Insanity". The film deals with the stubborn passion of a talented actor who endures torturous loss of integrity, family and even mental stability in the pursuit of succeeding in the only thing he feels he must do. In one important scene, he compares his acting with a disability. This is so self analytical that the character reveals both intelligence and an insightfulness that makes us see a depth in his personality that is very profound. <br />
<br />
The fact that this particular definition of insanity was originally Einstein's is not acknowledged in the film, but since the film was made, I haven't been able to get it off of my mind. I often wonder why Einstein addresses insanity in this way, as his most famous contributions in special and general relativity were not insane at all. In fact both have been shown to be accurate throughout many experiments. So he didn't fail at this by doing the same thing over and over. Though this is true in looking at a snapshot of that particular success, when looking at a long shot of Einstein's life we see some of the insanity he described, and not just in his wild hair. Amongst people interested in 20<sup>th</sup> century science, Einstein is not only known for his successes. He is also known for his insistent denial of the century's other biggest breakthrough, which is the probabilistic nature of Quantum Mechanics. Einstein actually won a Nobel Prize for his contribution to Quantum Mechanics. Still he could never take the ultimate step, which was theorized by Bohr, Heisenberg, Dirac and Born. They had theorized that momentum and position of electrons and other sub atomic particles could never be located simultaneously, and with certainty. This theory has been tested thousands of times, leaving little doubt to its validity. Still, Einstein despite his rigor and genius famously said of the theory "god does not play dice with the universe". Not meaning God as a deity, but believing in a deterministic beauty of the cosmos was key to how Einstein viewed the universe. He could not break with this view, no matter how many times he tried. In other words using his own definition he was "insane". When challenged about this seemingly denialist view, Einstein would say that there were hidden variables that Quantum Uncertainty was missing. He wanted to find those, but even if he didn't he felt they were there. <br />
<br />
Finding the hidden variables for the meaning of life is both what Einstein wanted, and what the main character in "Definition of Insanity" wants. In fact that desire, without the label of insanity, is often considered a kind of persistence that is admired; the actor trying to understand himself and others through characters, and the scientist trying to understand the universe through mathematics and observation. The difficulty becomes knowing when to stop. At what point does daily reality, like family and happiness, trump eternal questioning? More importantly, at what point is the questioning pointless as the question is already solved, or may never be solved? There is a philosophical strangeness to this whole question, and it is one that scientists seem to be aware of. In Brian Greene's book "The Fabric of The Cosmos", he has an introduction which is mentioned to me by more people than anything in the rest of the long and very engaging book. In it Greene discusses finding a copy of the Albert Camus book "The Myth of Sisyphus" as a child. Sisyphus is a book which uses the Greek legend as a backdrop to explain modern existentialism; a man endlessly pushing a bolder up a mountain, never to reach to peek. Why did this story of hopeless persistence make Greene want to be a scientist? The philosophy seems to suggest that the goal to reach a full understanding of the universe will never be achieved. Perhaps this shows Greene's self awareness. By Knowing that life will be only process and repetition; we can embrace the climb rather than the goal. <br />
<br />
So what of insanity? I have been accused of being insane for producing plays and films, which always lose money. I have been accused of insanity for arguing about religion with religious people, as no one has ever changed their views from these arguments. The list goes on and on, and those making the accusations certainly have a point. I would say though that in the Einstein sense we are all insane, and that those of us that acknowledge it may actually be on the journey that Brian Greene has taken. It is a pointless persistence of trying and failing that is the reality of living.<br />
By the way, please do buy "The Definition Of Insanity" I am persistently trying to make this film a much deserved success. <a href="http://www.amazon.com/Definition-Insanity-Robert-Margolis/dp/B0030EFZZ8">http://www.amazon.com/Definition-Insanity-Robert-Margolis/dp/B0030EFZZ8</a>Anonymoushttp://www.blogger.com/profile/07835248533705400567noreply@blogger.com1tag:blogger.com,1999:blog-6105237826208060810.post-62954460879414845462010-02-05T03:41:00.001-05:002010-02-05T11:19:29.724-05:00The Poetic Life of Scientists<span xmlns=""></span><br />
More than any interview of a physicist of my generation, last Friday's NPR Science Friday Interview with Cal Tech Physicist Sean Carroll created a mystique for the life of a theoretical scientist. It happened in one moment, which was just marginally different from the common job description of a theoretical scientist. Ira Flotow asked Carroll if he spent his time thinking up these big ideas about time, which is what his new book "From Eternity to Here" (<a href="http://www.amazon.com/Eternity-Here-Quest-Ultimate-Theory/dp/0525951334/lecturenotesonge">http://www.amazon.com/Eternity-Here-Quest-Ultimate-Theory/dp/0525951334/lecturenotesonge</a>) describes to a general audience. He said that is his job. He goes to the wine bar with a pencil and paper, and thinks of new ways to visualize time and space, and new equations to put the puzzle together. He also said that he was lucky that he worked in such a dynamic field where he could discuss his ideas with colleagues, who we get the impression are his friends. In that moment he managed to elevate the image of a gen x physicist in Pasadena, to lost generation poets in Paris. This is a needed transformation of the imagination. Scientists of my generation and younger have been caught in a historic limbo where social and solitary explorations of the mind have been replaced in large part by social and solitary explorations on-line. We think that science happens only because of computing power, our information gathering resources, and our mass connectivity, while all the while admiring with nostalgia the thought experiments of Einstein, the Eagle Pub of Watson and Crick, or long walks through Copenhagen parks. My favorite book of 2009 was Steven Johnson's, "Invention of Air" (<a href="http://www.amazon.com/Invention-Air-Science-Revolution-America/dp/B0031MA7UW/ref=sr_1_1?ie=UTF8&s=books&qid=1265357899&sr=1-1">http://www.amazon.com/Invention-Air-Science-Revolution-America/dp/B0031MA7UW/ref=sr_1_1?ie=UTF8&s=books&qid=1265357899&sr=1-1</a>) which not only told of the contribution of Joseph Priestley, but about how coffee bar culture in London led to many of the most important ideas in English science. <br />
Over the last year there have been several books about the need for scientists to be better communicators with the public. I like "Don't Be Such a Scientist" by Randy Olson (<a href="http://www.amazon.com/Dont-Be-Such-Scientist-Substance/dp/1597265632/ref=sr_1_1?ie=UTF8&s=books&qid=1265357791&sr=1-1">http://www.amazon.com/Dont-Be-Such-Scientist-Substance/dp/1597265632/ref=sr_1_1?ie=UTF8&s=books&qid=1265357791&sr=1-1</a>) which dealt with this topic, by describing the necessity of scientists to use film, and other multimedia tools to demonstrate ideas to a larger public. What I realize now though is that there is an essential step missing from the picture of going from the lab to the screen. That is the step where we write, draw and eventually talk with each other, not at seminars, but at wine bars. A Greek symposium was a long night of drinking and discussing. A college symposium usually takes place in a classroom during the day and is much shorter, but for some reason I think I would be much more likely to sleep in that daytime class than drunk on Plato's sofa. While poets and philosophers have searched for ways to explain the human condition, scientists are exploring ways to understand nature in its entirety. Friendship, debate and Pinot Noir are welcome companions in this pursuit.Anonymoushttp://www.blogger.com/profile/07835248533705400567noreply@blogger.com1tag:blogger.com,1999:blog-6105237826208060810.post-16555761631724248782010-01-18T04:33:00.000-05:002010-01-18T04:33:07.204-05:00Unfinished Business?There is a wonderful scene in the film “Six Degrees of Separation”, which as a parent I think about nearly every day. The Kittredge family, who provide the bourgeois Upper East side backdrop of the film, also provide a number of insights into excesses and ambition in contrast to natural instincts. As modern art dealers, they academically understand that the textural and prospective flattening of images represents not merely a shift in aesthetics. It also characterizes a reduction of experience, emotion and philosophy to a two dimension painting. In this particular scene Flan Kittredge, as played by Donald Sutherland, reminisces about his daughter’s second grade class. He remembers how when visiting the classroom, he was stunned, as if entering a gallery at MoMa. All of the paintings of the children struck him as spectacular. He asked the teacher how she managed to get such profound art from every student. Each piece was like a Matisse, a Cezanne, a Kandinsky or a Picasso. The teacher replied that she did nothing. Only that she knew when to take them away. In other words all children are modern masters, but by leaving them to continue work on a painting, that masterpiece may be disguised and colored over. <br />
<br />
<br />
Children are experimenting with paints and drawings, as we continue to with blogs, with lovers, with restaurants, with financial instruments, and with scientific experimentation. What we often lack however is the equivalent of the teacher who tells us when the experiment is over. For anyone who has played free jazz with a group of musicians for the first time, they will know that it is nearly impossible to bring a piece to its conclusion. This is one of the things that I love about improvisational music, but also one thing that separates those initial experiments from a band that is fully connected. At the heart of free jazz is an assumption that there is nothing a musician can play that is inherently wrong. If a dissonant interval from one instrument is played against a consonant interval from another, it may not be planned, but becomes an idea that requires exploration. During those first meetings of a group, every second of playing is packed with these micro experiments, all of which are of interest to the musicians. Resolving those tonal and rhythmic variations without discussion or pause is an infinite process, which leads to long sessions. For me this is often where jazz starts and ends, as I often don’t have time to rehearse or perform regularly with one particular group. I am often the sit-in pianist who comes into a session with a group who understands each other in such a metaphysically intense way that they instinctually know the movements of the others. It is still a process of experiment for these musicians, but one where a hypothesis has already been stated, and the theory is being tested. I then become a dependant variable in this equation. When it works, the process becomes a calculus, or more metaphorically accurate, a quantum wave function. When listening to the recording the results can be heard, but only as an approximation. Like a subatomic particle whose position and velocity cannot both be measured with complete certainty, neither can any one moment in the cacophony of the sound scape be isolated and understood. It is an evolving process, which as a whole can be experienced. Like the second graders, it takes discipline or a leader to know when to remove your hands from the keyboard. <br />
<br />
A science lab can be much the same as this, and like the examples of the children’s art and the free jazz session, it is not completely clear to me that a solution to an experiment ever truly represents a completion. Perhaps it is merely a disciplined end point, chosen aesthetically, artistically or randomly somewhere in the middle for any number of reasons. A corporate research project must have a point at which a conclusion is made, or a product would never be released. We know that the results are rarely perfect, as all products have some degree of uncertainty built into them. A drug is effective in a percentage, hopefully high, of the users, but not 100%. A Ph.d dissertation also must have a completion date, or the student would never get the diploma. American innovation is actually tied to this ability to wrap up an experiment. The first personal computers, IPODS, cell phones and MRI machines weren’t perfect when released, and the scientists who worked on them knew it. But an entrepreneur or manager knew that the product needed to be released. <br />
<br />
There is a dilemma for me in this question of creation and completion, which I also think about when watching my daughter paint. It is not whether a painting will look better if it is taken away from her at a certain time. It will certainly be more understandable if it is. We could all be like the second grade teacher in “Six Degrees of Separation”. The big question is rather by taking it away I am stopping a process which for psychological and even artistic reasons should continue to play itself out as long as she wants it too. For my daughter I would want her to continue, as the goal is not for her be a Matisse (at least not yet), but rather to have fun, and express herself. Who I am I to say that she is finished? As we consider ourselves more mature when playing in a band or on nuclear physics experiments, we start to want to be master something rather than just express it. Is that mastering or compromising? Of course it is necessary, and in the cases I mentioned it is important. I wouldn’t want a cure for HIV or cancer to be in a lab somewhere with a scientist saying, “I am not satisfied yet. It only works on 99% of test patients”. I also wouldn’t want every recording to be like most pop albums, where every moment is produced, and planned. This may all seem very trivial and obvious, but it is a question I face every day, as a professor, as a musician, as a scientist, and as a father.Anonymoushttp://www.blogger.com/profile/07835248533705400567noreply@blogger.com2tag:blogger.com,1999:blog-6105237826208060810.post-75926535848486939172009-12-30T14:51:00.000-05:002009-12-30T14:51:13.889-05:00Nature's Intellectual PropertyI just watched the Charlie Rose interview with Liv Ulmann and Kate Blanchett about the new production of “A Streetcar Named Desire”, playing at BAM in New York. Liv Ullman, a terrific actress herself, is the director of this play, and she said something that strangely applied to the rest of my days conversations. She spoke about that moment when observing an actor that you are directing, when the actor gets it exactly right. The moment is so moving that words cannot describe why it is so perfect. The director is left with a dilemma. Should she tell the actor that he or she got it right, and try to figure out why, or just leave it and be happy it is there? There is risk in both of these approaches. Speaking of it may intellectualize a purely instinctual and brilliant act of the subconscious mind. On the other hand, not speaking of it may mean that it was simply one moment, which may never again be repeated. I know what she is talking about, both from my days in theatre and now working in science. There is a sense of the complexity of inspiration that is humbly rooted in our knowledge of all we don’t know. Human psychology and character development are so deeply intertwined with life experience, theatrical experience, and character interpretation. There is this same thing that happens with invention, and can be equally as fragile.<br />
<br />
<br />
There are different strongly held beliefs in how to handle intellectual property. Invention of a new technology is not entirely different from the process of bringing a character to life. Like the play, the invention is a unification of previous ideas. Views on how to handle these ideas have varied, and distinguished inventors have disagreed on whether the patent system is truly the best place for them to be revealed. There is an idea, that until the open source movement in software, seemed quant. Benjamin Franklin said after inventing the open stove; “as we enjoy great advantages from the inventions of others, we should be glad of an opportunity to serve others by any invention of ours; and this we should do freely and generously.” This is a highly romantic ideal that has not been very practical. Even not for profit Universities and hospitals now routinely seek patents in order to finance further research. Patents actually do half of what Franklin was suggesting. They do allow others to make and understand the exact invention. They just can’t do it freely for 20 years. By the way, Thomas Jefferson agreed with Franklin on this account. Luckily for them they made their money in other ways, not relying on science and technology for an income. Most private inventors and corporations don’t have this benefit. There is another way that inventions are handled in modern society, which is through trade secrets. The concept of trade secrets is best known in the food and beverage industry. The secret formulas to Coca Cola or to KFC have been famously guarded. This is true though in nearly every product and process, even ones who have extremely strong patents. The truth about trade secrets may be much less brilliant, but more mysterious than a patent. I feel that in most cases a trade secret is something in a process that makes a product unique, even if the company or inventor doesn’t know what it is. I think that it is very possible that Coca Cola does have a secret recipe, but that the recipe by now must have made its way to competitors. The only explanation then on how Coke is still different is that something in the way they make it is different, so they keep making it the same way. This is not so much invention, but chance.<br />
<br />
Nature works in similar ways to the trade secret method. There is no patent on trees or minerals. They have come into their present form through a process that worked to keep them intact. Recently while working here in Paris with a very renowned polymer chemist, we were discussing a strange natural phenomenon. For 75 years chemists have been able to create a synthetic rubber which has the exact same chemical structure as natural rubber that comes from the Hevea tree. This was a major development, but strangely when we look at the properties of the natural rubber and the equivalent synthetic, the results are different. With all of our technical and analytical knowledge, we don’t know why this is. For this reason Natural Rubber is still used for many applications. When I was discussing this with my father, he suggested that this was somehow natures “trade secret”. He is right. There is something that for the last billion or so years has been refined to create the latex that is so unique. Nature is not an intelligent being, so likely it does not know why. It just happened, and continues to happen the same way over and over. The same thing is true of silicon, which has a near perfect structure. We would love to create something this perfect in a lab, but we haven’t had the billions of years of trial and error yet. I think with the prototype to evaluate, we should be able to do it faster.<br />
<br />
This brings me back to the actors, to instinct and to chance innovation. Perhaps most of what we do is about freezing a process on stage, in a factory or in a lab at the exact right moment. It is also possible that this ability to know when and how to do this is what makes great directors, inventors and companies.Anonymoushttp://www.blogger.com/profile/07835248533705400567noreply@blogger.com4tag:blogger.com,1999:blog-6105237826208060810.post-58337342480484936452009-12-16T11:30:00.001-05:002009-12-16T13:39:52.319-05:00Experimenting<em>Just because something doesn't do what you planned it to do doesn't mean it's useless.</em> <br />
<br />
Thomas A. Edison<br />
<br />
If you are an experimental scientist, your days are likely to be either incredibly frustrating, or incredibly exhilarating. Actually for many of us this oscillation of emotions is the natural bipolar state of the work that we are driven to do. Everyone has a slightly unique process for experimentation. I tend to start with improvisation, while other, more organized scientists begin by systematic preparation. An improvisation is by its nature different than an experiment. It is more like psychoanalysis, with free association of ideas, without any conscious direction. I remember this being called brain storming in business and school meetings. For me an improvisation can clear my mind, so that I can see what is already in front of me, rather than be trapped by outside thoughts. As I said though, this is not really an experiment. An experiment requires more than improvisation, it requires an idea, or hypothesis, so that a proper test, and set of testing conditions can be designed. In cases like the Large Hadron Collider at the CERN labs in Switzerland, 15 years have been spent preparing for experiments. One of the key experiments at LHC has been sculpted by the world’s leading physicists over much of this time. The Hypothesis is that a unique particle, called the Higgs Boson, can be detected by colliding protons at high energies near the speed of light. Most physicists expect this particle, called by many the “god particle”, to be detected, confirming one of the 20th century’s most famous, yet improvable theories in particle physics. This is what is generally thought of as experimentation. At the 2009 Origins Conference in Arizona, two physicists Laurence Krauss and Brian Greene talked of an even more rewarding, or exciting possibility. Dr. Green said “what would be even better than finding the Higgs at the LHC, is not finding it. It would show all of us that there is something else to be discovered. Of course this wouldn’t be good for financing another large experiment like this.”<br />
<br />
Greene was on to something that is generally misunderstood about scientists. Even when an experiment is well planned, and a hypothesis well formulated, we are even more enthralled by the possibility that the experiment leads us to entirely new places. The reason for this is that we trust that nature is inherently more interesting than we can first imagine.<br />
<br />
Small technology companies are no less of an experiment than one run in a lab. Like the scientist in the lab, the entrepreneur is putting all of his mental capabilities into a hypothesis, believing that his idea is of value. The good entrepreneur, like the good scientist, is even more moved by the idea which he didn’t have. In other words when the experiment of trying an idea fails, he assumes that it must mean that there is an even better solution. This can make for difficult days, quarters and years, but ultimately the openness to reinterpret the experiment can lead to more beautiful places than the original design.<br />
<br />
One area of the start-up which is often misrepresented, or at least not thought of in this light, is staffing. When I was a theatre director I was given a common piece of advice which is that “90% of the director’s job is casting.” This is true of course for directing and hiring engineers, but it is not as rigid as might be implied. When the director Mike Nichols hired Dustin Hoffman for “The Graduate” his choice was mocked throughout Hollywood. Hoffman was too old, to small and too Jewish. The role of Benjamin Bradick should have been given to Robert Redford, by all of the loose metrics of casting wisdom. Nichols was participating in an expensive Hollywood experiment, and one that ultimately paid off with one of the most successful films of its era. In hindsight Nichols is seen as a genius for this decision. When asked about it though, he doesn’t see it this way. He claims that the reason for choosing Hoffman was not based on an imagined box office success, but rather just because he thought Hoffman was good. He chose to experiment on Hoffman, not knowing for certain how he would fit in the role, but believing him to be a good enough actor that somehow he would. <br />
<br />
“The Graduate” casting example is exactly what plays itself out when hiring the first few engineers in a company, and probably everyone after that. Sometimes it is not always best to hire the MIT Ph.D. with a specialty in your field. Sometimes that is like casting Robert Redford in “The Graduate”. It would work probably, but it might not be as inventive as you would like. There was also one other small advantage to the casting of Dustin Hoffman, which at first may seem like a compromise. Hoffman was an unknown, and was not as expensive as Redford. I don’t think this was Nichols reason for casting him, but in the end it didn’t hurt either. Because the production was under budget in casting, they were able to reallocate some of that money towards the scenery, which included the famous modern and post modern monochrome homes of the Bradicks and the Robinsons. It also didn’t hurt Hoffman, as he is now one of Hollywood’s top paid actors.<br />
<br />
The early years of Tech Pro were much leaner than “The Graduate” pre-production days, but there were some similarities. My parents were looking at doing something that shouldn’t have been able to be done with a small amount of investment capital. They were trying to open a software, and hardware company to create completely new technologies, in order to compete with Monsanto, which was at the time a Fortune 50 company. Although it is obvious that this experiment was a risky one, and that there would be challenges, the challenge of hiring seemed easily approached by following common business wisdom: if a company has only a few dollars, at least those dollars should go to the obviously most qualified person. But, what if there aren’t even enough dollars to work with, or if it means changing your financing model in order to raise additional funds?<br />
<br />
The story of Jeff<br />
<br />
The summer of 1985 was a period of transition for Tech Pro. The company itself was experimenting through improvisation and hypothesis, starting as a garage refurbishing shop. That is really all it was. My Father had worked in Akron, the rubber capital of the world (at least then), in many areas of the industry, from manufacturing rubber, to working in a testing lab, to working for Monsanto, who made testing instrumentation. During this time period Monsanto had a near monopoly on a type of instrumentation called rheometers, which were the only practical way of evaluating vulcanization of rubber. In the mid 1980’s there was a transformation in the industry occurring. The large tire companies were being acquired by foreign firms, and local factories were being closed. At the same time smaller US companies were filling some of the gap left by the departure of the major players. These smaller companies couldn’t afford the rheometers that were by natural supply and demand standards expensive from Monsanto. My parents made a logical bridge between the factory closures, and the need for low cost instrumentation. They purchased used instruments from shuttered plants at auctions, and rebuilt them to resell to the new companies needing cheaper instruments. This was, not surprisingly, welcome news to the industry. It was also a lot of work. Tech Pro hired first a night maintenance man from Kmart to help with the rebuilding. Joe Bulman was a superb tinkerer, and even though hired mainly as a technician, showed creative interests, and abilities. So, he became a design partner, and was the first person to design, along with my father, an original rheometer, not just a refurbished old one. Actually, I will digress for a moment on this story, as it is a perfect example of an experiment that needed adjusting.<br />
<br />
In 1985 Tech Pro was actually happy, and even profitable in its business of refurbishing and reselling rheometers. With Joe building, my mother doing the administration, and my father doing sales and installations, it was a nice, very small business. The way the process worked was that Tech Pro would find the old, usually not functional instruments, at an abandoned factory and cheaply acquire them. They would then strip the instruments to only there bare physical structure. They would buy all new parts, from Monsanto, rebuild the instruments, paint them, test them and resell them. This was the entire business at the time. Then a shock that could have stopped Tech Pro at this stage happened. Monsanto refused to sell Tech Pro any more parts. Since Monsanto was the only supplier, there were no other choices. That is except the one that now seems obvious. Tech Pro started to make its own instruments. At this point my father moved from being a salesman, and installation man, to a designer, and Joe became an engineer. <br />
<br />
Once Tech Pro had an equivalent, but less expensive instrument to Monsanto, the idea of being just a second supplier lost its excitement. My father was an experimenter at heart, and wanted to experiment with the most exciting technology of the day, the personal computer. Personal computers in 1985 had started to find their way into corporations in many ways. The large main frames of the past were no longer necessary for many applications. Spreadsheets and word processing were being used by nearly everyone. Accounting departments and human resources were starting to use personal computers. In the rubber laboratory, however, analogue devices, called recorders, were the only way to acquire information from rheometers. Personal computers seemed like a perfect fit. A computer would be able to acquire data from the instrument, and store the information. It should also be able to do mathematical calculations to help with the interpretation of that data. The problems in pursuing this line of experimentation were: 1). Joe, my mother and my father had never programmed before, and 2). Computer scientists were scarce and expensive. <br />
<br />
Though Joe was the only engineer at Tech Pro in 1985, the work load for building and rebuilding instruments had increased to the point where some hourly employees were necessary to help with the manual labor involved. When a company is as small as Tech Pro, every hire is important, and risky, no matter how unskilled, or low paid the job appears to be. My parents even had a test, which was not so much based on knowledge but instead based on problem solving and creative manipulation. An example from this was putting together a pizza box quickly. Another involved an aspect of design. The only knowledge based questions were ones of electronics. It was important that every early Tech Pro employee know some basics, as everyone needed to multitask. There was also a search for a computer geek. For people who spent time building their own computers, and coding video games. Tech Pro was looking for people who had fun with computers and electronics, not people who were educated in them.<br />
<br />
One of these early shop hands was Jeff Archer. Jeff was in his early twenties, high school educated, and clumsy with tools. In such a small firm, where the ability to use a broom, and a drill were more important than your ability to do differential equations, this could have been a problem. Instead months went by with Jeff working hard, but not extremely effectively as an assistant of sorts to Joe. Jeff’s potential during this time was growing, as he was indeed leaving work to build his own computers, and doing programming. When my father decided that he wanted to create the first PC applications for rheometers, he did not search for capital, and computer scientists, he instead looked for the geek with the broom. Jeff, and my father worked together to make the first ever PC driven rheometer system. Jeff was not just good technically, he was creative, and smart, and understood, like my father did, the psychology of the user. Together they created a system which was such a smooth transition from analogue to digital, that within 5 years the entire industry had embraced it.<br />
<br />
Jeff was an experiment that paid off for him, for Tech Pro, and for the rubber industry in ways that were never hypothesized when he was hired. Still the flexibility and insight to see in him as a potential partner made something unique possible. Only in a small company where the owner knows the worker can this discovery be made.Anonymoushttp://www.blogger.com/profile/07835248533705400567noreply@blogger.com0tag:blogger.com,1999:blog-6105237826208060810.post-55856633786253316612009-12-07T22:46:00.000-05:002009-12-07T22:46:07.530-05:00Simplicity<div style="text-align: center;"><strong>Simplicity</strong><br />
</div><br />
<br />
<br />
<br />
Truth is ever to be found in simplicity, and not in the multiplicity and confusion of things. <br />
<br />
— Sir Isaac Newton<br />
<br />
<br />
<br />
Science fiction is a real passion for many scientists and non-scientists. Perhaps it may even account for the reason many of us work in science at all. For my generation, and my parents’ generation, there are two television series that most represent an idealized technological universe, Star Trek and Doctor Who. These two are markedly different from Orwellian type futurist fiction, in that they are not meant as a warning against technological advancement, but rather an excitement for its arrival. Something which strikes me as amazing is that the most popular character in Star Trek is Spock, the logical, knowledgeable Vulcan, who, until the recent Star Trek film, avoids human emotion in favor of reason. The Doctor, in Doctor Who, is certainly emotional, but he avoids commitment in a way, favoring discovery for its own sake. The Doctor sees nothing more romantic than traveling to the edges of time and the known universe, where even his vast comprehension is challenged, forcing him to learn something new. The reason that Spock and The Doctor are so enticing for the scientist, and the fan, is that they are able to reduce the complexity of the universe into something that is comprehensible for them, so, therefore, we feel it is possible for us. They remind us that while it takes a long time to learn things (The Doctor is over 900 years old!), once we know them those things become simple. The goal for them and for us is to have as much simplicity as possible in our lives. If we succeed, Quantum Chromadynamics and partial differential equations become second nature.<br />
<br />
Business tends to have the inverse value system. Specialization is not the same in business ideology. A CEO is not meant to understand the mechanics of financial models that make up the foundation of his company, or to understand the cultural components that affect the work habits and productivity of his thousands of employees. The job of a CEO is to create layers of complexity in the system, all of which he feels are handled by others, so that he can focus on the most illogical part of the process, which is vision. A business leader relies on the two things that Spock or The Doctor would never accept; faith and emotion. The faith is accepting that the system in place works, and that the thousands of employees he doesn’t even know are doing something useful. The perception, even by the leader himself, is that this is not happening. <br />
<br />
Now more than ever a CEO is bombarded with data. This data is too much for any one person to fully grasp; yet major decisions need to be made from it. In Malcolm Gladwells essay on the Enron crisis, he speaks of the complications of the accounting, and structural components of the Enron crisis. He suggests that there was so much data form special entity companies that Enron created, that it was actually not possible for the CEO Jeff Skilling to truly understand what was going on. So while claiming to be using numbers, those numbers were useless in really evaluating the situation. The company was just too large and too complex. Like Skilling, many CEOs then revert to an instinctual and rigid evaluation of a company’s health. My mother refers to this as “management by spreadsheet”, which until recently I didn’t completely understand, or even agree with. A spreadsheet appears to be very scientific. Then I revisited the lab environment. During a course of any experiment, enormous amounts of data are generated, and put into spreadsheets. Nearly anyone can do this. The creative scientist is not the one to compile data, but to properly analyze it. It is possible that a trained CEO could analyze data well. It is not likely though that he can analyze well the complete data that is presented to him. A scientist is always trying to narrow the scope of a single evaluation, because looking at multiple things at one time introduces too many variables to properly understand. A CEO, even if somehow very mathematical and analytical by nature, couldn’t possibly do this. He is instead forced to rely on generalizations about the data. This leads to an impersonal management style, and ultimately on that is not quantitative at all.<br />
<br />
<br />
<br />
Being emotional, and trusting are not bad qualities. It is what makes us different from a Vulcan or Time Lord. It is risky though to be in a situation where you are incapable of returning to the hard facts when necessary. This is a major advantage for small technology companies, who have less than 100 employees. In these companies you can read the financial statements monthly, talk with all of the engineers daily, and analyze customer satisfaction on your own. <br />
<br />
The hallmark of an overly complex business community can be seen at corporate headquarters, when business managers spend 90% of their days in meetings. I have been a consultant for large companies, and have found myself in some of these meetings. The goal is a good one; to make grand plans for the business. These meetings usually end up producing documents of notes from the meetings, and the notes produce documents of a strategy, which if you wait long enough trigger another meeting, which in the rare instance result in a document that gets passed down the management chain. Occasionally this may lead to innovation, but it is by nature an isolating process, where the meeting room and the document generation process become a bigger part of the route than the product or customer. This actually tends to happen even at small companies, especially those managed by MBAs. It rarely happens at small companies run by engineers or scientists, because engineers are too curious to not be involved. The Doctor would never delegate a mission to the future in a distant galaxy, because the joy of being a Time Lord is visiting it yourself, or with partners. This is one place where Google, even as large as it has become, succeeds. The founders and all major executives are engineers themselves, and avid users of the product. They understand creativity and how it comes from experimentation, rather than meetings. So each Google engineer is required to spend 20% of their time working on any idea they have. The other 80% is spent on other Google projects. That is 100%, none of which is in meetings. CEO Eric Schmitt, in a 2006 Charlie Rose interview, admits that this is getting more and more difficult as they grow. He claims that founder Larry Page knew the first 2500 employees personally, and exactly what each made. Page is brilliant of course to be able to remember 2500. For most of us knowing 100 people is about maximum. So now that Google has 30,000, even Page is in the dark about most of his employees.<br />
<br />
My father ran Tech Pro with Google style ideals. Since Tech Pro was so small it wasn’t necessary to formally structure them. There were small clues that let employees and customers know that it was a business run by curiosity and the thrill of science, rather than by meeting. One was that no one had titles on their business cards. While this may have been a deliberate choice, it was more of a practical one. Technician, salesman, engineer or president were each expected to be an advocate for the customer.. There was a constant feedback loop between customers and product development, without the middle or top bureaucracy which slows down progress. Another system which was at place at Tech Pro was a loose work schedule for engineers and especially programmers. These jobs were as creative as they were scientific, and people work better at different times of day or night. Often a programmer would still be at Tech Pro at midnight, even though he might not arrive the next day until noon. This made everyone feel a certain ownership in the products they were creating, but also a feeling of ownership in the company itself. The workplace often became an extension of their home where they not only worked but made coffee and ordered pizza. This sometimes meant that during the actual work day some engineers did very little work. They talked, had lunch and walked around the outside of the building smoking. With this strange schedule there was no time for, or really any need for long formal meetings, because small informal ones were happening all of the time.<br />
<br />
It is obvious to me that this bureaucracy -free system is simpler to maintain, and for small companies more effective and fun. It also has a way of transferring to the products and services as well. Customers become a natural extension of this environment, as the Tech Pro team is used to seeing the business “universe” as approachable rather than intimidating. Product design also tends to reduce complexity into simpler components. Tech Pro always used off the shelf components because they were available quickly and any time of day or night. For development this meant that you didn’t need to wait for an outside consultant to prepare a proprietary scheme. This made the costs of the final product less, and also led to innovation that could not have happened otherwise.<br />
<br />
By working simply, Tech Pro was actually more efficient and more interesting Simplicity was the rational approach to organization that Spock would have wanted and the adventurous approach that The Doctor would have embarked on. Technology and business can seem very complex but when seen in retrospect everything that has been done is very simple.Anonymoushttp://www.blogger.com/profile/07835248533705400567noreply@blogger.com1tag:blogger.com,1999:blog-6105237826208060810.post-18715318727334857562009-12-01T04:48:00.001-05:002009-12-01T04:49:41.569-05:00Risk<em>For a dying man it is not a difficult decision [to agree to become the world's first heart transplant] ... because he knows he is at the end. If a lion chases you to the bank of a river filled with crocodiles, you will leap into the water convinced you have a chance to swim to the other side. But you would not accept such odds if there were no lion. </em><br />
<em></em><br />
<em><br />
</em><br />
<em>— Christiaan Barnard</em><br />
<br />
<br />
<br />
Scientists suffer from an unfair reputation in America. They are considered to lack instinctual savvy. For this reason, nearly every physicist I know is deeply offended by the stereotypical characters in the CBS sitcom “The Big Bang Theory.” They are socially awkward and tend to quantify even the most banal details of life. What my colleagues miss about “The Big Bang Theory” though, is that hidden underneath all of the silliness these characters possess something that is unique, and apparently rather attractive about scientists. After all, even the Asbergian self-centered Sheldon is likeable , I actually think that there is something about the geekiness of “The Big Bang Theory” that is every bit as attractive as the coolness of the “Sex in The City” ladies. Though this is certainly a project for sociologists and not me, I have a hypothesis regarding television likeability and culture. I think that somehow we know that the common perception of cool is not really all that important, or even interesting. This is a significant, even if unconscious, realization for modern television viewers, because common wisdom takes a long time to become common. It is also interesting because it gives a defense for seeing the Universe and ourselves differently. In essence we think that everything Sheldon does is not intuitive. He is a classic clown in this way. He humiliates himself by creating friendship surveys so that he can create a mathematical model to see who would best qualify as a friend. Our instincts are the opposite of this. We become friends with people because they are attractive in some, sometimes superficial, way. They make us laugh, or they are handsome, or they are somehow dynamic in a way that we wish we were. For anyone who has had friends they ended up hating, or has been married and divorced, human instincts are exceptionally bad.<br />
<br />
This is one area where a geeky physicist may have some insight on business that is different than everyone else. Physicists are used to the facts being different than the way they feel. The Copernican revolution is one of the first examples of this: it doesn’t feel right that the earth rotates around the sun. This is even truer of 20th century physics. Our instincts are absolutely insufficient to understand Einstein. General Relativity, with warped space-time, just isn’t in our normal arsenal of instinctual survival skills. Quantum mechanics is even less intuitive, so much so that one of its discoverers, the Danish Nobel Laureate Neils Bohr famously said, “If quantum mechanics hasn't profoundly shocked you, you haven't understood it yet.” So to be a good physicist or scientist in general, it is important to be at least a little “Sheldonesque.” We must not trust our instincts, but instead investigate the truth. In facts and friendships we need to consider quantitative data, not just attraction.<br />
<br />
A field of science which is relatively new is called “evolutionary biology”. You would think that this would have started just after Darwin’s “Origin of the Species” was published in 1859, but it wasn’t. That revolutionary book was so well written, that it debunked all of recorded history’s various religious interpretations of biology without being all that difficult to understand. In fact, when you read “Origin of Species” the first thing that comes to mind is “why did it take so long to realize this?” Darwin just observed and recorded nature for many years, and made a leap which was incredibly obvious in retrospect; that there is a relation between species and that certain traits and species survive because of natural selection. This was, of course, controversial for all of the obvious reasons as well. Religion is both very personal and very political. Even though what Darwin was doing was science, it did step on some sacred beliefs. Still, to most people who read the book it made sense. Theodore Roosevelt read it when he was 14 years old and for the rest of his life claimed it was the most important book ever written. It is easy to see that Roosevelt’s love of nature shaped his decisions as president, and much of that can be traced back to his admiration for Darwin. So because of its apparently intuitive qualities, Natural Selection could be taught in biology classes, without there being a separate branch of the discipline devoted to it.<br />
<br />
The surprising thing is that as much as Natural Selection made sense, it really wasn’t as intuitive as expected. What made most sense to humans about evolution was not the centuries it takes for species to transform themselves. This is hard to visualize. Instead it was an idea that humans could only prosper through a conscious survival of the fittest. The term Social Darwinism was coined in 1944 by the historian Richard Hofstedter to reference some rather disturbing contemporary human behavior that was carried out with the excuse of Darwinian naturalism. The most obvious during Hofsteder’s time was the eugenics programs of the Nazis. The concept of superior races, and linking human survival to the elimination of the unfit seemed like an extension of Darwin’s work. In fact, Darwin’s own cousin, Francis Galton, was the founder of Eugenics at the end of the 19th Century. <br />
<br />
It is only when you let go of this instinctual idea of survival of the fittest that you can really see the true scope of Evolutionary Biology. Genetic mutations are slow to occur, and happen because of natural means, but humans have removed themselves from the natural process. Technology, as wonderful as it is, is only 10,000 years old. Until that time there was not agriculture or architecture. There were no cities or writings. We call everything before this “prehistory,” because until that point history was not created by deliberate societal choices. Darwinism has been in progress since the beginning of life on earth, but social Darwinism exists only during those times we call “history”. Therefore most scientists feel that we as humans are effecting nature, while behaving as if they were outside of the natural process. This is not instinctual, and is not as easy as “Origin of The Species” initially seemed to be.<br />
<br />
Social Darwinism is also evident when evaluating risk. The 2008 credit crisis and banking collapse can be seen in this light. The majority of the crisis was caused by misunderstanding risk. Was it riskier to give certain loans or not? Was it riskier to make certain credit swaps or not? There are two prevalent points of view about why this happened, and in such dramatic and drastic ways. Most people not working in finance see it as a case of greed. This viewpoint is an unconscious blaming of social Darwinism. It holds that the wealthy Wall Street Bankers wanted to be the strongest, therefore richest, so they pushed for larger profits at the expense of the weaker greater population. On the other hand, the banking industry blames poor financial modeling for the collapse. The claim has been that the models did not appropriately explain the risks associated with these transactions. Even the most famous economist and Federal Reserve chief of all time, Allen Greenspan, used this explanation saying “the models of finance that I have followed were not accurate in predicting this crisis”. When we look at these two explanations we see a huge contrast, while both ignore something that is probably more accurate. It was neither greed nor modeling alone that caused this. It was merely a deficiency in our ability to instinctually comprehend risk. The numbers were too big, our vision of history too limited, and our confidence too great. It seems that we were not as smart as we thought. Growth seemed inevitable. We thought that credit and finance in general were part of the evolutionary process, but in fact they were removed from it. Derivatives and credit default swaps just happened too fast for Natural Selection to catch up with them.<br />
<br />
This applies also to the personal and professional risks associated with starting a technology company. The common wisdom on the risks of entrepreneurship are strangely opposed to the way we approach risk in our daily lives. Most people change their risk threshold based on how much money they have. When we have very little money we tend to risk all we have, because we feel there is not much to lose. This is why lottery lines are longest in poor neighborhoods, and why casinos are filled with people who can’t really afford to lose the money that they are gambling with. These people feel that the upside is much larger than the downside. It just feels right, but is completely wrong. The odds of a poor person winning the lottery are just as bad as the odds of a rich person winning the lottery, but the risks associated with the poor person playing are so much worse. This wrong instinct extends through the middle classes and even to the wealthy, not for lottery tickets but for ideas and the money it takes to pursue them. Someone who is worth 2 Billion Dollars is less likely to risk 1 Billion dollars on a single venture or idea, than someone worth $ 200,000 is to risk $100,000 Even though the billionaire may lose much more money he certainly is in better shape financially than the person who loses half of his $200,000 investment. Our instincts are just that bad. <br />
<br />
Instinctively entrepreneurs often work by the motto “never spend your own money”. I have been told this many times, by everyone from the founder of Cirque Du Soleil, to the investor of the leveraged buyout concept in the 1980’s. These people have been undoubtedly following their instincts, and those instincts have made them a lot of money. The question I need to ask myself is whether that would make me a lot of money too, and more importantly whether that would be the best thing for my company and the happiness. We have tended to answer no to that question. Seeking other people’s money presents a bigger risk for me than risking my own for reasons that seem completely unreasonable.<br />
<br />
The main reason why technical entrepreneurs, like me and my family, may want to take relatively large risks compared to the Billionaire, but relatively small risks compared to the lottery addict is because it is a scale that we can relate to. There is an assumption that the risk taken is never so great that it will land us and our children on the street corner, but is none the less a risk that could alter our lifestyle. This is something a scientist can understand. A scientist knows he cannot appreciate in an instinctual way, those things which he cannot feel, and that further comprehension only comes through a deep knowledge of a subject. For a scientist starting a small company it isn’t always the most interesting use of his time to fully understand business, so he shouldn’t have to do it. The risk and reward factors change when this is considered. It becomes more risky to raise large sums of money because large sums of money may be out of the scientist/entrepreneur’s expertise. In order to gain that expertise he would have to sacrifice his true interests, which may very well be the riskiest choice of all.Anonymoushttp://www.blogger.com/profile/07835248533705400567noreply@blogger.com7tag:blogger.com,1999:blog-6105237826208060810.post-29382644769493146532009-10-12T05:41:00.001-04:002009-10-12T05:41:31.176-04:00Technically Responsible<span xmlns=''><p>Many years ago my wife and I had dinner with a friend of mine, who is a very talented theatre director, and performer. His work was inspiring to me, and I would generally see anything he did. I must admit though, as with my free jazz performances, his style of experimental theatre did not attract very large audiences. He made a statement to us, which at the time I thought rather pretentious. He said that he did not care if there was an audience at all. He would do his performance every night in an empty theatre if he had to, and still feel the same power and importance for creating a unique art. He felt no responsibility to an audience, only to himself. I would later find that many great and even successful artists's, whether it was Miles Davis, or Ingmar Bergman cared little about the audience. In both of those cases the audiences came anyway, as they should have, because the art was so special. For those of us whose art does not appeal to large groups however, we can still continue to perform. The idea of responsibility to an audience is actually a huge distraction for the most part, as we may very well choose quantity over quality. In the arts we also need to be honest with ourselves, especially if we play free jazz or do experimental theatre. No matter whether we have a relatively big audience or not, it doesn't make such a difference. Either the product will live on, or not, but the moment is fleeting. It is also not exactly perfect to speak about responsibility to an audience, as even a bad performance won't be genocide, a famine, a world war or even a hangover. It will just be an unpleasant hour or two in a theatre. While this may be true in the arts, as a scientist, or a technologist there is much more at stake.<br /></p><p>Many mornings when I am going to work I listen to Podcasts from the Stanford Literature Professor Robert Harrison called "Entitled Opinions". I highly recommend this podcast, as it is both entertaining and educational. Also for those of you that work in Academic Science or Technology it will be a huge change for you, as it was for me. Dr. Harrison is a classicist, and literary intellect, who while respecting and even hosting scientists from time to time, can be critical of modern society's manic drive to innovate, without particular attention to long term consequences. Though this may seem obvious, it is far from it for those of us who read and respect some of history's great scientific discoveries. I have often repeated a famous quote by Richard Feynman, who in addition to being a Nobel Laureate Physicist also worked on the Atomic bomb for the Manhattan Project. This is what Feynman said:<br /></p><blockquote><p>"scientific technology improves production, but we have trouble with automation. It brings about advances in medicine, but then we worry about the number of births and the fact that no one dies from the diseases we have eliminated. It produces rapid air transportation, but it also makes possible the severe horrors of air war. In a sense, science is like a key that can open the gates to heaven or hell. Which portal the key unlocks depends on the humans who employ it."<br /></p></blockquote><p>Often I and many other scientists working on theories or in labs even tend to use this as an excuse. In fact I am so convinced that this is a good statement that I think it is expressing something almost necessary in our evolution as a species. It is a desire to understand the universe of the large and small, and the mechanisms by which it functions. In doing that we uncover new ways to extend our lives, such as medicine, bio-engineering and even more futuristic notions such as neuro-regeneration and the downloadable brain. This seems to me to be what we must do. What Dr. Harrison and his guests put in perceptive for me is that we cannot pursue science and technology in intellectual isolation. Isolation has several relevant meaning here. Often when working on complex equations or experiments we physically and politically isolate ourselves, almost out of necessity. It is just too hard to lose focus. Maybe more importantly we isolate ourselves philosophically. I have written in this blog before of a convergence of art, science and philosophy, and how our specialized Academic world can limit our experience, and even hold back progress in the fields we work in. What I think of more now however is how working without recognition and understanding of philosophy can be dangerous, and counterproductive to human progress. Certainly the bomb is an obvious one. It was hard working, brilliant scientists' pursuing an idea which could lead to destruction. But what about the seemingly smaller efforts we make? For example, what about the impact of a new material, not only on our environment but on our interaction with nature and other people? My French father in law and I have had many arguments about progress. He is an extremely educated polymath, who worked as an engineer, but is equally proficient in quoting Victor Hugo, and Chateaubriand by memory at the dinner table. With all of this knowledge, as well as the fact that he has used computers since computers were becoming personal, he has always told me that society should be conscious of the "arbre de connaissance". For many years I thought that his idea was anti-democratic. The free flow of information and the ability for everyone to participate through the internet is one of the luckiest benefits of being alive in the 21rst century. But he is right. Without thought of consequence, knowledge alone is not progress. Advances in science and technology require responsibility. The reason this is truer now than ever, is that not just academics are using and creating sophisticated technologies. Nearly everyone is a technologist, no matter what field they work in. Still, with these new technologies, and our constant innovation we have the same human struggles we have always had. We face our own mortality. Unlike Epicurus, and other great philosophers though, we tend to not address that mortality metaphysically, but rather physically or religiously. For the non religious, like me, we try to conquer death through invention. Reading great literature or philosophy though, puts us back in the same boat as all humans in recorded history. Suddenly we realize that we must continue to innovate, but if we do so only with tools and not the mind we will be alone. We must think of technology as a responsibility. We are no longer playing to ourselves in an empty theatre, even if it feels like it.<br /></p><p> </p></span>Anonymoushttp://www.blogger.com/profile/07835248533705400567noreply@blogger.com2