Tuesday 15 December 2009

Tweet Analysis



Micro-blogging via Twitter (and status updates on Facebook) seems to represent the new fad in communication. Being able to declare your thoughts to the world -- without engaging the brain to process your thinking -- appears to be gaining favour with politicians, musicians, and need I add, academics! Communicating in bite sized thought-summaries is not new however. SMS and pager messages have already been around for some time, and the ability to communicate using short messages had already caught on before Evan Williams and Jack Dorsey launched Twitter in 2006. Perhaps, the most interesting aspect of this can be found at Wikileaks -- a "multi-jurisdictional organization to protect internal dissidents, whistleblowers, journalists and bloggers who face legal or other threats related to publishing." On November 25 this year, Wikileaks released half a million US national text pager intercepts, covering the 24 hour period surrounding the September 11, 2001 attacks in New York and Washington. Here's an example. The archives represent a catalogue of electronic and human "chatter" at a remarkable point in world history. Messages range from texts between machines, humans and machines and multiple humans -- from a variety of providers -- Metrocall, Skytel, Weblink_B etc. Perhaps, some historian in the future will trawl through this archive and try to better interpret it's contents. Could the analysis of Twitter and other micro-blogging sites, after a particular event in history (earthquakes, sporting events, elections etc), provide an insight that is often not available with "processed" news stories available through branded publications and media? Perhaps, an intelligent analysis of such feeds would provide real insights into what happened?

image from www.change.org

Wednesday 9 September 2009

To charge or not to charge



Newspapers have recently discovered that they can no longer rely on advertising as their sole source of revenue, and therefore need to start new business (charging) models for their on-line editions. Whereas previously such newspapers relied on the number of people looking at their site (and therefore also looking at the adverts on these pages), it seems that people are no longer clicking on these adverts. Whereas some newspapers (such as the New York Times) abandoned a subscription-based model to increase the number of people coming to their Web site (from 12 million to 20 million, according to New York Times' Vivian Schiller) -- just the number of people visiting a site does not seem to be enough any more. According to an article in the Guardian newspaper (which, interestingly, is still free), News Corp (who own a number of British newspapers -- such as The Times) feel that accessing news content for free is no longer viable, and utilising subscription based models from other successful on-line publications, such as the Wall Street Journal in the US and the Financial Times in the UK, should be the way ahead for internet based news media. In a BBC article, Rupert Murdoch of News Corp was quoted as saying: "Quality journalism is not cheap, and an industry that gives away its content is simply cannibalising its ability to produce good reporting."

Would people pay for news content -- or should content need to be of a very specialist nature for it to be of interest to subscribers? If it is of a specialist nature, would that also not limit the number of subscribers? The Financial Times, for instance, makes some articles available for free, and then charges for others -- although it does seem to have a very well defined readership. However, newspapers such as The Times, which appeal to a more general market may not be able to appeal to such distinctiveness. Perhaps, when major newspapers do start charging, those that don't, may dictate the newsagenda in the future -- and perhaps as markets recover -- dominate revenue from advertising once again. Similarly, this also raises questions about the role of government funded news media -- such as the BBC in the UK -- who, for a small licence fee, are already providing free content.

This also introduces the need for a new type of RSS-feed aggregator -- one that is able to take Blogs from different individuals and perhaps compile free news content. So, if you live in, say Cardiff, and write about events in your local area -- an aggregator engine (perhaps, similar to a search engine), could combine all news stories from Cardiff. Traffic to these news stories could then be used to identify their "value" to readers. Such reputation models -- based on number of readers -- could be one way that an aggregator could select which Blog entry to feature when compiling news content. In this way, news could be dynamically compiled, for free, without having to access subscription-based newspapers? Would this work -- or would the views of experienced reporters and journalists always be of greater value to readers -- and therefore result in people paying for such content?

One interesting comment from Mr Murdoch -- in the Guardian article above -- relates to advertising revenue from social networking sites (and perhaps, also from Internet-search engines) -- "... News Corp revealed that its interactive media division, which includes the social networking site MySpace, had turned in a lower contribution." Mr Murdoch stated that:
"We're not going for the Facebook model of getting hundreds and hundreds of million of people who don't bring any advertising with them at all," he said. This does raise questions about the possible future survival of social networking and search engine sites which solely rely on advertising revenue.

Image from University of Indiana

Friday 24 July 2009

Peer-2-Peer Banking



According to a BBC report the UK national debt stands at £799bn (or 56.6% of GDP). This is the highest since records began in 1974. One of the major causes of this debt, according to the BBC report, is the banks bail-out package in the UK. Another interesting figure was quoted by Alyssa McDonald (New Statesman Magazine, July 20, 2009), which indicated that the UK national debt had quadrupled since January 2007. Vincent Cable, the economics spokesperson for the UK Liberal Democrats political party, although expressing a political viewpoint, talks about the "institutionalised passivity of UK Financial Investments Ltd (UKFI), the Treasury-backed bank shareholder body" to deal with the activity of large banks, and the conduct of bankers operating through such institutions. In his article (New Statesman, June 29, 09), he asks three important questions: "(1) How can a semi-nationalised banking system best serve the different but overlapping interests of UK bank borrowers, depositors and taxpayers, as well as private shareholders and bank executives? (2) How should the systemic risks of banking – and the City generally – be managed through regulation, in order to safeguard the wider UK economy? (3) Is it actually possible for the UK to play host to a major financial service sector?" Considering the role that the UK has played in financial services, finding answers to these questions could have a profound impact on the financial markets of the world.

Perhaps, one thinking behind such questions lies in the ability to open up the banking sector to more innovative players -- albeit those that are regulated in some way. One approach would be to enable people (borrowers and lenders) to directly interact with others, and provide a more Peer-2-Peer approach to banking. As trust in the centralized banking sector erodes, perhaps trust built through knowledge of people and communities could be used to establish borrowing and lending institutions. Zopa happens to be one such company, operating in the UK, USA, Japan and Italy. The idea here is to allow people not large institutions to lend and borrow to each other, thereby sidestepping banks. Essentially, individuals decide who they want to invest their money in, and the rate of return they will see. A number of credit checks are used to identify the financial status of a lender/borrower. Zopa makes money by charging a fixed fee for each transaction conducted through it's site. In April 2009, Zopa transactions added up to a lending of £3 million, an increase of 3.5 folds from April 2008. Since it's launch in March 2005, over £40 million have been disbursed. Carpet Bagging provides a good summary of such Peer-2-Peer lending/banking approaches -- identifying other market players in this area, such as Prosper (US) and Smava" (Germany).

The role that the state should play in the regulation of such entities should help address some of the questions that Vincent Cable has asked in his article. Although such P2P lending sites could also fail -- as demonstrated in the collapse of Boober in The Netherlands -- affecting almost 1,200 people. Could technology and recent surge in interest in social network lead to a new form of community banking -- akin to the ideas being proposed by Silvio Gesell in his Natural Economic Order?

image from: http://fc.sharon.k12.ma.us/~soreilly/economics

Thursday 16 July 2009

Where can Autonomic Computing be of benefit ...?


There is now an active research community focusing their energy on Autonomic Computing (and the associated conference series) -- utilizing it, primarily, for supporting (computer) systems management. The motivation for this seems to be the observations that as computer/information systems get more complex, it is possible that no one person will fully understand how such (complex) systems operate. It is therefore necessary to enable each component of such a system to be more intelligent, enabling it to self-adapt and modify its behaviour based on detected changes in it's environment. The mechanisms behind this come from autonomic self-adaptation in humans -- although there are also significant overlaps in ideas from Ashby's Viability Zone in his Homestatic system -- a good introduction here (if I may say so myself!). Critics of autonomic computing however say that this is unlikely to ever be realized. It is often mentioned by such individuals that the vision statements in autonomic computing are often too high-level, and unless distilled down to very specific outcomes, make this field vague and uninterresting from a practical perspective. And herein lies the problem. The very nature of autonomics implies that such approaches (often utilizing mechanisms from machine learning (such as reinforcement learning, neural networks), rule-based adaptation and control theory) can only be applied (and, more importantly, validated) in well-defined problems. However, many large scale systems often appear to be "open" and such well defined ideal state -- which drives the autonomic mechanism -- is often hard to specify. Perhaps, what is needed is more work on design (software) and methodologies for applying some of these autonomic mechanisms in more open environments. Perhaps, this is a research area that needs further attention?

Image from IBM Almaden

Thursday 2 April 2009

Busting Cancer with Gold




Gold nanosphere's that contain antibodies that latch on to cancer cells provides a really interesting new way to deal with this disease. Jin Zhang at UC, Santa Cruz indicated that ... "you could send a person home, have them shine a laser on the specific part of the body with cancer for a couple weeks, and they could be cured of cancer". Interestingly, the size of the nanosphere indicates how it responds to different wavelengths of light. Hence, the size of the sphere indicates how big an area is being affected. According to Brown University's Shouheng Sun, "once the nanoparticles have done their job and destroyed the tumour, the kidneys filter them out of a patient's body within a few hours". Most interestingly, the nanosphere can be viewed through an MRI scan, and may also be used to locate tumors -- as they latch on to tumour cells. According to a Science Daily article, such spheres therefore provide a useful sensor for locating cancers, and subsequently treating them using the approach being advocated by Jin Zhang. "What makes this structure special is the combination of the spherical shape, the small size, and the strong absorption in visible and near infrared light," Zhang said. "The absorption is not only strong, it is also narrow and tunable. All of these properties are important for cancer treatment."

Image from: Drug Discovery and Development magazine (http://www.dddmag.com/)