Monday, April 07, 2014

Coarse Coding and Coarse Behavior

I confess that I am fascinated with the ongoing saga of Toronto Mayor Rob Ford’s scandalous behavior, his bid for reelection, and the coverage of it all by the Toronto, Canadian, and international media.  I am particularly intrigued when the same event is described completely differently by different news outlets (with competing political aims in their editorial policies).  At the Leafs home game on April 5, was the key story about a happy, lucid, sober Mr. Ford mobbed by a frenzy of adoring supporters?  Or was the news instead about a belligerent, irate, possibly drinking Mayor being warned about his behavior by security?

Regarding this most recent incident, it is of particular interest that social media provided the most detailed account of the Mayor’s evening out.  Posts on Twitter, several including links to video and photographs, tracked his movements from the start of the hockey game to his confrontation with security, alerted followers to his late night visit to City Hall (What was that mysterious burning rubber smell detected by security there? Why is that not mentioned in the traditional media?), and established his early morning presence at the Muzik nightclub.  In this case, it seemed that Twitter was breaking the news, and that the traditional media were playing catchup.  As reports appeared in the media the next day some posts on Twitter accused traditional outlets of not telling the complete story!

All of this made me think of Twitter as representing reality in what connectionist cognitive scientists call a coarse code.  Many artificial neural networks generate highly accurate responses by pooling the signals of individual elements, where each individual element has noisy, sketchy, or inaccurate information about what is going on.  The ‘coarseness’ of this type of representation is reflected in the fact that every processor inside the network is an inaccurate detector.  The surprising power of this representation comes from the fact that if you combine all of these poor measures together, a highly accurate measure is generated.

For coarse coding to work, the individual (inaccurate) measures generally require two different properties.  First, different measuring elements must have overlapping sensitivity: many of them will be measuring similar things.  Second, different measuring elements must also have to have different perspectives on what is being detected.  In short, their sensitivities overlap, but are not identical.  When these two properties are true high accuracy can be produced by combining measures.  This is because if each detector has a different perspective, it will be providing different ‘noise’ than is provided another.  Combining the different noise from different detectors will tend to cancel it all out.  What remains is the amplified ‘signal’ – the ‘truth’ – that is also being sensed to a limited extent by the various processors in the network (due to their overlapping sensitivities).

Each individual tweet on Twitter can be viewed as some information being provided by an inaccurate detector.  If the sources of a large number of these tweets have slightly different perspectives, or provide different kinds of information (statements vs pictures vs videos), then their combined effect provides information that has a strong sense of accuracy.

Not surprisingly, researchers interested in Big Data are actively exploring this characteristic of social media.  For instance, some researchers are using the content of tweets to predict the results of elections, although the accuracy of this approach is subject to a healthy debate.  Importantly, the accuracy of such predictions requires that the two key properties of coarse coding (presenting information that is similar, but different) be true.  When these properties are not true – for instance, when many people retweet the same information, so that different perspectives are not provided – social media can misinform, as shown by Twitter being a continual source of celebrity death hoaxes.

To me, the parallel between tweets and successful coarse coding in artificial neural networks clearly indicates that Twitter can be a source of a great deal of accurate information, and makes me reflect on how neural network paradigms might be tweaked to explore tweet contents.

The parallel also makes me think that if I was a politician seeking reelection – particularly one who is such a notorious celebrity that my frequent encounters with the public immediately appear on social media – I would strive to be on my best behavior.  The image of me emerging from all of those Tweets might be more accurate and telling than the one that the traditional news media feels safe to publish!

 

Sunday, March 23, 2014

Postsecondary Perspective On A Premier’s Problems

Alberta Premier Alison Redford has suddenly resigned; her successor Dave Hancock was sworn in today.  Redford’s resignation was the culmination of several weeks of revolt within the Progressive Conservative party and the government caucus, daily reports of concerns about the premier’s travel expenses, and terrible poll results.  Many detailed analyses of how this whole situation arose, and what it means for the future of the PC party, are appearing in local and national newspapers.  These include an analysis by Professor Ted Morton (who offers a unique perspective given his role in ousting Redford’s predecessor, Premier Ed Stelmach), rumours of a within-government smear campaign, concerns about political misogyny,  and chalking it up to business-as-usual in a province where the governing party (and not the electorate) decides who should be premier.

Most of these analyses focus on factors within the governing party.  As it embarks on the process of finding a new leader and premier, I would like to point out an obvious fact about the premier’s and the party’s low popularity.  I focus on just one aspect of the government’s mandate, postsecondary education.  Similar points could be easily made using many other government policies.

The fact is this: during the 2012 provincial election, Redford campaigned on a particular platform; she delivered policies contrary to her platform in her March 2013 budget.

To illustrate, Redford’s campaign promises to reinvest in postsecondary education, and to provide stable and predictable funding for this sector, transformed into an over 7% budget cut that took university and college administrators by surprise.  Decisions of this type – about faces on campaign promises – have not surprisingly led to enormous decreases in the government’s popularity, not to mention the obliteration of the electorate’s trust of the government.

The many analyses of the current situation in Alberta politics examine dynamics within the government and the governing party, providing possible explanations for the tension between campaign promises and current policies.  It is important to remember, though, that the general electorate that is outside the party is perhaps less interested in explaining this tension, and is simply more interested in the fact that it exists!

Consider an earlier provincial election campaign successfully conducted by the same party, Ralph Klein’s 1993 “Miracle on the Prairie”.  Klein campaigned on a platform that promised harsh cutbacks in government spending, cuts that were likely more far reaching than those delivered by Redford’s 2013 budget.  Klein won a majority (as did Redford), and proceeded with various budget cuts (as did Redford), but Klein’s popularity went up when his fiscal plan was enacted.

The difference, of course, is that Klein delivered what he promised on the campaign trail.  Redford did not.