Category Archives: Strategy

A Practical Approach to Risk Management


Following up on my last blog post regarding project management predictions for 2016, I thought I would expand on each of the ‘predictions’ in my first few posts of the year.

Projects Will Still Have Issues in 2016

As I noted in my post, projects will always have issues. And though we know this to be true, that shouldn’t prevent project managers and team members alike from being diligent in anticipating risks (that can become issues) and plan for them – as we have a responsibility to do so.

Now, while I’m confident I can regurgitate content from the PMBOK in terms of processes, inputs/outputs and tools & techniques related to risk management as good as anyone, for the purpose of this post I thought I would outline a few practical strategies from my own experience and perspective.

Do a Pre-Project Risk Assessment

Risk management seems to always be a topic of conversation within the realm of project management, but I would argue that it should pre-date the project kick-off. Before a project team is engaged and deemed to be accountable for the delivery of the project, the organization should have already performed an assessment of the project. Some of the points, questions and analysis to be considered might include the likes of:

  • How does this project align with our strategy?
  • Who is the customer? What industry are they in? Do we have any experience in this industry? Is doing work in this industry aligned with our strategy?
  • Have we done business with this customer before? If yes, what was our experience with them? If no, do we have any partners or contacts that have done business with them before and what was their experience with them?
  • Do we have team members with the skills required to do the work? Are these team members available at the allocation required to complete the work in the expected timeline and budget?

Build in Some Contingency

Based on the pre-project risk assessment, the organization should have a good sense of the level of risk a project has before it starts. If the project is deemed to be aligned to strategy and within the organization’s acceptable risk tolerance, the project team can use the outcomes of the pre-project risk assessment for inputs for the project risk planning.

For example, if we’re working in a new industry or with a new customer, or with a customer where there had been issues on previous projects, contingencies can be incorporated into the project budget and/or schedule (and other areas alike) to mitigate risks. This could mean having additional budget set aside for risks and/or additional time factored into the schedule. On the other hand, if we’re working with a familiar customer where we have a strong relationship and an established process proven to be effective with this customer, there should be less of a need for these types of contingencies.

Apply your Lessons Learned

Most project management methodologies advocate for a post project lessons learned (or post-mortem) type of session to identify project ‘failures’ and ‘successes’. From a risk management perspective, ensuring that lessons learned on past projects result in actionable improvements that are incorporated into future projects, is key.

Someone famous once defined insanity as ‘doing the same thing, but expecting a different result’, so unless you’re planning for the ‘insanity defense’ when your project goes off the rails, understanding and applying your lessons learned will be a key strategy in mitigating risks on your project.



Leave a comment

Filed under Management, Project Management, Strategy, Uncategorized

Project Management Predictions for 2016


You don’t have to look far this time of year if you’re looking for a view into what people are saying the next year will bring. As sure as there are those making resolutions for the New Year, there are others making claims about what is to come in the New Year, for the purpose of informing, entertaining or somewhere in between. Whether you’re into technology, science, politics, or the movies, there are no shortage of predictions – some more serious than others – for what the new year will bring.

In this vein – and walking the line between being informative and being entertaining (as is generally always my goal with my posts – this time leaning more to the latter than the former) – here are my top 3 project management predictions for 2016.

Projects Will Still Have Issues in 2016

I feel ok about the lack of a **spoiler alert** on my number one prediction for projects in 2016, since I can’t image that this has come as a surprise to anyone. While a new year comes with new beginning and (hopefully) a renewed sense of ‘this time it will be different’, the reality is that projects have issues. Always have. Always will. To be clear, this is not an admission of defeat; not by any stretch. It’s simply a recognition of what is real. Despite the best laid plans, the best of intentions, talented, hardworking teams, solid processes, training and experience, there will be risks and there will be issues. Timelines will sometimes be unrealistic. Requirements will sometimes not be as clear as they could/should be and the list goes on. All that said, good teams will recognize this, plan for the known-unknowns, anticipate having to deal with the unknown-unknowns and most importantly, support each other when the going gets tough.

People Will Still be Talking About Agile

In 2001 the Agile Manifesto was created by representatives from various areas of the software development community, as a collection of guiding principles that challenged long-held notions and methodologies for application development. And while the manifesto was created in 2001, the principles and “lightweight methodologies” that it is based on were in use long before 2001, with “scrum” dating back to the early-nineties, and concepts around ‘iterative development’ dating back as early as the 1950’s.

To come straight to the point. This stuff isn’t new. Yet, it has been the sort of topic has that remained relevant over many decades. To even compare to the 2001 date of the formalization of the manifesto, 2001 was the year that Microsoft released Internet Explorer 6, Windows XP and the original X-Box; Napster had a user base of 26 million users and the Compaq Presario was the hottest new computer on the market – and you don’t hear much talk about these things anymore, do you?

For one reason or another Agile has and continues to be a hot topic in the world of software development. Traditionalists like to argue that it’s nothing more than an excuse not to plan or document requirements, while proponents are quick to dispel these notions and point out that ‘responding to change over following a plan’ and valuing ‘working software over comprehensive documentation’ results in a better product in the end. All the while, the training and certification industries are trying to make everyone ‘certified practitioners’ and ‘scrum-masters’ by spamming your inbox every chance they get, claiming that Agile is the silver-bullet you’ve been looking for.

And as if there isn’t enough to debate, I have even heard people debate over the pronunciation of Agile, as apparently that is a thing. Who knew?

So, whether it’s to defend the viability of the approach/methodology, to defend a principle or a process, or to debate its pronunciation, I predict that people will still be talking about Agile in 2016.

Project Management Will Still Not Be All That Exciting

As this article does a great job of describing – through a conversation between Leonard and Penny on an episode of the Big Bang Theory, Project Management is a bit like physics:

Penny: “So, what’s new in the world of physics?”

Leonard: “Nothing.”

Penny: “Really, nothing?”

Leonard: “Well, with the exception of string theory, not much has happened since the 1930’s, and you can’t prove string theory, at best you can say “hey, look, my idea has an internal logical consistency.”

Penny: “Ah. Well I’m sure things will pick up.”

This is to say – from my perspective – that while, there are certainly advances in project management methodologies, tools and techniques (a few for 2016 here), they aren’t often (or ever, with the exception of perhaps Agile – see above) flashy or garner much attention. That said, behind every project involving Big Data, Artificial Intelligence, Cloud Computing, Virtual Reality, or whatever emerging technology is in play, rest assured that there is a Project Manager – and a project team, working tirelessly behind the scenes.


1 Comment

Filed under Business, General, Innovation, Management, Project Management, Strategy, Uncategorized

Please Stand-by to be Processed


Let’s begin with a quote from Alistair Cockburn, one of the initiators of the Agile movement in software development.

‘While a good process can’t assure delivery, and a good team can deliver despite an unwieldy process, a poor process can get in the way quite neatly’.

In case you haven’t figured it out by now, this post is about process. What it is. What it isn’t. Common misconceptions. Benefits. Implementation strategies.

What is a Process?

An established, step-by-step way of doing things, and a well-articulated set of internal and external documents and tools supporting it.

This is, of course, one definition of any number that may be out there. Another, more succinct definition (of sorts) that I’ve heard used is, ‘applied common sense’.

What isn’t a Process?

Magic. A silver bullet. A panacea. As the first quote above notes, a good process can’t assure delivery. That said, it’s a place to start. A baseline from which to begin and to continuously improve from. Even for the Agile purists that declare as part of their Agile Manifesto to value ‘Individuals and interactions over processes and tools’, are in fact rather processes driven in their own way through adherence to Agile processes and practices such as user story modeling , iterative development and time-boxing.

Preferred methodology aside, we could all benefit from a little bit of process – though not everyone is convinced.

Common Myths Associated with Process

  • Adds unnecessary steps and time
  • A form of micro-managment
  • Adds documentation for documentation’s sake
  • Limits creativity
  • Limits flexibility

These are common reasons why establishing (and documenting where appropriate) processes on a team or within an organization is often met with resistance.

On the other side of the coin, having a solid process that is understood, documented and followed can truly be the difference between success and failure on a project. And while it may not always be readily apparent from the outset, process can have numerous benefits for everyone involved. Here are 10+ that come to mind.

Benefits of Process

  1. Repeatable. Predictable. Manageable. Having a process in place will help to ensure that your delivery process is repeatable, predictable and manageable which has obvious associated benefits – as well as others explored in the subsequent points.
  2. Maximized efficiency. Minimized waste. To be able to hit the ground running on the actual work to be done vs. spending time defining the steps (the ‘how) and/or the creation of templates/documents/assets etc. to facilitate the work.
  3. Clear expectations. So everyone knows what they need to do, what everyone else needs to do, and what the timing and dependencies are.
  4. Reduces risk. By having processes in place related to the risky areas of delivery, risks are identified and mitigated early.
  5. Reduced re-work. With risks and/or issues caught earlier, the likelihood of costly re-work is reduced.
  6. Better quality product. With risks and/or issues caught earlier and the ‘how’ figured out in advance, the time and energy on the project can be focused on developing and testing the product, and the results will show.
  7. Reduced interruptions. Increased focus. Increased productivity. If there are still any skeptics, hopefully this one will help to convert them. While still having processes and steps in place related to the above points (e.g. risk identification and mitigation), an objective of any process should always be to provide more focused time for the team to do their work.
  8. Better awareness of the big picture. With a process in place and documented, everyone can see how their contributions fit into the larger picture.
  9. Cross training and professional growth. When processes are established and documented, it’s easy to train team members and new hires on how things are done to allow people to advance their career and not be ‘stuck’ having to take care of a certain task because he/she is the only one that knows how to do it.
  10. Increased customer satisfaction. Increased employee morale. With all of the above considered and addressed through the implementation of a process what works for your particular project/business unit/organization, the collective result will be increased customer satisfaction and increased employee morale, which is obviously good for everyone.

So now that we’ve seen the benefits of process, here are a few strategies for successful implementation.

Implementation Strategies

  • Alignment. When developing, documenting and improving your processes, take the time to step back and make sure it aligns with the goals and values of your organization, project and team members. Much like understanding how a work package or a test case maps back to a requirement in a delivery project, the way you get things done should also map back to individual, team and company goals.
  • Collaborate. Work collaboratively to develop your process with the people that will be using it. Seeking inputs from everyone will go a long way to ensure you have considered all of the areas and will also increase likelihood of adoption.
  • Don’t over analyze. While it’s important to not create a process in vacuum, not consulting and collecting inputs from all the stakeholders, be mindful of getting into analysis-paralysis mode trying to make it perfect for everyone. Take a page from the ‘Lean’ start-up playbook and seek to get your MVP (Minimum viable product) out there and then improve it over time.
  • Continuously improve. Further to the above point, processes should evolve and improve, so be active with looking for ways to make this happen. Process is not a one shot deal. Continuous improvement is key.
  • Understand that one size doesn’t fit all. Figure out what works for your team, your customer and your project and tailor accordingly. Consider having a step in the process to evaluate and tailor the process for each project, adjusting up or down based on factors such as the project’s size, type, technology and customer.

Leave a comment

Filed under Business, Business Analysis, General, Innovation, Management, People, Project Management, Strategy, Uncategorized

Broken Windows. Software. Process and Disorder.

broken window

Studies in the fields of criminology and urban decay have repeatedly found an interesting trigger mechanism – one that turns a clean and intact building, into a smashed and abandoned one. Care to hazard a guess what it could be?

A single broken window.

That single broken window, left unrepaired for a period of time has invariably lead to the rapid decline of the building and its community.

How does this happen?

It happens as a result of the sense of abandonment felt by those that live in and around the building; the sense that the powers that be don’t care about the building. So another window gets broken; then another. Later graffiti appears, followed by damage to the structure of the building. The spiral then continues until it gets to the point where the damage is beyond the building owner’s desire or willingness to fix it.

In one experiment in 1969, Stanford psychologist Philip Zimbardo arranged to have a car without license plates parked with its hood up in the Bronx, New York and another in Palo Alto California. In the New York example, vandals attacked the car within 10 minutes of its ‘abandonment’ and within 24 hours everything of any value had been taken. In California, the car sat untouched for more than a week. At this point Zimbardo himself smashed the windshield with a sledge hammer and then within minutes people passing by were joining in and then within a few hours the car was turned upside down and completely destroyed.

Social scientists Wilson and Kelling wrote about this ‘Broken Windows Theory’ in the March 1982 issue of ‘The Atlantic’. In this article they made these ‘inextricable linkages’ between disorder (or at least the perception of disorder) and crime. The New York vs. California example above – also referenced in the piece – notes how ‘untended property becomes fair game for people out for fun or plunder and even for people who ordinarily would not dream of doing such things and who probably consider themselves law-abiding’, and further, ‘vandalism can occur anywhere once communal barriers – the sense of mutual regard and the obligations of civility – are lowered by actions that seem to signal that “no one cares”’.

The Broken Windows theory has been credited in part for community based policing initiatives in various cities, perhaps most notably in New York City under the leadership of Rudolph Giuliani in the 1990s, where the focus was put on small crimes (e.g. vandalism, subway turn-style jumping) as a means of the prevention of larger crimes – by creating an atmosphere of order and lawfulness.

So, what does this have to do with software development?

Since the Broken Windows theory was theorized, software development teams have found inspiration in it, as a metaphor for focusing on the small things in order avoid larger problems down the road. Dealing with the ‘broken windows’, if you will, since ignoring them can result in ‘technical debt’ that will eventually have to be paid.

The idea is, that in addition to ensuring you’re doing all the right things – sweating the [seemingly] ‘small stuff’ – you’re not only developing a higher quality solution, you’re also creating an atmosphere on your project/team/company where process and quality matters. The Broken Windows theory as it applies to software development – generally speaking – asserts that issues should be dealt with before any further code is written. If the build doesn’t compile – fix it now. If there are bugs – fix them now. While this sounds simple enough, it’s not always the way it goes for one reason or another.

While each team and even each project for that matter will have different sets of processes for different reasons, there are still principles from this theory that can be used. Make it a point to agree on processes and standards for delivery. From unit tests, to comments and documentation, continuous integration and bug fixing – establish what your process and expected quality levels are and stick to them; even when you’re busy.

Especially when you’re busy.

In fact, ensure you’re working closely with whoever needs to be involved to make sure that you don’t find yourself constrained to the point where you’re unable to do all of these ‘little’ things, since that is a slippery slope to somewhere you don’t want to be.

I would further assert that this goes far beyond strategies related directly to the software development/coding aspect of solution delivery. Teams should also ensure that the overarching processes for solution delivery are agreed upon and maintained. While adherence to processes around things like communication, status reporting and risk analysis can sometimes seem like easy-pickings for corner cutting, the outcomes of doing so can lead you to a place where you could find yourself wondering how you got there.

One broken window at a time.


Leave a comment

Filed under General, Management, People, Project Management, Strategy

The Innovator’s Dilemma

Innovator's Dilemma

This is a book that I’ve recently read for my latest MBA course on Managing Innovation. It’s kind of one of those ‘must-read’ books for business/technology types and aspiring super-nerds like me. Without giving it all away, the crux of it is, when it comes to certain types of innovations (“Disruptive” Innovations), engaging in smart business practices may be the last thing you’ll want to do.


It is a bit of mind bender, but Clayton Christensen goes into a fair bit of detail (don’t say I didn’t warn you) – using predominately the computer disk drive industry as his testing ground – to outline how doing things like listening to their customers, investing in the highest ROI opportunities and allocating resources to continue to innovate and improve on products and processes has actually lead to the demise of many organizations.

As I said, it’s a bit of mind bender.

If you’re interested in learning more about how doing these seemingly ‘right things’ can in some cases end up being the ‘wrong things’ (hence the dilemma),  I’d recommend checking out this book. Alternatively, if you want the Cole’s notes version, along with some of my insights on how I think that the Internet is “disrupting” the education industry, continue reading to check out a short paper that I recently submitted for the class.


This is a book review on ‘The Innovators Dilemma’ by Clayton Christensen, being prepared and submitted for MBA 7351 – Managing Innovations, at the University of New Brunswick, Saint John. It will begin with a summary of disruptive technologies and innovations, as outlined in the book, to demonstrate an understanding of the concepts. It will be followed by a discussion on how the internet has and will continue to disrupt the education industry. The paper will then end with a brief conclusion.

A Summary of Disruptive Technology & Innovations

In ‘The Innovator’s Dilemma’, Clayton Christensen does a deep dive into the paradox of how, when it comes to dealing with ‘disruptive’ technologies, following good management practices can – and often does – lead to poor results. He focuses predominately on the computer disk drive industry as the focus of his research for the book, since as is still largely the case; it is an industry that moves at a staggering pace making it an easy target for research of an evolutionary (or in this case, revolutionary) nature.

Since ‘The Innovator’s Dilemma’ was first published some ten years ago, the term ‘disruptive’ has become somewhat ubiquitous, but as is often the case, it is used loosely and often incorrectly. As such, before proceeding any further, it will be helpful to first define what exactly a ‘disruptive’ technology is, as defined by Christensen.

Technologies and innovations can fall into one of two categories. The first is what Christensen calls ‘sustaining’ technologies, which are the most common. These are the types of advances that improve on existing products or services. They can be incremental in nature but they can also be radical. This is important to note, as sometimes radical, sustaining technologies are mistaken for disruptive technologies. Sustaining technologies, whether incremental or radical, are in fact the types that, “foster improved performance of established products along dimensions of performance that mainstream customers in major markets have historically valued” (Christensen, page xviii). As Christensen outlines, using examples from the disk drive industry, as well as others such as the excavator and motorcycle industries, sustaining innovations progress along the same performance trajectories whereas disruptive technologies redefine performance trajectories.

Disruptive innovations, unlike sustaining innovations which continue to advance existing technologies and products, often – at least in the short term – offer worse performance than is currently in the market. As such, when they first emerge they are of little, if any, interest to the mainstream market. And why should they be? A fundamental tenant of business is to seek out innovation as a way to create and maintain a competitive advantage in the market place. Another is to listen to your customers and to give them what they want. Yet another is to invest funds and allocate resources where you can get the highest returns. Understanding all of this – as good companies (companies not unlike the organizations featured in the book including IBM, Seatgate, Quantum and others) do – that is how businesses are typically managed. They seek out innovation (most often sustaining in nature); they listen to their customers; they astutely invest and allocate resources based on their expert knowledge, industry research and best practices. Yet, despite all of this, many wind up losing their industry leadership positions and eventually fail. Why does this happen? This is, in part, the “Innovator’s Dilemma”.

As Christensen illustrates throughout the book, the undoing of these otherwise well managed firms comes from their inability to make strategic decisions to embrace disruptive technologies; or at least while there is still time to do so. What has happened time after time and in industry after industry, is a disruptive technology will emerge onto the scene, and true to the earlier noted disruptive attributes of being a worse, not better performer by traditionally held standards, will be met with little fan fair from organizations trying to continually one-up the competition with the latest sustaining advancement. As a result of the inability to break into traditional markets, the disruptive innovations have to figure out an entirely new value proposition for a new, often smaller market. In doing so, what happens in these new, often fringe markets, is the disruptive innovation gets momentum and advances – through advancements in sustaining innovations on this new performance trajectory – until it can move ‘up-market’ to serve the very industries that weren’t initially interested. This then gets the attention of the often larger, established firms who will sometimes scramble to try and get on the bandwagon, but often it’s too little too late. It’s particularly interesting, and perplexing, how the when it comes to disruptive technologies, the attributes that make them “unattractive to mainstream markets are the attributes on which the new markets will be built”. (Christensen, page 267).

To illustrate these points using an example from the book, we’ll look at the disk drive industry where Christensen takes the readers through the progression of disk drives from being 14 inches (diameter) to 8 inches, to 5.25 inches, to 3.5 inches, to 2.5 inches and eventually to 1.8 inches. Initially the 14 inch drives, for use in mainframe computers, were the industry norm. In 1974 their average storage capacity was 130 MB (megabytes). Through sustaining innovations, disk drive makers were able to keep their customers happy by increasing capacities at levels the customers had come to expect. By the early 1980’s several entrant firms had emerged, producing smaller 8 inch drives with much lower capacities – from 10 to 40MB – nowhere near the requirement for a mainframe computer. As such, the mainframe computer users weren’t interested in these drives, despite the fact that they were smaller. Size wasn’t a factor for mainframe users, capacity was. So in order to survive, the smaller drive makers (Shugart Associates, Micropolis, Priam and Quantum) started selling to firms in a newer, smaller market than mainframes: minicomputers.

As the 8 inch drives gained momentum in the minicomputer market, developing sustaining innovations (at a faster rate of the established firms in the mainframe market) on this new performance trajectory, they were eventually able to begin serving the mainframe market, thus pushing out the established incumbents. Christensen goes on to walk readers though how this cycle repeats for all of the different drive sizes from 14 down to 1.8 inches.

In examining how disruptive technologies have lead to the demise of organizations that were once atop their industries, Christensen ensures that his readers understand that it wasn’t in these organizations’ lack of technical capability. He outlines how, in some cases, the industry leading organizations had working prototypes of the disruptive innovations ready to be taken to market. The reasons for their failures were largely related to what Christensen calls ‘value networks’ and their related cost structures. Essentially, large organizations in large markets are organized internally and within their supply chain to serve large markets. In the same vein, large companies require large markets to meet their growth targets and the smaller, fringe markets opportunities presented in the early days of disruptive innovations are generally not attractive propositions.

Similarly, he goes to great lengths to ensure that his readers understand that it wasn’t their inability to diligently manage their organizations. In fact, the diligent management of their organizations was the reason for their demise. They were innovative. They invested wisely. They listened to their customers. But what did their customers want? Better performance or worse performance? What made sense to invest in? Established markets with high returns, or fringe markets with low returns and little or no market research data?

With disruptive technologies, “doing the right thing is the wrong thing” (Christensen, page xxxiv). This is the “Innovator’s Dilemma”.

Impact of Disruptive Technology/Innovations on the Education Industry

At the risk of stating the obvious, the internet has and continues to have a profound effect on many industries. It has been a driver of both sustaining innovations as well as disruptive innovations.

Some of the more prevalent and talked about industries impacted by disruptive innovations tend to be the music (iTunes vs. brick and mortar music stores), video (Netflix vs. Blockbuster), and print media (Online news, Google, blogs, Twitter and others vs. traditional newspapers) industries as a result of the major changes that have already, and are continuing to take place.

There are other industries that are still in the very early days of what may become disruptions, some of which include the electric car (i.e. Tesla) and P2P currency (i.e. Bitcoin) – each, among others, were strong candidates for further exploration in this paper.

An industry that is somewhere in the middle is the education industry, which will be explored in more detail in this section. This industry is an interesting one in that it is being impacted by the internet for both sustaining innovations and disruptive innovations. Further, within areas that can be defined as disruptive, there are areas of similarity of those explored in “The Innovator’s Dilemma” as well as areas of difference making it a good candidate for inclusion here since certain aspects of the internet’s impact – and if they are truly disruptive in nature – may still be subject to some debate.

Leveraging the internet to provide learning opportunities has been around since almost as long as the internet itself has been around – or at least the internet as we know it today (post Windows 95 era). While generations past – including my own – used to have learn things the ‘old fashioned way’ (i.e. buying and reading a physical book, or setting foot in a physical classroom), now it is often as simple as a performing a Google search for an insightful blog or YouTube video to find the knowledge that you seek. Similarly, universities and educational institutions have been leveraging the power of the internet to extend learning capabilities to its students through such things as learning portals, such as the ‘Desire to learn’ portal currently deployed with the University of New Brunswick, Saint John. When the internet is used in this fashion, these are sustaining types of innovations. They further improve on an existing service. They add value – incrementally or radically – in ways that users have come to expect.

Somewhat more recently, as broadband internet becomes more accessible, combined with other factors such as rising university tuition costs and a generation of people doing more and more things online (i.e. working, communicating, watching TV, listening to music), opportunities to learn online are also increasing and more people are taking advantage. According to a recent study commissioned by The Sloan Consortium – a leading professional online learning institution, “over 6.7 million students were taking at least one online course during the fall 2011 term, an increase of 570,000 students over the previous year” (2013, The Sloan Consortium).

Where things start to take on disruptive characteristics are in a couple areas. Firstly, many of the entrant firms providing these online learning platforms are not your traditional, established firms (akin to the “IBM’s” in the disk drive industry). They are smaller, start-up firms such as Coursera and Aacademicearth, both education companies that partner with top universities and organizations around the world to offer courses online for anyone to take. Another – and arguably the most important reason they stand to disrupt this industry in a big way is – the courses are free.

Recalling that when disruptive technologies arrive on the scene they are often thought as being inferior offerings, it’s not a major surprise that there is still much debate over their merits within academic circles, with many education purists insisting that there is no substitute for the value of face to face interaction and collaboration that happens in a physical classroom.

One of the findings from the earlier noted study by The Sloan Consortium, was that “only 30.2 percent of chief academic officers believe that their faculty accept the value and legitimacy of online education – a rate is lower than recorded in 2004” (2013, The Sloan Consortium). However, the same study also found some evidence to indicate that reaction is still in fact mixed, with many academic institutions continuing to ponder as to how they will continue integrating online learning into their long-term strategies.

While traditional brick and mortar institutions continue working toward figuring this out, it’s hard to deny how providing a free education from top universities – the likes of which could include MIT and Harvard – for free no less, is opening the door to an untapped market for these technologies to get the momentum they require to become a real threat.

Interestingly, two of the major institutions driving online learning agendas are in fact two of the most established brick and mortar institutions: MIT and Harvard. These two academic powerhouses recently teamed up to develop an initiative called ‘edX’, a program that builds further on MIT’s existing OpenCourseWare platform (thousands of free online courses), to provide not only a library of free, online courses but also an open-source technology platform free for use by other universities looking for follow suit.


It will now be interesting to see how many of the traditional brick and mortar universities react to these disruptive innovations. Will they simply pass them off as an inferior product and continue developing sustaining innovations to serve their existing, mainstream markets hoping (or not aware of) the sustaining innovations advancing in the smaller markets below them (at potentially advancing at greater rates), as we have seen in the past? Or, will they take a hard look at what steps may be necessary – within their value networks, cost structures and organizational cultures – to not let history repeat itself in yet another industry.


(2013) Retrieved from:

(2013) Retrieved from:

(2013) Retrieved from:

Christensen (2000), The Innovator’s Dilemma

Dunn (2012, May 2) Harvard and MIT Introduce edX: The Future Of Online Learning.
Retrieved from:

Horn (2012, April 12) Yes, University of Phoenix is Disruptive; No, That Doesn’t Make It the End-All. Retrieved from:

Myers (2011, November 13) Clayton Christensen: Why online education is ready for disruption, now.
Retrieved from:

Myers (2011, May 14) How the Internet is Revolutionizing Education.
Retrieved from:

Stokes (2011, April 14) Is Online Learning a Disruptive Innovation?
Retrieved from:

1 Comment

Filed under Books, General, Innovation, Management, Strategy

Buy Local


As an assignment for my MBA B2B Marketing Course, I was involved with a project to make a case for private and public sector corporations to procure ICT services locally, as opposed to outside of the region. The idea was that we would take a closer look at some of the impacts of doing so, to make not only an ‘economic argument’, but a ‘social argument’ as well, for the merits of taking such an approach.

At a high level, the argument is centered around such things as how when governments and corporations procure services (IT or otherwise) from organizations based in the region, the firms and their employees alike contribute to the tax base as well as the economy and society as a whole. While this may sound simple enough, when you have procurement departments that are accustomed (or perhaps even mandated) to making purchasing decisions solely (or even largely) on the basis on lowest cost – without taking a ‘total cost of ownership’ approach – it’s easier said than done.

In putting together the buy local value proposition – taking into account both economic as well as social factors – we were able to make some extrapolations of our own for the benefits of buying local based on some multiplier effects and assumptions from similar studies. We also took a look at some of the strategies and best practices being taken by global leaders in ICT such as Finland and Iceland; nations that have grown their ICT sectors significantly through such strategies as commitments to R&D investments, education, entrepreneurship and ICT exports – among other things. In doing so we were able to build upon some great work already being done by the New Brunswick Information Technology Council (NBITC) to make some recommendations for how to drive economic development through a strengthened local ICT sector, where a ‘buy local’ strategy could be a key pillar.

Here is an article from one of co-sponsors of the project, Larry Sampson of the NBITC that ran in the Telegraph Journal (local newspaper) on July 22nd, that was subsequently posted on the University blog where I retrieved it from, where he references the project as well as a few specifics related to my group.

The Need to Buy Local

New Brunswick Telegraph-Journal  Mon Jul 22 2013  Page: B1  Section: B  Byline: Larry Sampson Telegraph-journal

Early last week I was fortunate enough to sit in on the final presentations of the MBA business to business marketing class at UNBSJ. The teams (there were seven in total) had spent the bulk of the last few months trying to determine if it made more sense for government to purchase information technology goods and services from businesses owned and operated in New Brunswick (local), or those who operated here, but were owned elsewhere (non-local). Assuming we’re talking equivalent quality and price, you wouldn’t think there’d be much distinction between the two – both types of businesses employ people in the province, buy goods and services locally, and are active in the community – but apparently there is.

In addition to being struck by the professional level of many of the presentations and the insights they provided, it was clear this was a group of smart, passionate people with a global perspective. There were students from Europe, Asia, and North America – most of whom were “from away” – and many of whom had prior real world experience.

As you might expect there was considerable variation between the team presentations, however a number of consistent themes emerged. All the teams saw ICT as a significant means of growing the economy. The current low levels of investment in research and development by government and the private sector were viewed both as a weakness and opportunity. Multiple teams pointed out the incongruity between government trying to develop the economy on one hand, while ignoring the potential to leverage its own procurement to that end on the other. The need to better equip our education system to foster entrepreneurs and grow the supply of ICT-related talent was also regularly cited, as was the potential for ICT to improve the competitiveness of our more traditional economic sectors.

Every single team suggested there would be a significant financial and social advantage to New Brunswick should government introduce a formal “Buy Local” policy. Many of the teams backed this up with a detailed analysis showing how buying locally keeps more of the money in the province, generates increased government revenue, and produces more jobs. In addition to modelling the financial impact themselves (one team assessed the net economic impact was over 2:1 in favour of local purchasing), teams quoted from a number of Canadian and American studies. These studies found every $100 spent “non-locally” results in $30 to $54 more leaving that economy than if the same $100 was spent with a local firm. On the social side, local businesses gave back $5 to the community for every $1 provided by a multinational.

One of the more recent studies cited was The Power of Purchasing: The Economic Impacts of Local Procurement – a study that was completed this May by the Columbia Institute, LOCO, and the ISIS Research Centre in the Sauder School of Business at UBC. It concluded that local firms recirculate nearly double the amount of revenue when compared to multinationals, and create almost twice as many jobs per dollar of revenue.

When asked why we weren’t doing a better job of leveraging local companies to grow the provincial economy, the teams offered a number of explanations, including: government hasn’t done the math or seen the opportunity; the ICT sector is doing a lousy job of educating government; there is no champion/owner inside of government; and government procurement is fixated on reducing cost to the detriment of the big picture.

Assuming these students have gotten it even vaguely right, there’s an awful lot of smoke here for there not to be a fire.

Larry Sampson is the CEO of the New Brunswick Information Technology Council.

© 2013 Telegraph-Journal (New Brunswick)”

– See more at:

Leave a comment

Filed under Business, Management, Marketing, Strategy

Not All Customers Created Equal

EPSON scanner image

At the risk of stating the obvious, there is a fair bit of buzz around entrepreneurship and start-ups these days. I suppose the truth is that this is something I’m interested in and work around, so it could just be that I notice it more than some. Either way, I think it’s safe to say that it’s an exciting time to be in the technology space. It seems every few weeks you hear about another “success story” (editorial note – success can be defined in many different ways) of some start-up that has hit the jackpot, by being scooped up by a large corporation (Instagram, Tumbler, or locally in my neck of the woods, Radian6 to name a few examples), to realize their exit strategy (be it planned or unplanned).

Again, at the risk of stating the obvious, regardless of what you’re “selling”, and regardless of your exit strategy, at some point you’re going to need some “customers”. (paying customers would be nice, but that hasn’t stopped many of today’s start-ups so don’t let it stop you…at least for now). Knowing this, it’s helpful to understand how customer adoption can differ. Some customers like to have the newest, shiniest things just because…we’ll because they’re new and they’re shiny, whereas some customer are more pragmatic and will hang back and wait for the “2.0 release” when perhaps the cost is a bit lower and/or some of the bugs have been worked out. This is the case in both the B2C as well as the B2B world. There is a continuum on which customers exist – from the “Innovators” at one end to the “Laggards” at the other. Though there are many ways to slice and dice the types and tendencies of the various “customer adoption types”, I’ve outlined below one way I’ve seen it done recently (with a B2B focus) with some of my thoughts woven in for good measure.

  • Innovators – these guys are the “techies”; the group even before the “early adopters” that we hear about so often. These guys are the ones that must have the latest and greatest gadgets and technologies. The consumer market equivalent would be the hardcore “geeks” (no disrespect; being a geek is über-cool now) that camp out overnight a Future Shop whenever Apple releases…well just about anything (again – no disrespect; kudos to Apple for building the hype and delivering most of the time). If you’re trying to appeal to this crowd – and why wouldn’t you want to really – you’ll want to focus on the elegance and general coolness of the technology.
  • Early Adopters – hey I’ve heard of these guys! These guys are the “visionaries” and often also leaders in their industry. They prefer technology “revolution over evolution” and are always on the lookout for a new technology to give them a competitive advantage. Landing these guys as customers should be part of any strategy as they can become great case studies for early successes that can help you land customers of a more pragmatic nature.
  • Early Majority – these guys are the “pragmatists”. They dig technology but really want see the proof that adopting something new WILL yield business benefits (e.g. lower costs, higher revenue). Show these guys the business case for how this will help them save money, grow revenues or some combination of the two. Show them how finely tuned your product is now as a result of working closely with your innovators and early adopters. In Geoffrey Moore’s book, “Crossing the Chasm” Moore discusses how bridging the gap from the early adopters to the early majority is a key step for any organization to achieve mainstream success.
  • Late Majority – this group is often referred to as the “conservatives”, and as the name suggests, they tend to resist change but are willing to adopt so long as the changes can be integrated into their existing systems and/or processes. Boring!
  • Laggards – these guys are the “skeptics”. They’re just not into it. They only adopt technology when they’re absolutely forced to. (Internet Explorer 6 is no longer supported. Please, please upgrade!)

So that’s it*. Know your customers then go get em’.

* That’s not it.

Image credit:

Leave a comment

Filed under Books, Business, Management, Marketing, Strategy