- While you may be able to reduce development time via rapid tools, can you speed up analysis and design and still maintain quality?
- Does iteration work? When?
- If we didn't have constraints on time/cost, would we really be our own worst enemy?
- Is quality, speed, cost and learning really like sliders that as you push one up others must go down?
There also are quite a few questions that the responses raise in my mind that I'm still grappling with...
- What do you do when you are in an organization that only wants low-quality, rapid - even if it only checks the box? In other words, do you fight What the Client Wants? (see also Wendy's comments)
- When is SME (or end-user) produced content with rapid tools the right answer?
I plan to do another couple of posts on this topic and related topics, but this has been pretty interesting already.
3 comments:
Thanks for distilling the many useful insights from the January Big Question. I've added a post to my blog that addresses the questions you've raised here: High Performance Networks solve the problems.
These problems with excessive speed, loss of quality, dysfunctional iterations, perfectionism and imposed compromises occur in all design professions. Architects are accused of being "stuck on themselves" when their building designs become too elaborate, over-budget and less functional. Engineers describe their "failing to keep the customer satisfied" as "designing gold plated Cadillacs". When I listen to the Director commentary on many DVDs, it's evident that creatives in Hollywood are experiencing the same "speed vs. quality" problems as they work out their film production deadlines, budgets, and costly special effects.
The comments on this post are being tracked and aggregated as part of Learning Circuits Blog's The Big Question for January. Thanks for participating,Tony~
We seem to be thinking of this as a one or the other proposition. 'VS' places the quality of speed (and assumably cost) and quality in general at odds. The two are not mutually exclusive...
Consider this, three years ago I recommended to a particular agency that they consider incremental and part-task support elements, and a solid implementation plan prior to building a monolithic ground-breaking simulation with a cost of over $3M and a development schedule of several years.
That solution rolled out last year... And it sits unused (and is slated for removal) because all of the resources were applied to the solution and not to figuring out what the problem was. Testing the waters with lower cost targeted solutions would have left this organization with at least two things (1) some useable and timely solutions (2) an earlier 'doh' moment or the tingling sensation that they were definitely on the right or wrong track (saving $$ that was wasted on the single monolithic solution). True, this solution doesn't necessarily meet the notion of quality (not that it was a horrible solution). The point is that two tracks could have co-existed, try some simpler things to refine the scope at the beginning of the process, target specific performance elements with low fidelity solutions. This likely / certainly would have been a better expenditure of resources.
In most cases I have been involved with where a development period longer than 30 days has been involved, an immediate need was present to solve the problem. Just because there's a faster way to test the waters doesn't necessarily mean that the customer will become accustomed to the cheap-fast way.
Though that prospect is scary for those of us that do this for a living.
The stage is being set by many industry players to enable the customer to self-service their eTraining needs. Two choices, fight it (and lose) or embrace it and help the industry progress to the next stepping stone of maturity.
Post a Comment