A common thread in my interactions with standards bodies (W3C & IDPF, mainly) is that most of the participants (esp. those who are do it as their day job) rarely take incentives into account.
1. Is it technically feasible?
2. Is it economically feasible?
3. Does anybody have the incentives to actually implement it?
Most standards people stop at 1. A few stop at 2. A tiny tiny minority goes all the way and realises that without an incentive to implement, the standard will never matter.
A classic example is pretty much any of the many problems in the EPUB ecosystem. No matter which problem you choose, you won't find anybody in a position to affect it that has an incentive to fix it. The money just isn't there in publishing.
This is why complex standardised solutions to problems that affect regular users but don't really affect corporations are less likely to be adopted and implemented in a usable manner than a solution that ties directly into the incentives of a company.
There are many reasons why standards and specs should favour the simple over the complex.
Most of the time, users benefit more from a simple solution that solves 25% of their problem & is actually well implemented, than a complex solution that solves 100% of the problem but never ever ever gets properly implemented.
The simpler the solution, the less economic incentive is necessary for it to be viably implemented. The last thing you want is to be dependent on the goodwill of multinational cos
@baldur Sometimes the success of an overly simplistic solution can set us back a long way too & make problems of its own, so I don't think there's a single right answer.
Agree about over-complex approaches. I sometimes talk about cost-cutting approaches to building a bridge over a chasm: make a narrower bridge that only lets 70% of the traffic get through, or make a shorter bridge that only goes 70% of the way across...
@baldur A new technology generally has to make something new possible , or to let something be done more cheaply/more quickly/more easily.
You're right that the actual final value usually isn't known in advance. At the consumer end it's not about complexity of implementation, so there's a disconnect there. it's also not about elegance of implementation, something that we technologists often forget and that annoys programmers no end!
Somebody always needs to pay for the implementation. If there isn't enough money at play the only thing you can workably specify is something simple.
My concern is that many at the W3C tend to gloss over this and specify solutions which, while technically correct, are out of proportion to the finance and resources at play.
Many problems actually _do_ lend themselves to simpler piecemeal solutions and the IDPF esp. had a habit of not capitalising on that.
@baldur Then help keep us simple, OK? :-)
I think the requirement for interoperability tests may also help.
@barefootliam Oh, absolutely. Interop tests make a huge huge difference especially since they actually need to be implemented and so can be a useful early warning about problem areas.
Overall, I do think the W3C is considerably better at this now than it used to be. But I am also a bit concerned that the organisation may pick up some bad habits from the IDPF merger.
@baldur it's a fair concern, but I'm seeing so far the opposite - with the IDPF people looking forward to more tests and more reuse of existing Web technologies, which makes me very optimistic.
@barefootliam I'm cautiously optimistic overall but will reserve judgement until I see the final Publishing Working Group Charter. The IDPF folks, esp. management, haven't properly started their tenure at the W3C yet.
@baldur maybe I'll see you again at next year's ebookcraft and you can tell me if you still feel the same way :D :D
@barefootliam Hopefully. In any case, I'm much more optimistic now than I was just a few months ago. ☺
The social network of the future: No ads, no corporate surveillance, ethical design, and decentralization! Own your data with Mastodon!