On the blog of Peter Vessenes's Bitcoin Foundation Gavin Andresen offered an outline of measures he would like to see to improve the ability of Bitcoin to scale in the future, and one measure he proposes is a hard fork over blocksize. The measures mentioned include:
Changes to initial block synchronization that includes downloading a chain of block headers before downloading actual blocks.
A library, 'libsecp256k1' by Pieter Wuille optimized for Bitcoin's elliptic curve used for transaction signing to replace the current implementation imported from OpenSSL.
A "pruned" block database implementation for clients that have gone through initial sync, and which may eventually be enabled by default.
Initial synchronization by downloading the set of unspent transactions rather than the whole blockchain is mentioned as something that may be explored eventually.
A new Inital maximum block size such that a full node may be run by "somebody with a current, reasonably fast computer and Internet connection, running an up-to-date version of Bitcoin Core and willing to dedicate half their CPU power and bandwidth to Bitcoin."
After that Blocksize increases 50% per year for the next 20 years.
Gavin offers this 50% yearly figure based on Nielsen's Law of Internet bandwidth which supposes high speed Internet connections double in speed every year. This proposal for a jump to a new larger block size then committing to doubling block size every year for two decades is likely in the long run to present a substantial increasing barrier to entry for operators of full nodes, particularly in geographic areas where bandwidth growth has been inconsistent.
One of the problems right now w btc is updates are all contolled by this one guy, whats supposed to happen to it if something happens to him