51% hashing power, or even 90%, means nothing if clients collectively refuse to accept and relay your blocks.
It is simple, provable engineering fact that storing data in transaction outputs makes block validation, double-spend checks and other critical consensus operations more expensive. More RAM is used on average. In general, it burdens the entire network. UTXO is our most critical resource currently.
Bitcoin is only zero trust, if you can verify the entire transaction history.
Do not premine, or other scamcoin traits.
- Jeff Garzik (Technology advisor to premined coins PO.ET and Civic)
"They are attempting to ride the coattails of the Bitcoin brand"
Further, you cannot handwave away the problem that, if transactions is infinitesimally cheap, people will abuse the system by sending non-currency data messages. Lots of them. Gigabytes worth, as other alt-chain field experience has proven. To the point that bitcoin-the-currency transactions are impacted. "I want a system that can process infinite amounts of traffic" is in the land of unicorns. The accusation of dev laziness is particularly rich, given that SatoshiDICE abused the blockchain in this way, by sending informational messages (IM "You lost a bet") via the blockchain. If you want an infinite amount of transactions per 10 minutes, you have just reinvented the Internet… over the blockchain. Poorly.
one cannot ignore a key attribute conferring by a limit like the 1MB limit: it encourages engineering efficiencies to be sought. Programmers have an incentive to actively seek ways to reduce the number of transactions, or reduce transaction size, when faced with a limited resource. Some business models simply don't care about that part of the equation. It's not a conspiracy by Gavin and the Bitcoin Foundation funders, it is simply one facet of some bitcoin businesses. They make money with increased transaction volume. That's fine, but a key economic counter-point is that these businesses are not bearing the costs of the mining/blockchain impact of a million-TX-per-day policy.
It's open source. Fork away. Though the consequence is that you remain at a higher, hardcoded fee level, and people will still dump megabytes worth of non-currency data into the blockchain (wikileaks cables etc.).
More to the point, zero-conf transactions have been double-spent already. It is proven they are not safe today, ignoring any proposed changes.
There have been chains of hashes and chains of digital signatures before. What makes bitcoin different is that it is timestamping these digital messages, and protecting those timestamps against being reversed. The currency aspect of bitcoin is simply a layer on top of the distributed timestamping service
Satoshi also intended the subsidy-free, fee-only future to support bitcoin. He did not describe fancy assurance contracts and infinite block sizes; he cleared indicated that fees would be driven in part by competition for space in the next block.
Any miner that increases MAX_BLOCK_SIZE beyond 1MB will self-select themselves away from the network, because all other validating nodes would ignore that change. Just like if a miner decides to issue themselves 100 BTC per block. All other validating nodes consider that invalid data, and do not relay or process it further.
If the users are not voting (validating), then it is trivial for miners to rewrite the rules. If the users are fully validating, then a miner decision to have each block produce 50 BTC again would be instantly rejected.
In an unfunded open source project, arguing all day about the lack of full-engineering-team rigor is entirely wasted energy. Blame the dev team if that is your favorite target, that will not magically create extra time in the day or extra manpower to accomplish these extra tasks being demanded by non-contributors. The time spent whining about what an unfunded effort fails to do could be better spent, say, creating a test network of full nodes running all known bitcoind versions, 0.3 through present. And test, test, test changes through that.
It is always entertaining to watch non-contributors opine about completely obvious solutions that the devs are silly to have overlooked.
A hard fork is a significant event that knocks legitimate users off the network, makes coins unspendable, or potentially makes the same coins spendable in two different locations, depending on whether or not you're talking to an updated node. It is, to pick a dramatic term, an Extinction Level Event. If done poorly, a hard fork could make it impossible for reasonable merchants to trust the bitcoins they receive, the very foundation of their economic value. Furthermore, a hard fork is akin to a Constitutional Convention: a hard fork implies the ability to rewrite the ground rules of bitcoin, be it block size, 21M limit, SHA256 hash, or other hard-baked behavior. Thus, there is always the risk of unpredictable miners, users and devs changing more than just the block size precisely because it makes the most engineering sense to change other hard-to-change features at the time of hard-fork. It is a nuclear option with widespread economic consequences for all bitcoin users.
Being the person who actually posted a faux-patch increasing the block size limit, it is important to understand why I disagree with that now… it was erroneously assuming that the block size was the whole-picture, and not a simple, lower layer solution in a bigger picture. The block size is an intentionally limited economic resource, just like the 21,000,000-bitcoin limit.
Boy that's a shortsighted analysis. Bitcoin will grow layers above the base layer — the blockchain — that will enable instant transactions, microtransactions, and other scalable issues. Do not think that the blockchain is the only way to transfer bitcoins. Larger aggregators will easily compensate for current maximum block size in a scalable manner. All nation-state/fiat currencies are multi-layer. Too many people look at what bitcoin does now, and assume that those are the only currency services that will ever exist.
Transactions will not always be free. Any time there are a lot of transactions being sent, free transactions get the lowest priority and might have to wait to make it into a block. If blocks are often full, you will need to pay a transaction fee to get priority.
It is not as good when obviously-still-learning people are billing their project as the "future of bitcoin" and misleading people into thinking they are a bitcoin expert, and are misleading people into thinking they are producing high quality, proven code (and potentially taking thousands of dollars for it). Those who are not coders lack the skills to judge this sort of thing, and only have hype from this thread to go on.
Miners only select (or ignore) transactions provided to them. The bitcoin client you run chooses what transactions and blocks to validate and relay. Miners cannot change the rules without bitcoin user agreement.
I think users with older clients, holders of older bitcoins quite appreciate the struggle to maintain backwards compat. Nobody wants to wake up in the morning, to discover that their money is unspendable outside of a required upgrade.
EDIT: In this post I am trying to be positive, its a genuine look back at great comments that taught me a lot. I am not trying to do character assassination.