tl;dr Block size issue and opinions

WhaleCalls
5 min readMar 18, 2017

--

There has been a lot questions asked on this subject due to uninformed media articles , bias opinions and technical nature of the thing.. This is our attempt to offer a simplified explanation of the situation and offer our opinion to those who want to hear.

Nomenclature:

Blocks — batches of transactions which are confirmed and shared via the bitcoin public ledger aka blockchain.

Hardfork — a hard fork is a radical change to the protocol that makes previously invalid blocks/transactions valid (or vice-versa), and as such requires all nodes or users to upgrade to the latest version of the protocol software. It is not backwards compatible. It is expected that the new “software” take over the majority of work and the old software die of attrition.

Softfork — a softfork is a change to the bitcoin protocol wherein only previously valid blocks/transactions are made invalid. Since old nodes will recognize the new blocks as valid, a softfork is backward-compatible. This kind of fork requires only a majority of the miners upgrading to enforce the new rules. This is equivalent to running updating your operating system, generally little risk in the operation.

SegWit or segrated witness — Separate the signature data from the transaction, contains the fix for transaction malleability, viewed as a (temporary) fix for current scaling issue, done by softfork viewed by current development team as the “Safe” approach.

Reference

https://bitcoincore.org/en/2016/01/26/segwit-benefits/ — benefits

https://www.youtube.com/watch?v=QYZv92F2kCw — solid youtube explanation.

Situation

In 2010, bitcoin block size had a 1MB cap placed on it. Since then, transaction have increased 3x and average sizes have increased more than 6x. Creating the issue of near full blocks, noticeable delays and higher transactions fees (relative to bitcoin price). Our own trials show the average transaction times are roughly 40 minutes around 35 cents.

Full blocks create a backlog of transactions slowing down transaction speeds and increasing average cost of “priority” transactions. The fear is that nodes would fall behind and start rejecting (some) transactions all together. There are several proposed solutions to this , the most popular of choices are increasing blocksize and segwitt.

Some argument against increasing the blocksize are that “contentious” hard fork would damage the trust of the 8 year old bitcoin blockchain (as we’ve seen in etherium) . There is also a resource concerns on just how big the new blocks should be. In some areas of the world high speed internet (20MB +) is not common, and it would eliminate some large mining operations and node owners. There are several pitches for 2MB, 8MB, 20MB and a flexible cap (ie adjust to market). All of these would require a hard fork and full cooperation from the community.

The arguments in favor of an immediate (but not lasting) solution to a tangible problem. Some miners don’t find this solution favorable, as they would earn less revenue per block in transaction fees.

Since changing the blocksize would be a hard fork , all solutions are reviewed critically by the “Core” development team and prioritized in order vs other proposed work and solutions (known as BIPS.) For the last two years , segwit has been given priority of work vs any other scaling solution. This has lead to other developers forking and proposing their own solution some of these are bitcoin classic, bitcoin xt and bitcoin unlimited. However, for a solution to be successful, it need to obtain 95% of the miners.

Enter Bitcoin Unlimited.

Bitcoin Unlimited, front manned by Roger Ver, is the first solution to be backed by a majority miner BITMAIN. However, this has received mixed reviews from subject matter experts and media alike. The solution itself is questioned because it does not receive the same level of critical peer review, amongst other commonly practiced software engineering standards (ie QA). One recent exploit being exposed was the cause of downing most of their network is a clear example of this.

Other critical reviews of this team are the manner that they are choosing to move forward. Rather than reaching a quorum with those with large investments in bitcoin companies/infrastructure and core developers, they have chosen to just do it and take the results. This has included refusing the proposed segwit solution, which is currently seeking consensus to move forward. Some are critical of Roger Ver himself based on his social media outbursts and have views of BU that reflect that.

There is currently a media blitzkrieg of quazi informed articles that would have you believe that bitcoin is broken and that Bitcoin Unlimited is the solution. These articles are generally written by those who are not very familiar with the Bitcoin itself, get major facts incorrect, and are just wrong — largely referencing other commentary articles and Reddit threads that have been posted.

credit to http://xtnodes.com for images

WhaleCalls thoughts and opinions..

It’s our opinion, as regular bitcoin users and software engineers, that there is clearly an issue to be solved. However, we believe that both solutions are needed in order to accomplish the end goal of scalability. Segwit addresses a completely different problem than increasing the block size, but its bi-product is reducing the average size of the blocks themselves. Limiting the blocksize with a hard cap at a protocol level should be addressed as well. Think if tcp/ip had a hard limit to the packet size, it could receive based on a consensus of speed. The world’s internet would be a much different and slower place.

In a perfect world, Segwit would be adopted environment wide , followed by the immediate priority shifting back to the block size issue . Many reasonable BIPs for blocksize changes and negotiations systems for the “average” size currently exist.

We wish the Bitcoin Unlimited had a more co-operative approach in proposing their solution and had not lowered themselves to mudslinging. The professionalism that would have resulted would have gone a long way and made it much easier to review their solution without a tainted community perception. Also, if software development standards such as peer review and quality assurance had been followed , resulting in a more polished product , they very well might have a contending product.

As it sits currently, its very hard to recommend the product for production use when these standard aren’t followed.

--

--