BSV
$54.85
Vol 60.59m
-12.37%
BTC
$86615
Vol 131873.64m
-2.96%
BCH
$408.18
Vol 1064.26m
-11.03%
LTC
$72.64
Vol 1242.84m
-7.59%
DOGE
$0.36
Vol 41928.29m
-9.62%
Getting your Trinity Audio player ready...

Last week I discussed matters concerning market manipulation, and stressed that although Bitcoin is immune to direct centralized governance, it is not impervious (like all markets) to market manipulation, and interference via social engineering.

One point concerning “social engineering” that deserves its own write-up is the blocksize debate.

The ever-intensifying Bitcoin scaling debate seems to have no end. By design, given that any fork requires a clear majority vote one way or another means that major protocol changes are difficult to implement across the board.

While Blockstream’s Core team try to push Segwit, as their ‘scaling’ solution, they refuse on all accounts to increase the blocksize in the interim. It can be argued that Segwit increases the effective blocksize capacity to just over 2MB, but according to the latest data on transactions and fees, this limit would itself be hit soon, which would again require a revisit of the same debate.

Segwit’s main competition at the moment comes the Bitcoin Unlimited client. A client which is in many regards the same as the pre-Segwit Core client, except that it allows for miners to vote on what the Blocksize should be via emergent consensus. The details of this consensus method are best described on Bitcoin Unlimited’s official website, but in short, it means that miners that create the backbone of the Bitcoin eco-system, can vote on, and with consensus, choose to fork into a bigger blocksize, and hence allow much greater transaction throughput.

It should be noted, that there are now a large number of Bitcoin dev teams, and that Core, and BU are just two names among many. As time passes, failure to reach consensus means that this number will only grow. – But this isn’t necessarily a bad thing just yet.

At present, Bitcoin transactions are stifled at mere 2-3 transactions per second, with a theoretical potential of 7 tps under optimal conditions. In order to get a transaction through the system, users are forced to pay ever increasing, competing fees. For a global payment system, this throughput is at best a joke. For this reason, any serious discussion concerning scalability requires that both on-chain and off-chain scaling solutions take place.

So how did we end up in this gridlocked mess where consensus is so hard to achieve?

If we go right back to the beginning, there was no Blocksize limit set at all, and Bitcoin was operating without this limit just fine. But clearly Satoshi Nakamoto wanted to keep its early days of implementation at a small scale, and thus, on Thursday July 15, 2010, input the following code:

Static const unsigned int MAX_BLOCK_SIZE = 1000000;

By doing so, it provided Bitcoin with a safeguard against spam and dusty transactions, so that if anyone was to flood the system with useless transactions, it would reach a limit and enforce fees.

Although Satoshi didn’t include anything in the changelogs concerning this line of code, he did however, leave some information in forums and emails. Notably:

“The dust spam limit is a first try at intentionally trying to prevent overly small micropayments like that.” – August 4, 2010.
“We can phase in a change later if we get closer to needing it.
“it can be phased in, like – if (blocknumber >115000) maxblocksize = largerlimit

There is no doubt Satoshi’s plan here was that this artificial limit was merely a spam prevention limit which was always supposed to be temporary. There weren’t any doubts, at least in Satoshi’s mind that the network would eventually be able to handle all transactions on-chain.

Satoshi stated: “The existing Visa credit card network processes about 15 million Internet purchases per day worldwide. Bitcoin can already scale much larger than that with existing hardware for a fraction of the cost. It never really hits a scale ceiling. By Moore’s Law, we can expect hardware speed to be 10 times faster in 5 years and 100 times faster in 10. Even if Bitcoin grows at crazy adoption rates, I think computer speeds will stay ahead of the number of transactions”

Ofcourse the loudest chants for small blocks come from the Blockstream Core group. As mentioned Segwit provides only a very mild increase, and Core refuse to raise the blocksize in the interim.

The main argument against increasing this 1MB limit to anything substantial is that it will threaten Bitcoin’s decentralized state.

But research strongly suggests that Bitcoin’s network can handle much more than the current restrictive 1MB cap.

The BTCSIM Bitcoin simulator by Javed Khan and Michalis Kargakis, showed that a 32MB blocksize could successfully hold 167,000 transactions, which translated to 270 tps. A single machine acting as a full node took approximately 10 minutes to verify and process a 32MB block. And this simulation was done in 2014.

A much more elaborate study was done in 2016 however at Cornell University. The Cornell study recommended a 4MB blocksize without affecting decentralization. The study also stated that a 4MB blocksize would result in a capacity of 27 transactions per second. That is 10 times the current capacity, even though the blocksize is only quadrupled.

So if the evidence is overwhelmingly that on-chain scaling is not a bad thing, and that it will only help in Bitcoin’s utility by allowing more transactions, for lesser fees, – then why not undertake this simple code change – which is literally changing only a few lines of code… ?

If you try to seek an answer by asking “small blockers” directly, you may find yourself going around in circles, as the answers tend to constantly shift the goal posts. But goal posts will only change, when there is a lack of substance to an argument. The answer lies in hidden agendas.

By not allowing even an interim 2 or 4MB blocksize increase while we await other scaling solutions, Core are effectively sending out the message that 1MB is the right max_block_size number for now. The rationale to this really defies logic, as the “1000000” (1mb), number that Satoshi input into the code follows no real decision point, but rather, a simple round numbered limit, that was never supposed to be reached. Except that we did reach that number, and as a result, many users are now paying well over 1USD per transaction.

This wasn’t Satoshi’s plan.

The idea that 1MB is this magical number we shouldn’t change unless we get Segwit is holding the Bitcoin community hostage. But to prevent from anyone holding Blockstream Core to this charge, Core dev Luke-jr, is even on record claiming that the 1MB limit is too high… On what basis!? Apparently, Core wants everyone to be paying well over 5USD per transaction each time.

So the only logical conclusion one can reach by analysing Core’s behaviour is that they want to implement Segwit sooner rather than later, and at any cost. That is, despite the fact that there are cleaner malleability fixes out there, and that there are better 2nd layer-ready clients.

Take Wladimir van der Laan’s quote from the dev mailing list on May 2015:

“A mounting fee pressure, resulting in a true fee market where transactions compete to get into blocks, results in urgency to develop decentralized off-chain solutions. I’m afraid increasing the blocksize will kick this can down the road and let people (and the large Bitcoin companies) relax, until it’s again time for a block chain increase, and then they’ll rally Gavin again, never resulting in a smart, sustainable solution but eternal awkward discussions like this”

This is intentional measure of keeping the blocksize low to achieve a desired purpose is known as “the strategy of degradation”. This strategy is commonly used by those seeking a revolution among a people, community, or nation. (I will discuss this specific detail next week)

But not only is Wladimir (above), attacking the intellect of every Bitcoin user, he also suggests that they cannot accurately create proper scaling solutions unless they are pressured to do so.

Such assumptions will do nothing but go on to cripple Bitcoin’s growth more and more. While other crypto-currencies feature dynamic or plain larger block sizes, Bitcoin remains at an illogical 1MB.

Let’s never forget that Blockstream has raised millions of dollars in venture capital funding, most of which has come from established banking institutions. The very institutions which Bitcoin is liberating users from. After all, where there is money, there is power, and greed. The ‘strategy of degradation’ is just one method that is employed by those seeking such power.

Eli Afram M.IT
Developer/Analyst
@justicemate

Recommended for you

Trump’s return: Bitcoin & beyond
Donald Trump's return to the White House is a major catalyst behind the BTC trend move. Trump's focus on energy...
November 12, 2024
Can truly scalable blockchain save AI?
BSV has shown a future where blockchain acts as a scalable platform for decentralized applications like AI. However, its potential...
November 12, 2024
Advertisement
Advertisement
Advertisement