Mining Calculator Bitcoin, Ethereum, Litecoin, Dash and Monero

The Origins of the Blocksize Debate

On May 4, 2015, Gavin Andresen wrote on his blog:
I was planning to submit a pull request to the 0.11 release of Bitcoin Core that will allow miners to create blocks bigger than one megabyte, starting a little less than a year from now. But this process of peer review turned up a technical issue that needs to get addressed, and I don’t think it can be fixed in time for the first 0.11 release.
I will be writing a series of blog posts, each addressing one argument against raising the maximum block size, or against scheduling a raise right now... please send me an email ([email protected]) if I am missing any arguments
In other words, Gavin proposed a hard fork via a series of blog posts, bypassing all developer communication channels altogether and asking for personal, private emails from anyone interested in discussing the proposal further.
On May 5 (1 day after Gavin submitted his first blog post), Mike Hearn published The capacity cliff on his Medium page. 2 days later, he posted Crash landing. In these posts, he argued:
A common argument for letting Bitcoin blocks fill up is that the outcome won’t be so bad: just a market for fees... this is wrong. I don’t believe fees will become high and stable if Bitcoin runs out of capacity. Instead, I believe Bitcoin will crash.
...a permanent backlog would start to build up... as the backlog grows, nodes will start running out of memory and dying... as Core will accept any transaction that’s valid without any limit a node crash is eventually inevitable.
He also, in the latter article, explained that he disagreed with Satoshi's vision for how Bitcoin would mature[1][2]:
Neither me nor Gavin believe a fee market will work as a substitute for the inflation subsidy.
Gavin continued to publish the series of blog posts he had announced while Hearn made these predictions. [1][2][3][4][5][6][7]
Matt Corallo brought Gavin's proposal up on the bitcoin-dev mailing list after a few days. He wrote:
Recently there has been a flurry of posts by Gavin at http://gavinandresen.svbtle.com/ which advocate strongly for increasing the maximum block size. However, there hasnt been any discussion on this mailing list in several years as far as I can tell...
So, at the risk of starting a flamewar, I'll provide a little bait to get some responses and hope the discussion opens up into an honest comparison of the tradeoffs here. Certainly a consensus in this kind of technical community should be a basic requirement for any serious commitment to blocksize increase.
Personally, I'm rather strongly against any commitment to a block size increase in the near future. Long-term incentive compatibility requires that there be some fee pressure, and that blocks be relatively consistently full or very nearly full. What we see today are transactions enjoying next-block confirmations with nearly zero pressure to include any fee at all (though many do because it makes wallet code simpler).
This allows the well-funded Bitcoin ecosystem to continue building systems which rely on transactions moving quickly into blocks while pretending these systems scale. Thus, instead of working on technologies which bring Bitcoin's trustlessness to systems which scale beyond a blockchain's necessarily slow and (compared to updating numbers in a database) expensive settlement, the ecosystem as a whole continues to focus on building centralized platforms and advocate for changes to Bitcoin which allow them to maintain the status quo
Shortly thereafter, Corallo explained further:
The point of the hard block size limit is exactly because giving miners free rule to do anything they like with their blocks would allow them to do any number of crazy attacks. The incentives for miners to pick block sizes are no where near compatible with what allows the network to continue to run in a decentralized manner.
Tier Nolan considered possible extensions and modifications that might improve Gavin's proposal and argued that soft caps could be used to mitigate against the dangers of a blocksize increase. Tom Harding voiced support for Gavin's proposal
Peter Todd mentioned that a limited blocksize provides the benefit of protecting against the "perverse incentives" behind potential block withholding attacks.
Slush didn't have a strong opinion one way or the other, and neither did Eric Lombrozo, though Eric was interested in developing hard-fork best practices and wanted to:
explore all the complexities involved with deployment of hard forks. Let’s not just do a one-off ad-hoc thing.
Matt Whitlock voiced his opinion:
I'm not so much opposed to a block size increase as I am opposed to a hard fork... I strongly fear that the hard fork itself will become an excuse to change other aspects of the system in ways that will have unintended and possibly disastrous consequences.
Bryan Bishop strongly opposed Gavin's proposal, and offered a philosophical perspective on the matter:
there has been significant public discussion... about why increasing the max block size is kicking the can down the road while possibly compromising blockchain security. There were many excellent objections that were raised that, sadly, I see are not referenced at all in the recent media blitz. Frankly I can't help but feel that if contributions, like those from #bitcoin-wizards, have been ignored in lieu of technical analysis, and the absence of discussion on this mailing list, that I feel perhaps there are other subtle and extremely important technical details that are completely absent from this--and other-- proposals.
Secured decentralization is the most important and most interesting property of bitcoin. Everything else is rather trivial and could be achieved millions of times more efficiently with conventional technology. Our technical work should be informed by the technical nature of the system we have constructed.
There's no doubt in my mind that bitcoin will always see the most extreme campaigns and the most extreme misunderstandings... for development purposes we must hold ourselves to extremely high standards before proposing changes, especially to the public, that have the potential to be unsafe and economically unsafe.
There are many potential technical solutions for aggregating millions (trillions?) of transactions into tiny bundles. As a small proof-of-concept, imagine two parties sending transactions back and forth 100 million times. Instead of recording every transaction, you could record the start state and the end state, and end up with two transactions or less. That's a 100 million fold, without modifying max block size and without potentially compromising secured decentralization.
The MIT group should listen up and get to work figuring out how to measure decentralization and its security.. Getting this measurement right would be really beneficial because we would have a more academic and technical understanding to work with.
Gregory Maxwell echoed and extended that perspective:
When Bitcoin is changed fundamentally, via a hard fork, to have different properties, the change can create winners or losers...
There are non-trivial number of people who hold extremes on any of these general belief patterns; Even among the core developers there is not a consensus on Bitcoin's optimal role in society and the commercial marketplace.
there is a at least a two fold concern on this particular ("Long term Mining incentives") front:
One is that the long-held argument is that security of the Bitcoin system in the long term depends on fee income funding autonomous, anonymous, decentralized miners profitably applying enough hash-power to make reorganizations infeasible.
For fees to achieve this purpose, there seemingly must be an effective scarcity of capacity.
The second is that when subsidy has fallen well below fees, the incentive to move the blockchain forward goes away. An optimal rational miner would be best off forking off the current best block in order to capture its fees, rather than moving the blockchain forward...
tools like the Lightning network proposal could well allow us to hit a greater spectrum of demands at once--including secure zero-confirmation (something that larger blocksizes reduce if anything), which is important for many applications. With the right technology I believe we can have our cake and eat it too, but there needs to be a reason to build it; the security and decentralization level of Bitcoin imposes a hard upper limit on anything that can be based on it.
Another key point here is that the small bumps in blocksize which wouldn't clearly knock the system into a largely centralized mode--small constants--are small enough that they don't quantitatively change the operation of the system; they don't open up new applications that aren't possible today
the procedure I'd prefer would be something like this: if there is a standing backlog, we-the-community of users look to indicators to gauge if the network is losing decentralization and then double the hard limit with proper controls to allow smooth adjustment without fees going to zero (see the past proposals for automatic block size controls that let miners increase up to a hard maximum over the median if they mine at quadratically harder difficulty), and we don't increase if it appears it would be at a substantial increase in centralization risk. Hardfork changes should only be made if they're almost completely uncontroversial--where virtually everyone can look at the available data and say "yea, that isn't undermining my property rights or future use of Bitcoin; it's no big deal". Unfortunately, every indicator I can think of except fee totals has been going in the wrong direction almost monotonically along with the blockchain size increase since 2012 when we started hitting full blocks and responded by increasing the default soft target. This is frustrating
many people--myself included--have been working feverishly hard behind the scenes on Bitcoin Core to increase the scalability. This work isn't small-potatoes boring software engineering stuff; I mean even my personal contributions include things like inventing a wholly new generic algebraic optimization applicable to all EC signature schemes that increases performance by 4%, and that is before getting into the R&D stuff that hasn't really borne fruit yet, like fraud proofs. Today Bitcoin Core is easily >100 times faster to synchronize and relay than when I first got involved on the same hardware, but these improvements have been swallowed by the growth. The ironic thing is that our frantic efforts to keep ahead and not lose decentralization have both not been enough (by the best measures, full node usage is the lowest its been since 2011 even though the user base is huge now) and yet also so much that people could seriously talk about increasing the block size to something gigantic like 20MB. This sounds less reasonable when you realize that even at 1MB we'd likely have a smoking hole in the ground if not for existing enormous efforts to make scaling not come at a loss of decentralization.
Peter Todd also summarized some academic findings on the subject:
In short, without either a fixed blocksize or fixed fee per transaction Bitcoin will will not survive as there is no viable way to pay for PoW security. The latter option - fixed fee per transaction - is non-trivial to implement in a way that's actually meaningful - it's easy to give miners "kickbacks" - leaving us with a fixed blocksize.
Even a relatively small increase to 20MB will greatly reduce the number of people who can participate fully in Bitcoin, creating an environment where the next increase requires the consent of an even smaller portion of the Bitcoin ecosystem. Where does that stop? What's the proposed mechanism that'll create an incentive and social consensus to not just 'kick the can down the road'(3) and further centralize but actually scale up Bitcoin the hard way?
Some developers (e.g. Aaron Voisine) voiced support for Gavin's proposal which repeated Mike Hearn's "crash landing" arguments.
Pieter Wuille said:
I am - in general - in favor of increasing the size blocks...
Controversial hard forks. I hope the mailing list here today already proves it is a controversial issue. Independent of personal opinions pro or against, I don't think we can do a hard fork that is controversial in nature. Either the result is effectively a fork, and pre-existing coins can be spent once on both sides (effectively failing Bitcoin's primary purpose), or the result is one side forced to upgrade to something they dislike - effectively giving a power to developers they should never have. Quoting someone: "I did not sign up to be part of a central banker's committee".
The reason for increasing is "need". If "we need more space in blocks" is the reason to do an upgrade, it won't stop after 20 MB. There is nothing fundamental possible with 20 MB blocks that isn't with 1 MB blocks.
Misrepresentation of the trade-offs. You can argue all you want that none of the effects of larger blocks are particularly damaging, so everything is fine. They will damage something (see below for details), and we should analyze these effects, and be honest about them, and present them as a trade-off made we choose to make to scale the system better. If you just ask people if they want more transactions, of course you'll hear yes. If you ask people if they want to pay less taxes, I'm sure the vast majority will agree as well.
Miner centralization. There is currently, as far as I know, no technology that can relay and validate 20 MB blocks across the planet, in a manner fast enough to avoid very significant costs to mining. There is work in progress on this (including Gavin's IBLT-based relay, or Greg's block network coding), but I don't think we should be basing the future of the economics of the system on undemonstrated ideas. Without those (or even with), the result may be that miners self-limit the size of their blocks to propagate faster, but if this happens, larger, better-connected, and more centrally-located groups of miners gain a competitive advantage by being able to produce larger blocks. I would like to point out that there is nothing evil about this - a simple feedback to determine an optimal block size for an individual miner will result in larger blocks for better connected hash power. If we do not want miners to have this ability, "we" (as in: those using full nodes) should demand limitations that prevent it. One such limitation is a block size limit (whatever it is).
Ability to use a full node.
Skewed incentives for improvements... without actual pressure to work on these, I doubt much will change. Increasing the size of blocks now will simply make it cheap enough to continue business as usual for a while - while forcing a massive cost increase (and not just a monetary one) on the entire ecosystem.
Fees and long-term incentives.
I don't think 1 MB is optimal. Block size is a compromise between scalability of transactions and verifiability of the system. A system with 10 transactions per day that is verifiable by a pocket calculator is not useful, as it would only serve a few large bank's settlements. A system which can deal with every coffee bought on the planet, but requires a Google-scale data center to verify is also not useful, as it would be trivially out-competed by a VISA-like design. The usefulness needs in a balance, and there is no optimal choice for everyone. We can choose where that balance lies, but we must accept that this is done as a trade-off, and that that trade-off will have costs such as hardware costs, decreasing anonymity, less independence, smaller target audience for people able to fully validate, ...
Choose wisely.
Mike Hearn responded:
this list is not a good place for making progress or reaching decisions.
if Bitcoin continues on its current growth trends it will run out of capacity, almost certainly by some time next year. What we need to see right now is leadership and a plan, that fits in the available time window.
I no longer believe this community can reach consensus on anything protocol related.
When the money supply eventually dwindles I doubt it will be fee pressure that funds mining
What I don't see from you yet is a specific and credible plan that fits within the next 12 months and which allows Bitcoin to keep growing.
Peter Todd then pointed out that, contrary to Mike's claims, developer consensus had been achieved within Core plenty of times recently. Btc-drak asked Mike to "explain where the 12 months timeframe comes from?"
Jorge Timón wrote an incredibly prescient reply to Mike:
We've successfully reached consensus for several softfork proposals already. I agree with others that hardfork need to be uncontroversial and there should be consensus about them. If you have other ideas for the criteria for hardfork deployment all I'm ears. I just hope that by "What we need to see right now is leadership" you don't mean something like "when Gaving and Mike agree it's enough to deploy a hardfork" when you go from vague to concrete.
Oh, so your answer to "bitcoin will eventually need to live on fees and we would like to know more about how it will look like then" it's "no bitcoin long term it's broken long term but that's far away in the future so let's just worry about the present". I agree that it's hard to predict that future, but having some competition for block space would actually help us get more data on a similar situation to be able to predict that future better. What you want to avoid at all cost (the block size actually being used), I see as the best opportunity we have to look into the future.
this is my plan: we wait 12 months... and start having full blocks and people having to wait 2 blocks for their transactions to be confirmed some times. That would be the beginning of a true "fee market", something that Gavin used to say was his #1 priority not so long ago (which seems contradictory with his current efforts to avoid that from happening). Having a true fee market seems clearly an advantage. What are supposedly disastrous negative parts of this plan that make an alternative plan (ie: increasing the block size) so necessary and obvious. I think the advocates of the size increase are failing to explain the disadvantages of maintaining the current size. It feels like the explanation are missing because it should be somehow obvious how the sky will burn if we don't increase the block size soon. But, well, it is not obvious to me, so please elaborate on why having a fee market (instead of just an price estimator for a market that doesn't even really exist) would be a disaster.
Some suspected Gavin/Mike were trying to rush the hard fork for personal reasons.
Mike Hearn's response was to demand a "leader" who could unilaterally steer the Bitcoin project and make decisions unchecked:
No. What I meant is that someone (theoretically Wladimir) needs to make a clear decision. If that decision is "Bitcoin Core will wait and watch the fireworks when blocks get full", that would be showing leadership
I will write more on the topic of what will happen if we hit the block size limit... I don't believe we will get any useful data out of such an event. I've seen distributed systems run out of capacity before. What will happen instead is technological failure followed by rapid user abandonment...
we need to hear something like that from Wladimir, or whoever has the final say around here.
Jorge Timón responded:
it is true that "universally uncontroversial" (which is what I think the requirement should be for hard forks) is a vague qualifier that's not formally defined anywhere. I guess we should only consider rational arguments. You cannot just nack something without further explanation. If his explanation was "I will change my mind after we increase block size", I guess the community should say "then we will just ignore your nack because it makes no sense". In the same way, when people use fallacies (purposely or not) we must expose that and say "this fallacy doesn't count as an argument". But yeah, it would probably be good to define better what constitutes a "sensible objection" or something. That doesn't seem simple though.
it seems that some people would like to see that happening before the subsidies are low (not necessarily null), while other people are fine waiting for that but don't want to ever be close to the scale limits anytime soon. I would also like to know for how long we need to prioritize short term adoption in this way. As others have said, if the answer is "forever, adoption is always the most important thing" then we will end up with an improved version of Visa. But yeah, this is progress, I'll wait for your more detailed description of the tragedies that will follow hitting the block limits, assuming for now that it will happen in 12 months. My previous answer to the nervous "we will hit the block limits in 12 months if we don't do anything" was "not sure about 12 months, but whatever, great, I'm waiting for that to observe how fees get affected". But it should have been a question "what's wrong with hitting the block limits in 12 months?"
Mike Hearn again asserted the need for a leader:
There must be a single decision maker for any given codebase.
Bryan Bishop attempted to explain why this did not make sense with git architecture.
Finally, Gavin announced his intent to merge the patch into Bitcoin XT to bypass the peer review he had received on the bitcoin-dev mailing list.
submitted by sound8bits to Bitcoin [link] [comments]

The Origins of the (Modern) Blocksize Debate

On May 4, 2015, Gavin Andresen wrote on his blog:
I was planning to submit a pull request to the 0.11 release of Bitcoin Core that will allow miners to create blocks bigger than one megabyte, starting a little less than a year from now. But this process of peer review turned up a technical issue that needs to get addressed, and I don’t think it can be fixed in time for the first 0.11 release.
I will be writing a series of blog posts, each addressing one argument against raising the maximum block size, or against scheduling a raise right now... please send me an email ([email protected]) if I am missing any arguments
In other words, Gavin proposed a hard fork via a series of blog posts, bypassing all developer communication channels altogether and asking for personal, private emails from anyone interested in discussing the proposal further.
On May 5 (1 day after Gavin submitted his first blog post), Mike Hearn published The capacity cliff on his Medium page. 2 days later, he posted Crash landing. In these posts, he argued:
A common argument for letting Bitcoin blocks fill up is that the outcome won’t be so bad: just a market for fees... this is wrong. I don’t believe fees will become high and stable if Bitcoin runs out of capacity. Instead, I believe Bitcoin will crash.
...a permanent backlog would start to build up... as the backlog grows, nodes will start running out of memory and dying... as Core will accept any transaction that’s valid without any limit a node crash is eventually inevitable.
He also, in the latter article, explained that he disagreed with Satoshi's vision for how Bitcoin would mature[1][2]:
Neither me nor Gavin believe a fee market will work as a substitute for the inflation subsidy.
Gavin continued to publish the series of blog posts he had announced while Hearn made these predictions. [1][2][3][4][5][6][7]
Matt Corallo brought Gavin's proposal up on the bitcoin-dev mailing list after a few days. He wrote:
Recently there has been a flurry of posts by Gavin at http://gavinandresen.svbtle.com/ which advocate strongly for increasing the maximum block size. However, there hasnt been any discussion on this mailing list in several years as far as I can tell...
So, at the risk of starting a flamewar, I'll provide a little bait to get some responses and hope the discussion opens up into an honest comparison of the tradeoffs here. Certainly a consensus in this kind of technical community should be a basic requirement for any serious commitment to blocksize increase.
Personally, I'm rather strongly against any commitment to a block size increase in the near future. Long-term incentive compatibility requires that there be some fee pressure, and that blocks be relatively consistently full or very nearly full. What we see today are transactions enjoying next-block confirmations with nearly zero pressure to include any fee at all (though many do because it makes wallet code simpler).
This allows the well-funded Bitcoin ecosystem to continue building systems which rely on transactions moving quickly into blocks while pretending these systems scale. Thus, instead of working on technologies which bring Bitcoin's trustlessness to systems which scale beyond a blockchain's necessarily slow and (compared to updating numbers in a database) expensive settlement, the ecosystem as a whole continues to focus on building centralized platforms and advocate for changes to Bitcoin which allow them to maintain the status quo
Shortly thereafter, Corallo explained further:
The point of the hard block size limit is exactly because giving miners free rule to do anything they like with their blocks would allow them to do any number of crazy attacks. The incentives for miners to pick block sizes are no where near compatible with what allows the network to continue to run in a decentralized manner.
Tier Nolan considered possible extensions and modifications that might improve Gavin's proposal and argued that soft caps could be used to mitigate against the dangers of a blocksize increase. Tom Harding voiced support for Gavin's proposal
Peter Todd mentioned that a limited blocksize provides the benefit of protecting against the "perverse incentives" behind potential block withholding attacks.
Slush didn't have a strong opinion one way or the other, and neither did Eric Lombrozo, though Eric was interested in developing hard-fork best practices and wanted to:
explore all the complexities involved with deployment of hard forks. Let’s not just do a one-off ad-hoc thing.
Matt Whitlock voiced his opinion:
I'm not so much opposed to a block size increase as I am opposed to a hard fork... I strongly fear that the hard fork itself will become an excuse to change other aspects of the system in ways that will have unintended and possibly disastrous consequences.
Bryan Bishop strongly opposed Gavin's proposal, and offered a philosophical perspective on the matter:
there has been significant public discussion... about why increasing the max block size is kicking the can down the road while possibly compromising blockchain security. There were many excellent objections that were raised that, sadly, I see are not referenced at all in the recent media blitz. Frankly I can't help but feel that if contributions, like those from #bitcoin-wizards, have been ignored in lieu of technical analysis, and the absence of discussion on this mailing list, that I feel perhaps there are other subtle and extremely important technical details that are completely absent from this--and other-- proposals.
Secured decentralization is the most important and most interesting property of bitcoin. Everything else is rather trivial and could be achieved millions of times more efficiently with conventional technology. Our technical work should be informed by the technical nature of the system we have constructed.
There's no doubt in my mind that bitcoin will always see the most extreme campaigns and the most extreme misunderstandings... for development purposes we must hold ourselves to extremely high standards before proposing changes, especially to the public, that have the potential to be unsafe and economically unsafe.
There are many potential technical solutions for aggregating millions (trillions?) of transactions into tiny bundles. As a small proof-of-concept, imagine two parties sending transactions back and forth 100 million times. Instead of recording every transaction, you could record the start state and the end state, and end up with two transactions or less. That's a 100 million fold, without modifying max block size and without potentially compromising secured decentralization.
The MIT group should listen up and get to work figuring out how to measure decentralization and its security.. Getting this measurement right would be really beneficial because we would have a more academic and technical understanding to work with.
Gregory Maxwell echoed and extended that perspective:
When Bitcoin is changed fundamentally, via a hard fork, to have different properties, the change can create winners or losers...
There are non-trivial number of people who hold extremes on any of these general belief patterns; Even among the core developers there is not a consensus on Bitcoin's optimal role in society and the commercial marketplace.
there is a at least a two fold concern on this particular ("Long term Mining incentives") front:
One is that the long-held argument is that security of the Bitcoin system in the long term depends on fee income funding autonomous, anonymous, decentralized miners profitably applying enough hash-power to make reorganizations infeasible.
For fees to achieve this purpose, there seemingly must be an effective scarcity of capacity.
The second is that when subsidy has fallen well below fees, the incentive to move the blockchain forward goes away. An optimal rational miner would be best off forking off the current best block in order to capture its fees, rather than moving the blockchain forward...
tools like the Lightning network proposal could well allow us to hit a greater spectrum of demands at once--including secure zero-confirmation (something that larger blocksizes reduce if anything), which is important for many applications. With the right technology I believe we can have our cake and eat it too, but there needs to be a reason to build it; the security and decentralization level of Bitcoin imposes a hard upper limit on anything that can be based on it.
Another key point here is that the small bumps in blocksize which wouldn't clearly knock the system into a largely centralized mode--small constants--are small enough that they don't quantitatively change the operation of the system; they don't open up new applications that aren't possible today
the procedure I'd prefer would be something like this: if there is a standing backlog, we-the-community of users look to indicators to gauge if the network is losing decentralization and then double the hard limit with proper controls to allow smooth adjustment without fees going to zero (see the past proposals for automatic block size controls that let miners increase up to a hard maximum over the median if they mine at quadratically harder difficulty), and we don't increase if it appears it would be at a substantial increase in centralization risk. Hardfork changes should only be made if they're almost completely uncontroversial--where virtually everyone can look at the available data and say "yea, that isn't undermining my property rights or future use of Bitcoin; it's no big deal". Unfortunately, every indicator I can think of except fee totals has been going in the wrong direction almost monotonically along with the blockchain size increase since 2012 when we started hitting full blocks and responded by increasing the default soft target. This is frustrating
many people--myself included--have been working feverishly hard behind the scenes on Bitcoin Core to increase the scalability. This work isn't small-potatoes boring software engineering stuff; I mean even my personal contributions include things like inventing a wholly new generic algebraic optimization applicable to all EC signature schemes that increases performance by 4%, and that is before getting into the R&D stuff that hasn't really borne fruit yet, like fraud proofs. Today Bitcoin Core is easily >100 times faster to synchronize and relay than when I first got involved on the same hardware, but these improvements have been swallowed by the growth. The ironic thing is that our frantic efforts to keep ahead and not lose decentralization have both not been enough (by the best measures, full node usage is the lowest its been since 2011 even though the user base is huge now) and yet also so much that people could seriously talk about increasing the block size to something gigantic like 20MB. This sounds less reasonable when you realize that even at 1MB we'd likely have a smoking hole in the ground if not for existing enormous efforts to make scaling not come at a loss of decentralization.
Peter Todd also summarized some academic findings on the subject:
In short, without either a fixed blocksize or fixed fee per transaction Bitcoin will will not survive as there is no viable way to pay for PoW security. The latter option - fixed fee per transaction - is non-trivial to implement in a way that's actually meaningful - it's easy to give miners "kickbacks" - leaving us with a fixed blocksize.
Even a relatively small increase to 20MB will greatly reduce the number of people who can participate fully in Bitcoin, creating an environment where the next increase requires the consent of an even smaller portion of the Bitcoin ecosystem. Where does that stop? What's the proposed mechanism that'll create an incentive and social consensus to not just 'kick the can down the road'(3) and further centralize but actually scale up Bitcoin the hard way?
Some developers (e.g. Aaron Voisine) voiced support for Gavin's proposal which repeated Mike Hearn's "crash landing" arguments.
Pieter Wuille said:
I am - in general - in favor of increasing the size blocks...
Controversial hard forks. I hope the mailing list here today already proves it is a controversial issue. Independent of personal opinions pro or against, I don't think we can do a hard fork that is controversial in nature. Either the result is effectively a fork, and pre-existing coins can be spent once on both sides (effectively failing Bitcoin's primary purpose), or the result is one side forced to upgrade to something they dislike - effectively giving a power to developers they should never have. Quoting someone: "I did not sign up to be part of a central banker's committee".
The reason for increasing is "need". If "we need more space in blocks" is the reason to do an upgrade, it won't stop after 20 MB. There is nothing fundamental possible with 20 MB blocks that isn't with 1 MB blocks.
Misrepresentation of the trade-offs. You can argue all you want that none of the effects of larger blocks are particularly damaging, so everything is fine. They will damage something (see below for details), and we should analyze these effects, and be honest about them, and present them as a trade-off made we choose to make to scale the system better. If you just ask people if they want more transactions, of course you'll hear yes. If you ask people if they want to pay less taxes, I'm sure the vast majority will agree as well.
Miner centralization. There is currently, as far as I know, no technology that can relay and validate 20 MB blocks across the planet, in a manner fast enough to avoid very significant costs to mining. There is work in progress on this (including Gavin's IBLT-based relay, or Greg's block network coding), but I don't think we should be basing the future of the economics of the system on undemonstrated ideas. Without those (or even with), the result may be that miners self-limit the size of their blocks to propagate faster, but if this happens, larger, better-connected, and more centrally-located groups of miners gain a competitive advantage by being able to produce larger blocks. I would like to point out that there is nothing evil about this - a simple feedback to determine an optimal block size for an individual miner will result in larger blocks for better connected hash power. If we do not want miners to have this ability, "we" (as in: those using full nodes) should demand limitations that prevent it. One such limitation is a block size limit (whatever it is).
Ability to use a full node.
Skewed incentives for improvements... without actual pressure to work on these, I doubt much will change. Increasing the size of blocks now will simply make it cheap enough to continue business as usual for a while - while forcing a massive cost increase (and not just a monetary one) on the entire ecosystem.
Fees and long-term incentives.
I don't think 1 MB is optimal. Block size is a compromise between scalability of transactions and verifiability of the system. A system with 10 transactions per day that is verifiable by a pocket calculator is not useful, as it would only serve a few large bank's settlements. A system which can deal with every coffee bought on the planet, but requires a Google-scale data center to verify is also not useful, as it would be trivially out-competed by a VISA-like design. The usefulness needs in a balance, and there is no optimal choice for everyone. We can choose where that balance lies, but we must accept that this is done as a trade-off, and that that trade-off will have costs such as hardware costs, decreasing anonymity, less independence, smaller target audience for people able to fully validate, ...
Choose wisely.
Mike Hearn responded:
this list is not a good place for making progress or reaching decisions.
if Bitcoin continues on its current growth trends it will run out of capacity, almost certainly by some time next year. What we need to see right now is leadership and a plan, that fits in the available time window.
I no longer believe this community can reach consensus on anything protocol related.
When the money supply eventually dwindles I doubt it will be fee pressure that funds mining
What I don't see from you yet is a specific and credible plan that fits within the next 12 months and which allows Bitcoin to keep growing.
Peter Todd then pointed out that, contrary to Mike's claims, developer consensus had been achieved within Core plenty of times recently. Btc-drak asked Mike to "explain where the 12 months timeframe comes from?"
Jorge Timón wrote an incredibly prescient reply to Mike:
We've successfully reached consensus for several softfork proposals already. I agree with others that hardfork need to be uncontroversial and there should be consensus about them. If you have other ideas for the criteria for hardfork deployment all I'm ears. I just hope that by "What we need to see right now is leadership" you don't mean something like "when Gaving and Mike agree it's enough to deploy a hardfork" when you go from vague to concrete.
Oh, so your answer to "bitcoin will eventually need to live on fees and we would like to know more about how it will look like then" it's "no bitcoin long term it's broken long term but that's far away in the future so let's just worry about the present". I agree that it's hard to predict that future, but having some competition for block space would actually help us get more data on a similar situation to be able to predict that future better. What you want to avoid at all cost (the block size actually being used), I see as the best opportunity we have to look into the future.
this is my plan: we wait 12 months... and start having full blocks and people having to wait 2 blocks for their transactions to be confirmed some times. That would be the beginning of a true "fee market", something that Gavin used to say was his #1 priority not so long ago (which seems contradictory with his current efforts to avoid that from happening). Having a true fee market seems clearly an advantage. What are supposedly disastrous negative parts of this plan that make an alternative plan (ie: increasing the block size) so necessary and obvious. I think the advocates of the size increase are failing to explain the disadvantages of maintaining the current size. It feels like the explanation are missing because it should be somehow obvious how the sky will burn if we don't increase the block size soon. But, well, it is not obvious to me, so please elaborate on why having a fee market (instead of just an price estimator for a market that doesn't even really exist) would be a disaster.
Some suspected Gavin/Mike were trying to rush the hard fork for personal reasons.
Mike Hearn's response was to demand a "leader" who could unilaterally steer the Bitcoin project and make decisions unchecked:
No. What I meant is that someone (theoretically Wladimir) needs to make a clear decision. If that decision is "Bitcoin Core will wait and watch the fireworks when blocks get full", that would be showing leadership
I will write more on the topic of what will happen if we hit the block size limit... I don't believe we will get any useful data out of such an event. I've seen distributed systems run out of capacity before. What will happen instead is technological failure followed by rapid user abandonment...
we need to hear something like that from Wladimir, or whoever has the final say around here.
Jorge Timón responded:
it is true that "universally uncontroversial" (which is what I think the requirement should be for hard forks) is a vague qualifier that's not formally defined anywhere. I guess we should only consider rational arguments. You cannot just nack something without further explanation. If his explanation was "I will change my mind after we increase block size", I guess the community should say "then we will just ignore your nack because it makes no sense". In the same way, when people use fallacies (purposely or not) we must expose that and say "this fallacy doesn't count as an argument". But yeah, it would probably be good to define better what constitutes a "sensible objection" or something. That doesn't seem simple though.
it seems that some people would like to see that happening before the subsidies are low (not necessarily null), while other people are fine waiting for that but don't want to ever be close to the scale limits anytime soon. I would also like to know for how long we need to prioritize short term adoption in this way. As others have said, if the answer is "forever, adoption is always the most important thing" then we will end up with an improved version of Visa. But yeah, this is progress, I'll wait for your more detailed description of the tragedies that will follow hitting the block limits, assuming for now that it will happen in 12 months. My previous answer to the nervous "we will hit the block limits in 12 months if we don't do anything" was "not sure about 12 months, but whatever, great, I'm waiting for that to observe how fees get affected". But it should have been a question "what's wrong with hitting the block limits in 12 months?"
Mike Hearn again asserted the need for a leader:
There must be a single decision maker for any given codebase.
Bryan Bishop attempted to explain why this did not make sense with git architecture.
Finally, Gavin announced his intent to merge the patch into Bitcoin XT to bypass the peer review he had received on the bitcoin-dev mailing list.
submitted by sound8bits to sound8bits [link] [comments]

Lies, FUD, and hyperbole

https://medium.com/@octskyward/the-resolution-of-the-bitcoin-experiment-dabb30201f7#.obcepgw0g
Lies, FUD, and hyperbole Part 1
With apologies to the length but Hearn does pack a lot of misrepresentations and lies into this article.
a system completely controlled by just a handful of people. Worse still, the network is on the brink of technical collapse.
This is patently untrue as power dynamics within bitcoin are a complex interwoven level of game theory shared by miners, nodes, developers, merchants and payment processors, and users. Even if one were to make the false assumption that Miners control all the power, the reality is mining pools are either made up of thousands of individual miners who can and do redirect their hashing power or private pools with companies controlled by multiple investors and owners.
Worse still, the network is on the brink of technical collapse.
If and when a fee event happens, bitcoin will be just fine. Wallets already can adjust for fees and tx fee pressures will be kept reasonable because they still need to compete with free off the chain solutions. Whether the Block size is raised to 2, 4, or 8 MB it will also be fine(in the short term) as long as corresponding sigop protections are included. The blocksize debate more has to do with bikeshedding and setting a long term direction for bitcoin than preventing a short term technical collapse.
Couldn’t move your existing money
Bitcoin functions as a payment rails system just fine, just ask Coinbase and bitpay.
Had wildly unpredictable fees that were high and rising fast
False, I normal pay 3-5 pennies , and tx instantly get to their destination and confirm between 5 min to 1 hour like normal. CC txs take weeks to months to confirm.
Allowed buyers to take back payments they’d made after walking out of shops, by simply pressing a button (if >you aren’t aware of this “feature” that’s because Bitcoin was only just changed to allow it)
RBF is opt in , and therefore payment processors won't accept this if they do 0 conf tx approvals.
Is suffering large backlogs and flaky payments
The block chain is full.
Blocks are 60-70% full on average . We have yet to see a continuous backlog lasting more than a few hours max. This conf backlog doesn't prevent tx from being processed unlike when the Visa/paypal network goes down and you cannot make a payment at all.
… which is controlled by China
People in China [b]partially [/b]Control one small aspect of the bitcoin ecosystem and why shouldn't they? They do represent 19% of the worlds population. This comment is both misleading and xenophobic.
… and in which the companies and people building it were in open civil war?
Most people are passionate but still friendly behind closed doors. The Blocksize debate has spurred decentralization of developer groups and new ideas which are good things. Sure there has been some unproductive infighting , but we will get through this and be stronger for it. "Civil wars" exist within and between all currencies anyways so this is nothing surprising.
Once upon a time, Bitcoin had the killer advantage of low and even zero fees, but it’s now common to be asked >to pay more to miners than a credit card would charge.
Credit cards charge 2.8% to 7% in the US and 5-8% in many other countries. Bitcoins once had fees up to 40 cents a tx , and for the past few years normal fees have been consistently between 2-8 pennies per tx on the chain and free off the chain.
Because the block chain is controlled by Chinese miners, just two of whom control more >than 50% of the hash >power.
At a recent conference over 95% of hashing power was controlled by a handful of guys sitting on a single stage.
Mining pools are controlled by many miners and interests , not individuals. Miners also share the control with many other competing interests and are limited in their ability to harm the bitcoin ecosystem if they so choose.
They have chosen instead to ignore the problem and hope it goes away.
Bitcoin core has already come to a consensus on a scaling proposal - https://bitcoincore.org/en/2015/12/21/capacity-increase/ https://bitcoincore.org/en/2015/12/23/capacity-increases-faq/ and various other implementations are developing theirs to propose to the community. Bitcoin Classic is another interesting implementations that appears to have found consensus around BIP102.
This gives them a perverse financial incentive to actually try and stop Bitcoin becoming popular.
The Chinese miners want bitcoin to scale to at least 2MB in the short term, something that both Core and Classic accommodate. Bitcoin will continue to scale with many other solutions and ultimately payment channels will allow it to scale to Visa like levels of TPS.
The resulting civil war has seen Coinbase — the largest and best known Bitcoin startup in the USA — be erased >from the official Bitcoin website for picking the “wrong” side and banned from the community forums.
Coinbase was re-added to bitcoin.org. Mike conveniently left that important datapoint off.
has gone from being a transparent and open community to one that is dominated by rampant censorship
There are more subreddits, more forums , and more information than ever before. The blocksize debate does sometimes create divisions in our ecosystem but the information is all there and easy for anyone to investigate.
But the inability to get news about XT or the censorship itself through to users has some problematic effects.
The failure of XT has nothing to do with the lack of information. If anything there is too much information available , being repeated over and over , in many different venues.
One of them, Gregory Maxwell, had an unusual set of views: he once claimed he had mathematically proven >Bitcoin to be impossible. More problematically, he did not believe in Satoshi’s original vision.
Satoshi never intended to be used as an argument from authority and if he does he can always come back and contribute. We should not depend upon an authority figure but evidence, valid reasoning, and testing.
And indeed back-of-the-envelope calculations suggested that, as he said to me, “it never really hits a scale >ceiling” even when looking at more factors than just bandwidth.
Hearn's calculations are wrong. More specifically they do not take into account TOR, decentralization in locations with bandwidth limitations, bandwidth softcaps imposed by ISP's, the true scale of historical bandwidth increases, and malicious actors attacking the system with sophisticated attacks.
Once the 5 developers with commit access to the code had been chosen and Gavin had decided he did not want >to be the leader, there was no procedure in place to ever remove one.
The 45 developers who contributed to Bitcoin Core in 2015 could be replaced instantly if the community wanted with little effort. Ultimately, the nodes, miners and users control which code they use and no group of developers can force them to upgrade. In fact Bitcoin Core deliberately avoids and auto-update feature with their releases at the cost of usability to specifically insure that users have to actively choose all new features and can opt out simply by not upgrading.
... end of part one...
submitted by bitusher to Bitcoin [link] [comments]

Is anyone else freaked out by this whole blocksize debate? Does anyone else find themself often agreeing with *both* sides - depending on whichever argument you happen to be reading at the moment? And do we need some better algorithms and data structures?

Why do both sides of the debate seem “right” to me?
I know, I know, a healthy debate is healthy and all - and maybe I'm just not used to the tumult and jostling which would be inevitable in a real live open major debate about something as vital as Bitcoin.
And I really do agree with the starry-eyed idealists who say Bitcoin is vital. Imperfect as it may be, it certainly does seem to represent the first real chance we've had in the past few hundred years to try to steer our civilization and our planet away from the dead-ends and disasters which our government-issued debt-based currencies keep dragging us into.
But this particular debate, about the blocksize, doesn't seem to be getting resolved at all.
Pretty much every time I read one of the long-form major arguments contributed by Bitcoin "thinkers" who I've come to respect over the past few years, this weird thing happens: I usually end up finding myself nodding my head and agreeing with whatever particular piece I'm reading!
But that should be impossible - because a lot of these people vehemently disagree!
So how can both sides sound so convincing to me, simply depending on whichever piece I currently happen to be reading?
Does anyone else feel this way? Or am I just a gullible idiot?
Just Do It?
When you first look at it or hear about it, increasing the size seems almost like a no-brainer: The "big-block" supporters say just increase the blocksize to 20 MB or 8 MB, or do some kind of scheduled or calculated regular increment which tries to take into account the capabilities of the infrastructure and the needs of the users. We do have the bandwidth and the memory to at least increase the blocksize now, they say - and we're probably gonna continue to have more bandwidth and memory in order to be able to keep increasing the blocksize for another couple decades - pretty much like everything else computer-based we've seen over the years (some of this stuff is called by names such as "Moore's Law").
On the other hand, whenever the "small-block" supporters warn about the utter catastrophe that a failed hard-fork would mean, I get totally freaked by their possible doomsday scenarios, which seem totally plausible and terrifying - so I end up feeling that the only way I'd want to go with a hard-fork would be if there was some pre-agreed "triggering" mechanism where the fork itself would only actually "switch on" and take effect provided that some "supermajority" of the network (of who? the miners? the full nodes?) had signaled (presumably via some kind of totally reliable p2p trustless software-based voting system?) that they do indeed "pre-agree" to actually adopt the pre-scheduled fork (and thereby avoid any possibility whatsoever of the precious blockchain somehow tragically splitting into two and pretty much killing this cryptocurrency off in its infancy).
So in this "conservative" scenario, I'm talking about wanting at least 95% pre-adoption agreement - not the mere 75% which I recall some proposals call for, which seems like it could easily lead to a 75/25 blockchain split.
But this time, with this long drawn-out blocksize debate, the core devs, and several other important voices who have become prominent opinion shapers over the past few years, can't seem to come to any real agreement on this.
Weird split among the devs
As far as I can see, there's this weird split: Gavin and Mike seem to be the only people among the devs who really want a major blocksize increase - and all the other devs seem to be vehemently against them.
But then on the other hand, the users seem to be overwhelmingly in favor of a major increase.
And there are meta-questions about governance, about about why this didn't come out as a BIP, and what the availability of Bitcoin XT means.
And today or yesterday there was this really cool big-blockian exponential graph based on doubling the blocksize every two years for twenty years, reminding us of the pure mathematical fact that 210 is indeed about 1000 - but not really addressing any of the game-theoretic points raised by the small-blockians. So a lot of the users seem to like it, but when so few devs say anything positive about it, I worry: is this just yet more exponential chart porn?
On the one hand, Gavin's and Mike's blocksize increase proposal initially seemed like a no-brainer to me.
And on the other hand, all the other devs seem to be against them. Which is weird - not what I'd initially expected at all (but maybe I'm just a fool who's seduced by exponential chart porn?).
Look, I don't mean to be rude to any of the core devs, and I don't want to come off like someone wearing a tinfoil hat - but it has to cross people's minds that the powers that be (the Fed and the other central banks and the governments that use their debt-issued money to run this world into a ditch) could very well be much more scared shitless than they're letting on. If we assume that the powers that be are using their usual playbook and tactics, then it could be worth looking at the book "Confessions of an Economic Hitman" by John Perkins, to get an idea of how they might try to attack Bitcoin. So, what I'm saying is, they do have a track record of sending in "experts" to try to derail projects and keep everyone enslaved to the Creature from Jekyll Island. I'm just saying. So, without getting ad hominem - let's just make sure that our ideas can really stand scrutiny on their own - as Nick Szabo says, we need to make sure there is "more computer science, less noise" in this debate.
When Gavin Andresen first came out with the 20 MB thing - I sat back and tried to imagine if I could download 20 MB in 10 minutes (which seems to be one of the basic mathematical and technological constraints here - right?)
I figured, "Yeah, I could download that" - even with my crappy internet connection.
And I guess the telecoms might be nice enough to continue to double our bandwidth every two years for the next couple decades – if we ask them politely?
On the other hand - I think we should be careful about entrusting the financial freedom of the world into the greedy hands of the telecoms companies - given all their shady shenanigans over the past few years in many countries. After decades of the MPAA and the FBI trying to chip away at BitTorrent, lately PirateBay has been hard to access. I would say it's quite likely that certain persons at institutions like JPMorgan and Goldman Sachs and the Fed might be very, very motivated to see Bitcoin fail - so we shouldn't be too sure about scaling plans which depend on the willingness of companies Verizon and AT&T to double our bandwith every two years.
Maybe the real important hardware buildout challenge for a company like 21 (and its allies such as Qualcomm) to take on now would not be "a miner in every toaster" but rather "Google Fiber Download and Upload Speeds in every Country, including China".
I think I've read all the major stuff on the blocksize debate from Gavin Andresen, Mike Hearn, Greg Maxwell, Peter Todd, Adam Back, and Jeff Garzick and several other major contributors - and, oddly enough, all their arguments seem reasonable - heck even Luke-Jr seems reasonable to me on the blocksize debate, and I always thought he was a whackjob overly influenced by superstition and numerology - and now today I'm reading the article by Bram Cohen - the inventor of BitTorrent - and I find myself agreeing with him too!
I say to myself: What's going on with me? How can I possibly agree with all of these guys, if they all have such vehemently opposing viewpoints?
I mean, think back to the glory days of a couple of years ago, when all we were hearing was how this amazing unprecedented grassroots innovation called Bitcoin was going to benefit everyone from all walks of life, all around the world:
...basically the entire human race transacting everything into the blockchain.
(Although let me say that I think that people's focus on ideas like driverless cabs creating realtime fare markets based on supply and demand seems to be setting our sights a bit low as far as Bitcoin's abilities to correct the financial world's capital-misallocation problems which seem to have been made possible by infinite debt-based fiat. I would have hoped that a Bitcoin-based economy would solve much more noble, much more urgent capital-allocation problems than driverless taxicabs creating fare markets or refrigerators ordering milk on the internet of things. I was thinking more along the lines that Bitcoin would finally strangle dead-end debt-based deadly-toxic energy industries like fossil fuels and let profitable clean energy industries like Thorium LFTRs take over - but that's another topic. :=)
Paradoxes in the blocksize debate
Let me summarize the major paradoxes I see here:
(1) Regarding the people (the majority of the core devs) who are against a blocksize increase: Well, the small-blocks arguments do seem kinda weird, and certainly not very "populist", in the sense that: When on earth have end-users ever heard of a computer technology whose capacity didn't grow pretty much exponentially year-on-year? All the cool new technology we've had - from hard drives to RAM to bandwidth - started out pathetically tiny and grew to unimaginably huge over the past few decades - and all our software has in turn gotten massively powerful and big and complex (sometimes bloated) to take advantage of the enormous new capacity available.
But now suddenly, for the first time in the history of technology, we seem to have a majority of the devs, on a major p2p project - saying: "Let's not scale the system up. It could be dangerous. It might break the whole system (if the hard-fork fails)."
I don't know, maybe I'm missing something here, maybe someone else could enlighten me, but I don't think I've ever seen this sort of thing happen in the last few decades of the history of technology - devs arguing against scaling up p2p technology to take advantage of expected growth in infrastructure capacity.
(2) But... on the other hand... the dire warnings of the small-blockians about what could happen if a hard-fork were to fail - wow, they do seem really dire! And these guys are pretty much all heavyweight, experienced programmers and/or game theorists and/or p2p open-source project managers.
I must say, that nearly all of the long-form arguments I've read - as well as many, many of the shorter comments I've read from many users in the threads, whose names I at least have come to more-or-less recognize over the past few months and years on reddit and bitcointalk - have been amazingly impressive in their ability to analyze all aspects of the lifecycle and management of open-source software projects, bringing up lots of serious points which I could never have come up with, and which seem to come from long experience with programming and project management - as well as dealing with economics and human nature (eg, greed - the game-theory stuff).
So a lot of really smart and experienced people with major expertise in various areas ranging from programming to management to game theory to politics to economics have been making some serious, mature, compelling arguments.
But, as I've been saying, the only problem to me is: in many of these cases, these arguments are vehemently in opposition to each other! So I find myself agreeing with pretty much all of them, one by one - which means the end result is just a giant contradiction.
I mean, today we have Bram Cohen, the inventor of BitTorrent, arguing (quite cogently and convincingly to me), that it would be dangerous to increase the blocksize. And this seems to be a guy who would know a few things about scaling out a massive global p2p network - since the protocol which he invented, BitTorrent, is now apparently responsible for like a third of the traffic on the internet (and this despite the long-term concerted efforts of major evil players such as the MPAA and the FBI to shut the whole thing down).
Was the BitTorrent analogy too "glib"?
By the way - I would like to go on a slight tangent here and say that one of the main reasons why I felt so "comfortable" jumping on the Bitcoin train back a few years ago, when I first heard about it and got into it, was the whole rough analogy I saw with BitTorrent.
I remembered the perhaps paradoxical fact that when a torrent is more popular (eg, a major movie release that just came out last week), then it actually becomes faster to download. More people want it, so more people have a few pieces of it, so more people are able to get it from each other. A kind of self-correcting economic feedback loop, where more demand directly leads to more supply.
(BitTorrent manages to pull this off by essentially adding a certain structure to the file being shared, so that it's not simply like an append-only list of 1 MB blocks, but rather more like an random-access or indexed array of 1 MB chunks. Say you're downloading a film which is 700 MB. As soon as your "client" program has downloaded a single 1-MB chunk - say chunk #99 - your "client" program instantly turns into a "server" program as well - offering that chunk #99 to other clients. From my simplistic understanding, I believe the Bitcoin protocol does something similar, to provide a p2p architecture. Hence my - perhaps naïve - assumption that Bitcoin already had the right algorithms / architecture / data structure to scale.)
The efficiency of the BitTorrent network seemed to jive with that "network law" (Metcalfe's Law?) about fax machines. This law states that the more fax machines there are, the more valuable the network of fax machines becomes. Or the value of the network grows on the order of the square of the number of nodes.
This is in contrast with other technology like cars, where the more you have, the worse things get. The more cars there are, the more traffic jams you have, so things start going downhill. I guess this is because highway space is limited - after all, we can't pave over the entire countryside, and we never did get those flying cars we were promised, as David Graeber laments in a recent essay in The Baffler magazine :-)
And regarding the "stress test" supposedly happening right now in the middle of this ongoing blocksize debate, I don't know what worries me more: the fact that it apparently is taking only $5,000 to do a simple kind of DoS on the blockchain - or the fact that there are a few rumors swirling around saying that the unknown company doing the stress test shares the same physical mailing address with a "scam" company?
Or maybe we should just be worried that so much of this debate is happening on a handful of forums which are controlled by some guy named theymos who's already engaged in some pretty "contentious" or "controversial" behavior like blowing a million dollars on writing forum software (I guess he never heard that reddit.com software is open-source)?
So I worry that the great promise of "decentralization" might be more fragile than we originally thought.
Scaling
Anyways, back to Metcalfe's Law: with virtual stuff, like torrents and fax machines, the more the merrier. The more people downloading a given movie, the faster it arrives - and the more people own fax machines, the more valuable the overall fax network.
So I kindof (naïvely?) assumed that Bitcoin, being "virtual" and p2p, would somehow scale up the same magical way BitTorrrent did. I just figured that more people using it would somehow automatically make it stronger and faster.
But now a lot of devs have started talking in terms of the old "scarcity" paradigm, talking about blockspace being a "scarce resource" and talking about "fee markets" - which seems kinda scary, and antithetical to much of the earlier rhetoric we heard about Bitcoin (the stuff about supporting our favorite creators with micropayments, and the stuff about Africans using SMS to send around payments).
Look, when some asshole is in line in front of you at the cash register and he's holding up the line so they can run his credit card to buy a bag of Cheeto's, we tend to get pissed off at the guy - clogging up our expensive global electronic payment infrastructure to make a two-dollar purchase. And that's on a fairly efficient centralized system - and presumably after a year or so, VISA and the guy's bank can delete or compress the transaction in their SQL databases.
Now, correct me if I'm wrong, but if some guy buys a coffee on the blockchain, or if somebody pays an online artist $1.99 for their work - then that transaction, a few bytes or so, has to live on the blockchain forever?
Or is there some "pruning" thing that gets rid of it after a while?
And this could lead to another question: Viewed from the perspective of double-entry bookkeeping, is the blockchain "world-wide ledger" more like the "balance sheet" part of accounting, i.e. a snapshot showing current assets and liabilities? Or is it more like the "cash flow" part of accounting, i.e. a journal showing historical revenues and expenses?
When I think of thousands of machines around the globe having to lug around multiple identical copies of a multi-gigabyte file containing some asshole's coffee purchase forever and ever... I feel like I'm ideologically drifting in one direction (where I'd end up also being against really cool stuff like online micropayments and Africans banking via SMS)... so I don't want to go there.
But on the other hand, when really experienced and battle-tested veterans with major experience in the world of open-souce programming and project management (the "small-blockians") warn of the catastrophic consequences of a possible failed hard-fork, I get freaked out and I wonder if Bitcoin really was destined to be a settlement layer for big transactions.
Could the original programmer(s) possibly weigh in?
And I don't mean to appeal to authority - but heck, where the hell is Satoshi Nakamoto in all this? I do understand that he/she/they would want to maintain absolute anonymity - but on the other hand, I assume SN wants Bitcoin to succeed (both for the future of humanity - or at least for all the bitcoins SN allegedly holds :-) - and I understand there is a way that SN can cryptographically sign a message - and I understand that as the original developer of Bitcoin, SN had some very specific opinions about the blocksize... So I'm kinda wondering of Satoshi could weigh in from time to time. Just to help out a bit. I'm not saying "Show us a sign" like a deity or something - but damn it sure would be fascinating and possibly very helpful if Satoshi gave us his/hetheir 2 satoshis worth at this really confusing juncture.
Are we using our capacity wisely?
I'm not a programming or game-theory whiz, I'm just a casual user who has tried to keep up with technology over the years.
It just seems weird to me that here we have this massive supercomputer (500 times more powerful than the all the supercomputers in the world combined) doing fairly straightforward "embarassingly parallel" number-crunching operations to secure a p2p world-wide ledger called the blockchain to keep track of a measly 2.1 quadrillion tokens spread out among a few billion addresses - and a couple of years ago you had people like Rick Falkvinge saying the blockchain would someday be supporting multi-million-dollar letters of credit for international trade and you had people like Andreas Antonopoulos saying the blockchain would someday allow billions of "unbanked" people to send remittances around the village or around the world dirt-cheap - and now suddenly in June 2015 we're talking about blockspace as a "scarce resource" and talking about "fee markets" and partially centralized, corporate-sponsored "Level 2" vaporware like Lightning Network and some mysterious company is "stess testing" or "DoS-ing" the system by throwing away a measly $5,000 and suddenly it sounds like the whole system could eventually head right back into PayPal and Western Union territory again, in terms of expensive fees.
When I got into Bitcoin, I really was heavily influenced by vague analogies with BitTorrent: I figured everyone would just have tiny little like utorrent-type program running on their machine (ie, Bitcoin-QT or Armory or Mycelium etc.).
I figured that just like anyone can host a their own blog or webserver, anyone would be able to host their own bank.
Yeah, Google and and Mozilla and Twitter and Facebook and WhatsApp did come along and build stuff on top of TCP/IP, so I did expect a bunch of companies to build layers on top of the Bitcoin protocol as well. But I still figured the basic unit of bitcoin client software powering the overall system would be small and personal and affordable and p2p - like a bittorrent client - or at the most, like a cheap server hosting a blog or email server.
And I figured there would be a way at the software level, at the architecture level, at the algorithmic level, at the data structure level - to let the thing scale - if not infinitely, at least fairly massively and gracefully - the same way the BitTorrent network has.
Of course, I do also understand that with BitTorrent, you're sharing a read-only object (eg, a movie) - whereas with Bitcoin, you're achieving distributed trustless consensus and appending it to a write-only (or append-only) database.
So I do understand that the problem which BitTorrent solves is much simpler than the problem which Bitcoin sets out to solve.
But still, it seems that there's got to be a way to make this thing scale. It's p2p and it's got 500 times more computing power than all the supercomputers in the world combined - and so many brilliant and motivated and inspired people want this thing to succeed! And Bitcoin could be our civilization's last chance to steer away from the oncoming debt-based ditch of disaster we seem to be driving into!
It just seems that Bitcoin has got to be able to scale somehow - and all these smart people working together should be able to come up with a solution which pretty much everyone can agree - in advance - will work.
Right? Right?
A (probably irrelevant) tangent on algorithms and architecture and data structures
I'll finally weigh with my personal perspective - although I might be biased due to my background (which is more on the theoretical side of computer science).
My own modest - or perhaps radical - suggestion would be to ask whether we're really looking at all the best possible algorithms and architectures and data structures out there.
From this perspective, I sometimes worry that the overwhelming majority of the great minds working on the programming and game-theory stuff might come from a rather specific, shall we say "von Neumann" or "procedural" or "imperative" school of programming (ie, C and Python and Java programmers).
It seems strange to me that such a cutting-edge and important computer project would have so little participation from the great minds at the other end of the spectrum of programming paradigms - namely, the "functional" and "declarative" and "algebraic" (and co-algebraic!) worlds.
For example, I was struck in particular by statements I've seen here and there (which seemed rather hubristic or lackadaisical to me - for something as important as Bitcoin), that the specification of Bitcoin and the blockchain doesn't really exist in any form other than the reference implementation(s) (in procedural languages such as C or Python?).
Curry-Howard anyone?
I mean, many computer scientists are aware of the Curry-Howard isomorophism, which basically says that the relationship between a theorem and its proof is equivalent to the relationship between a specification and its implementation. In other words, there is a long tradition in mathematics (and in computer programming) of:
And it's not exactly "turtles all the way down" either: a specification is generally simple and compact enough that a good programmer can usually simply visually inspect it to determine if it is indeed "correct" - something which is very difficult, if not impossible, to do with a program written in a procedural, implementation-oriented language such as C or Python or Java.
So I worry that we've got this tradition, from the open-source github C/Java programming tradition, of never actually writing our "specification", and only writing the "implementation". In mission-critical military-grade programming projects (which often use languages like Ada or Maude) this is simply not allowed. It would seem that a project as mission-critical as Bitcoin - which could literally be crucial for humanity's continued survival - should also use this kind of military-grade software development approach.
And I'm not saying rewrite the implementations in these kind of theoretical languages. But it might be helpful if the C/Python/Java programmers in the Bitcoin imperative programming world could build some bridges to the Maude/Haskell/ML programmers of the functional and algebraic programming worlds to see if any kind of useful cross-pollination might take place - between specifications and implementations.
For example, the JavaFAN formal analyzer for multi-threaded Java programs (developed using tools based on the Maude language) was applied to the Remote Agent AI program aboard NASA's Deep Space 1 shuttle, written in Java - and it took only a few minutes using formal mathematical reasoning to detect a potential deadlock which would have occurred years later during the space mission when the damn spacecraft was already way out around Pluto.
And "the Maude-NRL (Naval Research Laboratory) Protocol Analyzer (Maude-NPA) is a tool used to provide security proofs of cryptographic protocols and to search for protocol flaws and cryptosystem attacks."
These are open-source formal reasoning tools developed by DARPA and used by NASA and the US Navy to ensure that program implementations satisfy their specifications. It would be great if some of the people involved in these kinds of projects could contribute to help ensure the security and scalability of Bitcoin.
But there is a wide abyss between the kinds of programmers who use languages like Maude and the kinds of programmers who use languages like C/Python/Java - and it can be really hard to get the two worlds to meet. There is a bit of rapprochement between these language communities in languages which might be considered as being somewhere in the middle, such as Haskell and ML. I just worry that Bitcoin might be turning into being an exclusively C/Python/Java project (with the algorithms and practitioners traditionally of that community), when it could be more advantageous if it also had some people from the functional and algebraic-specification and program-verification community involved as well. The thing is, though: the theoretical practitioners are big on "semantics" - I've heard them say stuff like "Yes but a C / C++ program has no easily identifiable semantics". So to get them involved, you really have to first be able to talk about what your program does (specification) - before proceeding to describe how it does it (implementation). And writing high-level specifications is typically very hard using the syntax and semantics of languages like C and Java and Python - whereas specs are fairly easy to write in Maude - and not only that, they're executable, and you state and verify properties about them - which provides for the kind of debate Nick Szabo was advocating ("more computer science, less noise").
Imagine if we had an executable algebraic specification of Bitcoin in Maude, where we could formally reason about and verify certain crucial game-theoretical properties - rather than merely hand-waving and arguing and deploying and praying.
And so in the theoretical programming community you've got major research on various logics such as Girard's Linear Logic (which is resource-conscious) and Bruni and Montanari's Tile Logic (which enables "pasting" bigger systems together from smaller ones in space and time), and executable algebraic specification languages such as Meseguer's Maude (which would be perfect for game theory modeling, with its functional modules for specifying the deterministic parts of systems and its system modules for specifiying non-deterministic parts of systems, and its parameterized skeletons for sketching out the typical architectures of mobile systems, and its formal reasoning and verification tools and libraries which have been specifically applied to testing and breaking - and fixing - cryptographic protocols).
And somewhat closer to the practical hands-on world, you've got stuff like Google's MapReduce and lots of Big Data database languages developed by Google as well. And yet here we are with a mempool growing dangerously big for RAM on a single machine, and a 20-GB append-only list as our database - and not much debate on practical results from Google's Big Data databases.
(And by the way: maybe I'm totally ignorant for asking this, but I'll ask anyways: why the hell does the mempool have to stay in RAM? Couldn't it work just as well if it were stored temporarily on the hard drive?)
And you've got CalvinDB out of Yale which apparently provides an ACID layer on top of a massively distributed database.
Look, I'm just an armchair follower cheering on these projects. I can barely manage to write a query in SQL, or read through a C or Python or Java program. But I would argue two points here: (1) these languages may be too low-level and "non-formal" for writing and modeling and formally reasoning about and proving properties of mission-critical specifications - and (2) there seem to be some Big Data tools already deployed by institutions such as Google and Yale which support global petabyte-size databases on commodity boxes with nice properties such as near-real-time and ACID - and I sometimes worry that the "core devs" might be failing to review the literature (and reach out to fellow programmers) out there to see if there might be some formal program-verification and practical Big Data tools out there which could be applied to coming up with rock-solid, 100% consensus proposals to handle an issue such as blocksize scaling, which seems to have become much more intractable than many people might have expected.
I mean, the protocol solved the hard stuff: the elliptical-curve stuff and the Byzantine General stuff. How the heck can we be falling down on the comparatively "easier" stuff - like scaling the blocksize?
It just seems like defeatism to say "Well, the blockchain is already 20-30 GB and it's gonna be 20-30 TB ten years from now - and we need 10 Mbs bandwidth now and 10,000 Mbs bandwidth 20 years from - assuming the evil Verizon and AT&T actually give us that - so let's just become a settlement platform and give up on buying coffee or banking the unbanked or doing micropayments, and let's push all that stuff into some corporate-controlled vaporware without even a whitepaper yet."
So you've got Peter Todd doing some possibly brilliant theorizing and extrapolating on the idea of "treechains" - there is a Let's Talk Bitcoin podcast from about a year ago where he sketches the rough outlines of this idea out in a very inspiring, high-level way - although the specifics have yet to be hammered out. And we've got Blockstream also doing some hopeful hand-waving about the Lightning Network.
Things like Peter Todd's treechains - which may be similar to the spark in some devs' eyes called Lightning Network - are examples of the kind of algorithm or architecture which might manage to harness the massive computing power of miners and nodes in such a way that certain kinds of massive and graceful scaling become possible.
It just seems like a kindof tiny dev community working on this stuff.
Being a C or Python or Java programmer should not be a pre-req to being able to help contribute to the specification (and formal reasoning and program verification) for Bitcoin and the blockchain.
XML and UML are crap modeling and specification languages, and C and Java and Python are even worse (as specification languages - although as implementation languages, they are of course fine).
But there are serious modeling and specification languages out there, and they could be very helpful at times like this - where what we're dealing with is questions of modeling and specification (ie, "needs and requirements").
One just doesn't often see the practical, hands-on world of open-source github implementation-level programmers and the academic, theoretical world of specification-level programmers meeting very often. I wish there were some way to get these two worlds to collaborate on Bitcoin.
Maybe a good first step to reach out to the theoretical people would be to provide a modular executable algebraic specification of the Bitcoin protocol in a recognized, military/NASA-grade specification language such as Maude - because that's something the theoretical community can actually wrap their heads around, whereas it's very hard to get them to pay attention to something written only as a C / Python / Java implementation (without an accompanying specification in a formal language).
They can't check whether the program does what it's supposed to do - if you don't provide a formal mathematical definition of what the program is supposed to do.
Specification : Implementation :: Theorem : Proof
You have to remember: the theoretical community is very aware of the Curry-Howard isomorphism. Just like it would be hard to get a mathematician's attention by merely showing them a proof without telling also telling them what theorem the proof is proving - by the same token, it's hard to get the attention of a theoretical computer scientist by merely showing them an implementation without showing them the specification that it implements.
Bitcoin is currently confronted with a mathematical or "computer science" problem: how to secure the network while getting high enough transactional throughput, while staying within the limited RAM, bandwidth and hard drive space limitations of current and future infrastructure.
The problem only becomes a political and economic problem if we give up on trying to solve it as a mathematical and "theoretical computer science" problem.
There should be a plethora of whitepapers out now proposing algorithmic solutions to these scaling issues. Remember, all we have to do is apply the Byzantine General consensus-reaching procedure to a worldwide database which shuffles 2.1 quadrillion tokens among a few billion addresses. The 21 company has emphatically pointed out that racing to compute a hash to add a block is an "embarrassingly parallel" problem - very easy to decompose among cheap, fault-prone, commodity boxes, and recompose into an overall solution - along the lines of Google's highly successful MapReduce.
I guess what I'm really saying is (and I don't mean to be rude here), is that C and Python and Java programmers might not be the best qualified people to develop and formally prove the correctness of (note I do not say: "test", I say "formally prove the correctness of") these kinds of algorithms.
I really believe in the importance of getting the algorithms and architectures right - look at Google Search itself, it uses some pretty brilliant algorithms and architectures (eg, MapReduce, Paxos) which enable it to achieve amazing performance - on pretty crappy commodity hardware. And look at BitTorrent, which is truly p2p, where more demand leads to more supply.
So, in this vein, I will close this lengthy rant with an oddly specific link - which may or may not be able to make some interesting contributions to finding suitable algorithms, architectures and data structures which might help Bitcoin scale massively. I have no idea if this link could be helpful - but given the near-total lack of people from the Haskell and ML and functional worlds in these Bitcoin specification debates, I thought I'd be remiss if I didn't throw this out - just in case there might be something here which could help us channel the massive computing power of the Bitcoin network in such a way as to enable us simply sidestep this kind of desperate debate where both sides seem right because the other side seems wrong.
https://personal.cis.strath.ac.uk/neil.ghani/papers/ghani-calco07
The above paper is about "higher dimensional trees". It uses a bit of category theory (not a whole lot) and a bit of Haskell (again not a lot - just a simple data structure called a Rose tree, which has a wikipedia page) to develop a very expressive and efficient data structure which generalizes from lists to trees to higher dimensions.
I have no idea if this kind of data structure could be applicable to the current scaling mess we apparently are getting bogged down in - I don't have the game-theory skills to figure it out.
I just thought that since the blockchain is like a list, and since there are some tree-like structures which have been grafted on for efficiency (eg Merkle trees) and since many of the futuristic scaling proposals seem to also involve generalizing from list-like structures (eg, the blockchain) to tree-like structures (eg, side-chains and tree-chains)... well, who knows, there might be some nugget of algorithmic or architectural or data-structure inspiration there.
So... TL;DR:
(1) I'm freaked out that this blocksize debate has splintered the community so badly and dragged on so long, with no resolution in sight, and both sides seeming so right (because the other side seems so wrong).
(2) I think Bitcoin could gain immensely by using high-level formal, algebraic and co-algebraic program specification and verification languages (such as Maude including Maude-NPA, Mobile Maude parameterized skeletons, etc.) to specify (and possibly also, to some degree, verify) what Bitcoin does - before translating to low-level implementation languages such as C and Python and Java saying how Bitcoin does it. This would help to communicate and reason about programs with much more mathematical certitude - and possibly obviate the need for many political and economic tradeoffs which currently seem dismally inevitable - and possibly widen the collaboration on this project.
(3) I wonder if there are some Big Data approaches out there (eg, along the lines of Google's MapReduce and BigTable, or Yale's CalvinDB), which could be implemented to allow Bitcoin to scale massively and painlessly - and to satisfy all stakeholders, ranging from millionaires to micropayments, coffee drinkers to the great "unbanked".
submitted by BeYourOwnBank to Bitcoin [link] [comments]

BlockTorrent: The famous algorithm which BitTorrent uses for SHARING BIG FILES. Which you probably thought Bitcoin *also* uses for SHARING NEW BLOCKS (which are also getting kinda BIG). But Bitcoin *doesn't* torrent *new* blocks (while relaying). It only torrents *old* blocks (while sync-ing). Why?

This post is being provided to further disseminate an existing proposal:
This proposal was originally presented by jtoomim back in September of 2015 - on the bitcoin_dev mailing list (full text at the end of this OP), and on reddit:
https://np.reddit.com/btc/comments/3zo72i/fyi_ujtoomim_is_working_on_a_scaling_proposal/cyomgj3
Here's a TL;DR, in his words:
BlockTorrenting
For initial block sync, [Bitcoin] sort of works [like BitTorrent] already.
You download a different block from each peer. That's fine.
However, a mechanism does not currently exist for downloading a portion of each [new] block from a different peer.
That's what I want to add.
~ jtoomim
The more detailed version of this "BlockTorrenting" proposal (as presented by jtoomim on the bitcoin_dev mailing list) is linked and copied / reformatted at the end of this OP.
Meanwhile here are some observations from me as a concerned member of the Bitcoin-using public.
Questions:
Whoa??
WTF???
Bitcoin doesn't do this kind of "blocktorrenting" already??
But.. But... I thought Bitcoin was "p2p" and "based on BitTorrent"...
... because (as we all know) Bitcoin has to download giant files.
Oh...
Bitcoin only "torrents" when sharing one certain kind of really big file: the existing blockchain, when a node is being initialized.
But Bitcoin doesn't "torrent" when sharing another certain kind of moderately big file (a file whose size, by the way, has been notoriously and steadily growing over the years to the point where the system running the legacy "Core"/Blockstream Bitcoin implementation is starting to become dangerously congested - no matter what some delusional clowns "Core" devs may say): ie, the world's most wildly popular, industrial-strength "p2p file sharing algorithm" is mysteriously not being used where the Bitcoin network needs it the most in order to get transactions confirmed on-chain: when a a newly found block needs to be shared among nodes, when a node is relaying new blocks.
https://np.reddit.com/Bitcoin+bitcoinxt+bitcoin_uncensored+btc+bitcoin_classic/search?q=blocktorrent&restrict_sr=on
How many of you (honestly) just simply assumed that this algorithm was already being used in Bitcoin - since we've all been told that "Bitcoin is p2p, like BitTorrent"?
As it turns out - the only part of Bitcoin which has been p2p up until now is the "sync-ing a new full-node" part.
The "running an existing full-node" part of Bitcoin has never been implemented as truly "p2p2" yet!!!1!!!
And this is precisely the part of the system that we've been wasting all of our time (and destroying the community) fighting over for the past few months - because the so-called "experts" from the legacy "Core"/Blockstream Bitcoin implementation ignored this proposal!
Why?
Why have all the so-called "experts" at "Core"/Blockstream ignored this obvious well-known effective & popular & tested & successful algorithm for doing "blocktorrenting" to torrent each new block being relayed?
Why have the "Core"/Blockstream devs failed to p2p-ize the most central, fundamental networking aspect of Bitcoin - the part where blocks get propagated, the part we've been fighting about for the past few years?
This algorithm for "torrenting" a big file in parallel from peers is the very definition of "p2p".
It "surgically" attacks the whole problem of sharing big files in the most elegant and efficient way possible: right at the lowest level of the bottleneck itself, cleverly chunking a file and uploading it in parallel to multiple peers.
Everyone knows torrenting works. Why isn't Bitcoin using it for its new blocks?
As millions of torrenters already know (but evidently all the so-called "experts" at Core/Blocsktream seem to have conveniently forgotten), "torrenting" a file (breaking a file into chunks and then offering a different chunk to each peer to "get it out to everyone fast" - before your particular node even has the entire file) is such a well-known / feasible / obvious / accepted / battle-tested / highly efficient algorithm for "parallelizing" (and thereby significantly accelerating) the sharing of big files among peers, that many people simply assumed that Bitcoin had already been doing this kind of "torrenting of new-blocks" these whole past 7 years.
But Bitcoin doesn't do this - yet!
None of the Core/Blockstream devs (and the Chinese miners who follow them) have prioritized p2p-izing the most central and most vital and most resource-consuming function of the Bitcoin network - the propagation of new blocks!
Maybe it took someone who's both a miner and a dev to "scratch" this particular "itch": Jonathan Toomim jtoomim.
  • A miner + dev who gets very little attention / respect from the Core/Blockstream devs (and from the Chinese miners who follow them) - perhaps because they feel threatened by a competing implementation?
  • A miner + dev who may have come up with the simplest and safest and most effective algorithmic (ie, software-based, not hardware-consuming) scaling proposal of anyone!
  • A dev who who is not paid by Blockstream, and who is therefore free from the secret, undisclosed corporate restraints / confidentiality agreements imposed by the shadowy fiat venture-capitalists and legacy power elite who appear to be attempting to cripple our code and muzzle our devs.
  • A miner who has the dignity not to let himself be forced into signing a loyalty oath to any corporate overlords after being locked in a room until 3 AM.
Precisely because jtoomim is both a indepdendent miner and an independent dev...
  • He knows what needs to be done.
  • He knows how to do it.
  • He is free to go ahead and do it - in a permissionless, decentralized fashion.
Possible bonus: The "blocktorrent" algorithm would help the most in the upload direction - which is precisely where Bitcoin scaling needs the most help!
Consider the "upload" direction for a relatively slow full-node - such as Luke-Jr, who reports that his internet is so slow, he has not been able to run a full-node since mid-2015.
The upload direction is the direction which everyone says has been the biggest problem with Bitcoin - because, in order for a full-node to be "useful" to the network:
  • it has to able to upload a new block to (at least) 8 peers,
  • which places (at least) 8x more "demand" on the full-node's upload bandwidth.
The brilliant, simple proposed "blocktorrent" algorithm from jtoomim (already proven to work with Bram Cohen's BitTorrent protocol, and also already proven to work for initial sync-ing of Bitcoin full-nodes - but still un-implemented for ongoing relaying among full-nodes) looks like it would provide a significant performance improvement precisely at this tightest "bottleneck" in the system, the crucial central nexus where most of the "traffic" (and the congestion) is happening: the relaying of new blocks from "slower" full-nodes.
The detailed explanation for how this helps "slower" nodes when uploading, is as follows.
Say you are a "slower" node.
You need to send a new block out to (at least) 8 peers - but your "upload" bandwidth is really slow.
If you were to split the file into (at least) 8 "chunks", and then upload a different one of these (at least) 8 "chunks" to each of your (at least) 8 peers - then (if you were using "blocktorrenting") it only would take you 1/8 (or less) of the "normal" time to do this (compared to the naïve legacy "Core" algorithm).
Now the new block which your "slower" node was attempting to upload is already "out there" - in 1/8 (or less) of the "normal" time compared to the naïve legacy "Core" algorithm.[ 1 ]
... [ 1 ] There will of course also be a tiny amount of extra overhead involved due to the "housekeeping" performed by the "blocktorrent" algorithm itself - involving some additional processing and communicating to decompose the block into chunks and to organize the relaying of different chunks to different peers and the recompose the chunks into a block again (all of which, depending on the size of the block and the latency of your node's connections to its peers, would in most cases be negligible compared to the much-greater speed-up provided by the "blocktorrent" algorithm itself).
Now that your block is "out there" at those 8 (or more) peer nodes to whom you just blocktorrented it in 1/8 (or less) of the time - it has now been liberated from the "bottleneck" of your "slower" node.
In fact, its further propagation across the net may now be able to leverage much faster upload speeds from some other node(s) which have "blocktorrent"-downloaded it in pieces from you (and other peers) - and which might be faster relaying it along, than your "slower" node.
For some mysterious reason, the legacy Bitcoin implementation from "Core"/Blockstream has not been doing this kind of "blocktorrenting" for new blocks.
It's only been doing this torrenting for old blocks. The blocks that have already been confirmed.
Which is fine.
But we also obviously need this sort of "torrenting" to be done for each new block is currently being confirmed.
And this is where the entire friggin' "scaling bottleneck" is occurring, which we just wasted the past few years "debating" about.
Just sit down and think about this for a minute.
We've had all these so-called "experts" (Core/Blockstream devs and other small-block proponents) telling us for years that guys like Hearn and Gavin and repos like Classic and XT and BU were "wrong" or at least "unserious" because they "merely" proposed "brute-force" scaling: ie, scaling which would simply place more demands on finite resources (specifically: on the upload bandwidth from full-nodes - who need to relay to at least 8 peer full-nodes in order to be considered "useful" to the network).
These "experts" have been beating us over the head this whole time, telling us that we have to figure out some (really complicated, unproven, inefficient and centralized) clever scaling algorithms to squeeze more efficiency out of existing infrastructure.
And here is the most well-known / feasible / obvious / accepted / battle-tested algorithm for "parallelizing" (and thereby massively accelerating) the sharing of big file among peers - the BitTorrent algorithm itself, the gold standard of p2p relaying par excellence, which has been a major success on the Internet for decades, at one point accounting for nearly 1/3 of all traffic on the Internet itself - and which is also already being used in one part of Bitcoin: during the phase of sync-ing a new node.
And apparently pretty much only jtoomim has been talking about using it for the actual relaying of new blocks - while Core/Blockstream devs have so far basically ignored this simple and safe and efficient proposal.
And then the small-block sycophants (reddit users or wannabe C/C++ programmers who have beaten into submission and/or by the FUD and "technological pessimism" of the Core/Blockstream devs, and by the censorhip on their legacy forum), they all "laugh" at Classic and proclaim "Bitcoin doesn't need another dev team - all the 'experts' are at Core / Blockstream"...
...when in fact it actually looks like jtoomim (an independent minedev, free from the propaganda and secret details of the corporate agenda of Core/Blockstream - who works on the Classic Bitcoin implementation) may have proposed the simplest and safest and most effective scaling algorithm in this whole debate.
By the way, his proposal estimates that we could get about 1 magnitude greater throughput, based on the typical latency and blocksize for blocks of around 20 MB and bandwidth of around 8 Mbps (which seems like a pretty normal scenario).
So why the fuck isn't this being done yet?
This is such a well-known / feasible / obvious / accepted / battle-tested algorithm for "parallelizing" (and thereby significantly accelerating) the sharing of big files among peers:
  • It's already being used for the (currently) 65 gigabytes of "blocks in the existing blockchain" itself - the phase where a new node has to sync with the blockchain.
  • It's already being used in BitTorrent - although the BitTorrent protocol has been optimized more to maximize throughput, whereas it would probably be a good idea to optimize the BlockTorrent protocol to minimize latency (since avoiding orphans is the big issue here) - which I'm fairly sure should be quite doable.
This algorithm is so trivial / obvious / straightforward / feasible / well-known / proven that I (and probably many others) simply assumed that Bitcoin had been doing this all along!
But it has never been implemented.
There is however finally a post about it today on the score-hidden forum /Bitcoin, from eragmus:
[bitcoin-dev] BlockTorrent: Torrent-style new-block propagation on Merkle trees
https://np.reddit.com/Bitcoin/comments/484nbx/bitcoindev_blocktorrent_torrentstyle_newblock/
And, predictably, the top-voted comment there is a comment telling us why it will never work.
And the comment after that comment is from the author of the proposal, jtoomim, explaining why it would work.
Score hidden on all those comments.
Because the immature tyrant theymos still doesn't understand the inherent advantages of people using reddit's upvoting & downvoting tools to hold decentralized, permissionless debates online.
Whatever.
Questions:
(1) Would this "BlockTorrenting" algorithm from jtoomim really work?
(2) If so, why hasn't it been implemented yet?
(3) Specifically: With all the "dev firepower" (and $76 million in venture capital) available at Core/Blockstream, why have they not prioritized implementing this simple and safe and highly effective solution?
(4) Even more specifically: Are there undisclosed strategies / agreements / restraints imposed by Blockstream financial investors on Bitcoin "Core" devs which have been preventing further discussion and eventual implementation of this possible simple & safe & efficient scaling solution?
Here is the more-detailed version of this proposal, presented by Jonathan Toomim jtoomim back in September of 2015 on the bitcoin-dev mailing list (and pretty much ignored for months by almost all the "experts" there):
https://lists.linuxfoundation.org/pipermail/bitcoin-dev/2015-Septembe011176.html
As I understand it, the current block propagation algorithm is this:
  1. A node mines a block.
  2. It notifies its peers that it has a new block with an inv. Typical nodes have 8 peers.
  3. The peers respond that they have not seen it, and request the block with getdata [hash].
  4. The node sends out the block in parallel to all 8 peers simultaneously. If the node's upstream bandwidth is limiting, then all peers will receive most of the block before any peer receives all of the block. The block is sent out as the small header followed by a list of transactions.
  5. Once a peer completes the download, it verifies the block, then enters step 2.
(If I'm missing anything, please let me know.)
The main problem with this algorithm is that it requires a peer to have the full block before it does any uploading to other peers in the p2p mesh. This slows down block propagation to:
O( p • log_p(n) ) 
where:
  • n is the number of peers in the mesh,
  • p is the number of peers transmitted to simultaneously.
It's like the Napster era of file-sharing. We can do much better than this.
Bittorrent can be an example for us.
Bittorrent splits the file to be shared into a bunch of chunks, and hashes each chunk.
Downloaders (leeches) grab the list of hashes, then start requesting their peers for the chunks out-of-order.
As each leech completes a chunk and verifies it against the hash, it begins to share those chunks with other leeches.
Total propagation time for large files can be approximately equal to the transmission time for an FTP upload.
Sometimes it's significantly slower, but often it's actually faster due to less bottlenecking on a single connection and better resistance to packet/connection loss.
(This could be relevant for crossing the Chinese border, since the Great Firewall tends to produce random packet loss, especially on encrypted connections.)
Bitcoin uses a data structure for transactions with hashes built-in. We can use that in lieu of Bittorrent's file chunks.
A Bittorrent-inspired algorithm might be something like this:
  1. (Optional steps to build a Merkle cache; described later)
  2. A seed node mines a block.
  3. It notifies its peers that it has a new block with an extended version of inv.
  4. The leech peers request the block header.
  5. The seed sends the block header. The leech code path splits into two.
  6. (a) The leeches verify the block header, including the PoW. If the header is valid,
  7. (a) They notify their peers that they have a header for an unverified new block with an extended version of inv, looping back to 2. above. If it is invalid, they abort thread (b).
  8. (b) The leeches request the Nth row (from the root) of the transaction Merkle tree, where N might typically be between 2 and 10. That corresponds to about 1/4th to 1/1024th of the transactions. The leeches also request a bitfield indicating which of the Merkle nodes the seed has leaves for. The seed supplies this (0xFFFF...).
  9. (b) The leeches calculate all parent node hashes in the Merkle tree, and verify that the root hash is as described in the header.
  10. The leeches search their Merkle hash cache to see if they have the leaves (transaction hashes and/or transactions) for that node already.
  11. The leeches send a bitfield request to the node indicating which Merkle nodes they want the leaves for.
  12. The seed responds by sending leaves (either txn hashes or full transactions, depending on benchmark results) to the leeches in whatever order it decides is optimal for the network.
  13. The leeches verify that the leaves hash into the ancestor node hashes that they already have.
  14. The leeches begin sharing leaves with each other.
  15. If the leaves are txn hashes, they check their cache for the actual transactions. If they are missing it, they request the txns with a getdata, or all of the txns they're missing (as a list) with a few batch getdatas.
Features and benefits
The main feature of this algorithm is that a leech will begin to upload chunks of data as soon as it gets them and confirms both PoW and hash/data integrity instead of waiting for a fully copy with full verification.
Inefficient cases, and mitigations
This algorithm is more complicated than the existing algorithm, and won't always be better in performance.
Because more round trip messages are required for negotiating the Merkle tree transfers, it will perform worse in situations where the bandwidth to ping latency ratio is high relative to the blocksize.
Specifically, the minimum per-hop latency will likely be higher.
This might be mitigated by reducing the number of round-trip messages needed to set up the BlockTorrent by using larger and more complex inv-like and getdata-like messages that preemptively send some data (e.g. block headers).
This would trade off latency for bandwidth overhead from larger duplicated inv messages.
Depending on implementation quality, the latency for the smallest block size might be the same between algorithms, or it might be 300% higher for the torrent algorithm.
For small blocks (perhaps < 100 kB), the BlockTorrent algorithm will likely be slightly slower.
Sidebar from the OP: So maybe this would discourage certain miners (cough Dow cough) from mining blocks that aren't full enough:
Why is [BTCC] limiting their block size to under 750 all of a sudden?
https://np.reddit.com/Bitcoin/comments/486o1u/why_is_bttc_limiting_their_block_size_to_unde

For large blocks (e.g. 8 MB over 20 Mbps), I expect the BlockTorrent algo will likely be around an order of magnitude faster in the worst case (adversarial) scenarios, in which none of the block's transactions are in the caches.

One of the big benefits of the BlockTorrent algorithm is that it provides several obvious and straightforward points for bandwidth saving and optimization by caching transactions and reconstructing the transaction order.

Future work: possible further optimizations
A cooperating miner [could] pre-announce Merkle subtrees with some of the transactions they are planning on including in the final block.
Other miners who see those subtrees [could] compare the transactions in those subtrees to the transaction sets they are mining with, and can rearrange their block prototypes to use the same subtrees as much as possible.
In the case of public pools supporting the getblocktemplate protocol, it might be possible to build Merkle subtree caches without the pool's help by having one or more nodes just scraping their getblocktemplate results.
Even if some transactions are inserted or deleted, it [might] be possible to guess a lot of the tree based on the previous ordering.
Once a block header and the first few rows of the Merkle tree [had] been published, they [would] propagate through the whole network, at which time full nodes might even be able to guess parts of the tree by searching through their txn and Merkle node/subtree caches.
That might be fun to think about, but probably not effective due to O( n2 ) or worse scaling with transaction count.
Might be able to make it work if the whole network cooperates on it, but there are probably more important things to do.
Leveraging other features from BitTorrent
There are also a few other features of Bittorrent that would be useful here, like:
  • prioritizing uploads to different peers based on their upload capacity,
  • banning peers that submit data that doesn't hash to the right value.
Sidebar from the OP: Hmm...maybe that would be one way to deal with the DDoS-ing we're experiencing right now? I know the DDoSer is using a rotating list of proxies, but still it could be a quick-and-dirty way to mitigate against his attack.
DDoS started again. Have a nice day, guys :)
https://np.reddit.com/Bitcoin_Classic/comments/47zglz/ddos_started_again_have_a_nice_day_guys/d0gj13y
(It might be good if we could get Bram Cohen to help with the implementation.)
Using the existing BitTorrent algorithm as-is - versus tailoring a new algorithm optimized for Bitcoin
Another possible option would be to just treat the block as a file and literally Bittorrent it.
But I think that there should be enough benefits to integrating it with the existing bitcoin p2p connections and also with using bitcoind's transaction caches and Merkle tree caches to make a native implementation worthwhile.
Also, BitTorrent itself was designed to optimize more for bandwidth than for latency, so we will have slightly different goals and tradeoffs during implementation.
Concerns, possible attacks, mitigations, related work
One of the concerns that I initially had about this idea was that it would involve nodes forwarding unverified block data to other nodes.
At first, I thought this might be useful for a rogue miner or node who wanted to quickly waste the whole network's bandwidth.
However, in order to perform this attack, the rogue needs to construct a valid header with a valid PoW, but use a set of transactions that renders the block as a whole invalid in a manner that is difficult to detect without full verification.
However, it will be difficult to design such an attack so that the damage in bandwidth used has a greater value than the 240 exahashes (and 25.1 BTC opportunity cost) associated with creating a valid header.
Related work: IBLT (Invertible Bloom Lookup Tables)
As I understand it, the O(1) IBLT approach requires that blocks follow strict rules (yet to be fully defined) about the transaction ordering.
If these are not followed, then it turns into sending a list of txn hashes, and separately ensuring that all of the txns in the new block are already in the recipient's mempool.
When mempools are very dissimilar, the IBLT approach performance degrades heavily and performance becomes worse than simply sending the raw block.
This could occur if a node just joined the network, during chain reorgs, or due to malicious selfish miners.
Also, if the mempool has a lot more transactions than are included in the block, the false positive rate for detecting whether a transaction already exists in another node's mempool might get high for otherwise reasonable bucket counts/sizes.
With the BlockTorrent approach, the focus is on transmitting the list of hashes in a manner that propagates as quickly as possible while still allowing methods for reducing the total bandwidth needed.
Remark
The BlockTorrent algorithm does not really address how the actual transaction data will be obtained because, once the leech has the list of txn hashes, the standard Bitcoin p2p protocol can supply them in a parallelized and decentralized manner.
Thoughts?
-jtoomim
submitted by ydtm to btc [link] [comments]

Forkology 301: The Three Tiers of Investor Control over Bitcoin

DanielKrawisz's article Who Controls Bitcoin is a must-read for anyone wanting to understand how Bitcoin is governed.
This post builds on Krawisz's point - that investors hold all the cards - by describing in more detail how Bitcoin investors can exercise their control over Bitcoin through a tiered or layered structure of increasing directness and radicalness.
Tier 1: Expression of Intent
Investors simply make it known, in a credible way, that they support some change (say a bigger blocksize cap), meaning they intend to buy more BTC if the change is made in good time, and sell BTC if it is not. Then there are three ways the ecosystem can react:
(i) Core Capitulates: The Core dev team is pressured to up the blocksize cap in Core and does so in a way that satisfies investors.
(ii) Competing Implementations Arise: If Core refuses or raises the cap too slowly, other implementations like BitcoinXT spring up and miners - enticed by the additional gains through a higher BTC price - adopt it.
(iii) Bitcoin Unlimited Renders the Previous Two Moot: Bitcoin Unlimited is another implementation in development that attempts to dispense with centralized blocksize planning entirely by allowing each user to set their own blocksize cap through a pulldown menu. Set the cap too low and your node might fail to track consensus as larger blocks get into the chain; set it too high and you might waste resources dealing with blocks that will end up orphaned. Users can also set a block depth after which they will accept a block higher than their set limit only if the block gets deep enough in the chain.
This mechanism constitutes a kind of built in fork-tolerant logic.
Instead of a preset group of developers opining over the "correct" blocksize cap or an ivory-tower scheme of centrally planned "Flexcaps," the blocksize limit is an emergent property of each individual node and miner's cost/benefit analysis and priorities for their own situation, much like the price of graphite. The concept of consensus becomes more fluid, with nodes sometimes objecting to bigger blocks by refusing to relay them, thereby assuming a risk of temporarily falling out of consensus. Somewhat like the English language, consensus on the rules is emergent rather than consensus rules being handed down from Core dev.
Instead of "Concur with Core or go pound sand," Bitcoin Unlimited's consensus on blocksize is an aggregate product of each node and miner positioning themselves favorably in the market due to their own calculations of the trade-offs for their unique circumstances.
The result is expected to be a soft blocksize limit that grows dynamically as market forces (orphan rates and other incentives), transaction demand, and technology levels change, in a way that maximizes investor satisfaction and therefore BTC price and miner revenue. Miners will up the size of the blocks they mine as transaction demand grows, and as long as they do so conservatively other miners and nodes (all interested in seeing the BTC price rise) will approvingly build on and propagate these blocks. Blocks over the soft limit will be discouraged by most nodes (by definition of the term "soft limit"), but if they manage to get several blocks deep into the chain most nodes will accept them. Miners a take a risk (orphan risk) in producing these slightly oversized blocks, edging forward carefully when they believe nodes will respond approvingly because investors and users are demanding it.
If Bitcoin Unlimited catches on, Core and XT's centralized blocksize plans become relics. Investors announce their intent, ideally through a prediction market or futures market but cruder measures would also have an effect, and miners react (conservatively!) through adjusting blocksize cap (and chain depth at which they'll give in and accept an oversized block) through the pulldown menu to rake in those juicy profits. Nodes also have a voice in what they help propagate, with an interest to aid bigger blocks because of their stake in the BTC price as business owners, holders, etc.
Tier 2: Fork Arbitrage on Exchanges
This case is more radical, but it is only required if a change is too controversial for something like XT's 75% threshold to be relied upon. Here, several weeks/months before the fork is to occur, Bitcoin exchanges prepare futures contracts for, say, coins in Core and coins in XT, and let investors effectively sell their coins in Core to buy more coins in XT, or vice versa.
For example if you have 10 BTC, you would of course have 10 Core bitcoins and 10 XT bitcoins after the fork if you took no action, but if you choose to participate in the arbitrage you might sell your 10 future Core bitcoins and use them to increase your future XT bitcoin count to 15 or 20 BTC. Why would it ever be only 15 BTC? This would be the case where you entered the arbitraging late and Core bitcoin futures had already fallen to half the price of XT bitcoin futures, meaning your 10 Core BTC only buys you 5 XT BTC. [For more technical details, see Meni Rosenfeld's How I learned to stop worrying and love the fork, though he doesn't address the futures contract innovation, which further streamlines the process by giving a very strong indication of the winner before the fork even happens.]
In almost all conceivable cases a definitive winner emerges (and if not, no other method is going to do any better at determining the winner), and the other fork either dies or becomes a niche alt-protocol coin (not really an "altcoin," since it shares Bitcoin's ledger). The niche coin would likely only arise and persist if there truly were a key tradeoff being made, as some small block adherents argue. In any case, hodler purchasing power is completely preserved by default if they choose not to bet in the "forkbitrage" process, even in the event of a persistent split.
This forkbitrage process represents a more direct expression of investor will than in Tier 1. (Also, it may be possible that this process starting up would kick off Tier 1 effects that would allow the more radical measure of forbitrage to be halted early, with the exchanges returning investors' bets.)
Tier 3: Spinoff with New Hashing Algorithm
This is the most radical, because it is only required in the scenario where "miners go insane" and do something ridiculous like upping the block reward or refusing to implement obvious necessary changes like blocksize cap increases, despite investor support, and where the miners would threaten to 51% attack the investors' chosen fork in the above forkbitrage process. Of course this can only be a short term threat, since the fork winning the Tier 2 forkbitrage process would soon have far more hashpower thanks to far greater market cap, but short term matters when you could be 51% attacked.
Here the Bitcoin ledger is copied over to the investors' chosen protocol, so that all holders have the same number of coins (and same percentage of all outstanding coins) in the "new" coin, say a larger blocksize cap coin. The World Wide Ledger is preserved, which is all that should matter to investors, and the "old" Bitcoin is again sold off to nothing or goes niche. Hodler purchasing power is preserved, of course.
This is the very purest expression of investor will. Miners can be called a kind of investor, but with some complications. Spinoffs allow investors to circumvent even the miners - a radical measure for outlandish scenarios.
Tier 1 lets investors deal with attempted developer control, Tier 2 lets investors deal with controversy, and Tier 3 lets investors deal with pervasive miner irrationality. This is how investors rule the roost.

Previous Forkology posts and discussions:
Forkology 101
Forkology 201 (guest post by Peter__R)
submitted by ForkiusMaximus to btc [link] [comments]

The Big Blocks Mega Thread

Since this is a pressing and prevalent issue, I thought maybe condensing the essential arguments into one mega thread is better than rehashing everything in new threads all the time. I chose a FAQ format for this so a certain statement can be answered. I don't want to re-post everything here so where appropriate I'm just going to use links.
Disclaimer: This is biased towards big blocks (BIP 101 in particular) but still tries to mention the risks, worries and fears. I think this is fair because all other major bitcoin discussion places severely censor and discourage big block discussion.
 
What is the block size limit?
The block size limit was introduced by Satoshi back in 2010-07-15 as an anti-DoS measure (though this was not stated in the commit message, more info here). Ever since, it has never been touched because historically there was no need and raising the block size limit requires a hard fork. The block size directly limits the number of transactions in a block. Therefore, the capacity of Bitcoin is directly limited by the block size limit.
 
Why does a raise require a hard fork?
Because larger blocks are seen as invalid by old nodes, a block size increase would fork these nodes off the network. Therefore it is a hard fork. However, it is possible to downsize the block limit with a soft fork since smaller blocks would still be seen as valid from old nodes. It is considerably easier to roll out a soft fork. Therefore, it makes sense to roll out a more ambitious hard fork limit and downsize as needed with soft forks if problems arise.
 
What is the deal with soft and hard forks anyways?
See this article by Mike Hearn: https://medium.com/@octskyward/on-consensus-and-forks-c6a050c792e7#.74502eypb
 
Why do we need to increase the block size?
The Bitcoin network is reaching its imposed block size limit while the hard- and software would be able to support more transactions. Many believe that in its current phase of growth, artificially limiting the block size is stifling adoption, investment and future growth.
Read this article and all linked articles for further reading: http://gavinandresen.ninja/time-to-roll-out-bigger-blocks
Another article by Mike Hearn: https://medium.com/@octskyward/crash-landing-f5cc19908e32#.uhky4y1ua (this article is a little outdated since both Bitcoin Core and XT now have mempool limits)
 
What is the Fidelity Effect?
It is the Chicken and Egg problem applied to future growth of Bitcoin. If companies do not see how Bitcoin can scale long term, they don't invest which in turn slows down adoption and development.
See here and here.
 
Does an increase in block size limit mean that blocks immediately get larger to the point of the new block size limit?
No, blocks are as large as there is demand for transactions on the network. But one can assume that if the limit is lifted, more users and businesses will want to use the blockchain. This means that blocks will get bigger, but they will not automatically jump to the size of the block size limit. Increased usage of the blockchain also means increased adoption, investment and also price appreciation.
 
Which are the block size increase proposals?
See here.
It should be noted that BIP 101 is the only proposal which has been implemented and is ready to go.
 
What is the long term vision of BIP 101?
BIP 101 tries to be as close to hardware limitations regarding bandwidth as possible so that nodes can continue running at normal home-user grade internet connections to keep the decentralized aspect of Bitcoin alive. It is believed that it is hard to increase the block size limit, so a long term increase is beneficial to planning and investment in the Bitcoin network. Go to this article for further reading and understand what is meant by "designing for success".
BIP 101 vs actual transaction growth visualized: http://imgur.com/QoTEOO2
Note that the actual growth in BIP 101 is piece-wise linear and does not grow in steps as suggested in the picture.
 
What is up with the moderation and censorship on bitcoin.org, bitcointalk.org and /bitcoin?
Proponents of a more conservative approach fear that a block size increase proposal that does not have "developeexpert consensus" should not be implemented via a majority hard fork. Therefore, discussion about the full node clients which implement BIP 101 is not allowed. Since the same individuals have major influence of all the three bitcoin websites (most notably theymos), discussion of Bitcoin XT is censored and/or discouraged on these websites.
 
What is Bitcoin XT anyways?
More info here.
 
What does Bitcoin Core do about the block size? What is the future plan by Bitcoin Core?
Bitcoin Core scaling plan as envisioned by Gregory Maxwell: http://lists.linuxfoundation.org/pipermail/bitcoin-dev/2015-Decembe011865.html
 
Who governs or controls Bitcoin Core anyways? Who governs Bitcoin XT? What is Bitcoin governance?
Bitcoin Core is governed by a consensus mechanism. How it actually works is not clear. It seems that any major developer can "veto" a change. However, there is one head maintainer who pushes releases and otherwise organizes the development effort. It should be noted that the majority of the main contributors to Bitcoin Core are Blockstream employees.
BitcoinXT follows a benevolent dictator model (as Bitcoin used to follow when Satoshi and later Gavin Andresen were the lead maintainers).
It is a widespread believe that Bitcoin can be separated into protocol and full node development. This means that there can be multiple implementations of Bitcoin that all follow the same protocol and overall consensus mechanism. More reading here. By having multiple implementations of Bitcoin, single Bitcoin implementations can be run following a benevolent dictator model while protocol development would follow an overall consensus model (which is enforced by Bitcoin's fundamental design through full nodes and miners' hash power). It is still unclear how protocol changes should actually be governed in such a model. Bitcoin governance is a research topic and evolving.
 
What are the arguments against a significant block size increase and against BIP 101 in particular?
The main arguments against a significant increase are related to decentralization and therefore robustness against commercial interests and government regulation and intervention. More here (warning: biased Wiki article).
Another main argument is that Bitcoin needs a fee market established by a low block size limit to support miners long term. There is significant evidence and game theory to doubt this claim, as can be seen here.
Finally, block propagation and verification times increase with an increased block size. This in turn increases the orphan rate of miners which means reduced profit. Some believe that this is a disadvantage to small miners because they are not as well connected to other big miners. Also, there is currently a large miner centralization in China. Since most of these miners are behind the Great Firewall of China, their bandwidth to the rest of the world is limited. There is a fear that larger block propagation times favor Chinese miners as long as they have a mining majority. However, there are solutions in development that can drastically reduce block propagation times so this problem will be less of an issue long term.
 
What is up with the fee market and what is the Lightning Network (LN)?
Major Bitcoin Core developers believe that a fee market established by a low block size is needed for future security of the bitcoin network. While many believe fundamentally this is true, there is major dispute if a fee market needs to be forced by a low block size. One of the main LN developers thinks such a fee market through low block size is needed (read here). The Lightning Network is a non-bandwidth scaling solution. It uses payment channels that can be opened and closed using Bitcoin transactions that are settled on the blockchain. By routing transactions through many of these payment channels, in theory it is possible to support a lot more transactions while a user only needs very few payment channels and therefore rarely has to use (settle on) the actual blockchain. More info here.
 
How does LN and other non-bandwidth scaling solutions relate to Bitcoin Core and its long term scaling vision?
Bitcoin Core is headed towards a future where block sizes are kept low so that a fee market is established long term that secures miner incentives. The main scaling solution propagated by Core is LN and other solutions that only sometimes settle transactions on the main Bitcoin blockchain. Essentially, Bitcoin becomes a settlement layer for solutions that are built on top of Bitcoin's core technology. Many believe that long term this might be inevitable. But forcing this off-chain development already today seems counterproductive to Bitcoin's much needed growth and adoption phase before such solutions can thrive. It should also be noted that no major non-bandwidth scaling solution (such as LN) has been tested or even implemented. It is not even clear if such off-chain solutions are needed long term scaling solutions as it might be possible to scale Bitcoin itself to handle all needed transaction volumes. Some believe that the focus on a forced fee market by major Bitcoin Core developers represents a conflict of interest since their employer is interested in pushing off-chain scaling solutions such as LN (more reading here).
 
Are there solutions in development that show the block sizes as proposed via BIP 101 are viable and block propagation times in particular are low enough?
Yes, most notably: Weak Blocks, Thin Blocks and IBLT.
 
What is Segregated Witness (SW) and how does it relate to scaling and block size increases?
See here. SW among other things is a way to increase the block size once without a hard fork (the actual block size is not increased but there is extra information exchanged separately to blocks).
 
Feedback and more of those question/answer type posts (or revised question/answer pairs) appreciated!
 
ToDo and thoughts for expansion:
@Mods: Maybe this could be stickied?
submitted by BIP-101 to btc [link] [comments]

(({{{Bitcoin$Free}})) Best. Cryptocurrency. Trading. Platform.

[+#Bitcoin$Free}}))Best Cryptocurrency Trading Platform THE BITCOIN MANIA Trading on crypto currencies, is the new trend taking over of the digital money market. Bitcoin, the crypto currency is breaking records with 4,500% growth in the last year, proving that the future is in crypto currencies. BITCOIN TRAINING DAY Join our risk free "Bitcoin Club Marathon", where you will get a demo account to perfect your trading skills. Here you receive an opportunity to learn from others without risking your money. Crypto Signals Crypto currencies exchange market could be highly fluctuant, presenting numerous opportunities 24/7. The club provides you Signals system that will send you alerts whenever an opportunity arises.
bitcoin,.,price,.,bitcoin,.,mining,.,bitcoin,.,to gbp,.,bitcoin,.,news,.,bitcoin,.,wallet,.,bitcoin,.,to usd,.,bitcoin,.,price chart,.,bitcoin,.,exchange rate,.,bitcoin,.,cash,.,bitcoin,.,price history,.,bitcoin,.,bitcoin,.,atm,.,bitcoin,.,account,.,bitcoin,.,atm london,.,bitcoin,.,alternative,.,bitcoin,.,address,.,bitcoin,.,atm near me,.,bitcoin,.,august 1st,.,bitcoin,.,atm uk,.,bitcoin,.,asic,.,bitcoin,.,app,.,a,.,bitcoin,.,to naira,.,a,.,bitcoin,.,miner,.,a,.,bitcoin,.,address,.,a,.,bitcoin,.,worth,.,a,.,bitcoin,.,governance network,.,a,.,bitcoin,.,atm,.,a,.,bitcoin,.,faucet,.,a,.,bitcoin,.,transaction,.,a,.,bitcoin,.,wallet,.,a,.,bitcoin,.,to a dollar,.,bitcoin,.,buy,.,bitcoin,.,blockchain,.,bitcoin,.,bubble,.,bitcoin,.,buy uk,.,bitcoin,.,broker,.,bitcoin,.,block explorer,.,bitcoin,.,billionaire,.,bitcoin,.,bank,.,bitcoin,.,blockchain size,.,bitcoin,.,bbc,.,bitcoin,.,b font,.,bitcoin,.,miner.b,.,bitcoin,.,b symbol,.,mel b,.,bitcoin,.,buy,.,bitcoin,.,capital b,.,bitcoin,.,b-eleven,.,bitcoin,.,plan b,.,bitcoin,.,b-wallet,.,bitcoin,.,b&h,.,bitcoin,.,bitcoin,.,chart,.,bitcoin,.,converter,.,bitcoin,.,core,.,bitcoin,.,crash,.,bitcoin,.,currency,.,bitcoin,.,current value,.,bitcoin,.,calculator uk,.,bitcoin,.,creator,.,bitcoin,.,casino,.,c,.,bitcoin,.,miner,.,c,.,bitcoin,.,library,.,c't,.,bitcoin,.,c,.,bitcoin,.,wallet,.,bitcoin,.,ac id,.,bitcoin,.,c sharp,.,bitcoin,.,miner-c pup,.,bitcointalk c-cex,.,bitcoin,.,c==,.,bitcoin,.,debit card,.,bitcoin,.,difficulty,.,bitcoin,.,dollar,.,bitcoin,.,debit card uk,.,bitcoin,.,dark,.,bitcoin,.,documentary,.,bitcoin,.,drop,.,bitcoin,.,define,.,bitcoin,.,dark web,.,bitcoin,.,download,.,bitcoin.d,.,bitcoin,.,xt d,.,bitcoind backup,.,d-wave,.,bitcoin,.,mining,.,the d,.,bitcoin,.,atm,.,d'angelo,.,bitcoin,.,system d,.,bitcoin,.,init.d,.,bitcoind,.,d las vegas,.,bitcoin,.,/etc/init.d/bitcoind,.,bitcoin,.,exchange,.,bitcoin,.,explained,.,bitcoin,.,etf,.,bitcoin,.,exchange uk,.,bitcoin,.,ethereum,.,bitcoin,.,explorer,.,bitcoin,.,exchange rate uk,.,bitcoin,.,euro,.,bitcoin,.,exchange rate chart,.,bitcoin,.,exchange rate history,.,bitcoin,.,e wallet,.,,.,bitcoin,.,e.g. crossword,.,bitcoin,.,e.g. crossword clue,.,bitcoin,.,e commerce,.,bitcoin,.,e-currency,.,bitcoin,.,e money,.,bitcoin,.,e card,.,bitcoin,.,ebook,.,bitcoin,.,e-voucher,.,bitcoin,.,e pill,.,bitcoin,.,fork,.,bitcoin,.,forecast,.,bitcoin,.,faucet,.,bitcoin,.,forum,.,bitcoin,.,for dummies,.,bitcoin,.,farm,.,bitcoin,.,fees,.,bitcoin,.,founder,.,bitcoin,.,future,.,bitcoin,.,fund,.,f#,.,bitcoin,.,price of,.,bitcoin,.,brother john f,.,bitcoin,.,bh f,.,bitcoin,.,msil,.,bitcoin,.,miner-f,.,bitcoin,.,gbp,.,bitcoin,.,graph,.,bitcoin,.,growth,.,bitcoin,.,generator,.,bitcoin,.,gambling,.,bitcoin,.,github,.,bitcoin,.,games,.,bitcoin,.,guide,.,bitcoin,.,google finance,.,bitcoin,.,global capital,.,g,.,bitcoin,.,price,.,g,.,bitcoin,.,charts,.,g,.,bitcoin,.,value,.,g,.,bitcoin,.,mining calculator,.,bitcoin,.,guiminer,.,ghash,.,bitcoin,.,g coin,.,bitcointalk,.,g cash to,.,bitcoin,.,bitcoin,.,miner.g,.,bitcoin,.,hard fork,.,bitcoin,.,hardware wallet,.,bitcoin,.,history,.,bitcoin,.,historical price,.,bitcoin,.,hack,.,bitcoin,.,how to buy,.,bitcoin,.,halving,.,bitcoin,.,how it works,.,bitcoin,.,hashrate,.,bitcoin,.,hardware wallet uk,.,bitcoin,.,h/s,.,c&h,.,bitcoin,.,main.h,.,bitcoin,.,hash.h,.,bitcoin,.,coins.h,.,bitcoin,.,best b#$h -,.,bitcoin,.,best b#$h -,.,bitcoin,.,lyrics,.,h&r block,.,bitcoin,.,h-not-zero,.,bitcoin,.,bitcoin,.,investment,.,bitcoin,.,in gbp,.,bitcoin,.,investment trust,.,bitcoin,.,index,.,bitcoin,.,inventor,.,bitcoin,.,in usd,.,bitcoin,.,india,.,bitcoin,.,inflation,.,bitcoin,.,in dollars,.,bitcoin,.,investment uk,.,i,.,bitcoin,.,in usd,.,i,.,bitcoin,.,in inr,.,i,.,bitcoin,.,to pkr,.,i,.,bitcoin,.,to dollar,.,i,.,bitcoin,.,to naira,.,i,.,bitcoin,.,in rs,.,i,.,bitcoin,.,= satoshi,.,i,.,bitcoin,.,is equal to,.,i,.,bitcoin,.,berapa rupiah,.,i,.,bitcoin,.,in inr in 2009,.,bitcoin,.,japan,.,bitcoin,.,jobs,.,bitcoin,.,japan legal,.,bitcoin,.,jesus,.,bitcoin,.,jobs london,.,bitcoin,.,jobs uk,.,bitcoin,.,july 2017,.,bitcoin,.,jokes,.,bitcoin,.,june 2017,.,bitcoin,.,jihan,.,bitcoin,.,j,.,bitcoinj tutorial,.,bitcoinj micropayments,.,mary j,.,bitcoin,.,belle,.,mary j,.,bitcointalk,.,j maurice,.,bitcoin,.,mary j,.,bitcoin,.,j p morgan,.,bitcoin,.,,.,bitcoin,.,j vty,.,обменник,.,bitcoin,.,bitcoin,.,kurs,.,bitcoin,.,kraken,.,bitcoin,.,koers,.,bitcoin,.,knots,.,bitcoin,.,key,.,bitcoin,.,kopen,.,bitcoin,.,korea,.,bitcoin,.,knowledge,.,bitcoin,.,kaufen,.,bitcoin,.,kurz,.,bitcoin,.,k line,.,bitcoin,.,k,.,bitcoin,.,k value,.,bitcoin,.,k chart,.,john k,.,bitcoin,.,bitcoin,.,k-market,.,k-market jätkäsaari,.,bitcoin,.,k čemu,.,bitcoin,.,bitcoin,.,live price,.,bitcoin,.,latest news,.,bitcoin,.,login,.,bitcoin,.,logo,.,bitcoin,.,ledger,.,bitcoin,.,live,.,bitcoin,.,local,.,bitcoin,.,lottery,.,bitcoin,.,london,.,bitcoin,.,loan,.,bitcoin,.,l-39,.,l-39,.,bitcoin,.,jet,.,bitcoin,.,l'altra faccia della moneta,.,l'ambassade,.,bitcoin,.,l'avenir du,.,bitcoin,.,l'histoire du,.,bitcoin,.,l'inventeur du,.,bitcoin,.,l'évolution du,.,bitcoin,.,l'avenir des,.,bitcoins,.,l'origine du,.,bitcoin,.,bitcoin,.,market,.,bitcoin,.,millionaire,.,bitcoin,.,mining software,.,bitcoin,.,meaning,.,bitcoin,.,mining hardware,.,bitcoin,.,machine,.,bitcoin,.,mining pool,.,bitcoin,.,magazine,.,bitcoin,.,mining rig,.,m,.,bitcoin,.,meaning,.,m.bitcoin2048,.,bitcoin,.,m of n,.,bitcoin,.,m of n transactions,.,siriusxm,.,bitcoin,.,triple m,.,bitcoin,.,m lhuillier,.,bitcoin,.,m pesa vs,.,bitcoin,.,m.bitcoin2048.com отзывы,.,mercado,.,bitcoin,.,bitcoin,.,news uk,.,bitcoin,.,network,.,bitcoin,.,net worth,.,bitcoin,.,news reddit,.,bitcoin,.,nodes,.,bitcoin,.,network fee,.,bitcoin,.,near me,.,bitcoin,.,nedir,.,bitcoin,.,news india,.,bitcoin.n,.,bitcoin,.,n.ireland,.,n&p,.,bitcoin,.,consulting,.,shares in,.,bitcoin,.,piotr_n,.,bitcointalk,.,piotr_n,.,bitcoin,.,m of n,.,bitcoin,.,bitcoinspot.n,.,bitcoin,.,or ethereum,.,bitcoin,.,owner,.,bitcoin,.,online,.,bitcoin,.,original price,.,bitcoin,.,offline wallet,.,bitcoin,.,online wallet,.,bitcoin,.,outlook,.,bitcoin,.,official site,.,bitcoin,.,on amazon,.,o,.,bitcoin,.,e seguro,.,o,.,bitcoinu,.,bitcoin,.,o'reilly,.,bitcoin,.,to aud,.,bitcoin,.,o'reilly pdf,.,bitcoin,.,to euro,.,bitcoin,.,to btc,.,sve o,.,bitcoin,.,o'reilly,.,bitcoin,.,and the blockchain,.,bitcoin,.,price gbp,.,bitcoin,.,predictions,.,bitcoin,.,price uk,.,bitcoin,.,price prediction,.,bitcoin,.,paper wallet,.,bitcoin,.,pizza,.,,.,bitcoin,.,price live,.,p np,.,bitcoin,.,r.i.p.,.,bitcoin,.,p-free,.,bitcoin,.,win32/bitcoinminer.p,.,bitcoin,.,qt,.,bitcoin,.,qr code,.,bitcoin,.,quote,.,bitcoin,.,quantum computing,.,bitcoin,.,que es,.,bitcoin,.,quora,.,bitcoin,.,questions,.,bitcoin,.,qt update,.,bitcoin,.,qt wallet location,.,bitcoin,.,quantum,.,bitcoin,.,q,.,bitcoin,.,q es,.,q son,.,bitcoins,.,q es un,.,bitcoin,.,q son los,.,bitcoins,.,q es el,.,bitcoin,.,q comprar con,.,bitcoins,.,bitcoins que significa,.,bitcoin,.,q significa,.,bitcoin,.,rate,.,bitcoin,.,reddit,.,bitcoin,.,review,.,bitcoin,.,rival,.,bitcoin,.,rate gbp,.,bitcoin,.,rise,.,bitcoin,.,regulation,.,bitcoin,.,rich list,.,bitcoin,.,rate history,.,bitcoin,.,regulation uk,.,r,.,bitcoinmarkets,.,r,.,bitcoin,.,uk,.,r,.,bitcoin,.,canada,.,r,.,bitcoin,.,cash,.,r,.,bitcoin,.,package,.,r,.,bitcointalk,.,r,.,bitcoin,.,mining,.,r,.,bitcoin,.,abc,.,r,.,bitcoin,.,analysis,.,bitcoinxt,.,bitcoin,.,share price,.,bitcoin,.,stock,.,bitcoin,.,split,.,bitcoin,.,segwit,.,bitcoin,.,stock price,.,bitcoin,.,shares,.,bitcoin,.,symbol,.,bitcoin,.,suisse,.,bitcoin,.,scams,.,bitcoin,.,stock market,.,bitcoins value,.,bitcoin,.,s curve,.,bitcoin,.,miners,.,gh/s,.,bitcoin,.,th/s,.,bitcoin,.,th/s,.,bitcoin,.,miner,.,mh/s,.,bitcoin,.,1th/s,.,bitcoin,.,miner,.,10th/s,.,bitcoin,.,miner,.,20th/s,.,bitcoin,.,miner,.,bitcoin,.,trading,.,bitcoin,.,to dollar,.,bitcoin,.,transaction,.,bitcoin,.,to £,.,bitcoin,.,ticker,.,bitcointalk,.,bitcoin,.,transaction fee,.,bitcoin,.,t shirt,.,bitcoin,.,t shirt uk,.,bitcoin,.,t shirt india,.,bitcoin,.,t shirt store,.,alpha-t,.,bitcointalk,.,bb&t,.,bitcoin,.,t-110,.,bitcoin,.,mining system,.,bitcoin,.,miner t720,.,bitcoin,.,usd,.,bitcoin,.,uk,.,bitcoin,.,unlimited,.,bitcoin,.,unconfirmed transaction,.,bitcoin,.,usd price,.,bitcoin,.,uk price,.,bitcoin,.,uasf,.,bitcoin,.,uk tax,.,bitcoin,.,update,.,bitcoin,.,uk exchange,.,why u,.,bitcoin,.,billionaire,.,bitcoin,.,u bosni,.,bitcoin,.,miner.u,.,bitcoin,.,u crnoj gori,.,bitcoin,.,youtube,.,bitcoin,.,u dinarima,.,wii u,.,bitcoin,.,utorrent,.,bitcoin,.,u.s.,.,bitcoin,.,exchange,.,bitcoin,.,u kune,.,bitcoin,.,value,.,,.,bitcoin,.,value chart,.,bitcoin,.,value history,.,bitcoin,.,value gbp,.,bitcoin,.,vs ethereum,.,bitcoin,.,vs usd,.,bitcoin,.,volatility,.,bitcoin,.,vs litecoin,.,bitcoin,.,value 2010,.,bitcoin,.,vs gold,.,bitcoin,.,v litecoin,.,bitcoin,.,v dollar,.,bitcoin,.,v euro,.,bitcoin,.,v gold,.,bitcoin,.,v blockchain,.,bitcoin,.,v onecoin,.,bitcoin,.,hack v.2,.,bitcoin,.,worth,.,bitcoin,.,wiki,.,bitcoin,.,wallet uk,.,bitcoin,.,what is it,.,bitcoinwisdom,.,bitcoin,.,whitepaper,.,bitcoin,.,wallet online,.,bitcoin,.,wallet address,.,bitcoin,.,wallet download,.,bitcoin,.,miner.w,.,bitcoin,.,w polsce,.,bitcoiny w polsce,.,bitcoin,.,w niemczech,.,bitcoin,.,w chmurze,.,bitcoin,.,w żabce,.,bitcoin,.,w polsce legalny,.,bitcoin,.,w chinach,.,bitcoin,.,w prawie polskim,.,bitcoin,.,w górę,.,bitcoin,.,xe,.,bitcoin,.,xbt,.,bitcoin,.,xt,.,bitcoin,.,xbte,.,bitcoin,.,xapo,.,bitcoin,.,xrp,.,bitcoin,.,xt price,.,bitcoin,.,xpub,.,x,.,bitcoin,.,generator,.,bitcoin,.,yahoo finance,.,bitcoin,.,year chart,.,bitcoin,.,year,.,bitcoin,.,yield,.,bitcoin,.,ytd,.,bitcoin,.,yubikey,.,bitcoin,.,yoda,.,bitcoin,.,yahoo finance chart,.,ybitcoin,.,magazine,.,bitcoin,.,y control de cambio,.,y combinator,.,bitcoin,.,ecuador y,.,bitcoin,.,bitcoin,.,by paypal,.,bitcoin,.,y el lavado de dinero,.,bitcoin,.,y deep web,.,bitcoin,.,y lavado de dinero,.,bitcoin,.,y litecoin,.,bitcoin,.,and blockchain,.,bitcoin,.,zebra,.,bitcoin,.,zerohedge,.,bitcoin,.,zimbabwe,.,bitcoin,.,zar,.,bitcoin,.,zcash,.,bitcoin,.,zapwallettxes,.,bitcoin,.,zarabianie,.,bitcoin,.,zug,.,bitcoin,.,zero,.,bitcoin,.,zero confirmations,.,bitcoin,.,z value,.,titan z,.,bitcoin,.,mining,.,titan z,.,bitcoin,.,z cash,.,bitcoin,.,nvidia titan z,.,bitcoin,.,mining,.,nvidia titan z,.,bitcoin,.,nakup zlata z,.,bitcoini,.,sklep z,.,bitcoinami,.,trgovanje z,.,bitcoini,.,co z,.,bitcoinem,.,bitcoin,.,0 confirmations,.,bitcoin,.,0.1,.,bitcoin,.,0.1.0,.,bitcoin,.,0 active connections,.,bitcoin,.,0 transaction fee,.,bitcoin,.,0 fee,.,0.15,.,bitcoins,.,0 25,.,bitcoins,.,0.05,.,bitcoin,.,in euro,.,bitcoin,.,2.0,.,0.1,.,bitcoins,.,0.21,.,bitcoins,.,bitcoin,.,1st august,.,bitcoin,.,1 million,.,bitcoin,.,101,.,bitcoin,.,10 year chart,.,bitcoin,.,10000,.,bitcoin,.,148,.,,.,bitcoin,.,10 year prediction,.,bitcoin,.,100k,.,bitcoin,.,100 dollars,.,bitcoin,.,10 years ago,.,1,.,bitcoin,.,in gbp,.,1,.,bitcoin,.,in pounds,.,1,.,bitcoin,.,in £,.,1,.,bitcoin,.,to dollar,.,1,.,bitcoin,.,in inr,.,1,.,bitcoin,.,to euro,.,1,.,bitcoin,.,in gdp,.,1,.,bitcoin,.,in eur,.,1,.,bitcoin,.,to myr,.,1,.,bitcoin,.,in sterling,.,bitcoin,.,2010,.,bitcoin,.,2017,.,bitcoin,.,2020,.,bitcoin,.,2018,.,bitcoin,.,2009,.,bitcoin,.,2013,.,bitcoin,.,21 million,.,bitcoin,.,2012,.,bitcoin,.,2014,.,2,.,bitcoin,.,to usd,.,2,.,bitcoin,.,price,.,2,.,bitcoin,.,to inr,.,2,.,bitcoin,.,wallets,.,2,.,bitcoins to dollars,.,2,.,bitcoins free,.,2,.,bitcoins a month,.,2,.,bitcoin,.,qt,.,bitcoin,.,2 year chart,.,bitcoin,.,2 paypal,.,bitcoin,.,3000,.,bitcoin,.,31st july,.,bitcoin,.,3 confirmations,.,bitcoin,.,3.0,.,bitcoin,.,3 year chart,.,bitcoin,.,3 month chart,.,bitcoin,.,300,.,bitcoin,.,365 club,.,bitcoin,.,3000 usd,.,bitcoin,.,30 confirmations,.,3,.,bitcoins in gbp,.,3,.,bitcoins,.,3,.,bitcoins to usd,.,3,.,bitcoin,.,in euro,.,3,.,bitcoin,.,to eur,.,bitcoin,.,3 unlimited,.,bitcoin,.,3 day chart,.,bitcoin,.,3 address,.,bitcoin,.,4000,.,bitcoin,.,4chan,.,bitcoin,.,4 billion,.,bitcoin,.,401k,.,bitcoin,.,4 backpage,.,bitcoin,.,43,.,bitcoin,.,40000,.,bitcoin,.,4k,.,bitcoin,.,4 year chart,.,bitcoin,.,48,.,4,.,bitcoins,.,4,.,bitcoins to usd,.,4,.,bitcoins in gbp,.,4,.,bitcoin,.,to eur,.,bitcoins 4 backpage,.,bitcoin,.,4 igaming,.,bitcoin,.,4 u,.,bitcoin,.,4 november,.,bitcoin,.,4 cash,.,bitcoin,.,5 year chart,.,bitcoin,.,51 attack,.,bitcoin,.,500,.,bitcoin,.,5 year,.,bitcoin,.,500 000,.,bitcoin,.,5000,.,bitcoin,.,50000,.,bitcoin,.,5 year price,.,bitcoin,.,5 years ago,.,bitcoin,.,5 year forecast,.,5,.,bitcoins in pounds,.,5,.,bitcoins,.,5,.,bitcoins to usd,.,5,.,bitcoin,.,free,.,5,.,bitcoin,.,in euro,.,bitcoin,.,5 years,.,bitcoin,.,5 minutes,.,bitcoin,.,5 min,.,bitcoin,.,5 unlimited generator,.,bitcoin,.,666,.,bitcoin,.,6 months,.,bitcoin,.,6 confirmations,.,bitcoin,.,6 month chart,.,bitcoin,.,6000,.,bitcoin,.,60 minutes,.,bitcoin,.,6 confirmations time,.,bitcoin,.,6 month price,.,bitcoin,.,6 years ago,.,bitcoin,.,60 day chart,.,6,.,bitcoin,.,network confirmations,.,,.,
submitted by besterse to BestCryptoPlatform [link] [comments]

How to buy bitcoins for beginners 2017 - YouTube Nick Szabo's Remarks on the Bitcoin XT Fork Bitcoin Mining Calculator Bitcoin mining is a full time job EB82 – Mike Hearn - Blocksize Debate At The Breaking Point

Height Age Transactions Total Sent Total Fees Block Size (in bytes) 654077: 2020-10-24T07:54:33.84Z: 1,769: 2,703.104 BTC: 0.251 BTC: 862,610: 654076: 2020-10-24T07 ... Bitcoin Block Size Economics. In the last two weeks, the topic of increasing the Bitcoin block size limit has made its way into the Bitcoin community’s spotlight. The discussion began on October 6, 2014, when Gavin Andresen—the Chief Scientist at the Bitcoin Foundation—made a post on the Foundation’s official blog regarding the issue. Andresen said that he wanted to increase the hard ... BCD Bitcoin Diamond. BCD 178 MH/s, 600 W. 0.03 USD-1.44 USD-1.41 USD. 18. ZP BCD Zergpool. BCD 178 MH/s, 600 W. 0.03 USD-1.44 USD-1.41 USD. 19. zpool BCD zpool. BCD 178 MH/s, 600 W. 0.03 USD-1.44 USD-1.41 USD. 20. MPH Lyra2REv2 Mining Pool Hub. Lyra2REv2 216 MH/s, 600 W. 0.02 USD-1.44 USD-1.42 USD. Load all coins. Include coins with alerts. Disclaimer: Results from mining calculator are ... Accurate Bitcoin mining calculator trusted by millions of cryptocurrency miners since May 2013 - developed by an OG Bitcoin miner looking to maximize on mining profits and calculate ROI for new ASIC miners. Updated in 2020, the newest version of the Bitcoin mining calculator makes it simple and easy to quickly calculate mining profitability for your Bitcoin mining hardware. Bitcoin Transaction Fees Explained in Detail. Bitcoin fees are a fascinating component of the network’s game theory and an indispensable element without which the whole project’s economic sustainability becomes questionable.. Whenever a transaction is sent, miners demand for an arbitrary amount of bitcoin fractions (denominated in satoshis, the hundred millionth part of 1 BTC) so that they ...

[index] [38203] [1728] [49095] [3646] [44729] [24380] [821] [49987] [21901] [42722]

How to buy bitcoins for beginners 2017 - YouTube

How to choose the right Renko box size - Duration: 14:32. Chartist Ranga 40,443 views. 14:32. RENKO Trading Strategy ... Bitcoin Trading Challenge 4,999 views. 10:17. The Ultimate Candlestick ... Crypto exchange rate calculator helps you convert prices online between two currencies in real-time. Online CryptoCurrency Calculator with multi-Cryptocurrencies. Cryptocurrency converter, calculator. In this video we answer the very famous question: Why Will There Be Only 21 Million Bitcoins? Calculation involved for 21 million: Calculate the number of blocks per 4 year cycle: 6 blocks per ... Block Operations 12,581 views. 30:58. What is Bitcoin Mining and is it worth doing? - Duration: 10:46. The Bitcoin Project 2,844 views. 10:46. What It Was Like MINING Cryptocurrency Full-Time For ... Demo of our Bitcoin Mining Calculator. Estimate what you can earn mining Bitcoin. To download the calculator and get started with our recommended mining prov...

#