Bitcoin Discussion - Page 60 - The Bitcoin Forum

Long live decentralized bitcoin(!) A reading list

Newbs might not know this, but bitcoin recently came out of an intense internal drama. Between July 2015 and August 2017 bitcoin was attacked by external forces who were hoping to destroy the very properties that made bitcoin valuable in the first place. This culminated in the creation of segwit and the UASF (user activated soft fork) movement. The UASF was successful, segwit was added to bitcoin and with that the anti-decentralization side left bitcoin altogether and created their own altcoin called bcash. Bitcoin's price was $2500, soon after segwit was activated the price doubled to $5000 and continued rising until a top of $20000 before correcting to where we are today.
During this drama, I took time away from writing open source code to help educate and argue on reddit, twitter and other social media. I came up with a reading list for quickly copypasting things. It may be interesting today for newbs or anyone who wants a history lesson on what exactly happened during those two years when bitcoin's very existence as a decentralized low-trust currency was questioned. Now the fight has essentially been won, I try not to comment on reddit that much anymore. There's nothing left to do except wait for Lightning and similar tech to become mature (or better yet, help code it and test it)
In this thread you can learn about block sizes, latency, decentralization, segwit, ASICBOOST, lightning network and all the other issues that were debated endlessly for over two years. So when someone tries to get you to invest in bcash, remind them of the time they supported Bitcoin Unlimited.
For more threads like this see UASF

Summary / The fundamental tradeoff

A trip to the moon requires a rocket with multiple stages by gmaxwell (must read) https://www.reddit.com/Bitcoin/comments/438hx0/a_trip_to_the_moon_requires_a_rocket_with/
Bram Cohen, creator of bittorrent, argues against a hard fork to a larger block size https://medium.com/@bramcohen/bitcoin-s-ironic-crisis-32226a85e39f#.558vetum4
gmaxwell's summary of the debate https://bitcointalk.org/index.php?topic=1343716.msg13701818#msg13701818
Core devs please explain your vision (see luke's post which also argues that blocks are already too big) https://www.reddit.com/Bitcoin/comments/61yvvv/request_to_core_devs_please_explain_your_vision/
Mod of btc speaking against a hard fork https://www.reddit.com/btc/comments/57hd14/core_reaction_to_viabtc_this_week/d8scokm/
It's becoming clear to me that a lot of people don't understand how fragile bitcoin is https://www.reddit.com/Bitcoin/comments/59kflj/its_becoming_clear_to_me_that_a_lot_of_people/
Blockchain space must be costly, it can never be free https://www.reddit.com/Bitcoin/comments/4og24h/i_just_attended_the_distributed_trade_conference/
Charlie Lee with a nice analogy about the fundamental tradeoff https://medium.com/@SatoshiLite/eating-the-bitcoin-cake-fc2b4ebfb85e#.444vr8shw
gmaxwell on the tradeoffs https://bitcointalk.org/index.php?topic=1520693.msg15303746#msg15303746
jratcliff on the layering https://www.reddit.com/btc/comments/59upyh/segwit_the_poison_pill_for_bitcoin/d9bstuw/

Scaling on-chain will destroy bitcoin's decentralization

Peter Todd: How a floating blocksize limit inevitably leads towards centralization [Feb 2013] https://bitcointalk.org/index.php?topic=144895.0 mailing list https://lists.linuxfoundation.org/pipermail/bitcoin-dev/2013-February/002176.html with discussion on reddit in Aug 2015 https://www.reddit.com/Bitcoin/comments/3hnvi8/just_a_little_history_lesson_for_everyone_new_the/
Nick Szabo's blog post on what makes bitcoin so special http://unenumerated.blogspot.com/2017/02/money-blockchains-and-social-scalability.html
There is academic research showing that even small (2MB) increases to the blocksize results in drastic node dropoff counts due to the non-linear increase of RAM needed. http://bravenewcoin.com/assets/Whitepapers/block-size-1.1.1.pdf
Reddit summary of above link. In this table, you can see it estimates a 40% drop immediately in node count with a 2MB upgrade and a 50% over 6 months. At 4mb, it becomes 75% immediately and 80% over 6 months. At 8, it becomes 90% and 95%. https://www.reddit.com/Bitcoin/comments/5qw2wa_future_led_by_bitcoin_unlimited_is_a/dd442pw/
Larger block sizes make centralization pressures worse (mathematical) https://petertodd.org/2016/block-publication-incentives-for-miners
Talk at scalingbitcoin montreal, initial blockchain synchronization puts serious constraints on any increase in the block size https://www.youtube.com/watch?v=TgjrS-BPWDQ&t=2h02m06s with transcript https://scalingbitcoin.org/transcript/montreal2015/block-synchronization-time
Bitcoin's P2P Network: The Soft Underbelly of Bitcoin https://www.youtube.com/watch?v=Y6kibPzbrIc someone's notes: https://gist.github.com/romyilano/5e22394857a39889a1e5 reddit discussion https://www.reddit.com/Bitcoin/comments/4py5df/so_f2pool_antpool_btcc_pool_are_actually_one_pool/
In adversarial environments blockchains dont scale https://scalingbitcoin.org/transcript/hongkong2015/in-adversarial-environments-blockchains-dont-scale
Why miners will not voluntarily individually produce smaller blocks https://scalingbitcoin.org/transcript/hongkong2015/why-miners-will-not-voluntarily-individually-produce-smaller-blocks
Hal Finney: bitcoin's blockchain can only be a settlement layer (mostly interesting because it's hal finney and its in 2010) https://www.reddit.com/Bitcoin/comments/3sb5nj/most_bitcoin_transactions_will_occur_between/
petertodd's 2013 video explaining this https://www.youtube.com/watch?v=cZp7UGgBR0I
luke-jr's summary https://www.reddit.com/Bitcoin/comments/61yvvv/request_to_core_devs_please_explain_your_vision/dficjhj/
Another jratcliff thread https://www.reddit.com/Bitcoin/comments/6lmpll/explaining_why_big_blocks_are_bad/

Full blocks are not a disaster

Blocks must be always full, there must always be a backlog https://medium.com/@bergealex4/bitcoin-is-unstable-without-the-block-size-size-limit-70db07070a54#.kh2vi86lr
Same as above, the mining gap means there must always be a backlog talk: https://www.youtube.com/watch?time_continue=2453&v=iKDC2DpzNbw transcript: https://scalingbitcoin.org/transcript/montreal2015/security-of-diminishing-block-subsidy
Backlogs arent that bad https://www.reddit.com/Bitcoin/comments/49p011/was_the_fee_event_really_so_bad_my_mind_is/
Examples where scarce block space causes people to use precious resources more efficiently https://www.reddit.com/Bitcoin/comments/4kxxvj/i_just_singlehandedly_increased_bitcoin_network/
https://www.reddit.com/Bitcoin/comments/47d4m2/why_does_coinbase_make_2_transactions_pe
https://www.reddit.com/Bitcoin/comments/53wucs/why_arent_blocks_full_yet/d7x19iv
Full blocks are fine https://www.reddit.com/Bitcoin/comments/5uld1a/misconception_full_blocks_mean_bitcoin_is_failing/
High miner fees imply a sustainable future for bitcoin https://www.reddit.com/BitcoinMarkets/comments/680tvf/fundamentals_friday_week_of_friday_april_28_2017/dgwmhl7/
gmaxwell on why full blocks are good https://www.reddit.com/Bitcoin/comments/6b57ca/full_blocks_good_or_bad/dhjxwbz/
The whole idea of the mempool being "filled" is wrong headed. The mempool doesn't "clog" or get stuck, or anything like that. https://www.reddit.com/Bitcoin/comments/7cusnx/to_the_people_still_doubting_that_this_congestion/dpssokf/

Segwit

What is segwit

luke-jr's longer summary https://www.reddit.com/Bitcoin/comments/6033h7/today_is_exactly_4_months_since_the_segwit_voting/df3tgwg/?context=1
Charlie Shrem's on upgrading to segwit https://twitter.com/CharlieShrem/status/842711238853513220
Original segwit talk at scalingbitcoin hong kong + transcript https://youtu.be/zchzn7aPQjI?t=110
https://scalingbitcoin.org/transcript/hongkong2015/segregated-witness-and-its-impact-on-scalability
Segwit is not too complex https://www.reddit.com/btc/comments/57vjin/segwit_is_not_great/d8vos33/
Segwit does not make it possible for miners to steal coins, contrary to what some people say https://www.reddit.com/btc/comments/5e6bt0/concerns_with_segwit_and_anyone_can_spend/daa5jat/?context=1
https://keepingstock.net/segwit-eli5-misinformation-faq-19908ceacf23#.r8hlzaquz
Segwit is required for a useful lightning network It's now known that without a malleability fix useful indefinite channels are not really possible.
https://www.reddit.com/Bitcoin/comments/5tzqtc/gentle_reminder_the_ln_doesnt_require_segwit/ddqgda7/
https://www.reddit.com/Bitcoin/comments/5tzqtc/gentle_reminder_the_ln_doesnt_require_segwit/ddqbukj/
https://www.reddit.com/Bitcoin/comments/5x2oh0/olaoluwa_osuntokun_all_active_lightning_network/deeto14/?context=3
Clearing up SegWit Lies and Myths: https://achow101.com/2016/04/Segwit-FUD-Clearup
Segwit is bigger blocks https://www.reddit.com/Bitcoin/comments/5pb8vs/misinformation_is_working_54_incorrectly_believe/dcpz3en/
Typical usage results in segwit allowing capacity equivalent to 2mb blocks https://www.reddit.com/Bitcoin/comments/69i2md/observe_for_yourself_segwit_allows_2_mb_blocks_in/

Why is segwit being blocked

Jihan Wu (head of largest bitcoin mining group) is blocking segwit because of perceived loss of income https://www.reddit.com/Bitcoin/comments/60mb9e/complete_high_quality_translation_of_jihans/
Witness discount creates aligned incentives https://segwit.org/why-a-discount-factor-of-4-why-not-2-or-8-bbcebe91721e#.h36odthq0 https://medium.com/@SegWit.co/what-is-behind-the-segwit-discount-988f29dc1edf#.sr91dg406
or because he wants his mining enterprise to have control over bitcoin https://www.reddit.com/Bitcoin/comments/6jdyk8/direct_report_of_jihan_wus_real_reason_fo

Segwit is being blocked because it breaks ASICBOOST, a patented optimization used by bitmain ASIC manufacturer

Details and discovery by gmaxwell https://lists.linuxfoundation.org/pipermail/bitcoin-dev/2017-April/013996.html
Reddit thread with discussion https://www.reddit.com/Bitcoin/comments/63otrp/gregory_maxwell_major_asic_manufacturer_is/
Simplified explaination by jonny1000 https://www.reddit.com/Bitcoin/comments/64qq5g/attempted_explanation_of_the_alleged_asicboost/
http://www.mit.edu/~jlrubin/public/pdfs/Asicboost.pdf
https://medium.com/@jimmysong/examining-bitmains-claims-about-asicboost-1d61118c678d
Evidence https://www.reddit.com/Bitcoin/comments/63yo27/some_circumstantial_evidence_supporting_the_claim/
https://www.reddit.com/Bitcoin/comments/63vn5g/please_dont_stop_us_from_using_asicboost_which/dfxmm75/
https://www.reddit.com/Bitcoin/comments/63soe3/reverse_engineering_an_asic_is_a_significant_task/dfx9nc
Bitmain admits their chips have asicboost but they say they never used it on the network (haha a likely story) https://blog.bitmain.com/en/regarding-recent-allegations-smear-campaigns/
Worth $100m per year to them (also in gmaxwell's original email) https://twitter.com/petertoddbtc/status/849798529929424898
Other calculations show less https://medium.com/@vcorem/the-real-savings-from-asicboost-to-bitmaintech-ff265c2d305b
This also blocks all these other cool updates, not just segwit https://www.reddit.com/Bitcoin/comments/63otrp/gregory_maxwell_major_asic_manufacturer_is/dfw0ej3/
Summary of bad consequences of asicboost https://www.reddit.com/Bitcoin/comments/64qq5g/attempted_explanation_of_the_alleged_asicboost/dg4hyqk/?context=1
Luke's summary of the entire situation https://www.reddit.com/Bitcoin/comments/6ego3s/why_is_killing_asicboost_not_a_priority/diagkkb/?context=1
Prices goes up because now segwit looks more likely https://twitter.com/TuurDemeestestatus/849846845425799168
Asicboost discovery made the price rise https://twitter.com/TuurDemeestestatus/851520094677200901
A pool was caught red handed doing asicboost, by this time it seemed fairly certain that segwit would get activated so it didnt produce as much interest as earlier https://www.reddit.com/Bitcoin/comments/6p7lr5/1hash_pool_has_mined_2_invalid_blocks/ and https://www.reddit.com/Bitcoin/comments/6p95dl/interesting_1hash_pool_mined_some_invalid_blocks/ and https://twitter.com/petertoddbtc/status/889475196322811904
This btc user is outraged at the entire forum because they support Bitmain and ASICBOOST https://www.reddit.com/btc/comments/67t43y/dragons_den_planned_smear_campaign_of_bitmain/dgtg9l2/
Antbleed, turns out Bitmain can shut down all its ASICs by remote control: http://www.antbleed.com/

What if segwit never activates

What if segwit never activates? https://www.reddit.com/Bitcoin/comments/6ab8js/transaction_fees_are_now_making_btc_like_the_banks/dhdq3id/ with https://www.reddit.com/Bitcoin/comments/5ksu3o/blinded_bearer_certificates/ and https://www.reddit.com/Bitcoin/comments/4xy0fm/scaling_quickly/

Lightning

bitcoinmagazine's series on what lightning is and how it works https://bitcoinmagazine.com/articles/understanding-the-lightning-network-part-building-a-bidirectional-payment-channel-1464710791/ https://bitcoinmagazine.com/articles/understanding-the-lightning-network-part-creating-the-network-1465326903/ https://bitcoinmagazine.com/articles/understanding-the-lightning-network-part-completing-the-puzzle-and-closing-the-channel-1466178980/
The Lightning Network ELIDHDICACS (Explain Like I Don’t Have Degrees in Cryptography and Computer Science) https://letstalkbitcoin.com/blog/post/the-lightning-network-elidhdicacs
Ligtning will increases fees for miners, not lower them https://medium.com/lightning-resources/the-lightning-paradox-f15ce0e8e374#.erfgunumh
Cost-benefit analysis of lightning from the point of view of miners https://medium.com/@rusty_lightning/miners-and-bitcoin-lightning-a133cd550310#.x42rovlg8
Routing blog post by rusty https://medium.com/@rusty_lightning/routing-dijkstra-bellman-ford-and-bfg-7715840f004 and reddit comments https://www.reddit.com/Bitcoin/comments/4lzkz1/rusty_russell_on_lightning_routing_routing/
Lightning protocol rfc https://github.com/lightningnetwork/lightning-rfc
Blog post with screenshots of ln being used on testnet https://medium.com/@btc_coach/lightning-network-in-action-b18a035c955d video https://www.youtube.com/watch?v=mxGiMu4V7ns
Video of sending and receiving ln on testnet https://twitter.com/alexbosworth/status/844030573131706368
Lightning tradeoffs http://www.coindesk.com/lightning-technical-challenges-bitcoin-scalability/
Beer sold for testnet lightning https://www.reddit.com/Bitcoin/comments/62uw23/lightning_network_is_working_room77_is_accepting/ and https://twitter.com/MrHodl/status/848265171269283845
Lightning will result in far fewer coins being stored on third parties because it supports instant transactions https://medium.com/@thecryptoconomy/the-barely-discussed-incredible-benefit-of-the-lightning-network-4ce82c75eb58
jgarzik argues strongly against LN, he owns a coin tracking startup https://twitter.com/petertoddbtc/status/860826532650123264 https://twitter.com/Beautyon_/status/886128801926795264
luke's great debunking / answer of some misinformation questions https://www.reddit.com/Bitcoin/comments/6st4eq/questions_about_lightning_network/dlfap0u/
Lightning centralization doesnt happen https://www.reddit.com/Bitcoin/comments/6vzau5/reminder_bitcoins_key_strength_is_in_being/dm4ou3v/?context=1
roasbeef on hubs and charging fees https://twitter.com/roasbeef/status/930209165728825344 and https://twitter.com/roasbeef/status/930210145790976000

Immutability / Being a swiss bank in your pocket / Why doing a hard fork (especially without consensus) is damaging

A downside of hard forks is damaging bitcoin's immutability https://www.reddit.com/Bitcoin/comments/5em6vu/what_happens_if_segwit_doesnt_activate/dae1r6c/?context=3
Interesting analysis of miners incentives and how failure is possible, don't trust the miners for long term https://www.reddit.com/Bitcoin/comments/5gtew4/why_an_increased_block_size_increases_the_cost_of/daybazj/?context=2
waxwing on the meaning of cash and settlement https://www.reddit.com/Bitcoin/comments/5ei7m3/unconfirmed_transactions_60k_total_fees_14btc/dad001v/
maaku on the cash question https://www.reddit.com/Bitcoin/comments/5i5iq5/we_are_spoiled/db5luiv/?context=1
Digital gold funamentalists gain nothing from supporting a hard fork to larger block sizes https://www.reddit.com/Bitcoin/comments/5xzunq/core_please_compromise_before_we_end_up_with_bu/dem73xg/?context=1
Those asking for a compromise don't understand the underlying political forces https://www.reddit.com/Bitcoin/comments/6ef7wb/some_comments_on_the_bip148_uasf_from_the/dia236b/?context=3
Nobody wants a contentious hard fork actually, anti-core people got emotionally manipulated https://www.reddit.com/Bitcoin/comments/5sq5ocontentious_forks_vs_incremental_progress/ddip57o/
The hard work of the core developers has kept bitcoin scalable https://www.reddit.com/Bitcoin/comments/3hfgpo/an_initiative_to_bring_advanced_privacy_features/cu7mhw8?context=9
Recent PRs to improve bitcoin scaleability ignored by the debate https://twitter.com/jfnewbery/status/883001356168167425
gmaxwell against hard forks since 2013 https://bitcointalk.org/index.php?topic=140233.20
maaku: hard forks are really bad https://www.reddit.com/Bitcoin/comments/5zxjza/adam_greg_core_devs_and_big_blockers_now_is_the/df275yk/?context=2

Some metrics on what the market thinks of decentralization and hostile hard forks

The price history shows that the exchange rate drops every time a hard fork threatens: https://i.imgur.com/EVPYLR8.jpg
and this example from 2017 https://twitter.com/WhalePanda/status/845562763820912642
http://imgur.com/a/DuHAn btc users lose money
price supporting theymos' moderation https://i.imgur.com/0jZdF9h.png
old version https://i.imgur.com/BFTxTJl.png
older version https://pbs.twimg.com/media/CxqtUakUQAEmC0d.jpg
about 50% of nodes updated to the soft fork node quite quickly https://imgur.com/O0xboVI

Bitcoin Unlimited / Emergent Consensus is badly designed, changes the game theory of bitcoin

Bitcoin Unlimited was a proposed hard fork client, it was made with the intention to stop segwit from activating
A Future Led by Bitcoin Unlimited is a Centralized Future https://blog.sia.tech/a-future-led-by-bitcoin-unlimited-is-a-centralized-future-e48ab52c817a#.p1ly6hldk
Flexible transactions are bugged https://www.reddit.com/Bitcoin/comments/57tf5g/bitcoindev_bluematt_on_flexible_transactions/
Bugged BU software mines an invalid block, wasting 13 bitcoins or $12k
https://www.reddit.com/Bitcoin/comments/5qwtr2/bitcoincom_loses_132btc_trying_to_fork_the/
https://www.reddit.com/btc/comments/5qx18i/bitcoincom_loses_132btc_trying_to_fork_the/
bitcoin.com employees are moderators of btc https://medium.com/@WhalePanda/the-curious-relation-between-bitcoin-com-anti-segwit-propaganda-26c877249976#.vl02566k4
miners don't control stuff like the block size http://hackingdistributed.com/2016/01/03/time-for-bitcoin-user-voice/
even gavin agreed that economic majority controls things https://www.reddit.com/Bitcoin/comments/5ywoi9/in_2010_gavin_predicted_that_exchanges_ie_the/
fork clients are trying to steal bitcoin's brand and network effect, theyre no different from altcoins https://medium.com/@Coinosphere/why-bitcoin-unlimited-should-be-correctly-classified-as-an-attempted-robbery-of-bitcoin-not-a-9355d075763c#.qeaynlx5m
BU being active makes it easier to reverse payments, increases wasted work making the network less secure and giving an advantage to bigger miners https://www.reddit.com/Bitcoin/comments/5g1x84/bitcoin_unlimited_bu_median_value_of_miner_eb/
bitcoin unlimited takes power away from users and gives it to miners https://medium.com/@alpalpalp/bitcoin-unlimiteds-placebo-controls-6320cbc137d4#.q0dv15gd5
bitcoin unlimited's accepted depth https://twitter.com/tdryja/status/804770009272696832
BU's lying propaganda poster https://imgur.com/osrViDE

BU is bugged, poorly-reviewed and crashes

bitcoin unlimited allegedly funded by kraken stolen coins
https://www.reddit.com/btc/comments/55ajuh/taint_analysis_on_bitcoin_stolen_from_kraken_on/
https://www.reddit.com/btc/comments/559miz/taint_analysis_on_btc_allegedly_stolen_from_kraken/
Other funding stuff
https://www.reddit.com/Bitcoin/comments/5zozmn/damning_evidence_on_how_bitcoin_unlimited_pays/
A serious bug in BU https://www.reddit.com/Bitcoin/comments/5h70s3/bitcoin_unlimited_bu_the_developers_have_realized/
A summary of what's wrong with BU: https://www.reddit.com/Bitcoin/comments/5z3wg2/jihanwu_we_will_switch_the_entire_pool_to/devak98/

Bitcoin Unlimited Remote Exploit Crash 14/3/2017

https://www.reddit.com/Bitcoin/comments/5zdkv3/bitcoin_unlimited_remote_exploit_crash/ https://www.reddit.com/Bitcoin/comments/5zeb76/timbe https://www.reddit.com/btc/comments/5zdrru/peter_todd_bu_remote_crash_dos_wtf_bug_assert0_in/
BU devs calling it as disaster https://twitter.com/SooMartindale/status/841758265188966401 also btc deleted a thread about the exploit https://i.imgur.com/lVvFRqN.png
Summary of incident https://www.reddit.com/Bitcoin/comments/5zf97j/i_was_undecided_now_im_not/
More than 20 exchanges will list BTU as an altcoin
https://www.reddit.com/Bitcoin/comments/5zyg6g/bitcoin_exchanges_unveil_emergency_hard_fork/
Again a few days later https://www.reddit.com/Bitcoin/comments/60qmkt/bu_is_taking_another_shit_timberrrrr

User Activated Soft Fork (UASF)

site for it, including list of businesses supporting it http://www.uasf.co/
luke's view
https://www.reddit.com/Bitcoin/comments/5zsk45/i_am_shaolinfry_author_of_the_recent_usedf1dqen/?context=3
threat of UASF makes the miner fall into line in litecoin
https://www.reddit.com/litecoin/comments/66omhlitecoin_global_roundtable_resolution/dgk2thk/?context=3
UASF delivers the goods for vertcoin
https://www.reddit.com/Bitcoin/comments/692mi3/in_test_case_uasf_results_in_miner_consensus/dh3cm34/?context=1
UASF coin is more valuable https://www.reddit.com/Bitcoin/comments/6cgv44/a_uasf_chain_will_be_profoundly_more_valuable/
All the links together in one place https://www.reddit.com/Bitcoin/comments/6dzpew/hi_its_mkwia_again_maintainer_of_uasfbitcoin_on/
p2sh was a uasf https://github.com/bitcoin/bitcoin/blob/v0.6.0/src/main.cpp#L1281-L1283
jgarzik annoyed at the strict timeline that segwit2x has to follow because of bip148 https://twitter.com/jgarzik/status/886605836902162432
Committed intolerant minority https://www.reddit.com/Bitcoin/comments/6d7dyt/a_plea_for_rational_intolerance_extremism_and/
alp on the game theory of the intolerant minority https://medium.com/@alpalpalp/user-activated-soft-forks-and-the-intolerant-minority-a54e57869f57
The risk of UASF is less than the cost of doing nothing https://www.reddit.com/Bitcoin/comments/6bof7a/were_getting_to_the_point_where_a_the_cost_of_not/
uasf delivered the goods for bitcoin, it forced antpool and others to signal (May 2016) https://bitcoinmagazine.com/articles/antpool-will-not-run-segwit-without-block-size-increase-hard-fork-1464028753/ "When asked specifically whether Antpool would run SegWit code without a hard fork increase in the block size also included in a release of Bitcoin Core, Wu responded: “No. It is acceptable that the hard fork code is not activated, but it needs to be included in a ‘release’ of Bitcoin Core. I have made it clear about the definition of ‘release,’ which is not ‘public.’”"
Screenshot of peter rizun capitulating https://twitter.com/chris_belcher_/status/905231603991007232

Fighting off 2x HF

https://twitter.com/MrHodl/status/895089909723049984
https://www.reddit.com/Bitcoin/comments/6h612o/can_someone_explain_to_me_why_core_wont_endorse/?st=j6ic5n17&sh=cc37ee23
https://www.reddit.com/Bitcoin/comments/6smezz/segwit2x_hard_fork_is_completely_useless_its_a/?st=j6ic2aw3&sh=371418dd
https://www.reddit.com/Bitcoin/comments/6sbspv/who_exactly_is_segwit2x_catering_for_now_segwit/?st=j6ic5nic&sh=1f86cadd
https://medium.com/@elliotolds/lesser-known-reasons-to-keep-blocks-small-in-the-words-of-bitcoin-core-developers-44861968185e
b2x is most of all about firing core https://twitter.com/WhalePanda/status/912664487135760384
https://medium.com/@StopAndDecrypt/thats-not-bitcoin-this-is-bitcoin-95f05a6fd6c2

Misinformation / sockpuppets

https://www.reddit.com/Bitcoin/comments/6uqz6k/markets_update_bitcoin_cash_rallies_for_three/dlurbpx/
three year old account, only started posting today https://archive.is/3STjH
Why we should not hard fork after the UASF worked: https://www.reddit.com/Bitcoin/comments/6sl1qf/heres_why_we_should_not_hard_fork_in_a_few_months/

History

Good article that covers virtually all the important history https://bitcoinmagazine.com/articles/long-road-segwit-how-bitcoins-biggest-protocol-upgrade-became-reality/
Interesting post with some history pre-2015 https://btcmanager.com/the-long-history-of-the-fight-over-scaling-bitcoin/
The core scalabality roadmap + my summary from 3/2017 https://lists.linuxfoundation.org/pipermail/bitcoin-dev/2015-Decembe011865.html my summary https://www.reddit.com/Bitcoin/comments/5xa5fa/the_core_development_scalability_roadmap/
History from summer 2015 https://www.reddit.com/Bitcoin/comments/5xg7f8/the_origins_of_the_blocksize_debate/
Brief reminders of the ETC situation https://www.reddit.com/Bitcoin/comments/6nvlgo/simple_breakdown_of_bip91_its_simply_the_miners/dkcycrz/
Longer writeup of ethereum's TheDAO bailout fraud https://www.reddit.com/ethereumfraud/comments/6bgvqv/faq_what_exactly_is_the_fraud_in_ethereum/
Point that the bigblocker side is only blocking segwit as a hostage https://www.reddit.com/BitcoinMarkets/comments/5sqhcq/daily_discussion_wednesday_february_08_2017/ddi3ctv/?context=3
jonny1000's recall of the history of bitcoin https://www.reddit.com/Bitcoin/comments/6s34gg/rbtc_spreading_misinformation_in_rbitcoinmarkets/dl9wkfx/

Misc (mostly memes)

libbitcoin's Understanding Bitcoin series (another must read, most of it) https://github.com/libbitcoin/libbitcoin/wiki/Understanding-Bitcoin
github commit where satoshi added the block size limit https://www.reddit.com/Bitcoin/comments/63859l/github_commit_where_satoshi_added_the_block_size/
hard fork proposals from some core devs https://bitcoinhardforkresearch.github.io/
blockstream hasnt taken over the entire bitcoin core project https://www.reddit.com/Bitcoin/comments/622bjp/bitcoin_core_blockstream/
blockstream is one of the good guys https://www.reddit.com/Bitcoin/comments/6cttkh/its_happening_blockstream_opens_liquid_sidechain/dhxu4e
Forkers, we're not raising a single byte! Song lyrics by belcher https://gist.github.com/chris-belche7264cd6750a86f8b4a9a
Some stuff here along with that cool photoshopped poster https://medium.com/@jimmysong/bitcoin-realism-or-how-i-learned-to-stop-worrying-and-love-1mb-blocks-c191c35e74cb
Nice graphic https://twitter.com/RNR_0/status/871070843698380800
gmaxwell saying how he is probably responsible for the most privacy tech in bitcoin, while mike hearn screwed up privacy https://www.reddit.com/btc/comments/6azyme/hey_bu_wheres_your_testnet/dhiq3xo/?context=6
Fairly cool propaganda poster https://twitter.com/urbanarson/status/880476631583924225
btc tankman https://i.redd.it/gxjqenzpr27z.png https://twitter.com/DanDarkPill/status/853653168151986177
asicboost discovery meme https://twitter.com/allenscottoshi/status/849888189124947971
https://twitter.com/urbanarson/status/882020516521013250
gavin wanted to kill the bitcoin chain https://twitter.com/allenscottoshi/status/849888189124947971
stuff that btc believes https://www.reddit.com/Bitcoin/comments/6ld4a5/serious_is_the_rbtc_and_the_bu_crowd_a_joke_how/djszsqu/
after segwit2x NYA got agreed all the fee pressure disappeared, laurenmt found they were artificial spam https://twitter.com/i/moments/885827802775396352
theymos saying why victory isnt inevitable https://www.reddit.com/Bitcoin/comments/6lmpll/explaining_why_big_blocks_are_bad/djvxv2o/
with ignorant enemies like these its no wonder we won https://bitco.in/forum/threads/gold-collapsing-bitcoin-up.16/page-999 ""So, once segwit2x activates, from that moment on it will require a coordinated fork to avoid the up coming "baked in" HF. ""
a positive effect of bcash, it made blockchain utxo spammers move away from bitcoin https://www.reddit.com/btc/comments/76lv0b/cryptograffitiinfo_now_accepts_bitcoin_cash/dof38gw/
summary of craig wright, jihan wu and roger ver's positions https://medium.com/@HjalmarPeters/the-big-blockers-bead6027deb2
Why is bitcoin so strong against attack?!?! (because we're motivated and awesome) https://www.reddit.com/btc/comments/64wo1h/bitcoin_unlimited_is_being_blocked_by_antivirus/dg5n00x/
what happened to #oldjeffgarzik https://www.reddit.com/Bitcoin/comments/6ufv5x/a_reminder_of_some_of_jeff_garziks_greatest/
big blockers fully deserve to lose every last bitcoin they ever had and more https://www.reddit.com/BitcoinMarkets/comments/756nxf/daily_discussion_monday_october_09_2017/do5ihqi/
gavinandresen brainstorming how to kill bitcoin with a 51% in a nasty way https://twitter.com/btcdrak/status/843914877542567937
Roger Ver as bitcoin Judas https://imgur.com/a/Rf1Pi
A bunch of tweets and memes celebrating UASF
https://twitter.com/shaolinfry/status/842457019286188032 | https://twitter.com/SatoshiLite/status/888335092560441345 | https://twitter.com/btcArtGallery/status/887485162925285377 | https://twitter.com/Beautyon_/status/888109901611802624 | https://twitter.com/Excellion/status/889211512966873088 | https://twitter.com/lopp/status/888200452197801984 | https://twitter.com/AlpacaSW/status/886988980524396544 | https://twitter.com/BashCo_/status/877253729531162624 | https://twitter.com/tdryja/status/865212300361379840 | https://twitter.com/Excellion/status/871179040157179904 | https://twitter.com/TraceMayestatus/849856343074902016 | https://twitter.com/TraceMayestatus/841855022640033792 | https://fs.bitcoinmagazine.com/img/images/Screen_Shot_2017-08-18_at_01.36.47.original.png
submitted by belcher_ to Bitcoin [link] [comments]

Are we here because we agree with the 8MB blocks or because of censorship?

Just trying to wrap my head around what the crap is going on...and why this is happening.
submitted by youvebeenliedto to bitcoinxt [link] [comments]

An Overview of What's Happening.

This is an overview for those who may have not been following events closely, so unfortunately a TL;DR is not possible. However, I have added headings to make navigation easier. Depending on your interest, you can navigate to: The Debate, Decision Time or How to Run a Node
Introduction
China is having some huge celebrations right now for their new year, which to China is like Christmas for us. So they're enjoying their time with their families and friends, enjoying their food and I should think they are enjoying some wine. I wish them all a very happy new year and may it bring them good health and good fortune.
Hopefully they'll have some time during their holidays to consider the matters summarized here.
The Debate
As you know, we have been having a debate for about three years now and a massive debate for the past 10 months on the question of whether onchain transaction capacity should increase. After two conferences, then a third in Miami, then a fourth in China, a wide economic and mining consensus has been achieved with almost all prominent bitcoin companies supporting classic, with 54% of real hashing power expressing their clear support for classic so far, with 80% of users by most polls and with renown users such as Roger Ver and "loaded" who in combination hold some 340k btc, having expressed support for classic.
We do not however have 100% agreement. Some developers have maintained for quite some time that onchain capacity should not be increased or should be increased by the smallest amount that is absolutely necessary and that most, if not almost all, transactions should instead happen on layer2 settlement systems.
Chief amongst them is Gregory Maxwell who has maintained the same view over the past 3 years and has formed a company with a number of other developers to provide lightning and other products to operate as the settlement system they envision.
Another developer who has maintained the same view is Peter Todd who argued in a propaganda video some three years ago that increasing onchain scalability means nodes will have to be run on datacentres which affects decentralization, therefore everyone should pay $100 or $1000 per onchain transaction instead with most transactions operating through centralized hubs which do not employ proof of work (commentary: no proof of work - no decentralization)
Gavin, Jeff Garzik, many other developers, academics, almost all wallet developers, take a different view. They argue that bitcoin can remain decentralized while scaling onchain. I would add to their argument two points. Bitcoin is efficient because it automates away man. Software is very cheap, employees - the tens of thousands that visa has and the millions that banks have including their rented space etc - are very, very expensive. Calculations have been made which show that to run a node, even if blocks were full right now at 20mb per block, would cost only $100 per year. I'd argue that is a tiny price to pay for the huge benefits that onchain scalability provides. And, even if it costs 1k or 10k, it should be the node operator who pays the market price for the extra security that his node provides, rather than ask users to pay 100 or 1k dollars per transactions instead.
A node provides indirect, but in some circumstances necessary, value to businesses, miners, researchers, hobbyist and whoever wants to run one. Asking 5k nodes to pay 1k makes far more sense than asking 200k transactions a day to pay 100 or 1k per transaction and it is of course way cheaper too.
However, after all the conferences, after all the debates, after all the never ending arguments, they retain their position. Therefore, developer agreement is not possible.
Decision Time
As such, the decision has now been opened to everyone to decide one question alone. Whether you agree with Satoshi's vision that we should scale onchain - and thus support classic - or whether you agree with Gregory's vision of a settlement system, and thus keep running a core node.
The way this works is in stages. Firstly, nodes have to upgrade first so that everyone has the time and opportunity to upgrade and be on the new rules. This is extremely important because old nodes (including core nodes) won't be able to work anymore if classic succeeds. Therefore, the economy (which includes businesses, miners, individuals) has to be given the opportunity to upgrade first, with miners having their vote second by the blocks they mine.
An indirect result of the upgrade process is that it gives an indication of where the sentiment and public opinion stands. Of course, this can be played, so it is not a perfect indication, but there are ways of telling when this indication is played to a significant degree. I'd rather not go into the details just as google doesn't go into the details of how it ranks pages, but this is an important time for bitcoin and the world so I would ask all who have the true interest of bitcoin at heart to refrain from any immature behavior so that we can learn where the community stands, giving the losing side, whichever it may be, the opportunity to accept the will of the people, and the results, thus allowing us to move forward as one.
I'll give some instructions on how to run or upgrade a node in a way that can be counted. First I want to say that you should run a classic node for many reasons, but chiefly three. It implements Satoshi's vision, the vision of that genius who knows how bitcoin works best and who with that knowledge designed it to scale considerably.
The second is in regards to our community. Many individuals, and not just Theymos, have engaged in censorship and banning in almost all the main communication channels, including the Hong Kong conference, the mailing list, the IRC channels and of course on bitcoin. I hope you agree with me that it is extremely important for us to show fundamental disapproval with such unprincipled and immoral behavior by running classic, and not reward them for it.
Thirdly, because cheap, fast and convenient is an incredibly important competitive quality of bitcoin. I would argue that the only reason why the mainstream media and silicon valley is abuzz about bitcoin is because it can save billions in fees and other ways such as making value transfer faster and more convenient. I want to change the world and offer to all bitcoin's supreme qualities of an open, permissionless, decentralized, global ledger, with fast if not almost instant and very cheap transactions, while having a gold like retention of value.
We know this works. It has been working for 7 years and works right now (although blocks are full making it not work as well as it can). Let's keep it working by upgrading to welcome the new wave of people and companies and miners.
How to run a node:
If you are not already running a node, you can download classic here: https://github.com/bitcoinclassic/bitcoinclassic/releases/tag/v0.11.2.cl1.b2
For windows, it is as easy as downloading any normal software. You just have to wait for it to synchronize and you are good to go.
You'll then need to open port 8333 for your node to be counted https://bitcoin.org/en/full-node#daemon-peer-info
If you already run a node you can easily upgrade without having to re-download the whole chain.
You can use cloud servers as well if you wish. Cloud nodes do still provide all the functionalities, but the cloud admin may have some control over your node, so it is best you run it on hardware you control. If nonetheless you still wish to use cloud, it is best that you do so through your own account rather than through someone else.
I think that covers all grounds. Obviously the above overview is necessarily reflective of my own views, but nullc and petertodd are welcomed to express their views at this important time and I hope they do not get downvoted simply because we disagree with their views, but allow them to have their say.
submitted by aquentin to btc [link] [comments]

My username ydtm refers to a foundational principle behind Bitcoin: You Do The Math. Regarding the Craig Wright spectacle, I must say that Theymos and Luke-Jr are the ones who best reflect this idea of "you do the math" - while Gavin's blog post and comments (for whatever mysterious reasons) do not.

Craig Wright just performed a public spectacle, not a mathematical proof
It is totally irrelevant whether someone, anyone - be it Gavin or Satoshi or Galileo or the Pope - writes some blog post saying they personally "witnessed the keys signed and then verified on a clean computer that could not have been tampered with". [emphasis added]
Even the further detail which Gavin provides here...
https://np.reddit.com/btc/comments/4hfyyo/gavin_can_you_please_detail_all_parts_of_the/d2plygg
...might be interesting from a sociological perspective, but from the perspective of mathematics, it is utterly meaningless.
People who know my post history know that I have supported Gavin's approach for "simpler and safer scaling now" via things like bigger blocks and Classic - and I have vehemently criticized Theymos for being tyrannical and Luke-Jr for being doctrinaire.
But regarding Craig Wright's extraordinary claims and his unorthodox methods for supposedly "proving" them, Gavin is wrong (for believing them - or, more precisely, for expecting us to believe his hearsay testimony about them) and people like Theymos and Luke-Jr - as well as many other people on these threads - are right (for questioning or simply ignoring Craig's claims and "demos").
This little demo in London is not, and has never been, the way a mathematical proof is done.
And, frankly, aside from any particular details of this so-called irrelevant pseudo-"proof", it is shocking that Gavin does not know this basic underlying fact about the methods of mathematics - which go back for centuries, long before we started doing mathematics with the assistance of electronic computing machines.
Someone (in this case, Gavin) talking about having witnessed some pixels on a screen driven by a heap of metal and silicon stirring "a vast sea of binary soup" on a machine running a von Neumann architecture manufactured by Intel or AMD using some "funky OpenSSL procedure" is not and has never been "mathematical proof" - and it is shocking that Gavin suddenly seems to have forgotten this well-known mathematical fact.
Gavin may "believe" that he "witnessed" a mathematical proof. And it's fine for him to write about this on his blog. But he should not present his witnessing and his blogging as some kind of "mathematical proof" for the rest of us.
Because (as many of us might remember from our high school geometry classes): mathematical proof is not and cannot be provided by a mere human witness or blog report or reddit comment.
As we all know, a person or a comment might talk about a proof, and might even (for convenience) provide a reference or link to the proof itself - so that we could all reproduce it.
But the "proof itself" must involve a publicly available method or algorithm which any interested party can access and repeat / reproduce on their own, to their own satisfaction.
And it would be shocking and appalling for someone who supposedly knows a bit about math (Gavin) to not understand this basic fact about mathematics. I don't know what the hell happened to Gavin here, but this sure is yet one more fascinating event in the ongoing drama of Bitcoin!
Proofs vs politics
If you're inclined towards tinfoil theories, then what we're seeing could also possibly be interpreted as an economic or political event (ie: a stunt?), when we remember that the proposition whose truth is supposedly being "proven" in this case happens to involve:
There could be enough drama and mystery here for us to engage in wild speculation and theorizing until the last Bitcoin is mined. But that's not what this post is mainly about. This post is about proof.
Everyone who took high school geometry knows what a "proof" is
As we know, a mathematical proof, unlike a political stunt or a public spectacle, is essentially an abstract artifact (in math it's often called a "theorem" or an "assertion") in association with one or more concrete (but, most importantly: public and reproducible) "realizations" or implementations demonstrating the "truth" of that theorem of assertion. By the way, all these various realizations or implementations (or proofs) are in some sense equivalent - even if they might happen to use different "languages" or formats.
The important thing of course is that a proof must be arbitrarily reproducible by anyone, using their own methods and tools - and hardware!
For example, some people might prefer to go through the steps of a proof on a laptop using a library written in C++ or Python, others might use the Coq theorem prover, and others might use pen-and-paper. Some people might use an algebraic approach, others might use a geometric approach, etc.
But the point is: a proof is just an abstract idea (theorem or assertion) plus the concrete implementation(s) which demonstrate its "truth" - with the implementation(s) getting done again and again, by anyone in the public who wants to - not just once on some laptop during some event in London before some hand-picked witness who got specially flown in for the occasion.
A "proof" must be public, repeatable, and reproducible
A proof is something you (can and should) do yourself.
You. The public. Everyone in their own way, using their own language, to repeatedly prove the same proposition, in their own way, to their own satisfaction, on their own device.
In order to verify that 32 + 42 = 52 you don't rely on a blog post from some guy who got flown to London and who personally "witnessed" it.
You prove it yourself, whichever way you know best - using a calculator, your laptop, your smartphone, chalk on a blackboard, pen and ink on back of a cocktail napkin, or scribbles drawn in the sand.
Or, in another situation, to verify that some software you downloaded is authentic, you grab some public keys off servers and you run some code to check some signatures, while of course taking reasonable steps to avoid man-in-the-middle attacks, ensure that your computer is virus-free, etc.
You Do The Math
What you do not do is "believe" the mathematical proof of the Pythagorean Theorem or the Quadratic Formula or someone's cryptographic signature simply because some well-known guy got flown to London and personally "witnessed" it "on a clean computer that could not have been tampered with".
Also, by the way, that "well-known guy" should be very careful how he writes about what he "witnessed":
Those are just reporting of something that Gavin says he saw. Mildly interesting - but mathematically irrelevant.
And it is very strange that Gavin is even posting them. You would that he has enough mathematical background to know that such communications, devoid of reproducible results, are meaningless.
We all need to be able to repeat the proof ourselves
Sometimes, if a proof involves lots of details or some tricky concepts, we could alternatively watch someone else do it - but there can't be anything "up their sleeve". They have to "show all their work" - to us.
For example, you can watch some Khan Academy YouTube videos that provide nice, easy-to-follow proofs of the Pythagorean Theorem or the Quadratic Formula.
These videos are quite satisfyingly convincing. For example, to prove the Pythagorean Theorem, they use a geometric approach where they break up a triangle into chunks, and then they move the chunks around to reposition them, so you, me, anyone, can literally (geometrically) see, and "prove", that how a2 + b2 = c2 (where a and b are the "legs" and c is the "hypoteneuse" of a right triangle).
In this approach, we are verifying everything ourselves. There is nothing "hidden" - there is nothing that even could be tampered with behind the scenes: the little triangles are all there in front of us. Those proofs on the Khan Academy YouTube channel are done in chalk before our eyes. Not behind the scenes, spitting out some result, on some computer that we merely believe "could not have been tampered with".
So, the essence of the meaning of "proof" is that anyone who is interested must able to conceptually go through the actual steps themselves - it's not about taking someone else's word for it.
Proof, like Bitcoin itself, is permissionless
"Proof" isn't about doing something behind a curtain (or on a chip on a computer in London) for a specially chosen audience.
"Proof" is about me and you and anyone else being able to repeat and reproduce the results ourselves.
Maybe Gavin himself did indeed "see" something, and as far as that goes, it's fine - for him. And of course he's entitled to write a post expressing his opinions and beliefs.
But that has nothing to do with mathematical proof for us, and it would be crazy (and very un-mathematical) of him to expect us to give any mathematical weight to his personal experiences and opinions and beliefs as expressed on his blog or in his comments.
Real mathematicians and programmers (and, presumably, Satoshi) already know all this
All over these subs, many people are saying that if Craig Wright wants to prove that he is Satoshi, then he should simply follow the standard procedures for proving this (from mathematics and public-key cryptography). And if not, GTFO.
And they're absolutely right.
Satoshi Nakamoto certainly knows the standard procedures and requirements of science and mathematics and public-key cryptography - and none of them have been followed in this weird farce: most importantly, the requirements that scientific and mathematical proof must be based on a permissionless, repeatable, reproducible procedure (and not some private performance).
A bizarre episode
Maybe eventually we'll get to the bottom of all the fascinating social or political or economic details behind this bizarre episode.
And if Bitcoin does turn out to be anti-fragile the way many of us believe, then hopefully someday we all might be able to look back on this strange day as yet another twist in the history of Bitcoin and cryptocurrency.
Bitcoin is about trusting math, not humans
I have no idea what's going on with Gavin. The fact that someone so central to Bitcoin development (and so prominent on one side of the scaling debates) has gotten involved with this whole weird Craig Wright spectacle is, shall we say, "very interesting" - and could be the basis for any number of wild speculative theories.
My own (admittedly somewhat tinfoil) theory would be that, even though we don't know what specifically is happening here, we can at least take this as one more suggestive indication that certain people seem to be trying very hard to do various things to the publicly visible developers of Bitcoin. Many devs seem to have been "neutralized" in various ways - whether co-opted by a corporation (like most of the Core devs now at Blockstream), or ostracized and hounded into rage-quitting (like Mike Hearn), or now (apparently) publicly duped and discredited (like Gavin).
Meanwhile, right now I'm just happy that people like Theymos and Luke-Jr (both of whom I've vehemently disagreed with in the past) - as well as many other people on these threads - understand and insist that the only way you can prove something in Bitcoin is if "you do the math" yourself.
submitted by ydtm to btc [link] [comments]

Sold everything and you should too!

Who I am Engineer w/ MBA in Finance who has studied the subject for hours every day over the last 3 years. Furthermore, I just liquidated my $100,000 USD Bitcoin position.
I honestly believe that meaningful change cannot happen until there is a massive price drop. Change is not going to happen until the pain of sticking to the status quo is greater than the perceived pain of changing. Bitcoin's current path is toxic and will likely lead to it permanently losing its market leadership position. Bitcoin in its current form and capacity has no chance of revolutionizing anything or making a difference for people of the world.
Problems:
As a result of the aforementioned points I've sold my holdings as the fundamentals do not justify the current price nor do they justify a higher price in the future. Realistically, fundamentals point to a continued decline as negative fundamentals of business and investment environment start to kick in. I feel that the only reason Bitcoin could go up further is via blind speculation or if Classic is adopted. Since I can't foresee and bank on further speculation causing the price to detach from the fundamentals, I'll have to discount that chance and focus on the event of a BitcoinClassic hardfork. If Bitcoin forks to BitcoinClassic in the near future, before Bitcoin loses its dominant position, I'll jump back in during the 28 day grace period. If it forks after losing its dominant position I'll never touch it again. Last but not least, exiting Bitcoin only puts more pressure on the system to bring about change.
TL/DR: Sold $100,000 USD in BTC. Bitcoin is screwed on its current path and only a price drop will bring about change. If bitcoin forks I'll jump back in, but only if it's still in a market leader position. Exiting Bitcoin creates the catalyst to bring about change.
Edit/Disclaimer: I'm not sitting in USD. I'm now sitting in another alt coin which I have no intention of discussing.
submitted by Paperempire1 to btc [link] [comments]

BlockTorrent: The famous algorithm which BitTorrent uses for SHARING BIG FILES. Which you probably thought Bitcoin *also* uses for SHARING NEW BLOCKS (which are also getting kinda BIG). But Bitcoin *doesn't* torrent *new* blocks (while relaying). It only torrents *old* blocks (while sync-ing). Why?

This post is being provided to further disseminate an existing proposal:
This proposal was originally presented by jtoomim back in September of 2015 - on the bitcoin_dev mailing list (full text at the end of this OP), and on reddit:
https://np.reddit.com/btc/comments/3zo72i/fyi_ujtoomim_is_working_on_a_scaling_proposal/cyomgj3
Here's a TL;DR, in his words:
BlockTorrenting
For initial block sync, [Bitcoin] sort of works [like BitTorrent] already.
You download a different block from each peer. That's fine.
However, a mechanism does not currently exist for downloading a portion of each [new] block from a different peer.
That's what I want to add.
~ jtoomim
The more detailed version of this "BlockTorrenting" proposal (as presented by jtoomim on the bitcoin_dev mailing list) is linked and copied / reformatted at the end of this OP.
Meanwhile here are some observations from me as a concerned member of the Bitcoin-using public.
Questions:
Whoa??
WTF???
Bitcoin doesn't do this kind of "blocktorrenting" already??
But.. But... I thought Bitcoin was "p2p" and "based on BitTorrent"...
... because (as we all know) Bitcoin has to download giant files.
Oh...
Bitcoin only "torrents" when sharing one certain kind of really big file: the existing blockchain, when a node is being initialized.
But Bitcoin doesn't "torrent" when sharing another certain kind of moderately big file (a file whose size, by the way, has been notoriously and steadily growing over the years to the point where the system running the legacy "Core"/Blockstream Bitcoin implementation is starting to become dangerously congested - no matter what some delusional clowns "Core" devs may say): ie, the world's most wildly popular, industrial-strength "p2p file sharing algorithm" is mysteriously not being used where the Bitcoin network needs it the most in order to get transactions confirmed on-chain: when a a newly found block needs to be shared among nodes, when a node is relaying new blocks.
https://np.reddit.com/Bitcoin+bitcoinxt+bitcoin_uncensored+btc+bitcoin_classic/search?q=blocktorrent&restrict_sr=on
How many of you (honestly) just simply assumed that this algorithm was already being used in Bitcoin - since we've all been told that "Bitcoin is p2p, like BitTorrent"?
As it turns out - the only part of Bitcoin which has been p2p up until now is the "sync-ing a new full-node" part.
The "running an existing full-node" part of Bitcoin has never been implemented as truly "p2p2" yet!!!1!!!
And this is precisely the part of the system that we've been wasting all of our time (and destroying the community) fighting over for the past few months - because the so-called "experts" from the legacy "Core"/Blockstream Bitcoin implementation ignored this proposal!
Why?
Why have all the so-called "experts" at "Core"/Blockstream ignored this obvious well-known effective & popular & tested & successful algorithm for doing "blocktorrenting" to torrent each new block being relayed?
Why have the "Core"/Blockstream devs failed to p2p-ize the most central, fundamental networking aspect of Bitcoin - the part where blocks get propagated, the part we've been fighting about for the past few years?
This algorithm for "torrenting" a big file in parallel from peers is the very definition of "p2p".
It "surgically" attacks the whole problem of sharing big files in the most elegant and efficient way possible: right at the lowest level of the bottleneck itself, cleverly chunking a file and uploading it in parallel to multiple peers.
Everyone knows torrenting works. Why isn't Bitcoin using it for its new blocks?
As millions of torrenters already know (but evidently all the so-called "experts" at Core/Blocsktream seem to have conveniently forgotten), "torrenting" a file (breaking a file into chunks and then offering a different chunk to each peer to "get it out to everyone fast" - before your particular node even has the entire file) is such a well-known / feasible / obvious / accepted / battle-tested / highly efficient algorithm for "parallelizing" (and thereby significantly accelerating) the sharing of big files among peers, that many people simply assumed that Bitcoin had already been doing this kind of "torrenting of new-blocks" these whole past 7 years.
But Bitcoin doesn't do this - yet!
None of the Core/Blockstream devs (and the Chinese miners who follow them) have prioritized p2p-izing the most central and most vital and most resource-consuming function of the Bitcoin network - the propagation of new blocks!
Maybe it took someone who's both a miner and a dev to "scratch" this particular "itch": Jonathan Toomim jtoomim.
  • A miner + dev who gets very little attention / respect from the Core/Blockstream devs (and from the Chinese miners who follow them) - perhaps because they feel threatened by a competing implementation?
  • A miner + dev who may have come up with the simplest and safest and most effective algorithmic (ie, software-based, not hardware-consuming) scaling proposal of anyone!
  • A dev who who is not paid by Blockstream, and who is therefore free from the secret, undisclosed corporate restraints / confidentiality agreements imposed by the shadowy fiat venture-capitalists and legacy power elite who appear to be attempting to cripple our code and muzzle our devs.
  • A miner who has the dignity not to let himself be forced into signing a loyalty oath to any corporate overlords after being locked in a room until 3 AM.
Precisely because jtoomim is both a indepdendent miner and an independent dev...
  • He knows what needs to be done.
  • He knows how to do it.
  • He is free to go ahead and do it - in a permissionless, decentralized fashion.
Possible bonus: The "blocktorrent" algorithm would help the most in the upload direction - which is precisely where Bitcoin scaling needs the most help!
Consider the "upload" direction for a relatively slow full-node - such as Luke-Jr, who reports that his internet is so slow, he has not been able to run a full-node since mid-2015.
The upload direction is the direction which everyone says has been the biggest problem with Bitcoin - because, in order for a full-node to be "useful" to the network:
  • it has to able to upload a new block to (at least) 8 peers,
  • which places (at least) 8x more "demand" on the full-node's upload bandwidth.
The brilliant, simple proposed "blocktorrent" algorithm from jtoomim (already proven to work with Bram Cohen's BitTorrent protocol, and also already proven to work for initial sync-ing of Bitcoin full-nodes - but still un-implemented for ongoing relaying among full-nodes) looks like it would provide a significant performance improvement precisely at this tightest "bottleneck" in the system, the crucial central nexus where most of the "traffic" (and the congestion) is happening: the relaying of new blocks from "slower" full-nodes.
The detailed explanation for how this helps "slower" nodes when uploading, is as follows.
Say you are a "slower" node.
You need to send a new block out to (at least) 8 peers - but your "upload" bandwidth is really slow.
If you were to split the file into (at least) 8 "chunks", and then upload a different one of these (at least) 8 "chunks" to each of your (at least) 8 peers - then (if you were using "blocktorrenting") it only would take you 1/8 (or less) of the "normal" time to do this (compared to the naïve legacy "Core" algorithm).
Now the new block which your "slower" node was attempting to upload is already "out there" - in 1/8 (or less) of the "normal" time compared to the naïve legacy "Core" algorithm.[ 1 ]
... [ 1 ] There will of course also be a tiny amount of extra overhead involved due to the "housekeeping" performed by the "blocktorrent" algorithm itself - involving some additional processing and communicating to decompose the block into chunks and to organize the relaying of different chunks to different peers and the recompose the chunks into a block again (all of which, depending on the size of the block and the latency of your node's connections to its peers, would in most cases be negligible compared to the much-greater speed-up provided by the "blocktorrent" algorithm itself).
Now that your block is "out there" at those 8 (or more) peer nodes to whom you just blocktorrented it in 1/8 (or less) of the time - it has now been liberated from the "bottleneck" of your "slower" node.
In fact, its further propagation across the net may now be able to leverage much faster upload speeds from some other node(s) which have "blocktorrent"-downloaded it in pieces from you (and other peers) - and which might be faster relaying it along, than your "slower" node.
For some mysterious reason, the legacy Bitcoin implementation from "Core"/Blockstream has not been doing this kind of "blocktorrenting" for new blocks.
It's only been doing this torrenting for old blocks. The blocks that have already been confirmed.
Which is fine.
But we also obviously need this sort of "torrenting" to be done for each new block is currently being confirmed.
And this is where the entire friggin' "scaling bottleneck" is occurring, which we just wasted the past few years "debating" about.
Just sit down and think about this for a minute.
We've had all these so-called "experts" (Core/Blockstream devs and other small-block proponents) telling us for years that guys like Hearn and Gavin and repos like Classic and XT and BU were "wrong" or at least "unserious" because they "merely" proposed "brute-force" scaling: ie, scaling which would simply place more demands on finite resources (specifically: on the upload bandwidth from full-nodes - who need to relay to at least 8 peer full-nodes in order to be considered "useful" to the network).
These "experts" have been beating us over the head this whole time, telling us that we have to figure out some (really complicated, unproven, inefficient and centralized) clever scaling algorithms to squeeze more efficiency out of existing infrastructure.
And here is the most well-known / feasible / obvious / accepted / battle-tested algorithm for "parallelizing" (and thereby massively accelerating) the sharing of big file among peers - the BitTorrent algorithm itself, the gold standard of p2p relaying par excellence, which has been a major success on the Internet for decades, at one point accounting for nearly 1/3 of all traffic on the Internet itself - and which is also already being used in one part of Bitcoin: during the phase of sync-ing a new node.
And apparently pretty much only jtoomim has been talking about using it for the actual relaying of new blocks - while Core/Blockstream devs have so far basically ignored this simple and safe and efficient proposal.
And then the small-block sycophants (reddit users or wannabe C/C++ programmers who have beaten into submission and/or by the FUD and "technological pessimism" of the Core/Blockstream devs, and by the censorhip on their legacy forum), they all "laugh" at Classic and proclaim "Bitcoin doesn't need another dev team - all the 'experts' are at Core / Blockstream"...
...when in fact it actually looks like jtoomim (an independent minedev, free from the propaganda and secret details of the corporate agenda of Core/Blockstream - who works on the Classic Bitcoin implementation) may have proposed the simplest and safest and most effective scaling algorithm in this whole debate.
By the way, his proposal estimates that we could get about 1 magnitude greater throughput, based on the typical latency and blocksize for blocks of around 20 MB and bandwidth of around 8 Mbps (which seems like a pretty normal scenario).
So why the fuck isn't this being done yet?
This is such a well-known / feasible / obvious / accepted / battle-tested algorithm for "parallelizing" (and thereby significantly accelerating) the sharing of big files among peers:
  • It's already being used for the (currently) 65 gigabytes of "blocks in the existing blockchain" itself - the phase where a new node has to sync with the blockchain.
  • It's already being used in BitTorrent - although the BitTorrent protocol has been optimized more to maximize throughput, whereas it would probably be a good idea to optimize the BlockTorrent protocol to minimize latency (since avoiding orphans is the big issue here) - which I'm fairly sure should be quite doable.
This algorithm is so trivial / obvious / straightforward / feasible / well-known / proven that I (and probably many others) simply assumed that Bitcoin had been doing this all along!
But it has never been implemented.
There is however finally a post about it today on the score-hidden forum /Bitcoin, from eragmus:
[bitcoin-dev] BlockTorrent: Torrent-style new-block propagation on Merkle trees
https://np.reddit.com/Bitcoin/comments/484nbx/bitcoindev_blocktorrent_torrentstyle_newblock/
And, predictably, the top-voted comment there is a comment telling us why it will never work.
And the comment after that comment is from the author of the proposal, jtoomim, explaining why it would work.
Score hidden on all those comments.
Because the immature tyrant theymos still doesn't understand the inherent advantages of people using reddit's upvoting & downvoting tools to hold decentralized, permissionless debates online.
Whatever.
Questions:
(1) Would this "BlockTorrenting" algorithm from jtoomim really work?
(2) If so, why hasn't it been implemented yet?
(3) Specifically: With all the "dev firepower" (and $76 million in venture capital) available at Core/Blockstream, why have they not prioritized implementing this simple and safe and highly effective solution?
(4) Even more specifically: Are there undisclosed strategies / agreements / restraints imposed by Blockstream financial investors on Bitcoin "Core" devs which have been preventing further discussion and eventual implementation of this possible simple & safe & efficient scaling solution?
Here is the more-detailed version of this proposal, presented by Jonathan Toomim jtoomim back in September of 2015 on the bitcoin-dev mailing list (and pretty much ignored for months by almost all the "experts" there):
https://lists.linuxfoundation.org/pipermail/bitcoin-dev/2015-Septembe011176.html
As I understand it, the current block propagation algorithm is this:
  1. A node mines a block.
  2. It notifies its peers that it has a new block with an inv. Typical nodes have 8 peers.
  3. The peers respond that they have not seen it, and request the block with getdata [hash].
  4. The node sends out the block in parallel to all 8 peers simultaneously. If the node's upstream bandwidth is limiting, then all peers will receive most of the block before any peer receives all of the block. The block is sent out as the small header followed by a list of transactions.
  5. Once a peer completes the download, it verifies the block, then enters step 2.
(If I'm missing anything, please let me know.)
The main problem with this algorithm is that it requires a peer to have the full block before it does any uploading to other peers in the p2p mesh. This slows down block propagation to:
O( p • log_p(n) ) 
where:
  • n is the number of peers in the mesh,
  • p is the number of peers transmitted to simultaneously.
It's like the Napster era of file-sharing. We can do much better than this.
Bittorrent can be an example for us.
Bittorrent splits the file to be shared into a bunch of chunks, and hashes each chunk.
Downloaders (leeches) grab the list of hashes, then start requesting their peers for the chunks out-of-order.
As each leech completes a chunk and verifies it against the hash, it begins to share those chunks with other leeches.
Total propagation time for large files can be approximately equal to the transmission time for an FTP upload.
Sometimes it's significantly slower, but often it's actually faster due to less bottlenecking on a single connection and better resistance to packet/connection loss.
(This could be relevant for crossing the Chinese border, since the Great Firewall tends to produce random packet loss, especially on encrypted connections.)
Bitcoin uses a data structure for transactions with hashes built-in. We can use that in lieu of Bittorrent's file chunks.
A Bittorrent-inspired algorithm might be something like this:
  1. (Optional steps to build a Merkle cache; described later)
  2. A seed node mines a block.
  3. It notifies its peers that it has a new block with an extended version of inv.
  4. The leech peers request the block header.
  5. The seed sends the block header. The leech code path splits into two.
  6. (a) The leeches verify the block header, including the PoW. If the header is valid,
  7. (a) They notify their peers that they have a header for an unverified new block with an extended version of inv, looping back to 2. above. If it is invalid, they abort thread (b).
  8. (b) The leeches request the Nth row (from the root) of the transaction Merkle tree, where N might typically be between 2 and 10. That corresponds to about 1/4th to 1/1024th of the transactions. The leeches also request a bitfield indicating which of the Merkle nodes the seed has leaves for. The seed supplies this (0xFFFF...).
  9. (b) The leeches calculate all parent node hashes in the Merkle tree, and verify that the root hash is as described in the header.
  10. The leeches search their Merkle hash cache to see if they have the leaves (transaction hashes and/or transactions) for that node already.
  11. The leeches send a bitfield request to the node indicating which Merkle nodes they want the leaves for.
  12. The seed responds by sending leaves (either txn hashes or full transactions, depending on benchmark results) to the leeches in whatever order it decides is optimal for the network.
  13. The leeches verify that the leaves hash into the ancestor node hashes that they already have.
  14. The leeches begin sharing leaves with each other.
  15. If the leaves are txn hashes, they check their cache for the actual transactions. If they are missing it, they request the txns with a getdata, or all of the txns they're missing (as a list) with a few batch getdatas.
Features and benefits
The main feature of this algorithm is that a leech will begin to upload chunks of data as soon as it gets them and confirms both PoW and hash/data integrity instead of waiting for a fully copy with full verification.
Inefficient cases, and mitigations
This algorithm is more complicated than the existing algorithm, and won't always be better in performance.
Because more round trip messages are required for negotiating the Merkle tree transfers, it will perform worse in situations where the bandwidth to ping latency ratio is high relative to the blocksize.
Specifically, the minimum per-hop latency will likely be higher.
This might be mitigated by reducing the number of round-trip messages needed to set up the BlockTorrent by using larger and more complex inv-like and getdata-like messages that preemptively send some data (e.g. block headers).
This would trade off latency for bandwidth overhead from larger duplicated inv messages.
Depending on implementation quality, the latency for the smallest block size might be the same between algorithms, or it might be 300% higher for the torrent algorithm.
For small blocks (perhaps < 100 kB), the BlockTorrent algorithm will likely be slightly slower.
Sidebar from the OP: So maybe this would discourage certain miners (cough Dow cough) from mining blocks that aren't full enough:
Why is [BTCC] limiting their block size to under 750 all of a sudden?
https://np.reddit.com/Bitcoin/comments/486o1u/why_is_bttc_limiting_their_block_size_to_unde

For large blocks (e.g. 8 MB over 20 Mbps), I expect the BlockTorrent algo will likely be around an order of magnitude faster in the worst case (adversarial) scenarios, in which none of the block's transactions are in the caches.

One of the big benefits of the BlockTorrent algorithm is that it provides several obvious and straightforward points for bandwidth saving and optimization by caching transactions and reconstructing the transaction order.

Future work: possible further optimizations
A cooperating miner [could] pre-announce Merkle subtrees with some of the transactions they are planning on including in the final block.
Other miners who see those subtrees [could] compare the transactions in those subtrees to the transaction sets they are mining with, and can rearrange their block prototypes to use the same subtrees as much as possible.
In the case of public pools supporting the getblocktemplate protocol, it might be possible to build Merkle subtree caches without the pool's help by having one or more nodes just scraping their getblocktemplate results.
Even if some transactions are inserted or deleted, it [might] be possible to guess a lot of the tree based on the previous ordering.
Once a block header and the first few rows of the Merkle tree [had] been published, they [would] propagate through the whole network, at which time full nodes might even be able to guess parts of the tree by searching through their txn and Merkle node/subtree caches.
That might be fun to think about, but probably not effective due to O( n2 ) or worse scaling with transaction count.
Might be able to make it work if the whole network cooperates on it, but there are probably more important things to do.
Leveraging other features from BitTorrent
There are also a few other features of Bittorrent that would be useful here, like:
  • prioritizing uploads to different peers based on their upload capacity,
  • banning peers that submit data that doesn't hash to the right value.
Sidebar from the OP: Hmm...maybe that would be one way to deal with the DDoS-ing we're experiencing right now? I know the DDoSer is using a rotating list of proxies, but still it could be a quick-and-dirty way to mitigate against his attack.
DDoS started again. Have a nice day, guys :)
https://np.reddit.com/Bitcoin_Classic/comments/47zglz/ddos_started_again_have_a_nice_day_guys/d0gj13y
(It might be good if we could get Bram Cohen to help with the implementation.)
Using the existing BitTorrent algorithm as-is - versus tailoring a new algorithm optimized for Bitcoin
Another possible option would be to just treat the block as a file and literally Bittorrent it.
But I think that there should be enough benefits to integrating it with the existing bitcoin p2p connections and also with using bitcoind's transaction caches and Merkle tree caches to make a native implementation worthwhile.
Also, BitTorrent itself was designed to optimize more for bandwidth than for latency, so we will have slightly different goals and tradeoffs during implementation.
Concerns, possible attacks, mitigations, related work
One of the concerns that I initially had about this idea was that it would involve nodes forwarding unverified block data to other nodes.
At first, I thought this might be useful for a rogue miner or node who wanted to quickly waste the whole network's bandwidth.
However, in order to perform this attack, the rogue needs to construct a valid header with a valid PoW, but use a set of transactions that renders the block as a whole invalid in a manner that is difficult to detect without full verification.
However, it will be difficult to design such an attack so that the damage in bandwidth used has a greater value than the 240 exahashes (and 25.1 BTC opportunity cost) associated with creating a valid header.
Related work: IBLT (Invertible Bloom Lookup Tables)
As I understand it, the O(1) IBLT approach requires that blocks follow strict rules (yet to be fully defined) about the transaction ordering.
If these are not followed, then it turns into sending a list of txn hashes, and separately ensuring that all of the txns in the new block are already in the recipient's mempool.
When mempools are very dissimilar, the IBLT approach performance degrades heavily and performance becomes worse than simply sending the raw block.
This could occur if a node just joined the network, during chain reorgs, or due to malicious selfish miners.
Also, if the mempool has a lot more transactions than are included in the block, the false positive rate for detecting whether a transaction already exists in another node's mempool might get high for otherwise reasonable bucket counts/sizes.
With the BlockTorrent approach, the focus is on transmitting the list of hashes in a manner that propagates as quickly as possible while still allowing methods for reducing the total bandwidth needed.
Remark
The BlockTorrent algorithm does not really address how the actual transaction data will be obtained because, once the leech has the list of txn hashes, the standard Bitcoin p2p protocol can supply them in a parallelized and decentralized manner.
Thoughts?
-jtoomim
submitted by ydtm to btc [link] [comments]

The history of the "opt-in" full RBF merged into core

1) Peter Todd (Bitcoin core developer, the author of the merged "opt-in" full RBF) convinced Chung Wang (Co-owner and chief administrator of the F2Pool) to adopt Full RBF at F2Pool
Yesterday F2Pool, currently the largest pool with 21% of the hashing power, enabled full replace-by-fee (RBF) support after discussions with me.
Peter Todd Fri, 19 Jun 2015 03:41:51 -0700
2) Many got disappointed, including Gavin Andersen (the chief scientist at Bitcoin Foundation and the former lead developer of Bitcoin core) and Adrian Macneil (Director of Engineering at Coinbase at that time):
Extremely disappointed to hear this. This change turns double spending from a calculable (and affordable) risk for merchant payment processors into certain profit for scammers, and provides no useful benefit for consumers.
Adrian Macneil, Fri, 19 Jun 2015 07:06:36 -0700
I was disappointed to see Peter Todd claiming that you have (or will?) run his replace-by-fee patch. I strongly encourage you to wait until most wallet software supports replace-by-fee before doing that, because until that happens replace-by-fee just makes it easier to steal from bitcoin-accepting merchants.
Gavin Andersen, Fri, 19 Jun 2015 06:35:14 -0700
3) F2Pool switched from full RBF to FSS (first-seen-safe) RBF
I know how bad the full RBF is. We are going to switch to FSS RBF in a few hours. Sorry.
Chun Wang, Fri, 19 Jun 2015 06:54:18 -0700
4) Founder and CEO of GreenAddress IT Limited tried to convince miners that full RBF is better for them
FSS RBF is better than no RBF but we think it is better to use full RBF. <..> And why would miner pick the option paying less when other miners run the option paying more? It may be soon more than 1-5% of block reward.
Lawrence Nahum, Fri, 19 Jun 2015 07:19:16 -0700 (Founder and CEO of GreenAddress IT Limited)
5) Jeff Garzik, the developer of the Bitcoin core, agreed however that FSS RBF is better
Yes, FSS RBF is far better.
Jeff Garzik Fri, 19 Jun 2015 08:45:18 -0700
6) After the long discussion, moderated by Theymos in order to remove distraction from poor trolls the so-called "opt-in" full-RBF got merged into the Core.
Am I missing anything?
Oh, the weird "opt-in" part of it (that doesn't opt-in anything). Why did they merge it?
Because there's too much FUD to get it merged without. Luke-jr, bitcoin core developer
How will Bitcoin become mainstream with 1Mb blocks, high fees and lack of 0-conf transactions?
Bitcoin core developers think it is inefficient to let every transaction be stored in the blockchain forever (and it is true).
They envision Bitcoin not as a public payment system available for everybody, but as a settlement layer for other complex technical solutions, so they are optimizing it for them. I have nothing against complex technical solutions, I am looking forward to them.
But I am against freezing the temporary 1Mb block size limit (that was introduced back in 2010) and adopting the RBF today because these changes to bitcoin policy may harm the main stream adoption of Bitcoin.
Bitcoin payment processors and merchants that are crucial for the adoption are likely to suffer and buyers are likely to get repelled by high fees and long confirmation times.
submitted by arsenische to Bitcoin [link] [comments]

Is anyone else freaked out by this whole blocksize debate? Does anyone else find themself often agreeing with *both* sides - depending on whichever argument you happen to be reading at the moment? And do we need some better algorithms and data structures?

Why do both sides of the debate seem “right” to me?
I know, I know, a healthy debate is healthy and all - and maybe I'm just not used to the tumult and jostling which would be inevitable in a real live open major debate about something as vital as Bitcoin.
And I really do agree with the starry-eyed idealists who say Bitcoin is vital. Imperfect as it may be, it certainly does seem to represent the first real chance we've had in the past few hundred years to try to steer our civilization and our planet away from the dead-ends and disasters which our government-issued debt-based currencies keep dragging us into.
But this particular debate, about the blocksize, doesn't seem to be getting resolved at all.
Pretty much every time I read one of the long-form major arguments contributed by Bitcoin "thinkers" who I've come to respect over the past few years, this weird thing happens: I usually end up finding myself nodding my head and agreeing with whatever particular piece I'm reading!
But that should be impossible - because a lot of these people vehemently disagree!
So how can both sides sound so convincing to me, simply depending on whichever piece I currently happen to be reading?
Does anyone else feel this way? Or am I just a gullible idiot?
Just Do It?
When you first look at it or hear about it, increasing the size seems almost like a no-brainer: The "big-block" supporters say just increase the blocksize to 20 MB or 8 MB, or do some kind of scheduled or calculated regular increment which tries to take into account the capabilities of the infrastructure and the needs of the users. We do have the bandwidth and the memory to at least increase the blocksize now, they say - and we're probably gonna continue to have more bandwidth and memory in order to be able to keep increasing the blocksize for another couple decades - pretty much like everything else computer-based we've seen over the years (some of this stuff is called by names such as "Moore's Law").
On the other hand, whenever the "small-block" supporters warn about the utter catastrophe that a failed hard-fork would mean, I get totally freaked by their possible doomsday scenarios, which seem totally plausible and terrifying - so I end up feeling that the only way I'd want to go with a hard-fork would be if there was some pre-agreed "triggering" mechanism where the fork itself would only actually "switch on" and take effect provided that some "supermajority" of the network (of who? the miners? the full nodes?) had signaled (presumably via some kind of totally reliable p2p trustless software-based voting system?) that they do indeed "pre-agree" to actually adopt the pre-scheduled fork (and thereby avoid any possibility whatsoever of the precious blockchain somehow tragically splitting into two and pretty much killing this cryptocurrency off in its infancy).
So in this "conservative" scenario, I'm talking about wanting at least 95% pre-adoption agreement - not the mere 75% which I recall some proposals call for, which seems like it could easily lead to a 75/25 blockchain split.
But this time, with this long drawn-out blocksize debate, the core devs, and several other important voices who have become prominent opinion shapers over the past few years, can't seem to come to any real agreement on this.
Weird split among the devs
As far as I can see, there's this weird split: Gavin and Mike seem to be the only people among the devs who really want a major blocksize increase - and all the other devs seem to be vehemently against them.
But then on the other hand, the users seem to be overwhelmingly in favor of a major increase.
And there are meta-questions about governance, about about why this didn't come out as a BIP, and what the availability of Bitcoin XT means.
And today or yesterday there was this really cool big-blockian exponential graph based on doubling the blocksize every two years for twenty years, reminding us of the pure mathematical fact that 210 is indeed about 1000 - but not really addressing any of the game-theoretic points raised by the small-blockians. So a lot of the users seem to like it, but when so few devs say anything positive about it, I worry: is this just yet more exponential chart porn?
On the one hand, Gavin's and Mike's blocksize increase proposal initially seemed like a no-brainer to me.
And on the other hand, all the other devs seem to be against them. Which is weird - not what I'd initially expected at all (but maybe I'm just a fool who's seduced by exponential chart porn?).
Look, I don't mean to be rude to any of the core devs, and I don't want to come off like someone wearing a tinfoil hat - but it has to cross people's minds that the powers that be (the Fed and the other central banks and the governments that use their debt-issued money to run this world into a ditch) could very well be much more scared shitless than they're letting on. If we assume that the powers that be are using their usual playbook and tactics, then it could be worth looking at the book "Confessions of an Economic Hitman" by John Perkins, to get an idea of how they might try to attack Bitcoin. So, what I'm saying is, they do have a track record of sending in "experts" to try to derail projects and keep everyone enslaved to the Creature from Jekyll Island. I'm just saying. So, without getting ad hominem - let's just make sure that our ideas can really stand scrutiny on their own - as Nick Szabo says, we need to make sure there is "more computer science, less noise" in this debate.
When Gavin Andresen first came out with the 20 MB thing - I sat back and tried to imagine if I could download 20 MB in 10 minutes (which seems to be one of the basic mathematical and technological constraints here - right?)
I figured, "Yeah, I could download that" - even with my crappy internet connection.
And I guess the telecoms might be nice enough to continue to double our bandwidth every two years for the next couple decades – if we ask them politely?
On the other hand - I think we should be careful about entrusting the financial freedom of the world into the greedy hands of the telecoms companies - given all their shady shenanigans over the past few years in many countries. After decades of the MPAA and the FBI trying to chip away at BitTorrent, lately PirateBay has been hard to access. I would say it's quite likely that certain persons at institutions like JPMorgan and Goldman Sachs and the Fed might be very, very motivated to see Bitcoin fail - so we shouldn't be too sure about scaling plans which depend on the willingness of companies Verizon and AT&T to double our bandwith every two years.
Maybe the real important hardware buildout challenge for a company like 21 (and its allies such as Qualcomm) to take on now would not be "a miner in every toaster" but rather "Google Fiber Download and Upload Speeds in every Country, including China".
I think I've read all the major stuff on the blocksize debate from Gavin Andresen, Mike Hearn, Greg Maxwell, Peter Todd, Adam Back, and Jeff Garzick and several other major contributors - and, oddly enough, all their arguments seem reasonable - heck even Luke-Jr seems reasonable to me on the blocksize debate, and I always thought he was a whackjob overly influenced by superstition and numerology - and now today I'm reading the article by Bram Cohen - the inventor of BitTorrent - and I find myself agreeing with him too!
I say to myself: What's going on with me? How can I possibly agree with all of these guys, if they all have such vehemently opposing viewpoints?
I mean, think back to the glory days of a couple of years ago, when all we were hearing was how this amazing unprecedented grassroots innovation called Bitcoin was going to benefit everyone from all walks of life, all around the world:
...basically the entire human race transacting everything into the blockchain.
(Although let me say that I think that people's focus on ideas like driverless cabs creating realtime fare markets based on supply and demand seems to be setting our sights a bit low as far as Bitcoin's abilities to correct the financial world's capital-misallocation problems which seem to have been made possible by infinite debt-based fiat. I would have hoped that a Bitcoin-based economy would solve much more noble, much more urgent capital-allocation problems than driverless taxicabs creating fare markets or refrigerators ordering milk on the internet of things. I was thinking more along the lines that Bitcoin would finally strangle dead-end debt-based deadly-toxic energy industries like fossil fuels and let profitable clean energy industries like Thorium LFTRs take over - but that's another topic. :=)
Paradoxes in the blocksize debate
Let me summarize the major paradoxes I see here:
(1) Regarding the people (the majority of the core devs) who are against a blocksize increase: Well, the small-blocks arguments do seem kinda weird, and certainly not very "populist", in the sense that: When on earth have end-users ever heard of a computer technology whose capacity didn't grow pretty much exponentially year-on-year? All the cool new technology we've had - from hard drives to RAM to bandwidth - started out pathetically tiny and grew to unimaginably huge over the past few decades - and all our software has in turn gotten massively powerful and big and complex (sometimes bloated) to take advantage of the enormous new capacity available.
But now suddenly, for the first time in the history of technology, we seem to have a majority of the devs, on a major p2p project - saying: "Let's not scale the system up. It could be dangerous. It might break the whole system (if the hard-fork fails)."
I don't know, maybe I'm missing something here, maybe someone else could enlighten me, but I don't think I've ever seen this sort of thing happen in the last few decades of the history of technology - devs arguing against scaling up p2p technology to take advantage of expected growth in infrastructure capacity.
(2) But... on the other hand... the dire warnings of the small-blockians about what could happen if a hard-fork were to fail - wow, they do seem really dire! And these guys are pretty much all heavyweight, experienced programmers and/or game theorists and/or p2p open-source project managers.
I must say, that nearly all of the long-form arguments I've read - as well as many, many of the shorter comments I've read from many users in the threads, whose names I at least have come to more-or-less recognize over the past few months and years on reddit and bitcointalk - have been amazingly impressive in their ability to analyze all aspects of the lifecycle and management of open-source software projects, bringing up lots of serious points which I could never have come up with, and which seem to come from long experience with programming and project management - as well as dealing with economics and human nature (eg, greed - the game-theory stuff).
So a lot of really smart and experienced people with major expertise in various areas ranging from programming to management to game theory to politics to economics have been making some serious, mature, compelling arguments.
But, as I've been saying, the only problem to me is: in many of these cases, these arguments are vehemently in opposition to each other! So I find myself agreeing with pretty much all of them, one by one - which means the end result is just a giant contradiction.
I mean, today we have Bram Cohen, the inventor of BitTorrent, arguing (quite cogently and convincingly to me), that it would be dangerous to increase the blocksize. And this seems to be a guy who would know a few things about scaling out a massive global p2p network - since the protocol which he invented, BitTorrent, is now apparently responsible for like a third of the traffic on the internet (and this despite the long-term concerted efforts of major evil players such as the MPAA and the FBI to shut the whole thing down).
Was the BitTorrent analogy too "glib"?
By the way - I would like to go on a slight tangent here and say that one of the main reasons why I felt so "comfortable" jumping on the Bitcoin train back a few years ago, when I first heard about it and got into it, was the whole rough analogy I saw with BitTorrent.
I remembered the perhaps paradoxical fact that when a torrent is more popular (eg, a major movie release that just came out last week), then it actually becomes faster to download. More people want it, so more people have a few pieces of it, so more people are able to get it from each other. A kind of self-correcting economic feedback loop, where more demand directly leads to more supply.
(BitTorrent manages to pull this off by essentially adding a certain structure to the file being shared, so that it's not simply like an append-only list of 1 MB blocks, but rather more like an random-access or indexed array of 1 MB chunks. Say you're downloading a film which is 700 MB. As soon as your "client" program has downloaded a single 1-MB chunk - say chunk #99 - your "client" program instantly turns into a "server" program as well - offering that chunk #99 to other clients. From my simplistic understanding, I believe the Bitcoin protocol does something similar, to provide a p2p architecture. Hence my - perhaps naïve - assumption that Bitcoin already had the right algorithms / architecture / data structure to scale.)
The efficiency of the BitTorrent network seemed to jive with that "network law" (Metcalfe's Law?) about fax machines. This law states that the more fax machines there are, the more valuable the network of fax machines becomes. Or the value of the network grows on the order of the square of the number of nodes.
This is in contrast with other technology like cars, where the more you have, the worse things get. The more cars there are, the more traffic jams you have, so things start going downhill. I guess this is because highway space is limited - after all, we can't pave over the entire countryside, and we never did get those flying cars we were promised, as David Graeber laments in a recent essay in The Baffler magazine :-)
And regarding the "stress test" supposedly happening right now in the middle of this ongoing blocksize debate, I don't know what worries me more: the fact that it apparently is taking only $5,000 to do a simple kind of DoS on the blockchain - or the fact that there are a few rumors swirling around saying that the unknown company doing the stress test shares the same physical mailing address with a "scam" company?
Or maybe we should just be worried that so much of this debate is happening on a handful of forums which are controlled by some guy named theymos who's already engaged in some pretty "contentious" or "controversial" behavior like blowing a million dollars on writing forum software (I guess he never heard that reddit.com software is open-source)?
So I worry that the great promise of "decentralization" might be more fragile than we originally thought.
Scaling
Anyways, back to Metcalfe's Law: with virtual stuff, like torrents and fax machines, the more the merrier. The more people downloading a given movie, the faster it arrives - and the more people own fax machines, the more valuable the overall fax network.
So I kindof (naïvely?) assumed that Bitcoin, being "virtual" and p2p, would somehow scale up the same magical way BitTorrrent did. I just figured that more people using it would somehow automatically make it stronger and faster.
But now a lot of devs have started talking in terms of the old "scarcity" paradigm, talking about blockspace being a "scarce resource" and talking about "fee markets" - which seems kinda scary, and antithetical to much of the earlier rhetoric we heard about Bitcoin (the stuff about supporting our favorite creators with micropayments, and the stuff about Africans using SMS to send around payments).
Look, when some asshole is in line in front of you at the cash register and he's holding up the line so they can run his credit card to buy a bag of Cheeto's, we tend to get pissed off at the guy - clogging up our expensive global electronic payment infrastructure to make a two-dollar purchase. And that's on a fairly efficient centralized system - and presumably after a year or so, VISA and the guy's bank can delete or compress the transaction in their SQL databases.
Now, correct me if I'm wrong, but if some guy buys a coffee on the blockchain, or if somebody pays an online artist $1.99 for their work - then that transaction, a few bytes or so, has to live on the blockchain forever?
Or is there some "pruning" thing that gets rid of it after a while?
And this could lead to another question: Viewed from the perspective of double-entry bookkeeping, is the blockchain "world-wide ledger" more like the "balance sheet" part of accounting, i.e. a snapshot showing current assets and liabilities? Or is it more like the "cash flow" part of accounting, i.e. a journal showing historical revenues and expenses?
When I think of thousands of machines around the globe having to lug around multiple identical copies of a multi-gigabyte file containing some asshole's coffee purchase forever and ever... I feel like I'm ideologically drifting in one direction (where I'd end up also being against really cool stuff like online micropayments and Africans banking via SMS)... so I don't want to go there.
But on the other hand, when really experienced and battle-tested veterans with major experience in the world of open-souce programming and project management (the "small-blockians") warn of the catastrophic consequences of a possible failed hard-fork, I get freaked out and I wonder if Bitcoin really was destined to be a settlement layer for big transactions.
Could the original programmer(s) possibly weigh in?
And I don't mean to appeal to authority - but heck, where the hell is Satoshi Nakamoto in all this? I do understand that he/she/they would want to maintain absolute anonymity - but on the other hand, I assume SN wants Bitcoin to succeed (both for the future of humanity - or at least for all the bitcoins SN allegedly holds :-) - and I understand there is a way that SN can cryptographically sign a message - and I understand that as the original developer of Bitcoin, SN had some very specific opinions about the blocksize... So I'm kinda wondering of Satoshi could weigh in from time to time. Just to help out a bit. I'm not saying "Show us a sign" like a deity or something - but damn it sure would be fascinating and possibly very helpful if Satoshi gave us his/hetheir 2 satoshis worth at this really confusing juncture.
Are we using our capacity wisely?
I'm not a programming or game-theory whiz, I'm just a casual user who has tried to keep up with technology over the years.
It just seems weird to me that here we have this massive supercomputer (500 times more powerful than the all the supercomputers in the world combined) doing fairly straightforward "embarassingly parallel" number-crunching operations to secure a p2p world-wide ledger called the blockchain to keep track of a measly 2.1 quadrillion tokens spread out among a few billion addresses - and a couple of years ago you had people like Rick Falkvinge saying the blockchain would someday be supporting multi-million-dollar letters of credit for international trade and you had people like Andreas Antonopoulos saying the blockchain would someday allow billions of "unbanked" people to send remittances around the village or around the world dirt-cheap - and now suddenly in June 2015 we're talking about blockspace as a "scarce resource" and talking about "fee markets" and partially centralized, corporate-sponsored "Level 2" vaporware like Lightning Network and some mysterious company is "stess testing" or "DoS-ing" the system by throwing away a measly $5,000 and suddenly it sounds like the whole system could eventually head right back into PayPal and Western Union territory again, in terms of expensive fees.
When I got into Bitcoin, I really was heavily influenced by vague analogies with BitTorrent: I figured everyone would just have tiny little like utorrent-type program running on their machine (ie, Bitcoin-QT or Armory or Mycelium etc.).
I figured that just like anyone can host a their own blog or webserver, anyone would be able to host their own bank.
Yeah, Google and and Mozilla and Twitter and Facebook and WhatsApp did come along and build stuff on top of TCP/IP, so I did expect a bunch of companies to build layers on top of the Bitcoin protocol as well. But I still figured the basic unit of bitcoin client software powering the overall system would be small and personal and affordable and p2p - like a bittorrent client - or at the most, like a cheap server hosting a blog or email server.
And I figured there would be a way at the software level, at the architecture level, at the algorithmic level, at the data structure level - to let the thing scale - if not infinitely, at least fairly massively and gracefully - the same way the BitTorrent network has.
Of course, I do also understand that with BitTorrent, you're sharing a read-only object (eg, a movie) - whereas with Bitcoin, you're achieving distributed trustless consensus and appending it to a write-only (or append-only) database.
So I do understand that the problem which BitTorrent solves is much simpler than the problem which Bitcoin sets out to solve.
But still, it seems that there's got to be a way to make this thing scale. It's p2p and it's got 500 times more computing power than all the supercomputers in the world combined - and so many brilliant and motivated and inspired people want this thing to succeed! And Bitcoin could be our civilization's last chance to steer away from the oncoming debt-based ditch of disaster we seem to be driving into!
It just seems that Bitcoin has got to be able to scale somehow - and all these smart people working together should be able to come up with a solution which pretty much everyone can agree - in advance - will work.
Right? Right?
A (probably irrelevant) tangent on algorithms and architecture and data structures
I'll finally weigh with my personal perspective - although I might be biased due to my background (which is more on the theoretical side of computer science).
My own modest - or perhaps radical - suggestion would be to ask whether we're really looking at all the best possible algorithms and architectures and data structures out there.
From this perspective, I sometimes worry that the overwhelming majority of the great minds working on the programming and game-theory stuff might come from a rather specific, shall we say "von Neumann" or "procedural" or "imperative" school of programming (ie, C and Python and Java programmers).
It seems strange to me that such a cutting-edge and important computer project would have so little participation from the great minds at the other end of the spectrum of programming paradigms - namely, the "functional" and "declarative" and "algebraic" (and co-algebraic!) worlds.
For example, I was struck in particular by statements I've seen here and there (which seemed rather hubristic or lackadaisical to me - for something as important as Bitcoin), that the specification of Bitcoin and the blockchain doesn't really exist in any form other than the reference implementation(s) (in procedural languages such as C or Python?).
Curry-Howard anyone?
I mean, many computer scientists are aware of the Curry-Howard isomorophism, which basically says that the relationship between a theorem and its proof is equivalent to the relationship between a specification and its implementation. In other words, there is a long tradition in mathematics (and in computer programming) of:
And it's not exactly "turtles all the way down" either: a specification is generally simple and compact enough that a good programmer can usually simply visually inspect it to determine if it is indeed "correct" - something which is very difficult, if not impossible, to do with a program written in a procedural, implementation-oriented language such as C or Python or Java.
So I worry that we've got this tradition, from the open-source github C/Java programming tradition, of never actually writing our "specification", and only writing the "implementation". In mission-critical military-grade programming projects (which often use languages like Ada or Maude) this is simply not allowed. It would seem that a project as mission-critical as Bitcoin - which could literally be crucial for humanity's continued survival - should also use this kind of military-grade software development approach.
And I'm not saying rewrite the implementations in these kind of theoretical languages. But it might be helpful if the C/Python/Java programmers in the Bitcoin imperative programming world could build some bridges to the Maude/Haskell/ML programmers of the functional and algebraic programming worlds to see if any kind of useful cross-pollination might take place - between specifications and implementations.
For example, the JavaFAN formal analyzer for multi-threaded Java programs (developed using tools based on the Maude language) was applied to the Remote Agent AI program aboard NASA's Deep Space 1 shuttle, written in Java - and it took only a few minutes using formal mathematical reasoning to detect a potential deadlock which would have occurred years later during the space mission when the damn spacecraft was already way out around Pluto.
And "the Maude-NRL (Naval Research Laboratory) Protocol Analyzer (Maude-NPA) is a tool used to provide security proofs of cryptographic protocols and to search for protocol flaws and cryptosystem attacks."
These are open-source formal reasoning tools developed by DARPA and used by NASA and the US Navy to ensure that program implementations satisfy their specifications. It would be great if some of the people involved in these kinds of projects could contribute to help ensure the security and scalability of Bitcoin.
But there is a wide abyss between the kinds of programmers who use languages like Maude and the kinds of programmers who use languages like C/Python/Java - and it can be really hard to get the two worlds to meet. There is a bit of rapprochement between these language communities in languages which might be considered as being somewhere in the middle, such as Haskell and ML. I just worry that Bitcoin might be turning into being an exclusively C/Python/Java project (with the algorithms and practitioners traditionally of that community), when it could be more advantageous if it also had some people from the functional and algebraic-specification and program-verification community involved as well. The thing is, though: the theoretical practitioners are big on "semantics" - I've heard them say stuff like "Yes but a C / C++ program has no easily identifiable semantics". So to get them involved, you really have to first be able to talk about what your program does (specification) - before proceeding to describe how it does it (implementation). And writing high-level specifications is typically very hard using the syntax and semantics of languages like C and Java and Python - whereas specs are fairly easy to write in Maude - and not only that, they're executable, and you state and verify properties about them - which provides for the kind of debate Nick Szabo was advocating ("more computer science, less noise").
Imagine if we had an executable algebraic specification of Bitcoin in Maude, where we could formally reason about and verify certain crucial game-theoretical properties - rather than merely hand-waving and arguing and deploying and praying.
And so in the theoretical programming community you've got major research on various logics such as Girard's Linear Logic (which is resource-conscious) and Bruni and Montanari's Tile Logic (which enables "pasting" bigger systems together from smaller ones in space and time), and executable algebraic specification languages such as Meseguer's Maude (which would be perfect for game theory modeling, with its functional modules for specifying the deterministic parts of systems and its system modules for specifiying non-deterministic parts of systems, and its parameterized skeletons for sketching out the typical architectures of mobile systems, and its formal reasoning and verification tools and libraries which have been specifically applied to testing and breaking - and fixing - cryptographic protocols).
And somewhat closer to the practical hands-on world, you've got stuff like Google's MapReduce and lots of Big Data database languages developed by Google as well. And yet here we are with a mempool growing dangerously big for RAM on a single machine, and a 20-GB append-only list as our database - and not much debate on practical results from Google's Big Data databases.
(And by the way: maybe I'm totally ignorant for asking this, but I'll ask anyways: why the hell does the mempool have to stay in RAM? Couldn't it work just as well if it were stored temporarily on the hard drive?)
And you've got CalvinDB out of Yale which apparently provides an ACID layer on top of a massively distributed database.
Look, I'm just an armchair follower cheering on these projects. I can barely manage to write a query in SQL, or read through a C or Python or Java program. But I would argue two points here: (1) these languages may be too low-level and "non-formal" for writing and modeling and formally reasoning about and proving properties of mission-critical specifications - and (2) there seem to be some Big Data tools already deployed by institutions such as Google and Yale which support global petabyte-size databases on commodity boxes with nice properties such as near-real-time and ACID - and I sometimes worry that the "core devs" might be failing to review the literature (and reach out to fellow programmers) out there to see if there might be some formal program-verification and practical Big Data tools out there which could be applied to coming up with rock-solid, 100% consensus proposals to handle an issue such as blocksize scaling, which seems to have become much more intractable than many people might have expected.
I mean, the protocol solved the hard stuff: the elliptical-curve stuff and the Byzantine General stuff. How the heck can we be falling down on the comparatively "easier" stuff - like scaling the blocksize?
It just seems like defeatism to say "Well, the blockchain is already 20-30 GB and it's gonna be 20-30 TB ten years from now - and we need 10 Mbs bandwidth now and 10,000 Mbs bandwidth 20 years from - assuming the evil Verizon and AT&T actually give us that - so let's just become a settlement platform and give up on buying coffee or banking the unbanked or doing micropayments, and let's push all that stuff into some corporate-controlled vaporware without even a whitepaper yet."
So you've got Peter Todd doing some possibly brilliant theorizing and extrapolating on the idea of "treechains" - there is a Let's Talk Bitcoin podcast from about a year ago where he sketches the rough outlines of this idea out in a very inspiring, high-level way - although the specifics have yet to be hammered out. And we've got Blockstream also doing some hopeful hand-waving about the Lightning Network.
Things like Peter Todd's treechains - which may be similar to the spark in some devs' eyes called Lightning Network - are examples of the kind of algorithm or architecture which might manage to harness the massive computing power of miners and nodes in such a way that certain kinds of massive and graceful scaling become possible.
It just seems like a kindof tiny dev community working on this stuff.
Being a C or Python or Java programmer should not be a pre-req to being able to help contribute to the specification (and formal reasoning and program verification) for Bitcoin and the blockchain.
XML and UML are crap modeling and specification languages, and C and Java and Python are even worse (as specification languages - although as implementation languages, they are of course fine).
But there are serious modeling and specification languages out there, and they could be very helpful at times like this - where what we're dealing with is questions of modeling and specification (ie, "needs and requirements").
One just doesn't often see the practical, hands-on world of open-source github implementation-level programmers and the academic, theoretical world of specification-level programmers meeting very often. I wish there were some way to get these two worlds to collaborate on Bitcoin.
Maybe a good first step to reach out to the theoretical people would be to provide a modular executable algebraic specification of the Bitcoin protocol in a recognized, military/NASA-grade specification language such as Maude - because that's something the theoretical community can actually wrap their heads around, whereas it's very hard to get them to pay attention to something written only as a C / Python / Java implementation (without an accompanying specification in a formal language).
They can't check whether the program does what it's supposed to do - if you don't provide a formal mathematical definition of what the program is supposed to do.
Specification : Implementation :: Theorem : Proof
You have to remember: the theoretical community is very aware of the Curry-Howard isomorphism. Just like it would be hard to get a mathematician's attention by merely showing them a proof without telling also telling them what theorem the proof is proving - by the same token, it's hard to get the attention of a theoretical computer scientist by merely showing them an implementation without showing them the specification that it implements.
Bitcoin is currently confronted with a mathematical or "computer science" problem: how to secure the network while getting high enough transactional throughput, while staying within the limited RAM, bandwidth and hard drive space limitations of current and future infrastructure.
The problem only becomes a political and economic problem if we give up on trying to solve it as a mathematical and "theoretical computer science" problem.
There should be a plethora of whitepapers out now proposing algorithmic solutions to these scaling issues. Remember, all we have to do is apply the Byzantine General consensus-reaching procedure to a worldwide database which shuffles 2.1 quadrillion tokens among a few billion addresses. The 21 company has emphatically pointed out that racing to compute a hash to add a block is an "embarrassingly parallel" problem - very easy to decompose among cheap, fault-prone, commodity boxes, and recompose into an overall solution - along the lines of Google's highly successful MapReduce.
I guess what I'm really saying is (and I don't mean to be rude here), is that C and Python and Java programmers might not be the best qualified people to develop and formally prove the correctness of (note I do not say: "test", I say "formally prove the correctness of") these kinds of algorithms.
I really believe in the importance of getting the algorithms and architectures right - look at Google Search itself, it uses some pretty brilliant algorithms and architectures (eg, MapReduce, Paxos) which enable it to achieve amazing performance - on pretty crappy commodity hardware. And look at BitTorrent, which is truly p2p, where more demand leads to more supply.
So, in this vein, I will close this lengthy rant with an oddly specific link - which may or may not be able to make some interesting contributions to finding suitable algorithms, architectures and data structures which might help Bitcoin scale massively. I have no idea if this link could be helpful - but given the near-total lack of people from the Haskell and ML and functional worlds in these Bitcoin specification debates, I thought I'd be remiss if I didn't throw this out - just in case there might be something here which could help us channel the massive computing power of the Bitcoin network in such a way as to enable us simply sidestep this kind of desperate debate where both sides seem right because the other side seems wrong.
https://personal.cis.strath.ac.uk/neil.ghani/papers/ghani-calco07
The above paper is about "higher dimensional trees". It uses a bit of category theory (not a whole lot) and a bit of Haskell (again not a lot - just a simple data structure called a Rose tree, which has a wikipedia page) to develop a very expressive and efficient data structure which generalizes from lists to trees to higher dimensions.
I have no idea if this kind of data structure could be applicable to the current scaling mess we apparently are getting bogged down in - I don't have the game-theory skills to figure it out.
I just thought that since the blockchain is like a list, and since there are some tree-like structures which have been grafted on for efficiency (eg Merkle trees) and since many of the futuristic scaling proposals seem to also involve generalizing from list-like structures (eg, the blockchain) to tree-like structures (eg, side-chains and tree-chains)... well, who knows, there might be some nugget of algorithmic or architectural or data-structure inspiration there.
So... TL;DR:
(1) I'm freaked out that this blocksize debate has splintered the community so badly and dragged on so long, with no resolution in sight, and both sides seeming so right (because the other side seems so wrong).
(2) I think Bitcoin could gain immensely by using high-level formal, algebraic and co-algebraic program specification and verification languages (such as Maude including Maude-NPA, Mobile Maude parameterized skeletons, etc.) to specify (and possibly also, to some degree, verify) what Bitcoin does - before translating to low-level implementation languages such as C and Python and Java saying how Bitcoin does it. This would help to communicate and reason about programs with much more mathematical certitude - and possibly obviate the need for many political and economic tradeoffs which currently seem dismally inevitable - and possibly widen the collaboration on this project.
(3) I wonder if there are some Big Data approaches out there (eg, along the lines of Google's MapReduce and BigTable, or Yale's CalvinDB), which could be implemented to allow Bitcoin to scale massively and painlessly - and to satisfy all stakeholders, ranging from millionaires to micropayments, coffee drinkers to the great "unbanked".
submitted by BeYourOwnBank to Bitcoin [link] [comments]

u/brg444's "reasonable" post "The Artificial Blocksize Limit" was already rebutted by an earlier comment from u/Noosterdam: "4MB is the minimum size where exceeding it could cause any problems... Eventually we hit a limit... but we have no reason to believe that point is even in the ballpark of 1MB"

u/brg444 just posted a very reasonable-sounding and persuasive article on medium.com and on r\bitcoin:
The artificial block size limit
https://np.reddit.com/Bitcoin/comments/5e5ecv/the_artificial_block_size_limit/
https://medium.com/@bergealex4/the-artificial-block-size-limit-1b69aa5d9d4#.f9194hcwl
It's well written and he makes a lot of good points about the risks of allowing miners to determine the blocksize.
However, you can see the fatal flaw in u/brg444's arguments when you notice that is tacitly assuming that his buddies at Core/Blockstream should be allowed to determine the blocksize (and 1 MB just happens to be the right "magic" number).
As u/tsontar said several months ago:
He [Greg Maxwell] is not alone. Most of his team shares his ignorance.
Here's everything you need to know: The team considers the limit simply a question of engineering, and will silence discussion on its economic impact since "this is an engineering decision."
It's a joke. They are literally re-creating the technocracy of the Fed through a combination of computer science and a complete ignorance of the way the world works.
If ten smart guys in a room could outsmart the market, we wouldn't need Bitcoin.
https://np.reddit.com/btc/comments/46052e/adam_back_greg_maxwell_are_experts_in_mathematics/
As we know, the current "1 MB" blocksize was just a random accident of history - a temporary anti-spam kludge which everyone expected would be removed:
Then in late 2014, along came that "shitty startup" Blockstream - getting $76 million in funding from some of the most powerful companies in the legacy world of "fiat finance" - paying off most of the Core devs - attempting to hijack the Bitcoin codebase to serve the agenda of their corporate masters - leading to artificially high fees, periodic and worsening congestion and delays - which is suppressing Bitcoin's price, adoption and market cap.
As people like u/jessquit have recently started pointing out, we now know that Blockstream's business plan is 100% dependent on two things:
We also know that u/brg444 previously worked for a "viral marketing" firm in Canada. Now he's putting his propaganda talents to use to serve the agenda of Blockstream's corporate masters:
Because u/brg444 posted his article in a subreddit which is notorious for heavy-handed corporate censorship, I thought it would be useful to cross-post it here, so we could have a more open discussion, since anything critical of Core/Blockstream would probably get deleted in that other subreddit.
To start the discussion off, here's an earlier comment by u/Noosterdam which actually happens to pre-emptively destroy u/brg444's implicit argument - reminding us that Blockstream simply pulled the "1 MB" number out of their ass, while empirical studies (such as the Cornell study) have shown that the network could definitely handle blocks of at least 4 MB - and possibly much bigger:
https://np.reddit.com/btc/comments/5dxe42/i_am_a_longtime_btc_hodler_since_2010_this_is/da9pmkk/?context=1
The only academic study I've seen puts a floor of 4MB as the minimum size where exceeding it could cause any problems. It's only a study to determine a lower bound, i.e., "the network could safely support at least this big of blocks." That says nothing about 10 or 100MB blocks being a problem.
And remember that's the current network, with Bitcoin being only as big of a deal as it is now. By the time we have 10MB blocks, Bitcoin will be a much bigger deal and far more economically important, so many more people and businesses will want to be running nodes. And by the time we are craving 100MB blocks, all the more so.
Eventually we hit a limit where off-chain scaling starts to be a worthwhile tradeoff, but we have no reason to believe that point is even in the ballpark of 1MB. It would be a spectacular coincidence it if were, and yet this is what we're asked to believe. Most of all, to even calculate where that tradeoff would be, you would need to provide a minimum node spec you want the network to maintain support for. So far I don't know that even that first step has been done, so it's constant moving goalposts.
So... be careful when reading posts by u/brg444. He works in for a "viral marketing firm" so he's got a lot of training in Public Relations in order to make soothing, "reasonable-sounding" arguments to manipulate people's opinion to get them to submit to the agenda of his corporate masters at Blockstream.
submitted by ydtm to btc [link] [comments]

The Big Blocks Mega Thread

Since this is a pressing and prevalent issue, I thought maybe condensing the essential arguments into one mega thread is better than rehashing everything in new threads all the time. I chose a FAQ format for this so a certain statement can be answered. I don't want to re-post everything here so where appropriate I'm just going to use links.
Disclaimer: This is biased towards big blocks (BIP 101 in particular) but still tries to mention the risks, worries and fears. I think this is fair because all other major bitcoin discussion places severely censor and discourage big block discussion.
 
What is the block size limit?
The block size limit was introduced by Satoshi back in 2010-07-15 as an anti-DoS measure (though this was not stated in the commit message, more info here). Ever since, it has never been touched because historically there was no need and raising the block size limit requires a hard fork. The block size directly limits the number of transactions in a block. Therefore, the capacity of Bitcoin is directly limited by the block size limit.
 
Why does a raise require a hard fork?
Because larger blocks are seen as invalid by old nodes, a block size increase would fork these nodes off the network. Therefore it is a hard fork. However, it is possible to downsize the block limit with a soft fork since smaller blocks would still be seen as valid from old nodes. It is considerably easier to roll out a soft fork. Therefore, it makes sense to roll out a more ambitious hard fork limit and downsize as needed with soft forks if problems arise.
 
What is the deal with soft and hard forks anyways?
See this article by Mike Hearn: https://medium.com/@octskyward/on-consensus-and-forks-c6a050c792e7#.74502eypb
 
Why do we need to increase the block size?
The Bitcoin network is reaching its imposed block size limit while the hard- and software would be able to support more transactions. Many believe that in its current phase of growth, artificially limiting the block size is stifling adoption, investment and future growth.
Read this article and all linked articles for further reading: http://gavinandresen.ninja/time-to-roll-out-bigger-blocks
Another article by Mike Hearn: https://medium.com/@octskyward/crash-landing-f5cc19908e32#.uhky4y1ua (this article is a little outdated since both Bitcoin Core and XT now have mempool limits)
 
What is the Fidelity Effect?
It is the Chicken and Egg problem applied to future growth of Bitcoin. If companies do not see how Bitcoin can scale long term, they don't invest which in turn slows down adoption and development.
See here and here.
 
Does an increase in block size limit mean that blocks immediately get larger to the point of the new block size limit?
No, blocks are as large as there is demand for transactions on the network. But one can assume that if the limit is lifted, more users and businesses will want to use the blockchain. This means that blocks will get bigger, but they will not automatically jump to the size of the block size limit. Increased usage of the blockchain also means increased adoption, investment and also price appreciation.
 
Which are the block size increase proposals?
See here.
It should be noted that BIP 101 is the only proposal which has been implemented and is ready to go.
 
What is the long term vision of BIP 101?
BIP 101 tries to be as close to hardware limitations regarding bandwidth as possible so that nodes can continue running at normal home-user grade internet connections to keep the decentralized aspect of Bitcoin alive. It is believed that it is hard to increase the block size limit, so a long term increase is beneficial to planning and investment in the Bitcoin network. Go to this article for further reading and understand what is meant by "designing for success".
BIP 101 vs actual transaction growth visualized: http://imgur.com/QoTEOO2
Note that the actual growth in BIP 101 is piece-wise linear and does not grow in steps as suggested in the picture.
 
What is up with the moderation and censorship on bitcoin.org, bitcointalk.org and /bitcoin?
Proponents of a more conservative approach fear that a block size increase proposal that does not have "developeexpert consensus" should not be implemented via a majority hard fork. Therefore, discussion about the full node clients which implement BIP 101 is not allowed. Since the same individuals have major influence of all the three bitcoin websites (most notably theymos), discussion of Bitcoin XT is censored and/or discouraged on these websites.
 
What is Bitcoin XT anyways?
More info here.
 
What does Bitcoin Core do about the block size? What is the future plan by Bitcoin Core?
Bitcoin Core scaling plan as envisioned by Gregory Maxwell: http://lists.linuxfoundation.org/pipermail/bitcoin-dev/2015-Decembe011865.html
 
Who governs or controls Bitcoin Core anyways? Who governs Bitcoin XT? What is Bitcoin governance?
Bitcoin Core is governed by a consensus mechanism. How it actually works is not clear. It seems that any major developer can "veto" a change. However, there is one head maintainer who pushes releases and otherwise organizes the development effort. It should be noted that the majority of the main contributors to Bitcoin Core are Blockstream employees.
BitcoinXT follows a benevolent dictator model (as Bitcoin used to follow when Satoshi and later Gavin Andresen were the lead maintainers).
It is a widespread believe that Bitcoin can be separated into protocol and full node development. This means that there can be multiple implementations of Bitcoin that all follow the same protocol and overall consensus mechanism. More reading here. By having multiple implementations of Bitcoin, single Bitcoin implementations can be run following a benevolent dictator model while protocol development would follow an overall consensus model (which is enforced by Bitcoin's fundamental design through full nodes and miners' hash power). It is still unclear how protocol changes should actually be governed in such a model. Bitcoin governance is a research topic and evolving.
 
What are the arguments against a significant block size increase and against BIP 101 in particular?
The main arguments against a significant increase are related to decentralization and therefore robustness against commercial interests and government regulation and intervention. More here (warning: biased Wiki article).
Another main argument is that Bitcoin needs a fee market established by a low block size limit to support miners long term. There is significant evidence and game theory to doubt this claim, as can be seen here.
Finally, block propagation and verification times increase with an increased block size. This in turn increases the orphan rate of miners which means reduced profit. Some believe that this is a disadvantage to small miners because they are not as well connected to other big miners. Also, there is currently a large miner centralization in China. Since most of these miners are behind the Great Firewall of China, their bandwidth to the rest of the world is limited. There is a fear that larger block propagation times favor Chinese miners as long as they have a mining majority. However, there are solutions in development that can drastically reduce block propagation times so this problem will be less of an issue long term.
 
What is up with the fee market and what is the Lightning Network (LN)?
Major Bitcoin Core developers believe that a fee market established by a low block size is needed for future security of the bitcoin network. While many believe fundamentally this is true, there is major dispute if a fee market needs to be forced by a low block size. One of the main LN developers thinks such a fee market through low block size is needed (read here). The Lightning Network is a non-bandwidth scaling solution. It uses payment channels that can be opened and closed using Bitcoin transactions that are settled on the blockchain. By routing transactions through many of these payment channels, in theory it is possible to support a lot more transactions while a user only needs very few payment channels and therefore rarely has to use (settle on) the actual blockchain. More info here.
 
How does LN and other non-bandwidth scaling solutions relate to Bitcoin Core and its long term scaling vision?
Bitcoin Core is headed towards a future where block sizes are kept low so that a fee market is established long term that secures miner incentives. The main scaling solution propagated by Core is LN and other solutions that only sometimes settle transactions on the main Bitcoin blockchain. Essentially, Bitcoin becomes a settlement layer for solutions that are built on top of Bitcoin's core technology. Many believe that long term this might be inevitable. But forcing this off-chain development already today seems counterproductive to Bitcoin's much needed growth and adoption phase before such solutions can thrive. It should also be noted that no major non-bandwidth scaling solution (such as LN) has been tested or even implemented. It is not even clear if such off-chain solutions are needed long term scaling solutions as it might be possible to scale Bitcoin itself to handle all needed transaction volumes. Some believe that the focus on a forced fee market by major Bitcoin Core developers represents a conflict of interest since their employer is interested in pushing off-chain scaling solutions such as LN (more reading here).
 
Are there solutions in development that show the block sizes as proposed via BIP 101 are viable and block propagation times in particular are low enough?
Yes, most notably: Weak Blocks, Thin Blocks and IBLT.
 
What is Segregated Witness (SW) and how does it relate to scaling and block size increases?
See here. SW among other things is a way to increase the block size once without a hard fork (the actual block size is not increased but there is extra information exchanged separately to blocks).
 
Feedback and more of those question/answer type posts (or revised question/answer pairs) appreciated!
 
ToDo and thoughts for expansion:
@Mods: Maybe this could be stickied?
submitted by BIP-101 to btc [link] [comments]

raekotz - YouTube 30 SMART TRICKS TO OPEN ANYTHING AROUND YOU - YouTube Marlou Volkerink - YouTube

Cryptocurrencies Bitcoin Reaches Overbought Territory After Piercing $13,000 Mark by Dave Liedtka 10/22/20, 11:40 AM EDT ... October 16th First Bitcoin deposit is registered: a deal between users nanotube and Diabo-3, with user theymos as . October 17th #bitcoin-otc trade channel appears on IRC freenode. October 28th First short trade transaction: 100 coins between users nanotube and kiba on #bitcoin-otc. November 6th Bitcoin share capital reaches 1 million USD. Its exchange rate on MtGox reaches USD$0.50 per BTC ... Bitcoin is the currency of the Internet: a distributed, worldwide, decentralized digital money. Unlike traditional currencies such as dollars, bitcoins are issued and managed without any central authority whatsoever: there is no government, company, or bank in charge of Bitcoin. Bitcoin Forum general discussion about the Bitcoin ecosystem that doesn t fit better elsewhere. The Bitcoin community, innovations, the general environment, etc. Discussion of specific Bitcoin-related services usually belongs in other sections. The notorious and questionable owner of the website bitcoin.org, Cobra, is under fire just recently and a number of neighborhood members have actually asked. The notorious and questionable owner of the website bitcoin.org, Cobra, is under fire just recently and a number of neighborhood members have actually asked . Skip to content. Friday, September 18, 2020. Facebook-f. Twitter. Rss. Bitcoin ...

[index] [4345] [40776] [7321] [3397] [49040] [715] [22556] [36129] [25398] [10303]

raekotz - YouTube

Gossip Room est une communauté sur les réseaux sociaux, créée il y a 7 ans, qui regroupe aujourd’hui des millions de passionnés d’actualité TV, people, série... Les vidéos d'aquaportail http://www.aquaportail.com concerne le site portail d'aquariophilie et sont destinées à présenter des scènes de vie aquatique en aqu... Enjoy the videos and music you love, upload original content, and share it all with friends, family, and the world on YouTube. I review technology from the inside with smartphone durability tests and teardown videos! Mixed with drones and other random projects. DO NOT EMAIL ME ABOUT ... Jollyroom är Nordens största barn- och babybutik på nätet! Vi erbjuder marknadens största lagerförda sortiment för gravida och barn mellan 0-8 år. Välkommen in!

#