MultiversX Tracker is Live!

The Limits to Blockchain Scalability on vitalik.ca

Etherum Reddit

More / Etherum Reddit 319 Views

At the end of the article Vitalik writes this:

Quantifying this risk is easy. Take the blockchain's data capacity in MB/sec, and multiply by ~30 to get the amount of data stored in terabytes per year. The current sharding plan has a data capacity of ~1.3 MB/sec, so about 40 TB/year. If that is increased by 10x, this becomes 400 TB/year. If we want the data to be not just accessible, but accessible conveniently, we would also need metadata (eg. decompressing rollup transactions), so make that 4 petabytes per year, or 40 petabytes after a decade. The Internet Archive uses 50 petabytes. So that's a reasonable upper bound for how large a sharded blockchain can safely get.

Hence, it looks like on both of these dimensions, the Ethereum sharding design is actually already roughly targeted fairly close to reasonable maximum safe values. The constants can be increased a little bit, but not too much.

I don't really understand how this is going to work. If the blockchain gets this big then only archive.org or other providers will be able to run full nodes. I'm sure there's something I'm missing but I'm not sure what it is. Is it actually planned that in the future of Ethereum you'll need petabyte storage to run a full node?

submitted by /u/aemmeroli
[link] [comments]
Get BONUS $200 for FREE!

You can get bonuses upto $100 FREE BONUS when you:
πŸ’° Install these recommended apps:
πŸ’² SocialGood - 100% Crypto Back on Everyday Shopping
πŸ’² xPortal - The DeFi For The Next Billion
πŸ’² CryptoTab Browser - Lightweight, fast, and ready to mine!
πŸ’° Register on these recommended exchanges:
🟑 Binance🟑 Bitfinex🟑 Bitmart🟑 Bittrex🟑 Bitget
🟑 CoinEx🟑 Crypto.com🟑 Gate.io🟑 Huobi🟑 Kucoin.



Comments