An Overview of 7 Bitcoin Full Node Products

Proposal: The Sia Foundation

Vision Statement

A common sentiment is brewing online; a shared desire for the internet that might have been. After decades of corporate encroachment, you don't need to be a power user to realize that something has gone very wrong.
In the early days of the internet, the future was bright. In that future, when you sent an instant message, it traveled directly to the recipient. When you needed to pay a friend, you announced a transfer of value to their public key. When an app was missing a feature you wanted, you opened up the source code and implemented it. When you took a picture on your phone, it was immediately encrypted and backed up to storage that you controlled. In that future, people would laugh at the idea of having to authenticate themselves to some corporation before doing these things.
What did we get instead? Rather than a network of human-sized communities, we have a handful of enormous commons, each controlled by a faceless corporate entity. Hey user, want to send a message? You can, but we'll store a copy of it indefinitely, unencrypted, for our preference-learning algorithms to pore over; how else could we slap targeted ads on every piece of content you see? Want to pay a friend? You can—in our Monopoly money. Want a new feature? Submit a request to our Support Center and we'll totally maybe think about it. Want to backup a photo? You can—inside our walled garden, which only we (and the NSA, of course) can access. Just be careful what you share, because merely locking you out of your account and deleting all your data is far from the worst thing we could do.
You rationalize this: "MEGACORP would never do such a thing; it would be bad for business." But we all know, at some level, that this state of affairs, this inversion of power, is not merely "unfortunate" or "suboptimal" – No. It is degrading. Even if MEGACORP were purely benevolent, it is degrading that we must ask its permission to talk to our friends; that we must rely on it to safeguard our treasured memories; that our digital lives are completely beholden to those who seek only to extract value from us.
At the root of this issue is the centralization of data. MEGACORP can surveil you—because your emails and video chats flow through their servers. And MEGACORP can control you—because they hold your data hostage. But centralization is a solution to a technical problem: How can we make the user's data accessible from anywhere in the world, on any device? For a long time, no alternative solution to this problem was forthcoming.
Today, thanks to a confluence of established techniques and recent innovations, we have solved the accessibility problem without resorting to centralization. Hashing, encryption, and erasure encoding got us most of the way, but one barrier remained: incentives. How do you incentivize an anonymous stranger to store your data? Earlier protocols like BitTorrent worked around this limitation by relying on altruism, tit-for-tat requirements, or "points" – in other words, nothing you could pay your electric bill with. Finally, in 2009, a solution appeared: Bitcoin. Not long after, Sia was born.
Cryptography has unleashed the latent power of the internet by enabling interactions between mutually-distrustful parties. Sia harnesses this power to turn the cloud storage market into a proper marketplace, where buyers and sellers can transact directly, with no intermediaries, anywhere in the world. No more silos or walled gardens: your data is encrypted, so it can't be spied on, and it's stored on many servers, so no single entity can hold it hostage. Thanks to projects like Sia, the internet is being re-decentralized.
Sia began its life as a startup, which means it has always been subjected to two competing forces: the ideals of its founders, and the profit motive inherent to all businesses. Its founders have taken great pains to never compromise on the former, but this often threatened the company's financial viability. With the establishment of the Sia Foundation, this tension is resolved. The Foundation, freed of the obligation to generate profit, is a pure embodiment of the ideals from which Sia originally sprung.
The goals and responsibilities of the Foundation are numerous: to maintain core Sia protocols and consensus code; to support developers building on top of Sia and its protocols; to promote Sia and facilitate partnerships in other spheres and communities; to ensure that users can easily acquire and safely store siacoins; to develop network scalability solutions; to implement hardforks and lead the community through them; and much more. In a broader sense, its mission is to commoditize data storage, making it cheap, ubiquitous, and accessible to all, without compromising privacy or performance.
Sia is a perfect example of how we can achieve better living through cryptography. We now begin a new chapter in Sia's history. May our stewardship lead it into a bright future.
 

Overview

Today, we are proposing the creation of the Sia Foundation: a new non-profit entity that builds and supports distributed cloud storage infrastructure, with a specific focus on the Sia storage platform. What follows is an informal overview of the Sia Foundation, covering two major topics: how the Foundation will be funded, and what its funds will be used for.

Organizational Structure

The Sia Foundation will be structured as a non-profit entity incorporated in the United States, likely a 501(c)(3) organization or similar. The actions of the Foundation will be constrained by its charter, which formalizes the specific obligations and overall mission outlined in this document. The charter will be updated on an annual basis to reflect the current goals of the Sia community.
The organization will be operated by a board of directors, initially comprising Luke Champine as President and Eddie Wang as Chairman. Luke Champine will be leaving his position at Nebulous to work at the Foundation full-time, and will seek to divest his shares of Nebulous stock along with other potential conflicts of interest. Neither Luke nor Eddie personally own any siafunds or significant quantities of siacoin.

Funding

The primary source of funding for the Foundation will come from a new block subsidy. Following a hardfork, 30 KS per block will be allocated to the "Foundation Fund," continuing in perpetuity. The existing 30 KS per block miner reward is not affected. Additionally, one year's worth of block subsidies (approximately 1.57 GS) will be allocated to the Fund immediately upon activation of the hardfork.
As detailed below, the Foundation will provably burn any coins that it cannot meaningfully spend. As such, the 30 KS subsidy should be viewed as a maximum. This allows the Foundation to grow alongside Sia without requiring additional hardforks.
The Foundation will not be funded to any degree by the possession or sale of siafunds. Siafunds were originally introduced as a means of incentivizing growth, and we still believe in their effectiveness: a siafund holder wants to increase the amount of storage on Sia as much as possible. While the Foundation obviously wants Sia to succeed, its driving force should be its charter. Deriving significant revenue from siafunds would jeopardize the Foundation's impartiality and focus. Ultimately, we want the Foundation to act in the best interests of Sia, not in growing its own budget.

Responsibilities

The Foundation inherits a great number of responsibilities from Nebulous. Each quarter, the Foundation will publish the progress it has made over the past quarter, and list the responsibilities it intends to prioritize over the coming quarter. This will be accompanied by a financial report, detailing each area of expenditure over the past quarter, and forecasting expenditures for the coming quarter. Below, we summarize some of the myriad responsibilities towards which the Foundation is expected to allocate its resources.

Maintain and enhance core Sia software

Arguably, this is the most important responsibility of the Foundation. At the heart of Sia is its consensus algorithm: regardless of other differences, all Sia software must agree upon the content and rules of the blockchain. It is therefore crucial that the algorithm be stewarded by an entity that is accountable to the community, transparent in its decision-making, and has no profit motive or other conflicts of interest.
Accordingly, Sia’s consensus functionality will no longer be directly maintained by Nebulous. Instead, the Foundation will release and maintain an implementation of a "minimal Sia full node," comprising the Sia consensus algorithm and P2P networking code. The source code will be available in a public repository, and signed binaries will be published for each release.
Other parties may use this code to provide alternative full node software. For example, Nebulous may extend the minimal full node with wallet, renter, and host functionality. The source code of any such implementation may be submitted to the Foundation for review. If the code passes review, the Foundation will provide "endorsement signatures" for the commit hash used and for binaries compiled internally by the Foundation. Specifically, these signatures assert that the Foundation believes the software contains no consensus-breaking changes or other modifications to imported Foundation code. Endorsement signatures and Foundation-compiled binaries may be displayed and distributed by the receiving party, along with an appropriate disclaimer.
A minimal full node is not terribly useful on its own; the wallet, renter, host, and other extensions are what make Sia a proper developer platform. Currently, the only implementations of these extensions are maintained by Nebulous. The Foundation will contract Nebulous to ensure that these extensions continue to receive updates and enhancements. Later on, the Foundation intends to develop its own implementations of these extensions and others. As with the minimal node software, these extensions will be open source and available in public repositories for use by any Sia node software.
With the consensus code now managed by the Foundation, the task of implementing and orchestrating hardforks becomes its responsibility as well. When the Foundation determines that a hardfork is necessary (whether through internal discussion or via community petition), a formal proposal will be drafted and submitted for public review, during which arguments for and against the proposal may be submitted to a public repository. During this time, the hardfork code will be implemented, either by Foundation employees or by external contributors working closely with the Foundation. Once the implementation is finished, final arguments will be heard. The Foundation board will then vote whether to accept or reject the proposal, and announce their decision along with appropriate justification. Assuming the proposal was accepted, the Foundation will announce the block height at which the hardfork will activate, and will subsequently release source code and signed binaries that incorporate the hardfork code.
Regardless of the Foundation's decision, it is the community that ultimately determines whether a fork is accepted or rejected – nothing can change that. Foundation node software will never automatically update, so all forks must be explicitly adopted by users. Furthermore, the Foundation will provide replay and wipeout protection for its hard forks, protecting other chains from unintended or malicious reorgs. Similarly, the Foundation will ensure that any file contracts formed prior to a fork activation will continue to be honored on both chains until they expire.
Finally, the Foundation also intends to pursue scalability solutions for the Sia blockchain. In particular, work has already begun on an implementation of Utreexo, which will greatly reduce the space requirements of fully-validating nodes (allowing a full node to be run on a smartphone) while increasing throughput and decreasing initial sync time. A hardfork implementing Utreexo will be submitted to the community as per the process detailed above.
As this is the most important responsibility of the Foundation, it will receive a significant portion of the Foundation’s budget, primarily in the form of developer salaries and contracting agreements.

Support community services

We intend to allocate 25% of the Foundation Fund towards the community. This allocation will be held and disbursed in the form of siacoins, and will pay for grants, bounties, hackathons, and other community-driven endeavours.
Any community-run service, such as a Skynet portal, explorer or web wallet, may apply to have its costs covered by the Foundation. Upon approval, the Foundation will reimburse expenses incurred by the service, subject to the exact terms agreed to. The intent of these grants is not to provide a source of income, but rather to make such services "break even" for their operators, so that members of the community can enrich the Sia ecosystem without worrying about the impact on their own finances.

Ensure easy acquisition and storage of siacoins

Most users will acquire their siacoins via an exchange. The Foundation will provide support to Sia-compatible exchanges, and pursue relevant integrations at its discretion, such as Coinbase's new Rosetta standard. The Foundation may also release DEX software that enables trading cryptocurrencies without the need for a third party. (The Foundation itself will never operate as a money transmitter.)
Increasingly, users are storing their cryptocurrency on hardware wallets. The Foundation will maintain the existing Ledger Nano S integration, and pursue further integrations at its discretion.
Of course, all hardware wallets must be paired with software running on a computer or smartphone, so the Foundation will also develop and/or maintain client-side wallet software, including both full-node wallets and "lite" wallets. Community-operated wallet services, i.e. web wallets, may be funded via grants.
Like core software maintenance, this responsibility will be funded in the form of developer salaries and contracting agreements.

Protect the ecosystem

When it comes to cryptocurrency security, patching software vulnerabilities is table stakes; there are significant legal and social threats that we must be mindful of as well. As such, the Foundation will earmark a portion of its fund to defend the community from legal action. The Foundation will also safeguard the network from 51% attacks and other threats to network security by implementing softforks and/or hardforks where necessary.
The Foundation also intends to assist in the development of a new FOSS software license, and to solicit legal memos on various Sia-related matters, such as hosting in the United States and the EU.
In a broader sense, the establishment of the Foundation makes the ecosystem more robust by transferring core development to a more neutral entity. Thanks to its funding structure, the Foundation will be immune to various forms of pressure that for-profit companies are susceptible to.

Drive adoption of Sia

Although the overriding goal of the Foundation is to make Sia the best platform it can be, all that work will be in vain if no one uses the platform. There are a number of ways the Foundation can promote Sia and get it into the hands of potential users and developers.
In-person conferences are understandably far less popular now, but the Foundation can sponsor and/or participate in virtual conferences. (In-person conferences may be held in the future, permitting circumstances.) Similarly, the Foundation will provide prizes for hackathons, which may be organized by community members, Nebulous, or the Foundation itself. Lastly, partnerships with other companies in the cryptocurrency space—or the cloud storage space—are a great way to increase awareness of Sia. To handle these responsibilities, one of the early priorities of the Foundation will be to hire a marketing director.

Fund Management

The Foundation Fund will be controlled by a multisig address. Each member of the Foundation's board will control one of the signing keys, with the signature threshold to be determined once the final composition of the board is known. (This threshold may also be increased or decreased if the number of board members changes.) Additionally, one timelocked signing key will be controlled by David Vorick. This key will act as a “dead man’s switch,” to be used in the event of an emergency that prevents Foundation board members from reaching the signature threshold. The timelock ensures that this key cannot be used unless the Foundation fails to sign a transaction for several months.
On the 1st of each month, the Foundation will use its keys to transfer all siacoins in the Fund to two new addresses. The first address will be controlled by a high-security hot wallet, and will receive approximately one month's worth of Foundation expenditures. The second address, receiving the remaining siacoins, will be a modified version of the source address: specifically, it will increase the timelock on David Vorick's signing key by one month. Any other changes to the set of signing keys, such as the arrival or departure of board members, will be incorporated into this address as well.
The Foundation Fund is allocated in SC, but many of the Foundation's expenditures must be paid in USD or other fiat currency. Accordingly, the Foundation will convert, at its discretion, a portion of its monthly withdrawals to fiat currency. We expect this conversion to be primarily facilitated by private "OTC" sales to accredited investors. The Foundation currently has no plans to speculate in cryptocurrency or other assets.
Finally, it is important that the Foundation adds value to the Sia platform well in excess of the inflation introduced by the block subsidy. For this reason, the Foundation intends to provably burn, on a quarterly basis, any coins that it cannot allocate towards any justifiable expense. In other words, coins will be burned whenever doing so provides greater value to the platform than any other use. Furthermore, the Foundation will cap its SC treasury at 5% of the total supply, and will cap its USD treasury at 4 years’ worth of predicted expenses.
 
Addendum: Hardfork Timeline
We would like to see this proposal finalized and accepted by the community no later than September 30th. A new version of siad, implementing the hardfork, will be released no later than October 15th. The hardfork will activate at block 293220, which is expected to occur around 12pm EST on January 1st, 2021.
 
Addendum: Inflation specifics
The total supply of siacoins as of January 1st, 2021 will be approximately 45.243 GS. The initial subsidy of 1.57 GS thus increases the supply by 3.47%, and the total annual inflation in 2021 will be at most 10.4% (if zero coins are burned). In 2022, total annual inflation will be at most 6.28%, and will steadily decrease in subsequent years.
 

Conclusion

We see the establishment of the Foundation as an important step in the maturation of the Sia project. It provides the ecosystem with a sustainable source of funding that can be exclusively directed towards achieving Sia's ambitious goals. Compared to other projects with far deeper pockets, Sia has always punched above its weight; once we're on equal footing, there's no telling what we'll be able to achieve.
Nevertheless, we do not propose this change lightly, and have taken pains to ensure that the Foundation will act in accordance with the ideals that this community shares. It will operate transparently, keep inflation to a minimum, and respect the user's fundamental role in decentralized systems. We hope that everyone in the community will consider this proposal carefully, and look forward to a productive discussion.
submitted by lukechampine to siacoin [link] [comments]

Adoption!

Adoption! submitted by bit_igu to btc [link] [comments]

Gridcoin 5.0.0.0-Mandatory "Fern" Release

https://github.com/gridcoin-community/Gridcoin-Research/releases/tag/5.0.0.0
Finally! After over ten months of development and testing, "Fern" has arrived! This is a whopper. 240 pull requests merged. Essentially a complete rewrite that was started with the scraper (the "neural net" rewrite) in "Denise" has now been completed. Practically the ENTIRE Gridcoin specific codebase resting on top of the vanilla Bitcoin/Peercoin/Blackcoin vanilla PoS code has been rewritten. This removes the team requirement at last (see below), although there are many other important improvements besides that.
Fern was a monumental undertaking. We had to encode all of the old rules active for the v10 block protocol in new code and ensure that the new code was 100% compatible. This had to be done in such a way as to clear out all of the old spaghetti and ring-fence it with tightly controlled class implementations. We then wrote an entirely new, simplified ruleset for research rewards and reengineered contracts (which includes beacon management, polls, and voting) using properly classed code. The fundamentals of Gridcoin with this release are now on a very sound and maintainable footing, and the developers believe the codebase as updated here will serve as the fundamental basis for Gridcoin's future roadmap.
We have been testing this for MONTHS on testnet in various stages. The v10 (legacy) compatibility code has been running on testnet continuously as it was developed to ensure compatibility with existing nodes. During the last few months, we have done two private testnet forks and then the full public testnet testing for v11 code (the new protocol which is what Fern implements). The developers have also been running non-staking "sentinel" nodes on mainnet with this code to verify that the consensus rules are problem-free for the legacy compatibility code on the broader mainnet. We believe this amount of testing is going to result in a smooth rollout.
Given the amount of changes in Fern, I am presenting TWO changelogs below. One is high level, which summarizes the most significant changes in the protocol. The second changelog is the detailed one in the usual format, and gives you an inkling of the size of this release.

Highlights

Protocol

Note that the protocol changes will not become active until we cross the hard-fork transition height to v11, which has been set at 2053000. Given current average block spacing, this should happen around October 4, about one month from now.
Note that to get all of the beacons in the network on the new protocol, we are requiring ALL beacons to be validated. A two week (14 day) grace period is provided by the code, starting at the time of the transition height, for people currently holding a beacon to validate the beacon and prevent it from expiring. That means that EVERY CRUNCHER must advertise and validate their beacon AFTER the v11 transition (around Oct 4th) and BEFORE October 18th (or more precisely, 14 days from the actual date of the v11 transition). If you do not advertise and validate your beacon by this time, your beacon will expire and you will stop earning research rewards until you advertise and validate a new beacon. This process has been made much easier by a brand new beacon "wizard" that helps manage beacon advertisements and renewals. Once a beacon has been validated and is a v11 protocol beacon, the normal 180 day expiration rules apply. Note, however, that the 180 day expiration on research rewards has been removed with the Fern update. This means that while your beacon might expire after 180 days, your earned research rewards will be retained and can be claimed by advertising a beacon with the same CPID and going through the validation process again. In other words, you do not lose any earned research rewards if you do not stake a block within 180 days and keep your beacon up-to-date.
The transition height is also when the team requirement will be relaxed for the network.

GUI

Besides the beacon wizard, there are a number of improvements to the GUI, including new UI transaction types (and icons) for staking the superblock, sidestake sends, beacon advertisement, voting, poll creation, and transactions with a message. The main screen has been revamped with a better summary section, and better status icons. Several changes under the hood have improved GUI performance. And finally, the diagnostics have been revamped.

Blockchain

The wallet sync speed has been DRASTICALLY improved. A decent machine with a good network connection should be able to sync the entire mainnet blockchain in less than 4 hours. A fast machine with a really fast network connection and a good SSD can do it in about 2.5 hours. One of our goals was to reduce or eliminate the reliance on snapshots for mainnet, and I think we have accomplished that goal with the new sync speed. We have also streamlined the in-memory structures for the blockchain which shaves some memory use.
There are so many goodies here it is hard to summarize them all.
I would like to thank all of the contributors to this release, but especially thank @cyrossignol, whose incredible contributions formed the backbone of this release. I would also like to pay special thanks to @barton2526, @caraka, and @Quezacoatl1, who tirelessly helped during the testing and polishing phase on testnet with testing and repeated builds for all architectures.
The developers are proud to present this release to the community and we believe this represents the starting point for a true renaissance for Gridcoin!

Summary Changelog

Accrual

Changed

Most significantly, nodes calculate research rewards directly from the magnitudes in EACH superblock between stakes instead of using a two- or three- point average based on a CPID's current magnitude and the magnitude for the CPID when it last staked. For those long-timers in the community, this has been referred to as "Superblock Windows," and was first done in proof-of-concept form by @denravonska.

Removed

Beacons

Added

Changed

Removed

Unaltered

As a reminder:

Superblocks

Added

Changed

Removed

Voting

Added

Changed

Removed

Detailed Changelog

[5.0.0.0] 2020-09-03, mandatory, "Fern"

Added

Changed

Removed

Fixed

submitted by jamescowens to gridcoin [link] [comments]

Bob The Magic Custodian



Summary: Everyone knows that when you give your assets to someone else, they always keep them safe. If this is true for individuals, it is certainly true for businesses.
Custodians always tell the truth and manage funds properly. They won't have any interest in taking the assets as an exchange operator would. Auditors tell the truth and can't be misled. That's because organizations that are regulated are incapable of lying and don't make mistakes.

First, some background. Here is a summary of how custodians make us more secure:

Previously, we might give Alice our crypto assets to hold. There were risks:

But "no worries", Alice has a custodian named Bob. Bob is dressed in a nice suit. He knows some politicians. And he drives a Porsche. "So you have nothing to worry about!". And look at all the benefits we get:
See - all problems are solved! All we have to worry about now is:
It's pretty simple. Before we had to trust Alice. Now we only have to trust Alice, Bob, and all the ways in which they communicate. Just think of how much more secure we are!

"On top of that", Bob assures us, "we're using a special wallet structure". Bob shows Alice a diagram. "We've broken the balance up and store it in lots of smaller wallets. That way", he assures her, "a thief can't take it all at once". And he points to a historic case where a large sum was taken "because it was stored in a single wallet... how stupid".
"Very early on, we used to have all the crypto in one wallet", he said, "and then one Christmas a hacker came and took it all. We call him the Grinch. Now we individually wrap each crypto and stick it under a binary search tree. The Grinch has never been back since."

"As well", Bob continues, "even if someone were to get in, we've got insurance. It covers all thefts and even coercion, collusion, and misplaced keys - only subject to the policy terms and conditions." And with that, he pulls out a phone-book sized contract and slams it on the desk with a thud. "Yep", he continues, "we're paying top dollar for one of the best policies in the country!"
"Can I read it?' Alice asks. "Sure," Bob says, "just as soon as our legal team is done with it. They're almost through the first chapter." He pauses, then continues. "And can you believe that sales guy Mike? He has the same year Porsche as me. I mean, what are the odds?"

"Do you use multi-sig?", Alice asks. "Absolutely!" Bob replies. "All our engineers are fully trained in multi-sig. Whenever we want to set up a new wallet, we generate 2 separate keys in an air-gapped process and store them in this proprietary system here. Look, it even requires the biometric signature from one of our team members to initiate any withdrawal." He demonstrates by pressing his thumb into the display. "We use a third-party cloud validation API to match the thumbprint and authorize each withdrawal. The keys are also backed up daily to an off-site third-party."
"Wow that's really impressive," Alice says, "but what if we need access for a withdrawal outside of office hours?" "Well that's no issue", Bob says, "just send us an email, call, or text message and we always have someone on staff to help out. Just another part of our strong commitment to all our customers!"

"What about Proof of Reserve?", Alice asks. "Of course", Bob replies, "though rather than publish any blockchain addresses or signed transaction, for privacy we just do a SHA256 refactoring of the inverse hash modulus for each UTXO nonce and combine the smart contract coefficient consensus in our hyperledger lightning node. But it's really simple to use." He pushes a button and a large green checkmark appears on a screen. "See - the algorithm ran through and reserves are proven."
"Wow", Alice says, "you really know your stuff! And that is easy to use! What about fiat balances?" "Yeah, we have an auditor too", Bob replies, "Been using him for a long time so we have quite a strong relationship going! We have special books we give him every year and he's very efficient! Checks the fiat, crypto, and everything all at once!"

"We used to have a nice offline multi-sig setup we've been using without issue for the past 5 years, but I think we'll move all our funds over to your facility," Alice says. "Awesome", Bob replies, "Thanks so much! This is perfect timing too - my Porsche got a dent on it this morning. We have the paperwork right over here." "Great!", Alice replies.
And with that, Alice gets out her pen and Bob gets the contract. "Don't worry", he says, "you can take your crypto-assets back anytime you like - just subject to our cancellation policy. Our annual management fees are also super low and we don't adjust them often".

How many holes have to exist for your funds to get stolen?
Just one.

Why are we taking a powerful offline multi-sig setup, widely used globally in hundreds of different/lacking regulatory environments with 0 breaches to date, and circumventing it by a demonstrably weak third party layer? And paying a great expense to do so?
If you go through the list of breaches in the past 2 years to highly credible organizations, you go through the list of major corporate frauds (only the ones we know about), you go through the list of all the times platforms have lost funds, you go through the list of times and ways that people have lost their crypto from identity theft, hot wallet exploits, extortion, etc... and then you go through this custodian with a fine-tooth comb and truly believe they have value to add far beyond what you could, sticking your funds in a wallet (or set of wallets) they control exclusively is the absolute worst possible way to take advantage of that security.

The best way to add security for crypto-assets is to make a stronger multi-sig. With one custodian, what you are doing is giving them your cryptocurrency and hoping they're honest, competent, and flawlessly secure. It's no different than storing it on a really secure exchange. Maybe the insurance will cover you. Didn't work for Bitpay in 2015. Didn't work for Yapizon in 2017. Insurance has never paid a claim in the entire history of cryptocurrency. But maybe you'll get lucky. Maybe your exact scenario will buck the trend and be what they're willing to cover. After the large deductible and hopefully without a long and expensive court battle.

And you want to advertise this increase in risk, the lapse of judgement, an accident waiting to happen, as though it's some kind of benefit to customers ("Free institutional-grade storage for your digital assets.")? And then some people are writing to the OSC that custodians should be mandatory for all funds on every exchange platform? That this somehow will make Canadians as a whole more secure or better protected compared with standard air-gapped multi-sig? On what planet?

Most of the problems in Canada stemmed from one thing - a lack of transparency. If Canadians had known what a joke Quadriga was - it wouldn't have grown to lose $400m from hard-working Canadians from coast to coast to coast. And Gerald Cotten would be in jail, not wherever he is now (at best, rotting peacefully). EZ-BTC and mister Dave Smilie would have been a tiny little scam to his friends, not a multi-million dollar fraud. Einstein would have got their act together or been shut down BEFORE losing millions and millions more in people's funds generously donated to criminals. MapleChange wouldn't have even been a thing. And maybe we'd know a little more about CoinTradeNewNote - like how much was lost in there. Almost all of the major losses with cryptocurrency exchanges involve deception with unbacked funds.
So it's great to see transparency reports from BitBuy and ShakePay where someone independently verified the backing. The only thing we don't have is:
It's not complicated to validate cryptocurrency assets. They need to exist, they need to be spendable, and they need to cover the total balances. There are plenty of credible people and firms across the country that have the capacity to reasonably perform this validation. Having more frequent checks by different, independent, parties who publish transparent reports is far more valuable than an annual check by a single "more credible/official" party who does the exact same basic checks and may or may not publish anything. Here's an example set of requirements that could be mandated:
There are ways to structure audits such that neither crypto assets nor customer information are ever put at risk, and both can still be properly validated and publicly verifiable. There are also ways to structure audits such that they are completely reasonable for small platforms and don't inhibit innovation in any way. By making the process as reasonable as possible, we can completely eliminate any reason/excuse that an honest platform would have for not being audited. That is arguable far more important than any incremental improvement we might get from mandating "the best of the best" accountants. Right now we have nothing mandated and tons of Canadians using offshore exchanges with no oversight whatsoever.

Transparency does not prove crypto assets are safe. CoinTradeNewNote, Flexcoin ($600k), and Canadian Bitcoins ($100k) are examples where crypto-assets were breached from platforms in Canada. All of them were online wallets and used no multi-sig as far as any records show. This is consistent with what we see globally - air-gapped multi-sig wallets have an impeccable record, while other schemes tend to suffer breach after breach. We don't actually know how much CoinTrader lost because there was no visibility. Rather than publishing details of what happened, the co-founder of CoinTrader silently moved on to found another platform - the "most trusted way to buy and sell crypto" - a site that has no information whatsoever (that I could find) on the storage practices and a FAQ advising that “[t]rading cryptocurrency is completely safe” and that having your own wallet is “entirely up to you! You can certainly keep cryptocurrency, or fiat, or both, on the app.” Doesn't sound like much was learned here, which is really sad to see.
It's not that complicated or unreasonable to set up a proper hardware wallet. Multi-sig can be learned in a single course. Something the equivalent complexity of a driver's license test could prevent all the cold storage exploits we've seen to date - even globally. Platform operators have a key advantage in detecting and preventing fraud - they know their customers far better than any custodian ever would. The best job that custodians can do is to find high integrity individuals and train them to form even better wallet signatories. Rather than mandating that all platforms expose themselves to arbitrary third party risks, regulations should center around ensuring that all signatories are background-checked, properly trained, and using proper procedures. We also need to make sure that signatories are empowered with rights and responsibilities to reject and report fraud. They need to know that they can safely challenge and delay a transaction - even if it turns out they made a mistake. We need to have an environment where mistakes are brought to the surface and dealt with. Not one where firms and people feel the need to hide what happened. In addition to a knowledge-based test, an auditor can privately interview each signatory to make sure they're not in coercive situations, and we should make sure they can freely and anonymously report any issues without threat of retaliation.
A proper multi-sig has each signature held by a separate person and is governed by policies and mutual decisions instead of a hierarchy. It includes at least one redundant signature. For best results, 3of4, 3of5, 3of6, 4of5, 4of6, 4of7, 5of6, or 5of7.

History has demonstrated over and over again the risk of hot wallets even to highly credible organizations. Nonetheless, many platforms have hot wallets for convenience. While such losses are generally compensated by platforms without issue (for example Poloniex, Bitstamp, Bitfinex, Gatecoin, Coincheck, Bithumb, Zaif, CoinBene, Binance, Bitrue, Bitpoint, Upbit, VinDAX, and now KuCoin), the public tends to focus more on cases that didn't end well. Regardless of what systems are employed, there is always some level of risk. For that reason, most members of the public would prefer to see third party insurance.
Rather than trying to convince third party profit-seekers to provide comprehensive insurance and then relying on an expensive and slow legal system to enforce against whatever legal loopholes they manage to find each and every time something goes wrong, insurance could be run through multiple exchange operators and regulators, with the shared interest of having a reputable industry, keeping costs down, and taking care of Canadians. For example, a 4 of 7 multi-sig insurance fund held between 5 independent exchange operators and 2 regulatory bodies. All Canadian exchanges could pay premiums at a set rate based on their needed coverage, with a higher price paid for hot wallet coverage (anything not an air-gapped multi-sig cold wallet). Such a model would be much cheaper to manage, offer better coverage, and be much more reliable to payout when needed. The kind of coverage you could have under this model is unheard of. You could even create something like the CDIC to protect Canadians who get their trading accounts hacked if they can sufficiently prove the loss is legitimate. In cases of fraud, gross negligence, or insolvency, the fund can be used to pay affected users directly (utilizing the last transparent balance report in the worst case), something which private insurance would never touch. While it's recommended to have official policies for coverage, a model where members vote would fully cover edge cases. (Could be similar to the Supreme Court where justices vote based on case law.)
Such a model could fully protect all Canadians across all platforms. You can have a fiat coverage governed by legal agreements, and crypto-asset coverage governed by both multi-sig and legal agreements. It could be practical, affordable, and inclusive.

Now, we are at a crossroads. We can happily give up our freedom, our innovation, and our money. We can pay hefty expenses to auditors, lawyers, and regulators year after year (and make no mistake - this cost will grow to many millions or even billions as the industry grows - and it will be borne by all Canadians on every platform because platforms are not going to eat up these costs at a loss). We can make it nearly impossible for any new platform to enter the marketplace, forcing Canadians to use the same stagnant platforms year after year. We can centralize and consolidate the entire industry into 2 or 3 big players and have everyone else fail (possibly to heavy losses of users of those platforms). And when a flawed security model doesn't work and gets breached, we can make it even more complicated with even more people in suits making big money doing the job that blockchain was supposed to do in the first place. We can build a system which is so intertwined and dependent on big government, traditional finance, and central bankers that it's future depends entirely on that of the fiat system, of fractional banking, and of government bail-outs. If we choose this path, as history has shown us over and over again, we can not go back, save for revolution. Our children and grandchildren will still be paying the consequences of what we decided today.
Or, we can find solutions that work. We can maintain an open and innovative environment while making the adjustments we need to make to fully protect Canadian investors and cryptocurrency users, giving easy and affordable access to cryptocurrency for all Canadians on the platform of their choice, and creating an environment in which entrepreneurs and problem solvers can bring those solutions forward easily. None of the above precludes innovation in any way, or adds any unreasonable cost - and these three policies would demonstrably eliminate or resolve all 109 historic cases as studied here - that's every single case researched so far going back to 2011. It includes every loss that was studied so far not just in Canada but globally as well.
Unfortunately, finding answers is the least challenging part. Far more challenging is to get platform operators and regulators to agree on anything. My last post got no response whatsoever, and while the OSC has told me they're happy for industry feedback, I believe my opinion alone is fairly meaningless. This takes the whole community working together to solve. So please let me know your thoughts. Please take the time to upvote and share this with people. Please - let's get this solved and not leave it up to other people to do.

Facts/background/sources (skip if you like):



Thoughts?
submitted by azoundria2 to QuadrigaInitiative [link] [comments]

Scaling Reddit Community Points with Arbitrum Rollup: a piece of cake

Scaling Reddit Community Points with Arbitrum Rollup: a piece of cake
https://preview.redd.it/b80c05tnb9e51.jpg?width=2550&format=pjpg&auto=webp&s=850282c1a3962466ed44f73886dae1c8872d0f31
Submitted for consideration to The Great Reddit Scaling Bake-Off
Baked by the pastry chefs at Offchain Labs
Please send questions or comments to [[email protected] ](mailto:[email protected])
1. Overview
We're excited to submit Arbitrum Rollup for consideration to The Great Reddit Scaling Bake-Off. Arbitrum Rollup is the only Ethereum scaling solution that supports arbitrary smart contracts without compromising on Ethereum's security or adding points of centralization. For Reddit, this means that Arbitrum can not only scale the minting and transfer of Community Points, but it can foster a creative ecosystem built around Reddit Community Points enabling points to be used in a wide variety of third party applications. That's right -- you can have your cake and eat it too!
Arbitrum Rollup isn't just Ethereum-style. Its Layer 2 transactions are byte-for-byte identical to Ethereum, which means Ethereum users can continue to use their existing addresses and wallets, and Ethereum developers can continue to use their favorite toolchains and development environments out-of-the-box with Arbitrum. Coupling Arbitrum’s tooling-compatibility with its trustless asset interoperability, Reddit not only can scale but can onboard the entire Ethereum community at no cost by giving them the same experience they already know and love (well, certainly know).
To benchmark how Arbitrum can scale Reddit Community Points, we launched the Reddit contracts on an Arbitrum Rollup chain. Since Arbitrum provides full Solidity support, we didn't have to rewrite the Reddit contracts or try to mimic their functionality using an unfamiliar paradigm. Nope, none of that. We launched the Reddit contracts unmodified on Arbitrum Rollup complete with support for minting and distributing points. Like every Arbitrum Rollup chain, the chain included a bridge interface in which users can transfer Community Points or any other asset between the L1 and L2 chains. Arbitrum Rollup chains also support dynamic contract loading, which would allow third-party developers to launch custom ecosystem apps that integrate with Community Points on the very same chain that runs the Reddit contracts.
1.1 Why Ethereum
Perhaps the most exciting benefit of distributing Community Points using a blockchain is the ability to seamlessly port points to other applications and use them in a wide variety of contexts. Applications may include simple transfers such as a restaurant that allows Redditors to spend points on drinks. Or it may include complex smart contracts -- such as placing Community Points as a wager for a multiparty game or as collateral in a financial contract.
The common denominator between all of the fun uses of Reddit points is that it needs a thriving ecosystem of both users and developers, and the Ethereum blockchain is perhaps the only smart contract platform with significant adoption today. While many Layer 1 blockchains boast lower cost or higher throughput than the Ethereum blockchain, more often than not, these attributes mask the reality of little usage, weaker security, or both.
Perhaps another platform with significant usage will rise in the future. But today, Ethereum captures the mindshare of the blockchain community, and for Community Points to provide the most utility, the Ethereum blockchain is the natural choice.
1.2 Why Arbitrum
While Ethereum's ecosystem is unmatched, the reality is that fees are high and capacity is too low to support the scale of Reddit Community Points. Enter Arbitrum. Arbitrum Rollup provides all of the ecosystem benefits of Ethereum, but with orders of magnitude more capacity and at a fraction of the cost of native Ethereum smart contracts. And most of all, we don't change the experience from users. They continue to use the same wallets, addresses, languages, and tools.
Arbitrum Rollup is not the only solution that can scale payments, but it is the only developed solution that can scale both payments and arbitrary smart contracts trustlessly, which means that third party users can build highly scalable add-on apps that can be used without withdrawing money from the Rollup chain. If you believe that Reddit users will want to use their Community Points in smart contracts--and we believe they will--then it makes the most sense to choose a single scaling solution that can support the entire ecosystem, eliminating friction for users.
We view being able to run smart contracts in the same scaling solution as fundamentally critical since if there's significant demand in running smart contracts from Reddit's ecosystem, this would be a load on Ethereum and would itself require a scaling solution. Moreover, having different scaling solutions for the minting/distribution/spending of points and for third party apps would be burdensome for users as they'd have to constantly shuffle their Points back and forth.
2. Arbitrum at a glance
Arbitrum Rollup has a unique value proposition as it offers a combination of features that no other scaling solution achieves. Here we highlight its core attributes.
Decentralized. Arbitrum Rollup is as decentralized as Ethereum. Unlike some other Layer 2 scaling projects, Arbitrum Rollup doesn't have any centralized components or centralized operators who can censor users or delay transactions. Even in non-custodial systems, centralized components provide a risk as the operators are generally incentivized to increase their profit by extracting rent from users often in ways that severely degrade user experience. Even if centralized operators are altruistic, centralized components are subject to hacking, coercion, and potential liability.
Massive Scaling. Arbitrum achieves order of magnitude scaling over Ethereum's L1 smart contracts. Our software currently supports 453 transactions-per-second for basic transactions (at 1616 Ethereum gas per tx). We have a lot of room left to optimize (e.g. aggregating signatures), and over the next several months capacity will increase significantly. As described in detail below, Arbitrum can easily support and surpass Reddit's anticipated initial load, and its capacity will continue to improve as Reddit's capacity needs grow.
Low cost. The cost of running Arbitrum Rollup is quite low compared to L1 Ethereum and other scaling solutions such as those based on zero-knowledge proofs. Layer 2 fees are low, fixed, and predictable and should not be overly burdensome for Reddit to cover. Nobody needs to use special equipment or high-end machines. Arbitrum requires validators, which is a permissionless role that can be run on any reasonable on-line machine. Although anybody can act as a validator, in order to protect against a “tragedy of the commons” and make sure reputable validators are participating, we support a notion of “invited validators” that are compensated for their costs. In general, users pay (low) fees to cover the invited validators’ costs, but we imagine that Reddit may cover this cost for its users. See more on the costs and validator options below.
Ethereum Developer Experience. Not only does Arbitrum support EVM smart contracts, but the developer experience is identical to that of L1 Ethereum contracts and fully compatible with Ethereum tooling. Developers can port existing Solidity apps or write new ones using their favorite and familiar toolchains (e.g. Truffle, Buidler). There are no new languages or coding paradigms to learn.
Ethereum wallet compatibility. Just as in Ethereum, Arbitrum users need only hold keys, but do not have to store any coin history or additional data to protect or access their funds. Since Arbitrum transactions are semantically identical to Ethereum L1 transactions, existing Ethereum users can use their existing Ethereum keys with their existing wallet software such as Metamask.
Token interoperability. Users can easily transfer their ETH, ERC-20 and ERC-721 tokens between Ethereum and the Arbitrum Rollup chain. As we explain in detail below, it is possible to mint tokens in L2 that can subsequently be withdrawn and recognized by the L1 token contract.
Fast finality. Transactions complete with the same finality time as Ethereum L1 (and it's possible to get faster finality guarantees by trading away trust assumptions; see the Arbitrum Rollup whitepaper for details).
Non-custodial. Arbitrum Rollup is a non-custodial scaling solution, so users control their funds/points and neither Reddit nor anyone else can ever access or revoke points held by users.
Censorship Resistant. Since it's completely decentralized, and the Arbitrum protocol guarantees progress trustlessly, Arbitrum Rollup is just as censorship-proof as Ethereum.
Block explorer. The Arbitrum Rollup block explorer allows users to view and analyze transactions on the Rollup chain.
Limitations
Although this is a bake-off, we're not going to sugar coat anything. Arbitrum Rollup, like any Optimistic Rollup protocol, does have one limitation, and that's the delay on withdrawals.
As for the concrete length of the delay, we've done a good deal of internal modeling and have blogged about this as well. Our current modeling suggests a 3-hour delay is sufficient (but as discussed in the linked post there is a tradeoff space between the length of the challenge period and the size of the validators’ deposit).
Note that this doesn't mean that the chain is delayed for three hours. Arbitrum Rollup supports pipelining of execution, which means that validators can keep building new states even while previous ones are “in the pipeline” for confirmation. As the challenge delays expire for each update, a new state will be confirmed (read more about this here).
So activity and progress on the chain are not delayed by the challenge period. The only thing that's delayed is the consummation of withdrawals. Recall though that any single honest validator knows immediately (at the speed of L1 finality) which state updates are correct and can guarantee that they will eventually be confirmed, so once a valid withdrawal has been requested on-chain, every honest party knows that the withdrawal will definitely happen. There's a natural place here for a liquidity market in which a validator (or someone who trusts a validator) can provide withdrawal loans for a small interest fee. This is a no-risk business for them as they know which withdrawals will be confirmed (and can force their confirmation trustlessly no matter what anyone else does) but are just waiting for on-chain finality.
3. The recipe: How Arbitrum Rollup works
For a description of the technical components of Arbitrum Rollup and how they interact to create a highly scalable protocol with a developer experience that is identical to Ethereum, please refer to the following documents:
Arbitrum Rollup Whitepaper
Arbitrum academic paper (describes a previous version of Arbitrum)
4. Developer docs and APIs
For full details about how to set up and interact with an Arbitrum Rollup chain or validator, please refer to our developer docs, which can be found at https://developer.offchainlabs.com/.
Note that the Arbitrum version described on that site is older and will soon be replaced by the version we are entering in Reddit Bake-Off, which is still undergoing internal testing before public release.
5. Who are the validators?
As with any Layer 2 protocol, advancing the protocol correctly requires at least one validator (sometimes called block producers) that is honest and available. A natural question is: who are the validators?
Recall that the validator set for an Arbitrum chain is open and permissionless; anyone can start or stop validating at will. (A useful analogy is to full nodes on an L1 chain.) But we understand that even though anyone can participate, Reddit may want to guarantee that highly reputable nodes are validating their chain. Reddit may choose to validate the chain themselves and/or hire third-party validators.To this end, we have begun building a marketplace for validator-for-hire services so that dapp developers can outsource validation services to reputable nodes with high up-time. We've announced a partnership in which Chainlink nodes will provide Arbitrum validation services, and we expect to announce more partnerships shortly with other blockchain infrastructure providers.
Although there is no requirement that validators are paid, Arbitrum’s economic model tracks validators’ costs (e.g. amount of computation and storage) and can charge small fees on user transactions, using a gas-type system, to cover those costs. Alternatively, a single party such as Reddit can agree to cover the costs of invited validators.
6. Reddit Contract Support
Since Arbitrum contracts and transactions are byte-for-byte compatible with Ethereum, supporting the Reddit contracts is as simple as launching them on an Arbitrum chain.
Minting. Arbitrum Rollup supports hybrid L1/L2 tokens which can be minted in L2 and then withdrawn onto the L1. An L1 contract at address A can make a special call to the EthBridge which deploys a "buddy contract" to the same address A on an Arbitrum chain. Since it's deployed at the same address, users can know that the L2 contract is the authorized "buddy" of the L1 contract on the Arbitrum chain.
For minting, the L1 contract is a standard ERC-20 contract which mints and burns tokens when requested by the L2 contract. It is paired with an ERC-20 contract in L2 which mints tokens based on whatever programmer provided minting facility is desired and burns tokens when they are withdrawn from the rollup chain. Given this base infrastructure, Arbitrum can support any smart contract based method for minting tokens in L2, and indeed we directly support Reddit's signature/claim based minting in L2.
Batch minting. What's better than a mint cookie? A whole batch! In addition to supporting Reddit’s current minting/claiming scheme, we built a second minting design, which we believe outperforms the signature/claim system in many scenarios.
In the current system, Reddit periodically issues signed statements to users, who then take those statements to the blockchain to claim their tokens. An alternative approach would have Reddit directly submit the list of users/amounts to the blockchain and distribute the tokens to the users without the signature/claim process.
To optimize the cost efficiency of this approach, we designed an application-specific compression scheme to minimize the size of the batch distribution list. We analyzed the data from Reddit's previous distributions and found that the data is highly compressible since token amounts are small and repeated, and addresses appear multiple times. Our function groups transactions by size, and replaces previously-seen addresses with a shorter index value. We wrote client code to compress the data, wrote a Solidity decompressing function, and integrated that function into Reddit’s contract running on Arbitrum.
When we ran the compression function on the previous Reddit distribution data, we found that we could compress batched minting data down to to 11.8 bytes per minting event (averaged over a 6-month trace of Reddit’s historical token grants)compared with roughly 174 bytes of on-chain data needed for the signature claim approach to minting (roughly 43 for an RLP-encoded null transaction + 65 for Reddit's signature + 65 for the user's signature + roughly 8 for the number of Points) .
The relative benefit of the two approaches with respect to on-chain call data cost depends on the percentage of users that will actually claim their tokens on chain. With the above figures, batch minting will be cheaper if roughly 5% of users redeem their claims. We stress that our compression scheme is not Arbitrum-specific and would be beneficial in any general-purpose smart contract platform.
8. Benchmarks and costs
In this section, we give the full costs of operating the Reddit contracts on an Arbitrum Rollup chain including the L1 gas costs for the Rollup chain, the costs of computation and storage for the L2 validators as well as the capital lockup requirements for staking.
Arbitrum Rollup is still on testnet, so we did not run mainnet benchmarks. Instead, we measured the L1 gas cost and L2 workload for Reddit operations on Arbitrum and calculated the total cost assuming current Ethereum gas prices. As noted below in detail, our measurements do not assume that Arbitrum is consuming the entire capacity of Ethereum. We will present the details of our model now, but for full transparency you can also play around with it yourself and adjust the parameters, by copying the spreadsheet found here.
Our cost model is based on measurements of Reddit’s contracts, running unmodified (except for the addition of a batch minting function) on Arbitrum Rollup on top of Ethereum.
On the distribution of transactions and frequency of assertions. Reddit's instructions specify the following minimum parameters that submissions should support:
Over a 5 day period, your scaling PoC should be able to handle:
  • 100,000 point claims (minting & distributing points)
  • 25,000 subscriptions
  • 75,000 one-off points burning
  • 100,000 transfers
We provide the full costs of operating an Arbitrum Rollup chain with this usage under the assumption that tokens are minted or granted to users in batches, but other transactions are uniformly distributed over the 5 day period. Unlike some other submissions, we do not make unrealistic assumptions that all operations can be submitted in enormous batches. We assume that batch minting is done in batches that use only a few percent on an L1 block’s gas, and that other operations come in evenly over time and are submitted in batches, with one batch every five minutes to keep latency reasonable. (Users are probably already waiting for L1 finality, which takes at least that long to achieve.)
We note that assuming that there are only 300,000 transactions that arrive uniformly over the 5 day period will make our benchmark numbers lower, but we believe that this will reflect the true cost of running the system. To see why, say that batches are submitted every five minutes (20 L1 blocks) and there's a fixed overhead of c bytes of calldata per batch, the cost of which will get amortized over all transactions executed in that batch. Assume that each individual transaction adds a marginal cost of t. Lastly assume the capacity of the scaling system is high enough that it can support all of Reddit's 300,000 transactions within a single 20-block batch (i.e. that there is more than c + 300,000*t byes of calldata available in 20 blocks).
Consider what happens if c, the per-batch overhead, is large (which it is in some systems, but not in Arbitrum). In the scenario that transactions actually arrive at the system's capacity and each batch is full, then c gets amortized over 300,000 transactions. But if we assume that the system is not running at capacity--and only receives 300,000 transactions arriving uniformly over 5 days-- then each 20-block assertion will contain about 200 transactions, and thus each transaction will pay a nontrivial cost due to c.
We are aware that other proposals presented scaling numbers assuming that 300,000 transactions arrived at maximum capacity and was executed in a single mega-transaction, but according to our estimates, for at least one such report, this led to a reported gas price that was 2-3 orders of magnitude lower than it would have been assuming uniform arrival. We make more realistic batching assumptions, and we believe Arbitrum compares well when batch sizes are realistic.
Our model. Our cost model includes several sources of cost:
  • L1 gas costs: This is the cost of posting transactions as calldata on the L1 chain, as well as the overhead associated with each batch of transactions, and the L1 cost of settling transactions in the Arbitrum protocol.
  • Validator’s staking costs: In normal operation, one validator will need to be staked. The stake is assumed to be 0.2% of the total value of the chain (which is assumed to be $1 per user who is eligible to claim points). The cost of staking is the interest that could be earned on the money if it were not staked.
  • Validator computation and storage: Every validator must do computation to track the chain’s processing of transactions, and must maintain storage to keep track of the contracts’ EVM storage. The cost of computation and storage are estimated based on measurements, with the dollar cost of resources based on Amazon Web Services pricing.
It’s clear from our modeling that the predominant cost is for L1 calldata. This will probably be true for any plausible rollup-based system.
Our model also shows that Arbitrum can scale to workloads much larger than Reddit’s nominal workload, without exhausting L1 or L2 resources. The scaling bottleneck will ultimately be calldata on the L1 chain. We believe that cost could be reduced substantially if necessary by clever encoding of data. (In our design any compression / decompression of L2 transaction calldata would be done by client software and L2 programs, never by an L1 contract.)
9. Status of Arbitrum Rollup
Arbitrum Rollup is live on Ethereum testnet. All of the code written to date including everything included in the Reddit demo is open source and permissively licensed under the Apache V2 license. The first testnet version of Arbitrum Rollup was released on testnet in February. Our current internal version, which we used to benchmark the Reddit contracts, will be released soon and will be a major upgrade.
Both the Arbitrum design as well as the implementation are heavily audited by independent third parties. The Arbitrum academic paper was published at USENIX Security, a top-tier peer-reviewed academic venue. For the Arbitrum software, we have engaged Trail of Bits for a security audit, which is currently ongoing, and we are committed to have a clean report before launching on Ethereum mainnet.
10. Reddit Universe Arbitrum Rollup Chain
The benchmarks described in this document were all measured using the latest internal build of our software. When we release the new software upgrade publicly we will launch a Reddit Universe Arbitrum Rollup chain as a public demo, which will contain the Reddit contracts as well as a Uniswap instance and a Connext Hub, demonstrating how Community Points can be integrated into third party apps. We will also allow members of the public to dynamically launch ecosystem contracts. We at Offchain Labs will cover the validating costs for the Reddit Universe public demo.
If the folks at Reddit would like to evaluate our software prior to our public demo, please email us at [email protected] and we'd be more than happy to provide early access.
11. Even more scaling: Arbitrum Sidechains
Rollups are an excellent approach to scaling, and we are excited about Arbitrum Rollup which far surpasses Reddit's scaling needs. But looking forward to Reddit's eventual goal of supporting hundreds of millions of users, there will likely come a time when Reddit needs more scaling than any Rollup protocol can provide.
While Rollups greatly reduce costs, they don't break the linear barrier. That is, all transactions have an on-chain footprint (because all calldata must be posted on-chain), albeit a far smaller one than on native Ethereum, and the L1 limitations end up being the bottleneck for capacity and cost. Since Ethereum has limited capacity, this linear use of on-chain resources means that costs will eventually increase superlinearly with traffic.
The good news is that we at Offchain Labs have a solution in our roadmap that can satisfy this extreme-scaling setting as well: Arbitrum AnyTrust Sidechains. Arbitrum Sidechains are similar to Arbitrum Rollup, but deviate in that they name a permissioned set of validators. When a chain’s validators agree off-chain, they can greatly reduce the on-chain footprint of the protocol and require almost no data to be put on-chain. When validators can't reach unanimous agreement off-chain, the protocol reverts to Arbitrum Rollup. Technically, Arbitrum Sidechains can be viewed as a hybrid between state channels and Rollup, switching back and forth as necessary, and combining the performance and cost that state channels can achieve in the optimistic case, with the robustness of Rollup in other cases. The core technical challenge is how to switch seamlessly between modes and how to guarantee that security is maintained throughout.
Arbitrum Sidechains break through this linear barrier, while still maintaining a high level of security and decentralization. Arbitrum Sidechains provide the AnyTrust guarantee, which says that as long as any one validator is honest and available (even if you don't know which one will be), the L2 chain is guaranteed to execute correctly according to its code and guaranteed to make progress. Unlike in a state channel, offchain progress does not require unanimous consent, and liveness is preserved as long as there is a single honest validator.
Note that the trust model for Arbitrum Sidechains is much stronger than for typical BFT-style chains which introduce a consensus "voting" protocols among a small permissioned group of validators. BFT-based protocols require a supermajority (more than 2/3) of validators to agree. In Arbitrum Sidechains, by contrast, all you need is a single honest validator to achieve guaranteed correctness and progress. Notice that in Arbitrum adding validators strictly increases security since the AnyTrust guarantee provides correctness as long as any one validator is honest and available. By contrast, in BFT-style protocols, adding nodes can be dangerous as a coalition of dishonest nodes can break the protocol.
Like Arbitrum Rollup, the developer and user experiences for Arbitrum Sidechains will be identical to that of Ethereum. Reddit would be able to choose a large and diverse set of validators, and all that they would need to guarantee to break through the scaling barrier is that a single one of them will remain honest.
We hope to have Arbitrum Sidechains in production in early 2021, and thus when Reddit reaches the scale that surpasses the capacity of Rollups, Arbitrum Sidechains will be waiting and ready to help.
While the idea to switch between channels and Rollup to get the best of both worlds is conceptually simple, getting the details right and making sure that the switch does not introduce any attack vectors is highly non-trivial and has been the subject of years of our research (indeed, we were working on this design for years before the term Rollup was even coined).
12. How Arbitrum compares
We include a comparison to several other categories as well as specific projects when appropriate. and explain why we believe that Arbitrum is best suited for Reddit's purposes. We focus our attention on other Ethereum projects.
Payment only Rollups. Compared to Arbitrum Rollup, ZK-Rollups and other Rollups that only support token transfers have several disadvantages:
  • As outlined throughout the proposal, we believe that the entire draw of Ethereum is in its rich smart contracts support which is simply not achievable with today's zero-knowledge proof technology. Indeed, scaling with a ZK-Rollup will add friction to the deployment of smart contracts that interact with Community Points as users will have to withdraw their coins from the ZK-Rollup and transfer them to a smart contract system (like Arbitrum). The community will be best served if Reddit builds on a platform that has built-in, frictionless smart-contract support.
  • All other Rollup protocols of which we are aware employ a centralized operator. While it's true that users retain custody of their coins, the centralized operator can often profit from censoring, reordering, or delaying transactions. A common misconception is that since they're non-custodial protocols, a centralized sequencer does not pose a risk but this is incorrect as the sequencer can wreak havoc or shake down users for side payments without directly stealing funds.
  • Sidechain type protocols can eliminate some of these issues, but they are not trustless. Instead, they require trust in some quorum of a committee, often requiring two-third of the committee to be honest, compared to rollup protocols like Arbitrum that require only a single honest party. In addition, not all sidechain type protocols have committees that are diverse, or even non-centralized, in practice.
  • Plasma-style protocols have a centralized operator and do not support general smart contracts.
13. Concluding Remarks
While it's ultimately up to the judges’ palate, we believe that Arbitrum Rollup is the bakeoff choice that Reddit kneads. We far surpass Reddit's specified workload requirement at present, have much room to optimize Arbitrum Rollup in the near term, and have a clear path to get Reddit to hundreds of millions of users. Furthermore, we are the only project that gives developers and users the identical interface as the Ethereum blockchain and is fully interoperable and tooling-compatible, and we do this all without any new trust assumptions or centralized components.
But no matter how the cookie crumbles, we're glad to have participated in this bake-off and we thank you for your consideration.
About Offchain Labs
Offchain Labs, Inc. is a venture-funded New York company that spun out of Princeton University research, and is building the Arbitrum platform to usher in the next generation of scalable, interoperable, and compatible smart contracts. Offchain Labs is backed by Pantera Capital, Compound VC, Coinbase Ventures, and others.
Leadership Team
Ed Felten
Ed Felten is Co-founder and Chief Scientist at Offchain Labs. He is on leave from Princeton University, where he is the Robert E. Kahn Professor of Computer Science and Public Affairs. From 2015 to 2017 he served at the White House as Deputy United States Chief Technology Officer and senior advisor to the President. He is an ACM Fellow and member of the National Academy of Engineering. Outside of work, he is an avid runner, cook, and L.A. Dodgers fan.
Steven Goldfeder
Steven Goldfeder is Co-founder and Chief Executive Officer at Offchain Labs. He holds a PhD from Princeton University, where he worked at the intersection of cryptography and cryptocurrencies including threshold cryptography, zero-knowledge proof systems, and post-quantum signatures. He is a co-author of Bitcoin and Cryptocurrency Technologies, the leading textbook on cryptocurrencies, and he has previously worked at Google and Microsoft Research, where he co-invented the Picnic signature algorithm. When not working, you can find Steven spending time with his family, taking a nature walk, or twisting balloons.
Harry Kalodner
Harry Kalodner is Co-founder and Chief Technology Officer at Offchain Labs where he leads the engineering team. Before the company he attended Princeton as a Ph.D candidate where his research explored economics, anonymity, and incentive compatibility of cryptocurrencies, and he also has worked at Apple. When not up at 3:00am writing code, Harry occasionally sleeps.
submitted by hkalodner to ethereum [link] [comments]

RiB Newsletter #16 – Secure Enclaves à la Crab

For the last few months we’ve been following new zero-knowledge proof projects in Rust. This month, with Secret Network upgrading their mainnet with secret contracts, it seems like a good opportunity to explore Rust blockchains that are using a completely different privacy-preserving technology: secure enclaves.
Secure enclaves are processes whose environment is protected from inspection by other processes, even the kernel, by special hardware. This protection particularly involves the encryption of a process’s memory. Software that wants to compute in secret can put those computations inside a secure enclave and, if everything works as expected, neither a local user, nor the hosting provider, can snoop on the computations being performed. The most notable implementation of secure enclaves is Intel’s SGX (Secure Guard Extensions).
Secure enclaves are an attractive way to perform private computation primarily because they don’t impose any limitations on what can be computed — code that runs inside SGX is more-or-less just regular x86 code, just running inside a special environment. But depending on SGX for privacy does have some special risks: software that runs in an SGX enclave must be signed (if transitively) by Intel’s own cryptographic keys, which means that Intel must approve of any software running in SGX, that Intel can revoke permission to use SGX, and that there is a risk of the signing keys being compromised; and it’s not obvious that secure enclaves are actually secure, there have already been a number of attacks against SGX. Regardless, as of now, hardware enclaves provide security features that aren’t feasible any other way.
There are two prominent Rust blockchains relying on SGX:
Outside of the blockchain world there are some other Rust projects using SGX, the most notable being:
Whether it’s secure enclaves or zk-SNARKs, Rust blockchains are walking the bleeding edge of privacy tech.
In unrelated RiB news, we recently received two donations,
Thanks so much to our anonymous donors. We don’t often receive donations, so this was a nice surprise! We intend to put all monetary contributions to use funding events or new contributors, and we’ll let you know what we do with the funds when we spend them.

Project Spotlight

Each month we like to shine a light on a notable Rust blockchain project. This month that project is…
Aleo.
Aleo is a zero-knowledge blockchain, with its own zero-knowledge programming language, Leo.
We don’t have a lot to say about it, but we think it looks cool. We hope they blog more.

Interesting Things

News

Blog Posts

Papers

Projects


Read more: https://rustinblockchain.org/newsletters/2020-09-30-secure-enclaves-a-la-crab/
submitted by Aimeedeer to rust [link] [comments]

u/nullc wanted me to remake the law of Substitute Goods chart to use Blockchain.info that includes Segwit (removed signatures) data. No data is manipulated. BTC adoption was manipulated.

u/nullc wanted me to remake the law of Substitute Goods chart to use Blockchain.info that includes Segwit (removed signatures) data. No data is manipulated. BTC adoption was manipulated. submitted by masterD3v to btc [link] [comments]

The Future Of The Accounting Industry Using Blockchain

The Future Of The Accounting Industry Using Blockchain
Link to our website: https://block.co/the-future-of-the-accounting-industry-using-blockchain/
Blockchain as an immutable and incorruptible ledger of different types of data finds its natural expression in accounting. Or, we could reverse the concept in that accounting is the most natural application for blockchain technology. Modern accounting is based on the double-entry system which, since the Renaissance time, allowed managers to realize whether they could trust their own books.
In the double-entry system, transactions are recorded in terms of debits and credits. Since a debit in one account neutralizes a credit in another, the sum of all debits must equal the sum of all credits. Double-entry bookkeeping allows firms to maintain records that show what the firm owns and owes, and also what the firm has earned and spent over any given period of time. Triple entry accounting is a more recent enhancement of the traditional double-entry system in which all accounting entries, including purchases of inventory and supplies, sales, taxes, utility expenses, and so forth, are cryptographically sealed by a third entry.
With blockchain, these entries occur in the same distributed, public ledger rather than in separate books, creating an interlocking system of immutable accounting records. Being distributed and cryptographically sealed, manipulating, or destroying these entries is practically impossible. Companies using blockchain triple-entry bookkeeping acquire major benefits from adoption. First, it helps auditors quickly and easily access and verify financial data, thus reducing considerably the cost and time necessary to conduct an audit. Second, it helps preserve the integrity of a company’s financial statements since an encrypted signature of the counterparty is required in order to be accepted as valid, thus drastically reducing the risk of counterfeits.
While most operations are still performed manually or, even if delivered on software are still labor-intensive tasks and far from being automated, blockchain appears as the technology needed to simplify and improve regulatory compliance, enhance the prevalent double-entry bookkeeping, reduce fraud, auditing processes and errors. In essence, blockchain technology enables complete verification without the need of a trusted party.
https://preview.redd.it/4jhr2h0papj51.png?width=768&format=png&auto=webp&s=39112eee34bcc485c2c43fd177f72ec8873ab9ce
On 10th August, Block.co CEO Alexis Nicolaou appeared on The Future of Accounting Industry using Blockchain webcast to discuss upcoming changes in accounting along with Dr. Maria Papadaki, Managing Director at BUiD Dubai Center for Risk and Innovation and Raymond Abrea, President & CEO of Philippine Abrea Consulting Group, hosted by Legal Solutions expert and Senior Executive of CACI, Edward Logan.
Maria Papadaki has more than 10 years of experience in Risk Management from both academia and industry, with numerous years in the implementation, development, improvement, and management of risk frameworks, tools, and techniques. She believes that “Blockchain is going to introduce a new component in accounting that’s going to make the work easier. It will provide a link between the two double-entry books together in an open way and will put things in order. Auditing will also become easier while trust, fraud, and compliance issues will all become obsolete with the open and distributed blockchain because it’s hard to cheat when everybody is watching. It will reveal less human interface with immutable, accurate, and easy to verify transactions. Accounting needs to be innovated with international links between institutions, organizations, and businesses”.
Raymond Abrea, based in the Philippines, is also Co-Chair of the Ease of Doing Business (EODB) Task Force on Paying Taxes and the brainchild of the TaxWhizPH mobile app. He was recognized as one of the 2017 Outstanding Young Persons of the World. “As a tax consultant, I choose integrity over profit with our game-changing strategy to do what is right and help the client pay the right taxes while pushing for genuine tax reform as an important contribution to the nation-building of the Philippines”. “I am not a blockchain expert — continues Raymond — but someone who will benefit from it and as a company we collaborate with various institutions to help implement blockchain to fight corruption, fraud, more errors, and so forth. Tax compliance review, tax audit, and assessment will all gain efficiency with the blockchain thanks to smoother processes while saving time and money. It normally takes about one month to go through all the procedures of tax registration and payment before it’s all approved by the government, and we believe blockchain will cut that time considerably.”
So, do accountants need to fear for their jobs?
Whenever new technologies appear, there is widespread worry among different sectors that jobs might be impacted and specific professions are abolished. As webcast guests repeatedly affirmed during the event, blockchain will surely disrupt accounting in that both professionals and clients will be offered safer and more immutable records while making processes easier and faster but the responsibilities of accountants will largely remain intact. Auditing will also be disrupted with the disuse of paper trail documents and the adoption of encrypted key data verification supporting financial statements, thus reducing costs and time for the audit payer. Also, regulatory compliance can be verified more efficiently.

Alexis Nicolaou has over 25 years’ experience in C-suite positions in Accountancy, Finance, Electronic Banking, Electronic Banking Software, Media, and he currently also serves on the board of Directors of Grant Thornton Cyprus’ Distributed Ledger Technologies business unit. “I am an accountant myself and one thing accountants should not worry about is that they would lose their jobs with blockchain. On the contrary, their jobs will evolve and will be enhanced. As all entries in blockchain are distributed and cryptographically sealed, it is virtually impossible to destroy or manipulate that information. This so-called triple entry bookkeeping model will be accessible to all relevant parties. The auditor, the regulator, the client will all have an identical copy of the ledger at all times; it will be distributed across a peer to peer network of nodes and shared in multiple sites”.
Despite having all the information agreed by all parties and timestamped in the blockchain, businesses will still need to hire good accountants to interpret and categorize that information, and they will have to implement and maintain the system. So, no, the accountant’s job is not at risk. As the info is secured, encrypted, and transmitted to a network of members, accountants will be able to provide more real-time advice and guidance to the client and their role will actually be more consistent. “The future accountant will need to be more skilled in IT and not just with numbers” carries on Alexis.
Block.co has introduced electronic correspondence and filing for a while now, and that translates into a major reduction of costs, paper waste, and time. It all started in 2014 when the University of Nicosia began to issue certificates on the blockchain. They anchored the fingerprint of that document (certificate) on the blockchain. They then shared it with their students, and all a student had to do when they went for a job interview was to present that PDF file, while all the employer had to do is drag and drop that PDF file.
“Many businesses — continues Alexis — shy away from electronic archiving systems but using a blockchain makes it possible to prove the integrity of a file and the way we do it at Block.co is by generating the hash, a fingerprint of that file, and timestamping it on the open and public Bitcoin blockchain. We developed a platform where all we have to do is drag and drop a PDF file and this can be done with any type of document, an invoice, a legal document, a medical certificate. I remember back in the ’90s, when I started the profession, everything was much longer and burdensome because it was done on paper. This is what I mean when I say that the accounting profession needs to evolve and embrace more IT skills in order to stand up to the competition with other more innovative companies.”
During the webcast, Raymond asked a compelling question about how will blockchain be implemented. Will it be powered by governments or by private initiative? Since the Philippines started adopting digital systems, they still have issues with government compliance, therefore any technological innovation takes a long time to be implemented. “In Dubai for example — advises Dr. Papadaki — blockchain adoption is mandated by the government which makes everything easier. They are driving all initiatives to report improvement and adapt it in an excellent way. I think Cyprus is going that way too, therefore, it ultimately depends on the individual country”.
Alexis, also, believes that it’s extremely important for the government to be on board. Malta, for instance, was the first country to introduce legislation concerning blockchain and Cyprus is following suit and their initiative has been particularly successful because local governments actively participated and promoted the adoption of the technology. “Private institutions are running it but to have a global acceptance governments will need to embrace it too. Governments have come to understand, especially during the pandemic, that we need to push innovation in technology so hopefully, this will put pressure on them to adopt blockchain more quickly”.
For more info, contact Block.co directly or email at [email protected].
Tel +357 70007828
Get the latest from Block.co, like and follow us on social media:
✔️Facebook
✔️LinkedIn
✔️Twitter
✔️YouTube
✔️Medium
✔️Instagram
✔️Telegram
✔️Reddit
✔️GitHub
submitted by BlockDotCo to u/BlockDotCo [link] [comments]

DFINITY Research Report

DFINITY Research Report
Author: Gamals Ahmed, CoinEx Business Ambassador
ABSTRACT
The DFINITY blockchain computer provides a secure, performant and flexible consensus mechanism. At its core, DFINITY contains a decentralized randomness beacon, which acts as a verifiable random function (VRF) that produces a stream of outputs over time. The novel technique behind the beacon relies on the existence of a unique-deterministic, non-interactive, DKG-friendly threshold signatures scheme. The only known examples of such a scheme are pairing-based and derived from BLS.
The DFINITY blockchain is layered on top of the DFINITY beacon and uses the beacon as its source of randomness for leader selection and leader ranking. A “weight” is attributed to a chain based on the ranks of the leaders who propose the blocks in the chain, and that weight is used to select between competing chains. The DFINITY blockchain is layered on top of the DFINITY beacon and uses the beacon as its source of randomness for leader selection and leader ranking blockchain is further hardened by a notarization process which dramatically improves the time to finality and eliminates the nothing-at-stake and selfish mining attacks.
DFINITY consensus algorithm is made to scale through continuous quorum selections driven by the random beacon. In practice, DFINITY achieves block times of a few seconds and transaction finality after only two confirmations. The system gracefully handles temporary losses of network synchrony including network splits, while it is provably secure under synchrony.

1.INTRODUCTION

DFINITY is building a new kind of public decentralized cloud computing resource. The company’s platform uses blockchain technology which is aimed at building a new kind of public decentralized cloud computing resource with unlimited capacity, performance and algorithmic governance shared by the world, with the capability to power autonomous self-updating software systems, enabling organizations to design and deploy custom-tailored cloud computing projects, thereby reducing enterprise IT system costs by 90%.
DFINITY aims to explore new territory and prove that the blockchain opportunity is far broader and deeper than anyone has hitherto realized, unlocking the opportunity with powerful new crypto.
Although a standalone project, DFINITY is not maximalist minded and is a great supporter of Ethereum.
The DFINITY blockchain computer provides a secure, performant and flexible consensus mechanism. At its core, DFINITY contains a decentralized randomness beacon, which acts as a verifiable random function (VRF) that produces a stream of outputs over time. The novel technique behind the beacon relies on the existence of a unique-deterministic, non-interactive, DKG-friendly threshold signatures scheme. The only known examples of such a scheme are pairing-based and derived from BLS.
DFINITY’s consensus mechanism has four layers: notary (provides fast finality guarantees to clients and external observers), blockchain (builds a blockchain from validated transactions via the Probabilistic Slot Protocol driven by the random beacon), random beacon (provides the source of randomness for all higher layers like smart contract applications), and identity (provides a registry of all clients).
DFINITY’s consensus mechanism has four layers

Figure1: DFINITY’s consensus mechanism layers
1. Identity layer:
Active participants in the DFINITY Network are called clients. Where clients are registered with permanent identities under a pseudonym. Moreover, DFINITY supports open membership by providing a protocol for registering new clients by depositing a stake with an insurance period. This is the responsibility of the first layer.
2. Random Beacon layer:
Provides the source of randomness (VRF) for all higher layers including ap- plications (smart contracts). The random beacon in the second layer is an unbiasable, verifiable random function (VRF) that is produced jointly by registered clients. Each random output of the VRF is unpredictable by anyone until just before it becomes avail- able to everyone. This is a key technology of the DFINITY system, which relies on a threshold signature scheme with the properties of uniqueness and non-interactivity.

https://preview.redd.it/hkcf53ic05e51.jpg?width=441&format=pjpg&auto=webp&s=44d45c9602ee630705ce92902b8a8379201d8111
3. Blockchain layer:
The third layer deploys the “probabilistic slot protocol” (PSP). This protocol ranks the clients for each height of the chain, in an order that is derived determin- istically from the unbiased output of the random beacon for that height. A weight is then assigned to block proposals based on the proposer’s rank such that blocks from clients at the top of the list receive a higher weight. Forks are resolved by giving favor to the “heaviest” chain in terms of accumulated block weight — quite sim- ilar to how traditional proof-of-work consensus is based on the highest accumulated amount of work.
The first advantage of the PSP protocol is that the ranking is available instantaneously, which allows for a predictable, constant block time. The second advantage is that there is always a single highest-ranked client, which allows for a homogenous network bandwidth utilization. Instead, a race between clients would favor a usage in bursts.
4. Notarization layer:
Provides fast finality guarantees to clients and external observers. DFINITY deploys the novel technique of block notarization in its fourth layer to speed up finality. A notarization is a threshold signature under a block created jointly by registered clients. Only notarized blocks can be included in a chain. Of all RSA-based alternatives exist but suffer from an impracticality of setting up the thresh- old keys without a trusted dealer.
DFINITY achieves its high speed and short block times exactly because notarization is not full consensus.
DFINITY does not suffer from selfish mining attack or a problem nothing at stake because the authentication step is impossible for the opponent to build and maintain a series of linked and trusted blocks in secret.
DFINITY’s consensus is designed to operate on a network of millions of clients. To en- able scalability to this extent, the random beacon and notarization protocols are designed such as that they can be safely and efficiently delegated to a committee

1.1 OVERVIEW ABOUT DFINITY

DFINITY is a blockchain-based cloud-computing project that aims to develop an open, public network, referred to as the “internet computer,” to host the next generation of software and data. and it is a decentralized and non-proprietary network to run the next generation of mega-applications. It dubbed this public network “Cloud 3.0”.
DFINITY is a third generation virtual blockchain network that sets out to function as an “intelligent decentralised cloud,”¹ strongly focused on delivering a viable corporate cloud solution. The DFINITY project is overseen, supported and promoted by DFINITY Stiftung a not-for-profit foundation based in Zug, Switzerland.
DFINITY is a decentralized network design whose protocols generate a reliable “virtual blockchain computer” running on top of a peer-to-peer network upon which software can be installed and can operate in the tamperproof mode of smart contracts.
DFINITY introduces algorithmic governance in the form of a “Blockchain Nervous System” that can protect users from attacks and help restart broken systems, dynamically optimize network security and efficiency, upgrade the protocol and mitigate misuse of the platform, for example by those wishing to run illegal or immoral systems.
DFINITY is an Ethereum-compatible smart contract platform that is implementing some revolutionary ideas to address blockchain performance, scaling, and governance. Whereas
DFINITY could pose a credible threat to Ethereum’s extinction, the project is pursuing a coevolutionary strategy by contributing funding and effort to Ethereum projects and freely offering their technology to Ethereum for adoption. DFINITY has labeled itself Ethereum’s “crazy sister” to express it’s close genetic resemblance to Ethereum, differentiated by its obsession with performance and neuron-inspired governance model.
Dfinity raised $61 million from Andreesen Horowitz and Polychain Capital in a February 2018 funding round. At the time, Dfinity said it wanted to create an “internet computer” to cut the costs of running cloud-based business applications. A further $102 million funding round in August 2018 brought the project’s total funding to $195 million.
In May 2018, Dfinity announced plans to distribute around $35 million worth of Dfinity tokens in an airdrop. It was part of the company’s plan to create a “Cloud 3.0.” Because of regulatory concerns, none of the tokens went to US residents.
DFINITY be broadening and strengthening the EVM ecosystem by giving applications a choice of platforms with different characteristics. However, if DFINITY succeeds in delivering a fully EVM-compatible smart contract platform with higher transaction throughput, faster confirmation times, and governance mechanisms that can resolve public disputes without causing community splits, then it will represent a clearly superior choice for deploying new applications and, as its network effects grow, an attractive place to bring existing ones. Of course the challenge for DFINITY will be to deliver on these promises while meeting the security demands of a public chain with significant value at risk.

1.1.1 DFINITY FUTURE

  • DFINITY aims to explore new blockchain territory related to the original goals of the Ethereum project and is sometimes considered “Ethereum’s crazy sister.”
  • DFINITY is developing blockchain-based infrastructure to support a new style of the internet (akin to Ethereum’s “World Computer”), one in which the internet itself will support software applications and data rather than various cloud hosting providers.
  • The project suggests this reinvented software platform can simplify the development of new software systems, reduce the human capital needed to maintain and secure data, and preserve user data privacy.
  • Dfinity aims to reduce the costs of cloud services by creating a decentralized “internet computer” which may launch in 2020
  • Dfinity claims transactions on its network are finalized in 3–5 seconds, compared to 1 hour for Bitcoin and 10 minutes for Ethereum.

1.1.2 DFINITY’S VISION

DFINITY’s vision is its new internet infrastructure can support a wide variety of end-user and enterprise applications. Social media, messaging, search, storage, and peer-to-peer Internet interactions are all examples of functionalities that DFINITY plans to host atop its public Web 3.0 cloud-like computing resource. In order to provide the transaction and data capacity necessary to support this ambitious vision, DFINITY features a unique consensus model (dubbed Threshold Relay) and algorithmic governance via its Blockchain Nervous System (BNS) — sometimes also referred to as the Network Nervous System or NNS.

1.2 DFINITY COMMUNITY

The DFINITY community brings people and organizations together to learn and collaborate on products that help steward the next-generation of internet software and services. The Internet Computer allows developers to take on the monopolization of the internet, and return the internet back to its free and open roots. We’re committed to connecting those who believe the same through our events, content, and discussions.

https://preview.redd.it/0zv64fzf05e51.png?width=637&format=png&auto=webp&s=e2b17365fae3c679a32431062d8e3c00a57673cf

1.3 DFINITY ROADMAP (TIMELINE) February 15, 2017

February 15, 2017
Ethereum based community seed round raises 4M Swiss francs (CHF)
The DFINITY Stiftung, a not-for-profit foundation entity based in Zug, Switzerland, raised the round. The foundation held $10M of assets as of April 2017.
February 8, 2018
Dfinity announces a $61M fundraising round led by Polychain Capital and Andreessen Horowitz
The round $61M round led by Polychain Capital and Andreessen Horowitz, along with an DFINITY Ecosystem Venture Fund which will be used to support projects developing on the DFINITY platform, and an Ethereum based raise in 2017 brings the total funding for the project over $100 million. This is the first cryptocurrency token that Andressen Horowitz has invested in, led by Chris Dixon.
August 2018
Dfinity raises a $102,000,000 venture round from Multicoin Capital, Village Global, Aspect Ventures, Andreessen Horowitz, Polychain Capital, Scalar Capital, Amino Capital and SV Angel.
January 23, 2020
Dfinity launches an open source platform aimed at the social networking giants

2.DFINITY TECHNOLOGY

Dfinity is building what it calls the internet computer, a decentralized technology spread across a network of independent data centers that allows software to run anywhere on the internet rather than in server farms that are increasingly controlled by large firms, such as Amazon Web Services or Google Cloud. This week Dfinity is releasing its software to third-party developers, who it hopes will start making the internet computer’s killer apps. It is planning a public release later this year.
At its core, the DFINITY consensus mechanism is a variation of the Proof of Stake (PoS) model, but offers an alternative to traditional Proof of Work (PoW) and delegated PoS (dPoS) networks. Threshold Relay intends to strike a balance between inefficiencies of decentralized PoW blockchains (generally characterized by slow block times) and the less robust game theory involved in vote delegation (as seen in dPoS blockchains). In DFINITY, a committee of “miners” is randomly selected to add a new block to the chain. An individual miner’s probability of being elected to the committee proposing and computing the next block (or blocks) is proportional to the number of dfinities the miner has staked on the network. Further, a “weight” is attributed to a DFINITY chain based on the ranks of the miners who propose blocks in the chain, and that weight is used to choose between competing chains (i.e. resolve chain forks).
A decentralized random beacon manages the random selection process of temporary block producers. This beacon is a Variable Random Function (VRF), which is a pseudo-random function that provides publicly verifiable proofs of its outputs’ correctness. A core component of the random beacon is the use of Boneh-Lynn-Shacham (BLS) signatures. By leveraging the BLS signature scheme, the DFINITY protocol ensures no actor in the network can determine the outcome of the next random assignment.
Dfinity is introducing a new standard, which it calls the internet computer protocol (ICP). These new rules let developers move software around the internet as well as data. All software needs computers to run on, but with ICP the computers could be anywhere. Instead of running on a dedicated server in Google Cloud, for example, the software would have no fixed physical address, moving between servers owned by independent data centers around the world. “Conceptually, it’s kind of running everywhere,” says Dfinity engineering manager Stanley Jones.
DFINITY also features a native programming language, called ActorScript (name may be subject to change), and a virtual machine for smart contract creation and execution. The new smart contract language is intended to simplify the management of application state for programmers via an orthogonal persistence environment (which means active programs are
not required to retrieve or save their state). All ActorScript contracts are eventually compiled down to WebAssembly instructions so the DFINITY virtual machine layer can execute the logic of applications running on the network. The advantage of using the WebAssembly standard is that all major browsers support it and a variety of programming languages can compile down to Wasm (not just ActorScript).
Dfinity is moving fast. Recently, Dfinity showed off a TikTok clone called CanCan. In January it demoed a LinkedIn-alike called LinkedUp. Neither app is being made public, but they make a convincing case that apps made for the internet computer can rival the real things.

2.1 DFINITY CORE APPLICATIONS

The DFINITY cloud has two core applications:
  1. Enabling the re-engineering of business: DFINITY ambitiously aims to facilitate the re-engineering of mass-market services (such as Web Search, Ridesharing Services, Messaging Services, Social Media, Supply Chain, etc) into open source businesses that leverage autonomous software and decentralised governance systems to operate and update themselves more efficiently.
  2. Enable the re-engineering of enterprise IT systems to reduce costs: DFINITY seeks to re-engineer enterprise IT systems to take advantage of the unique properties that blockchain computer networks provide.
At present, computation on blockchain-based computer networks is far more expensive than traditional, centralised solutions (Amazon Web Services, Microsoft Azure, Google Cloud Platform, etc). Despite increasing computational cost, DFINITY intends to lower net costs “by 90% or more” through reducing the human capital cost associated with sustaining and supporting these services.
Whilst conceptually similar to Ethereum, DFINITY employs original and new cryptography methods and protocols (crypto:3) at the network level, in concert with AI and network-fuelled systemic governance (Blockchain Nervous System — BNS) to facilitate Corporate adoption.
DFINITY recognises that different users value different properties and sees itself as more of a fully compatible extension of the Ethereum ecosystem rather than a competitor of the Ethereum network.
In the future, DFINITY hopes that much of their “new crypto might be used within the Ethereum network and are also working hard on shared technology components.”
As the DFINITY project develops over time, the DFINITY Stiftung foundation intends to steadily increase the BNS’ decision-making responsibilities over time, eventually resulting in the dissolution of its own involvement entirely, once the BNS is sufficiently sophisticated.
DFINITY consensus mechanism is a heavily optimized proof of stake (PoS) model. It places a strong emphasis on transaction finality through implementing a Threshold Relay technique in conjunction with the BLS signature scheme and a notarization method to address many of the problems associated with PoS consensus.

2.2 THRESHOLD RELAY

As a public cloud computing resource, DFINITY targets business applications by substantially reducing cloud computing costs for IT systems. They aim to achieve this with a highly scalable and powerful network with potentially unlimited capacity. The DFINITY platform is chalk full of innovative designs and features like their Blockchain Nervous System (BNS) for algorithmic governance.
One of the primary components of the platform is its novel Threshold Relay Consensus model from which randomness is produced, driving the other systems that the network depends on to operate effectively. The consensus system was first designed for a permissioned participation model but can be paired with any method of Sybil resistance for an open participation model.
“The Threshold Relay is the mechanism by which Dfinity randomly samples replicas into groups, sets the groups (committees) up for threshold operation, chooses the current committee, and relays from one committee to the next is called the threshold relay.”
Threshold Relay consists of four layers (As mentioned previously):
  1. Notary layer, which provides fast finality guarantees to clients and external observers and eliminates nothing-at-stake and selfish mining attacks, providing Sybil attack resistance.
  2. Blockchain layer that builds a blockchain from validated transactions via the Probabilistic Slot Protocol driven by the random beacon.
  3. Random beacon, which as previously covered, provides the source of randomness for all higher layers like the blockchain layer smart contract applications.
  4. Identity layer that provides a registry of all clients.

2.2.1 HOW DOES THRESHOLD RELAY WORK?

Threshold Relay produces an endogenous random beacon, and each new value defines random group(s) of clients that may independently try and form into a “threshold group”. The composition of each group is entirely random such that they can intersect and clients can be presented in multiple groups. In DFINITY, each group is comprised of 400 members. When a group is defined, the members attempt to set up a BLS threshold signature system using a distributed key generation protocol. If they are successful within some fixed number of blocks, they then register the public key (“identity”) created for their group on the global blockchain using a special transaction, such that it will become part of the set of active groups in a following “epoch”. The network begins at “genesis” with some number of predefined groups, one of which is nominated to create a signature on some default value. Such signatures are random values — if they were not then the group’s signatures on messages would be predictable and the threshold signature system insecure — and each random value produced thus is used to select a random successor group. This next group then signs the previous random value to produce a new random value and select another group, relaying between groups ad infinitum and producing a sequence of random values.
In a cryptographic threshold signature system a group can produce a signature on a message upon the cooperation of some minimum threshold of its members, which is set to 51% in the DFINITY network. To produce the threshold signature, group members sign the message
individually (here the preceding group’s threshold signature) creating individual “signature shares” that are then broadcast to other group members. The group threshold signature can be constructed upon combination of a sufficient threshold of signature shares. So for example, if the group size is 400, if the threshold is set at 201 any client that collects that many shares will be able to construct the group’s signature on the message. Other group members can validate each signature share, and any client using the group’s public key can validate the single group threshold signature produced by combining them. The magic of the BLS scheme is that it is “unique and deterministic” meaning that from whatever subset of group members the required number of signature shares are collected, the single threshold signature created is always the same and only a single correct value is possible.
Consequently, the sequence of random values produced is entirely deterministic and unmanipulable, and signatures generated by relaying between groups produces a Verifiable Random Function, or VRF. Although the sequence of random values is pre-determined given some set of participating groups, each new random value can only be produced upon the minimal agreement of a threshold of the current group. Conversely, in order for relaying to stall because a random number was not produced, the number of correct processes must be below the threshold. Thresholds are configured so that this is extremely unlikely. For example, if the group size is set to 400, and the threshold is 201, 200 or more of the processes must become faulty to prevent production. If there are 10,000 processes in the network, of which 3,000 are faulty, the probability this will occur is less than 10e-17.

2.3 DFINITY TOKEN

The DFINITY blockchain also supports a native token, called dfinities (DFN), which perform multiple roles within the network, including:
  1. Fuel for deploying and running smart contracts.
  2. Security deposits (i.e. staking) that enable participation in the BNS governance system.
  3. Security deposits that allow client software or private DFINITY cloud networks to connect to the public network.
Although dfinities will end up being assigned a value by the market, the DFINITY team does not intend for DFN to act as a currency. Instead, the project has envisioned PHI, a “next-generation” crypto-fiat scheme, to act as a stable medium of exchange within the DFINITY ecosystem.
Neuron operators can earn Dfinities by participating in network-wide votes, which could be concerning protocol upgrades, a new economic policy, etc. DFN rewards for participating in the governance system are proportional to the number of tokens staked inside a neuron.

2.4 SCALABILITY

DFINITY is constantly developing with a structure that separates consensus, validation, and storage into separate layers. The storage layer is divided into multiple strings, each of which is responsible for processing transactions that occur in the fragment state. The verification layer is responsible for combining hashes of all fragments in a Merkle-like structure that results in a global state fractionation that is stored in blocks in the top-level chain.

2.5 DFINITY CONSENSUS ALGORITHM

The single most important aspect of the user experience is certainly the time required before a transaction becomes final. This is not solved by a short block time alone — Dfinity’s team also had to reduce the number of confirmations required to a small constant. DFINITY moreover had to provide a provably secure proof-of-stake algorithm that scales to millions of active participants without compromising any bit on decentralization.
Dfinity soon realized that the key to scalability lay in having an unmanipulable source of randomness available. Hence they built a scalable decentralized random beacon, based on what they call the Threshold Relay technique, right into the foundation of the protocol. This strong foundation drives a scalable and fast consensus layer: On top of the beacon runs a blockchain which utilizes notarization by threshold groups to achieve near-instant finality. Details can be found in the overview paper that we are releasing today.
The roots of the DFINITY consensus mechanism date back to 2014 when thair Chief Scientist, Dominic Williams, started to look for more efficient ways to drive large consensus networks. Since then, much research has gone into the protocol and it took several iterations to reach its current design.
For any practical consensus system the difficulty lies in navigating the tight terrain that one is given between the boundaries imposed by theoretical impossibility-results and practical performance limitations.
The first key milestone was the novel Threshold Relay technique for decentralized, deterministic randomness, which is made possible by certain unique characteristics of the BLS signature system. The next breakthrough was the notarization technique, which allows DFINITY consensus to solve the traditional problems that come with proof-of-stake systems. Getting the security proofs sound was the final step before publication.
DFINITY consensus has made the proper trade-offs between the practical side (realistic threat models and security assumptions) and the theoretical side (provable security). Out came a flexible, tunable algorithm, which we expect will establish itself as the best performing proof-of-stake algorithm. In particular, having the built-in random beacon will prove to be indispensable when building out sharding and scalable validation techniques.

2.6 LINKEDUP

The startup has rather cheekily called this “an open version of LinkedIn,” the Microsoft-owned social network for professionals. Unlike LinkedIn, LinkedUp, which runs on any browser, is not owned or controlled by a corporate entity.
LinkedUp is built on Dfinity’s so-called Internet Computer, its name for the platform it is building to distribute the next generation of software and open internet services.
The software is hosted directly on the internet on a Switzerland-based independent data center, but in the concept of the Internet Computer, it could be hosted at your house or mine. The compute power to run the application LinkedUp, in this case — is coming not from Amazon AWS, Google Cloud or Microsoft Azure, but is instead based on the distributed architecture that Dfinity is building.
Specifically, Dfinity notes that when enterprises and developers run their web apps and enterprise systems on the Internet Computer, the content is decentralized across a minimum of four or a maximum of an unlimited number of nodes in Dfinity’s global network of independent data centers.
Dfinity is an open source for LinkedUp to developers for creating other types of open internet services on the architecture it has built.
“Open Social Network for Professional Profiles” suggests that on Dfinity model one can create “Open WhatsApp”, “Open eBay”, “Open Salesforce” or “Open Facebook”.
The tools include a Canister Software Developer Kit and a simple programming language called Motoko that is optimized for Dfinity’s Internet Computer.
“The Internet Computer is conceived as an alternative to the $3.8 trillion legacy IT stack, and empowers the next generation of developers to build a new breed of tamper-proof enterprise software systems and open internet services. We are democratizing software development,” Williams said. “The Bronze release of the Internet Computer provides developers and enterprises a glimpse into the infinite possibilities of building on the Internet Computer — which also reflects the strength of the Dfinity team we have built so far.”
Dfinity says its “Internet Computer Protocol” allows for a new type of software called autonomous software, which can guarantee permanent APIs that cannot be revoked. When all these open internet services (e.g. open versions of WhatsApp, Facebook, eBay, Salesforce, etc.) are combined with other open software and services it creates “mutual network effects” where everyone benefits.
On 1 November, DFINITY has released 13 new public versions of the SDK, to our second major milestone [at WEF Davos] of demoing a decentralized web app called LinkedUp on the Internet Computer. Subsequent milestones towards the public launch of the Internet Computer will involve:
  1. On boarding a global network of independent data centers.
  2. Fully tested economic system.
  3. Fully tested Network Nervous Systems for configuration and upgrades

2.7 WHAT IS MOTOKO?

Motoko is a new software language being developed by the DFINITY Foundation, with an accompanying SDK, that is designed to help the broadest possible audience of developers create reliable and maintainable websites, enterprise systems and internet services on the Internet Computer with ease. By developing the Motoko language, the DFINITY Foundation will ensure that a language that is highly optimized for the new environment is available. However, the Internet Computer can support any number of different software frameworks, and the DFINITY Foundation is also working on SDKs that support the Rust and C languages. Eventually, it is expected there will be many different SDKs that target the Internet Computer.
Full article
submitted by CoinEx_Institution to u/CoinEx_Institution [link] [comments]

How to mine bitcoins (solo mining) with the core client ... Bcoin with JJ Bitcoin NodeJS Part 1 - Hello World What is a Bitcoin Node? - Step by Step Explanation - YouTube Full Node vs. Light Client

A node will look at a transaction as it arrives and then run a series of checks to verify it.. Each node builds its own transaction pool, which are mostly the same. The conditions can change and evolve over time and a present list can be checked through the AcceptToMemoryPool, CheckTransaction & CheckInputs functions in the bitcoin client. Bitcoin Core . The base of a sovereign Bitcoin node is a fully validating Bitcoin client. We are using Bitcoin Core, the reference implementation, but not the only option available.This application will download the whole blockchain from other peers and validate every single transaction that ever happened. Home of the Bitcoin Cash Node full node implementation for Bitcoin Cash (BCH). BCHN supports the Bitcoin Cash (BCH) November 2020 Upgrade statement. Home. Docs; About . Team Goals Newsroom. Source; Download; Language . English 简体中文 Español 日本語 한국어 Français Nederlands Norsk. Download. v22.0.0 . v22.0.0 v0.21.2 v0.21.1 v0.21.0. Release notes. Berkeley DB version 5.3 is ... Running a Bitcoin full node comes with certain costs and can expose you to certain risks. This section will explain those costs and risks so you can decide whether you’re able to help the network. Special Cases. Miners, businesses, and privacy-conscious users rely on particular behavior from the full nodes they use, so they will often run their own full nodes and take special safety ... Most full Bitcoin nodes also act as a client, which allows users to send transactions to the network (works as a personal interface for communication with the Bitcoin network). Running a node ensures that the transactions are verified and sent to the person are conducting the transaction with. This makes it possible to send money around the world without censorship and contributes to the ...

[index] [8612] [686] [32991] [1877] [44986] [44056] [27739] [5702] [24925] [18464]

How to mine bitcoins (solo mining) with the core client ...

A full node contains a full copy of the blockchain, while a light client is referencing a trusted full node's copy of the blockchain. This allows users to transact on the blockchain without ... *****UPDATE***** Solo mining has been removed from client. I'll keep the video up for how it used to work, it might still work for some alt coins (unsure) yo... In the second part we are going to actually build a blockchain and simulate the behaviour of bitcoin network by creating different nodes and different clients sending their transactions to the ... Bitcoin NodeJS Tutorial. Intro to NodeJS including installation, creating a basic http server, and fetching the current bitcoin price. JJ, the CTO of Purse.io, spoke about "bcoin", a robust and powerful full node bitcoin implementation, written in JS for node.js and browsers.

#