Making a full node : BitcoinBeginners

The Crypto Fee Wars of '20-'21 Have Begun...

The Crypto Fee Wars of '20-'21 Have Begun... submitted by BCHPleaseOrg to btc [link] [comments]

Proposal: The Sia Foundation

Vision Statement

A common sentiment is brewing online; a shared desire for the internet that might have been. After decades of corporate encroachment, you don't need to be a power user to realize that something has gone very wrong.
In the early days of the internet, the future was bright. In that future, when you sent an instant message, it traveled directly to the recipient. When you needed to pay a friend, you announced a transfer of value to their public key. When an app was missing a feature you wanted, you opened up the source code and implemented it. When you took a picture on your phone, it was immediately encrypted and backed up to storage that you controlled. In that future, people would laugh at the idea of having to authenticate themselves to some corporation before doing these things.
What did we get instead? Rather than a network of human-sized communities, we have a handful of enormous commons, each controlled by a faceless corporate entity. Hey user, want to send a message? You can, but we'll store a copy of it indefinitely, unencrypted, for our preference-learning algorithms to pore over; how else could we slap targeted ads on every piece of content you see? Want to pay a friend? You can—in our Monopoly money. Want a new feature? Submit a request to our Support Center and we'll totally maybe think about it. Want to backup a photo? You can—inside our walled garden, which only we (and the NSA, of course) can access. Just be careful what you share, because merely locking you out of your account and deleting all your data is far from the worst thing we could do.
You rationalize this: "MEGACORP would never do such a thing; it would be bad for business." But we all know, at some level, that this state of affairs, this inversion of power, is not merely "unfortunate" or "suboptimal" – No. It is degrading. Even if MEGACORP were purely benevolent, it is degrading that we must ask its permission to talk to our friends; that we must rely on it to safeguard our treasured memories; that our digital lives are completely beholden to those who seek only to extract value from us.
At the root of this issue is the centralization of data. MEGACORP can surveil you—because your emails and video chats flow through their servers. And MEGACORP can control you—because they hold your data hostage. But centralization is a solution to a technical problem: How can we make the user's data accessible from anywhere in the world, on any device? For a long time, no alternative solution to this problem was forthcoming.
Today, thanks to a confluence of established techniques and recent innovations, we have solved the accessibility problem without resorting to centralization. Hashing, encryption, and erasure encoding got us most of the way, but one barrier remained: incentives. How do you incentivize an anonymous stranger to store your data? Earlier protocols like BitTorrent worked around this limitation by relying on altruism, tit-for-tat requirements, or "points" – in other words, nothing you could pay your electric bill with. Finally, in 2009, a solution appeared: Bitcoin. Not long after, Sia was born.
Cryptography has unleashed the latent power of the internet by enabling interactions between mutually-distrustful parties. Sia harnesses this power to turn the cloud storage market into a proper marketplace, where buyers and sellers can transact directly, with no intermediaries, anywhere in the world. No more silos or walled gardens: your data is encrypted, so it can't be spied on, and it's stored on many servers, so no single entity can hold it hostage. Thanks to projects like Sia, the internet is being re-decentralized.
Sia began its life as a startup, which means it has always been subjected to two competing forces: the ideals of its founders, and the profit motive inherent to all businesses. Its founders have taken great pains to never compromise on the former, but this often threatened the company's financial viability. With the establishment of the Sia Foundation, this tension is resolved. The Foundation, freed of the obligation to generate profit, is a pure embodiment of the ideals from which Sia originally sprung.
The goals and responsibilities of the Foundation are numerous: to maintain core Sia protocols and consensus code; to support developers building on top of Sia and its protocols; to promote Sia and facilitate partnerships in other spheres and communities; to ensure that users can easily acquire and safely store siacoins; to develop network scalability solutions; to implement hardforks and lead the community through them; and much more. In a broader sense, its mission is to commoditize data storage, making it cheap, ubiquitous, and accessible to all, without compromising privacy or performance.
Sia is a perfect example of how we can achieve better living through cryptography. We now begin a new chapter in Sia's history. May our stewardship lead it into a bright future.
 

Overview

Today, we are proposing the creation of the Sia Foundation: a new non-profit entity that builds and supports distributed cloud storage infrastructure, with a specific focus on the Sia storage platform. What follows is an informal overview of the Sia Foundation, covering two major topics: how the Foundation will be funded, and what its funds will be used for.

Organizational Structure

The Sia Foundation will be structured as a non-profit entity incorporated in the United States, likely a 501(c)(3) organization or similar. The actions of the Foundation will be constrained by its charter, which formalizes the specific obligations and overall mission outlined in this document. The charter will be updated on an annual basis to reflect the current goals of the Sia community.
The organization will be operated by a board of directors, initially comprising Luke Champine as President and Eddie Wang as Chairman. Luke Champine will be leaving his position at Nebulous to work at the Foundation full-time, and will seek to divest his shares of Nebulous stock along with other potential conflicts of interest. Neither Luke nor Eddie personally own any siafunds or significant quantities of siacoin.

Funding

The primary source of funding for the Foundation will come from a new block subsidy. Following a hardfork, 30 KS per block will be allocated to the "Foundation Fund," continuing in perpetuity. The existing 30 KS per block miner reward is not affected. Additionally, one year's worth of block subsidies (approximately 1.57 GS) will be allocated to the Fund immediately upon activation of the hardfork.
As detailed below, the Foundation will provably burn any coins that it cannot meaningfully spend. As such, the 30 KS subsidy should be viewed as a maximum. This allows the Foundation to grow alongside Sia without requiring additional hardforks.
The Foundation will not be funded to any degree by the possession or sale of siafunds. Siafunds were originally introduced as a means of incentivizing growth, and we still believe in their effectiveness: a siafund holder wants to increase the amount of storage on Sia as much as possible. While the Foundation obviously wants Sia to succeed, its driving force should be its charter. Deriving significant revenue from siafunds would jeopardize the Foundation's impartiality and focus. Ultimately, we want the Foundation to act in the best interests of Sia, not in growing its own budget.

Responsibilities

The Foundation inherits a great number of responsibilities from Nebulous. Each quarter, the Foundation will publish the progress it has made over the past quarter, and list the responsibilities it intends to prioritize over the coming quarter. This will be accompanied by a financial report, detailing each area of expenditure over the past quarter, and forecasting expenditures for the coming quarter. Below, we summarize some of the myriad responsibilities towards which the Foundation is expected to allocate its resources.

Maintain and enhance core Sia software

Arguably, this is the most important responsibility of the Foundation. At the heart of Sia is its consensus algorithm: regardless of other differences, all Sia software must agree upon the content and rules of the blockchain. It is therefore crucial that the algorithm be stewarded by an entity that is accountable to the community, transparent in its decision-making, and has no profit motive or other conflicts of interest.
Accordingly, Sia’s consensus functionality will no longer be directly maintained by Nebulous. Instead, the Foundation will release and maintain an implementation of a "minimal Sia full node," comprising the Sia consensus algorithm and P2P networking code. The source code will be available in a public repository, and signed binaries will be published for each release.
Other parties may use this code to provide alternative full node software. For example, Nebulous may extend the minimal full node with wallet, renter, and host functionality. The source code of any such implementation may be submitted to the Foundation for review. If the code passes review, the Foundation will provide "endorsement signatures" for the commit hash used and for binaries compiled internally by the Foundation. Specifically, these signatures assert that the Foundation believes the software contains no consensus-breaking changes or other modifications to imported Foundation code. Endorsement signatures and Foundation-compiled binaries may be displayed and distributed by the receiving party, along with an appropriate disclaimer.
A minimal full node is not terribly useful on its own; the wallet, renter, host, and other extensions are what make Sia a proper developer platform. Currently, the only implementations of these extensions are maintained by Nebulous. The Foundation will contract Nebulous to ensure that these extensions continue to receive updates and enhancements. Later on, the Foundation intends to develop its own implementations of these extensions and others. As with the minimal node software, these extensions will be open source and available in public repositories for use by any Sia node software.
With the consensus code now managed by the Foundation, the task of implementing and orchestrating hardforks becomes its responsibility as well. When the Foundation determines that a hardfork is necessary (whether through internal discussion or via community petition), a formal proposal will be drafted and submitted for public review, during which arguments for and against the proposal may be submitted to a public repository. During this time, the hardfork code will be implemented, either by Foundation employees or by external contributors working closely with the Foundation. Once the implementation is finished, final arguments will be heard. The Foundation board will then vote whether to accept or reject the proposal, and announce their decision along with appropriate justification. Assuming the proposal was accepted, the Foundation will announce the block height at which the hardfork will activate, and will subsequently release source code and signed binaries that incorporate the hardfork code.
Regardless of the Foundation's decision, it is the community that ultimately determines whether a fork is accepted or rejected – nothing can change that. Foundation node software will never automatically update, so all forks must be explicitly adopted by users. Furthermore, the Foundation will provide replay and wipeout protection for its hard forks, protecting other chains from unintended or malicious reorgs. Similarly, the Foundation will ensure that any file contracts formed prior to a fork activation will continue to be honored on both chains until they expire.
Finally, the Foundation also intends to pursue scalability solutions for the Sia blockchain. In particular, work has already begun on an implementation of Utreexo, which will greatly reduce the space requirements of fully-validating nodes (allowing a full node to be run on a smartphone) while increasing throughput and decreasing initial sync time. A hardfork implementing Utreexo will be submitted to the community as per the process detailed above.
As this is the most important responsibility of the Foundation, it will receive a significant portion of the Foundation’s budget, primarily in the form of developer salaries and contracting agreements.

Support community services

We intend to allocate 25% of the Foundation Fund towards the community. This allocation will be held and disbursed in the form of siacoins, and will pay for grants, bounties, hackathons, and other community-driven endeavours.
Any community-run service, such as a Skynet portal, explorer or web wallet, may apply to have its costs covered by the Foundation. Upon approval, the Foundation will reimburse expenses incurred by the service, subject to the exact terms agreed to. The intent of these grants is not to provide a source of income, but rather to make such services "break even" for their operators, so that members of the community can enrich the Sia ecosystem without worrying about the impact on their own finances.

Ensure easy acquisition and storage of siacoins

Most users will acquire their siacoins via an exchange. The Foundation will provide support to Sia-compatible exchanges, and pursue relevant integrations at its discretion, such as Coinbase's new Rosetta standard. The Foundation may also release DEX software that enables trading cryptocurrencies without the need for a third party. (The Foundation itself will never operate as a money transmitter.)
Increasingly, users are storing their cryptocurrency on hardware wallets. The Foundation will maintain the existing Ledger Nano S integration, and pursue further integrations at its discretion.
Of course, all hardware wallets must be paired with software running on a computer or smartphone, so the Foundation will also develop and/or maintain client-side wallet software, including both full-node wallets and "lite" wallets. Community-operated wallet services, i.e. web wallets, may be funded via grants.
Like core software maintenance, this responsibility will be funded in the form of developer salaries and contracting agreements.

Protect the ecosystem

When it comes to cryptocurrency security, patching software vulnerabilities is table stakes; there are significant legal and social threats that we must be mindful of as well. As such, the Foundation will earmark a portion of its fund to defend the community from legal action. The Foundation will also safeguard the network from 51% attacks and other threats to network security by implementing softforks and/or hardforks where necessary.
The Foundation also intends to assist in the development of a new FOSS software license, and to solicit legal memos on various Sia-related matters, such as hosting in the United States and the EU.
In a broader sense, the establishment of the Foundation makes the ecosystem more robust by transferring core development to a more neutral entity. Thanks to its funding structure, the Foundation will be immune to various forms of pressure that for-profit companies are susceptible to.

Drive adoption of Sia

Although the overriding goal of the Foundation is to make Sia the best platform it can be, all that work will be in vain if no one uses the platform. There are a number of ways the Foundation can promote Sia and get it into the hands of potential users and developers.
In-person conferences are understandably far less popular now, but the Foundation can sponsor and/or participate in virtual conferences. (In-person conferences may be held in the future, permitting circumstances.) Similarly, the Foundation will provide prizes for hackathons, which may be organized by community members, Nebulous, or the Foundation itself. Lastly, partnerships with other companies in the cryptocurrency space—or the cloud storage space—are a great way to increase awareness of Sia. To handle these responsibilities, one of the early priorities of the Foundation will be to hire a marketing director.

Fund Management

The Foundation Fund will be controlled by a multisig address. Each member of the Foundation's board will control one of the signing keys, with the signature threshold to be determined once the final composition of the board is known. (This threshold may also be increased or decreased if the number of board members changes.) Additionally, one timelocked signing key will be controlled by David Vorick. This key will act as a “dead man’s switch,” to be used in the event of an emergency that prevents Foundation board members from reaching the signature threshold. The timelock ensures that this key cannot be used unless the Foundation fails to sign a transaction for several months.
On the 1st of each month, the Foundation will use its keys to transfer all siacoins in the Fund to two new addresses. The first address will be controlled by a high-security hot wallet, and will receive approximately one month's worth of Foundation expenditures. The second address, receiving the remaining siacoins, will be a modified version of the source address: specifically, it will increase the timelock on David Vorick's signing key by one month. Any other changes to the set of signing keys, such as the arrival or departure of board members, will be incorporated into this address as well.
The Foundation Fund is allocated in SC, but many of the Foundation's expenditures must be paid in USD or other fiat currency. Accordingly, the Foundation will convert, at its discretion, a portion of its monthly withdrawals to fiat currency. We expect this conversion to be primarily facilitated by private "OTC" sales to accredited investors. The Foundation currently has no plans to speculate in cryptocurrency or other assets.
Finally, it is important that the Foundation adds value to the Sia platform well in excess of the inflation introduced by the block subsidy. For this reason, the Foundation intends to provably burn, on a quarterly basis, any coins that it cannot allocate towards any justifiable expense. In other words, coins will be burned whenever doing so provides greater value to the platform than any other use. Furthermore, the Foundation will cap its SC treasury at 5% of the total supply, and will cap its USD treasury at 4 years’ worth of predicted expenses.
 
Addendum: Hardfork Timeline
We would like to see this proposal finalized and accepted by the community no later than September 30th. A new version of siad, implementing the hardfork, will be released no later than October 15th. The hardfork will activate at block 293220, which is expected to occur around 12pm EST on January 1st, 2021.
 
Addendum: Inflation specifics
The total supply of siacoins as of January 1st, 2021 will be approximately 45.243 GS. The initial subsidy of 1.57 GS thus increases the supply by 3.47%, and the total annual inflation in 2021 will be at most 10.4% (if zero coins are burned). In 2022, total annual inflation will be at most 6.28%, and will steadily decrease in subsequent years.
 

Conclusion

We see the establishment of the Foundation as an important step in the maturation of the Sia project. It provides the ecosystem with a sustainable source of funding that can be exclusively directed towards achieving Sia's ambitious goals. Compared to other projects with far deeper pockets, Sia has always punched above its weight; once we're on equal footing, there's no telling what we'll be able to achieve.
Nevertheless, we do not propose this change lightly, and have taken pains to ensure that the Foundation will act in accordance with the ideals that this community shares. It will operate transparently, keep inflation to a minimum, and respect the user's fundamental role in decentralized systems. We hope that everyone in the community will consider this proposal carefully, and look forward to a productive discussion.
submitted by lukechampine to siacoin [link] [comments]

[OWL WATCH] Waiting for "IOTA TIME" 27;

Disclaimer: This is my editing, so there could be always some misunderstandings and exaggerations, plus many convos are from 'spec channel', so take it with a grain of salt, pls.
+ I added some recent convos afterward.
--------------------------------------------------​
📷
Luigi Vigneri [IF]어제 오후 8:26
Giving the opportunity to everybody to set up/run nodes is one of IOTA's priority. A minimum amount of resources is obviously required to prevent easy attacks, but we are making sure that being active part of the IOTA network can be possible without crazy investments.
we are building our solution in such a way that the protocol is fair and lightweight.

📷
Hans Moog [IF]어제 오후 11:24
IOTA is not "free to use" but it's - fee-less
you have tokens? you can send them around for free
📷
Hans Moog [IF]어제 오후 11:25
you have no tokens? you have to pay to use the network
📷
lekanovic어제 오후 11:25
I think it is a smart way to avoid the spamming network problem
📷
Hans Moog [IF]어제 오후 11:26
owning tokens is essentially like owning a share of the actual network
and the throughput it can process
📷
Hans Moog [IF]어제 오후 11:26****​
if you don't need all of that yourself, you can rent it out to people and earn money
📷
Hans Moog [IF]어제 오후 11:27
mana = tokens * time since you own them
simplified
📷
Hans Moog [IF]어제 오후 11:27
the longer you hold your tokens and the more you have, the more mana you have
but every now and then you have to move them to "realize" that mana
📷
lekanovic어제 오후 11:28
Is there any other project that is using a Mana solution to the network fee problem ?
📷
Hans Moog [IF]어제 오후 11:28
nah
the problem with current protocol is that they are leader based
📷
Hans Moog [IF]어제 오후 11:29
you need absolute consensus on who the current leaders are and what their influence in the network is
that's how blockchains works
📷
Hans Moog [IF]어제 오후 11:29
if two block producers produce 2 blocks at the same time, then you have to choose which one wins
and where everybody attaches their next block to
IOTA works differently and doesn't need to choose a single leader
we therefore have a much bigger flexibility of designing our sybil protection mechanisms
in a way, mana is also supposed to solve the problem of "rewarding" the infrastructure instead of the validators
in blockchain only the miners get all the money
running a node and even if it's one that is used by a lot of people will only cost
you won't get anything back
no fees, nothing
the miners get it all
📷
Hans Moog [IF]어제 오후 11:31
in IOTA, the node operators receive the mana
which gives them a share of the network throughput
📷
Hans Moog [IF]어제 오후 11:32
because in blockchain you need to decide whose txs become part of the blocks
and it's not really based on networking protocols like AIMD
📷
lekanovic어제 오후 11:33
And the more Mana your node have, the more trust your node has and you have more to say in the FPC, is that correct?
📷
Hans Moog [IF]어제 오후 11:33
yeah
a node that has processed a lot of txs of its users will have more mana than other nodes
and therefore a bigger say in deciding conflicts
its a direct measure of "trust" by its users
📷
lekanovic어제 오후 11:34
And choosing committee for dRNG would be done on L1 protocol level?
Everything regarding Mana will be L1 level, right?
📷
Hans Moog [IF]어제 오후 11:35
Yeah
Mana is layer1, but will also be used as weight in L2 solutions like smart contracts
📷
lekanovic어제 오후 11:35
And you are not dependant on using SC to implement this
📷
Hans Moog [IF]어제 오후 11:35
No, you don't need smart contracts
That's all the base layer
📷
Hans Moog [IF]어제 오후 11:37
'Time' actually takes into account things like decay
So it doesn't just increase forever
It's close to "Demurrage" in monetary theory
📷
lekanovic어제 오후 11:36
For projects to be able to connect to Polkadot or Cosmos, you need to get the state of the ledger.
Will it be possible to get the Tangle state?
If this would be possible, then I think it would be SUPER good for IOTA
📷
Hans Moog [IF]어제 오후 11:38
Yeah but polkadot is not connecting other dlts
Just inhouse stuff
📷
Hyperware어제 오후 11:39
Is there still a cap on mana so that the rich don't get richer?
📷
Hans Moog [IF]어제 오후 11:39
Yes mana is capped
📷
TangleAccountant어제 오후 11:39
u/Hans Moog [IF] My first thought is that the evolution of this renting system will lead to several big mana renting companies that pool together tons of token holders mana. That way businesses looking to rent mana just need to deal with a reliable mana renting company for years instead of a new individual every couple of months (because life happens and you don't know if that individual will need to sell their IOTAs due to personal reasons). Any thoughts on this?
📷
Hans Moog [IF]어제 오후 11:41
u/TangleAccountant yes that is likely - but also not a bad thing - token holders will have a place to get their monthly payout and the companies that want to use the tangle without having tokens have a place to pay
📷
TangleAccountant어제 오후 11:42
Oh I completely agree. That's really cool. I'll take a stab at creating one of those companies in the US.
📷
Hans Moog [IF]어제 오후 11:42
And everybody who wants to run a node themselves or has tokens and wants use the tangle for free can do so
But "leachers" that would want to use the network for free won't be able to do so
I mean ultimately there will always be "fees", as there is no "free lunch".
You have a certain amount of resources that a network can process and you have a certain demand.
And that will naturally result in fees based on supply / demand
what you can do however is to build a system where the actual users of that system that legitimately want to use it can do so for free,
just because they already "invest" enough by having tokens
or running infrastructure
they are already contributing to the well-being of the network through these two aspects alone
it would be stupid to ask those guys for additional fees
and mana essentially tries to be such a measure of honesty among the users
📷
Hyperware어제 오후 11:47
It's interesting from an investment perspective that having tokens/mana is like owning a portion of the network.
📷
Hans Moog [IF]어제 오후 11:48
Yeah, you are owning a certain % of the throughput and whatever the price will ultimately be to execute on this network - you will earn proportionally
but you have to keep in mind that we are trying to build the most efficient DLT that you could possibly ever build
📷
semibaron어제 오후 11:48
The whole mana (tokens) = share of network throuput sounds very much like EOS tbh
Just that EOS uses DPoS
📷
Hans Moog [IF]어제 오후 11:50
yeah i mean there is really not too many new things under the sun - you can just tweak a few things here and there, when it comes to distributing resources
DPoS is simply not very nice from a centralization aspect
📷
Hans Moog [IF]어제 오후 11:50
at least not the way EOS does it
delegating weights is 1 thing
but assuming that the weight will always be in a way that 21 "identities" run the whole network is bad
in the current world you see a centralization of power
but ultimately we want to build a future where the wealth is more evenly distributed
and the same goes for voting power
📷
Hans Moog [IF]어제 오후 11:52
blockchain needs leader selection
it only works with such a centralizing component
IOTA doesn't need that
it's delusional to say that IOTA wouldn't have any such centralization
but maybe we get better than just a handselected nodes 📷
📷
Phantom3D어제 오후 11:52
How would this affect a regular hodler without a node. Should i keep my tokens elsewere to generate mana and put the tokens to use?
📷
Hans Moog [IF]어제 오후 11:53
you can do whatever you want with your mana
just make an account at a node you regularly use and use it to build up a reputation with that node
to be able to use your funds for free
or run a node yourself
or rent it out to companies if you just hodl
📷
semibaron어제 오후 11:54
Will there be a build-in function into the node software / wallet to delegate ("sell") my mana?
📷
Hans Moog [IF]어제 오후 11:55
u/semibaron not from the start - that would happen on a 2nd layer
------------------------------------------------------------------------------------------------------------
📷
dom어제 오후 9:49
suddenly be incentive to hold iota?
to generate Mana
📷
Hyperware오늘 오전 4:21
The only thing I can really do, is believe that the IF have smart answers and are still building the best solutions they can for the sake of the vision
📷
dom오늘 오전 4:43
100% - which is why we're spending so much effort to communicate it more clearly now
we'll do an AMA on this topic very soon
📷
M [s2]오늘 오전 4:54
u/dom​ please accept my question for the AMA: will IOTA remain a permissionless system and if so, how?
📷
dom오늘 오전 4:57
of course it remains permissionless
📷
dom오늘 오전 5:20
what is permissioned about it?
is ETH or Bitcoin permissioned because you have to pay a transaction fee in their native token?
📷
Gerrit오늘 오전 5:24
How did your industry partners think about the mana solution and the fact they need to hold the token to ensure network throughput?
📷
dom오늘 오전 5:26
u/Gerrit considering how the infrastructure, legal and regulatory frameworks are improving around the adoption and usage of crypto-currencies within large companies, I really think that we are introducing this concept exactly at the right time. It should make enterprise partners comfortable in using the permissionless network without much of a hurdle. They can always launch their own network if they want to ...
📷
Gerrit오늘 오전 5:27
Launching their own network can’t be what you want
📷
dom오늘 오전 5:27
exactly
but that is what's happening with Ethereum and all the other networks
they don't hold Ether tokens either.
📷
Gerrit오늘 오전 5:32
Will be very exciting to see if ongoing regulation will „allow“ companies to invest and hold the tokens. With upcoming custody solutions that would be a fantastic play.
📷
Hans Moog [IF]오늘 오전 5:34
It's still possible to send transactions even without mana - mana is only used in times of congestion to give the people that have more mana more priority
there will still be sharding to keep the network free most of the time
📷
Hans Moog [IF]오늘 오전 5:35
but without a protection mechanism, somebody could just spam a lot of bullshit and you could break the network(수정됨)
you need some form of protection from this
📷
M [s2]오늘 오전 5:36
u/Hans Moog [IF] so when I have 0 Mana, I can still send transactions? This is actually the point where it got strange...
📷
Hans Moog [IF]오늘 오전 5:37
yes you can
unless the network is close to its processing capabilities / being attacked by spammers
then the nodes will favor the mana holders
📷
Hans Moog [IF]오늘 오전 5:37
but having mana is not a requirement for many years to come
currently even people having fpgas can't spam that many tps
and we will also have sharding implemented by then
📷
M [s2]오늘 오전 5:39
Thank you u/Hans Moog [IF] ! This is the actually important piece of info!
📷
Basha오늘 오전 5:38
ok, i thought it was communicated that you need at least 1 mana to process a transaction.
from the blogpost: "... a node with 0 mana can issue no transactions."
maybe they meant during the congestion**, but if that's the case maybe you should add that**
📷
Hans Moog [IF]오늘 오전 5:42
its under the point "Congestion control:"
yeah this only applies to spam attacks
network not overloaded = no mana needed
📷
Hans Moog [IF]오늘 오전 5:43
if congested => favor txs from people who have the most skin in the game
but sharding will try to keep the network non-congested most of the time - but there might be short periods of time where an attacker might bring the network close to its limits
and of course its going to take a while to add this, so we need a protection mechanism till sharding is supported(수정됨)
📷
Hans Moog [IF]오늘 오전 6:36
I don't have a particular problem with EOS or their amount of validators - the reason why I think blockchain is inferior has really nothing to do with the way you do sybil protection
and with validators I mean "voting nodes"
I mean even bitcoin has less mining pools
and you could compare mining pools to dpos in some sense
where people assign their weight (in that case hashing power) to the corresponding mining pools
so EOS is definitely not less decentralized than any other tech
but having more identities having weight in the decision process definitely makes it harder to corrupt a reasonable fraction of the system and makes it easier to shard
so its desirable to have this property(수정됨)

-------------------------------------------------

📷
Antonio Nardella [IF]오늘 오전 3:36
https://twitter.com/cmcanalytics/status/1310866311929647104?s=19
u/C3PO [92% Cooless] They could also add more git repos instead of the wallet one, and we would probably be #1 there too..
----------------------------------------------------------------------------------
Disclaimer:
I'm sorry, maybe I'm fueling some confusion through posting this mana-thing too soon,
but, instead of erasing this posting, I'm adding recent convos.
Certain things about mana seem to be not clear, yet.
It would be better to wait for some official clarification.
But, I hope the community gives its full support to IF, 'cause
there could be always some bumps along the untouched, unchartered way.
--------------------------------------------------------------------------------------
Recent Addition;

Billy Sanders [IF]오늘 오후 1:36

It's still possible to send transactions even without mana - mana is only used in times of congestion to give the people that have more mana more priority
u/Hans Moog [IF] Im sorry Hans, but this is false in the current congestion control algorithm. No mana = no transactions. To be honest, we havent really tried to make it work so that you can sent transactions with no mana during ties with no congestion, but I dont see how you can enable this and still maintain the sybil protection required. u/Luigi Vigneri [IF] What do you think?📷

Dave [EF]오늘 오후 2:19

Suggestion: Sidebar, then get back to us with the verdict.(수정됨)📷2📷

dom오늘 오후 2:27

No Mana no tx will definitely not be the case(수정됨)📷5📷7***[오후 2:28]***Billy probably means the previous rate control paper as it was written by Luigi. I'll clarify with them📷

Hans Moog [IF]오늘 오후 2:29

When was this decided u/Billy Sanders [IF] and by whom? Was this discussed at last resum when I wasnt there? The last info that I had was that the congestion control should only kick in when there is congestion?!?***[오후 2:29]***📷 📷 📷📷

Navin Ramachandran [IF]오늘 오후 2:30

Let's sidebar this discussion and return when we have agreement. Dave has the right idea

submitted by btlkhs to Iota [link] [comments]

Bob The Magic Custodian



Summary: Everyone knows that when you give your assets to someone else, they always keep them safe. If this is true for individuals, it is certainly true for businesses.
Custodians always tell the truth and manage funds properly. They won't have any interest in taking the assets as an exchange operator would. Auditors tell the truth and can't be misled. That's because organizations that are regulated are incapable of lying and don't make mistakes.

First, some background. Here is a summary of how custodians make us more secure:

Previously, we might give Alice our crypto assets to hold. There were risks:

But "no worries", Alice has a custodian named Bob. Bob is dressed in a nice suit. He knows some politicians. And he drives a Porsche. "So you have nothing to worry about!". And look at all the benefits we get:
See - all problems are solved! All we have to worry about now is:
It's pretty simple. Before we had to trust Alice. Now we only have to trust Alice, Bob, and all the ways in which they communicate. Just think of how much more secure we are!

"On top of that", Bob assures us, "we're using a special wallet structure". Bob shows Alice a diagram. "We've broken the balance up and store it in lots of smaller wallets. That way", he assures her, "a thief can't take it all at once". And he points to a historic case where a large sum was taken "because it was stored in a single wallet... how stupid".
"Very early on, we used to have all the crypto in one wallet", he said, "and then one Christmas a hacker came and took it all. We call him the Grinch. Now we individually wrap each crypto and stick it under a binary search tree. The Grinch has never been back since."

"As well", Bob continues, "even if someone were to get in, we've got insurance. It covers all thefts and even coercion, collusion, and misplaced keys - only subject to the policy terms and conditions." And with that, he pulls out a phone-book sized contract and slams it on the desk with a thud. "Yep", he continues, "we're paying top dollar for one of the best policies in the country!"
"Can I read it?' Alice asks. "Sure," Bob says, "just as soon as our legal team is done with it. They're almost through the first chapter." He pauses, then continues. "And can you believe that sales guy Mike? He has the same year Porsche as me. I mean, what are the odds?"

"Do you use multi-sig?", Alice asks. "Absolutely!" Bob replies. "All our engineers are fully trained in multi-sig. Whenever we want to set up a new wallet, we generate 2 separate keys in an air-gapped process and store them in this proprietary system here. Look, it even requires the biometric signature from one of our team members to initiate any withdrawal." He demonstrates by pressing his thumb into the display. "We use a third-party cloud validation API to match the thumbprint and authorize each withdrawal. The keys are also backed up daily to an off-site third-party."
"Wow that's really impressive," Alice says, "but what if we need access for a withdrawal outside of office hours?" "Well that's no issue", Bob says, "just send us an email, call, or text message and we always have someone on staff to help out. Just another part of our strong commitment to all our customers!"

"What about Proof of Reserve?", Alice asks. "Of course", Bob replies, "though rather than publish any blockchain addresses or signed transaction, for privacy we just do a SHA256 refactoring of the inverse hash modulus for each UTXO nonce and combine the smart contract coefficient consensus in our hyperledger lightning node. But it's really simple to use." He pushes a button and a large green checkmark appears on a screen. "See - the algorithm ran through and reserves are proven."
"Wow", Alice says, "you really know your stuff! And that is easy to use! What about fiat balances?" "Yeah, we have an auditor too", Bob replies, "Been using him for a long time so we have quite a strong relationship going! We have special books we give him every year and he's very efficient! Checks the fiat, crypto, and everything all at once!"

"We used to have a nice offline multi-sig setup we've been using without issue for the past 5 years, but I think we'll move all our funds over to your facility," Alice says. "Awesome", Bob replies, "Thanks so much! This is perfect timing too - my Porsche got a dent on it this morning. We have the paperwork right over here." "Great!", Alice replies.
And with that, Alice gets out her pen and Bob gets the contract. "Don't worry", he says, "you can take your crypto-assets back anytime you like - just subject to our cancellation policy. Our annual management fees are also super low and we don't adjust them often".

How many holes have to exist for your funds to get stolen?
Just one.

Why are we taking a powerful offline multi-sig setup, widely used globally in hundreds of different/lacking regulatory environments with 0 breaches to date, and circumventing it by a demonstrably weak third party layer? And paying a great expense to do so?
If you go through the list of breaches in the past 2 years to highly credible organizations, you go through the list of major corporate frauds (only the ones we know about), you go through the list of all the times platforms have lost funds, you go through the list of times and ways that people have lost their crypto from identity theft, hot wallet exploits, extortion, etc... and then you go through this custodian with a fine-tooth comb and truly believe they have value to add far beyond what you could, sticking your funds in a wallet (or set of wallets) they control exclusively is the absolute worst possible way to take advantage of that security.

The best way to add security for crypto-assets is to make a stronger multi-sig. With one custodian, what you are doing is giving them your cryptocurrency and hoping they're honest, competent, and flawlessly secure. It's no different than storing it on a really secure exchange. Maybe the insurance will cover you. Didn't work for Bitpay in 2015. Didn't work for Yapizon in 2017. Insurance has never paid a claim in the entire history of cryptocurrency. But maybe you'll get lucky. Maybe your exact scenario will buck the trend and be what they're willing to cover. After the large deductible and hopefully without a long and expensive court battle.

And you want to advertise this increase in risk, the lapse of judgement, an accident waiting to happen, as though it's some kind of benefit to customers ("Free institutional-grade storage for your digital assets.")? And then some people are writing to the OSC that custodians should be mandatory for all funds on every exchange platform? That this somehow will make Canadians as a whole more secure or better protected compared with standard air-gapped multi-sig? On what planet?

Most of the problems in Canada stemmed from one thing - a lack of transparency. If Canadians had known what a joke Quadriga was - it wouldn't have grown to lose $400m from hard-working Canadians from coast to coast to coast. And Gerald Cotten would be in jail, not wherever he is now (at best, rotting peacefully). EZ-BTC and mister Dave Smilie would have been a tiny little scam to his friends, not a multi-million dollar fraud. Einstein would have got their act together or been shut down BEFORE losing millions and millions more in people's funds generously donated to criminals. MapleChange wouldn't have even been a thing. And maybe we'd know a little more about CoinTradeNewNote - like how much was lost in there. Almost all of the major losses with cryptocurrency exchanges involve deception with unbacked funds.
So it's great to see transparency reports from BitBuy and ShakePay where someone independently verified the backing. The only thing we don't have is:
It's not complicated to validate cryptocurrency assets. They need to exist, they need to be spendable, and they need to cover the total balances. There are plenty of credible people and firms across the country that have the capacity to reasonably perform this validation. Having more frequent checks by different, independent, parties who publish transparent reports is far more valuable than an annual check by a single "more credible/official" party who does the exact same basic checks and may or may not publish anything. Here's an example set of requirements that could be mandated:
There are ways to structure audits such that neither crypto assets nor customer information are ever put at risk, and both can still be properly validated and publicly verifiable. There are also ways to structure audits such that they are completely reasonable for small platforms and don't inhibit innovation in any way. By making the process as reasonable as possible, we can completely eliminate any reason/excuse that an honest platform would have for not being audited. That is arguable far more important than any incremental improvement we might get from mandating "the best of the best" accountants. Right now we have nothing mandated and tons of Canadians using offshore exchanges with no oversight whatsoever.

Transparency does not prove crypto assets are safe. CoinTradeNewNote, Flexcoin ($600k), and Canadian Bitcoins ($100k) are examples where crypto-assets were breached from platforms in Canada. All of them were online wallets and used no multi-sig as far as any records show. This is consistent with what we see globally - air-gapped multi-sig wallets have an impeccable record, while other schemes tend to suffer breach after breach. We don't actually know how much CoinTrader lost because there was no visibility. Rather than publishing details of what happened, the co-founder of CoinTrader silently moved on to found another platform - the "most trusted way to buy and sell crypto" - a site that has no information whatsoever (that I could find) on the storage practices and a FAQ advising that “[t]rading cryptocurrency is completely safe” and that having your own wallet is “entirely up to you! You can certainly keep cryptocurrency, or fiat, or both, on the app.” Doesn't sound like much was learned here, which is really sad to see.
It's not that complicated or unreasonable to set up a proper hardware wallet. Multi-sig can be learned in a single course. Something the equivalent complexity of a driver's license test could prevent all the cold storage exploits we've seen to date - even globally. Platform operators have a key advantage in detecting and preventing fraud - they know their customers far better than any custodian ever would. The best job that custodians can do is to find high integrity individuals and train them to form even better wallet signatories. Rather than mandating that all platforms expose themselves to arbitrary third party risks, regulations should center around ensuring that all signatories are background-checked, properly trained, and using proper procedures. We also need to make sure that signatories are empowered with rights and responsibilities to reject and report fraud. They need to know that they can safely challenge and delay a transaction - even if it turns out they made a mistake. We need to have an environment where mistakes are brought to the surface and dealt with. Not one where firms and people feel the need to hide what happened. In addition to a knowledge-based test, an auditor can privately interview each signatory to make sure they're not in coercive situations, and we should make sure they can freely and anonymously report any issues without threat of retaliation.
A proper multi-sig has each signature held by a separate person and is governed by policies and mutual decisions instead of a hierarchy. It includes at least one redundant signature. For best results, 3of4, 3of5, 3of6, 4of5, 4of6, 4of7, 5of6, or 5of7.

History has demonstrated over and over again the risk of hot wallets even to highly credible organizations. Nonetheless, many platforms have hot wallets for convenience. While such losses are generally compensated by platforms without issue (for example Poloniex, Bitstamp, Bitfinex, Gatecoin, Coincheck, Bithumb, Zaif, CoinBene, Binance, Bitrue, Bitpoint, Upbit, VinDAX, and now KuCoin), the public tends to focus more on cases that didn't end well. Regardless of what systems are employed, there is always some level of risk. For that reason, most members of the public would prefer to see third party insurance.
Rather than trying to convince third party profit-seekers to provide comprehensive insurance and then relying on an expensive and slow legal system to enforce against whatever legal loopholes they manage to find each and every time something goes wrong, insurance could be run through multiple exchange operators and regulators, with the shared interest of having a reputable industry, keeping costs down, and taking care of Canadians. For example, a 4 of 7 multi-sig insurance fund held between 5 independent exchange operators and 2 regulatory bodies. All Canadian exchanges could pay premiums at a set rate based on their needed coverage, with a higher price paid for hot wallet coverage (anything not an air-gapped multi-sig cold wallet). Such a model would be much cheaper to manage, offer better coverage, and be much more reliable to payout when needed. The kind of coverage you could have under this model is unheard of. You could even create something like the CDIC to protect Canadians who get their trading accounts hacked if they can sufficiently prove the loss is legitimate. In cases of fraud, gross negligence, or insolvency, the fund can be used to pay affected users directly (utilizing the last transparent balance report in the worst case), something which private insurance would never touch. While it's recommended to have official policies for coverage, a model where members vote would fully cover edge cases. (Could be similar to the Supreme Court where justices vote based on case law.)
Such a model could fully protect all Canadians across all platforms. You can have a fiat coverage governed by legal agreements, and crypto-asset coverage governed by both multi-sig and legal agreements. It could be practical, affordable, and inclusive.

Now, we are at a crossroads. We can happily give up our freedom, our innovation, and our money. We can pay hefty expenses to auditors, lawyers, and regulators year after year (and make no mistake - this cost will grow to many millions or even billions as the industry grows - and it will be borne by all Canadians on every platform because platforms are not going to eat up these costs at a loss). We can make it nearly impossible for any new platform to enter the marketplace, forcing Canadians to use the same stagnant platforms year after year. We can centralize and consolidate the entire industry into 2 or 3 big players and have everyone else fail (possibly to heavy losses of users of those platforms). And when a flawed security model doesn't work and gets breached, we can make it even more complicated with even more people in suits making big money doing the job that blockchain was supposed to do in the first place. We can build a system which is so intertwined and dependent on big government, traditional finance, and central bankers that it's future depends entirely on that of the fiat system, of fractional banking, and of government bail-outs. If we choose this path, as history has shown us over and over again, we can not go back, save for revolution. Our children and grandchildren will still be paying the consequences of what we decided today.
Or, we can find solutions that work. We can maintain an open and innovative environment while making the adjustments we need to make to fully protect Canadian investors and cryptocurrency users, giving easy and affordable access to cryptocurrency for all Canadians on the platform of their choice, and creating an environment in which entrepreneurs and problem solvers can bring those solutions forward easily. None of the above precludes innovation in any way, or adds any unreasonable cost - and these three policies would demonstrably eliminate or resolve all 109 historic cases as studied here - that's every single case researched so far going back to 2011. It includes every loss that was studied so far not just in Canada but globally as well.
Unfortunately, finding answers is the least challenging part. Far more challenging is to get platform operators and regulators to agree on anything. My last post got no response whatsoever, and while the OSC has told me they're happy for industry feedback, I believe my opinion alone is fairly meaningless. This takes the whole community working together to solve. So please let me know your thoughts. Please take the time to upvote and share this with people. Please - let's get this solved and not leave it up to other people to do.

Facts/background/sources (skip if you like):



Thoughts?
submitted by azoundria2 to QuadrigaInitiative [link] [comments]

How EpiK Protocol “Saved the Miners” from Filecoin with the E2P Storage Model?

How EpiK Protocol “Saved the Miners” from Filecoin with the E2P Storage Model?

https://preview.redd.it/n5jzxozn27v51.png?width=2222&format=png&auto=webp&s=6cd6bd726582bbe2c595e1e467aeb3fc8aabe36f
On October 20, Eric Yao, Head of EpiK China, and Leo, Co-Founder & CTO of EpiK, visited Deep Chain Online Salon, and discussed “How EpiK saved the miners eliminated by Filecoin by launching E2P storage model”. ‘?” The following is a transcript of the sharing.
Sharing Session
Eric: Hello, everyone, I’m Eric, graduated from School of Information Science, Tsinghua University. My Master’s research was on data storage and big data computing, and I published a number of industry top conference papers.
Since 2013, I have invested in Bitcoin, Ethereum, Ripple, Dogcoin, EOS and other well-known blockchain projects, and have been settling in the chain circle as an early technology-based investor and industry observer with 2 years of blockchain experience. I am also a blockchain community initiator and technology evangelist
Leo: Hi, I’m Leo, I’m the CTO of EpiK. Before I got involved in founding EpiK, I spent 3 to 4 years working on blockchain, public chain, wallets, browsers, decentralized exchanges, task distribution platforms, smart contracts, etc., and I’ve made some great products. EpiK is an answer to the question we’ve been asking for years about how blockchain should be landed, and we hope that EpiK is fortunate enough to be an answer for you as well.
Q & A
Deep Chain Finance:
First of all, let me ask Eric, on October 15, Filecoin’s main website launched, which aroused everyone’s attention, but at the same time, the calls for fork within Filecoin never stopped. The EpiK protocol is one of them. What I want to know is, what kind of project is EpiK Protocol? For what reason did you choose to fork in the first place? What are the differences between the forked project and Filecoin itself?
Eric:
First of all, let me answer the first question, what kind of project is EpiK Protocol.
With the Fourth Industrial Revolution already upon us, comprehensive intelligence is one of the core goals of this stage, and the key to comprehensive intelligence is how to make machines understand what humans know and learn new knowledge based on what they already know. And the knowledge graph scale is a key step towards full intelligence.
In order to solve the many challenges of building large-scale knowledge graphs, the EpiK Protocol was born. EpiK Protocol is a decentralized, hyper-scale knowledge graph that organizes and incentivizes knowledge through decentralized storage technology, decentralized autonomous organizations, and generalized economic models. Members of the global community will expand the horizons of artificial intelligence into a smarter future by organizing all areas of human knowledge into a knowledge map that will be shared and continuously updated for the eternal knowledge vault of humanity
And then, for what reason was the fork chosen in the first place?
EpiK’s project founders are all senior blockchain industry practitioners and have been closely following the industry development and application scenarios, among which decentralized storage is a very fresh application scenario.
However, in the development process of Filecoin, the team found that due to some design mechanisms and historical reasons, the team found that Filecoin had some deviations from the original intention of the project at that time, such as the overly harsh penalty mechanism triggered by the threat to weaken security, and the emergence of the computing power competition leading to the emergence of computing power monopoly by large miners, thus monopolizing the packaging rights, which can be brushed with computing power by uploading useless data themselves.
The emergence of these problems will cause the data environment on Filecoin to get worse and worse, which will lead to the lack of real value of the data in the chain, high data redundancy, and the difficulty of commercializing the project to land.
After paying attention to the above problems, the project owner proposes to introduce multi-party roles and a decentralized collaboration platform DAO to ensure the high value of the data on the chain through a reasonable economic model and incentive mechanism, and store the high-value data: knowledge graph on the blockchain through decentralized storage, so that the lack of value of the data on the chain and the monopoly of large miners’ computing power can be solved to a large extent.
Finally, what differences exist between the forked project and Filecoin itself?
On the basis of the above-mentioned issues, EpiK’s design is very different from Filecoin, first of all, EpiK is more focused in terms of business model, and it faces a different market and track from the cloud storage market where Filecoin is located because decentralized storage has no advantage over professional centralized cloud storage in terms of storage cost and user experience.
EpiK focuses on building a decentralized knowledge graph, which reduces data redundancy and safeguards the value of data in the distributed storage chain while preventing the knowledge graph from being tampered with by a few people, thus making the commercialization of the entire project reasonable and feasible.
From the perspective of ecological construction, EpiK treats miners more friendly and solves the pain point of Filecoin to a large extent, firstly, it changes the storage collateral and commitment collateral of Filecoin to one-time collateral.
Miners participating in EpiK Protocol are only required to pledge 1000 EPK per miner, and only once before mining, not in each sector.
What is the concept of 1000 EPKs, you only need to participate in pre-mining for about 50 days to get this portion of the tokens used for pledging. The EPK pre-mining campaign is currently underway, and it runs from early September to December, with a daily release of 50,000 ERC-20 standard EPKs, and the pre-mining nodes whose applications are approved will divide these tokens according to the mining ratio of the day, and these tokens can be exchanged 1:1 directly after they are launched on the main network. This move will continue to expand the number of miners eligible to participate in EPK mining.
Secondly, EpiK has a more lenient penalty mechanism, which is different from Filecoin’s official consensus, storage and contract penalties, because the protocol can only be uploaded by field experts, which is the “Expert to Person” mode. Every miner needs to be backed up, which means that if one or more miners are offline in the network, it will not have much impact on the network, and the miner who fails to upload the proof of time and space in time due to being offline will only be forfeited by the authorities for the effective computing power of this sector, not forfeiting the pledged coins.
If the miner can re-submit the proof of time and space within 28 days, he will regain the power.
Unlike Filecoin’s 32GB sectors, EpiK’s encapsulated sectors are smaller, only 8M each, which will solve Filecoin’s sector space wastage problem to a great extent, and all miners have the opportunity to complete the fast encapsulation, which is very friendly to miners with small computing power.
The data and quality constraints will also ensure that the effective computing power gap between large and small miners will not be closed.
Finally, unlike Filecoin’s P2P data uploading model, EpiK changes the data uploading and maintenance to E2P uploading, that is, field experts upload and ensure the quality and value of the data on the chain, and at the same time introduce the game relationship between data storage roles and data generation roles through a rational economic model to ensure the stability of the whole system and the continuous high-quality output of the data on the chain.
Deep Chain Finance:
Eric, on the eve of Filecoin’s mainline launch, issues such as Filecoin’s pre-collateral have aroused a lot of controversy among the miners. In your opinion, what kind of impact will Filecoin bring to itself and the whole distributed storage ecosystem after it launches? Do you think that the current confusing FIL prices are reasonable and what should be the normal price of FIL?
Eric:
Filecoin mainnet has launched and many potential problems have been exposed, such as the aforementioned high pre-security problem, the storage resource waste and computing power monopoly caused by unreasonable sector encapsulation, and the harsh penalty mechanism, etc. These problems are quite serious, and will greatly affect the development of Filecoin ecology.
These problems are relatively serious, and will greatly affect the development of Filecoin ecology, here are two examples to illustrate. For example, the problem of big miners computing power monopoly, now after the big miners have monopolized computing power, there will be a very delicate state — — the miners save a file data with ordinary users. There is no way to verify this matter in the chain, whether what he saved is uploaded by himself or someone else. And after the big miners have monopolized computing power, there will be a very delicate state — — the miners will save a file data with ordinary users, there is no way to verify this matter in the chain, whether what he saved is uploaded by himself or someone else. Because I can fake another identity to upload data for myself, but that leads to the fact that for any miner I go to choose which data to save. I have only one goal, and that is to brush my computing power and how fast I can brush my computing power.
There is no difference between saving other people’s data and saving my own data in the matter of computing power. When I save someone else’s data, I don’t know that data. Somewhere in the world, the bandwidth quality between me and him may not be good enough.
The best option is to store my own local data, which makes sense, and that results in no one being able to store data on the chain at all. They only store their own data, because it’s the most economical for them, and the network has essentially no storage utility, no one is providing storage for the masses of retail users.
The harsh penalty mechanism will also severely deplete the miner’s profits, because DDOS attacks are actually a very common attack technique for the attacker, and for a big miner, he can get a very high profit in a short period of time if he attacks other customers, and this thing is a profitable thing for all big miners.
Now as far as the status quo is concerned, the vast majority of miners are actually not very well maintained, so they are not very well protected against these low-DDOS attacks. So the penalty regime is grim for them.
The contradiction between the unreasonable system and the demand will inevitably lead to the evolution of the system in a more reasonable direction, so there will be many forked projects that are more reasonable in terms of mechanism, thus attracting Filecoin miners and a diversion of storage power.
Since each project is in the field of decentralized storage track, the demand for miners is similar or even compatible with each other, so miners will tend to fork the projects with better economic benefits and business scenarios, so as to filter out the projects with real value on the ground.
For the chaotic FIL price, because FIL is also a project that has gone through several years, carrying too many expectations, so it can only be said that the current situation has its own reasons for existence. As for the reasonable price of FIL there is no way to make a prediction because in the long run, it is necessary to consider the commercialization of the project to land and the value of the actual chain of data. In other words, we need to keep observing whether Filecoin will become a game of computing power or a real value carrier.
Deep Chain Finance:
Leo, we just mentioned that the pre-collateral issue of Filecoin caused the dissatisfaction of miners, and after Filecoin launches on the main website, the second round of space race test coins were directly turned into real coins, and the official selling of FIL hit the market phenomenon, so many miners said they were betrayed. What I want to know is, EpiK’s main motto is “save the miners eliminated by Filecoin”, how to deal with the various problems of Filecoin, and how will EpiK achieve “save”?
Leo:
Originally Filecoin’s tacit approval of the computing power makeup behavior was to declare that the official directly chose to abandon the small miners. And this test coin turned real coin also hurt the interests of the loyal big miners in one cut, we do not know why these low-level problems, we can only regret.
EpiK didn’t do it to fork Filecoin, but because EpiK to build a shared knowledge graph ecology, had to integrate decentralized storage in, so the most hardcore Filecoin’s PoRep and PoSt decentralized verification technology was chosen. In order to ensure the quality of knowledge graph data, EpiK only allows community-voted field experts to upload data, so EpiK naturally prevents miners from making up computing power, and there is no reason for the data that has no value to take up such an expensive decentralized storage resource.
With the inability to make up computing power, the difference between big miners and small miners is minimal when the amount of knowledge graph data is small.
We can’t say that we can save the big miners, but we are definitely the optimal choice for the small miners who are currently in the market to be eliminated by Filecoin.
Deep Chain Finance:
Let me ask Eric: According to EpiK protocol, EpiK adopts the E2P model, which allows only experts in the field who are voted to upload their data. This is very different from Filecoin’s P2P model, which allows individuals to upload data as they wish. In your opinion, what are the advantages of the E2P model? If only voted experts can upload data, does that mean that the EpiK protocol is not available to everyone?
Eric:
First, let me explain the advantages of the E2P model over the P2P model.
There are five roles in the DAO ecosystem: miner, coin holder, field expert, bounty hunter and gateway. These five roles allocate the EPKs generated every day when the main network is launched.
The miner owns 75% of the EPKs, the field expert owns 9% of the EPKs, and the voting user shares 1% of the EPKs.
The other 15% of the EPK will fluctuate based on the daily traffic to the network, and the 15% is partly a game between the miner and the field expert.
The first describes the relationship between the two roles.
The first group of field experts are selected by the Foundation, who cover different areas of knowledge (a wide range of knowledge here, including not only serious subjects, but also home, food, travel, etc.) This group of field experts can recommend the next group of field experts, and the recommended experts only need to get 100,000 EPK votes to become field experts.
The field expert’s role is to submit high-quality data to the miner, who is responsible for encapsulating this data into blocks.
Network activity is judged by the amount of EPKs pledged by the entire network for daily traffic (1 EPK = 10 MB/day), with a higher percentage indicating higher data demand, which requires the miner to increase bandwidth quality.
If the data demand decreases, this requires field experts to provide higher quality data. This is similar to a library with more visitors needing more seats, i.e., paying the miner to upgrade the bandwidth.
When there are fewer visitors, more money is needed to buy better quality books to attract visitors, i.e., money for bounty hunters and field experts to generate more quality knowledge graph data. The game between miners and field experts is the most important game in the ecosystem, unlike the game between the authorities and big miners in the Filecoin ecosystem.
The game relationship between data producers and data storers and a more rational economic model will inevitably lead to an E2P model that generates stored on-chain data of much higher quality than the P2P model, and the quality of bandwidth for data access will be better than the P2P model, resulting in greater business value and better landing scenarios.
I will then answer the question of whether this means that the EpiK protocol will not be universally accessible to all.
The E2P model only qualifies the quality of the data generated and stored, not the roles in the ecosystem; on the contrary, with the introduction of the DAO model, the variety of roles introduced in the EpiK ecosystem (which includes the roles of ordinary people) is not limited. (Bounty hunters who can be competent in their tasks) gives roles and possibilities for how everyone can participate in the system in a more logical way.
For example, a miner with computing power can provide storage, a person with a certain domain knowledge can apply to become an expert (this includes history, technology, travel, comics, food, etc.), and a person willing to mark and correct data can become a bounty hunter.
The presence of various efficient support tools from the project owner will lower the barriers to entry for various roles, thus allowing different people to do their part in the system and together contribute to the ongoing generation of a high-quality decentralized knowledge graph.
Deep Chain Finance:
Leo, some time ago, EpiK released a white paper and an economy whitepaper, explaining the EpiK concept from the perspective of technology and economy model respectively. What I would like to ask is, what are the shortcomings of the current distributed storage projects, and how will EpiK protocol be improved?
Leo:
Distributed storage can easily be misunderstood as those of Ali’s OceanDB, but in the field of blockchain, we should focus on decentralized storage first.
There is a big problem with the decentralized storage on the market now, which is “why not eat meat porridge”.
How to understand it? Decentralized storage is cheaper than centralized storage because of its technical principle, and if it is, the centralized storage is too rubbish for comparison.
What incentive does the average user have to spend more money on decentralized storage to store data?
Is it safer?
Existence miners can shut down at any time on decentralized storage by no means save a share of security in Ariadne and Amazon each.
More private?
There’s no difference between encrypted presence on decentralized storage and encrypted presence on Amazon.
Faster?
The 10,000 gigabytes of bandwidth in decentralized storage simply doesn’t compare to the fiber in a centralized server room. This is the root problem of the business model, no one is using it, no one is buying it, so what’s the big vision.
The goal of EpiK is to guide all community participants in the co-construction and sharing of field knowledge graph data, which is the best way for robots to understand human knowledge, and the more knowledge graph data there is, the more knowledge a robot has, the more intelligent it is exponentially, i.e., EpiK uses decentralized storage technology. The value of exponentially growing data is captured with linearly growing hardware costs, and that’s where the buy-in for EPK comes in.
Organized data is worth a lot more than organized hard drives, and there is a demand for EPK when robots have the need for intelligence.
Deep Chain Finance:
Let me ask Leo, how many forked projects does Filecoin have so far, roughly? Do you think there will be more or less waves of fork after the mainnet launches? Have the requirements of the miners at large changed when it comes to participation?
Leo:
We don’t have specific statistics, now that the main network launches, we feel that forking projects will increase, there are so many restricted miners in the market that they need to be organized efficiently.
However, we currently see that most forked projects are simply modifying the parameters of Filecoin’s economy model, which is undesirable, and this level of modification can’t change the status quo of miners making up computing power, and the change to the market is just to make some of the big miners feel more comfortable digging up, which won’t help to promote the decentralized storage ecology to land.
We need more reasonable landing scenarios so that idle mining resources can be turned into effective productivity, pitching a 100x coin instead of committing to one Fomo sentiment after another.
Deep Chain Finance:
How far along is the EpiK Protocol project, Eric? What other big moves are coming in the near future?
Eric:
The development of the EpiK Protocol is divided into 5 major phases.
(a) Phase I testing of the network “Obelisk”.
Phase II Main Network 1.0 “Rosetta”.
Phase III Main Network 2.0 “Hammurabi”.
(a) The Phase IV Enrichment Knowledge Mapping Toolkit.
The fifth stage is to enrich the knowledge graph application ecology.
Currently in the first phase of testing network “Obelisk”, anyone can sign up to participate in the test network pre-mining test to obtain ERC20 EPK tokens, after the mainnet exchange on a one-to-one basis.
We have recently launched ERC20 EPK on Uniswap, you can buy and sell it freely on Uniswap or download our EpiK mobile wallet.
In addition, we will soon launch the EpiK Bounty platform, and welcome all community members to do tasks together to build the EpiK community. At the same time, we are also pushing forward the centralized exchange for token listing.
Users’ Questions
User 1:
Some KOLs said, Filecoin consumed its value in the next few years, so it will plunge, what do you think?
Eric:
First of all, the judgment of the market is to correspond to the cycle, not optimistic about the FIL first judgment to do is not optimistic about the economic model of the project, or not optimistic about the distributed storage track.
First of all, we are very confident in the distributed storage track and will certainly face a process of growth and decline, so as to make a choice for a better project.
Since the existing group of miners and the computing power already produced is fixed, and since EpiK miners and FIL miners are compatible, anytime miners will also make a choice for more promising and economically viable projects.
Filecoin consumes the value of the next few years this time, so it will plunge.
Regarding the market issues, the plunge is not a prediction, in the industry or to keep learning iteration and value judgment. Because up and down market sentiment is one aspect, there will be more very important factors. For example, the big washout in March this year, so it can only be said that it will slow down the development of the FIL community. But prices are indeed unpredictable.
User2:
Actually, in the end, if there are no applications and no one really uploads data, the market value will drop, so what are the landing applications of EpiK?
Leo: The best and most direct application of EpiK’s knowledge graph is the question and answer system, which can be an intelligent legal advisor, an intelligent medical advisor, an intelligent chef, an intelligent tour guide, an intelligent game strategy, and so on.
submitted by EpiK-Protocol to u/EpiK-Protocol [link] [comments]

NEAR PROJECT REPORT

NEAR PROJECT REPORT
Author: Gamals Ahmed, CoinEx Business Ambassador
https://preview.redd.it/xbnvecjn71t51.png?width=1164&format=png&auto=webp&s=acfd141ead035ee156f218eec9fc41288142a922

ABSTRACT

The effects of the web by a number of companies have seduced a large number of users as these companies keep their data to prevent them from searching for alternatives. Likewise, these huge platforms have attracted applications to build their highest ecosystems before either severing access or actively opposing their interests when the applications became so successful. As a result, these walled gardens have effectively hindered innovation and monopolized large sections of the web. After the emergence of blockchain technology and decentralized cryptocurrencies, the need for applications to support decentralization has emerged. Several blockchain-based companies, applications and platforms have appeared in decentralization. In this research report, we will explain the approach adopted by the NEAR decentralization platform in designing and implementing the basic technology for its system. Near is a basic platform for cloud computing and decentralized storage managed by the community, designed to enable the open web for the future. On this web, everything can be created from new currencies to new applications to new industries, opening the door to an entirely new future.

1. INTRODUCTION

The richness of the web is increasing day by day with the combined efforts of millions of people who have benefited from “innovation without permission” as content and applications are created without asking anyone. this lack of freedom of data has led to an environment hostile to the interests of its participants. And as we explained in the summary previously, web hosting companies have hindered innovation and greatly monopolized the web.
In the future, we can fix this by using new technologies to re-enable the permissionless innovation of the past in a way, which creates a more open web where users are free and applications are supportive rather than adversarial to their interests.
Decentralization emerged after the global financial crisis in 2008, which created fundamental problems of confidence in the heavily indebted banking system. Then the decentralized financial sector based on Blockchain technology has emerged since 2009.
Decentralized Blockchain technology has made it easy for decentralized digital currencies like Bitcoin to exchange billions of dollars in peer-to-peer transfers for a fraction of the price of a traditional banking system. This technology allows participants in the over $ 50 billion virtual goods economy to track, own and trade in these commodities without permission. It allows real-world goods to cross into the digital domain, with verified ownership and tracking just like that of the digital.
By default, the Internet where freedom of data enables innovation will lead to the development of a new form of software development. On this web, developers can quickly create applications from open state components and boost their efforts by using new business models that are enabled from within the program itself rather than relying on parasitic relationships with their users. This not only accelerates the creation of applications that have a more honest and cooperative relationship with its users, but also allows the emergence of completely new business built on them.
To enable these new applications and the open web, it needs the appropriate infrastructure. The new web platform cannot be controlled by a single entity and its use is not limited due to insufficient scalability. It should be decentralized in design like the web itself and supported by a community of distributors widely so that the value they store cannot be monitored, modified or removed without permission from the users who store this value on their behalf.
A new decentralization technology (Blockchain), which has facilitated decentralized digital currencies like Bitcoin, has made billions of dollars in peer-to-peer transfers at a fraction of the price of the traditional banking system. This technology allows participants in the $ 50 billion + virtual goods economy to track, own and trade in these goods without permission. It allows real-world goods to cross into the digital domain, with verified ownership and tracking just like that of the digital.
Although the cost of storing data or performing a calculation on the Ethereum blockchain is thousands and millions of times higher than the cost of performing the same functionality on Amazon Web Services. A developer can always create a “central” app or even a central currency for a fraction of the cost of doing the same on a decentralized platform because a decentralized platform, by definition, will have many iterations in its operations and storage.
Bitcoin can be thought of as the first, very basic, version of this global community-run cloud, though it is primarily used only to store and move the Bitcoin digital currency.
Ethereum is the second and slightly more sophisticated version, which expanded the basic principles of Bitcoin to create a more general computing and storage platform, though it is a raw technology, which hasn’t achieved meaningful mainstream adoption.

1.1 WHY IS IT IMPORTANT TO PAY THE EXTRA COST TO SUPPORT DECENTRALIZATION?

Because some elements of value, for example bits representing digital currency ownership, personal identity, or asset notes, are very sensitive. While in the central system, the following players can change the value of any credits they come into direct contact with:
  1. The developer who controls the release or update of the application’s code
  2. The platform where the data is stored
  3. The servers which run the application’s code
Even if none of these players intend to operate with bad faith, the actions of governments, police forces and hackers can easily turn their hands against their users and censor, modify or steal the balances they are supposed to protect.
A typical user will trust a typical centralized application, despite its potential vulnerabilities, with everyday data and computation. Typically, only banks and governments are trusted sufficiently to maintain custody of the most sensitive information — balances of wealth and identity. But these entities are also subject to the very human forces of hubris, corruption and theft.
Especially after the 2008 global financial crisis, which demonstrated the fundamental problems of confidence in a highly indebted banking system. And governments around the
world apply significant capital controls to citizens during times of crisis. After these examples, it has become a truism that hackers now own most or all of your sensitive data.
These decentralized applications operate on a more complex infrastructure than today’s web but they have access to an instantaneous and global pool of currency, value and information that today’s web, where data is stored in the silos of individual corporations, cannot provide.

1.2 THE CHALLENGES OF CREATING A DECENTRALIZED CLOUD

A community-run system like this has very different challenges from centralized “cloud” infrastructure, which is running by a single entity or group of known entities. For example:
  1. It must be both inclusive to anyone and secure from manipulation or capture.
  2. Participants must be fairly compensated for their work while avoiding creating incentives for negligent or malicious behavior.
  3. It must be both game theoretically secure so good actors find the right equilibrium and resistant to manipulation so bad actors are actively prevented from negatively affecting the system.

2. NEAR

NEAR is a global community-run computing and storage cloud which is organized to be permissionless and which is economically incentivized to create a strong and decentralized data layer for the new web.
Essentially, it is a platform for running applications which have access to a shared — and secure — pool of money, identity and data which is owned by their users. More technically, it combines the features of partition-resistant networking, serverless compute and distributed storage into a new kind of platform.
NEAR is a community-managed, decentralized cloud storage and computing platform, designed to enable the open web in the future. It uses the same core technology for Bitcoin and Blockchain. On this web, everything can be created from new currencies to new applications to new industries, opening the door to an entirely new future.
NEAR is a decentralized community-run cloud computing and storage platform, which is designed to enable the open web of the future. On this web, everything from new currencies to new applications to new industries can be created, opening the door to a brand new future.
NEAR is a scalable computing and storage platform with the potential to change how systems are designed, how applications are built and how the web itself works.
It is a complex technology allow developers and entrepreneurs to easily and sustainably build applications which reap the benefits of decentralization and participate in the Open Web while minimizing the associated costs for end users.
NEAR creates the only community-managed cloud that is strong enough to power the future of the open web, as NEAR is designed from the ground up to deliver intuitive experiences to
end users, expand capacity across millions of devices, and provide developers with new and sustainable business models for their applications.
The NEAR Platform uses a token — also called “NEAR”. This token allows the users of these cloud resources, regardless of where they are in the world, to fairly compensate the providers of the services and to ensure that these participants operate in good faith.

2.1 WHY NEAR?

Through focus, we find that Platforms based on blockchain technologies like Bitcoin and Ethereum have made great progress and enriched the world with thousands of innovative applications spanning from games to decentralized financing.
However, these original networks and none of the networks that followed were not able to bridge the gap towards mainstream adoption of the applications created above them and do not provide this type of standard that fully supports the web.
This is a result of two key factors:
  1. System design
  2. Organization design
System design is relevant because the technical architecture of other platforms creates substantial problems with both usability and scalability which have made adoption nearly impossible by any but the most technical innovators. End-users experience 97–99% dropoff rates when using applications and developers find the process of creating and maintaining their applications endlessly frustrating.
Fixing these problems requires substantial and complex changes to current protocol architectures, something which existing organizations haven’t proven capable of implementing. Instead, they create multi-year backlogs of specification design and implementation, which result in their technology falling further and further behind.
NEAR’s platform and organization are architected specifically to solve the above-mentioned problems. The technical design is fanatically focused on creating the world’s most usable and scalable decentralized platform so global-scale applications can achieve real adoption. The organization and governance structure are designed to rapidly ship and continuously evolve the protocol so it will never become obsolete.

2.1.1 Features, which address these problems:

1. USABILITY FIRST
The most important problem that needs to be addressed is how to allow developers to create useful applications that users can use easily and that will capture the sustainable value of these developers.
2. End-User Usability
Developers will only build applications, which their end users can actually use. NEAR’s “progressive security” model allows developers to create experiences for their users which more closely resemble familiar web experiences by delaying onboarding, removing the need for user to learn “blockchain” concepts and limiting the number of permission-asking interactions the user must have to use the application.
1. Simple Onboarding: NEAR allows developers to take actions on behalf of their users, which allows them to onboard users without requiring these users to provide a wallet or interact with tokens immediately upon reaching an application. Because accounts keep track of application-specific keys, user accounts can also be used for the kind of “Single Sign On” (SSO) functionality that users are familiar with from the traditional web (eg “Login with Facebook/Google/Github/etc”).
2. Easy Subscriptions: Contract-based accounts allow for easy creation of subscriptions and custom permissioning for particular applications.
3. Familiar Usage Styles: The NEAR economic model allows developers to pay for usage on behalf of their users in order to hide the costs of infrastructure in a way that is in line with familiar web usage paradigms.
4. Predictable Pricing: NEAR prices transactions on the platform in simple terms, which allow end-users to experience predictable pricing and less cognitive load when using the platform.

2.1.2 Design principles and development NEAR’s platform

1. Usability: Applications deployed to the platform should be seamless to use for end users and seamless to create for developers. Wherever possible, the underlying technology itself should fade to the background or be hidden completely from end users. Wherever possible, developers should use familiar languages and patterns during the development process. Basic applications should be intuitive and simple to create while applications that are more robust should still be secure.
2. Scalability: The platform should scale with no upper limit as long as there is economic justification for doing so in order to support enterprise-grade, globally used applications.
3. Sustainable Decentralization: The platform should encourage significant decentralization in both the short term and the long term in order to properly secure the value it hosts. The platform — and community — should be widely and permissionlessly inclusive and actively encourage decentralization and participation. To maintain sustainability, both technological and community governance mechanisms should allow for practical iteration while avoiding capture by any single parties in the end.
4. Simplicity: The design of each of the system’s components should be as simple as possible in order to achieve their primary purpose. Optimize for simplicity, pragmatism and ease of understanding above theoretical perfection.

2.2 HOW NEAR WORKS?

NEAR’s platform provides a community-operated cloud infrastructure for deploying and running decentralized applications. It combines the features of a decentralized database with others of a serverless compute platform. The token, which allows this platform to run also, enables applications built on top of it to interact with each other in new ways. Together, these features allow developers to create censorship resistant back-ends for applications that deal with high stakes data like money, identity, assets, and open-state components, which interact seamlessly with each other. These application back-ends and components are called “smart contracts,” though we will often refer to these all as simply “applications” here.
The infrastructure, which makes up this cloud, is created from a potentially infinite number of “nodes” run by individuals around the world who offer portions of their CPU and hard drive space — whether on their laptops or more professionally deployed servers. Developers write smart contracts and deploy them to this cloud as if they were deploying to a single server, which is a process that feels very similar to how applications are deployed to existing centralized clouds.
Once the developer has deployed an application, called a “smart contract”, and marked it unchangeable (“immutable”), the application will now run for as long as at least a handful of members of the NEAR community continue to exist. When end users interact with that deployed application, they will generally do so through a familiar web or mobile interface just like any one of a million apps today.
In the central cloud hosted by some companies today like: Amazon or Google, developers pay for their apps every month based on the amount of usage needed, for example based on the number of requests created by users visiting their webpages. The NEAR platform similarly requires that either users or developers provide compensation for their usage to the community operators of this infrastructure. Like today’s cloud infrastructure, NEAR prices usage based on easy to understand metrics that aren’t heavily influenced by factors like system congestion. Such factors make it very complicated for developers on alternative blockchain-based systems today.
In the centralized cloud, the controlling corporation makes decisions unilaterally. NEAR community-run cloud is decentralized so updates must ultimately be accepted by a sufficient quorum of the network participants. Updates about its future are generated from the community and subject to an inclusive governance process, which balances efficiency and security.
In order to ensure that the operators of nodes — who are anonymous and potentially even malicious — run the code with good behavior, they participate in a staking process called “Proof of Stake”. In this process, they willingly put a portion of value at risk as a sort of deposit, which they will forfeit if it is proven that they have operated improperly.

2.2.1 Elements of the NEAR’s Platform

The NEAR platform is made up of many separate elements. Some of these are native to the platform itself while others are used in conjunction with or on top of it.
1. THE NEAR TOKEN
NEAR token is the fundamental native asset of the NEAR ecosystem and its functionality is enabled for all accounts. Each token is a unique digital asset similar to Ether, which can be used to:
a) Pay the system for processing transactions and storing data.
b) Run a validating node as part of the network by participating in the staking process.
c) Help determine how network resources are allocated and where its future technical direction will go by participating in governance processes.
The NEAR token enables the economic coordination of all participants who operate the network plus it enables new behaviors among the applications which are built on top of that network.
2. OTHER DIGITAL ASSETS
The platform is designed to easily store unique digital assets, which may include, but aren’t limited to:
  • Other Tokens: Tokens bridged from other chains (“wrapped”) or created atop the NEAR Platform can be easily stored and moved using the underlying platform. This allows many kinds of tokens to be used atop the platform to pay for goods and services. “Stablecoins,” specific kinds of token which are designed to match the price of another asset (like the US Dollar), are particularly useful for transacting on the network in this way.
  • Unique Digital Assets: Similar to tokens, digital assets (sometimes called “Non Fungible Tokens” (NFTs) ranging from in-game collectibles to representations of real-world asset ownership can be stored and moved using the platform.
3. THE NEAR PLATFORM
The core platform, which is made up of the cloud of community-operated nodes, is the most basic piece of infrastructure provided. Developers can permissionlessly deploy smart contracts to this cloud and users can permissionlessly use the applications they power. Applications, which could range from consumer-facing games to digital currencies, can store their state (data) securely on the platform. This is conceptually similar to the Ethereum platform.
Operations that require an account, network use, or storage at the top of the platform require payment to the platform in the form of transaction fees that the platform then distributes to its community from the authentication contract. These operations could include creating new accounts, publishing new contracts, implementing code by contract and storing or modifying data by contract.
As long as the rules of the protocol are followed, any independent developer can write software, which interfaces with it (for example, by submitting transactions, creating accounts or even running a new node client) without asking for anyone’s permission first.
4. THE NEAR DEVELOPMENT SUITE
Set of tools and reference implementations created to facilitate its use by those developers and end users who prefer them. These tools include:
  • NEAR SDKs: NEAR platform supports (Rust and AssemblyScript) languages to write smart contracts. To provide a great experience for developers, NEAR has a full SDK, which includes standard data structures, examples and testing tools for these two languages.
  • Gitpod for NEAR: NEAR uses existing technology Gitpod to create zero time onboarding experience for developers. Gitpod provides an online “Integrated Development Environment” (IDE), which NEAR customized to allow developers to easily write, test and deploy smart contracts from a web browser.
  • NEAR Wallet: A wallet is a basic place for developers and end users to store the assets they need to use the network. NEAR Wallet is a reference implementation that is intended to work seamlessly with the progressive security model that lets application developers design more effective user experiences. It will eventually include built-in functionality to easily enable participation by holders in staking and governance processes on the network.
  • NEAR Explorer: To aid with both debugging of contracts and the understanding of network performance, Explorer presents information from the blockchain in an easily digestible web-based format.
  • NEAR Command Line Tools: The NEAR team provides a set of straightforward command line tools to allow developers to easily create, test and deploy applications from their local environments.
All of these tools are being created in an open-source manner so they can be modified or deployed by anyone.

3. ECONOMIC

Primarily economic forces drive the ecosystem, which makes up the NEAR platform. This economy creates the incentives, which allow participants permissionlessly organize to drive the platform’s key functions while creating strong disincentives for undesirable, irresponsible or malicious behavior. In order for the platform to be effective, these incentives need to exist both in the short term and in the long term.
The NEAR platform is a market among participants interested in two aspects:
  • On the supply side, certification contract operators and other core infrastructure must be motivated to provide these services that make up the community cloud.
  • On the demand side, platform developers and end-users who pay for their use need to be able to do so in a simple, clear and consistent way that helps them.
Further, economic forces can also be applied to support the ecosystem as a whole. They can be used at a micro level to create new business models by directly compensating the developers who create its most useful applications. They can also be used at a macro level by coordinating the efforts of a broader set of ecosystem participants who participate in everything from education to governance.

3.1 NEAR ECONOMY DESIGN PRINCIPLES

NEAR’s overall system design principles are used to inform its economic design according to the following interpretations:
1. Usability: End users and developers should have predictable and consistent pricing for their usage of the network. Users should never lose data forever.
2. Scalability: The platform should scale at economically justified thresholds.
3. Simplicity: The design of each of the system’s components should be as simple as possible in order to achieve their primary purpose.
4. Sustainable Decentralization: The barrier for participation in the platform as a validating node should be set as low as possible in order to bring a wide range of participants. Over time, their participation should not drive wealth and control into the hands of a small number. Individual transactions made far in the future must be at least as secure as those made today in order to safeguard the value they modify.

3.2 ECONOMIC OVERVIEW

The NEAR economy is optimized to provide developers and end users with the easiest possible experience while still providing proper incentives for network security and ecosystem development.
Summary of the key ideas that drive the system:
  • Thresholded Proof of Stake: Validating node operators provide scarce and valuable compute resources to the network. In order to ensure that the computations they run are correct, they are required to “stake” NEAR tokens, which guarantee their results. If these results are found to be inaccurate, the staker loses their tokens. This is a fundamental mechanism for securing the network. The threshold for participating in the system is set algorithmically at the lowest level possible to allow for the broadest possible participation of validating nodes in a given “epoch” period (½ of a day).
  • Epoch Rewards: Node operators are paid for their service a fixed percentage of total supply as a “security” fee of roughly 4.5% annualized. This rate targets sufficient participation levels among stakers in order to secure the network while balancing with other usage of NEAR token in the ecosystem.
  • Protocol treasury: In addition to validators, protocol treasury received a 0.5% of total supply annually to continuously re-invest into ecosystem development.
  • Transaction Costs: Usage of the network consumes two separate kinds of resources — instantaneous and long term. Instantaneous costs are generated by every transaction because each transaction requires the usage of both the network itself and some of its computation resources. These are priced together as a mostly-predictable cost per transaction, which is paid in NEAR tokens.
  • Storage Costs: Storage is a long term cost because storing data represents an ongoing burden to the nodes of the network. Storage costs are covered by maintaining minimum balance of NEAR tokens on the account or contract. This provides indirect mechanism of payment via inflation to validators for maintaining contract and account state on their nodes.
  • Inflation: Inflation is determined as combination of payouts to validators and protocol treasury minus the collected transaction fees and few other NEAR burning mechanics (like name auction). Overall the maximum inflation is 5%, which can go down over time as network gets more usage and more transactions fees are burned. It’s possible that inflation becomes negative (total supply decreases) if there is enough fees burned.
  • Scaling Thresholds: In a network, which scales its capacity relative to the amount of usage it receives, the thresholds, which drive the network to bring on additional capacity are economic in nature.
  • Security Thresholds: Some thresholds, which provide for good behavior among participants are set using economic incentives. For example, “Fishermen” (described separately).
Full Report
submitted by CoinEx_Institution to Coinex [link] [comments]

Scaling Reddit Community Points with Arbitrum Rollup: a piece of cake

Scaling Reddit Community Points with Arbitrum Rollup: a piece of cake
https://preview.redd.it/b80c05tnb9e51.jpg?width=2550&format=pjpg&auto=webp&s=850282c1a3962466ed44f73886dae1c8872d0f31
Submitted for consideration to The Great Reddit Scaling Bake-Off
Baked by the pastry chefs at Offchain Labs
Please send questions or comments to [[email protected] ](mailto:[email protected])
1. Overview
We're excited to submit Arbitrum Rollup for consideration to The Great Reddit Scaling Bake-Off. Arbitrum Rollup is the only Ethereum scaling solution that supports arbitrary smart contracts without compromising on Ethereum's security or adding points of centralization. For Reddit, this means that Arbitrum can not only scale the minting and transfer of Community Points, but it can foster a creative ecosystem built around Reddit Community Points enabling points to be used in a wide variety of third party applications. That's right -- you can have your cake and eat it too!
Arbitrum Rollup isn't just Ethereum-style. Its Layer 2 transactions are byte-for-byte identical to Ethereum, which means Ethereum users can continue to use their existing addresses and wallets, and Ethereum developers can continue to use their favorite toolchains and development environments out-of-the-box with Arbitrum. Coupling Arbitrum’s tooling-compatibility with its trustless asset interoperability, Reddit not only can scale but can onboard the entire Ethereum community at no cost by giving them the same experience they already know and love (well, certainly know).
To benchmark how Arbitrum can scale Reddit Community Points, we launched the Reddit contracts on an Arbitrum Rollup chain. Since Arbitrum provides full Solidity support, we didn't have to rewrite the Reddit contracts or try to mimic their functionality using an unfamiliar paradigm. Nope, none of that. We launched the Reddit contracts unmodified on Arbitrum Rollup complete with support for minting and distributing points. Like every Arbitrum Rollup chain, the chain included a bridge interface in which users can transfer Community Points or any other asset between the L1 and L2 chains. Arbitrum Rollup chains also support dynamic contract loading, which would allow third-party developers to launch custom ecosystem apps that integrate with Community Points on the very same chain that runs the Reddit contracts.
1.1 Why Ethereum
Perhaps the most exciting benefit of distributing Community Points using a blockchain is the ability to seamlessly port points to other applications and use them in a wide variety of contexts. Applications may include simple transfers such as a restaurant that allows Redditors to spend points on drinks. Or it may include complex smart contracts -- such as placing Community Points as a wager for a multiparty game or as collateral in a financial contract.
The common denominator between all of the fun uses of Reddit points is that it needs a thriving ecosystem of both users and developers, and the Ethereum blockchain is perhaps the only smart contract platform with significant adoption today. While many Layer 1 blockchains boast lower cost or higher throughput than the Ethereum blockchain, more often than not, these attributes mask the reality of little usage, weaker security, or both.
Perhaps another platform with significant usage will rise in the future. But today, Ethereum captures the mindshare of the blockchain community, and for Community Points to provide the most utility, the Ethereum blockchain is the natural choice.
1.2 Why Arbitrum
While Ethereum's ecosystem is unmatched, the reality is that fees are high and capacity is too low to support the scale of Reddit Community Points. Enter Arbitrum. Arbitrum Rollup provides all of the ecosystem benefits of Ethereum, but with orders of magnitude more capacity and at a fraction of the cost of native Ethereum smart contracts. And most of all, we don't change the experience from users. They continue to use the same wallets, addresses, languages, and tools.
Arbitrum Rollup is not the only solution that can scale payments, but it is the only developed solution that can scale both payments and arbitrary smart contracts trustlessly, which means that third party users can build highly scalable add-on apps that can be used without withdrawing money from the Rollup chain. If you believe that Reddit users will want to use their Community Points in smart contracts--and we believe they will--then it makes the most sense to choose a single scaling solution that can support the entire ecosystem, eliminating friction for users.
We view being able to run smart contracts in the same scaling solution as fundamentally critical since if there's significant demand in running smart contracts from Reddit's ecosystem, this would be a load on Ethereum and would itself require a scaling solution. Moreover, having different scaling solutions for the minting/distribution/spending of points and for third party apps would be burdensome for users as they'd have to constantly shuffle their Points back and forth.
2. Arbitrum at a glance
Arbitrum Rollup has a unique value proposition as it offers a combination of features that no other scaling solution achieves. Here we highlight its core attributes.
Decentralized. Arbitrum Rollup is as decentralized as Ethereum. Unlike some other Layer 2 scaling projects, Arbitrum Rollup doesn't have any centralized components or centralized operators who can censor users or delay transactions. Even in non-custodial systems, centralized components provide a risk as the operators are generally incentivized to increase their profit by extracting rent from users often in ways that severely degrade user experience. Even if centralized operators are altruistic, centralized components are subject to hacking, coercion, and potential liability.
Massive Scaling. Arbitrum achieves order of magnitude scaling over Ethereum's L1 smart contracts. Our software currently supports 453 transactions-per-second for basic transactions (at 1616 Ethereum gas per tx). We have a lot of room left to optimize (e.g. aggregating signatures), and over the next several months capacity will increase significantly. As described in detail below, Arbitrum can easily support and surpass Reddit's anticipated initial load, and its capacity will continue to improve as Reddit's capacity needs grow.
Low cost. The cost of running Arbitrum Rollup is quite low compared to L1 Ethereum and other scaling solutions such as those based on zero-knowledge proofs. Layer 2 fees are low, fixed, and predictable and should not be overly burdensome for Reddit to cover. Nobody needs to use special equipment or high-end machines. Arbitrum requires validators, which is a permissionless role that can be run on any reasonable on-line machine. Although anybody can act as a validator, in order to protect against a “tragedy of the commons” and make sure reputable validators are participating, we support a notion of “invited validators” that are compensated for their costs. In general, users pay (low) fees to cover the invited validators’ costs, but we imagine that Reddit may cover this cost for its users. See more on the costs and validator options below.
Ethereum Developer Experience. Not only does Arbitrum support EVM smart contracts, but the developer experience is identical to that of L1 Ethereum contracts and fully compatible with Ethereum tooling. Developers can port existing Solidity apps or write new ones using their favorite and familiar toolchains (e.g. Truffle, Buidler). There are no new languages or coding paradigms to learn.
Ethereum wallet compatibility. Just as in Ethereum, Arbitrum users need only hold keys, but do not have to store any coin history or additional data to protect or access their funds. Since Arbitrum transactions are semantically identical to Ethereum L1 transactions, existing Ethereum users can use their existing Ethereum keys with their existing wallet software such as Metamask.
Token interoperability. Users can easily transfer their ETH, ERC-20 and ERC-721 tokens between Ethereum and the Arbitrum Rollup chain. As we explain in detail below, it is possible to mint tokens in L2 that can subsequently be withdrawn and recognized by the L1 token contract.
Fast finality. Transactions complete with the same finality time as Ethereum L1 (and it's possible to get faster finality guarantees by trading away trust assumptions; see the Arbitrum Rollup whitepaper for details).
Non-custodial. Arbitrum Rollup is a non-custodial scaling solution, so users control their funds/points and neither Reddit nor anyone else can ever access or revoke points held by users.
Censorship Resistant. Since it's completely decentralized, and the Arbitrum protocol guarantees progress trustlessly, Arbitrum Rollup is just as censorship-proof as Ethereum.
Block explorer. The Arbitrum Rollup block explorer allows users to view and analyze transactions on the Rollup chain.
Limitations
Although this is a bake-off, we're not going to sugar coat anything. Arbitrum Rollup, like any Optimistic Rollup protocol, does have one limitation, and that's the delay on withdrawals.
As for the concrete length of the delay, we've done a good deal of internal modeling and have blogged about this as well. Our current modeling suggests a 3-hour delay is sufficient (but as discussed in the linked post there is a tradeoff space between the length of the challenge period and the size of the validators’ deposit).
Note that this doesn't mean that the chain is delayed for three hours. Arbitrum Rollup supports pipelining of execution, which means that validators can keep building new states even while previous ones are “in the pipeline” for confirmation. As the challenge delays expire for each update, a new state will be confirmed (read more about this here).
So activity and progress on the chain are not delayed by the challenge period. The only thing that's delayed is the consummation of withdrawals. Recall though that any single honest validator knows immediately (at the speed of L1 finality) which state updates are correct and can guarantee that they will eventually be confirmed, so once a valid withdrawal has been requested on-chain, every honest party knows that the withdrawal will definitely happen. There's a natural place here for a liquidity market in which a validator (or someone who trusts a validator) can provide withdrawal loans for a small interest fee. This is a no-risk business for them as they know which withdrawals will be confirmed (and can force their confirmation trustlessly no matter what anyone else does) but are just waiting for on-chain finality.
3. The recipe: How Arbitrum Rollup works
For a description of the technical components of Arbitrum Rollup and how they interact to create a highly scalable protocol with a developer experience that is identical to Ethereum, please refer to the following documents:
Arbitrum Rollup Whitepaper
Arbitrum academic paper (describes a previous version of Arbitrum)
4. Developer docs and APIs
For full details about how to set up and interact with an Arbitrum Rollup chain or validator, please refer to our developer docs, which can be found at https://developer.offchainlabs.com/.
Note that the Arbitrum version described on that site is older and will soon be replaced by the version we are entering in Reddit Bake-Off, which is still undergoing internal testing before public release.
5. Who are the validators?
As with any Layer 2 protocol, advancing the protocol correctly requires at least one validator (sometimes called block producers) that is honest and available. A natural question is: who are the validators?
Recall that the validator set for an Arbitrum chain is open and permissionless; anyone can start or stop validating at will. (A useful analogy is to full nodes on an L1 chain.) But we understand that even though anyone can participate, Reddit may want to guarantee that highly reputable nodes are validating their chain. Reddit may choose to validate the chain themselves and/or hire third-party validators.To this end, we have begun building a marketplace for validator-for-hire services so that dapp developers can outsource validation services to reputable nodes with high up-time. We've announced a partnership in which Chainlink nodes will provide Arbitrum validation services, and we expect to announce more partnerships shortly with other blockchain infrastructure providers.
Although there is no requirement that validators are paid, Arbitrum’s economic model tracks validators’ costs (e.g. amount of computation and storage) and can charge small fees on user transactions, using a gas-type system, to cover those costs. Alternatively, a single party such as Reddit can agree to cover the costs of invited validators.
6. Reddit Contract Support
Since Arbitrum contracts and transactions are byte-for-byte compatible with Ethereum, supporting the Reddit contracts is as simple as launching them on an Arbitrum chain.
Minting. Arbitrum Rollup supports hybrid L1/L2 tokens which can be minted in L2 and then withdrawn onto the L1. An L1 contract at address A can make a special call to the EthBridge which deploys a "buddy contract" to the same address A on an Arbitrum chain. Since it's deployed at the same address, users can know that the L2 contract is the authorized "buddy" of the L1 contract on the Arbitrum chain.
For minting, the L1 contract is a standard ERC-20 contract which mints and burns tokens when requested by the L2 contract. It is paired with an ERC-20 contract in L2 which mints tokens based on whatever programmer provided minting facility is desired and burns tokens when they are withdrawn from the rollup chain. Given this base infrastructure, Arbitrum can support any smart contract based method for minting tokens in L2, and indeed we directly support Reddit's signature/claim based minting in L2.
Batch minting. What's better than a mint cookie? A whole batch! In addition to supporting Reddit’s current minting/claiming scheme, we built a second minting design, which we believe outperforms the signature/claim system in many scenarios.
In the current system, Reddit periodically issues signed statements to users, who then take those statements to the blockchain to claim their tokens. An alternative approach would have Reddit directly submit the list of users/amounts to the blockchain and distribute the tokens to the users without the signature/claim process.
To optimize the cost efficiency of this approach, we designed an application-specific compression scheme to minimize the size of the batch distribution list. We analyzed the data from Reddit's previous distributions and found that the data is highly compressible since token amounts are small and repeated, and addresses appear multiple times. Our function groups transactions by size, and replaces previously-seen addresses with a shorter index value. We wrote client code to compress the data, wrote a Solidity decompressing function, and integrated that function into Reddit’s contract running on Arbitrum.
When we ran the compression function on the previous Reddit distribution data, we found that we could compress batched minting data down to to 11.8 bytes per minting event (averaged over a 6-month trace of Reddit’s historical token grants)compared with roughly 174 bytes of on-chain data needed for the signature claim approach to minting (roughly 43 for an RLP-encoded null transaction + 65 for Reddit's signature + 65 for the user's signature + roughly 8 for the number of Points) .
The relative benefit of the two approaches with respect to on-chain call data cost depends on the percentage of users that will actually claim their tokens on chain. With the above figures, batch minting will be cheaper if roughly 5% of users redeem their claims. We stress that our compression scheme is not Arbitrum-specific and would be beneficial in any general-purpose smart contract platform.
8. Benchmarks and costs
In this section, we give the full costs of operating the Reddit contracts on an Arbitrum Rollup chain including the L1 gas costs for the Rollup chain, the costs of computation and storage for the L2 validators as well as the capital lockup requirements for staking.
Arbitrum Rollup is still on testnet, so we did not run mainnet benchmarks. Instead, we measured the L1 gas cost and L2 workload for Reddit operations on Arbitrum and calculated the total cost assuming current Ethereum gas prices. As noted below in detail, our measurements do not assume that Arbitrum is consuming the entire capacity of Ethereum. We will present the details of our model now, but for full transparency you can also play around with it yourself and adjust the parameters, by copying the spreadsheet found here.
Our cost model is based on measurements of Reddit’s contracts, running unmodified (except for the addition of a batch minting function) on Arbitrum Rollup on top of Ethereum.
On the distribution of transactions and frequency of assertions. Reddit's instructions specify the following minimum parameters that submissions should support:
Over a 5 day period, your scaling PoC should be able to handle:
  • 100,000 point claims (minting & distributing points)
  • 25,000 subscriptions
  • 75,000 one-off points burning
  • 100,000 transfers
We provide the full costs of operating an Arbitrum Rollup chain with this usage under the assumption that tokens are minted or granted to users in batches, but other transactions are uniformly distributed over the 5 day period. Unlike some other submissions, we do not make unrealistic assumptions that all operations can be submitted in enormous batches. We assume that batch minting is done in batches that use only a few percent on an L1 block’s gas, and that other operations come in evenly over time and are submitted in batches, with one batch every five minutes to keep latency reasonable. (Users are probably already waiting for L1 finality, which takes at least that long to achieve.)
We note that assuming that there are only 300,000 transactions that arrive uniformly over the 5 day period will make our benchmark numbers lower, but we believe that this will reflect the true cost of running the system. To see why, say that batches are submitted every five minutes (20 L1 blocks) and there's a fixed overhead of c bytes of calldata per batch, the cost of which will get amortized over all transactions executed in that batch. Assume that each individual transaction adds a marginal cost of t. Lastly assume the capacity of the scaling system is high enough that it can support all of Reddit's 300,000 transactions within a single 20-block batch (i.e. that there is more than c + 300,000*t byes of calldata available in 20 blocks).
Consider what happens if c, the per-batch overhead, is large (which it is in some systems, but not in Arbitrum). In the scenario that transactions actually arrive at the system's capacity and each batch is full, then c gets amortized over 300,000 transactions. But if we assume that the system is not running at capacity--and only receives 300,000 transactions arriving uniformly over 5 days-- then each 20-block assertion will contain about 200 transactions, and thus each transaction will pay a nontrivial cost due to c.
We are aware that other proposals presented scaling numbers assuming that 300,000 transactions arrived at maximum capacity and was executed in a single mega-transaction, but according to our estimates, for at least one such report, this led to a reported gas price that was 2-3 orders of magnitude lower than it would have been assuming uniform arrival. We make more realistic batching assumptions, and we believe Arbitrum compares well when batch sizes are realistic.
Our model. Our cost model includes several sources of cost:
  • L1 gas costs: This is the cost of posting transactions as calldata on the L1 chain, as well as the overhead associated with each batch of transactions, and the L1 cost of settling transactions in the Arbitrum protocol.
  • Validator’s staking costs: In normal operation, one validator will need to be staked. The stake is assumed to be 0.2% of the total value of the chain (which is assumed to be $1 per user who is eligible to claim points). The cost of staking is the interest that could be earned on the money if it were not staked.
  • Validator computation and storage: Every validator must do computation to track the chain’s processing of transactions, and must maintain storage to keep track of the contracts’ EVM storage. The cost of computation and storage are estimated based on measurements, with the dollar cost of resources based on Amazon Web Services pricing.
It’s clear from our modeling that the predominant cost is for L1 calldata. This will probably be true for any plausible rollup-based system.
Our model also shows that Arbitrum can scale to workloads much larger than Reddit’s nominal workload, without exhausting L1 or L2 resources. The scaling bottleneck will ultimately be calldata on the L1 chain. We believe that cost could be reduced substantially if necessary by clever encoding of data. (In our design any compression / decompression of L2 transaction calldata would be done by client software and L2 programs, never by an L1 contract.)
9. Status of Arbitrum Rollup
Arbitrum Rollup is live on Ethereum testnet. All of the code written to date including everything included in the Reddit demo is open source and permissively licensed under the Apache V2 license. The first testnet version of Arbitrum Rollup was released on testnet in February. Our current internal version, which we used to benchmark the Reddit contracts, will be released soon and will be a major upgrade.
Both the Arbitrum design as well as the implementation are heavily audited by independent third parties. The Arbitrum academic paper was published at USENIX Security, a top-tier peer-reviewed academic venue. For the Arbitrum software, we have engaged Trail of Bits for a security audit, which is currently ongoing, and we are committed to have a clean report before launching on Ethereum mainnet.
10. Reddit Universe Arbitrum Rollup Chain
The benchmarks described in this document were all measured using the latest internal build of our software. When we release the new software upgrade publicly we will launch a Reddit Universe Arbitrum Rollup chain as a public demo, which will contain the Reddit contracts as well as a Uniswap instance and a Connext Hub, demonstrating how Community Points can be integrated into third party apps. We will also allow members of the public to dynamically launch ecosystem contracts. We at Offchain Labs will cover the validating costs for the Reddit Universe public demo.
If the folks at Reddit would like to evaluate our software prior to our public demo, please email us at [email protected] and we'd be more than happy to provide early access.
11. Even more scaling: Arbitrum Sidechains
Rollups are an excellent approach to scaling, and we are excited about Arbitrum Rollup which far surpasses Reddit's scaling needs. But looking forward to Reddit's eventual goal of supporting hundreds of millions of users, there will likely come a time when Reddit needs more scaling than any Rollup protocol can provide.
While Rollups greatly reduce costs, they don't break the linear barrier. That is, all transactions have an on-chain footprint (because all calldata must be posted on-chain), albeit a far smaller one than on native Ethereum, and the L1 limitations end up being the bottleneck for capacity and cost. Since Ethereum has limited capacity, this linear use of on-chain resources means that costs will eventually increase superlinearly with traffic.
The good news is that we at Offchain Labs have a solution in our roadmap that can satisfy this extreme-scaling setting as well: Arbitrum AnyTrust Sidechains. Arbitrum Sidechains are similar to Arbitrum Rollup, but deviate in that they name a permissioned set of validators. When a chain’s validators agree off-chain, they can greatly reduce the on-chain footprint of the protocol and require almost no data to be put on-chain. When validators can't reach unanimous agreement off-chain, the protocol reverts to Arbitrum Rollup. Technically, Arbitrum Sidechains can be viewed as a hybrid between state channels and Rollup, switching back and forth as necessary, and combining the performance and cost that state channels can achieve in the optimistic case, with the robustness of Rollup in other cases. The core technical challenge is how to switch seamlessly between modes and how to guarantee that security is maintained throughout.
Arbitrum Sidechains break through this linear barrier, while still maintaining a high level of security and decentralization. Arbitrum Sidechains provide the AnyTrust guarantee, which says that as long as any one validator is honest and available (even if you don't know which one will be), the L2 chain is guaranteed to execute correctly according to its code and guaranteed to make progress. Unlike in a state channel, offchain progress does not require unanimous consent, and liveness is preserved as long as there is a single honest validator.
Note that the trust model for Arbitrum Sidechains is much stronger than for typical BFT-style chains which introduce a consensus "voting" protocols among a small permissioned group of validators. BFT-based protocols require a supermajority (more than 2/3) of validators to agree. In Arbitrum Sidechains, by contrast, all you need is a single honest validator to achieve guaranteed correctness and progress. Notice that in Arbitrum adding validators strictly increases security since the AnyTrust guarantee provides correctness as long as any one validator is honest and available. By contrast, in BFT-style protocols, adding nodes can be dangerous as a coalition of dishonest nodes can break the protocol.
Like Arbitrum Rollup, the developer and user experiences for Arbitrum Sidechains will be identical to that of Ethereum. Reddit would be able to choose a large and diverse set of validators, and all that they would need to guarantee to break through the scaling barrier is that a single one of them will remain honest.
We hope to have Arbitrum Sidechains in production in early 2021, and thus when Reddit reaches the scale that surpasses the capacity of Rollups, Arbitrum Sidechains will be waiting and ready to help.
While the idea to switch between channels and Rollup to get the best of both worlds is conceptually simple, getting the details right and making sure that the switch does not introduce any attack vectors is highly non-trivial and has been the subject of years of our research (indeed, we were working on this design for years before the term Rollup was even coined).
12. How Arbitrum compares
We include a comparison to several other categories as well as specific projects when appropriate. and explain why we believe that Arbitrum is best suited for Reddit's purposes. We focus our attention on other Ethereum projects.
Payment only Rollups. Compared to Arbitrum Rollup, ZK-Rollups and other Rollups that only support token transfers have several disadvantages:
  • As outlined throughout the proposal, we believe that the entire draw of Ethereum is in its rich smart contracts support which is simply not achievable with today's zero-knowledge proof technology. Indeed, scaling with a ZK-Rollup will add friction to the deployment of smart contracts that interact with Community Points as users will have to withdraw their coins from the ZK-Rollup and transfer them to a smart contract system (like Arbitrum). The community will be best served if Reddit builds on a platform that has built-in, frictionless smart-contract support.
  • All other Rollup protocols of which we are aware employ a centralized operator. While it's true that users retain custody of their coins, the centralized operator can often profit from censoring, reordering, or delaying transactions. A common misconception is that since they're non-custodial protocols, a centralized sequencer does not pose a risk but this is incorrect as the sequencer can wreak havoc or shake down users for side payments without directly stealing funds.
  • Sidechain type protocols can eliminate some of these issues, but they are not trustless. Instead, they require trust in some quorum of a committee, often requiring two-third of the committee to be honest, compared to rollup protocols like Arbitrum that require only a single honest party. In addition, not all sidechain type protocols have committees that are diverse, or even non-centralized, in practice.
  • Plasma-style protocols have a centralized operator and do not support general smart contracts.
13. Concluding Remarks
While it's ultimately up to the judges’ palate, we believe that Arbitrum Rollup is the bakeoff choice that Reddit kneads. We far surpass Reddit's specified workload requirement at present, have much room to optimize Arbitrum Rollup in the near term, and have a clear path to get Reddit to hundreds of millions of users. Furthermore, we are the only project that gives developers and users the identical interface as the Ethereum blockchain and is fully interoperable and tooling-compatible, and we do this all without any new trust assumptions or centralized components.
But no matter how the cookie crumbles, we're glad to have participated in this bake-off and we thank you for your consideration.
About Offchain Labs
Offchain Labs, Inc. is a venture-funded New York company that spun out of Princeton University research, and is building the Arbitrum platform to usher in the next generation of scalable, interoperable, and compatible smart contracts. Offchain Labs is backed by Pantera Capital, Compound VC, Coinbase Ventures, and others.
Leadership Team
Ed Felten
Ed Felten is Co-founder and Chief Scientist at Offchain Labs. He is on leave from Princeton University, where he is the Robert E. Kahn Professor of Computer Science and Public Affairs. From 2015 to 2017 he served at the White House as Deputy United States Chief Technology Officer and senior advisor to the President. He is an ACM Fellow and member of the National Academy of Engineering. Outside of work, he is an avid runner, cook, and L.A. Dodgers fan.
Steven Goldfeder
Steven Goldfeder is Co-founder and Chief Executive Officer at Offchain Labs. He holds a PhD from Princeton University, where he worked at the intersection of cryptography and cryptocurrencies including threshold cryptography, zero-knowledge proof systems, and post-quantum signatures. He is a co-author of Bitcoin and Cryptocurrency Technologies, the leading textbook on cryptocurrencies, and he has previously worked at Google and Microsoft Research, where he co-invented the Picnic signature algorithm. When not working, you can find Steven spending time with his family, taking a nature walk, or twisting balloons.
Harry Kalodner
Harry Kalodner is Co-founder and Chief Technology Officer at Offchain Labs where he leads the engineering team. Before the company he attended Princeton as a Ph.D candidate where his research explored economics, anonymity, and incentive compatibility of cryptocurrencies, and he also has worked at Apple. When not up at 3:00am writing code, Harry occasionally sleeps.
submitted by hkalodner to ethereum [link] [comments]

How to Run a Qtum Full Node On A Raspberry Pi Light Node Wallet VS Full Node Wallet Tutorial: Setting up Your Casa Node Who Pays the Bitcoin Mining Reward? - George Levy Bitcoin Q&A: Mining incentives after 2140

I understand that 51% Attack is very costly and probably won't probably much financial incentive to the one who orchestrates it. I came across this blog post called How China can kill bitcoin and thought that the author does have quite a good point (despite the tone of the blog post). He argued that the top 4 Chinese mining pools alone represent more than 51% of the hashrate and if the Chinese ... In that case, miners definitely do have an incentive to change Bitcoin's rules in their favor. It is only reasonably secure to use a lightweight node because most of the Bitcoin economy uses full nodes. Therefore, it is critical for Bitcoin's survival that the great majority of the Bitcoin economy be backed by full nodes, not lightweight nodes. This is especially important for Bitcoin ... If you’re interested in signing up to possibly receive bitcoin payouts for running a well connected node, you can check out the Bitnodes Incentive Program. If you are not willing to run a node on your home machine / Internet connection but you are willing to pay to rent a machine, you can find instructions for setting up a cheap VPS on this Reddit post . Overall, more Bitcoin nodes translate into a faster, more stable, and more decentralized network. To that end, we’ve compiled a list of 6 reasons to run a Bitcoin Full Node. 1) Helps the Network. Running your own full node is the only way to have full control and to ensure that all the rules of Bitcoin are being followed. Nodes do this by ... A full-node is a mining node. A node: Needs a complete, validated copy of the blockchain so that it can mine on the correct chain, earning bitcoins and contributing to the security of the network.

[index] [26754] [30800] [43047] [23741] [7057] [27301] [44694] [44703] [34888] [32456]

How to Run a Qtum Full Node On A Raspberry Pi

Vipnode is an economic incentive to run Ethereum full nodes. We provide a protocol and open source implementation of an agent that allows your full node to earn money by serving light clients who ... What are the differences between Bitcoin and Lightning nodes and wallets? These questions are from the MOOC 10.5 session and the December monthly subscriber session, which took place on October ... Wallet Series - https://www.youtube.com/playlist?list=PLyawcAUcwe11oO92sMxJ2K2XZyjkv1RWN Get Bitcoin worth Rs 50 worth free: https://koinex.in/?ref=f2b486 Do... Casa Node Bitcoin Full and Lightning Node Unboxing and Setup December 2018 - Duration: 13:57. Crypto Explorer 3,366 views. 13:57. 03 - myNode series - Hardware requirements - Duration: 18:02. ... Bitcoin full nodes are computers that are running the Bitcoin software. A full node will contain a full copy of the Bitcoin blockchain. Full nodes talk to each other, verify transactions, and make ...

#