Bitcoin SV: Build on the only public blockchain designed ...

Bitcoin Newcomers FAQ - Please read!

Welcome to the /Bitcoin Sticky FAQ

You've probably been hearing a lot about Bitcoin recently and are wondering what's the big deal? Most of your questions should be answered by the resources below but if you have additional questions feel free to ask them in the comments.
It all started with the release of the release of Satoshi Nakamoto's whitepaper however that will probably go over the head of most readers so we recommend the following videos for a good starting point for understanding how bitcoin works and a little about its long term potential:
Some other great resources include Lopp.net, the Princeton crypto series and James D'Angelo's Bitcoin 101 Blackboard series.
Some excellent writing on Bitcoin's value proposition and future can be found at the Satoshi Nakamoto Institute.
Some Bitcoin statistics can be found here and here. Developer resources can be found here. Peer-reviewed research papers can be found here.
Potential upcoming protocol improvements and scaling resources here and here.
The number of times Bitcoin was declared dead by the media can be found here (LOL!)

Key properties of Bitcoin

Where can I buy bitcoins?

Bitcoin.org and BuyBitcoinWorldwide.com are helpful sites for beginners. You can buy or sell any amount of bitcoin (even just a few dollars worth) and there are several easy methods to purchase bitcoin with cash, credit card or bank transfer. Some of the more popular resources are below, also check out the bitcoinity exchange resources for a larger list of options for purchases.
Here is a listing of local ATMs. If you would like your paycheck automatically converted to bitcoin use Bitwage.
Note: Bitcoins are valued at whatever market price people are willing to pay for them in balancing act of supply vs demand. Unlike traditional markets, bitcoin markets operate 24 hours per day, 365 days per year. Preev is a useful site that that shows how much various denominations of bitcoin are worth in different currencies. Alternatively you can just Google "1 bitcoin in (your local currency)".

Securing your bitcoins

With bitcoin you can "Be your own bank" and personally secure your bitcoins OR you can use third party companies aka "Bitcoin banks" which will hold the bitcoins for you.
Note: For increased security, use Two Factor Authentication (2FA) everywhere it is offered, including email!
2FA requires a second confirmation code to access your account making it much harder for thieves to gain access. Google Authenticator and Authy are the two most popular 2FA services, download links are below. Make sure you create backups of your 2FA codes.
Google Auth Authy OTP Auth
Android Android N/A
iOS iOS iOS

Watch out for scams

As mentioned above, Bitcoin is decentralized, which by definition means there is no official website or Twitter handle or spokesperson or CEO. However, all money attracts thieves. This combination unfortunately results in scammers running official sounding names or pretending to be an authority on YouTube or social media. Many scammers throughout the years have claimed to be the inventor of Bitcoin. Websites like bitcoin(dot)com and the btc subreddit are active scams. Almost all altcoins (shitcoins) are marketed heavily with big promises but are really just designed to separate you from your bitcoin. So be careful: any resource, including all linked in this document, may in the future turn evil. Don't trust, verify. Also as they say in our community "Not your keys, not your coins".

Where can I spend bitcoins?

Check out spendabit or bitcoin directory for millions of merchant options. Also you can spend bitcoin anywhere visa is accepted with bitcoin debit cards such as the CashApp card. Some other useful site are listed below.
Store Product
Gyft Gift cards for hundreds of retailers including Amazon, Target, Walmart, Starbucks, Whole Foods, CVS, Lowes, Home Depot, iTunes, Best Buy, Sears, Kohls, eBay, GameStop, etc.
Spendabit, Overstock and The Bitcoin Directory Retail shopping with millions of results
ShakePay Generate one time use Visa cards in seconds
NewEgg and Dell For all your electronics needs
Bitwa.la, Coinbills, Piixpay, Bitbill.eu, Bylls, Coins.ph, Bitrefill, LivingRoomofSatoshi, Coinsfer, and more Bill payment
Menufy, Takeaway and Thuisbezorgd NL Takeout delivered to your door
Expedia, Cheapair, Destinia, Abitsky, SkyTours, the Travel category on Gyft and 9flats For when you need to get away
Cryptostorm, Mullvad, and PIA VPN services
Namecheap, Porkbun Domain name registration
Stampnik Discounted USPS Priority, Express, First-Class mail postage
Coinmap and AirBitz are helpful to find local businesses accepting bitcoins. A good resource for UK residents is at wheretospendbitcoins.co.uk.
There are also lots of charities which accept bitcoin donations.

Merchant Resources

There are several benefits to accepting bitcoin as a payment option if you are a merchant;
If you are interested in accepting bitcoin as a payment method, there are several options available;

Can I mine bitcoin?

Mining bitcoins can be a fun learning experience, but be aware that you will most likely operate at a loss. Newcomers are often advised to stay away from mining unless they are only interested in it as a hobby similar to folding at home. If you want to learn more about mining you can read more here. Still have mining questions? The crew at /BitcoinMining would be happy to help you out.
If you want to contribute to the bitcoin network by hosting the blockchain and propagating transactions you can run a full node using this setup guide. If you would prefer to keep it simple there are several good options. You can view the global node distribution here.

Earning bitcoins

Just like any other form of money, you can also earn bitcoins by being paid to do a job.
Site Description
WorkingForBitcoins, Bitwage, Cryptogrind, Coinality, Bitgigs, /Jobs4Bitcoins, BitforTip, Rein Project Freelancing
Lolli Earn bitcoin when you shop online!
OpenBazaar, Purse.io, Bitify, /Bitmarket, 21 Market Marketplaces
/GirlsGoneBitcoin NSFW Adult services
A-ads, Coinzilla.io Advertising
You can also earn bitcoins by participating as a market maker on JoinMarket by allowing users to perform CoinJoin transactions with your bitcoins for a small fee (requires you to already have some bitcoins.

Bitcoin-Related Projects

The following is a short list of ongoing projects that might be worth taking a look at if you are interested in current development in the bitcoin space.
Project Description
Lightning Network Second layer scaling
Blockstream, Rootstock and Drivechain Sidechains
Hivemind and Augur Prediction markets
Tierion and Factom Records & Titles on the blockchain
BitMarkets, DropZone, Beaver and Open Bazaar Decentralized markets
JoinMarket and Wasabi Wallet CoinJoin implementation
Coinffeine and Bisq Decentralized bitcoin exchanges
Keybase Identity & Reputation management
Abra Global P2P money transmitter network
Bitcore Open source Bitcoin javascript library

Bitcoin Units

One Bitcoin is quite large (hundreds of £/$/€) so people often deal in smaller units. The most common subunits are listed below:
Unit Symbol Value Info
bitcoin BTC 1 bitcoin one bitcoin is equal to 100 million satoshis
millibitcoin mBTC 1,000 per bitcoin used as default unit in recent Electrum wallet releases
bit bit 1,000,000 per bitcoin colloquial "slang" term for microbitcoin (μBTC)
satoshi sat 100,000,000 per bitcoin smallest unit in bitcoin, named after the inventor
For example, assuming an arbitrary exchange rate of $10000 for one Bitcoin, a $10 meal would equal:
For more information check out the Bitcoin units wiki.
Still have questions? Feel free to ask in the comments below or stick around for our weekly Mentor Monday thread. If you decide to post a question in /Bitcoin, please use the search bar to see if it has been answered before, and remember to follow the community rules outlined on the sidebar to receive a better response. The mods are busy helping manage our community so please do not message them unless you notice problems with the functionality of the subreddit.
Note: This is a community created FAQ. If you notice anything missing from the FAQ or that requires clarification you can edit it here and it will be included in the next revision pending approval.
Welcome to the Bitcoin community and the new decentralized economy!
submitted by BitcoinFan7 to Bitcoin [link] [comments]

NEAR PROJECT REPORT

NEAR PROJECT REPORT
Author: Gamals Ahmed, CoinEx Business Ambassador
https://preview.redd.it/xbnvecjn71t51.png?width=1164&format=png&auto=webp&s=acfd141ead035ee156f218eec9fc41288142a922

ABSTRACT

The effects of the web by a number of companies have seduced a large number of users as these companies keep their data to prevent them from searching for alternatives. Likewise, these huge platforms have attracted applications to build their highest ecosystems before either severing access or actively opposing their interests when the applications became so successful. As a result, these walled gardens have effectively hindered innovation and monopolized large sections of the web. After the emergence of blockchain technology and decentralized cryptocurrencies, the need for applications to support decentralization has emerged. Several blockchain-based companies, applications and platforms have appeared in decentralization. In this research report, we will explain the approach adopted by the NEAR decentralization platform in designing and implementing the basic technology for its system. Near is a basic platform for cloud computing and decentralized storage managed by the community, designed to enable the open web for the future. On this web, everything can be created from new currencies to new applications to new industries, opening the door to an entirely new future.

1. INTRODUCTION

The richness of the web is increasing day by day with the combined efforts of millions of people who have benefited from “innovation without permission” as content and applications are created without asking anyone. this lack of freedom of data has led to an environment hostile to the interests of its participants. And as we explained in the summary previously, web hosting companies have hindered innovation and greatly monopolized the web.
In the future, we can fix this by using new technologies to re-enable the permissionless innovation of the past in a way, which creates a more open web where users are free and applications are supportive rather than adversarial to their interests.
Decentralization emerged after the global financial crisis in 2008, which created fundamental problems of confidence in the heavily indebted banking system. Then the decentralized financial sector based on Blockchain technology has emerged since 2009.
Decentralized Blockchain technology has made it easy for decentralized digital currencies like Bitcoin to exchange billions of dollars in peer-to-peer transfers for a fraction of the price of a traditional banking system. This technology allows participants in the over $ 50 billion virtual goods economy to track, own and trade in these commodities without permission. It allows real-world goods to cross into the digital domain, with verified ownership and tracking just like that of the digital.
By default, the Internet where freedom of data enables innovation will lead to the development of a new form of software development. On this web, developers can quickly create applications from open state components and boost their efforts by using new business models that are enabled from within the program itself rather than relying on parasitic relationships with their users. This not only accelerates the creation of applications that have a more honest and cooperative relationship with its users, but also allows the emergence of completely new business built on them.
To enable these new applications and the open web, it needs the appropriate infrastructure. The new web platform cannot be controlled by a single entity and its use is not limited due to insufficient scalability. It should be decentralized in design like the web itself and supported by a community of distributors widely so that the value they store cannot be monitored, modified or removed without permission from the users who store this value on their behalf.
A new decentralization technology (Blockchain), which has facilitated decentralized digital currencies like Bitcoin, has made billions of dollars in peer-to-peer transfers at a fraction of the price of the traditional banking system. This technology allows participants in the $ 50 billion + virtual goods economy to track, own and trade in these goods without permission. It allows real-world goods to cross into the digital domain, with verified ownership and tracking just like that of the digital.
Although the cost of storing data or performing a calculation on the Ethereum blockchain is thousands and millions of times higher than the cost of performing the same functionality on Amazon Web Services. A developer can always create a “central” app or even a central currency for a fraction of the cost of doing the same on a decentralized platform because a decentralized platform, by definition, will have many iterations in its operations and storage.
Bitcoin can be thought of as the first, very basic, version of this global community-run cloud, though it is primarily used only to store and move the Bitcoin digital currency.
Ethereum is the second and slightly more sophisticated version, which expanded the basic principles of Bitcoin to create a more general computing and storage platform, though it is a raw technology, which hasn’t achieved meaningful mainstream adoption.

1.1 WHY IS IT IMPORTANT TO PAY THE EXTRA COST TO SUPPORT DECENTRALIZATION?

Because some elements of value, for example bits representing digital currency ownership, personal identity, or asset notes, are very sensitive. While in the central system, the following players can change the value of any credits they come into direct contact with:
  1. The developer who controls the release or update of the application’s code
  2. The platform where the data is stored
  3. The servers which run the application’s code
Even if none of these players intend to operate with bad faith, the actions of governments, police forces and hackers can easily turn their hands against their users and censor, modify or steal the balances they are supposed to protect.
A typical user will trust a typical centralized application, despite its potential vulnerabilities, with everyday data and computation. Typically, only banks and governments are trusted sufficiently to maintain custody of the most sensitive information — balances of wealth and identity. But these entities are also subject to the very human forces of hubris, corruption and theft.
Especially after the 2008 global financial crisis, which demonstrated the fundamental problems of confidence in a highly indebted banking system. And governments around the
world apply significant capital controls to citizens during times of crisis. After these examples, it has become a truism that hackers now own most or all of your sensitive data.
These decentralized applications operate on a more complex infrastructure than today’s web but they have access to an instantaneous and global pool of currency, value and information that today’s web, where data is stored in the silos of individual corporations, cannot provide.

1.2 THE CHALLENGES OF CREATING A DECENTRALIZED CLOUD

A community-run system like this has very different challenges from centralized “cloud” infrastructure, which is running by a single entity or group of known entities. For example:
  1. It must be both inclusive to anyone and secure from manipulation or capture.
  2. Participants must be fairly compensated for their work while avoiding creating incentives for negligent or malicious behavior.
  3. It must be both game theoretically secure so good actors find the right equilibrium and resistant to manipulation so bad actors are actively prevented from negatively affecting the system.

2. NEAR

NEAR is a global community-run computing and storage cloud which is organized to be permissionless and which is economically incentivized to create a strong and decentralized data layer for the new web.
Essentially, it is a platform for running applications which have access to a shared — and secure — pool of money, identity and data which is owned by their users. More technically, it combines the features of partition-resistant networking, serverless compute and distributed storage into a new kind of platform.
NEAR is a community-managed, decentralized cloud storage and computing platform, designed to enable the open web in the future. It uses the same core technology for Bitcoin and Blockchain. On this web, everything can be created from new currencies to new applications to new industries, opening the door to an entirely new future.
NEAR is a decentralized community-run cloud computing and storage platform, which is designed to enable the open web of the future. On this web, everything from new currencies to new applications to new industries can be created, opening the door to a brand new future.
NEAR is a scalable computing and storage platform with the potential to change how systems are designed, how applications are built and how the web itself works.
It is a complex technology allow developers and entrepreneurs to easily and sustainably build applications which reap the benefits of decentralization and participate in the Open Web while minimizing the associated costs for end users.
NEAR creates the only community-managed cloud that is strong enough to power the future of the open web, as NEAR is designed from the ground up to deliver intuitive experiences to
end users, expand capacity across millions of devices, and provide developers with new and sustainable business models for their applications.
The NEAR Platform uses a token — also called “NEAR”. This token allows the users of these cloud resources, regardless of where they are in the world, to fairly compensate the providers of the services and to ensure that these participants operate in good faith.

2.1 WHY NEAR?

Through focus, we find that Platforms based on blockchain technologies like Bitcoin and Ethereum have made great progress and enriched the world with thousands of innovative applications spanning from games to decentralized financing.
However, these original networks and none of the networks that followed were not able to bridge the gap towards mainstream adoption of the applications created above them and do not provide this type of standard that fully supports the web.
This is a result of two key factors:
  1. System design
  2. Organization design
System design is relevant because the technical architecture of other platforms creates substantial problems with both usability and scalability which have made adoption nearly impossible by any but the most technical innovators. End-users experience 97–99% dropoff rates when using applications and developers find the process of creating and maintaining their applications endlessly frustrating.
Fixing these problems requires substantial and complex changes to current protocol architectures, something which existing organizations haven’t proven capable of implementing. Instead, they create multi-year backlogs of specification design and implementation, which result in their technology falling further and further behind.
NEAR’s platform and organization are architected specifically to solve the above-mentioned problems. The technical design is fanatically focused on creating the world’s most usable and scalable decentralized platform so global-scale applications can achieve real adoption. The organization and governance structure are designed to rapidly ship and continuously evolve the protocol so it will never become obsolete.

2.1.1 Features, which address these problems:

1. USABILITY FIRST
The most important problem that needs to be addressed is how to allow developers to create useful applications that users can use easily and that will capture the sustainable value of these developers.
2. End-User Usability
Developers will only build applications, which their end users can actually use. NEAR’s “progressive security” model allows developers to create experiences for their users which more closely resemble familiar web experiences by delaying onboarding, removing the need for user to learn “blockchain” concepts and limiting the number of permission-asking interactions the user must have to use the application.
1. Simple Onboarding: NEAR allows developers to take actions on behalf of their users, which allows them to onboard users without requiring these users to provide a wallet or interact with tokens immediately upon reaching an application. Because accounts keep track of application-specific keys, user accounts can also be used for the kind of “Single Sign On” (SSO) functionality that users are familiar with from the traditional web (eg “Login with Facebook/Google/Github/etc”).
2. Easy Subscriptions: Contract-based accounts allow for easy creation of subscriptions and custom permissioning for particular applications.
3. Familiar Usage Styles: The NEAR economic model allows developers to pay for usage on behalf of their users in order to hide the costs of infrastructure in a way that is in line with familiar web usage paradigms.
4. Predictable Pricing: NEAR prices transactions on the platform in simple terms, which allow end-users to experience predictable pricing and less cognitive load when using the platform.

2.1.2 Design principles and development NEAR’s platform

1. Usability: Applications deployed to the platform should be seamless to use for end users and seamless to create for developers. Wherever possible, the underlying technology itself should fade to the background or be hidden completely from end users. Wherever possible, developers should use familiar languages and patterns during the development process. Basic applications should be intuitive and simple to create while applications that are more robust should still be secure.
2. Scalability: The platform should scale with no upper limit as long as there is economic justification for doing so in order to support enterprise-grade, globally used applications.
3. Sustainable Decentralization: The platform should encourage significant decentralization in both the short term and the long term in order to properly secure the value it hosts. The platform — and community — should be widely and permissionlessly inclusive and actively encourage decentralization and participation. To maintain sustainability, both technological and community governance mechanisms should allow for practical iteration while avoiding capture by any single parties in the end.
4. Simplicity: The design of each of the system’s components should be as simple as possible in order to achieve their primary purpose. Optimize for simplicity, pragmatism and ease of understanding above theoretical perfection.

2.2 HOW NEAR WORKS?

NEAR’s platform provides a community-operated cloud infrastructure for deploying and running decentralized applications. It combines the features of a decentralized database with others of a serverless compute platform. The token, which allows this platform to run also, enables applications built on top of it to interact with each other in new ways. Together, these features allow developers to create censorship resistant back-ends for applications that deal with high stakes data like money, identity, assets, and open-state components, which interact seamlessly with each other. These application back-ends and components are called “smart contracts,” though we will often refer to these all as simply “applications” here.
The infrastructure, which makes up this cloud, is created from a potentially infinite number of “nodes” run by individuals around the world who offer portions of their CPU and hard drive space — whether on their laptops or more professionally deployed servers. Developers write smart contracts and deploy them to this cloud as if they were deploying to a single server, which is a process that feels very similar to how applications are deployed to existing centralized clouds.
Once the developer has deployed an application, called a “smart contract”, and marked it unchangeable (“immutable”), the application will now run for as long as at least a handful of members of the NEAR community continue to exist. When end users interact with that deployed application, they will generally do so through a familiar web or mobile interface just like any one of a million apps today.
In the central cloud hosted by some companies today like: Amazon or Google, developers pay for their apps every month based on the amount of usage needed, for example based on the number of requests created by users visiting their webpages. The NEAR platform similarly requires that either users or developers provide compensation for their usage to the community operators of this infrastructure. Like today’s cloud infrastructure, NEAR prices usage based on easy to understand metrics that aren’t heavily influenced by factors like system congestion. Such factors make it very complicated for developers on alternative blockchain-based systems today.
In the centralized cloud, the controlling corporation makes decisions unilaterally. NEAR community-run cloud is decentralized so updates must ultimately be accepted by a sufficient quorum of the network participants. Updates about its future are generated from the community and subject to an inclusive governance process, which balances efficiency and security.
In order to ensure that the operators of nodes — who are anonymous and potentially even malicious — run the code with good behavior, they participate in a staking process called “Proof of Stake”. In this process, they willingly put a portion of value at risk as a sort of deposit, which they will forfeit if it is proven that they have operated improperly.

2.2.1 Elements of the NEAR’s Platform

The NEAR platform is made up of many separate elements. Some of these are native to the platform itself while others are used in conjunction with or on top of it.
1. THE NEAR TOKEN
NEAR token is the fundamental native asset of the NEAR ecosystem and its functionality is enabled for all accounts. Each token is a unique digital asset similar to Ether, which can be used to:
a) Pay the system for processing transactions and storing data.
b) Run a validating node as part of the network by participating in the staking process.
c) Help determine how network resources are allocated and where its future technical direction will go by participating in governance processes.
The NEAR token enables the economic coordination of all participants who operate the network plus it enables new behaviors among the applications which are built on top of that network.
2. OTHER DIGITAL ASSETS
The platform is designed to easily store unique digital assets, which may include, but aren’t limited to:
  • Other Tokens: Tokens bridged from other chains (“wrapped”) or created atop the NEAR Platform can be easily stored and moved using the underlying platform. This allows many kinds of tokens to be used atop the platform to pay for goods and services. “Stablecoins,” specific kinds of token which are designed to match the price of another asset (like the US Dollar), are particularly useful for transacting on the network in this way.
  • Unique Digital Assets: Similar to tokens, digital assets (sometimes called “Non Fungible Tokens” (NFTs) ranging from in-game collectibles to representations of real-world asset ownership can be stored and moved using the platform.
3. THE NEAR PLATFORM
The core platform, which is made up of the cloud of community-operated nodes, is the most basic piece of infrastructure provided. Developers can permissionlessly deploy smart contracts to this cloud and users can permissionlessly use the applications they power. Applications, which could range from consumer-facing games to digital currencies, can store their state (data) securely on the platform. This is conceptually similar to the Ethereum platform.
Operations that require an account, network use, or storage at the top of the platform require payment to the platform in the form of transaction fees that the platform then distributes to its community from the authentication contract. These operations could include creating new accounts, publishing new contracts, implementing code by contract and storing or modifying data by contract.
As long as the rules of the protocol are followed, any independent developer can write software, which interfaces with it (for example, by submitting transactions, creating accounts or even running a new node client) without asking for anyone’s permission first.
4. THE NEAR DEVELOPMENT SUITE
Set of tools and reference implementations created to facilitate its use by those developers and end users who prefer them. These tools include:
  • NEAR SDKs: NEAR platform supports (Rust and AssemblyScript) languages to write smart contracts. To provide a great experience for developers, NEAR has a full SDK, which includes standard data structures, examples and testing tools for these two languages.
  • Gitpod for NEAR: NEAR uses existing technology Gitpod to create zero time onboarding experience for developers. Gitpod provides an online “Integrated Development Environment” (IDE), which NEAR customized to allow developers to easily write, test and deploy smart contracts from a web browser.
  • NEAR Wallet: A wallet is a basic place for developers and end users to store the assets they need to use the network. NEAR Wallet is a reference implementation that is intended to work seamlessly with the progressive security model that lets application developers design more effective user experiences. It will eventually include built-in functionality to easily enable participation by holders in staking and governance processes on the network.
  • NEAR Explorer: To aid with both debugging of contracts and the understanding of network performance, Explorer presents information from the blockchain in an easily digestible web-based format.
  • NEAR Command Line Tools: The NEAR team provides a set of straightforward command line tools to allow developers to easily create, test and deploy applications from their local environments.
All of these tools are being created in an open-source manner so they can be modified or deployed by anyone.

3. ECONOMIC

Primarily economic forces drive the ecosystem, which makes up the NEAR platform. This economy creates the incentives, which allow participants permissionlessly organize to drive the platform’s key functions while creating strong disincentives for undesirable, irresponsible or malicious behavior. In order for the platform to be effective, these incentives need to exist both in the short term and in the long term.
The NEAR platform is a market among participants interested in two aspects:
  • On the supply side, certification contract operators and other core infrastructure must be motivated to provide these services that make up the community cloud.
  • On the demand side, platform developers and end-users who pay for their use need to be able to do so in a simple, clear and consistent way that helps them.
Further, economic forces can also be applied to support the ecosystem as a whole. They can be used at a micro level to create new business models by directly compensating the developers who create its most useful applications. They can also be used at a macro level by coordinating the efforts of a broader set of ecosystem participants who participate in everything from education to governance.

3.1 NEAR ECONOMY DESIGN PRINCIPLES

NEAR’s overall system design principles are used to inform its economic design according to the following interpretations:
1. Usability: End users and developers should have predictable and consistent pricing for their usage of the network. Users should never lose data forever.
2. Scalability: The platform should scale at economically justified thresholds.
3. Simplicity: The design of each of the system’s components should be as simple as possible in order to achieve their primary purpose.
4. Sustainable Decentralization: The barrier for participation in the platform as a validating node should be set as low as possible in order to bring a wide range of participants. Over time, their participation should not drive wealth and control into the hands of a small number. Individual transactions made far in the future must be at least as secure as those made today in order to safeguard the value they modify.

3.2 ECONOMIC OVERVIEW

The NEAR economy is optimized to provide developers and end users with the easiest possible experience while still providing proper incentives for network security and ecosystem development.
Summary of the key ideas that drive the system:
  • Thresholded Proof of Stake: Validating node operators provide scarce and valuable compute resources to the network. In order to ensure that the computations they run are correct, they are required to “stake” NEAR tokens, which guarantee their results. If these results are found to be inaccurate, the staker loses their tokens. This is a fundamental mechanism for securing the network. The threshold for participating in the system is set algorithmically at the lowest level possible to allow for the broadest possible participation of validating nodes in a given “epoch” period (½ of a day).
  • Epoch Rewards: Node operators are paid for their service a fixed percentage of total supply as a “security” fee of roughly 4.5% annualized. This rate targets sufficient participation levels among stakers in order to secure the network while balancing with other usage of NEAR token in the ecosystem.
  • Protocol treasury: In addition to validators, protocol treasury received a 0.5% of total supply annually to continuously re-invest into ecosystem development.
  • Transaction Costs: Usage of the network consumes two separate kinds of resources — instantaneous and long term. Instantaneous costs are generated by every transaction because each transaction requires the usage of both the network itself and some of its computation resources. These are priced together as a mostly-predictable cost per transaction, which is paid in NEAR tokens.
  • Storage Costs: Storage is a long term cost because storing data represents an ongoing burden to the nodes of the network. Storage costs are covered by maintaining minimum balance of NEAR tokens on the account or contract. This provides indirect mechanism of payment via inflation to validators for maintaining contract and account state on their nodes.
  • Inflation: Inflation is determined as combination of payouts to validators and protocol treasury minus the collected transaction fees and few other NEAR burning mechanics (like name auction). Overall the maximum inflation is 5%, which can go down over time as network gets more usage and more transactions fees are burned. It’s possible that inflation becomes negative (total supply decreases) if there is enough fees burned.
  • Scaling Thresholds: In a network, which scales its capacity relative to the amount of usage it receives, the thresholds, which drive the network to bring on additional capacity are economic in nature.
  • Security Thresholds: Some thresholds, which provide for good behavior among participants are set using economic incentives. For example, “Fishermen” (described separately).
Full Report
submitted by CoinEx_Institution to Coinex [link] [comments]

Scaling Reddit Community Points with Arbitrum Rollup: a piece of cake

Scaling Reddit Community Points with Arbitrum Rollup: a piece of cake
https://preview.redd.it/b80c05tnb9e51.jpg?width=2550&format=pjpg&auto=webp&s=850282c1a3962466ed44f73886dae1c8872d0f31
Submitted for consideration to The Great Reddit Scaling Bake-Off
Baked by the pastry chefs at Offchain Labs
Please send questions or comments to [[email protected] ](mailto:[email protected])
1. Overview
We're excited to submit Arbitrum Rollup for consideration to The Great Reddit Scaling Bake-Off. Arbitrum Rollup is the only Ethereum scaling solution that supports arbitrary smart contracts without compromising on Ethereum's security or adding points of centralization. For Reddit, this means that Arbitrum can not only scale the minting and transfer of Community Points, but it can foster a creative ecosystem built around Reddit Community Points enabling points to be used in a wide variety of third party applications. That's right -- you can have your cake and eat it too!
Arbitrum Rollup isn't just Ethereum-style. Its Layer 2 transactions are byte-for-byte identical to Ethereum, which means Ethereum users can continue to use their existing addresses and wallets, and Ethereum developers can continue to use their favorite toolchains and development environments out-of-the-box with Arbitrum. Coupling Arbitrum’s tooling-compatibility with its trustless asset interoperability, Reddit not only can scale but can onboard the entire Ethereum community at no cost by giving them the same experience they already know and love (well, certainly know).
To benchmark how Arbitrum can scale Reddit Community Points, we launched the Reddit contracts on an Arbitrum Rollup chain. Since Arbitrum provides full Solidity support, we didn't have to rewrite the Reddit contracts or try to mimic their functionality using an unfamiliar paradigm. Nope, none of that. We launched the Reddit contracts unmodified on Arbitrum Rollup complete with support for minting and distributing points. Like every Arbitrum Rollup chain, the chain included a bridge interface in which users can transfer Community Points or any other asset between the L1 and L2 chains. Arbitrum Rollup chains also support dynamic contract loading, which would allow third-party developers to launch custom ecosystem apps that integrate with Community Points on the very same chain that runs the Reddit contracts.
1.1 Why Ethereum
Perhaps the most exciting benefit of distributing Community Points using a blockchain is the ability to seamlessly port points to other applications and use them in a wide variety of contexts. Applications may include simple transfers such as a restaurant that allows Redditors to spend points on drinks. Or it may include complex smart contracts -- such as placing Community Points as a wager for a multiparty game or as collateral in a financial contract.
The common denominator between all of the fun uses of Reddit points is that it needs a thriving ecosystem of both users and developers, and the Ethereum blockchain is perhaps the only smart contract platform with significant adoption today. While many Layer 1 blockchains boast lower cost or higher throughput than the Ethereum blockchain, more often than not, these attributes mask the reality of little usage, weaker security, or both.
Perhaps another platform with significant usage will rise in the future. But today, Ethereum captures the mindshare of the blockchain community, and for Community Points to provide the most utility, the Ethereum blockchain is the natural choice.
1.2 Why Arbitrum
While Ethereum's ecosystem is unmatched, the reality is that fees are high and capacity is too low to support the scale of Reddit Community Points. Enter Arbitrum. Arbitrum Rollup provides all of the ecosystem benefits of Ethereum, but with orders of magnitude more capacity and at a fraction of the cost of native Ethereum smart contracts. And most of all, we don't change the experience from users. They continue to use the same wallets, addresses, languages, and tools.
Arbitrum Rollup is not the only solution that can scale payments, but it is the only developed solution that can scale both payments and arbitrary smart contracts trustlessly, which means that third party users can build highly scalable add-on apps that can be used without withdrawing money from the Rollup chain. If you believe that Reddit users will want to use their Community Points in smart contracts--and we believe they will--then it makes the most sense to choose a single scaling solution that can support the entire ecosystem, eliminating friction for users.
We view being able to run smart contracts in the same scaling solution as fundamentally critical since if there's significant demand in running smart contracts from Reddit's ecosystem, this would be a load on Ethereum and would itself require a scaling solution. Moreover, having different scaling solutions for the minting/distribution/spending of points and for third party apps would be burdensome for users as they'd have to constantly shuffle their Points back and forth.
2. Arbitrum at a glance
Arbitrum Rollup has a unique value proposition as it offers a combination of features that no other scaling solution achieves. Here we highlight its core attributes.
Decentralized. Arbitrum Rollup is as decentralized as Ethereum. Unlike some other Layer 2 scaling projects, Arbitrum Rollup doesn't have any centralized components or centralized operators who can censor users or delay transactions. Even in non-custodial systems, centralized components provide a risk as the operators are generally incentivized to increase their profit by extracting rent from users often in ways that severely degrade user experience. Even if centralized operators are altruistic, centralized components are subject to hacking, coercion, and potential liability.
Massive Scaling. Arbitrum achieves order of magnitude scaling over Ethereum's L1 smart contracts. Our software currently supports 453 transactions-per-second for basic transactions (at 1616 Ethereum gas per tx). We have a lot of room left to optimize (e.g. aggregating signatures), and over the next several months capacity will increase significantly. As described in detail below, Arbitrum can easily support and surpass Reddit's anticipated initial load, and its capacity will continue to improve as Reddit's capacity needs grow.
Low cost. The cost of running Arbitrum Rollup is quite low compared to L1 Ethereum and other scaling solutions such as those based on zero-knowledge proofs. Layer 2 fees are low, fixed, and predictable and should not be overly burdensome for Reddit to cover. Nobody needs to use special equipment or high-end machines. Arbitrum requires validators, which is a permissionless role that can be run on any reasonable on-line machine. Although anybody can act as a validator, in order to protect against a “tragedy of the commons” and make sure reputable validators are participating, we support a notion of “invited validators” that are compensated for their costs. In general, users pay (low) fees to cover the invited validators’ costs, but we imagine that Reddit may cover this cost for its users. See more on the costs and validator options below.
Ethereum Developer Experience. Not only does Arbitrum support EVM smart contracts, but the developer experience is identical to that of L1 Ethereum contracts and fully compatible with Ethereum tooling. Developers can port existing Solidity apps or write new ones using their favorite and familiar toolchains (e.g. Truffle, Buidler). There are no new languages or coding paradigms to learn.
Ethereum wallet compatibility. Just as in Ethereum, Arbitrum users need only hold keys, but do not have to store any coin history or additional data to protect or access their funds. Since Arbitrum transactions are semantically identical to Ethereum L1 transactions, existing Ethereum users can use their existing Ethereum keys with their existing wallet software such as Metamask.
Token interoperability. Users can easily transfer their ETH, ERC-20 and ERC-721 tokens between Ethereum and the Arbitrum Rollup chain. As we explain in detail below, it is possible to mint tokens in L2 that can subsequently be withdrawn and recognized by the L1 token contract.
Fast finality. Transactions complete with the same finality time as Ethereum L1 (and it's possible to get faster finality guarantees by trading away trust assumptions; see the Arbitrum Rollup whitepaper for details).
Non-custodial. Arbitrum Rollup is a non-custodial scaling solution, so users control their funds/points and neither Reddit nor anyone else can ever access or revoke points held by users.
Censorship Resistant. Since it's completely decentralized, and the Arbitrum protocol guarantees progress trustlessly, Arbitrum Rollup is just as censorship-proof as Ethereum.
Block explorer. The Arbitrum Rollup block explorer allows users to view and analyze transactions on the Rollup chain.
Limitations
Although this is a bake-off, we're not going to sugar coat anything. Arbitrum Rollup, like any Optimistic Rollup protocol, does have one limitation, and that's the delay on withdrawals.
As for the concrete length of the delay, we've done a good deal of internal modeling and have blogged about this as well. Our current modeling suggests a 3-hour delay is sufficient (but as discussed in the linked post there is a tradeoff space between the length of the challenge period and the size of the validators’ deposit).
Note that this doesn't mean that the chain is delayed for three hours. Arbitrum Rollup supports pipelining of execution, which means that validators can keep building new states even while previous ones are “in the pipeline” for confirmation. As the challenge delays expire for each update, a new state will be confirmed (read more about this here).
So activity and progress on the chain are not delayed by the challenge period. The only thing that's delayed is the consummation of withdrawals. Recall though that any single honest validator knows immediately (at the speed of L1 finality) which state updates are correct and can guarantee that they will eventually be confirmed, so once a valid withdrawal has been requested on-chain, every honest party knows that the withdrawal will definitely happen. There's a natural place here for a liquidity market in which a validator (or someone who trusts a validator) can provide withdrawal loans for a small interest fee. This is a no-risk business for them as they know which withdrawals will be confirmed (and can force their confirmation trustlessly no matter what anyone else does) but are just waiting for on-chain finality.
3. The recipe: How Arbitrum Rollup works
For a description of the technical components of Arbitrum Rollup and how they interact to create a highly scalable protocol with a developer experience that is identical to Ethereum, please refer to the following documents:
Arbitrum Rollup Whitepaper
Arbitrum academic paper (describes a previous version of Arbitrum)
4. Developer docs and APIs
For full details about how to set up and interact with an Arbitrum Rollup chain or validator, please refer to our developer docs, which can be found at https://developer.offchainlabs.com/.
Note that the Arbitrum version described on that site is older and will soon be replaced by the version we are entering in Reddit Bake-Off, which is still undergoing internal testing before public release.
5. Who are the validators?
As with any Layer 2 protocol, advancing the protocol correctly requires at least one validator (sometimes called block producers) that is honest and available. A natural question is: who are the validators?
Recall that the validator set for an Arbitrum chain is open and permissionless; anyone can start or stop validating at will. (A useful analogy is to full nodes on an L1 chain.) But we understand that even though anyone can participate, Reddit may want to guarantee that highly reputable nodes are validating their chain. Reddit may choose to validate the chain themselves and/or hire third-party validators.To this end, we have begun building a marketplace for validator-for-hire services so that dapp developers can outsource validation services to reputable nodes with high up-time. We've announced a partnership in which Chainlink nodes will provide Arbitrum validation services, and we expect to announce more partnerships shortly with other blockchain infrastructure providers.
Although there is no requirement that validators are paid, Arbitrum’s economic model tracks validators’ costs (e.g. amount of computation and storage) and can charge small fees on user transactions, using a gas-type system, to cover those costs. Alternatively, a single party such as Reddit can agree to cover the costs of invited validators.
6. Reddit Contract Support
Since Arbitrum contracts and transactions are byte-for-byte compatible with Ethereum, supporting the Reddit contracts is as simple as launching them on an Arbitrum chain.
Minting. Arbitrum Rollup supports hybrid L1/L2 tokens which can be minted in L2 and then withdrawn onto the L1. An L1 contract at address A can make a special call to the EthBridge which deploys a "buddy contract" to the same address A on an Arbitrum chain. Since it's deployed at the same address, users can know that the L2 contract is the authorized "buddy" of the L1 contract on the Arbitrum chain.
For minting, the L1 contract is a standard ERC-20 contract which mints and burns tokens when requested by the L2 contract. It is paired with an ERC-20 contract in L2 which mints tokens based on whatever programmer provided minting facility is desired and burns tokens when they are withdrawn from the rollup chain. Given this base infrastructure, Arbitrum can support any smart contract based method for minting tokens in L2, and indeed we directly support Reddit's signature/claim based minting in L2.
Batch minting. What's better than a mint cookie? A whole batch! In addition to supporting Reddit’s current minting/claiming scheme, we built a second minting design, which we believe outperforms the signature/claim system in many scenarios.
In the current system, Reddit periodically issues signed statements to users, who then take those statements to the blockchain to claim their tokens. An alternative approach would have Reddit directly submit the list of users/amounts to the blockchain and distribute the tokens to the users without the signature/claim process.
To optimize the cost efficiency of this approach, we designed an application-specific compression scheme to minimize the size of the batch distribution list. We analyzed the data from Reddit's previous distributions and found that the data is highly compressible since token amounts are small and repeated, and addresses appear multiple times. Our function groups transactions by size, and replaces previously-seen addresses with a shorter index value. We wrote client code to compress the data, wrote a Solidity decompressing function, and integrated that function into Reddit’s contract running on Arbitrum.
When we ran the compression function on the previous Reddit distribution data, we found that we could compress batched minting data down to to 11.8 bytes per minting event (averaged over a 6-month trace of Reddit’s historical token grants)compared with roughly 174 bytes of on-chain data needed for the signature claim approach to minting (roughly 43 for an RLP-encoded null transaction + 65 for Reddit's signature + 65 for the user's signature + roughly 8 for the number of Points) .
The relative benefit of the two approaches with respect to on-chain call data cost depends on the percentage of users that will actually claim their tokens on chain. With the above figures, batch minting will be cheaper if roughly 5% of users redeem their claims. We stress that our compression scheme is not Arbitrum-specific and would be beneficial in any general-purpose smart contract platform.
8. Benchmarks and costs
In this section, we give the full costs of operating the Reddit contracts on an Arbitrum Rollup chain including the L1 gas costs for the Rollup chain, the costs of computation and storage for the L2 validators as well as the capital lockup requirements for staking.
Arbitrum Rollup is still on testnet, so we did not run mainnet benchmarks. Instead, we measured the L1 gas cost and L2 workload for Reddit operations on Arbitrum and calculated the total cost assuming current Ethereum gas prices. As noted below in detail, our measurements do not assume that Arbitrum is consuming the entire capacity of Ethereum. We will present the details of our model now, but for full transparency you can also play around with it yourself and adjust the parameters, by copying the spreadsheet found here.
Our cost model is based on measurements of Reddit’s contracts, running unmodified (except for the addition of a batch minting function) on Arbitrum Rollup on top of Ethereum.
On the distribution of transactions and frequency of assertions. Reddit's instructions specify the following minimum parameters that submissions should support:
Over a 5 day period, your scaling PoC should be able to handle:
  • 100,000 point claims (minting & distributing points)
  • 25,000 subscriptions
  • 75,000 one-off points burning
  • 100,000 transfers
We provide the full costs of operating an Arbitrum Rollup chain with this usage under the assumption that tokens are minted or granted to users in batches, but other transactions are uniformly distributed over the 5 day period. Unlike some other submissions, we do not make unrealistic assumptions that all operations can be submitted in enormous batches. We assume that batch minting is done in batches that use only a few percent on an L1 block’s gas, and that other operations come in evenly over time and are submitted in batches, with one batch every five minutes to keep latency reasonable. (Users are probably already waiting for L1 finality, which takes at least that long to achieve.)
We note that assuming that there are only 300,000 transactions that arrive uniformly over the 5 day period will make our benchmark numbers lower, but we believe that this will reflect the true cost of running the system. To see why, say that batches are submitted every five minutes (20 L1 blocks) and there's a fixed overhead of c bytes of calldata per batch, the cost of which will get amortized over all transactions executed in that batch. Assume that each individual transaction adds a marginal cost of t. Lastly assume the capacity of the scaling system is high enough that it can support all of Reddit's 300,000 transactions within a single 20-block batch (i.e. that there is more than c + 300,000*t byes of calldata available in 20 blocks).
Consider what happens if c, the per-batch overhead, is large (which it is in some systems, but not in Arbitrum). In the scenario that transactions actually arrive at the system's capacity and each batch is full, then c gets amortized over 300,000 transactions. But if we assume that the system is not running at capacity--and only receives 300,000 transactions arriving uniformly over 5 days-- then each 20-block assertion will contain about 200 transactions, and thus each transaction will pay a nontrivial cost due to c.
We are aware that other proposals presented scaling numbers assuming that 300,000 transactions arrived at maximum capacity and was executed in a single mega-transaction, but according to our estimates, for at least one such report, this led to a reported gas price that was 2-3 orders of magnitude lower than it would have been assuming uniform arrival. We make more realistic batching assumptions, and we believe Arbitrum compares well when batch sizes are realistic.
Our model. Our cost model includes several sources of cost:
  • L1 gas costs: This is the cost of posting transactions as calldata on the L1 chain, as well as the overhead associated with each batch of transactions, and the L1 cost of settling transactions in the Arbitrum protocol.
  • Validator’s staking costs: In normal operation, one validator will need to be staked. The stake is assumed to be 0.2% of the total value of the chain (which is assumed to be $1 per user who is eligible to claim points). The cost of staking is the interest that could be earned on the money if it were not staked.
  • Validator computation and storage: Every validator must do computation to track the chain’s processing of transactions, and must maintain storage to keep track of the contracts’ EVM storage. The cost of computation and storage are estimated based on measurements, with the dollar cost of resources based on Amazon Web Services pricing.
It’s clear from our modeling that the predominant cost is for L1 calldata. This will probably be true for any plausible rollup-based system.
Our model also shows that Arbitrum can scale to workloads much larger than Reddit’s nominal workload, without exhausting L1 or L2 resources. The scaling bottleneck will ultimately be calldata on the L1 chain. We believe that cost could be reduced substantially if necessary by clever encoding of data. (In our design any compression / decompression of L2 transaction calldata would be done by client software and L2 programs, never by an L1 contract.)
9. Status of Arbitrum Rollup
Arbitrum Rollup is live on Ethereum testnet. All of the code written to date including everything included in the Reddit demo is open source and permissively licensed under the Apache V2 license. The first testnet version of Arbitrum Rollup was released on testnet in February. Our current internal version, which we used to benchmark the Reddit contracts, will be released soon and will be a major upgrade.
Both the Arbitrum design as well as the implementation are heavily audited by independent third parties. The Arbitrum academic paper was published at USENIX Security, a top-tier peer-reviewed academic venue. For the Arbitrum software, we have engaged Trail of Bits for a security audit, which is currently ongoing, and we are committed to have a clean report before launching on Ethereum mainnet.
10. Reddit Universe Arbitrum Rollup Chain
The benchmarks described in this document were all measured using the latest internal build of our software. When we release the new software upgrade publicly we will launch a Reddit Universe Arbitrum Rollup chain as a public demo, which will contain the Reddit contracts as well as a Uniswap instance and a Connext Hub, demonstrating how Community Points can be integrated into third party apps. We will also allow members of the public to dynamically launch ecosystem contracts. We at Offchain Labs will cover the validating costs for the Reddit Universe public demo.
If the folks at Reddit would like to evaluate our software prior to our public demo, please email us at [email protected] and we'd be more than happy to provide early access.
11. Even more scaling: Arbitrum Sidechains
Rollups are an excellent approach to scaling, and we are excited about Arbitrum Rollup which far surpasses Reddit's scaling needs. But looking forward to Reddit's eventual goal of supporting hundreds of millions of users, there will likely come a time when Reddit needs more scaling than any Rollup protocol can provide.
While Rollups greatly reduce costs, they don't break the linear barrier. That is, all transactions have an on-chain footprint (because all calldata must be posted on-chain), albeit a far smaller one than on native Ethereum, and the L1 limitations end up being the bottleneck for capacity and cost. Since Ethereum has limited capacity, this linear use of on-chain resources means that costs will eventually increase superlinearly with traffic.
The good news is that we at Offchain Labs have a solution in our roadmap that can satisfy this extreme-scaling setting as well: Arbitrum AnyTrust Sidechains. Arbitrum Sidechains are similar to Arbitrum Rollup, but deviate in that they name a permissioned set of validators. When a chain’s validators agree off-chain, they can greatly reduce the on-chain footprint of the protocol and require almost no data to be put on-chain. When validators can't reach unanimous agreement off-chain, the protocol reverts to Arbitrum Rollup. Technically, Arbitrum Sidechains can be viewed as a hybrid between state channels and Rollup, switching back and forth as necessary, and combining the performance and cost that state channels can achieve in the optimistic case, with the robustness of Rollup in other cases. The core technical challenge is how to switch seamlessly between modes and how to guarantee that security is maintained throughout.
Arbitrum Sidechains break through this linear barrier, while still maintaining a high level of security and decentralization. Arbitrum Sidechains provide the AnyTrust guarantee, which says that as long as any one validator is honest and available (even if you don't know which one will be), the L2 chain is guaranteed to execute correctly according to its code and guaranteed to make progress. Unlike in a state channel, offchain progress does not require unanimous consent, and liveness is preserved as long as there is a single honest validator.
Note that the trust model for Arbitrum Sidechains is much stronger than for typical BFT-style chains which introduce a consensus "voting" protocols among a small permissioned group of validators. BFT-based protocols require a supermajority (more than 2/3) of validators to agree. In Arbitrum Sidechains, by contrast, all you need is a single honest validator to achieve guaranteed correctness and progress. Notice that in Arbitrum adding validators strictly increases security since the AnyTrust guarantee provides correctness as long as any one validator is honest and available. By contrast, in BFT-style protocols, adding nodes can be dangerous as a coalition of dishonest nodes can break the protocol.
Like Arbitrum Rollup, the developer and user experiences for Arbitrum Sidechains will be identical to that of Ethereum. Reddit would be able to choose a large and diverse set of validators, and all that they would need to guarantee to break through the scaling barrier is that a single one of them will remain honest.
We hope to have Arbitrum Sidechains in production in early 2021, and thus when Reddit reaches the scale that surpasses the capacity of Rollups, Arbitrum Sidechains will be waiting and ready to help.
While the idea to switch between channels and Rollup to get the best of both worlds is conceptually simple, getting the details right and making sure that the switch does not introduce any attack vectors is highly non-trivial and has been the subject of years of our research (indeed, we were working on this design for years before the term Rollup was even coined).
12. How Arbitrum compares
We include a comparison to several other categories as well as specific projects when appropriate. and explain why we believe that Arbitrum is best suited for Reddit's purposes. We focus our attention on other Ethereum projects.
Payment only Rollups. Compared to Arbitrum Rollup, ZK-Rollups and other Rollups that only support token transfers have several disadvantages:
  • As outlined throughout the proposal, we believe that the entire draw of Ethereum is in its rich smart contracts support which is simply not achievable with today's zero-knowledge proof technology. Indeed, scaling with a ZK-Rollup will add friction to the deployment of smart contracts that interact with Community Points as users will have to withdraw their coins from the ZK-Rollup and transfer them to a smart contract system (like Arbitrum). The community will be best served if Reddit builds on a platform that has built-in, frictionless smart-contract support.
  • All other Rollup protocols of which we are aware employ a centralized operator. While it's true that users retain custody of their coins, the centralized operator can often profit from censoring, reordering, or delaying transactions. A common misconception is that since they're non-custodial protocols, a centralized sequencer does not pose a risk but this is incorrect as the sequencer can wreak havoc or shake down users for side payments without directly stealing funds.
  • Sidechain type protocols can eliminate some of these issues, but they are not trustless. Instead, they require trust in some quorum of a committee, often requiring two-third of the committee to be honest, compared to rollup protocols like Arbitrum that require only a single honest party. In addition, not all sidechain type protocols have committees that are diverse, or even non-centralized, in practice.
  • Plasma-style protocols have a centralized operator and do not support general smart contracts.
13. Concluding Remarks
While it's ultimately up to the judges’ palate, we believe that Arbitrum Rollup is the bakeoff choice that Reddit kneads. We far surpass Reddit's specified workload requirement at present, have much room to optimize Arbitrum Rollup in the near term, and have a clear path to get Reddit to hundreds of millions of users. Furthermore, we are the only project that gives developers and users the identical interface as the Ethereum blockchain and is fully interoperable and tooling-compatible, and we do this all without any new trust assumptions or centralized components.
But no matter how the cookie crumbles, we're glad to have participated in this bake-off and we thank you for your consideration.
About Offchain Labs
Offchain Labs, Inc. is a venture-funded New York company that spun out of Princeton University research, and is building the Arbitrum platform to usher in the next generation of scalable, interoperable, and compatible smart contracts. Offchain Labs is backed by Pantera Capital, Compound VC, Coinbase Ventures, and others.
Leadership Team
Ed Felten
Ed Felten is Co-founder and Chief Scientist at Offchain Labs. He is on leave from Princeton University, where he is the Robert E. Kahn Professor of Computer Science and Public Affairs. From 2015 to 2017 he served at the White House as Deputy United States Chief Technology Officer and senior advisor to the President. He is an ACM Fellow and member of the National Academy of Engineering. Outside of work, he is an avid runner, cook, and L.A. Dodgers fan.
Steven Goldfeder
Steven Goldfeder is Co-founder and Chief Executive Officer at Offchain Labs. He holds a PhD from Princeton University, where he worked at the intersection of cryptography and cryptocurrencies including threshold cryptography, zero-knowledge proof systems, and post-quantum signatures. He is a co-author of Bitcoin and Cryptocurrency Technologies, the leading textbook on cryptocurrencies, and he has previously worked at Google and Microsoft Research, where he co-invented the Picnic signature algorithm. When not working, you can find Steven spending time with his family, taking a nature walk, or twisting balloons.
Harry Kalodner
Harry Kalodner is Co-founder and Chief Technology Officer at Offchain Labs where he leads the engineering team. Before the company he attended Princeton as a Ph.D candidate where his research explored economics, anonymity, and incentive compatibility of cryptocurrencies, and he also has worked at Apple. When not up at 3:00am writing code, Harry occasionally sleeps.
submitted by hkalodner to ethereum [link] [comments]

Blockchain Oracles: Connecting The Worlds. Part 1

Blockchain Oracles: Connecting The Worlds. Part 1
Many blockchain-based cryptocurrencies, and most importantly, Bitcoin and Ether, have become notorious for transaction failures, primarily due to scalability issues. And that’s pretty much all when we talk about plain-vanilla Bitcoin and its copycats like Litecoin or its clones like Bitcoin Cash. But with blockchains like Ethereum, which expand into the far reaches of a new territory known as smart contracts and decentralized applications (dApps), this is only the tip of the iceberg. In other words, only a beginning.
by StealthEX
What hides below the surface, and thus rarely emerges in public discussions about cryptocurrencies, is the closed-off nature of smart contract-enabled blockchains. To be ever so slightly useful, smart contracts and dApps running on top of them must have access to real-world data which is immensely off-chain, while they are permanently stuck within the constraints of their tiny on-chain world. So how do they escape out of the straightjackets put on them by their restrictive environments? That’s where blockchain oracles come into play.

And what role do they play, exactly?

Enabling smart contracts and dApps to interact with the outside world opens both endless possibilities and a big can of worms. Now that the entire world is made available to a smart contract, it can take an input from an external source of information, make some calculations that require this data or arrive at a decision based on it, and then get down to some work like moving contractually-locked funds from Alice to Bob. Or from Bob to Alice, depending on the verdict. Sounds cool, huh?
But here’s the catch. As transactions on a smart contract blockchain are supposed to be irreversible (while the blockchain itself immutable), it can lead to catastrophic consequences if the input has been tampered with or just happens to be incorrect for some arbitrary reason, not necessarily ill in intent. This fundamental problem of internalizing the outside world for on-chain execution of contractual agreements on smart contract blockchains has become most apparent with the advent of Decentralized Finance (or simply DeFi) a few years ago.
DeFi is a promising new kid on the blockchain arena. It hinges on the idea of decentralizing most financial services that we use today, but without a bank or other financial institution in the middle. It is envisioned that with the help of smart contracts and dApps using them we will be able to lend money and borrow with collateralized digital assets, offer and receive banking services including mortgages and insurance, buy and sell digital assets safely on decentralized marketplaces, as well as issue stablecoins and user tokens. Pretty impressive list, isn’t it?
However, for all of this to work properly we need trustless and reliable sources of information outside the blockchain that provide inputs to dApps running on that blockchain. DeFi requires trustless data feeds about the state of the world to ensure correct on-chain execution of smart contracts powering dApps. But how do we get these and manage to retrieve external data that cannot be verified through cryptography but that we can still trust and rely on? Entities that provide off-chain data for on-chain consumption are called blockchain oracles. Technically, an oracle is an interface through which a smart contract queries and retrieves information from an external source of truth.
As it turns out, DeFi is not the only field of application where blockchain oracles turn up quite handy, but since their use is most indispensable there, it makes sense to delve deeper into this area.

Oracles of DeFi

Today, the space is crowded with a plethora of players that aim at providing DeFi with so much needed real-time market information, for example, digital asset prices. There are many forms of oracles, but the most important distinction is drawn between centralized and decentralized ones. As DeFi is supposed to be a trustless, decentralized environment, the decentralized oracles are the flesh and blood of this ecosystem, so we are mostly concerned here with this type of blockchain oracles (just in case, there are centralized oracles too).
DeFi platforms deploy various oracle solutions in their pursuit of retrieving real-time information about the market price of digital assets. The most well known among these is the MakerDAO lending platform which uses an oracle module called the Medianizer to obtain the real-time exchange prices. Technically, it is a smart contract that accepts price updates from independent data feeds, discards false ones along with outliers, and calculates the median price (hence the name) to be used as a reference for other smart contracts.
Another blockchain-based borrowing and lending platform, Compound, uses administrators who are holders of the platform’s native COMP token. They manage and control price feeds through aggregator contracts they create. Authorized sources of information called reporters are then queried for reference prices by aggregators which verify the data and calculate median values to be used internally. A somewhat similar approach is utilized by AmpleForth, a developer of a stablecoin with elastic supply, and Synthetix, a platform for creating crypto-backed synthetic versions of assets like commodities, stocks, indices, cryptocurrencies, and fiat.
On the other hand, there are a few blockchain projects that provide decentralized oracle services to other platforms and blockchains, mostly Ethereum and EOS. Such projects as Provable (formerly Oraclize), ChainLink, Band Protocol, Tellor aim at providing blockchain-agnostic protocols that allow query and retrieval of virtually any type of reference data in a standardized manner. As authenticity and veracity of the information retrieved is of crucial importance to its consumers, these projects run their own decentralized blockchains whose primary task is to validate data feeds and check that the information received is authentic and has not been tampered with.
Realistically, these special-purpose blockchains come closest to the implementation of a blockchain oracle idea in a truly decentralized and trustless way.

To be continued

In the second part of this two-part article we will look into other uses of blockchain oracles beyond DeFi, and talk about potential problems and pitfalls of this nascent technology, as well as approaches used to deal with them. Stay with us and remain in the know!
And remember if you need to exchange your coins StealthEX is here for you. We provide a selection of more than 250 coins and constantly updating the list so that our customers will find a suitable option. Our service does not require registration and allows you to remain anonymous. Why don’t you check it out? Just go to StealthEX and follow these easy steps:
✔ Choose the pair and the amount for your exchange. For example COMP to DAI.
✔ Press the “Start exchange” button.
✔ Provide the recipient address to which the coins will be transferred.
✔ Move your cryptocurrency for the exchange.
✔ Receive your coins.
Follow us on Medium, Twitter, Facebook, and Reddit to get StealthEX.io updates and the latest news about the crypto world. For all requests message us via [[email protected]](mailto:[email protected]).
The views and opinions expressed here are solely those of the author. Every investment and trading move involves risk. You should conduct your own research when making a decision.
Original article was posted on https://stealthex.io/blog/2020/08/12/blockchain-oracles-connecting-the-worlds-part-1/
submitted by Stealthex_io to StealthEX [link] [comments]

Polkadot Launch AMA Recap

Polkadot Launch AMA Recap

The Polkadot Telegram AMA below took place on June 10, 2020

https://preview.redd.it/4ti681okap951.png?width=4920&format=png&auto=webp&s=e21f6a9a276d35bb9cdec59f46744f23c37966ef
AMA featured:
Dieter Fishbein, Ecosystem Development Lead, Web3 Foundation
Logan Saether, Technical Education, Web3 Foundation
Will Pankiewicz, Master of Validators, Parity Technologies
Moderated by Dan Reecer, Community and Growth, Polkadot & Kusama at Web3 Foundation

Transcription compiled by Theresa Boettger, Polkadot Ambassador:

Dieter Fishbein, Ecosystem Development Lead, Web3 Foundation

Dan: Hey everyone, thanks for joining us for the Polkadot Launch AMA. We have Dieter Fishbein (Head of Ecosystem Development, our business development team), Logan Saether (Technical Education), and Will Pankiewicz (Master of Validators) joining us today.
We had some great questions submitted in advance, and we’ll start by answering those and learning a bit about each of our guests. After we go through the pre-submitted questions, then we’ll open up the chat to live Q&A and the hosts will answer as many questions as they can.
We’ll start off with Dieter and ask him a set of some business-related questions.

Dieter could you introduce yourself, your background, and your role within the Polkadot ecosystem?

Dieter: I got my start in the space as a cryptography researcher at the University of Waterloo. This is where I first learned about Bitcoin and started following the space. I spent the next four years or so on the investment team for a large asset manager where I primarily focused on emerging markets. In 2017 I decided to take the plunge and join the space full-time. I worked at a small blockchain-focused VC fund and then joined the Polkadot team just over a year ago. My role at Polkadot is mainly focused on ensuring there is a vibrant community of projects building on our technology.

Q: Adoption of Polkadot of the important factors that all projects need to focus on to become more attractive to the industry. So, what is Polkadot's plan to gain more Adoption? [sic]

A (Dieter): Polkadot is fundamentally a developer-focused product so much of our adoption strategy is focused around making Polkadot an attractive product for developers. This has many elements. Right now the path for most developers to build on Polkadot is by creating a blockchain using the Substrate framework which they will later connect to Polkadot when parachains are enabled. This means that much of our adoption strategy comes down to making Substrate an attractive tool and framework. However, it’s not just enough to make building on Substrate attractive, we must also provide an incentive to these developers to actually connect their Substrate-based chain to Polkadot. Part of this incentive is the security that the Polkadot relay chain provides but another key incentive is becoming interoperable with a rich ecosystem of other projects that connect to Polkadot. This means that a key part of our adoption strategy is outreach focused. We go out there and try to convince the best projects in the space that building on our technology will provide them with significant value-add. This is not a purely technical argument. We provide significant support to projects building in our ecosystem through grants, technical support, incubatoaccelerator programs and other structured support programs such as the Substrate Builders Program (https://www.substrate.io/builders-program). I do think we really stand out in the significant, continued support that we provide to builders in our ecosystem. You can also take a look at the over 100 Grants that we’ve given from the Web3 Foundation: https://medium.com/web3foundation/web3-foundation-grants-program-reaches-100-projects-milestone-8fd2a775fd6b

Q: On moving forward through your roadmap, what are your most important next priorities? Does the Polkadot team have enough fundamentals (Funds, Community, etc.) to achieve those milestones?

A (Dieter): I would say the top priority by far is to ensure a smooth roll-out of key Polkadot features such as parachains, XCMP and other key parts of the protocol. Our recent Proof of Authority network launch was only just the beginning, it’s crucial that we carefully and successfully deploy features that allow builders to build meaningful technology. Second to that, we want to promote adoption by making more teams aware of Polkadot and how they can leverage it to build their product. Part of this comes down to the outreach that I discussed before but a major part of it is much more community-driven and many members of the team focus on this.
We are also blessed to have an awesome community to make this process easier 🙂

Q: Where can a list of Polkadot's application-specific chains can be found?

A (Dieter): The best list right now is http://www.polkaproject.com/. This is a community-led effort and the team behind it has done a terrific job. We’re also working on providing our own resource for this and we’ll share that with the community when it’s ready.

Q: Could you explain the differences and similarities between Kusama and Polkadot?

A (Dieter): Kusama is fundamentally a less robust, faster-moving version of Polkadot with less economic backing by validators. It is less robust since we will be deploying new technology to Kusama before Polkadot so it may break more frequently. It has less economic backing than Polkadot, so a network takeover is easier on Kusama than on Polkadot, lending itself more to use cases without the need for bank-like security.
In exchange for lower security and robustness, we expect the cost of a parachain lease to be lower on Kusama than Polkadot. Polkadot will always be 100% focused on security and robustness and I expect that applications that deal with high-value transactions such as those in the DeFi space will always want a Polkadot deployment, I think there will be a market for applications that are willing to trade cheap, high throughput for lower security and robustness such as those in the gaming, content distribution or social networking sectors. Check out - https://polkadot.network/kusama-polkadot-comparing-the-cousins/ for more detailed info!

Q: and for what reasons would a developer choose one over the other?

A (Dieter): Firstly, I see some earlier stage teams who are still iterating on their technology choosing to deploy to Kusama exclusively because of its lower-stakes, faster moving environment where it will be easier for them to iterate on their technology and build their user base. These will likely encompass the above sectors I identified earlier. To these teams, Polkadot becomes an eventual upgrade path for them if, and when, they are able to perfect their product, build a larger community of users and start to need the increased stability and security that Polkadot will provide.
Secondly, I suspect many teams who have their main deployment on Polkadot will also have an additional deployment on Kusama to allow them to test new features, either their tech or changes to the network, before these are deployed to Polkadot mainnet.

Logan Saether, Technical Education, Web3 Foundation

Q: Sweet, let's move over to Logan. Logan - could you introduce yourself, your background, and your role within the Polkadot ecosystem?

A (Logan): My initial involvement in the industry was as a smart contract engineer. During this time I worked on a few projects, including a reboot of the Ethereum Alarm Clock project originally by Piper Merriam. However, I had some frustrations at the time with the limitations of the EVM environment and began to look at other tools which could help me build the projects that I envisioned. This led to me looking at Substrate and completing a bounty for Web3 Foundation, after which I applied and joined the Technical Education team. My responsibilities at the Technical Education team include maintaining the Polkadot Wiki as a source of truth on the Polkadot ecosystem, creating example applications, writing technical documentation, giving talks and workshops, as well as helping initiatives such as the Thousand Validator Programme.

Q: The first technical question submitted for you was: "When will an official Polkadot mobile wallet appear?"

A (Logan): There is already an “official” wallet from Parity Technologies called the Parity Signer. Parity Signer allows you to keep your private keys on an air-gapped mobile device and to interactively sign messages using web interfaces such as Polkadot JS Apps. If you’re looking for something that is more of an interface to the blockchain as well as a wallet, you might be interested in PolkaWallet which is a community team that is building a full mobile interface for Polkadot.
For more information on Parity Signer check out the website: https://www.parity.io/signe

Q: Great thanks...our next question is: If someone already developed an application to run on Ethereum, but wants the interoperability that Polkadot will offer, are there any advantages to rebuilding with Substrate to run as a parachain on the Polkadot network instead of just keeping it on Ethereum and using the Ethereum bridge for use with Polkadot?

A (Logan): Yes, the advantage you would get from building on Substrate is more control over how your application will interact with the greater Polkadot ecosystem, as well as a larger design canvas for future iterations of your application.
Using an Ethereum bridge will probably have more cross chain latency than using a Polkadot parachain directly. The reason for this is due to the nature of Ethereum’s separate consensus protocol from Polkadot. For parachains, messages can be sent to be included in the next block with guarantees that they will be delivered. On bridged chains, your application will need to go through more routes in order to execute on the desired destination. It must first route from your application on Ethereum to the Ethereum bridge parachain, and afterward dispatch the XCMP message from the Polkadot side of the parachain. In other words, an application on Ethereum would first need to cross the bridge then send a message, while an application as a parachain would only need to send the message without needing to route across an external bridge.

Q: DOT transfers won't go live until Web3 removes the Sudo module and token holders approve the proposal to unlock them. But when will staking rewards start to be distributed? Will it have to after token transfers unlock? Or will accounts be able to accumulate rewards (still locked) once the network transitions to NPoS?

A (Logan): Staking rewards will be distributed starting with the transition to NPoS. Transfers will still be locked during the beginning of this phase, but reward payments are technically different from the normal transfer mechanism. You can read more about the launch process and steps at http://polkadot.network/launch-roadmap

Q: Next question is: I'm interested in how Cumulus/parachain development is going. ETA for when we will see the first parachain registered working on Kusama or some other public testnet like Westend maybe?

A (Logan): Parachains and Cumulus is a current high priority development objective of the Parity team. There have already been PoC parachains running with Cumulus on local testnets for months. The current work now is making the availability and validity subprotocols production ready in the Polkadot client. The best way to stay up to date would be to follow the project boards on GitHub that have delineated all of the tasks that should be done. Ideally, we can start seeing parachains on Westend soon with the first real parachains being deployed on Kusama thereafter.
The projects board can be viewed here: https://github.com/paritytech/polkadot/projects
Dan: Also...check out Basti's tweet from yesterday on the Cumulus topic: https://twitter.com/bkchstatus/1270479898696695808?s=20

Q: In what ways does Polkadot support smart contracts?

A (Logan): The philosophy behind the Polkadot Relay Chain is to be as minimal as possible, but allow arbitrary logic at the edges in the parachains. For this reason, Polkadot does not support smart contracts natively on the Relay Chain. However, it will support smart contracts on parachains. There are already a couple major initiatives out there. One initiative is to allow EVM contracts to be deployed on parachains, this includes the Substrate EVM module, Parity’s Frontier, and projects such as Moonbeam. Another initiative is to create a completely new smart contract stack that is native to Substrate. This includes the Substrate Contracts pallet, and the ink! DSL for writing smart contracts.
Learn more about Substrate's compatibility layer with Ethereum smart contracts here: https://github.com/paritytech/frontier

Will Pankiewicz, Master of Validators, Parity Technologies


Q: (Dan) Thanks for all the answers. Now we’ll start going through some staking questions with Will related to validating and nominating on Polkadot. Will - could you introduce yourself, your background, and your role within the Polkadot ecosystem?

A (Will): Sure thing. Like many others, Bitcoin drew me in back in 2013, but it wasn't until Ethereum came that I took the deep dive into working in the space full time. It was the financial infrastructure aspects of cryptocurrencies I was initially interested in, and first worked on dexes, algorithmic trading, and crypto funds. I really liked the idea of "Generalized Mining" that CoinFund came up with, and started to explore the whacky ways the crypto funds and others can both support ecosystems and be self-sustaining at the same time. This drew me to a lot of interesting experiments in what later became DeFi, as well as running validators on Proof of Stake networks. My role in the Polkadot ecosystem as “Master of Validators” is ensuring the needs of our validator community get met.

Q: Cool thanks. Our first community question was "Is it still more profitable to nominate the validators with lesser stake?"

A (Will): It depends on their commission, but generally yes it is more profitable to nominate validators with lesser stake. When validators have lesser stake, when you nominate them this makes your nomination stake a higher percentage of total stake. This means when rewards get distributed, it will be split more favorably toward you, as rewards are split by total stake percentage. Our entire rewards scheme is that every era (6 hours in Kusama, 24 hours in Polkadot), a certain amount of rewards get distributed, where that amount of rewards is dependent on the total amount of tokens staked for the entire network (50% of all tokens staked is currently optimal). These rewards from the end of an era get distributed roughly equally to all validators active in the validator set. The reward given to each validator is then split between the validators and all their nominators, determined by the total stake that each entity contributes. So if you contribute to a higher percentage of the total stake, you will earn more rewards.

Q: What does priority ranking under nominator addresses mean? For example, what does it mean that nominator A has priority 1 and nominator B has priority 6?

A (Will): Priority ranking is just the index of the nomination that gets stored on chain. It has no effect on how stake gets distributed in Phragmen or how rewards get calculated. This is only the order that the nominator chose their validators. The way that stake from a nominator gets distributed from a nominator to validators is via Phragmen, which is an algorithm that will optimally put stake behind validators so that distribution is roughly equal to those that will get in the validator set. It will try to maximize the total amount at stake in the network and maximize the stake behind minimally staked validators.

Q: On Polkadot.js, what does it mean when there are nodes waiting on Polkadot?

**A (Will):**In Polkadot there is a fixed validator set size that is determined by governance. The way validators get in the active set is by having the highest amount of total stake relative to other validators. So if the validator set size is 100, the top 100 validators by total stake will be in the validator set. Those not active in the validator set will be considered “waiting”.

Q: Another question...Is it necessary to become a waiting validator node right now?

A (Will): It's not necessary, but highly encouraged if you actively want to validate on Polkadot. The longer you are in the waiting tab, the longer you get exposure to nominators that may nominate you.

Q: Will current validators for Kusama also validate for Polkadot? How strongly should I consider their history (with Kusama) when looking to nominate a good validator for DOTs?

A (Will): A lot of Kusama validators will also be validators for Polkadot, as KSM was initially distributed to DOT holders. The early Kusama Validators will also likely be the first Polkadot validators. Being a Kusama validator should be a strong indicator for who to nominate on Polkadot, as the chaos that has ensued with Kusama has allowed validators to battle test their infrastructure. Kusama validators by now are very familiar with tooling, block explorers, terminology, common errors, log formats, upgrades, backups, and other aspects of node operation. This gives them an edge against Polkadot validators that may be new to the ecosystem. You should strongly consider well known Kusama validators when making your choices as a nominator on Polkadot.

Q: Can you go into more details about the process for becoming a DOT validator? Is it similar as the KSM 1000 validators program?

A (Will): The Process for becoming a DOT validators is first to have DOTs. You cannot be a validator without DOTs, as DOTs are used to pay transaction fees, and the minimum amount of DOTs you need is enough to create a validate transaction. After obtaining enough DOTs, you will need to set up your validator infrastructure. Ideally you should have a validator node with specs that match what we call standard hardware, as well as one or more sentry nodes to help isolate the validator node from attacks. After the infrastructure is up and running, you should have your Polkadot accounts set up right with a stash bonded to a controller account, and then submit a validate transaction, which will tell the network your nodes are ready to be a part of the network. You should then try and build a community around your validator to let others know you are trustworthy so that they will nominate you. The 1000 validators programme for Kusama is a programme that gives a certain amount of nominations from the Web3 Foundation and Parity to help bootstrap a community and reputation for validators. There may eventually be a similar type of programme for Polkadot as well.
Dan: Thanks a lot for all the answers, Will. That’s the end of the pre-submitted questions and now we’ll open the chat up to live Q&A, and our three team members will get through as many of your questions as possible.
We will take questions related to business development, technology, validating, and staking. For those wondering about DOT:
DOT tokens do not exist yet. Allocations of Polkadot's native DOT token are technically and legally non-transferable. Hence any publicized sale of DOTs is unsanctioned by Web3 Foundation and possibly fraudulent. Any official public sale of DOTs will be announced on the Web3 Foundation website. Polkadot’s launch process started in May and full network decentralization later this year, holders of DOT allocations will determine issuance and transferability. For those who participated in previous DOT sales, you can learn how to claim your DOTs here (https://wiki.polkadot.network/docs/en/claims).


Telegram Community Follow-up Questions Addressed Below


Q: Polkadot looks good but it confuses me that there are so many other Blockchain projects. What should I pay attention in Polkadot to give it the importance it deserves? What are your planning to achieve with your project?

A (Will): Personally, what I think differentiates it is the governance process. Coordinating forkless upgrades and social coordination helps stand it apart.
A (Dieter): The wiki is awesome - https://wiki.polkadot.network/

Q: Over 10,000 ETH paid as a transaction fee , what if this happens on Polkadot? Is it possible we can go through governance to return it to the owner?

A: Anything is possible with governance including transaction reversals, if a network quorum is reached on a topic.
A (Logan): Polkadot transaction fees work differently than the fees on Ethereum so it's a bit more difficult to shoot yourself in the foot as the whale who sent this unfortunate transaction. See here for details on fees: https://w3f-research.readthedocs.io/en/latest/polkadot/Token%20Economics.html?highlight=transaction%20fees#relay-chain-transaction-fees-and-per-block-transaction-limits
However, there is a tip that the user can input themselves which they could accidentally set to a large amount. In this cases, yes, they could proposition governance to reduce the amount that was paid in the tip.

Q: What is the minimum ideal amount of DOT and KSM to have if you want to become a validator and how much technical knowledge do you need aside from following the docs?

A (Will): It depends on what the other validators in the ecosystem are staking as well as the validator set size. You just need to be in the top staking amount of the validator set size. So if its 100 validators, you need to be in the top 100 validators by stake.

Q: Will Web3 nominate validators? If yes, which criteria to be elected?

A (Will): Web 3 Foundation is running programs like the 1000 validators programme for Kusama. There's a possibility this will continue on for Polkadot as well after transfers are enabled. https://thousand-validators.kusama.network/#/
You will need to be an active validator to earn rewards. Only those active in the validator set earn rewards. I would recommend checking out parts of the wiki: https://wiki.polkadot.network/docs/en/maintain-guides-validator-payout

Q: Is it possible to implement hastables or dag with substrate?

A (Logan): Yes.

Q: Polkadot project looks very futuristic! But, could you tell us the main role of DOT Tokens in the Polkadot Ecosystem?

A (Dan): That's a good question. The short answer is Staking, Governance, Bonding. More here: http://polkadot.network/dot-token

Q: How did you manage to prove that the consensus protocol is safe and unbreakable mathematically?

A (Dieter): We have a research teams of over a dozen scientists with PhDs and post-docs in cryptography and distributed computing who do thorough theoretical analyses on all the protocols used in Polkadot

Q: What are the prospects for NFT?

A: Already being built 🙂

Q: What will be Polkadot next roadmap for 2020 ?

A (Dieter): Building. But seriously - we will continue to add many more features and upgrades to Polkadot as well as continue to strongly focus on adoption from other builders in the ecosystem 🙂
A (Will): https://polkadot.network/launch-roadmap/
This is the launch roadmap. Ideally adding parachains and xcmp towards the end of the year

Q: How Do you stay active in terms of marketing developments during this PANDEMIC? Because I'm sure you're very excited to promote more after this settles down.

A (Dan): The main impact of covid was the impact on in-person events. We have been very active on Crowdcast for webinars since 2019, so it was quite the smooth transition to all-online events. You can see our 40+ past event recordings and follow us on Crowdcast here: https://www.crowdcast.io/polkadot. If you're interested in following our emails for updates (including online events), subscribe here: https://info.polkadot.network/subscribe

Q: Hi, who do you think is your biggest competitor in the space?

A (Dan): Polkadot is a metaprotocol that hasn't been seen in the industry up until this point. We hope to elevate the industry by providing interoperability between all major public networks as well as private blockchains.

Q: Is Polkadot a friend or competitor of Ethereum?

A: Polkadot aims to elevate the whole blockchain space with serious advancements in interoperability, governance and beyond :)

Q: When will there be hardware wallet support?

A (Will): Parity Signer works well for now. Other hardware wallets will be added pretty soon

Q: What are the attractive feature of DOT project that can attract any new users ?

A: https://polkadot.network/what-is-polkadot-a-brief-introduction/
A (Will): Buidling parachains with cross chain messaging + bridges to other chains I think will be a very appealing feature for developers

Q: According to you how much time will it take for Polkadot to get into mainstream adoption and execute all the plans set for this project?

A: We are solving many problems that have held back the blockchain industry up until now. Here is a summary in basic terms:
https://preview.redd.it/ls7i0bpm8p951.png?width=752&format=png&auto=webp&s=a8eb7bf26eac964f6b9056aa91924685ff359536

Q: When will bitpie or imtoken support DOT?

A: We are working on integrations on all the biggest and best wallet providers. ;)

Q: What event/call can we track to catch a switch to nPOS? Is it only force_new_era call? Thanks.

A (Will): If you're on riot, useful channels to follow for updates like this are #polkabot:matrix.org and #polkadot-announcements:matrix.parity.io
A (Logan): Yes this is the trigger for initiating the switch to NPoS. You can also poll the ForceEra storage for when it changes to ForceNew.

Q: What strategy will the Polkadot Team use to make new users trust its platform and be part of it?

A (Will): Pushing bleeding edge cryptography from web 3 foundation research
A (Dan): https://t.me/PolkadotOfficial/43378

Q: What technology stands behind and What are its advantages?

A (Dieter): Check out https://polkadot.network/technology/ for more info on our tech stack!

Q: What problems do you see occurring in the blockchain industry nowadays and how does your project aims to solve these problems?

A (Will): Governance I see as a huge problem. For example upgrading Bitcoin and making decisions for changing things is a very challenging process. We have robust systems of on-chain governance to help solve these coordination problems

Q: How involved are the Polkadot partners? Are they helping with the development?

A (Dieter): There are a variety of groups building in the Polkadot ecosystem. Check out http://www.polkaproject.com/ for a great list.

Q: Can you explain the role of the treasury in Polkadot?

A (Will): The treasury is for projects or people that want to build things, but don't want to go through the formal legal process of raising funds from VCs or grants or what have you. You can get paid by the community to build projects for the community.
A: There’s a whole section on the wiki about the treasury and how it functions here https://wiki.polkadot.network/docs/en/mirror-learn-treasury#docsNav

Q: Any plan to introduce Polkadot on Asia, or rising market on Asia?

**A (Will):**We're globally focused

Q: What kind of impact do you expect from the Council? Although it would be elected by token holders, what kind of people you wish to see there?

A (Will): Community focused individuals like u/jam10o that want to see cool things get built and cool communities form

If you have further questions, please ask in the official Polkadot Telegram channel.
submitted by dzr9127 to dot [link] [comments]

The retail investor community has the memory and logic of a literal goldfish.

It has been just under three years since I've joined the subreddit, and just over three years since I started being interested in cryptocurrency. After countless hours of research on the various industries being disrupted, after reading whitepapers, parsing through the profile of team members, watching recorded conferences and making calculations based on demand and market size, I can write with full confidence that the overwhelming majority of the community of crypto investors is retarded. Never have I seen a market that is less focused on growth objectives, or that circlejerks harder to news and announcements whose impact they don't even try to measure or fully understand.
In all honesty, it's not that surprising. The tech itself is more or less difficult to understand (anyone with an engineering, computer science, economics or finance background should be able to grasp the underlying concepts and realize why distributed ledgers have vale, despite not being familiar with every key concept) but the extremely hard part is picturing exactly how efficiencies are bound to be created on an industrial level, and how impactful they will be. In other markets, this is why there exist ratings, panels of experts, consulting groups. They perpetually gauge the health of various industries and make predictions based on trends, to help others make informed decisions. Even then, a layer of speculation (and thus, manipulation) always exists, but the ratio of speculation vs. real demand tends to be in a different league entirely.
Despite knowing or suspecting this, however, the same shit happens every time here. Bitcoin starts to drop, and the trolls come out of the woods to spread their depressive thoughts onto everyone else. Altcoins start to shine, and the morons who bought during the frenzy and are seeing 10% gains for the first time in weeks, months, if not ever, start tugging each other and patting themselves in the back and writing essays as to why DPoS experiment #898 (which doesn't even have a main net yet) is on track to being massively adopted and is definitely the best buy opportunity of the decade. Bitcoin starts to rise again, and suddenly all money floods from the rest of the market to feed Big Daddy and every single other project is a shitcoin again and they'll never amount to anything ever.
It is beyond frustrating to see how easily everyone is being manipulated. Short-term fluctuations in the market doesn't make you right, and it doesn't make you wrong either. There are basically zero candid discussions on the state of the technology and how it stands relative to adoption. There are basically zero attempts to map out the ecosystem in a comprehensive way so that 'investors' can have a clear picture of how some projects might interact with each other, allowing us to formulate enlightened guesses as to how much market share each might capture. Nobody seems to give a shit about decentralization at all, but I suspect for the vast majority, it was never about freedom and fairness, only profits.
Everybody strictly only celebrates when the market moves up without ever questioning the validity of that movement. When an asset inflates without any fundamental reason, its only achievement is the creation of a bubble that will inevitably result in severe losses for the majority of investors.
With the size of this community (this subreddit alone), we have the power to warn others and to influence the market by infusing some rationality into it. So please, take the time to encourage others to look beyond immediate gains and to research the field. Take the time to criticize twitter "influencers". Take the time to reach out to professors, industry professionals and other knowledgeable experts to ask them their thoughts about specific projects and the market as a whole. We desperately need to create a global stage on which there is informed discourse, else this market will never truly grow.
submitted by bLbGoldeN to CryptoCurrency [link] [comments]

Ethereum 2.0: Why, How And Then?

Ethereum 2.0: Why, How And Then?
Why update Ethereum? One problem of the Ethereum network that the update should solve is scalability. At the moment, its blockchain can perform to 15 transactions per second, which is over two times more than that of bitcoin. However, this speed is still not enough for a large number of users. For example, the Visa payment system can perform up to 24 thousand transactions per second.
Adding an Optimistic Rollup technology will help to solve the scalability problem. According to Vitalik Buterin, the creator of Ethereum, its implementation will occur after the network’s update and will increase its throughput to 1000 transactions per second.
by StealthEX
Another solution to this problem is a change in the algorithm. Currently, Ethereum runs on the same protocol as Bitcoin, Proof-of-Work, confirmation of transactions in the cryptocurrency network occurs using the computing power of processors.
Using the Proof-of-Work algorithm limits the growth of the Ethereum network bandwidth. To withstand a large load, more miners are needed, but the growth of their number slows down since it becomes more difficult to mine cryptocurrency and, consequently, less profitable.
This is the reason the Ethereum development team is planning to switch to the Proof-of-Stake algorithm. Unlike the PoW, it does not require the use of computing power to confirm blocks. Instead of miners, transactions will be confirmed by validators. To become a validator, the user should have 32 ETH and install a special client. From a technical point of view, this is easier than buying mining devices and maintaining their functionality, as well as looking for access to cheap electricity. Thus, the system will no longer need expensive hardware.
The main solution to the scalability problem will be to implement sharding. Current Ethereum network is a unified database. After the update, the blockchain will be divided into autonomous, interacting blocks — shards, each of which will process particular transactions and smart contracts, which, however, will be recognized by the entire Ethereum blockchain. Nodes that form the shard process information separately, this allows maintaining the principle of decentralization. This is important since the risk of centralization is another big problem of the old algorithm.
Since the complexity of mining has increased over time, and now this process requires having expensive equipment and access to cheap electricity, small participants can not afford to stay in the game. In such conditions, big pools of miners that can provide higher productivity have a decisive advantage. For example, in April, more than 50% of the computing power of the Ethereum network was provided by only two mining pools. This creates a significant risk of centralization and “51% attacks”.
Validators will confirm transactions and get rewards in the form of passive income. According to the project’s roadmap, this amount will vary from 1.81% to 18.1%. The profitability of the stacking will depend on the number of validators. The more of them, the smaller the amount they get. However, there will be some costs. In the same Ethereum 2.0 roadmap, developers mentioned that the cost of validating transactions, based on rough calculations, will be about $180 per year. One of the developers of the project, Justin Drake, predicts that on average the validator will receive an income of 5% per year.

What is the estimated Ethereum 2.0 release date?

The launch of Ethereum 2.0 will take place gradually, in six stages, the “zero” of which is expected this summer. However, it is worth noting that due to finding vulnerabilities, the dates have already been shifted several times–initially, the transition to the new version was planned in 2019.
One of the developers of the project, Afri Schoedon, said that the launch could be postponed to 2021. According to him, under favourable circumstances, the main network can be presented in November of this year, but there are certain difficulties in this.
Schoedon explained that before launching ETH 2.0, all of its clients must be brought to the same specifications. After that, the developer’s team needs to open a unified deposit contract so that users can transfer their assets from the old chain to the new one. Between these stages, developers also need additional time, so they could test all aspects of the new system.
As it usually happens, there’s going to be two parallel blockchains as a result of the hard fork. The first one, ETH1, will continue to work using an old protocol, while the update will be implemented on ETH2. Users will be able to transfer their coins from the old blockchain to the new one, but not vice versa. The appearance of sharding will allow developers to move to phase 1.5 — during this phase, ETH1 will merge with ETH2, becoming one of the 64 “shards” of the updated blockchain. In the second phase, smart contracts become available on ETH2, which can be considered the full start of its economic activity.

And what are expectations?

Updating the Ethereum network will increase its technical capabilities, namely, it will speed up and reduce the cost of transactions, as well as make the blockchain less vulnerable for centralization process.
Currently, the absolute majority of decentralized finance projects are developed using the Ethereum platform. The Ethereum 2.0 release will probably attract even more partners who will use the blockchain for their projects.
Ryan Watkins, Messari Analysis company’s researcher, highly values the importance of updating.
“ETH 2.0 is a much stronger catalyst than the Bitcoin halving simply because it’s an uncertain and fundamental change.” — Ryan Watkins wrote on his Twitter account
And the part about uncertainty is hard to disagree with. Of course, there are some concerns about the bright Ethereum future. The coming hard fork carries with it potential negative consequences. For example, after switching to the PoS algorithm, the US Securities and Exchange Commission (SEC) may well admit Ethereum as a security, which will lead to legal complications similar to those faced by Pavel Durov when trying to launch his TON blockchain platform.
For now, ETH is the most popular coin for mining at home, and most of these miners will probably just leave the network.
There is also a risk that the price of Ethereum may fall. To receive passive income for storing ETH, the user will not only need to have 32 coins but also block them through a special transaction. They will not be able to withdraw these blocked funds immediately. As stated in the project roadmap, the cryptocurrency withdrawal process will take at least 18 hours. This could take even more time if many users request the return of tokens at the same time. Thus, if ETH falls in price, it will be impossible to sell it immediately, and there is a risk of losing some capital and all the income received from stacking.
Nevertheless, investors are mostly optimistic — the volume of Ethereum options on the Deribit exchange has grown to a historical high, which indicates confidence in the future of Ethereum project. The ETH price is also growing, having overcome the consequences of the March collapse of cryptocurrencies.
Most experts agree that Ethereum price will grow after the update. On the one hand, the altcoin will become more expensive, as it will become a more attractive investment. On the other hand, the offer will decrease, as users will start transferring coins from the first version of the network to the second, to block them for passive income.
If you want to participate in the future fate of the ETH project, you can buy Ethereum using our service. We provide fast, anonymous and limitless swaps between over 250 cryptocurrencies. Just go to StealthEX and follow these easy steps:
✔ Choose the pair and the amount for your exchange. For example BTC to ETH.
✔ Press the “Start exchange” button.
✔ Provide the recipient address to which the coins will be transferred.
✔ Move your cryptocurrency for the exchange.
✔ Receive your coins.
Follow us on Medium, Twitter, Facebook, and Reddit to get StealthEX.io updates and the latest news about the crypto world. For all requests message us via [[email protected]](mailto:[email protected]).
The views and opinions expressed here are solely those of the author. Every investment and trading move involves risk. You should conduct your own research when making a decision.
Original article was posted on https://stealthex.io/blog/2020/06/30/ethereum-2-0-why-how-and-then/.
submitted by Stealthex_io to StealthEX [link] [comments]

Ethereum 2.0: Why, How And Then?

Ethereum 2.0: Why, How And Then?
Why update Ethereum? One problem of the Ethereum network that the update should solve is scalability. At the moment, its blockchain can perform to 15 transactions per second, which is over two times more than that of bitcoin. However, this speed is still not enough for a large number of users. For example, the Visa payment system can perform up to 24 thousand transactions per second.
Adding an Optimistic Rollup technology will help to solve the scalability problem. According to Vitalik Buterin, the creator of Ethereum, its implementation will occur after the network’s update and will increase its throughput to 1000 transactions per second.
by StealthEX
Another solution to this problem is a change in the algorithm. Currently, Ethereum runs on the same protocol as Bitcoin, Proof-of-Work, confirmation of transactions in the cryptocurrency network occurs using the computing power of processors.
Using the Proof-of-Work algorithm limits the growth of the Ethereum network bandwidth. To withstand a large load, more miners are needed, but the growth of their number slows down since it becomes more difficult to mine cryptocurrency and, consequently, less profitable.
This is the reason the Ethereum development team is planning to switch to the Proof-of-Stake algorithm. Unlike the PoW, it does not require the use of computing power to confirm blocks. Instead of miners, transactions will be confirmed by validators. To become a validator, the user should have 32 ETH and install a special client. From a technical point of view, this is easier than buying mining devices and maintaining their functionality, as well as looking for access to cheap electricity. Thus, the system will no longer need expensive hardware.
The main solution to the scalability problem will be to implement sharding. Current Ethereum network is a unified database. After the update, the blockchain will be divided into autonomous, interacting blocks — shards, each of which will process particular transactions and smart contracts, which, however, will be recognized by the entire Ethereum blockchain. Nodes that form the shard process information separately, this allows maintaining the principle of decentralization. This is important since the risk of centralization is another big problem of the old algorithm.
Since the complexity of mining has increased over time, and now this process requires having expensive equipment and access to cheap electricity, small participants can not afford to stay in the game. In such conditions, big pools of miners that can provide higher productivity have a decisive advantage. For example, in April, more than 50% of the computing power of the Ethereum network was provided by only two mining pools. This creates a significant risk of centralization and “51% attacks”.
Validators will confirm transactions and get rewards in the form of passive income. According to the project’s roadmap, this amount will vary from 1.81% to 18.1%. The profitability of the stacking will depend on the number of validators. The more of them, the smaller the amount they get. However, there will be some costs. In the same Ethereum 2.0 roadmap, developers mentioned that the cost of validating transactions, based on rough calculations, will be about $180 per year. One of the developers of the project, Justin Drake, predicts that on average the validator will receive an income of 5% per year.

What is the estimated Ethereum 2.0 release date?

The launch of Ethereum 2.0 will take place gradually, in six stages, the “zero” of which is expected this summer. However, it is worth noting that due to finding vulnerabilities, the dates have already been shifted several times–initially, the transition to the new version was planned in 2019.
One of the developers of the project, Afri Schoedon, said that the launch could be postponed to 2021. According to him, under favourable circumstances, the main network can be presented in November of this year, but there are certain difficulties in this.
Schoedon explained that before launching ETH 2.0, all of its clients must be brought to the same specifications. After that, the developer’s team needs to open a unified deposit contract so that users can transfer their assets from the old chain to the new one. Between these stages, developers also need additional time, so they could test all aspects of the new system.
As it usually happens, there’s going to be two parallel blockchains as a result of the hard fork. The first one, ETH1, will continue to work using an old protocol, while the update will be implemented on ETH2. Users will be able to transfer their coins from the old blockchain to the new one, but not vice versa. The appearance of sharding will allow developers to move to phase 1.5 — during this phase, ETH1 will merge with ETH2, becoming one of the 64 “shards” of the updated blockchain. In the second phase, smart contracts become available on ETH2, which can be considered the full start of its economic activity.

And what are expectations?

Updating the Ethereum network will increase its technical capabilities, namely, it will speed up and reduce the cost of transactions, as well as make the blockchain less vulnerable for centralization process.
Currently, the absolute majority of decentralized finance projects are developed using the Ethereum platform. The Ethereum 2.0 release will probably attract even more partners who will use the blockchain for their projects.
Ryan Watkins, Messari Analysis company’s researcher, highly values the importance of updating.
“ETH 2.0 is a much stronger catalyst than the Bitcoin halving simply because it’s an uncertain and fundamental change.” — Ryan Watkins wrote on his Twitter account
And the part about uncertainty is hard to disagree with. Of course, there are some concerns about the bright Ethereum future. The coming hard fork carries with it potential negative consequences. For example, after switching to the PoS algorithm, the US Securities and Exchange Commission (SEC) may well admit Ethereum as a security, which will lead to legal complications similar to those faced by Pavel Durov when trying to launch his TON blockchain platform.
For now, ETH is the most popular coin for mining at home, and most of these miners will probably just leave the network.
There is also a risk that the price of Ethereum may fall. To receive passive income for storing ETH, the user will not only need to have 32 coins but also block them through a special transaction. They will not be able to withdraw these blocked funds immediately. As stated in the project roadmap, the cryptocurrency withdrawal process will take at least 18 hours. This could take even more time if many users request the return of tokens at the same time. Thus, if ETH falls in price, it will be impossible to sell it immediately, and there is a risk of losing some capital and all the income received from stacking.
Nevertheless, investors are mostly optimistic — the volume of Ethereum options on the Deribit exchange has grown to a historical high, which indicates confidence in the future of Ethereum project. The ETH price is also growing, having overcome the consequences of the March collapse of cryptocurrencies.
Most experts agree that Ethereum price will grow after the update. On the one hand, the altcoin will become more expensive, as it will become a more attractive investment. On the other hand, the offer will decrease, as users will start transferring coins from the first version of the network to the second, to block them for passive income.
If you want to participate in the future fate of the ETH project, you can buy Ethereum using our service. We provide fast, anonymous and limitless swaps between over 250 cryptocurrencies. Just go to StealthEX and follow these easy steps:
✔ Choose the pair and the amount for your exchange. For example BTC to ETH.
✔ Press the “Start exchange” button.
✔ Provide the recipient address to which the coins will be transferred.
✔ Move your cryptocurrency for the exchange.
✔ Receive your coins.
Follow us on Medium, Twitter and Reddit to get StealthEX.io updates and the latest news about the crypto world. For all requests message us via [[email protected]](mailto:[email protected]).
The views and opinions expressed here are solely those of the author. Every investment and trading move involves risk. You should conduct your own research when making a decision.
Original article was posted on https://stealthex.io/blog/2020/06/30/ethereum-2-0-why-how-and-then/.
submitted by Stealthex_io to conspiracy [link] [comments]

Recap of Binance English Kava AMA (May 2020)

This AMA was conducted within the Binance English Telegram channel prior to Kava's June 10th launch of its DeFi Lending Platform.

Q1:

Can you give us a little history of KAVA?

Q2:

Could you please tell me what KAVA cryptocurrency is? What problem does it solve?

  • Answer - KAVA is the staking, governance, and reserve asset of the Kava DeFi platform. KAVA is required by node operators to secure transactions on the blockchain. Additionally, when lending fees are paid, they are converted to Kava and burned reducing the overall supply of KAVA tokens. As more users use the Kava lending platform, KAVA should become more scarce overtime.

Q3:

What is the advantage of keeping the KAVA token for a long and short term?

  • Answer - In the short term, if you stake KAVA you can earn additional block rewards every day, block by block. This provides a nice steady return on the Kava usually in the range of 3-20% depending on the number of people staking.
  • We will be opening the gates of DeFi to many top tier assets such as BNB, XRP, ATOM, and BTC which have never been able to use lending, stablecoins, or other DeFi Services. If you are a KAVA hodler you can benefit from owning and having a stake in the network as we grow because as the network grows, Kava is burned and it becomes more scarce as a resource.

Q4:

Chainlink is KAVA’s partner, can you explain more about this partnership?

  • Answer - Yes, this is not the usual chainlink partnership where a blockchain consumes data from Chainlink’s oracle solution.
  • No oracle solution adequate for DeFi applications on Cosmos was available. For this reason, Kava has teamed up with Chainlink to bring its data and reliable oracle solution to the Cosmos ecosystem. Chainlink nodes now will be able to securely publish data directly on the Kava blockchain where it can be used or easily transported to other Cosmos-based blockchains and applications. Chainlink oracles on Kava utilize all the industry-leading technologies of Chainlink, while enabling more frequent price updates and improving the reach and distribution of where that data can be used.
  • Since Kava’s blockchain is built using Tendermint, Tendermint-based blockchains within the Cosmos ecosystem (Binance, Terra, OKChain, Cosmos Hub, Agoric, Aragon, and others) will now be able to retrieve market data such as cryptocurrency, FX, and commodity prices. For DEX’s like Binance this will enable them to create futures, options, and other derivative products they were not able to do so before.
  • TLDR: Kava + Chainlink Data creates the ideal hub for all blockchains and applications to get their DeFi services and Data, and as result makes Kava a natural hub for the growing Cosmos ecosystem.

Q5:

What is the KAVA CDP product? Do you have any exciting things down the pipeline that you can share?

  • Answer - First, let me clarify that CDP simply means “collateralized-debt-position” similar to CDOs that exist in the traditional finance world. What it means is a loan using collateral to back the loan.
  • Kava’s lending platform offers collateralized loans to users who have crypto. Getting a loan with Kava’s platform is great if you don’t want to sell your crypto position, but need short term cash for payments or if you want to use the loan to get a levered / margin position without going through KYC.
  • As for news! Kava’s lending platform is scheduled to officially launch on the mainnet June 10th.
  • At this time, DeFi will be made available to BNB for the first time ever. Also at this time, the Kava DeFi platform will be awarding the first users that have BNB extremely high rewards for being early adopters.
  • Each week, 74,000 KAVA will be given out to all the users who have taken out loans on Kava. Yes, you get free KAVA, for taking out a loan using BNB!
  • If you want to participate, you can learn more about how to do it here!
  • Medium

Q6:

Why should BNB users use KAVA’s lending platform and take out USDX? And how to mint USDX with BNB on KAVA CDP?

  • Answer - Free- maybe let's call it rewards for being good users 😉
  • The rewards are platform growth incentives so that we can grow the platform quickly.
  • Well at launch, definitely the KAVA rewards are a huge reason for BNB users to use it.
  • As for the product long-term, the major use case for our lending platform is to get a levered position without needing an exchange or to go through KYC.
  • How it works is that a BNB holder can deposit their BNB and take out USDX loans - this capital they will take and buy more BNB with it. Most people will use the loan this way to get 2-3x the original BNB amount. If the price goes up on BNB, they win 2-3x the gains!
  • Of course if the price goes down and they cannot repay their loan, the BNB collateral might get liquidated, so be careful, it works just like a margin trading account.

Q7:

Brian do you have any more information or links for our community about this?

Q8:

KAVA was initially planned to launch on Ripple network but later switched to Cosmos Tendermint Core. [email protected] is that something you see in Tendermint Core that is not available anywhere?

  • Answer - For clarification, Kava was never planned to be on Ripple. However, Ripple is a Kava investor, shareholder, and partner.
  • We selected the Cosmos-SDK featuring the Tendermint BFT consensus because during our past work with Ripple, MakerDao, ETH, and other layer 2 work we learned the value of “finality” of blockchains. For example, on ETH, the finality of blocks do not happen right away. You need to reach 15+ blocks to be confirmed on Ethereum to really know a transaction has passed. This results in really slow user experiences that aren’t acceptable in finance or any application really.
  • Tendermint solves this because it makes every transaction final and occur in seconds.
  • Additionally, we chose the Cosmos-SDK as the framework to build our stand alone blockchain, Kava because it allowed us to create our own security model and design which enables Kava as a DeFi platform responsible for millions of dollars of collateral to be very secure in a way we could net get if we built it on any other network.

Q9:

KAVA does cross-chain support. Compared to other DeFi platforms, KAVA offer collateralized loans and stable coins to users too. How will volatility be managed there with so many different collateral systems in CDP?

  • Answer - Volatility is an important consideration and accurate and timely price reference data is needed to make sure the system works.
  • All the collateral positions rely on price feeds from oracles to determine if they are safe or need to be liquidated. Kava has created a novel partnership with Chainlink, where Chainlink oracles that normally run on Ethereum, operate nodes directly on Kava where they can post prices. This Kava to avoid network congestion, high gas fees, and other less desirable issues found on Ethereum, while enabling the oracles with Kava’s fast blocktimes and finality so they can actually deliver price updates 10-20x more frequently than is possible elsewhere. This makes Kava’s price feed data very reliable.
  • In times of volatility, if liquidations occur, the Kava platform automatically auctions collateral off for USDX on the market and burns the USDX. This mechanism keeps the system balanced and USDX algorithmically stable and always fully collateralized by real assets.
  • And it does this transparently, unlike the real world CDOs which caused the world issues in 2008 due to the lack of transparency in their assets and risk.

Q10:

Recently, Binance has released a white paper on BSC, a Binance smart chain. So, what can I get by staking through Binance Coin BNB?

  • Answer - Yay for smart contracts!
  • What can we get by staking bnb?
  • Staking BNB on Kava, or depositing it in a CDP and creating USDX from it earns users KAVA in rewards everyweek. A lot of rewards. In addition, you get USDX to hold which also pays out a savings rate each block that is much better than say what USD in a checking account could do.

Q11:

Various platforms are in Ethereum. So why is Kava not at Ethereum?

  • Answer - I could speak about this for ages, but there is a reason for Ethereum being the home to many hacks and bugs.
  • Kava is not on ethereum because we couldn’t build our system there. The main reasons. as I have mentioned are:
  • (1) Ethereum has congestion, oracle issues, high fees, and slow block times.
  • (2) Ethereum’s open smart contracting system can do anything. This is great for building crypto kitties, but horrible for financial software as it makes all code have infinite attack vectors that hackers can use which are impossible to test for. We built our own chain so we could scope the code and limit what attack vectors are possible.
  • (3) Building in solidity, the language of Ethereum, is horrible. The development environment is bad, testnets don’t work, and many other things are painful. Kava is primarily built in GO which is far superior for financial applications in most respects.
  • (4) The future is Cosmos. Binance, Okchain, terra, Cosmos Hub(ATOM), and Kava all are created using the Cosmos-SDK framework. I believe this is the future and the blockchain developers are moving to this in mass. Over 110 projects now are building with the Cosmos-SDK.

Q12:

What are ways by which Kava project generates profit/revenue to maintain project. What is your revenue model?

  • Answer - Kava is a for-profit financial DAO with over 80 different businesses staking Kava and voting on its evolution. They want to see Kava succeed so they vote to fund operations and developments that drive user growth in Kava. Due to fees paid in Kava and the burning mechanism, as the system grows in users, the Kava supply decreases making those that hold Kava win due to scarcity.

Q13:

Lending/Borrowing has been introduced by Binance. How can this affect the Kava since people can directly borrow BUSD from Binance with BNB used as collateral than going to Kava?

  • Answer - Kava will be featured on Binance as well. The main benefit of Kava is that there is no counterparty. The capital is minted on demand not sourced from somewhere. Binance and other centralized parties on the otherhand need to find capital to provide loans, creating a cost of capital. Kava is much more efficient at providing capital and avoids a lot of regulator issues.
  • I'll add I think BUSD in the future might be usable for collateral to Kava's loans as well. It would be cool 🙂

Q14:

What's your opinions on Future of DeFi & DApps? Do you think that DeFi is the future of current Financial world? Also, How do you see the future of KAVA?

  • Answer - I believe Centralized Finance and the existing infrastructure has a place. It has a lot of issues that cause things like the 2008 crisis and the current insolvency issues that are happening across the world due to trust-based debt with no actual backers other than the people which end up bailing out banks and other financial institutions that have made poor decisions.
  • DeFi's future is bright because it solves this fundamental issue. It removes trust and adds transparency. Kava is right at the foundation for all of DeFi as things grow and mature.

Q15:

Recently, we have seen some big hacks in DeFi platforms. How will KAVA deal with these bad actors of crypto and what security measures have been taken by KAVA for the safety of users' funds?"

  • Answer - Unlike a lot of DeFi startups, we take things seriously. We don't ""move fast and break things"" as Mark Zuckerberg would say.
  • We do a thorough analysis before suggesting to deploy code. Our internal team works very hard to run tests and simulations, once it passes internally, we give it to 3rd party auditors who try and game it and break the code. If it passes there, we give the code to the community to review and vote into the mainnet. In this way, I’d estimate about 100+ people review our code and test it before it goes live and consumers can touch it. I don't know many other project teams that due things with such diligence.

Q16:

Binance for KAVA is a very valuable partner in terms of increasing the number of users, but what is KAVA ready to give equivalent to Binance users? What applications will be integrated into Binance to expand the ecosystem?

  • Answer - Kava gives the BNB users loans. It gives the DEX a stablecoin and the ability to offer margin products. Kava’s connection to binance chain and chainlink data also enables Binance DEX to offer trustless derivatives like options and futures products going forward.

Q17:

Cosmos has limitations on working with PoW coins. How do you technically solve the problem of implementing DeFi products for bitcoin?

  • Answer - Cosmos is great for hard-to-work-with blockchains like BTC. It's flexible in how you can construct bridges. For example, the validator set can have a multisig private key split up into pieces in order to create a trustless escrow and control of assets on other blockchains. In this way, we can create peg zones with Cosmos for the best assets in the world. Once a zone is established, it can be used on Kava and other Cosmos chains.

Q18:

USDX is currently a little-known stable coin. Do you plan to add it to the top exchanges with good liquidity, including Binance?

  • Answer - USDX will be growing quickly. We have a plan to have it listed and get liquidity across several known exchanges shortly after launch.

Q19:

There are several options for using USDX on the KAVA platform, one of which is Margin Trading / Leverage. Is this a selection function or a compulsory function? Wondering since there are some investors who don`t like margin. What is the level of leverage and how does a CDP auction work?

  • Answer - Using Kava for Margin trading is 100% optional. You can choose how you want to use the margin loan. You don’t have to spend the USDX unless you want to. It could be used for everyday payments as well in the case you simply don’t want to sell your underlying collateral. If you don’t want the risk, do small loans with lots of collateral.

Q20:

Will your team have a plan to implement the DAO module on your platform, as it provides autonomy, decentralization and transparency?

  • Answer - DAO - Kava is a for-profit DAO and it’s fully functional already. We have on-chain governance and have underwent several votes and evolutions you can look at. You actually can see some current voting processes taking place here: https://kava.mintscan.io/proposals
  • We recently implemented a cool feature called committees, which enables the DAO to elect a small group of experts to make decisions without needing a vote of the whole user base. This enables the experts to have control over a small portion of the protocol - such as monitoring the debt limit, fees, etc and enables Kava to operate faster and be more adaptable in volatile market conditions.

Q21:

How can we address the possible overloads and security threats caused by increased users in the DeFi scene?

  • Answer - Yes, this is a huge issue for Ethereum, MakerDAO and everyone in the space. I don’t see a bright future for DeFi on Etheruem unfortunately. You can’t have a blockchain do everything well. Tether alone congests most of Ethereum and makes oracle price feeds lag the market. This can cause liquidations that should not happen and real people will lose real funds. It’s a huge issue.
  • The hope is for a dedicated system like Kava to provide a better backbone for DeFi applications going forward.
  • I should point out that Kava is not just a MakerDao for Cosmos or a CDP for Bitcoin. Kava is designed to be a foundational layer for DeFi services that every new blockchain and application will need.
  • Every blockchain will need DeFi services like lending, stablecoins, and data and they need it to be very secure. Kava does all this with its cross-chain lending plarform, USDX stablecoin, and Chainlink data in an incredibly secure, but accessible manner.
  • In this way, Kava aims to connect and serve all the major cryptocurrency communities and build it’s place at the center, where every developer can get what they need to build financial applications of the future."

Q22:

What distinguishes Kava from your existing competitors like Syntetix?

  • Answer - Synthetix isn't really a competitor, but it is an interesting project in terms of mechanism design. We share a lot of common investors and have similar token economic ideas with them. The only blockchain project that could be is MakerDAO, but they can only work with ETH assets due to their design. We are focused on the major cap assets - BTC, BNB, XRP, ATOM and others have a much larger market than ETH to address. BTC is 10x the size alone. Currently no one serves them with DeFi. We’re going after this opportunity and believe it to be a huge one.

Q23:

Why is the KAVA coin not used for Mint, why am I asking that because I see it can also make the value of KAVA coins grow naturally?

  • Answer - Why is Kava not used as a collateral? Well, it could be I suppose. The community might vote for this in the near future if they want us to be like synthetix. It makes the Kava token more valuable and it will incentivize much more locked-up Kava reducing overall circulating supply which is fairly favorable. The main reason we have not done this yet is that we(Kava and its community) are still weighing the risks of doing this given that Kava also functions as a reserve asset. I think it's likely Kava gets added as collateral at some point, but it will likely have a high debt-collateral ratio to address the issues similar to Synthetix which is 750%.

Q24:

How do you prevent in a manipulated KAVA Mint just to take advantage of a token prize when minting?

  • Answer - Minting rewards and manipulation. We’ve thought of this. Each week, the blockchain counts all the blocks, counts how many people had a loan in that period, then takes the average loan amount over time to calculate the rewards. If you open and close a loan - you will get very little rewards. You only get a large reward if you keep the loan open the full period.

Q25:

Who are your oracle providers? Are you also an oracle provider?

  • Answer - Kava may run 1 oracle in the future, but we will always have many and be the minority. Most chainlink oracle node operators are large players in the space that run staking infrastructure companies like cosmostation, chainlayer, chorus one, figment networks, etc. Binance will also be one of our oracles.

Q26:

If we look at all the different types of DeFi products _(decentralized exchanges, stablecoins, atomic swaps, insurance products, loan platforms, trade financing platforms, custody platforms, and crowdfunding platforms) currently covering important areas of traditional finance...where does Kava fit in?

  • Answer - To make any interesting financial product work you need capital, a stable store of value, and price data. These are really hard to get on current blockchain environments. Kava provides all of these.

Q27:

Many people describe Kava as similar to Maker (MKR). How is Kava different? Why do you think Kava has more potential?

  • Answer - MakerDAO is a smart contract with a singular purpose, to serve ETH. It sadly inherited the problems of ethereum. Kava is designed from the ground up for security and interoperability. We are targeting bigger and better assets and have more capabilities to serve them with what their developers and ecosystem need.

Q28:

What is the uniqueness of KAVA project that cannot be found in other project that´s been released so far ?

  • Answer - Well in June 10th, we will be the first ever blockchain project to bring DeFi to another blockchain in a real way. BNB users will have loans, stablecoins, and much more.

Q29:

The gas fee is an issue for blockchain besides scalability. Does your Kava provide a solution for gas?

  • Answer - gas fees are very low on Kava, only high enough to prevent spam. We dont need high fees for TX because validators are paid in block rewards. Additionally, we dont have competing transactions from crypto-kitties or other non-financial applications. This leaves all of Kava's throughput 100% dedicated to scaling financial transactions.

Q30:

Kava project works on DeFi (Decentralized Finance) But what’s the benefits of Decentralized Financial system? What are the possibilities of DeFi over Centralized Finance system?

  • Answer - Open access, no need for trust, and no censorship by singular governments or parties. Kava is accessible anywhere in the world, by anyone.

Q31:

Data supplied by oracles are false at times, how do you prevent this? How reliable are data received by KAVA?

  • Answer - This is why using premium / credentialed APIs is important for oracles. These data sources tend to be more accurate and better managed. Wrong prices can happen - for liquidation systems like Kava, we factor this into our design by using an average of data overtime form all oracles as part of the calculation.

Q32:

Can anyone become a KAVA validator, or is it just an invitation from the project itself? What are the requirements for becoming a KAVA verifier?

  • Answer - Anyone can become a validator, but you will need to stake or have enough stake delegated to you from others to be in the top 100 validators to earn block rewards.

Q33:

DEFI PULSE said that a total of 902M is currently locked. According to you, how will this number change in the next few years, and how will KAVA position itself as the top player in this market segment?

  • Answer - DeFi will only grow through 2020. And likely grow massively.
  • All projects on DeFi pulse are ""ethereum"" based. Kava is going to shake the blockchain world in the next few weeks by being the first ""multi-chain"" project on DeFi pulse and by my estimations we should quickly surpass a lot of the projects on that list.

Q34:

I am an testnet minter and the process seem Simplified, now I want to know if minting of USDX will continue when you launch Mainnet and do you have plans to build your own KAVA WALLET for easy minting on your mainnet

  • Answer - Simple blockchain experience?! high praise! Yes the process will be the same. Kava will not provide interfaces or wallets. Kava Labs builds software for the blockchain, our community members like Cosmostation, Frontier, Trust Wallet build support for people to interact with it.

Q35:

What business plans does Kava have with Seoul (South Korea) after partnering with Cosmostation? Do you plan to expand your products beyond Asia? Have you thought about harnessing the potential of South America?

  • Answer - South Korea is a perfect market for Kava's DeFi. Regulations prohibit fiat-backed stablecoins and margin trading. Kava's platform uses crypto-backed stabvlecoins and can enable users to get loans to margin trade. I am looking forward to further developing the Korean market for Kava, working with close partners like Cosmostation and showing the world real use cases of DeFi.

Q36:

Thank you for taking the time to conduct this AMA. Do you have any parting words, and where can the people go to keep up with all of the new happenings regarding Kava Labs?

  • Answer - Thanks for all the awesome questions! Amazingly thoughtful!
  • I've been promising the world cross-chain DeFi since June of last year. The IEO and mainnet went live Nov 2019. It's been a year of hard work - but an industry first is coming on June 10th. I'm excited. I hope you guys are.
  • Thanks for having me, I hope you become a USDX minter and get KAVA rewards. And last but not least, I love Binance - it's Kava's first home and I'm really happy to open up DeFi to BNB first.
  • To keep up to date w/ all things Kava: Website - Telegram - Telegram for Kava Trading Chat - Twitter - Medium
submitted by Kava_Mod to KavaUSDX [link] [comments]

Read Rothbard ~ Use Bitcoin, with Max Hillebrand - YouTube HOW TO CALCULATE PIPS, PROFIT & PIP VALUE IN FOREX TRADING ... Bitcoin Fundamentals  If you own 0.22 BTC, You Are In 1% (World's Population) Bitcoin Quick Tutorial and Overview Khan Academy - YouTube

Bitcoin works on the basis of a few fundamental building blocks. Before looking into other cryptocurrencies and how they differ slightly in their approach, let’s cover the basics. Bitcoin is based on a system of peer to peer transactions secured by cryptography. Think of the films you definitely don’t torrent, and that gives you a rough conceptual framework. It’s architect was Satoshi ... Amiti Uttarwar is pushing bitcoin development forward with a focus on privacy and accessibility. She just so happens to also be the first confirmed female to do so. The fundamental thesis that altcoins compete with bitcoin ignores that the prices and the market capitalization of altcoins and bitcoin are positively correlated with one another. Viewed in a ... Blockchain Architect and Developer (Ethereum) at Fundamental Interactions Inc. (New York, NY, USA) – Blockchain News, Opinion, TV and Jobs; Newsom orders closure of indoor activities across California as coronavirus cases increase; Financial Planning & Analysis – Analyst at r3 (London, UK) – Blockchain News, Opinion, TV and Jobs SPV Channels offer encrypted persistent messaging channels between any Bitcoin participants. Seamlessly integrating offline and direct communications to break down the technical barrriers to enabling the direct peer to peer interactions that Satoshi described as fundamental to the operations of the Bitcoin network.

[index] [41955] [5643] [46795] [12384] [23461] [22687] [34822] [41764] [39317] [38782]

Read Rothbard ~ Use Bitcoin, with Max Hillebrand - YouTube

Baffled by bitcoin? Confused by the concept of crypto-currencies? Well, fear no more. In 190 seconds we explain what bitcoin actually is, where the idea came... It contains over 17 hours of video covering the fundamentals of JavaScript, DOM manipulation, an in-depth look behind the scenes of JavaScript, and advanced JavaScript features (function ... Hidden Secrets Of Money - Mike Maloney S1 • E8 From Bitcoin To Hedera Hashgraph (Documentary) Hidden Secrets Of Money Episode 8 - Duration: 1:14:26. GoldSilver (w/ Mike Maloney) 987,893 views 1 ... Professional Mentors, Exclusive Signals and Setups, Custom Indicators & Strategies, Educational Material, and much more. https://premium.crackingcryptocurren... One of the fundamental pillars of Bitcoin is Austrian Economics, a study of freedom and voluntary interactions in all parts of society. Murray Rothbard is one of the giant thinkers and freedom ...

#