Best Bitcoin BTC Mining Pools of 2014 Bitcoins In Ireland

Dragonchain Great Reddit Scaling Bake-Off Public Proposal

Dragonchain Great Reddit Scaling Bake-Off Public Proposal

Dragonchain Public Proposal TL;DR:

Dragonchain has demonstrated twice Reddit’s entire total daily volume (votes, comments, and posts per Reddit 2019 Year in Review) in a 24-hour demo on an operational network. Every single transaction on Dragonchain is decentralized immediately through 5 levels of Dragon Net, and then secured with combined proof on Bitcoin, Ethereum, Ethereum Classic, and Binance Chain, via Interchain. At the time, in January 2020, the entire cost of the demo was approximately $25K on a single system (transaction fees locked at $0.0001/txn). With current fees (lowest fee $0.0000025/txn), this would cost as little as $625.
Watch Joe walk through the entire proposal and answer questions on YouTube.
This proposal is also available on the Dragonchain blog.

Hello Reddit and Ethereum community!

I’m Joe Roets, Founder & CEO of Dragonchain. When the team and I first heard about The Great Reddit Scaling Bake-Off we were intrigued. We believe we have the solutions Reddit seeks for its community points system and we have them at scale.
For your consideration, we have submitted our proposal below. The team at Dragonchain and I welcome and look forward to your technical questions, philosophical feedback, and fair criticism, to build a scaling solution for Reddit that will empower its users. Because our architecture is unlike other blockchain platforms out there today, we expect to receive many questions while people try to grasp our project. I will answer all questions here in this thread on Reddit, and I've answered some questions in the stream on YouTube.
We have seen good discussions so far in the competition. We hope that Reddit’s scaling solution will emerge from The Great Reddit Scaling Bake-Off and that Reddit will have great success with the implementation.

Executive summary

Dragonchain is a robust open source hybrid blockchain platform that has proven to withstand the passing of time since our inception in 2014. We have continued to evolve to harness the scalability of private nodes, yet take full advantage of the security of public decentralized networks, like Ethereum. We have a live, operational, and fully functional Interchain network integrating Bitcoin, Ethereum, Ethereum Classic, and ~700 independent Dragonchain nodes. Every transaction is secured to Ethereum, Bitcoin, and Ethereum Classic. Transactions are immediately usable on chain, and the first decentralization is seen within 20 seconds on Dragon Net. Security increases further to public networks ETH, BTC, and ETC within 10 minutes to 2 hours. Smart contracts can be written in any executable language, offering full freedom to existing developers. We invite any developer to watch the demo, play with our SDK’s, review open source code, and to help us move forward. Dragonchain specializes in scalable loyalty & rewards solutions and has built a decentralized social network on chain, with very affordable transaction costs. This experience can be combined with the insights Reddit and the Ethereum community have gained in the past couple of months to roll out the solution at a rapid pace.

Response and PoC

In The Great Reddit Scaling Bake-Off post, Reddit has asked for a series of demonstrations, requirements, and other considerations. In this section, we will attempt to answer all of these requests.

Live Demo

A live proof of concept showing hundreds of thousands of transactions
On Jan 7, 2020, Dragonchain hosted a 24-hour live demonstration during which a quarter of a billion (250 million+) transactions executed fully on an operational network. Every single transaction on Dragonchain is decentralized immediately through 5 levels of Dragon Net, and then secured with combined proof on Bitcoin, Ethereum, Ethereum Classic, and Binance Chain, via Interchain. This means that every single transaction is secured by, and traceable to these networks. An attack on this system would require a simultaneous attack on all of the Interchained networks.
24 hours in 4 minutes (YouTube):
24 hours in 4 minutes
The demonstration was of a single business system, and any user is able to scale this further, by running multiple systems simultaneously. Our goals for the event were to demonstrate a consistent capacity greater than that of Visa over an extended time period.
Tooling to reproduce our demo is available here:
https://github.com/dragonchain/spirit-bomb

Source Code

Source code (for on & off-chain components as well tooling used for the PoC). The source code does not have to be shared publicly, but if Reddit decides to use a particular solution it will need to be shared with Reddit at some point.

Scaling

How it works & scales

Architectural Scaling

Dragonchain’s architecture attacks the scalability issue from multiple angles. Dragonchain is a hybrid blockchain platform, wherein every transaction is protected on a business node to the requirements of that business or purpose. A business node may be held completely private or may be exposed or replicated to any level of exposure desired.
Every node has its own blockchain and is independently scalable. Dragonchain established Context Based Verification as its consensus model. Every transaction is immediately usable on a trust basis, and in time is provable to an increasing level of decentralized consensus. A transaction will have a level of decentralization to independently owned and deployed Dragonchain nodes (~700 nodes) within seconds, and full decentralization to BTC and ETH within minutes or hours. Level 5 nodes (Interchain nodes) function to secure all transactions to public or otherwise external chains such as Bitcoin and Ethereum. These nodes scale the system by aggregating multiple blocks into a single Interchain transaction on a cadence. This timing is configurable based upon average fees for each respective chain. For detailed information about Dragonchain’s architecture, and Context Based Verification, please refer to the Dragonchain Architecture Document.

Economic Scaling

An interesting feature of Dragonchain’s network consensus is its economics and scarcity model. Since Dragon Net nodes (L2-L4) are independent staking nodes, deployment to cloud platforms would allow any of these nodes to scale to take on a large percentage of the verification work. This is great for scalability, but not good for the economy, because there is no scarcity, and pricing would develop a downward spiral and result in fewer verification nodes. For this reason, Dragonchain uses TIME as scarcity.
TIME is calculated as the number of Dragons held, multiplied by the number of days held. TIME influences the user’s access to features within the Dragonchain ecosystem. It takes into account both the Dragon balance and length of time each Dragon is held. TIME is staked by users against every verification node and dictates how much of the transaction fees are awarded to each participating node for every block.
TIME also dictates the transaction fee itself for the business node. TIME is staked against a business node to set a deterministic transaction fee level (see transaction fee table below in Cost section). This is very interesting in a discussion about scaling because it guarantees independence for business implementation. No matter how much traffic appears on the entire network, a business is guaranteed to not see an increased transaction fee rate.

Scaled Deployment

Dragonchain uses Docker and Kubernetes to allow the use of best practices traditional system scaling. Dragonchain offers managed nodes with an easy to use web based console interface. The user may also deploy a Dragonchain node within their own datacenter or favorite cloud platform. Users have deployed Dragonchain nodes on-prem on Amazon AWS, Google Cloud, MS Azure, and other hosting platforms around the world. Any executable code, anything you can write, can be written into a smart contract. This flexibility is what allows us to say that developers with no blockchain experience can use any code language to access the benefits of blockchain. Customers have used NodeJS, Python, Java, and even BASH shell script to write smart contracts on Dragonchain.
With Docker containers, we achieve better separation of concerns, faster deployment, higher reliability, and lower response times.
We chose Kubernetes for its self-healing features, ability to run multiple services on one server, and its large and thriving development community. It is resilient, scalable, and automated. OpenFaaS allows us to package smart contracts as Docker images for easy deployment.
Contract deployment time is now bounded only by the size of the Docker image being deployed but remains fast even for reasonably large images. We also take advantage of Docker’s flexibility and its ability to support any language that can run on x86 architecture. Any image, public or private, can be run as a smart contract using Dragonchain.

Flexibility in Scaling

Dragonchain’s architecture considers interoperability and integration as key features. From inception, we had a goal to increase adoption via integration with real business use cases and traditional systems.
We envision the ability for Reddit, in the future, to be able to integrate alternate content storage platforms or other financial services along with the token.
  • LBRY - To allow users to deploy content natively to LBRY
  • MakerDAO to allow users to lend small amounts backed by their Reddit community points.
  • STORJ/SIA to allow decentralized on chain storage of portions of content. These integrations or any other are relatively easy to integrate on Dragonchain with an Interchain implementation.

Cost

Cost estimates (on-chain and off-chain) For the purpose of this proposal, we assume that all transactions are on chain (posts, replies, and votes).
On the Dragonchain network, transaction costs are deterministic/predictable. By staking TIME on the business node (as described above) Reddit can reduce transaction costs to as low as $0.0000025 per transaction.
Dragonchain Fees Table

Getting Started

How to run it
Building on Dragonchain is simple and requires no blockchain experience. Spin up a business node (L1) in our managed environment (AWS), run it in your own cloud environment, or on-prem in your own datacenter. Clear documentation will walk you through the steps of spinning up your first Dragonchain Level 1 Business node.
Getting started is easy...
  1. Download Dragonchain’s dctl
  2. Input three commands into a terminal
  3. Build an image
  4. Run it
More information can be found in our Get started documents.

Architecture
Dragonchain is an open source hybrid platform. Through Dragon Net, each chain combines the power of a public blockchain (like Ethereum) with the privacy of a private blockchain.
Dragonchain organizes its network into five separate levels. A Level 1, or business node, is a totally private blockchain only accessible through the use of public/private keypairs. All business logic, including smart contracts, can be executed on this node directly and added to the chain.
After creating a block, the Level 1 business node broadcasts a version stripped of sensitive private data to Dragon Net. Three Level 2 Validating nodes validate the transaction based on guidelines determined from the business. A Level 3 Diversity node checks that the level 2 nodes are from a diverse array of locations. A Level 4 Notary node, hosted by a KYC partner, then signs the validation record received from the Level 3 node. The transaction hash is ledgered to the Level 5 public chain to take advantage of the hash power of massive public networks.
Dragon Net can be thought of as a “blockchain of blockchains”, where every level is a complete private blockchain. Because an L1 can send to multiple nodes on a single level, proof of existence is distributed among many places in the network. Eventually, proof of existence reaches level 5 and is published on a public network.

API Documentation

APIs (on chain & off)

SDK Source

Nobody’s Perfect

Known issues or tradeoffs
  • Dragonchain is open source and even though the platform is easy enough for developers to code in any language they are comfortable with, we do not have so large a developer community as Ethereum. We would like to see the Ethereum developer community (and any other communities) become familiar with our SDK’s, our solutions, and our platform, to unlock the full potential of our Ethereum Interchain. Long ago we decided to prioritize both Bitcoin and Ethereum Interchains. We envision an ecosystem that encompasses different projects to give developers the ability to take full advantage of all the opportunities blockchain offers to create decentralized solutions not only for Reddit but for all of our current platforms and systems. We believe that together we will take the adoption of blockchain further. We currently have additional Interchain with Ethereum Classic. We look forward to Interchain with other blockchains in the future. We invite all blockchains projects who believe in decentralization and security to Interchain with Dragonchain.
  • While we only have 700 nodes compared to 8,000 Ethereum and 10,000 Bitcoin nodes. We harness those 18,000 nodes to scale to extremely high levels of security. See Dragonchain metrics.
  • Some may consider the centralization of Dragonchain’s business nodes as an issue at first glance, however, the model is by design to protect business data. We do not consider this a drawback as these nodes can make any, none, or all data public. Depending upon the implementation, every subreddit could have control of its own business node, for potential business and enterprise offerings, bringing new alternative revenue streams to Reddit.

Costs and resources

Summary of cost & resource information for both on-chain & off-chain components used in the PoC, as well as cost & resource estimates for further scaling. If your PoC is not on mainnet, make note of any mainnet caveats (such as congestion issues).
Every transaction on the PoC system had a transaction fee of $0.0001 (one-hundredth of a cent USD). At 256MM transactions, the demo cost $25,600. With current operational fees, the same demonstration would cost $640 USD.
For the demonstration, to achieve throughput to mimic a worldwide payments network, we modeled several clients in AWS and 4-5 business nodes to handle the traffic. The business nodes were tuned to handle higher throughput by adjusting memory and machine footprint on AWS. This flexibility is valuable to implementing a system such as envisioned by Reddit. Given that Reddit’s daily traffic (posts, replies, and votes) is less than half that of our demo, we would expect that the entire Reddit system could be handled on 2-5 business nodes using right-sized containers on AWS or similar environments.
Verification was accomplished on the operational Dragon Net network with over 700 independently owned verification nodes running around the world at no cost to the business other than paid transaction fees.

Requirements

Scaling

This PoC should scale to the numbers below with minimal costs (both on & off-chain). There should also be a clear path to supporting hundreds of millions of users.
Over a 5 day period, your scaling PoC should be able to handle:
*100,000 point claims (minting & distributing points) *25,000 subscriptions *75,000 one-off points burning *100,000 transfers
During Dragonchain’s 24 hour demo, the above required numbers were reached within the first few minutes.
Reddit’s total activity is 9000% more than Ethereum’s total transaction level. Even if you do not include votes, it is still 700% more than Ethereum’s current volume. Dragonchain has demonstrated that it can handle 250 million transactions a day, and it’s architecture allows for multiple systems to work at that level simultaneously. In our PoC, we demonstrate double the full capacity of Reddit, and every transaction was proven all the way to Bitcoin and Ethereum.
Reddit Scaling on Ethereum

Decentralization

Solutions should not depend on any single third-party provider. We prefer solutions that do not depend on specific entities such as Reddit or another provider, and solutions with no single point of control or failure in off-chain components but recognize there are numerous trade-offs to consider
Dragonchain’s architecture calls for a hybrid approach. Private business nodes hold the sensitive data while the validation and verification of transactions for the business are decentralized within seconds and secured to public blockchains within 10 minutes to 2 hours. Nodes could potentially be controlled by owners of individual subreddits for more organic decentralization.
  • Billing is currently centralized - there is a path to federation and decentralization of a scaled billing solution.
  • Operational multi-cloud
  • Operational on-premises capabilities
  • Operational deployment to any datacenter
  • Over 700 independent Community Verification Nodes with proof of ownership
  • Operational Interchain (Interoperable to Bitcoin, Ethereum, and Ethereum Classic, open to more)

Usability Scaling solutions should have a simple end user experience.

Users shouldn't have to maintain any extra state/proofs, regularly monitor activity, keep track of extra keys, or sign anything other than their normal transactions
Dragonchain and its customers have demonstrated extraordinary usability as a feature in many applications, where users do not need to know that the system is backed by a live blockchain. Lyceum is one of these examples, where the progress of academy courses is being tracked, and successful completion of courses is rewarded with certificates on chain. Our @Save_The_Tweet bot is popular on Twitter. When used with one of the following hashtags - #please, #blockchain, #ThankYou, or #eternalize the tweet is saved through Eternal to multiple blockchains. A proof report is available for future reference. Other examples in use are DEN, our decentralized social media platform, and our console, where users can track their node rewards, view their TIME, and operate a business node.
Examples:

Transactions complete in a reasonable amount of time (seconds or minutes, not hours or days)
All transactions are immediately usable on chain by the system. A transaction begins the path to decentralization at the conclusion of a 5-second block when it gets distributed across 5 separate community run nodes. Full decentralization occurs within 10 minutes to 2 hours depending on which interchain (Bitcoin, Ethereum, or Ethereum Classic) the transaction hits first. Within approximately 2 hours, the combined hash power of all interchained blockchains secures the transaction.

Free to use for end users (no gas fees, or fixed/minimal fees that Reddit can pay on their behalf)
With transaction pricing as low as $0.0000025 per transaction, it may be considered reasonable for Reddit to cover transaction fees for users.
All of Reddit's Transactions on Blockchain (month)
Community points can be earned by users and distributed directly to their Reddit account in batch (as per Reddit minting plan), and allow users to withdraw rewards to their Ethereum wallet whenever they wish. Withdrawal fees can be paid by either user or Reddit. This model has been operating inside the Dragonchain system since 2018, and many security and financial compliance features can be optionally added. We feel that this capability greatly enhances user experience because it is seamless to a regular user without cryptocurrency experience, yet flexible to a tech savvy user. With regard to currency or token transactions, these would occur on the Reddit network, verified to BTC and ETH. These transactions would incur the $0.0000025 transaction fee. To estimate this fee we use the monthly active Reddit users statista with a 60% adoption rate and an estimated 10 transactions per month average resulting in an approximate $720 cost across the system. Reddit could feasibly incur all associated internal network charges (mining/minting, transfer, burn) as these are very low and controllable fees.
Reddit Internal Token Transaction Fees

Reddit Ethereum Token Transaction Fees
When we consider further the Ethereum fees that might be incurred, we have a few choices for a solution.
  1. Offload all Ethereum transaction fees (user withdrawals) to interested users as they wish to withdraw tokens for external use or sale.
  2. Cover Ethereum transaction fees by aggregating them on a timed schedule. Users would request withdrawal (from Reddit or individual subreddits), and they would be transacted on the Ethereum network every hour (or some other schedule).
  3. In a combination of the above, customers could cover aggregated fees.
  4. Integrate with alternate Ethereum roll up solutions or other proposals to aggregate minting and distribution transactions onto Ethereum.

Bonus Points

Users should be able to view their balances & transactions via a blockchain explorer-style interface
From interfaces for users who have no knowledge of blockchain technology to users who are well versed in blockchain terms such as those present in a typical block explorer, a system powered by Dragonchain has flexibility on how to provide balances and transaction data to users. Transactions can be made viewable in an Eternal Proof Report, which displays raw data along with TIME staking information and traceability all the way to Bitcoin, Ethereum, and every other Interchained network. The report shows fields such as transaction ID, timestamp, block ID, multiple verifications, and Interchain proof. See example here.
Node payouts within the Dragonchain console are listed in chronological order and can be further seen in either Dragons or USD. See example here.
In our social media platform, Dragon Den, users can see, in real-time, their NRG and MTR balances. See example here.
A new influencer app powered by Dragonchain, Raiinmaker, breaks down data into a user friendly interface that shows coin portfolio, redeemed rewards, and social scores per campaign. See example here.

Exiting is fast & simple
Withdrawing funds on Dragonchain’s console requires three clicks, however, withdrawal scenarios with more enhanced security features per Reddit’s discretion are obtainable.

Interoperability Compatibility with third party apps (wallets/contracts/etc) is necessary.
Proven interoperability at scale that surpasses the required specifications. Our entire platform consists of interoperable blockchains connected to each other and traditional systems. APIs are well documented. Third party permissions are possible with a simple smart contract without the end user being aware. No need to learn any specialized proprietary language. Any code base (not subsets) is usable within a Docker container. Interoperable with any blockchain or traditional APIs. We’ve witnessed relatively complex systems built by engineers with no blockchain or cryptocurrency experience. We’ve also demonstrated the creation of smart contracts within minutes built with BASH shell and Node.js. Please see our source code and API documentation.

Scaling solutions should be extensible and allow third parties to build on top of it Open source and extensible
APIs should be well documented and stable

Documentation should be clear and complete
For full documentation, explore our docs, SDK’s, Github repo’s, architecture documents, original Disney documentation, and other links or resources provided in this proposal.

Third-party permissionless integrations should be possible & straightforward Smart contracts are Docker based, can be written in any language, use full language (not subsets), and can therefore be integrated with any system including traditional system APIs. Simple is better. Learning an uncommon or proprietary language should not be necessary.
Advanced knowledge of mathematics, cryptography, or L2 scaling should not be required. Compatibility with common utilities & toolchains is expected.
Dragonchain business nodes and smart contracts leverage Docker to allow the use of literally any language or executable code. No proprietary language is necessary. We’ve witnessed relatively complex systems built by engineers with no blockchain or cryptocurrency experience. We’ve also demonstrated the creation of smart contracts within minutes built with BASH shell and Node.js.

Bonus

Bonus Points: Show us how it works. Do you have an idea for a cool new use case for Community Points? Build it!

TIME

Community points could be awarded to Reddit users based upon TIME too, whereas the longer someone is part of a subreddit, the more community points someone naturally gained, even if not actively commenting or sharing new posts. A daily login could be required for these community points to be credited. This grants awards to readers too and incentivizes readers to create an account on Reddit if they browse the website often. This concept could also be leveraged to provide some level of reputation based upon duration and consistency of contribution to a community subreddit.

Dragon Den

Dragonchain has already built a social media platform that harnesses community involvement. Dragon Den is a decentralized community built on the Dragonchain blockchain platform. Dragon Den is Dragonchain’s answer to fake news, trolling, and censorship. It incentivizes the creation and evaluation of quality content within communities. It could be described as being a shareholder of a subreddit or Reddit in its entirety. The more your subreddit is thriving, the more rewarding it will be. Den is currently in a public beta and in active development, though the real token economy is not live yet. There are different tokens for various purposes. Two tokens are Lair Ownership Rights (LOR) and Lair Ownership Tokens (LOT). LOT is a non-fungible token for ownership of a specific Lair. LOT will only be created and converted from LOR.
Energy (NRG) and Matter (MTR) work jointly. Your MTR determines how much NRG you receive in a 24-hour period. Providing quality content, or evaluating content will earn MTR.

Security. Users have full ownership & control of their points.
All community points awarded based upon any type of activity or gift, are secured and provable to all Interchain networks (currently BTC, ETH, ETC). Users are free to spend and withdraw their points as they please, depending on the features Reddit wants to bring into production.

Balances and transactions cannot be forged, manipulated, or blocked by Reddit or anyone else
Users can withdraw their balance to their ERC20 wallet, directly through Reddit. Reddit can cover the fees on their behalf, or the user covers this with a portion of their balance.

Users should own their points and be able to get on-chain ERC20 tokens without permission from anyone else
Through our console users can withdraw their ERC20 rewards. This can be achieved on Reddit too. Here is a walkthrough of our console, though this does not show the quick withdrawal functionality, a user can withdraw at any time. https://www.youtube.com/watch?v=aNlTMxnfVHw

Points should be recoverable to on-chain ERC20 tokens even if all third-parties involved go offline
If necessary, signed transactions from the Reddit system (e.g. Reddit + Subreddit) can be sent to the Ethereum smart contract for minting.

A public, third-party review attesting to the soundness of the design should be available
To our knowledge, at least two large corporations, including a top 3 accounting firm, have conducted positive reviews. These reviews have never been made public, as Dragonchain did not pay or contract for these studies to be released.

Bonus points
Public, third-party implementation review available or in progress
See above

Compatibility with HSMs & hardware wallets
For the purpose of this proposal, all tokenization would be on the Ethereum network using standard token contracts and as such, would be able to leverage all hardware wallet and Ethereum ecosystem services.

Other Considerations

Minting/distributing tokens is not performed by Reddit directly
This operation can be automated by smart contract on Ethereum. Subreddits can if desired have a role to play.

One off point burning, as well as recurring, non-interactive point burning (for subreddit memberships) should be possible and scalable
This is possible and scalable with interaction between Dragonchain Reddit system and Ethereum token contract(s).

Fully open-source solutions are strongly preferred
Dragonchain is fully open source (see section on Disney release after conclusion).

Conclusion

Whether it is today, or in the future, we would like to work together to bring secure flexibility to the highest standards. It is our hope to be considered by Ethereum, Reddit, and other integrative solutions so we may further discuss the possibilities of implementation. In our public demonstration, 256 million transactions were handled in our operational network on chain in 24 hours, for the low cost of $25K, which if run today would cost $625. Dragonchain’s interoperable foundation provides the atmosphere necessary to implement a frictionless community points system. Thank you for your consideration of our proposal. We look forward to working with the community to make something great!

Disney Releases Blockchain Platform as Open Source

The team at Disney created the Disney Private Blockchain Platform. The system was a hybrid interoperable blockchain platform for ledgering and smart contract development geared toward solving problems with blockchain adoption and usability. All objective evaluation would consider the team’s output a success. We released a list of use cases that we explored in some capacity at Disney, and our input on blockchain standardization as part of our participation in the W3C Blockchain Community Group.
https://lists.w3.org/Archives/Public/public-blockchain/2016May/0052.html

Open Source

In 2016, Roets proposed to release the platform as open source to spread the technology outside of Disney, as others within the W3C group were interested in the solutions that had been created inside of Disney.
Following a long process, step by step, the team met requirements for release. Among the requirements, the team had to:
  • Obtain VP support and approval for the release
  • Verify ownership of the software to be released
  • Verify that no proprietary content would be released
  • Convince the organization that there was a value to the open source community
  • Convince the organization that there was a value to Disney
  • Offer the plan for ongoing maintenance of the project outside of Disney
  • Itemize competing projects
  • Verify no conflict of interest
  • Preferred license
  • Change the project name to not use the name Disney, any Disney character, or any other associated IP - proposed Dragonchain - approved
  • Obtain legal approval
  • Approval from corporate, parks, and other business units
  • Approval from multiple Disney patent groups Copyright holder defined by Disney (Disney Connected and Advanced Technologies)
  • Trademark searches conducted for the selected name Dragonchain
  • Obtain IT security approval
  • Manual review of OSS components conducted
  • OWASP Dependency and Vulnerability Check Conducted
  • Obtain technical (software) approval
  • Offer management, process, and financial plans for the maintenance of the project.
  • Meet list of items to be addressed before release
  • Remove all Disney project references and scripts
  • Create a public distribution list for email communications
  • Remove Roets’ direct and internal contact information
  • Create public Slack channel and move from Disney slack channels
  • Create proper labels for issue tracking
  • Rename internal private Github repository
  • Add informative description to Github page
  • Expand README.md with more specific information
  • Add information beyond current “Blockchains are Magic”
  • Add getting started sections and info on cloning/forking the project
  • Add installation details
  • Add uninstall process
  • Add unit, functional, and integration test information
  • Detail how to contribute and get involved
  • Describe the git workflow that the project will use
  • Move to public, non-Disney git repository (Github or Bitbucket)
  • Obtain Disney Open Source Committee approval for release
On top of meeting the above criteria, as part of the process, the maintainer of the project had to receive the codebase on their own personal email and create accounts for maintenance (e.g. Github) with non-Disney accounts. Given the fact that the project spanned multiple business units, Roets was individually responsible for its ongoing maintenance. Because of this, he proposed in the open source application to create a non-profit organization to hold the IP and maintain the project. This was approved by Disney.
The Disney Open Source Committee approved the application known as OSSRELEASE-10, and the code was released on October 2, 2016. Disney decided to not issue a press release.
Original OSSRELASE-10 document

Dragonchain Foundation

The Dragonchain Foundation was created on January 17, 2017. https://den.social/l/Dragonchain/24130078352e485d96d2125082151cf0/dragonchain-and-disney/
submitted by j0j0r0 to ethereum [link] [comments]

Brief Comments on Goguen: Q4 2020, Q1 2021, utility, Marlowe, DSL, Glow, Plutus, IELE, smart contracts, thanksgiving to you, sidechains and Hydra, Goguen rollout and additions to product update

Smart contracts (origins in 80s, 90s vs. 2013 ETH and 2020s Cardano)
We had a pretty interesting product update. We laughed, we cried, we all learned a little bit. Two and a half hours lots of stuff and I hope this gives you guys a good window into all the things that are happening. There's an enormous amount of complexity in Cardano and Goguen is no different. In fact that one slide showing all the interlocking dependencies and the moving pieces for it and just the sheer volume of things that are going on is, is an indication of not only the quality of the team but also the commercial reality of being a smart contract platform. In 2020 when I co-founded Ethereum our reference material was paper. We looked at things that Nick Szabo and people from the 1990s and 1980s wrote about and whether you were a Ricardian contract fan or you had programmed in Eiffel or you understood things like FpML basically it was an open field which gave us kind of a freedom to just do whatever we wanted to do but it also didn't give us a commercial reality of who's going to buy it? Who's going to use it? What do you need to do? The expectations in 2020 are vastly different from the expectations in 2013 and the reality is that there are massive deficits with Ethereum as designed today which is why Tezos exists and Algorand exists and why ETH2 is being constructed . It's why there are so many different players from Polkadot and others on down who have deep and detailed opinions about the things we need to do. If the ICO revolution hadn't happened, there was no notion of an ERC20 token and we were in a just different world.
We didn't have DeFi, any of these things and now in 2020 if you are to be competitive and build great things and actually invite real use and utility at a scale of millions and billions of people or government or Fortune 500 you need to have real good answers about a lot of different threats and things. For example, Marlowe, what it does is it leverages 20 years of history from domain experts like Willi Brammertz and over 30 years of history in domain-specific language (DSL) design from professor Simon Thompson and his team and it puts them together. It says for the first time ever we're going to have semantical clarity between the entrepreneur, the developer, the writer and the financial services infrastructure whether that be the banker, the insurance agent, the exchange, whoever that might be. Up until the totality of human history till today we have never had that semantical clarity. All four of those actors speak different languages and what we're doing with Marlowe as a DSL is an example of how you can unify and create a common language and experience between all of them today.
Marlowe, DSL, Glow, Plutus, IELE
Right now, you guys can go to the Marlowe playground and you can start using it and start building things and start having that semantical clarity and work with us and over a period of six months or so that will continue to evolve. Templates will evolve, applications will be constructed and those applications will work their way into Cardano applications and eventually they'll become cross-platform and work on things like (Hyperledger) fabric and other such things as we see industry and commercial adoption but it requires a starting point and Marlowe has evolved over a four year period through the hard labors of so many people to actually give us a great starting point. You can visually look at contracts and talk about their design. You can write them in JavaScript, you can write them in the Marlowe programming language. There's a Haskell side to things and you can see the power of this approach because of its design. You can prove things are correct, you can use theory that has existed for over 40 years like SAT solvers and reachability to actually show that you're not going to have a parity bug and that's just one example of one DSL of which many more will come. The point of DSLs is to give clarity to people in the industry. For example if we get into the health business and we start talking about medical records that will become a DSL to broker their movement and that same clarity and semantical unification will occur between doctors and hospitals, patients, governments, regulators and business professionals and they will now have a common language. So, Marlowe is an entry point and it's an example of how to build a DSL and evolve a DSL and bring the right people to the table.
When we look to things like Glow, from MuKn, this is an example of a team that's highly motivated and intrinsically across blockchain. When we look to the future and we say what happens when Bitcoin gets smart contracts? What happens when ETH2 comes out? What happens when people want to build cross-blockchain applications? Wouldn't it be nice to have a unification language and that's what Glow is basically all about. By strategic investments in that ecosystem, what Glow does for us is it ensures that we won't be left behind that Cardano has that and all Cardano infrastructure can benefit from that and Glow in turn will benefit from its embedding in our ecosystem. More users, more technology and ultimately because Cardano's the best. If you deploy in that direction it's the best experience. When you look to Plutus, Plutus is the unification language, it's the conductor of the orchestra and it pulls all of these things together and there were a lot of design requirements with Plutus that were quite hard from a theory viewpoint. We really cared a lot about resource determinism. We wanted to make sure that it was always predictable or at least as predictable as it can be to know how much it costs to do things because at the end of the day this is not a science experiment. These are not toys back in 2013. We had the luxury with Ethereum of just seeing what happened and the market makes strategic investments and they have to know how much their operating cost is going to be for their business model. We designed Plutus so that it would be one of the best programming languages on a long arc agenda of being a very practical on and off chain language to unify all the Cardano ecosystem. There are many objects in the ecosystem to operate, manipulate, instruments of value like native assets, identity, smart contracts onto themselves, DAOs, off chain infrastructure and you need a conductor that's capable of living in between all of these things and you need certainty that the code you're writing is going to work.
This is why we based it on an ecosystem that has 35 years of history and we as a company have invested millions of dollars in that ecosystem to modernize it and bring it into the 21st century especially for things like Windows support and working with partners like Tweag WebAssembly support, working on projects like compilation to JavaScript so that we can share that's there and our commitment is going to continue beyond that we are a founding member of the Haskell foundation working with Simon Peyton Jones and we're going to ensure that Haskell has compilation to ARM and that all of the technology that's required to keep that language competitive and actually make the language even more competitive will happen. It's very nice that Plutus is deeply ingrained in that ecosystem and that makes it a perfect conductor language. In the coming months we're going to talk a lot more about our relationship decay in IELE. If you live in the imperative object-oriented world and you want to do things a bit differently than the way things are done in the Haskell functional world then it makes sense to have an option that has the same principles as us which is why we reached out to Grigore years ago and established a commercial relationship with him. It's been the privilege of my career finding a way to resurrect that relationship so in the coming months we're going to talk a lot about how IELE fits into the Cardano ecosystem and the value it's going to bring in addition to the value of Marlowe, Glow, and Plutus.
Native assets
One of the single most important things about all of this is the native asset standard. One of the things we did not anticipate when we created Ethereum is just how pervasive the user's ability to issue an asset would be. We figured this would be an important thing, it's why we put it on a T-shirt back in the Miami conference in January of 2014 and we realized that from the color coin's project in the master coin project and one of the most important things is that we have the ability to issue not just a utility token but non-functional assets, security tokens and a litany of other instruments that hold value. Some ephemeral, some permanent, some with flexible monetary policies, some with fixed monetary policies, some from a central issuer, some from a decentralized issuer, some managed by a foundation, some managed by the community, some managed by fixed code that's immutable and the point of the native asset standard in the ERC20 converter is to establish a co-evolution of the technology and the commercialization of the technology. What we've been doing with ERC20 converter is using that as a way to create a conversation with those who want to migrate or build on Cardano and thinking through how are we going to create practical standards with our native assets. We already have enormous advantages with this standard over Ethereum. In particular the fact that your assets you issue on Cardano are treated the way that ADA is treated whereas in Ethereum you're a second-class citizen or ETH is treated differently from smart contracts. This first class citizen approach means that your assets will have the same governance access layer, to portfolio access and infrastructure that ADA itself has. Easier listing experiences, easier time with hardware wallets, easier time with wallet software. In general better user experience, faster transactions, lower transaction costs and then eventually for higher value tokens even the possibility of paying transaction fees over the long term in the native asset itself as if you were your own cryptocurrency.
Goguen rollout
You just simply cannot do this with the design of Ethereum and Ethereum 2. It's a huge advantage we have in our ecosystem and it's one that will become more pervasive over time now Goguen has already started. As a launch agenda the very first update to enable some Goguen era functionality was the metadata standard which meant that you could go from just moving ADA around to actually a whole litany of applications in the identity space and in the metadata space some of which we're aggressively negotiating on in commercial deals which we'll announce at a later date. The rollout of Goguen in terms of the system as we mentioned in the presentation will be principally done for the first iteration over a series of three hard fork combinator (HFC) events. The first of which is beginning this year in November December time frame and that's going to lay a lot of the foundations that will enable us to get to the second hard fork combinator event which will occur in Q1 of next year and we'll announce that specific date likely at the next product update and then the third one will happen shortly thereafter. They have to be spaced this way because it's just simply too cumbersome on our developers and also our partners such as wallet infrastructure and exchanges to try to do too much too quickly and furthermore there's an enormous amount of work as you've noticed on that slide to roll out Goguen. You have to do two things at once, you have to deploy the infrastructure but then you also have to populate the infrastructure and what's nice about the way that we've done things as you now see with the Marlowe playground the population of that infrastructure is occurring now today and with the ERC20 converter and the mint test net that's coming.
That's going to occur in November which means that that gives people time to start building and playing on our ecosystem in a safe sandbox so that when they deploy it to the mainnet they do it right the first time and they don't make an existential failure as we have seen with the DeFi space because at the end of the day once you go live you have a huge adversarial surface and everybody in the world is going to try to break the things you've done. It's very important that you do it right which means that you need time as a commercial partner and an application deployer to do it correctly. Parts of Goguen are indeed shipping this year, some have already shipped and we'll have another HFC event at the end of next month or early in December and throughout the first quarter of next year and likely the second quarter will complete the other two HFC events which will roll out full support for native assets, extended UTxO, the Plutus infrastructure and the Marlowe infrastructure. In the meantime we're also working on strategies about how we can ensure best integration of Glow and IELE into the Cardano ecosystem and as you've noticed there are three parallel teams that are working very hard. The Shelley team continues to upgrade the Shelley experience. Just today we've received a lot of concerns over for example the state pool ranking in Daedalus. Let me be very clear about something. There's no problem with the ranking software, the problem is the k parameter. It needs to be increased and the fact that things are getting grayed out is an indication that the ranking parameter is actually working right for the first time. So, k needs to go up but there are consequences of that and we need to improve the software to reflect those consequences but it is my goal to get k to 1000 before ideally d hits 0 because we really do want to have over a 1000 well-functioning stake pools but by no means is that the end of the story.
Improvements + project Catalyst
We need partial delegation and delegation portfolios. We need means for stake pool operators to communicate effectively and efficiently with those who delegate to them. We need improvements in SMASH. We need an identity center, we need a litany of improvements to Daedalus itself. Right now, today, there are more than four companies working full-time at doing just these things in addition to the Goguen updates that are occurring right now. That research thread and that development thread will continue. We've already seen seven CIPs including CIPs related to the reward function. We take them very seriously, we review them and there's enormous amount of discussion about how to create a fair and balanced system and we appreciate this feedback. It's a process and we ask for patience and we also remind people that we launched Shelley just at the end of July and despite that the ecosystem has more than doubled in size and it's been growing at an incredible pace and it's only going to continue and we're only going to see our best days ahead of us. Good things are coming down the pipe and it's becoming a much more holistic ecosystem from in performance improvements, to usability improvements, to better overall software for everyone.
There's no greater example of that than what we've been able to accomplish in the last three months for the exchanges in general. We're really proud of what we've done with the Adrestia stack and we're really proud of working with great partners like Binance and Bittrex throughout the last few months and we've had some certain challenges there but as a result of overcoming those challenges we have left behind an incredible enterprise grade listening experience that continues to get faster, continues to get higher quality and is secure and reliable 24 hours a day, seven days a week and we'll continue investing heavily to ensure that that only gets better for all of those partners whether they be an external wallet or their infrastructure like an exchange operator. We've had a lot of wins also on the governance side with the Voltaire Catalyst project. We have seen huge wins in participation going from small focus groups to now over 3500 people every single day coming into cardano.ideascale.com competing for 2250000 worth of ADA with fund2. That's just the beginning and every six to eight weeks that's going to increase in scale, in terms of the money and people, the quality... When we ask what is our developer acquisition strategy that's a major part of it because people know that there's money to be made in building on Cardano and that you have the right incentives to go realize your dreams and add value so just as these frameworks like the Marlowe playground and the Plutus playground and other such things like Glow come online and IELE come online the ability to build will be matched by the ability to discuss what to build and fund? What to build through a community driven process that includes greater and greater inclusivity. For example the next fund will include a voting center built right into Daedalus in addition to the cell phone application that we've already launched to vote and we will continue refining that experience relentlessly that's one of our fastest moving teams and I will remind you we are doing this in parallel to the Shelley workstream and the Goguen workstream that we showed you guys today. Finally there's Basho, not the next hard fork combinator event but HFC#3 which we anticipate in Q1 2021.
Sidechains, Hydra
I would like to include a sidechain protocol that allows the movement of value between independent systems through some form of blocking mechanism. We are currently examining and designing a protocol that we think fits very nicely into the way that our system works with mild modifications to the ledger rules. If that and should this be successful then that helps with one of the pillars of Basho interoperability and then the other pillar is scalability. Rob is hard at work working with technical architects and scaling up a team to start de-risking the Hydra protocol and others are hard at work evolving the science behind the Hydra protocol. We have seen great progress on all fronts to de-risk Hydra's roll-out and what's so beautiful about Hydra is it is our belief that the majority if not all of Hydra can be implemented in Plutus. As Plutus rolls out we have a natural constituency to run this infrastructure. The stake pool operators and we have a natural way without an HFC event or special accommodation of rolling out Hydra.
It's not really needed at this level of scaling capacity. We have an enormous throughput already 10 times greater than Ethereum as it is today and room to make it a hundred times greater than what Ethereum is today without Hydra. However as we de-risk this infrastructure solidify the protocols and get out all the kinks. What's so beautiful about it is that we will be able to when the time comes the community can roll out multiple implementations of Hydra so that there is diversity and there will be a natural group of actors to run those channels as we have seen for example with the Bolt spec and the Lightning ecosystem on Bitcoin. The contrasting difference between Lightning and Bitcoin and Hydra and extended UTxO and Cardano is we designed Cardano for Hydra.
Bitcoin was not designed for Lightning and as a consequence it's always more difficult for them to try to make meaningful progress whereas us there's no friction in that relationship. It just fits very nicely through so the roadmap is coming together and Cardano 2020 has definitely started to evolve into quite a mature ecosystem and what's really exciting is we're going from an ecosystem of potential to one of reality and instead of asking what could we do we're showing people what has been done and people are actually doing things every day.
Our commercial team is inundated with requests for coordination and cooperation and deployment. I get numerous emails every single day, well intended to very serious about people wanting to build on the platform and we're really excited about that. We're going to keep this steady systematic relentless march as you saw with the enormity of the news today. It's business as usual and it'll be exactly the same in November only there'll be more and every month. The velocity increases, we burn down the remaining story points to get these things done and things are happening very quickly and we just keep releasing and releasing and releasing and it's a very different time than it was even six months ago.
Community rules
What's so reassuring is we continue to have the best community in all the cryptocurrency space. It's the final point but it's one that I'm most proud of. You see people get to decide where they want to live, what infrastructure they want to deploy, on who they want to work with and when you have a welcoming warm and friendly community that is constructive and productive and their job is to help you get to where you need to go you want to work with those people. When you have a destructive or toxic community that's exclusive hierarchical and not invented here in their mentality people don't want to work with that community. Money can't buy that. I don't care if you have a bank account with four billion dollars or you're a central bank. You can't buy character and you can't buy culture, you have to make it and you have to earn it and if we've accomplished anything over these last five years from the 90 papers now and the million plus lines of code and the incredible releases that have happened and continue to happen we accomplished the greatest thing of all: we built a community to rival that of bitcoin's. I believe with that community we can realize the dream in the coming years of Cardano becoming the financial operating system.
For those who don't have one and giving open prayer and free economic identity to those who need it I am astounded by just how easy it is to roll these things out. They're super hard and complex under the hood but they just feel right and fit right and all the pieces are starting to come together in just the right way and I'm astounded by the fact that when we roll them out community members are there to receive them and take them to the next level.
Thank you all for attending the product update at the end of the month. This was a real good one, just as good as the Shelley one and we are now in the Goguen era with the first HFC event coming in the end of November and we're going to keep pushing them out. Every single one of them will add more capabilities and I encourage everyone to check out the Marlowe playground start building with it. Today things are happening really fast when the mint comes online at the end of November. Start playing around with that, start talking about the multi-token standard. If you're interested in a project our commercial division divisions always' open and you're going to see more and more progress from all entities in this ecosystem and some potentially major announcements before you can think it. Thanks guys it was a good day and thanks to the entire team that made all this happen I'm real proud of all of you.
Video: https://www.youtube.com/watch?v=l5wADba8kCw
submitted by stake_pool to cardano [link] [comments]

Bob The Magic Custodian



Summary: Everyone knows that when you give your assets to someone else, they always keep them safe. If this is true for individuals, it is certainly true for businesses.
Custodians always tell the truth and manage funds properly. They won't have any interest in taking the assets as an exchange operator would. Auditors tell the truth and can't be misled. That's because organizations that are regulated are incapable of lying and don't make mistakes.

First, some background. Here is a summary of how custodians make us more secure:

Previously, we might give Alice our crypto assets to hold. There were risks:

But "no worries", Alice has a custodian named Bob. Bob is dressed in a nice suit. He knows some politicians. And he drives a Porsche. "So you have nothing to worry about!". And look at all the benefits we get:
See - all problems are solved! All we have to worry about now is:
It's pretty simple. Before we had to trust Alice. Now we only have to trust Alice, Bob, and all the ways in which they communicate. Just think of how much more secure we are!

"On top of that", Bob assures us, "we're using a special wallet structure". Bob shows Alice a diagram. "We've broken the balance up and store it in lots of smaller wallets. That way", he assures her, "a thief can't take it all at once". And he points to a historic case where a large sum was taken "because it was stored in a single wallet... how stupid".
"Very early on, we used to have all the crypto in one wallet", he said, "and then one Christmas a hacker came and took it all. We call him the Grinch. Now we individually wrap each crypto and stick it under a binary search tree. The Grinch has never been back since."

"As well", Bob continues, "even if someone were to get in, we've got insurance. It covers all thefts and even coercion, collusion, and misplaced keys - only subject to the policy terms and conditions." And with that, he pulls out a phone-book sized contract and slams it on the desk with a thud. "Yep", he continues, "we're paying top dollar for one of the best policies in the country!"
"Can I read it?' Alice asks. "Sure," Bob says, "just as soon as our legal team is done with it. They're almost through the first chapter." He pauses, then continues. "And can you believe that sales guy Mike? He has the same year Porsche as me. I mean, what are the odds?"

"Do you use multi-sig?", Alice asks. "Absolutely!" Bob replies. "All our engineers are fully trained in multi-sig. Whenever we want to set up a new wallet, we generate 2 separate keys in an air-gapped process and store them in this proprietary system here. Look, it even requires the biometric signature from one of our team members to initiate any withdrawal." He demonstrates by pressing his thumb into the display. "We use a third-party cloud validation API to match the thumbprint and authorize each withdrawal. The keys are also backed up daily to an off-site third-party."
"Wow that's really impressive," Alice says, "but what if we need access for a withdrawal outside of office hours?" "Well that's no issue", Bob says, "just send us an email, call, or text message and we always have someone on staff to help out. Just another part of our strong commitment to all our customers!"

"What about Proof of Reserve?", Alice asks. "Of course", Bob replies, "though rather than publish any blockchain addresses or signed transaction, for privacy we just do a SHA256 refactoring of the inverse hash modulus for each UTXO nonce and combine the smart contract coefficient consensus in our hyperledger lightning node. But it's really simple to use." He pushes a button and a large green checkmark appears on a screen. "See - the algorithm ran through and reserves are proven."
"Wow", Alice says, "you really know your stuff! And that is easy to use! What about fiat balances?" "Yeah, we have an auditor too", Bob replies, "Been using him for a long time so we have quite a strong relationship going! We have special books we give him every year and he's very efficient! Checks the fiat, crypto, and everything all at once!"

"We used to have a nice offline multi-sig setup we've been using without issue for the past 5 years, but I think we'll move all our funds over to your facility," Alice says. "Awesome", Bob replies, "Thanks so much! This is perfect timing too - my Porsche got a dent on it this morning. We have the paperwork right over here." "Great!", Alice replies.
And with that, Alice gets out her pen and Bob gets the contract. "Don't worry", he says, "you can take your crypto-assets back anytime you like - just subject to our cancellation policy. Our annual management fees are also super low and we don't adjust them often".

How many holes have to exist for your funds to get stolen?
Just one.

Why are we taking a powerful offline multi-sig setup, widely used globally in hundreds of different/lacking regulatory environments with 0 breaches to date, and circumventing it by a demonstrably weak third party layer? And paying a great expense to do so?
If you go through the list of breaches in the past 2 years to highly credible organizations, you go through the list of major corporate frauds (only the ones we know about), you go through the list of all the times platforms have lost funds, you go through the list of times and ways that people have lost their crypto from identity theft, hot wallet exploits, extortion, etc... and then you go through this custodian with a fine-tooth comb and truly believe they have value to add far beyond what you could, sticking your funds in a wallet (or set of wallets) they control exclusively is the absolute worst possible way to take advantage of that security.

The best way to add security for crypto-assets is to make a stronger multi-sig. With one custodian, what you are doing is giving them your cryptocurrency and hoping they're honest, competent, and flawlessly secure. It's no different than storing it on a really secure exchange. Maybe the insurance will cover you. Didn't work for Bitpay in 2015. Didn't work for Yapizon in 2017. Insurance has never paid a claim in the entire history of cryptocurrency. But maybe you'll get lucky. Maybe your exact scenario will buck the trend and be what they're willing to cover. After the large deductible and hopefully without a long and expensive court battle.

And you want to advertise this increase in risk, the lapse of judgement, an accident waiting to happen, as though it's some kind of benefit to customers ("Free institutional-grade storage for your digital assets.")? And then some people are writing to the OSC that custodians should be mandatory for all funds on every exchange platform? That this somehow will make Canadians as a whole more secure or better protected compared with standard air-gapped multi-sig? On what planet?

Most of the problems in Canada stemmed from one thing - a lack of transparency. If Canadians had known what a joke Quadriga was - it wouldn't have grown to lose $400m from hard-working Canadians from coast to coast to coast. And Gerald Cotten would be in jail, not wherever he is now (at best, rotting peacefully). EZ-BTC and mister Dave Smilie would have been a tiny little scam to his friends, not a multi-million dollar fraud. Einstein would have got their act together or been shut down BEFORE losing millions and millions more in people's funds generously donated to criminals. MapleChange wouldn't have even been a thing. And maybe we'd know a little more about CoinTradeNewNote - like how much was lost in there. Almost all of the major losses with cryptocurrency exchanges involve deception with unbacked funds.
So it's great to see transparency reports from BitBuy and ShakePay where someone independently verified the backing. The only thing we don't have is:
It's not complicated to validate cryptocurrency assets. They need to exist, they need to be spendable, and they need to cover the total balances. There are plenty of credible people and firms across the country that have the capacity to reasonably perform this validation. Having more frequent checks by different, independent, parties who publish transparent reports is far more valuable than an annual check by a single "more credible/official" party who does the exact same basic checks and may or may not publish anything. Here's an example set of requirements that could be mandated:
There are ways to structure audits such that neither crypto assets nor customer information are ever put at risk, and both can still be properly validated and publicly verifiable. There are also ways to structure audits such that they are completely reasonable for small platforms and don't inhibit innovation in any way. By making the process as reasonable as possible, we can completely eliminate any reason/excuse that an honest platform would have for not being audited. That is arguable far more important than any incremental improvement we might get from mandating "the best of the best" accountants. Right now we have nothing mandated and tons of Canadians using offshore exchanges with no oversight whatsoever.

Transparency does not prove crypto assets are safe. CoinTradeNewNote, Flexcoin ($600k), and Canadian Bitcoins ($100k) are examples where crypto-assets were breached from platforms in Canada. All of them were online wallets and used no multi-sig as far as any records show. This is consistent with what we see globally - air-gapped multi-sig wallets have an impeccable record, while other schemes tend to suffer breach after breach. We don't actually know how much CoinTrader lost because there was no visibility. Rather than publishing details of what happened, the co-founder of CoinTrader silently moved on to found another platform - the "most trusted way to buy and sell crypto" - a site that has no information whatsoever (that I could find) on the storage practices and a FAQ advising that “[t]rading cryptocurrency is completely safe” and that having your own wallet is “entirely up to you! You can certainly keep cryptocurrency, or fiat, or both, on the app.” Doesn't sound like much was learned here, which is really sad to see.
It's not that complicated or unreasonable to set up a proper hardware wallet. Multi-sig can be learned in a single course. Something the equivalent complexity of a driver's license test could prevent all the cold storage exploits we've seen to date - even globally. Platform operators have a key advantage in detecting and preventing fraud - they know their customers far better than any custodian ever would. The best job that custodians can do is to find high integrity individuals and train them to form even better wallet signatories. Rather than mandating that all platforms expose themselves to arbitrary third party risks, regulations should center around ensuring that all signatories are background-checked, properly trained, and using proper procedures. We also need to make sure that signatories are empowered with rights and responsibilities to reject and report fraud. They need to know that they can safely challenge and delay a transaction - even if it turns out they made a mistake. We need to have an environment where mistakes are brought to the surface and dealt with. Not one where firms and people feel the need to hide what happened. In addition to a knowledge-based test, an auditor can privately interview each signatory to make sure they're not in coercive situations, and we should make sure they can freely and anonymously report any issues without threat of retaliation.
A proper multi-sig has each signature held by a separate person and is governed by policies and mutual decisions instead of a hierarchy. It includes at least one redundant signature. For best results, 3of4, 3of5, 3of6, 4of5, 4of6, 4of7, 5of6, or 5of7.

History has demonstrated over and over again the risk of hot wallets even to highly credible organizations. Nonetheless, many platforms have hot wallets for convenience. While such losses are generally compensated by platforms without issue (for example Poloniex, Bitstamp, Bitfinex, Gatecoin, Coincheck, Bithumb, Zaif, CoinBene, Binance, Bitrue, Bitpoint, Upbit, VinDAX, and now KuCoin), the public tends to focus more on cases that didn't end well. Regardless of what systems are employed, there is always some level of risk. For that reason, most members of the public would prefer to see third party insurance.
Rather than trying to convince third party profit-seekers to provide comprehensive insurance and then relying on an expensive and slow legal system to enforce against whatever legal loopholes they manage to find each and every time something goes wrong, insurance could be run through multiple exchange operators and regulators, with the shared interest of having a reputable industry, keeping costs down, and taking care of Canadians. For example, a 4 of 7 multi-sig insurance fund held between 5 independent exchange operators and 2 regulatory bodies. All Canadian exchanges could pay premiums at a set rate based on their needed coverage, with a higher price paid for hot wallet coverage (anything not an air-gapped multi-sig cold wallet). Such a model would be much cheaper to manage, offer better coverage, and be much more reliable to payout when needed. The kind of coverage you could have under this model is unheard of. You could even create something like the CDIC to protect Canadians who get their trading accounts hacked if they can sufficiently prove the loss is legitimate. In cases of fraud, gross negligence, or insolvency, the fund can be used to pay affected users directly (utilizing the last transparent balance report in the worst case), something which private insurance would never touch. While it's recommended to have official policies for coverage, a model where members vote would fully cover edge cases. (Could be similar to the Supreme Court where justices vote based on case law.)
Such a model could fully protect all Canadians across all platforms. You can have a fiat coverage governed by legal agreements, and crypto-asset coverage governed by both multi-sig and legal agreements. It could be practical, affordable, and inclusive.

Now, we are at a crossroads. We can happily give up our freedom, our innovation, and our money. We can pay hefty expenses to auditors, lawyers, and regulators year after year (and make no mistake - this cost will grow to many millions or even billions as the industry grows - and it will be borne by all Canadians on every platform because platforms are not going to eat up these costs at a loss). We can make it nearly impossible for any new platform to enter the marketplace, forcing Canadians to use the same stagnant platforms year after year. We can centralize and consolidate the entire industry into 2 or 3 big players and have everyone else fail (possibly to heavy losses of users of those platforms). And when a flawed security model doesn't work and gets breached, we can make it even more complicated with even more people in suits making big money doing the job that blockchain was supposed to do in the first place. We can build a system which is so intertwined and dependent on big government, traditional finance, and central bankers that it's future depends entirely on that of the fiat system, of fractional banking, and of government bail-outs. If we choose this path, as history has shown us over and over again, we can not go back, save for revolution. Our children and grandchildren will still be paying the consequences of what we decided today.
Or, we can find solutions that work. We can maintain an open and innovative environment while making the adjustments we need to make to fully protect Canadian investors and cryptocurrency users, giving easy and affordable access to cryptocurrency for all Canadians on the platform of their choice, and creating an environment in which entrepreneurs and problem solvers can bring those solutions forward easily. None of the above precludes innovation in any way, or adds any unreasonable cost - and these three policies would demonstrably eliminate or resolve all 109 historic cases as studied here - that's every single case researched so far going back to 2011. It includes every loss that was studied so far not just in Canada but globally as well.
Unfortunately, finding answers is the least challenging part. Far more challenging is to get platform operators and regulators to agree on anything. My last post got no response whatsoever, and while the OSC has told me they're happy for industry feedback, I believe my opinion alone is fairly meaningless. This takes the whole community working together to solve. So please let me know your thoughts. Please take the time to upvote and share this with people. Please - let's get this solved and not leave it up to other people to do.

Facts/background/sources (skip if you like):



Thoughts?
submitted by azoundria2 to QuadrigaInitiative [link] [comments]

r/Bitcoin recap - March 2018

Hi Bitcoiners!
I’m back with the fifteenth monthly Bitcoin news recap.
For those unfamiliar, each day I pick out the most popularelevant/interesting stories in Bitcoin and save them. At the end of the month I release them in one batch, to give you a quick (but not necessarily the best) overview of what happened in bitcoin over the past month.
And a lot has happened. It's easy to forget with so much focus on the price. Take a moment and scroll through the list below. You'll find an incredibly eventful month.
You can see recaps of the previous months on Bitcoinsnippets.com
A recap of Bitcoin in March 2018
submitted by SamWouters to Bitcoin [link] [comments]

Cryptocurrency NEO-review and analysis of prospects

Cryptocurrency NEO-review and analysis of prospects

https://preview.redd.it/92i8bo3tm1v31.png?width=800&format=png&auto=webp&s=392f964144975e5e2e11a6ea784f6f03923017b3
The NEO digital asset platform was previously called Antshares. But in recent times, a complete rebranding has been made. In addition to the name change, the startup updated blockchain nodes and technical documentation, as well as the stock Ticker. In addition, the official website and social media were redesigned. The transition to a new version of the smart contract system, called NEO-2.0, was carried out.
The NEO cryptocurrency has been showing stable and non-stopping growth for a long time. Very quickly, the Chinese creation took seventh place in the top of Coinmarketcap. This, without a doubt, is a serious bid for prospects, given the high competition in the cryptocurrency market. And Ether confidently holds the second line after the famous Bitcoin. So the crypto currency NEO clearly has all the chances to rise much higher than the seventh line.
At the moment, the price fluctuates around $45. The cost for three months has increased 20 times.
The volumes of neo cryptocurrency reserves are clearly defined and limited to 100 million tokens. So far, only half of the available potential — 50 million tokens-is available on the market. So the crypto currency NEO clearly has all the chances to rise much higher than the seventh line.
The project is actively developing. OnChain cooperates with other players in the field of cryptocurrency and blockchain technologies. At the moment, there are connections with blockchain startups Coindash, Bancor, Agrello and others. The Chinese project Red Pulse has announced the creation of a financial research platform based on the NEO-2.0 smart contract system. Also, in cooperation with NEO, there is an intensive development of The Elastos operating system based on blockchain technologies.

THE history of the emergence and development of neo cryptocurrency

https://preview.redd.it/2f7c6ryop1v31.png?width=1280&format=png&auto=webp&s=300b03be2a471d857d7d22d5659f2a4ef74c5e8b
The date of origin of the project can be considered 2014. NEO Creator Da Hongfei is a Director of Shanghai-based OnChain. In 2014, onchain, according to Da Hongfei's idea, launches the AntShares blockchain project. On the basis of this platform, a cryptocurrency of the same name was also created.
Yes Junpei put to the company is simple, but a global problem. His goal was to build a fundamentally new system of financial interaction. This system should unite the sectors of the real and virtual economy into a single whole with the help of high-tech contracts. And cryptocurrency from OnChain should become a unit of payment for these contracts.
Soon OnChain enters into a contract for cooperation with the Wings blockchain project, as well as contracts with economic giants Microsoft and Alibaba.
In August 2017 begins the story of NEO already in its current form with the current name. Da Hongfei carried out a complete rebranding and technical modernization of the project. The rebranding was a huge success, and the price of cryptocurrency from OnChain soared 40 times.
But not without problems. On the fourth of September, the Chinese authorities adopt a package of sanctions laws against cryptocurrencies and ICO. It was a heavy blow, which at the time almost 2 times brought down the course of the brainchild of Hongfei. However, soon the NEO cryptocurrency moved away from the blow and began to confidently win back the lost positions. At the moment, OnChain is actively upgrading the product and simultaneously trying to find a compromise with the Chinese authorities for the legalization and quiet operation of its offspring.

Features and principle of operation NEO

https://preview.redd.it/tj1goppoq1v31.png?width=800&format=png&auto=webp&s=0c39d14754ba9dd99e2c6bfb692f0f7bdd6c1838
From a technical point of view, the Chinese cryptocurrency is very similar to Ethereum. The basis of the platform is the construction of smart contracts and their subsequent payment with tokens. Also an important part of the project is the ability to create new technologies based on the platform, as well as easy integration with other services.
Despite the fact that NEO is often called "Chinese Ether" and the fact that the Ether still occupies a higher position in the ratings, the product from OnChain has advantages that the Ether lacks. NEO is much more practical and functional. This, no doubt, opens up the potential to move the Airwaves in the ratings in the near future.
Let's see in detail how everything works. Transactions within the system are possible when paying a Commission. The Commission is paid in-system currency. That is, for the transaction you have to throw in the system additional "fuel". The developers of OnChain decided to create an additional in-system currency, called GAS, as a fuel (a means of paying commissions).
NEO mining is impossible. There is a final coin value of 100 million. 50 million thrown on the market during the ICO. The second half of the developers keep at home. However, GAS mining is possible. However, it occurs when holding coins in a purse. That is, the more tokens you have, the more GAS coins you can get to pay commissions. Today, 2000 coins in the wallet accumulate 1 coin GAS every twenty-four hours. Such mining is associated with the work of the network on the Proof-Of-Stake algorithm. Coins generate themselves. Without the use of farms of video cards and megawatts of electricity.

Like any cryptocurrency, NEO has advantages and disadvantages.
The benefits of NEO:
  • the publicity of the company and experienced team;
  • contracts and cooperation with corporate giants;
  • a wide functionality, much superior to the functionality of Ether (it is difficult for a simple person to understand what the salt is, but for a specialist NEO opens the widest horizons for development and operations);
  • activity in meetings and seminars;
  • active struggle of OnChain for legalization (although there are some problems with this now in China, however, there is a high probability that soon all issues with the government will be settled, which will attract large investors and significantly increase the already considerable capitalization of NEO).
The shortcomings of NEO:
  • all gas storage nodes belong to OnChain, that is, NEO is a centralized structure, although it is served as decentralized, this means that blockchains are in the hands of a narrow circle;
  • OnChain has the technical ability to monitor the transactions of coin owners, transmit information to the authorities, as well as personally block funds in users ' accounts and regulate the rate.
However, there are great economic and technical prospects for the development and increase in the price of the coin.

Difference between NEO and Bitcoin

https://preview.redd.it/qm9q0kmft1v31.png?width=1024&format=png&auto=webp&s=81bc03a4521b12f6e517b9ef8f905d8271041355
The main points that distinguish NEO from Bitcoin:

  • Direct mining of NEO is not feasible, you can only mine GAS to pay commissions.
  • Bitcoin mining depends on the power of the technical base of the miner. The larger the pool of farms from video cards, the more active is the production. In the NEO system, gas mining occurs exclusively due to the presence of coins in the wallet.
  • To organize a large Bitcoin mining requires large purchases of iron and organization of production (supply of high-power power supply line, cooling system, etc.). A direct injection of investment is sufficient for the development of GAS. Each purchased 2000 coins of "Chinese ether" will steadily accumulate exactly 1 coin of GAS per day.
  • Bitcoin has the most decentralized system of blockchains, as opposed to pseudo-centralization of NEO.
  • The processing speed of one NEO block is only 15 seconds. For bitcoin-as much as ten minutes. In the future, it is predicted to accelerate the processing of blocks for NEO to 1 second.
Despite the risks associated with the organization of blockchains, NEO remains a very promising platform in the cryptocurrency market.

NEO storage wallets

On the official NEO website you can find links to the following wallets.
  1. Wallet NEON-Wallet from the group of independent developers City of Zion. Quite good, but the factor of third-party development and the presence of bugs impose their risks.
  2. NEO-CLI. This wallet is recommended only for programmers and people who are good at command line.
  3. NEO-GUI. The best option for the average user. To use it, you need to download the application, synchronize the blockchains and make a backup of the wallet. All. Now you can safely carry out financial transactions using Chinese kryptonite.
There is also the option of storing directly on the exchanges, however it is risky. Also, holding coins on an exchange rather than in a personal NEO wallet will not generate GAS.

NEO: buying and sharing

https://preview.redd.it/5ovlowr3v1v31.png?width=750&format=png&auto=webp&s=ea4c20d0be463cbaaf14846db2355b77975cd296
NEO can be bought and sold on exchanges or exchanged in multi-currency wallets. The most famous exchanges:

  • Bittrex;
  • Binance;
  • CoinSpot;
  • YONBI;
  • JUBI;
  • Yuanbao;
  • 51szzc;
  • Yobtc.
As the value and popularity of NEO increases, a massive increase in trading platforms where you can buy or sell "Chinese Ether" is predicted.

Ways to get NEO

Unfortunately, at the moment there is no way FOR direct NEO mining in the manner of Bitcoins and Ether.
However, there is a way out. NEO cranes can be used. Cranes are resources where the user receives a cryptocurrency reward for performing certain tasks or participating in lotteries.
There is a high probability that if successful in the legalization negotiations, OneChain will provide additional ways to get their tokens.
As you can see, NEO is a very promising and rapidly developing cryptocurrency. And although the Chinese government has created some difficulties, on the example of Bitcoin, we see how high the rate of the crypto currency can rise if the factors interfering with the development disappear. So, the prospects of NEO are optimistic and you can risk investing in them.
submitted by AVAY11 to u/AVAY11 [link] [comments]

batching in Bitcoin

On May 6th, 2017, Bitcoin hit an all-time high in transactions processed on the network in a single day: it moved 375,000 transactions which accounted for a nominal output of about $2.5b. Average fees on the Bitcoin network had climbed over a dollar for the first time a couple days prior. And they kept climbing: by early June average fees hit an eye-watering $5.66. This was quite unprecedented. In the three-year period from Jan. 1 2014 to Jan. 1 2017, per-transaction fees had never exceeded 31 cents on a weekly average. And the hits kept coming. Before 2017 was over, average fees would top out at $48 on a weekly basis. When the crypto-recession set in, transaction count collapsed and fees crept back below $1.
During the most feverish days of the Bitcoin run-up, when normal users found themselves with balances that would cost more to send than they were worth, cries for batching — the aggregation of many outputs into a single transaction — grew louder than ever. David Harding had written a blog post on the cost-savings of batching at the end of August and it was reposted to the Bitcoin subreddit on a daily basis.
The idea was simple: for entities sending many transactions at once, clustering outputs into a single transaction was more space- (and cost-) efficient, because each transaction has a fixed data overhead. David found that if you combined 10 payments into one transaction, rather than sending them individually, you could save 75% of the block space. Essentially, batching is one way to pack as many transactions as possible into the finite block space available on Bitcoin.
When fees started climbing in mid-2017, users began to scrutinize the behavior of heavy users of the Bitcoin blockchain, to determine whether they were using block space efficiently. By and large, they were not — and an informal lobbying campaign began, in which these major users — principally exchanges — were asked to start batching transactions and be good stewards of the scarce block space at their disposal. Some exchanges had been batching for years, others relented and implemented it. The question faded from view after Bitcoin’s price collapsed in Q1 2018 from roughly $19,000 to $6000, and transaction load — and hence average fee — dropped off.
But we remained curious. A common refrain, during the collapse in on-chain usage, was that transaction count was an obfuscated method of apprehending actual usage. The idea was that transactions could encode an arbitrarily large (within reason) number of payments, and so if batching had become more and more prevalent, those payments were still occurring, just under a regime of fewer transactions.

“hmmm”
Some sites popped up to report outputs and payments per day rather than transactions, seemingly bristling at the coverage of declining transaction count. However, no one conducted an analysis of the changing relationship between transaction count and outputs or payments. We took it upon ourselves to find out.
Table Of Contents:
Introduction to batching
A timeline
Analysis
Conclusion
Bonus content: UTXO consolidation
  1. Introduction to batching
Bitcoin uses a UTXO model, which stands for Unspent Transaction Output. In comparison, Ripple and Ethereum use an account/balance model. In bitcoin, a user has no balances, only UTXOs that they control. If they want to transfer money to someone else, their wallet selects one or more UTXOs as inputs that in sum need to add up to the amount they want to transfer. The desired amount then goes to the recipient, which is called the output, and the difference goes back to the sender, which is called change output. Each output can carry a virtually unlimited amount of value in the form of satoshis. A satoshi is a unit representing a one-hundred-millionth of a Bitcoin. This is very similar to a physical wallet full of different denominations of bills. If you’re buying a snack for $2.50 and only have a $5, you don’t hand the cashier half of your 5 dollar bill — you give him the 5 and receive some change instead.
Unknown to some, there is no hardcoded limit to the number of transactions that can fit in a block. Instead, each transaction has a certain size in megabytes and constitutes an economic incentive for miners to include it in their block. Because miners have limited space of 2 MB to sell to transactors, larger transactions (in size, not bitcoin!) will need to pay higher fees to be included. Additionally, each transaction can have a virtually unlimited number of inputs or outputs — the record stands at transactions with 20,000 inputs and 13,107 outputs.
So each transaction has at least one input and at one output, but often more, as well as some additional boilerplate stuff. Most of that space is taken up by the input (often 60% or more, because of the signature that proves they really belong to the sender), while the output(s) account for 15–30%. In order to keep transactions as small as possible and save fees, Bitcoin users have two major choices:
Use as few inputs as possible. In order to minimize inputs, you can periodically send your smaller UTXOs to yourself in times when fees are very low, getting one large UTXO back. That is called UTXO consolidation or consolidating your inputs.
Users who frequently make transfers (especially within the same block) can include an almost unlimited amount of outputs (to different people!) in the same transaction. That is called transaction batching. A typical single output transaction takes up 230 bytes, while a two output transaction only takes up 260 bytes, instead of 460 if you were to send them individually.
This is something that many casual commentators overlook when comparing Bitcoin with other payment systems — a Bitcoin transaction can aggregate thousands of individual economic transfers! It’s important to recognize this, as it is the source of a great deal of misunderstanding and mistaken analysis.
We’ve never encountered a common definition of a batched transaction — so for the purposes of this study we define it in the loosest possible sense: a transaction with three or more outputs. Commonly, batching is understood as an activity undertaken primarily by mining pools or exchanges who can trade off immediacy for efficiency. It is rare that a normal bitcoin user would have cause to batch, and indeed most wallets make it difficult to impossible to construct batched transactions. For everyday purposes, normal bitcoiners will likely not go to the additional effort of batching transactions.
We set the threshold at three for simplicity’s sake — a normal unbatched transaction will have one transactional output and one change output — but the typical major batched transaction from an exchange will have dozens if not hundreds of outputs. For this reason we are careful to provide data on various different batch sizes, so we could determine the prevalence of three-output transactions and colossal, 100-output ones.
We find it helpful to think of a Bitcoin transaction as a mail truck full of boxes. Each truck (transaction) contains boxes (outputs), each of contains some number of letters (satoshis). So when you’re looking at transaction count as a measure of the performance and economic throughput of the Bitcoin network, it’s a bit like counting mail trucks to discern how many letters are being sent on a given day, even though the number of letters can vary wildly. The truck analogy also makes it clear why many see Bitcoin as a settlement layer in the future — just as mail trucks aren’t dispatched until they’re full, some envision that the same will ultimately be the case for Bitcoin.

Batching
  1. A timeline
So what actually happened in the last six months? Let’s look at some data. Daily transactions on the Bitcoin network rose steadily until about May 2017, when average fees hit about $4. This precipitated the first collapse in usage. Then began a series of feedback loops over the next six months in which transaction load grew, fees grew to match, and transactions dropped off. This cycle repeated itself five times over the latter half of 2017.

more like this on coinmetrics.io
The solid red line in the above chart is fees in BTC terms (not USD) and the shaded red area is daily transaction count. You can see the cycle of transaction load precipitating higher fees which in turn cause a reduction in usage. It repeats itself five or six times before the detente in spring 2018. The most notable period was the December-January fee crisis, but fees were actually fairly typical in BTC terms — the rising BTC price in USD however meant that USD fees hit extreme figures.
In mid-November when fees hit double digits in USD terms, users began a concerted campaign to convince exchanges to be better stewards of block space. Both Segwit and batching were held up as meaningful approaches to maximize the compression of Bitcoin transactions into the finite block space available. Data on when exchanges began batching is sparse, but we collected information where it was available into a chart summarizing when exchanges began batching.

Batching adoption at selected exchanges
We’re ignoring Segwit adoption by exchanges in this analysis; as far as batching is concerned, the campaign to get exchanges to batch appears to have persuaded Bitfinex, Binance, and Shapeshift to batch. Coinbase/GDAX have stated their intention to begin batching, although they haven’t managed to integrate it yet. As far as we can tell, Gemini hasn’t mentioned batching, although we have some mixed evidence that they may have begun recently. If you know about the status of batching on Gemini or other major exchanges please get in touch.
So some exchanges have been batching all along, and some have never bothered at all. Did the subset of exchanges who flipped the switch materially affect the prevalence of batched transactions? Let’s find out.
  1. Analysis
3.1 How common is batching?
We measured the prevalence of batching in three different ways, by transaction count, by output value and by output count.

The tl;dr.
Batching accounts for roughly 12% of all transactions, 40% of all outputs, and 30–60% of all raw BTC output value. Not bad.
3.2 Have batched transactions become more common over time?
From the chart in 3.1, we can already see a small, but steady uptrend in all three metrics, but we want to dig a little deeper. So we first looked at the relationship of payments (all outputs that actually pay someone, so total outputs minus change outputs) and transactions.

More at transactionfee.info/charts
The first thing that becomes obvious is that the popular narrative — that the drop in transactions was caused by an increase in batching — is not the case; payments dropped by roughly the same proportion as well.
Dividing payment count by transaction count gives us some insight into the relationship between the two.

In our analysis we want to zoom into the time frame between November 2017 and today, and we can see that payments per transactions have actually been rallying, from 1.5 payments per transaction in early 2017 to almost two today.
3.3 What are popular batch sizes?
In this next part, we will look at batch sizes to see which are most popular. To determine which transactions were batched, we downloaded a dataset of all transactions on the Bitcoin network between November 2017 and May 2018from Blockchair.
We picked that period because the fee crisis really got started in mid-November, and with it, the demands for exchanges to batch. So we wanted to capture the effect of exchanges starting to batch. Naturally a bigger sample would have been more instructive, but we were constrained in our resources, so we began with the six month sample.
We grouped transactions into “batched” and “unbatched” groups with batched transactions being those with three or more outputs.

We then divided batched transactions into roughly equal groups on the basis of how much total output in BTC they had accounted for in the six-month period. We didn’t select the batch sizes manually — we picked batch sizes that would split the sample into equal parts on the basis of transaction value. Here’s what we ended up with:

All of the batch buckets have just about the same fraction of total BTC output over the period, but they account for radically different transaction and output counts over the period. Notice that there were only 183,108 “extra large” batches (with 41 or more outputs) in the six-month period, but between them there were 23m outputs and 30m BTC worth of value transmitted.
Note that output value in this context refers to the raw or unadjusted figure — it would have been prohibitively difficult for us to adjust output for change or mixers, so we’re using the “naive” estimate.
Let’s look at how many transactions various batch sizes accounted for in the sample period:


Batched transactions steadily increased relative to unbatched ones, although the biggest fraction is the small batch with between 3 and 5 outputs. The story for output counts is a bit more illuminating. Even though batched transactions are a relatively small fraction of overall transaction count, they contain a meaningful number of overall outputs. Let’s see how it breaks down:


Lastly, let’s look at output value. Here we see that batched transactions represent a significant fraction of value transmitted on Bitcoin.


As we can see, even though batched transactions make up an average of only 12% of all transactions, they move between 30%-60% of all Bitcoins, at peak times even 70%. We think this is quite remarkable. Keep in mind, however that the ‘total output’ figure has not been altered to account for change outputs, mixers, or self-churn; that is, it is the raw and unadjusted figure. The total output value is therefore not an ideal approximation of economic volume on the Bitcoin network.
3.4 Has transaction count become an unreliable measure of Bitcoin’s usage because of batching?
Yes. We strongly encourage any analysts, investors, journalists, and developers to look past mere transaction count from now on. The default measure of Bitcoin’s performance should be “payments per day” rather than transaction count. This also makes Bitcoin more comparable with other UTXO chains. They generally have significantly variable payments-per-transaction ratios, so just using payments standardizes that. (Stay tuned: Coinmetrics will be rolling out tools to facilitate this very soon.)
More generally, we think that the economic value transmitted on the network is its most fundamental characteristic. Both the naive and the adjusted figures deserve to be considered. Adjusting raw output value is still more art than science, and best practices are still being developed. Again, Coinmetrics is actively developing open-source tools to make these adjustments available.
  1. Conclusion
We started by revisiting the past year in Bitcoin and showed that while the mempool was congested, the community started looking for ways to use the blockspace more efficiently. Attention quickly fell on batching, the practice of combining multiple outputs into a single transaction, for heavy users. We showed how batching works on a technical level and when different exchanges started implementing the technique.
Today, around 12% of all transactions on the Bitcoin network are batched, and these account for about 40% of all outputs and between 30–60% of all transactional value. The fact such that a small set of transactions carries so much economic weight makes us hopeful that Bitcoin still has a lot of room to scale on the base layer, especially if usage trends continue.
Lastly, it’s worth noting that the increase in batching on the Bitcoin network may not be entirely due to deliberate action by exchanges, but rather a function of its recessionary behavior in the last few months. Since batching is generally done by large industrial players like exchanges, mixers, payment processors, and mining pools, and unbatched transactions are generally made by normal individuals, the batched/unbatched ratio is also a strong proxy for how much average users are using Bitcoin. Since the collapse in price, it is quite possible that individual usage of Bitcoin decreased while “industrial” usage remained strong. This is speculation, but one explanation for what happened.
Alternatively, the industrial players appear to be taking their role as stewards of the scarce block space more seriously. This is a significant boon to the network, and a nontrivial development in its history. If a culture of parsimony can be encouraged, Bitcoin will be able to compress more data into its block space and everyday users will continue to be able to run nodes for the foreseeable future. We view this as a very positive development. Members of the Bitcoin community that lobbied exchanges to add support for Segwit and batching should be proud of themselves.
  1. Bonus content: UTXO consolidation
Remember that we said that a second way to systematically save transaction fees in the Bitcoin network was to consolidate your UTXOs when fees were low? Looking at the relationship between input count and output count allows us to spot such consolidation phases quite well.

Typically, inputs and outputs move together. When the network is stressed, they decouple. If you look at the above chart carefully, you’ll notice that when transactions are elevated (and block space is at a premium), outputs outpace inputs — look at the gaps in May and December 2017. However, prolonged activity always results in fragmented UTXO sets and wallets full of dust, which need to be consolidated. For this, users often wait until pressure on the network has decreased and fees are lower. Thus, after transactions decrease, inputs become more common than outputs. You can see this clearly in February/March 2017.

Here we’ve taken the ratio of inputs to outputs (which have been smoothed on a trailing 7 day basis). When the ratio is higher, there are more inputs than outputs on that day, and vice versa. You can clearly see the spam attack in summer 2015 in which thousands (possibly millions) of outputs were created and then consolidated. Once the ratio spikes upwards, that’s consolidation. The spike in February 2018 after the six weeks of high fees in December 2017 was the most pronounced sigh of relief in Bitcoin’s history; the largest ever departure from the in/out ratio norm. There were a huge number of UTXOs to be consolidated.
It’s also interesting to note where inputs and outputs cluster. Here we have histograms of transactions with large numbers of inputs or outputs. Unsurprisingly, round numbers are common which shows that exchanges don’t publish a transaction every, say, two minutes, but instead wait for 100 or 200 outputs to queue up and then publish their transaction. Curiously, 200-input transactions were more popular than 100-input transactions in the period.


We ran into more curiosities when researching this piece, but we’ll leave those for another time.
Future work on batching might focus on:
Determining batched transactions as a portion of (adjusted) economic rather than raw volume
Looking at the behavior of specific exchanges with regards to batching
Investigating how much space and fees could be saved if major exchanges were batching transactions
Lastly, we encourage everyone to run their transactions through the service at transactionfee.info to assess the efficiency of their transactions and determine whether exchanges are being good stewards of the block space.
Update 31.05.2018
Antoine Le Calvez has created a series of live-updated charts to track batching and batch sizes, which you can find here.
We’d like to thank 0xB10C for their generous assistance with datasets and advice, the people at Blockchair for providing the core datasets, and David A. Harding for writing the initial piece and answering our questions.
submitted by miguelfranco1412 to 800cc [link] [comments]

Crypto and the Latency Arms Race: Crypto Exchanges and the HFT Crowd

Crypto and the Latency Arms Race: Crypto Exchanges and the HFT Crowd


News by Coindesk: Max Boonen
Carrying on from an earlier post about the evolution of high frequency trading (HFT), how it can harm markets and how crypto exchanges are responding, here we focus on the potential longer-term impact on the crypto ecosystem.
First, though, we need to focus on the state of HFT in a broader context.

Conventional markets are adopting anti-latency arbitrage mechanisms

In conventional markets, latency arbitrage has increased toxicity on lit venues and pushed trading volumes over-the-counter or into dark pools. In Europe, dark liquidity has increased in spite of efforts by regulators to clamp down on it. In some markets, regulation has actually contributed to this. Per the SEC:
“Using the Nasdaq market as a proxy, [Regulation] NMS did not seem to succeed in its mission to increase the display of limit orders in the marketplace. We have seen an increase in dark liquidity, smaller trade sizes, similar trading volumes, and a larger number of “small” venues.”
Why is non-lit execution remaining or becoming more successful in spite of its lower transparency? In its 2014 paper, BlackRock came out in favour of dark pools in the context of best execution requirements. It also lamented message congestion and cautioned against increasing tick sizes, features that advantage latency arbitrageurs. (This echoes the comment to CoinDesk of David Weisberger, CEO of Coinroutes, who explained that the tick sizes typical of the crypto market are small and therefore do not put slower traders at much of a disadvantage.)
Major venues now recognize that the speed race threatens their business model in some markets, as it pushes those “slow” market makers with risk-absorbing capacity to provide liquidity to the likes of BlackRock off-exchange. Eurex has responded by implementing anti-latency arbitrage (ALA) mechanisms in options:
“Right now, a lot of liquidity providers need to invest more into technology in order to protect themselves against other, very fast liquidity providers, than they can invest in their pricing for the end client. The end result of this is a certain imbalance, where we have a few very sophisticated liquidity providers that are very active in the order book and then a lot of liquidity providers that have the ability to provide prices to end clients, but are tending to do so more away from the order book”, commented Jonas Ullmann, Eurex’s head of market functionality. Such views are increasingly supported by academic research.
XTX identifies two categories of ALA mechanisms: policy-based and technology-based. Policy-based ALA refers to a venue simply deciding that latency arbitrageurs are not allowed to trade on it. Alternative venues to exchanges (going under various acronyms such as ECN, ATS or MTF) can allow traders to either take or make, but not engage in both activities. Others can purposefully select — and advertise — their mix of market participants, or allow users to trade in separate “rooms” where undesired firms are excluded. The rise of “alternative microstructures” is mostly evidenced in crypto by the surge in electronic OTC trading, where traders can receive better prices than on exchange.
Technology-based ALA encompasses delays, random or deterministic, added to an exchange’s matching engine to reduce the viability of latency arbitrage strategies. The classic example is a speed bump where new orders are delayed by a few milliseconds, but the cancellation of existing orders is not. This lets market makers place fresh quotes at the new prevailing market price without being run over by latency arbitrageurs.
As a practical example, the London Metal Exchange recently announced an eight-millisecond speed bump on some contracts that are prime candidates for latency arbitrageurs due to their similarity to products trading on the much bigger CME in Chicago.
Why 8 milliseconds? First, microwave transmission between Chicago and the US East Coast is 3 milliseconds faster than fibre optic lines. From there, the $250,000 a month Hibernia Express transatlantic cable helps you get to London another 4 milliseconds faster than cheaper alternatives. Add a millisecond for internal latencies such as not using FPGAs and 8 milliseconds is the difference for a liquidity provider between investing tens of millions in speed technology or being priced out of the market by latency arbitrage.
With this in mind, let’s consider what the future holds for crypto.

Crypto exchanges must not forget their retail roots

We learn from conventional markets that liquidity benefits from a diverse base of market makers with risk-absorption capacity.
Some have claimed that the spread compression witnessed in the bitcoin market since 2017 is due to electronification. Instead, I posit that it is greater risk-absorbing capacity and capital allocation that has improved the liquidity of the bitcoin market, not an increase in speed, as in fact being a fast exchange with colocation such as Gemini has not supported higher volumes. Old-timers will remember Coinsetter, a company that, per the Bitcoin Wiki , “was created in 2012, and operates a bitcoin exchange and ECN. Coinsetter’s CSX trading technology enables millisecond trade execution times and offers one of the fastest API data streams in the industry.” The Wiki page should use the past tense as Coinsetter failed to gain traction, was acquired in 2016 and subsequently closed.
Exchanges that invest in scalability and user experience will thrive (BitMEX comes to mind). Crypto exchanges that favour the fastest traders (by reducing jitter, etc.) will find that winner-takes-all latency strategies do not improve liquidity. Furthermore, they risk antagonising the majority of their users, who are naturally suspicious of platforms that sell preferential treatment.
It is baffling that the head of Russia for Huobi vaunted to CoinDesk that: “The option [of co-location] allows [selected clients] to make trades 70 to 100 times faster than other users”. The article notes that Huobi doesn’t charge — but of course, not everyone can sign up.
Contrast this with one of the most successful exchanges today: Binance. It actively discourages some HFT strategies by tracking metrics such as order-to-trade ratios and temporarily blocking users that breach certain limits. Market experts know that Binance remains extremely relevant to price discovery, irrespective of its focus on a less professional user base.
Other exchanges, take heed.
Coinbase closed its entire Chicago office where 30 engineers had worked on a faster matching engine, an exercise that is rumoured to have cost $50mm. After much internal debate, I bet that the company finally realised that it wouldn’t recoup its investment and that its value derived from having onboarded 20 million users, not from upgrading systems that are already fast and reliable by the standards of crypto.
It is also unsurprising that Kraken’s Steve Hunt, a veteran of low-latency torchbearer Jump Trading, commented to CoinDesk that: “We want all customers regardless of size or scale to have equal access to our marketplace”. Experience speaks.
In a recent article on CoinDesk , Matt Trudeau of ErisX points to the lower reliability of cloud-based services compared to dedicated, co-located and cross-connected gateways. That much is true. Web-based technology puts the emphasis on serving the greatest number of users concurrently, not on serving a subset of users deterministically and at the lowest latency possible. That is the point. Crypto might be the only asset class that is accessible directly to end users with a low number of intermediaries, precisely because of the crypto ethos and how the industry evolved. It is cheaper to buy $500 of bitcoin than it is to buy $500 of Microsoft shares.
Trudeau further remarks that official, paid-for co-location is better than what he pejoratively calls “unsanctioned colocation,” the fact that crypto traders can place their servers in the same cloud providers as the exchanges. The fairness argument is dubious: anyone with $50 can set up an Amazon AWS account and run next to the major crypto exchanges, whereas cheap co-location starts at $1,000 a month in the real world. No wonder “speed technology revenues” are estimated at $1 billion for the major U.S. equity exchanges.
For a crypto exchange, to reside in a financial, non-cloud data centre with state-of-the-art network latencies might ironically impair the likelihood of success. The risk is that such an exchange becomes dominated on the taker side by the handful of players that already own or pay for the fastest communication routes between major financial data centres such as Equinix and the CME in Chicago, where bitcoin futures are traded. This might reduce liquidity on the exchange because a significant proportion of the crypto market’s risk-absorption capacity is coming from crypto-centric funds that do not have the scale to operate low-latency strategies, but might make up the bulk of the liquidity on, say, Binance. Such mom-and-pop liquidity providers might therefore shun an exchange that caters to larger players as a priority.

Exchanges risk losing market share to OTC liquidity providers

While voice trading in crypto has run its course, a major contribution to the market’s increase in liquidity circa 2017–2018 was the risk appetite of the original OTC voice desks such as Cumberland Mining and Circle.
Automation really shines in bringing together risk-absorbing capacity tailored to each client (which is impossible on anonymous exchanges) with seamless electronic execution. In contrast, latency-sensitive venues can see liquidity evaporate in periods of stress, as happened to a well-known and otherwise successful exchange on 26 June which saw its bitcoin order book become $1,000 wide for an extended period of time as liquidity providers turned their systems off. The problem is compounded by the general unavailability of credit on cash exchanges, an issue that the OTC market’s settlement model avoids.
As the crypto market matures, the business model of today’s major cash exchanges will come under pressure. In the past decade, the FX market has shown that retail traders benefit from better liquidity when they trade through different channels than institutional speculators. Systematic internalizers demonstrate the same in equities. This fact of life will apply to crypto. Exchanges have to pick a side: either cater to retail (or retail-driven intermediaries) or court HFTs.
Now that an aggregator like Tagomi runs transaction cost analysis for their clients, it will become plainly obvious to investors with medium-term and long-term horizons (i.e. anyone not looking at the next 2 seconds) that their price impact on exchange is worse than against electronic OTC liquidity providers.
Today, exchange fee structures are awkward because they must charge small users a lot to make up for crypto’s exceptionally high compliance and onboarding costs. Onboarding a single, small value user simply does not make sense unless fees are quite elevated. Exchanges end up over-charging large volume traders such as B2C2’s clients, another incentive to switch to OTC execution.
In the alternative, what if crypto exchanges focus on HFT traders? In my opinion, the CME is a much better venue for institutional takers as fees are much lower and conventional trading firms will already be connected to it. My hypothesis is that most exchanges will not be able to compete with the CME for fast traders (after all, the CBOE itself gave up), and must cater to their retail user base instead.
In a future post, we will explore other microstructures beyond all-to-all exchanges and bilateral OTC trading.
Fiber threads image via Shutterstock
submitted by GTE_IO to u/GTE_IO [link] [comments]

Binance Singapore VS CoinHako  Pros & Cons  How fast is the exchange? BINANCE POOL WHAT IS AND COULD BE - PERSONAL REVIEW Binance Hacked - Details in Hindi Use BNB in CeLaVi [Binance Coin & Alchemy Payment] Binance Launches Crypto Mining Pool Amid Centralization Concerns Mining pools - bitcoin mining pools: how to generate ... Ted Lin - Binance:

The first offline Bitcoin wallet and the first Bitcoin-centric world map was created by Satoshi Labs in addition to the mining stratum protocol which is being used by the other mining pools. Slush Pool stands unique for using Score based method, where the old shares are given lesser prominence than news shares at the start of the round. This method avoids the risk of getting cheated by other ... The goal of this article is to review Bitcoin’s growth as a whole, and discuss key events that happened in each month, and attempt to predict what we can expect 2015 to be like using 2013 and 2014 as years of comparison. This article will be broken down into months covering January- part of December. Bitcoin’s market will be explained simply rather than thoroughly analyzed. I hope you ... Best Bitcoin Mining Pools 2020 in Review (+ Fee Comparison) ... Another one of the popular mining pools, AntPool has been founded in 2014. The pool was founded by Xu Lingchao and Tian Xin and operates from China. The pool currently lets you mine a fairly solid number of cryptocurrencies, including BTC, BCH, LTC, ETH, ETC, ZEC, DASH, SC, XMC & BTM. There have been very few complaints about the ... According to the “2019 Q2 Crypto-Correlations Review,” which analyzed potential similarities between valuations of various cryptoassets, the second quarter of 2019 marked the third-best quarter the crypto industry has seen since 2014 and the highest growth since 2017. During this period, Bitcoin’s price increased by 300%, pushing its market dominance to above 60% and marking new highs ... 2014 was a year of upheaval in bitcoin. Mt. Gox, one of the exchanges that helped drive the price up to dizzing heights closed unexpectedly in February, and a series of hacks and thefts rocked exchanges, wallets and businesses across the blockchain. Prices rebounded a little at the start of the year to $940, before steadily dropping, with a sharp decline when Mt. Gox closed, and a continued ... We will go through 3 minings pools where you can mine bitcoin with your mining machines. Mining Bitcoin with the Best Pools. If you are someone who’s interested in mining Bitcoin, you are going to want to learn about mining pools. When you mine as part of a pool, you are essentially lending some of your resources to a bigger group of miners, and everyone is profiting more, together. You might have a lot of questions about mining pools if you’re new to mining, but not to worry ...

[index] [930] [21457] [879] [4833] [11980] [3326] [9196] [1183] [10667] [2039]

Binance Singapore VS CoinHako Pros & Cons How fast is the exchange?

As a company I love BINANCE, how they have been performing no matter what type of market they are in, bull market, bear market, sideways, you name it and their token BNB and Exchanges has been ... Binance Singapore VS CoinHako Pros & Cons How fast is the exchange? Just a review / comparison of Binance Singapore and Coinhako, you decide on your own which exchange suit you best or do ... Binance Pool has received mixed responses from the crypto community, with some commentators expressing concerns that Binance's pool will result in a further centralization of Bitcoin ( BTC ) hash ... Bitcoin Explained Simply for Dummies ... 12:49. Everything KuCoin - Pool-X, KuMex, Lending, Soft Staking - Overview in Hindi - Duration: 9:33. Game of Bitcoins 647 views. New; 9:33. How To Make ... mining pools - bitcoin & cryptocurrency mining pools explained best mining pools pps vs pplns. - mining pool hub. Bitcoin Mining Pools Explained and Review... Binance Academy Recommended for you. 3:35. With Great Power: The Stan Lee Story YouTube Movies. 2017 · Documentary; 1:20:21. Gold will be explosive, unlike anything we’ve seen says Canada’s ... Welcome back to the no BS blockchain channel covering bitcoin, cryptocurrency and everything around FinTech. Episode 9 is with the Ted Lin, CGO of Binance, the most popular and innovating crypto ...

#