I want to share with you a tool that I was continuously developing during the last couple of months. https://github.com/polakowo/vectorbt
As a data scientist, when I first started flirting with quant trading, I quickly realized that there is a shortage of Python packages that can actually enable me to iterate over a long list of possible strategies and hyper-parameters quickly. Most open-source backtesting libraries are very evolved in terms of functionality, but simply lack speed. Questions like "Which strategy is better: X or Y?" require fast computation and transformation of data. This not only prolongs your lifecycle of designing strategies, but is dangerous after all: limited number of tests is similar to a tunnel vision - it prevents you from seeing the bigger picture and makes you dive into the market blindly.
After trying tweaking pandas, multiprocessing, and even evaluating my strategies on a cluster with Spark, I finally found myself using Numba - a Python library that can compile slow Python code to be run at native machine code speed. And since there were no packages in the Python ecosystem that could even closely match the speed of my own backtests, I made vectorbt
vectorbt combines pandas, NumPy and Numba sauce to obtain orders-of-magnitude speedup over other libraries. It builds upon the idea that each instance of a trading strategy can be represented in a vectorized form, so multiple strategy instances can be packed into a single multi-dimensional array. In this form, they can processed in a highly efficient manner and compared easily. It also integrates Plotly and ipywidgets to display complex charts and dashboards akin to Tableau right in the Jupyter notebook. You can find basic examples and explanations in the documentation
Below is an example of doing in total 67,032 tests on three different timeframes of Bitcoin price history to explore how performance of a MACD strategy depends upon various combinations of fast, slow and signal windows:
import vectorbt as vbt import numpy as np import yfinance as yf from itertools import combinations, product # Fetch daily price of Bitcoin price = yf.Ticker("BTC-USD").history(period="max")['Close'] price = price.vbt.split_into_ranges(n=3) # Define hyper-parameter space # 49 fast x 49 slow x 19 signal fast_windows, slow_windows, signal_windows = vbt.indicators.create_param_combs( (product, (combinations, np.arange(2, 51, 1), 2), np.arange(2, 21, 1))) # Run MACD indicator macd_ind = vbt.MACD.from_params( price, fast_window=fast_windows, slow_window=slow_windows, signal_window=signal_windows, hide_params=['macd_ewm', 'signal_ewm'] ) # Long when MACD is above zero AND signal entries = macd_ind.macd_above(0) & macd_ind.macd_above(macd_ind.signal) # Short when MACD is below zero OR signal exits = macd_ind.macd_below(0) | macd_ind.macd_below(macd_ind.signal) # Build portfolio portfolio = vbt.Portfolio.from_signals( price.vbt.tile(len(fast_windows)), entries, exits, fees=0.001, freq='1D') # Draw all window combinations as a 3D volume fig = portfolio.total_return.vbt.volume( x_level='macd_fast_window', y_level='macd_slow_window', z_level='macd_signal_window', slider_level='range_start', template='plotly_dark', trace_kwargs=dict( colorscale='Viridis', colorbar=dict( title='Total return', tickformat='%' ) ) ) fig.show()
From signal generation to data visualization, the example above needs roughly a minute to run.
vectorbt let's you
- Analyze and engineer features for any time series data
- Supercharge pandas and your favorite tools to run much faster
- Test thousands of strategies, configurations, assets, and time ranges in one go
- Test machine learning models
- Build interactive charts/dashboards without leaving Jupyter
The current implementation has limitations though:
- It's still experimental and fast evolving, thus API can change quickly.
- Fast processing means more memory requirements. Above example created multiple DataFrames each taking 46MB of RAM (price, signals, cash, shares, equity, returns, etc). The issue can be mitigated by deleting at least some artifacts as soon as they are created and by disabling caching.
- Usage requires intermediate knowledge of pandas and NumPy to understand what's going on. Numba can be learned faster because of it mimicking NumPy. I tried to make lots of small examples in the documentation to get the idea how everything is glued together.
- The approach of merging vectorized and iterative code differs significantly from classic OOP approach of designing strategies, and will require you to rethink how strategies are formulated and implemented (which is kinda fun).
- Finally, if you're looking for a pure backtesting solution - it's not. It's more of a data mining tool to get to know your market and approach better.
If it sounds cool enough, try it out! I would love if you'd give me some feedback and contribute to it at some point, as the codebase has grown very fast. Cheers.
Apologies if this has been addressed elsewhere, or if the information I'm responding to here is out of date. tldr
: Are Ethereum (full, validating) nodes incentivized to vote no on gas limit increases that would lead to centralization? Context
I just read this old post
, its follow up post
, and this response post
all discussing whether or not Ethereum will trend towards centralization over time. As far as I can tell, all of that long discussion really boils down to disagreements about this paragraph from Gustav's post:
As miners are incentivized to act in ways that maximize the value of the tokens they receive in each mined block, they are unlikely to vote for a blocksize increase that would break the network. If the author trusts Bitcoin miners to act in ways that maximize the value of their bitcoins (such as not censoring transactions, generally prioritizing txs by their fee and otherwise act in ways that are beneficial for the network) the author should trust Ethereum miners to only vote for block sizes that can be handled by reasonable hardware, as the decentralized verification done by full nodes underpins the security of the network.
The network doesn’t break because validators drop off and peers are lost. The network functions with two datacenters. What breaks is decentralization. The connected nodes have no incentive to care about other less-connected nodes validation abilities. Summary:
Vitalik even agreed that [the number of full validating nodes] would shrink over time if the gas limits kept going up, and there’s nothing stopping that from happening. Right now miners are being altruistic, but what happens when mining doesn’t even exist? What happens when it’s just staking and the people doing it don’t care about other people’s blocks getting orphaned? Why would they keep the gas limit down? Remember they can manually adjust this, so why would they intentionally keep it low if they’re hyper-connected to each other and fully capable of processing that data? What happens when they start compounding their staking earnings, setting up more nodes, and gain more control of the network?
As block sizes increase, the number of full validating nodes decreases. Bitcoin solves this by simply saying block sizes never go up, period, and if that limits the amount of TPS the L1 layer can handle, so be it; just scale on L2. Ethereum's answer according to Vitalik is the gas price limits (and indirectly through that block size limits) that each individual minevalidator enforces. Implicit in this I believe is that sooner or later the block size limit has to stop growing (approximating the bitcoin approach), at least until hardware costs drop enough. This relies on an incentive mechanism in place to prevent validators from constantly increasing gas limits per block and letting the network trend towards centralization. Without such an incentive mechanism, I'd think each individual validator wouldn't necessarily care if increasing block sizes caused other
validators to drop off the network, so long as they
are able to keep up, which does indeed seem like it would eventually lead to a small number of powerful validators controlling the base layer. Core issue and my actual question:
So I think the real question at the core of this disagreement is: in what cases would (full, validating) nodes reject a block due to it being too big (using too much gas)? What are the incentive mechanisms that keep individual validators aligned with the goal of keeping the core validator pool decentralized?
Please note I'm only talking about the centralization of fully validating nodes with the power to reject new blocks, light clients don't help, and sharding helps by a constant factor in the short term but doesn't address the fundamental long term trends. PoW vs PoS is orthogonal and doesn't really affect the issues being discussed here .
Philippines Regulator Warns Investors to Steer Clear of Mining City's Bitcoin Vault Ponzi. There’s a mining contract scheme people are discussing and making the rounds on the web called ... Das Mining ist eine der technologischen Komponenten, der Bitcoin seinen dezentralen Charakter verdankt und für netzwerkweiten Konsensus sorgt. Miner validieren die innerhalb der letzten zehn Minuten getätigten Transaktionen und erfassen diese in einem Block. Diese Blöcke stellen aneinander gereiht die sogenannte Blockchain dar, die die komplette Transaktionshistorie umfasst und keine ... Hyperhash is an industry leading Bitcoin mining pool. All of the mining power is backed up by physical miners. Mining with the latest algorithms allows to make as much Bitcoin as possible. We aim to provide you with the easiest possible way to make money without having to do any of the hard stuff. With data centers around the globe, we aim to keep bills down and mining power high, meaning you ... ℹ️ hyper-mining.com receives about 5,443 unique visitors per day, and it is ranked 428,168 in the world. hyper-mining.com uses Apache, Facebook, Google AdSense, Google Font API, Google Maps, OWL Carousel, Select2, reCAPTCHA, jQuery web technologies. hyper-mining.com links to network IP address 126.96.36.199. Find more data about hyper mining. U weet nu wat bitcoin mining is, en wat er nodig is om zelf aan de slag te gaan. Zo moet u zorgen voor goede apparatuur, software en een wallet, en moet u natuurlijk rekening houden met de elektriciteitskosten. Bitcoin mining is tegenwoordig minder lucratief dan in de begindagen, dus u zal er niet veel aan verdienen. Maar als y wil meehelpen het netwerk in stand te houden, kan het zeker leuk ...
Bitcoin mining installed on Hyper-V - Duration: 2:02. Xander Smith Recommended for you. 2:02. Ganhei Muito Dinheiro Minerando Bitcoin em Casa de forma Rápida e Fácil! Bitcoin's down ~80% from it's all-time high, and about 50% below the price ($7,300) where miners are profitable. Is the network doomed? Or is this a great en... Check out my course here: https://kevin-talbot.teachable.com/p/kev-s-ebay-course 30 day No QUestions asked money back Guarantee (if you have completed under ... :) How Much Can You Make Mining Bitcoin With 6X 1080 Ti Beginners Guide - Duration: 19:20. How Much? 1,704,725 views. 19:20. How Are Mining Profits Now? August 2019 - Duration: 15:03. ...