Data/trading/python. https://samchaaa.substack.com/ https://github.com/samchaaa
Adjust your risk intelligently as the market changes. This dashboard automatically downloads free tick data, measures “rotations” of each instrument, and writes to Google Sheets on an hourly or daily basis.
Hourly-updated statistics on market rotations
GitHub repo: https://github.com/samchaaa/avg_rotations_dash/
If you’re a manual trader, chances are your risk is based on trial-and-error, rounding to the nearest 10. If you trade algorithmically, you may use quantitative measures, such as ADR (average daily range) or volatility.
This is the most important point of this article: The point of risk is to not get stopped out, if you’re actually right on a trade. But if you’re wrong, you want to be stopped out as soon as possible.
Here is an approach to risk, looking at markets from the bottom-up — starting with tick data, measuring each “rotation”, then delivering statistics based on the rotations.
Let’s define our terms first. A rotation is simply one “leg” of the zig-zag motion of the markets. In other words, consecutive up or down bars, without retracements.
GBPUSD, 4h (rotations calculated in MS Paint)
Let’s say you measure your risk with ADR. Hypothetically, the market could move the same range across two days, but have completely different internal movements.
If your timeline for trading is intraday, the movement inside the day may have many trade opportunities for you, and is valuable information to understand.
Previously I made a dashboard using only daily-level data in Google Sheets, which resulted in less data points and ignored the valuable data inside of the daily bar.
On a data level, measuring only daily-level gives us less data points, and a less reliable outcome. Using intraday rotations, we can build a bigger dataset, and get more consistent results.
Enough theory, let’s talk methodology.
How do we measure rotations quantitatively? Because the markets are fractal, rotations occur on every level — from ticks, to hours, to months.
Two rotations (up and down) on a daily timeframe become 50+ rotations on a 30 minute timeframe.
To begin, we pick an arbitrary way to group ticks. One way is to group by number (say 100 ticks), but in this case I will use time-based bars (5 minutes).
First, was the bar was an up-bar, down-bar, or unchanged? We label this +1, -1, or 0.
Second, we iterate through the bars. If there are multiple bars of the same sign, group them together. If the sign changes, a new group starts. (This forms “legs” of the rotations.)
Third, we iterate through the groups. If it’s an up-rotation, we get the high of the group. If it’s a down-rotation, we get the low. This forms the end points of the rotations.
Finally, we get the difference in prices between end points. This results in a list of rotations, with the exact measurement of each rotation from high to low.
Note: This does not account for edge cases of up bars with lows exceeding the close of prior down bar, or down bars with highs exceeding close of the prior up bar. (Example below)
Edge case (left, yellow box)… how the rotation is calculated (right).
The solution would be to group the high of the down bar with the previous up group, and low of the up bar with previous down group… but I’m just going to move ahead with the basic method for now. Just keep in mind, the overall rotation statistics may be very slightly understated due to these rare cases.
(data is a DataFrame with ohlc prices.)
def get_sign(data): # Get sign of each bar (-1, 0, 1) data['sign'] = np.sign(data['close'] - data['open']) # Group together (including 0's at end of group) prev = data['sign'] g = 0 for i, s in zip(data.index, data['sign']): if s == 1: if prev == 1: # continue up group data.loc[i, 'g'] = g else: # new up group g += 1 data.loc[i, 'g'] = g if s == -1: if prev == -1: # continue down group data.loc[i, 'g'] = g else: # new down group g += 1 data.loc[i, 'g'] = g if s == 0: # add to last group data.loc[i, 'g'] = g prev = s return data def get_rotations(data): r =  for x in range(int(max(data['g']))): # if g == x or x+1, find min/max # if g sign is +, find the +/- rot (high) # if g sign is -, find -/+ rot (low) if data.loc[data['g'] == x, 'sign'].iloc > 0: h = data.loc[(data['g'] == x) | (data['g'] == x+1), 'high'] i = h.idxmax() h = h.max() r.append([i, h]) if data.loc[data['g'] == x, 'sign'].iloc < 0: l = data.loc[(data['g'] == x) | (data['g'] == x+1), 'low'] i = l.idxmin() l = l.min() r.append([i, l]) return pd.DataFrame(r)
Here’s the step-by-step map of how the whole repo downloads data, calculates, then writes to Sheets.
1.) Download data
1a.) Delete what you don’t need (to manage storage).
1b.) Check what data you already have.
1c.) Download what you don’t already have, as far back as specified in settings.py, and up to the last complete hour (UTC).
2.) Calculate rotations
2a.) Run the calculations (explained above).
2b.) If the instrument is forex, scale pips to a proper number. For example, you could get rotations of 0.00010 for GBPUSD, or 0.10 for GBPJPY… but you just want to see “10 pips”. Everything else gets scaled to one point.
3.) Write to Google Sheets
3a.) Write to Google Sheets using gspread .
3b.) Update last updated datetime.
This repo is optimized to run in PythonAnywhere… which comes with a lot of dependencies pre-installed. For PythonAnywhere, you need a paid account (the download URL is not whitelisted on free tier). I use a $5/month account. Sign up here.
The code also runs locally, but downloads may fail (I think I was blacklisted for downloading months of tick data from my current IP). Another reason I prefer PythonAnywhere.
I haven’t tried AWS or GCP, but I’m sure those work as well (although you may need to install dependencies).
Steps for installation are explained in the README, but here they are again:
1.) Copy the Google Sheet (it comes with several formulae and formatting done for you).
2.) Download and unzip this repo to where you’ll be running it (local, cloud, or your dog’s Neuralink).
3.) Run pip install gspread .
4.) Create service account credentials for Google Sheets (explained also in gspread docs):
4a.) Create a new project in Google Cloud Console.
4b.) Enable Drive API.
4c.) Enable Sheets API.
4d.) Create a service account (name it and click through rest of options).
4e.) Once you’re back on the credentials page, click “Service Account” > “Keys” > “Add key” > “Create new key”. Select JSON (default) and it will download. Copy the service account e-mail.
5.) Click and drag the JSON credentials into to your repo folder (just inside the main folder, the script will find it automatically).
6.) Share your Google Sheet with the service account e-mail (as editor).
7.) In the main repo folder, open settings.py to add your Sheet ID and adjust your settings (explained below).
Pay attention to steps #5, #6, and #7 to make it work. The easiest errors are to not move the .json into the repo, not adding the Sheet ID in settings.py, or not sharing the Sheet with the service account email.
settings.py is very self-explanatory. The details are in the README.
The point needing most explanation is Dukascopy instrument names. This took some trial and error to figure out. To format the instrument name for downloading, you just need to format it as it is on Dukascopy, without any special characters.
“USA500.IDX/USD” => “USA500IDXUSD”
“BTC/USD” => “BTCUSD”
“AAPL.US/USD” => “AAPLUSUSD”
While python handles finding the rotations, once the data is in Google Sheets it is very easy to customize.
You can easily implement different statistics, like =AVERAGE(xyz) + STDDEV(xyz) for avg + 1 std rotations, or =PERCENTILE(xyz, 0.99) to get the 99th percentile rotation.
Other ideas are additional moving averages, or more detailed charts (to see exact highs and lows of rotations over recent sessions).
If you are checking instruments that are not forex, crypto, the S&P500, or Nasdaq (for example, Hong Kong ETFs), you may get results that are not scaled properly. This may be handled by adding your instrument name (and appropriate scaling) to download_dukas.py > normalize_tick() and rotations.py > final().
I have it set in PythonAnywhere to run every hour, about 10 minutes after the hour.
Let’s say I’m trading E-Mini S&P500 futures (/ES). I usually use about 3.5 point stop loss. Right now, this is how my dashboard looks:
“USA500IDXUSD” is the Dukascopy name for the S&P500 CFD, which matches the futures reasonably well.
Right off the bat, I can see the average rotation of 3.1 actually understates my expectations (3.5 points). I can see over the last 20 rotations, the average was 6.0 points. So using my normal 3.5 points may actually stop me out (assuming I’m right on the direction, which isn’t guaranteed).
Back to the most important point of this article: The point of risk is to not get stopped out, if you’re actually right on a trade. But if you’re wrong, you want to be stopped out as soon as possible.
This tells me recent volatility has increased (it’s the end of the week, right at Friday close). On the other hand, I can look at GBPUSD or BTCUSD and see the recent rotations average has dropped below the overall average (probably because those Friday sessions have already ended). This shows me the volatility is contracting for those instruments.
The bottom line is, I can look at this dashboard and find useful benchmarks in a single glance.
Speaking of GBPUSD, this reminds me of one caveat: markets behave differently. I use this primarily for /ES (USA500.IDX/USD in Dukascopy). But I also trade GBPUSD, and from experience the reading of 8.1 pips overall average may be pretty tight.
This is because GBPUSD rotates back and forth very often, while /ES is a slower moving market, with less back and forth rotations, and more clean moves.
Knowing this, you may still use the 95th percentile reading (20.6 pips), which is plenty of room for short term, intraday trading.
So this is not a substitute for actually knowing a market, but it can help you maintain reasonable expectations around risk.
The code for downloading ticks from Dukascopy came originally from these repos:
Create your free account to unlock your custom reading experience.