The Cup With Handle pattern, developed by William O’Neil, is a technical indicator for identifying the continuation of a trend after a period of consolidation [1]. It consists of an initial uptrend that’s ideally not too mature, a U-shaped move (cup), followed by another sharp and minor shake out (handle). The price, after a rally, starts to consolidate with a smooth slope but then bounces back to the previous highs as it faces support at lower price levels. When previous highs are touched, investors who bought shares before consolidation and other less committed investors sell their shares, pushing the price down for one last time. Eventually, the price reverses from a second support level (above the previous one) and breaks out of the resistance. Traders use different rules to identify Cup With Handle patterns and gauge their strength, but the base usually lasts 6–65 weeks with depths ranging from 8% to 50%. When trading Cup With Handles, the profit target is usually 20–25% above the initial resistance (pivot point), and the stop-loss range is 5–8% below that line [2]. , part of the IBD MarketSmith’s premium trading toolkit, identifies seven different chart patterns in daily and weekly time periods: Cup and Cup With Handle, Saucer and Saucer With Handle, Double Bottom, Flat Base, Ascending Base, Consolidation, and IPO Base. This article will focus on using Pattern Recognition API to identify and trade Cup With Handle patterns. To find more information about other properties of Pattern Recognition, check its . Pattern Recognition user manual Prerequisites A basic understanding of Python is required to get the most out of the article. We’ll use to validate and serialize data, and to backtest the strategy, to load and access data, to read environment variables, to fetch benchmark price data, and to make API calls. A is required to access Pattern Recognition. Symbols data and a list of Dow Jones Industrial Average (DJIA) constituents will be fetched from . To retrieve the historical price data of the constituents, you need to . pydantic zipline-reloaded pyfolio pandas python-dotenv yfinance requests premium MarketSmith account Financial Modeling Prep (FMP) v3 API ingest a zipline data bundle Please make sure to use the following versions: python 3.6.12 pyfolio 0.8.0 pandas 0.22.0 matplotlib 3.0.3 numpy 1.19.5 Alternatively, you need to follow and update a line at pyfolio source code to work with the latest stack. this answer Load and Store Data DJIA Constituents With a free FMP account, we can access the list of DJIA names from . First of all, create src/price/endpoints.py to store the FMP endpoints. this endpoint DJIA_CONSTITUENTS = NASDAQ100_CONSTITUENTS = # src/price/endpoints.py "https://financialmodelingprep.com/api/v3/dowjones_constituent" "https://financialmodelingprep.com/api/v3/nasdaq_constituent" Define model to serialize data received from FMP API. Constituent typing Union pydantic BaseModel symbol: str name: str sector: str subSector: str headQuarter: Union[str, ] dateFirstAdded: str cik: Union[str, ] founded: Union[str, ] # src/models/constituent.py from import from import : class Constituent (BaseModel) """Represents a ticker received from FMP API when retrieving constituents of an index; see `price.load_tickers` method.""" None None None Define to fetch and store data. load_tickers os csv typing List requests dotenv load_dotenv pydantic parse_obj_as src.price.endpoints NASDAQ100_CONSTITUENTS, DJIA_CONSTITUENTS src.models Constituent load_dotenv() params = { : api_key} res = requests.get(endpoint, params=params) res = res.json() tickers = parse_obj_as(List[Constituent], res) tickers = [constituent.dict() constituent tickers] keys = tickers[ ].keys() open( , , newline= ) output_file: dict_writer = csv.DictWriter(output_file, keys) dict_writer.writeheader() dict_writer.writerows(tickers) __name__ == : load_tickers(DJIA_CONSTITUENTS) # src/price/ticker.py import import from import import from import from import from import from import -> : def load_tickers (endpoint: str, api_key: str = os.environ[ ]) "FMP_API_KEY" None """Fetches and loads list of tickers to `data/ticker.csv` file. Uses FMP API to get the latest data and requires `FMP_API_KEY` env variable to be set. Fetches the data from the passed endpoint.""" "apikey" # parse and validate data # write data to file for in 0 with "data/tickers.csv" 'w' '' as if "__main__" We first load the environment variable, pass it to the endpoint defined in and convert the response to a dictionary by calling the method. We then use Pydantic’s utility method to serialize response into a list of instances. In the end, the data is converted back to a list of dictionaries to be stored in data/tickers.csv . FMP_API_KEY constants.py .json() parse_obj_as Constituent Make sure to store the key in the .env file and set it to the key you received from the FMP dashboard. Now we can dispatch from the command line. If it runs properly, we’ll have a CSV file similar to the image below. FMP_API_KEY load_tickers Cup With Handle Data We should now load the history of Cup With Handle patterns for all symbols in data/tickers.csv . Let’s first define the MarketSmith endpoints we’re going to call. GET_LOGIN = HANDLE_LOGIN = SEARCH_INSTRUMENTS = GET_PATTERNS = GET_USER_INFO = # src/ms/endpoints.py "https://login.investors.com/accounts.login" "https://myibd.investors.com/register/raas/loginhandler.aspx" "https://marketsmith.investors.com/mstool/api/chart/search-instruments" "https://marketsmith.investors.com/WONServices/MSPatternRec/MSPatternRec.svc/json/getPatterns" "https://marketsmith.investors.com/mstool/api/tool/user-info" class passes environment variables to IBD API to generate an authenticated session. AuthSession os json requests Session dotenv load_dotenv src.ms.endpoints HANDLE_LOGIN, GET_LOGIN load_dotenv() session = Session() payload = { : username, : password, : api_key, : include, : } self.payload = payload login = session.post(GET_LOGIN, data=payload).json() login[ ] = res = session.post(HANDLE_LOGIN, json=login) self.session = session # src/ms/auth.py import import from import from import from import : class AuthSession : def __init__ (self, username: str = os.environ[ ], password: str = os.environ[ ], api_key: str = os.environ[ ], include: str = ) "USERNAME" "PASSWORD" "API_KEY" "profile,data," """Generates a session authenticated into MarketSmith""" "loginID" "password" "ApiKey" "include" "includeUserInfo" "true" # make auth payload accessible to class consumers # make a request to GET_LOGIN endpoint to get login info "action" "login" # pass the login info to HANDLE_LOGIN endpoint to get .ASPXAUTH cookies It first sends the user credentials to endpoint to receive the user object, which then will be passed (along with an extra action key) to . The response includes the necessary headers to authenticate the session for future requests. Don’t forget to define , , and values (according to your MarketSmith account credentials) in .env . GET_LOGIN HANDLE_LOGIN Set-Cookie USERNAME PASSWORD API_KEY Before fetching patterns, we need to load and objects. Let’s start with the latter. Define the model to serialize the object we’ll receive from the MarketSmith backend. Instrument User User pydantic BaseModel CSUserID: int DisplayName: str EmailAddress: str IsSpecialAccount: bool RemainingTrialDays: int SessionID: str UserDataInitializationFailed: bool UserEntitlements: str UserID: int UserType: int # src/models/user.py from import : class User (BaseModel) """Represents a MarketSmith `User` object""" method receives an authenticated session and returns the authenticated user information. get_user pydantic validate_arguments src.ms.auth AuthSession src.ms.endpoints GET_USER_INFO src.models User response = session.session.get(GET_USER_INFO) user = User(**response.json()) user # src/ms/user.py from import from import from import from import @validate_arguments(config=dict(arbitrary_types_allowed=True)) -> User: def get_user (session: AuthSession) """Gets information of the authenticated user in a session""" return decorator parses and validates arguments before the function is called. parses arguments with an instance that don’t extend pydantic class (in this case, an instance). validate_arguments arbitrary_types_allowed BaseModel AuthSession It’s time to load instrument data from MarketSmith. MS API passes dates with this format: –the first number is the date in , and the second number is the timezone difference with GMT. method converts MS API date strings to the built-in object. /Date(1536303600000–0700)/ milliseconds since the epoch convert_msdate_to_date datetime.date : str_btwn_paranthesis = ms_date[ms_date.find( )+ :ms_date.find( )] (str_btwn_paranthesis[ ] == ): millis = int(str_btwn_paranthesis.split( )[ ]) * : millis = int(str_btwn_paranthesis.split( )[ ]) date_obj = date.fromtimestamp(millis/ ) date_obj TypeError: ValueError( ) # src/ms/utils.py -> date: def convert_msdate_to_date (ms_date: str) """Converts date string passed by MarketSmith API to `date` object Parameters ---------- ms_date : `str` e.g., "/Date(1536303600000-0700)/" Returns ------- `date` Raises ------- `ValueError` Invalid input type """ try "(" 1 ")" if 0 "-" "-" 1 -1 else "-" 0 1000 return except raise "Invalid date received from MS. Must be like /Date(1536303600000-0700)/" Nice. Now, use the method in the class to convert the dates during validation. Instrument datetime date pydantic BaseModel, validator mSID: int type: int instrumentID: int symbol: str name: str earliestTradingDate: date latestTradingDate: date hasComponents: bool hasOptions: bool isActive: bool src.ms.utils convert_msdate_to_date convert_msdate_to_date(v) # src/models/instrument.py from import from import : class Instrument (BaseModel) """Represents a financial `Instrument` object passed by MarketSmith API""" @validator("earliestTradingDate", "latestTradingDate", pre=True, always=True) : def validate_date (cls, v) from import return And fetch the instruments. logging pydantic validate_arguments src.ms AuthSession src.ms.endpoints SEARCH_INSTRUMENTS src.models Instrument search_results = session.session.post( SEARCH_INSTRUMENTS, json=symbol) search_results = search_results.json()[ ] instrument = list(filter( result: result[ ] == symbol, search_results)) : len(instrument) == AssertionError: logging.error( ) instrument = Instrument(**instrument[ ]) instrument # src/ms/instrument.py import from import from import from import from import @validate_arguments(config=dict(arbitrary_types_allowed=True)) -> Instrument: def get_instrument (session: AuthSession, symbol: str) """Given a symbol (ticker), gets the corresponding `Instrument` from MarketSmith API Parameters ---------- session : `AuthSession` authenticated session symbol : `str` ticker of Instrument Raises ---------- `AssertionError` if the length of search results for the ticker is more than one Returns ------- `Instrument` """ # search in instruments "content" # in search results, find the exact match lambda 'symbol' # there shouldn't be less or more than 1 exact match try assert 1 except f"Only 1 exact match should be found. Found " {len(instrument)} raise 0 return searches for a symbol in the MarketSmith database and then looks for an exact match in search results. If the number of exact matches for the symbol is not one, it raises . In the end, it serializes the received dictionary into an instance. get_instrument AssertionError Instrument We’re getting to the meat of the matter. Let’s load, parse, and store Cup With Handle patterns. First, define a model to serialize the data. typing Literal, List, Optional datetime date pydantic BaseModel, validator baseID: int baseStartDate: date baseEndDate: date baseNumber: int baseStage: str baseStatus: int pivotPriceDate: date baseLength: int periodicity: int versionID: str leftSideHighDate: date patternType: int firstBottomDate: date handleLowDate: date handleStartDate: date cupEndDate: date UpBars: int BlueBars: int StallBars: int UpVolumeTotal: int DownBars: int RedBars: int SupportBars: int DownVolumeTotal: int BaseDepth: float AvgVolumeRatePctOnPivot: float VolumePctChangeOnPivot: float PricePctChangeOnPivot: float HandleDepth: float HandleLength: int CupLength: int src.ms.utils convert_msdate_to_date convert_msdate_to_date(v) # src/models/pattern.py from import from import from import : class CupWithHandle (BaseModel) """Represents a cup with handle pattern object passed by MarketSmith API""" @validator("baseStartDate", "baseEndDate", "pivotPriceDate", "leftSideHighDate", "firstBottomDate", "handleLowDate", "handleStartDate", "cupEndDate", pre=True, always=True) : def validate_date (cls, v) from import return Next, we need a few methods to handle the extraction and storage of patterns. json typing Literal, List csv pydantic validate_arguments, BaseModel src.ms AuthSession, get_instrument, get_user src.models Instrument, User, CupWithHandle src.ms.endpoints GET_PATTERNS start_date = end_date = payload = { : user.UserID, : instrument.symbol, : instrument.instrumentID, : instrument.type, : { : start_date, : end_date, : , : } } res = session.session.post(GET_PATTERNS, json=payload) res = res.json() res pattern_properties = [pattern.pop( , ) pattern patterns] index, props enumerate(pattern_properties): prop props: patterns[index][prop[ ]] = prop[ ] patterns cups: List[CupWithHandle] = patterns.get( , ) (cups == ): cup_with_handles = [cup cup cups cup[ ] == ] cup_with_handles = flattern_pattern_properties(cup_with_handles) cup_with_handles = [CupWithHandle(**cup) cup cup_with_handles] cup_with_handles filepath = patterns = [{**pattern.dict(), : ticker} pattern patterns] keys = patterns[ ].keys() open(filepath, ) patterns_file: csv_dict = [row row csv.DictReader(patterns_file)] is_empty = len(csv_dict) == open(filepath, ) patterns_file: dict_writer = csv.DictWriter(patterns_file, keys) is_empty dict_writer.writeheader() dict_writer.writerows(patterns) # src/ms/pattern.py import from import import from import from import from import from import @validate_arguments(config=dict(arbitrary_types_allowed=True)) -> dict: def get_patterns (instrument: Instrument, user: User, session: AuthSession, start: int, end: int) """Gets all patterns for an instrument in a given period Parameters ---------- instrument : `Instrument` Instrument object of the target name user : `User` Authenticated user session : `AuthSession` Authenticated session start : `int` Start in millis end : `int` End in millis Returns ------- `dict` """ f"/Date( )/" {start} f"/Date( )/" {end} "userID" "symbol" "instrumentID" "instrumentType" "dateInfo" "startDate" "endDate" "frequency" 1 "tickCount" 0 return -> List[dict]: def flattern_pattern_properties (patterns: List[dict]) """Each received Pattern instance from MS includes a `properties` field, which is a list of dictionaries w/ the `Key` and `Value` fields and containts extra properties of the pattern. This method flattens Pattern instance by adding removing `properties` field and adding its keys as separate fields of instance. Parameters ---------- patterns : `List[dict]` list of patterns fetched from MS Returns ------- `List[dict]` flattened patterns """ # add properties field as separate keys "properties" None for in for in for in "Key" "Value" return -> List[CupWithHandle]: def filter_cup_with_handles (patterns) """Given the response object of `GET_PATTERNS` endpoint, filters cup with handle patterns from it Parameters ---------- patterns : `object` response of `GET_PATTERNS` endpoint Returns ------- List[CupWithHandle] list of cup with handles patterns """ # cups w/ or w/o a handle "cupWithHandles" None if None return # cups w/ handle for in if "patternType" 1 for in return -> : def store_patterns (patterns: List[BaseModel], ticker: str) None """Stores a given list of patterns to `data/patterns.csv` Parameters ---------- patterns : `List[BaseModel]` list of pydantic models (records) of the patterns to be stored ticker : `str` ticker that the data belongs to """ "data/patterns.csv" # convert to dict "symbol" for in 0 # check if is empty with "r" as for in 0 with 'a' as and makes a request to the patterns endpoints and receives all chart patterns for an instrument during a certain period. Note that if you want to get patterns for the weekly chart, set value in the payload to 2. get_patterns frequency MarketSmith passes attribute with the instrument object that includes the instrument’s custom properties as a list. Since we only care about Cup With Handle patterns, and they share the same properties, we use to flatten the object by removing key and adding the elements of its list value to our initial instrument object. properties flattern_pattern_properties properties receives a list of pattern objects and returns Cup With Handle patterns amongst them. One “gotcha” with this method is that MS passes all cup patterns under key, but only those with a of 1 are Cup With Handles. filter_cup_with_handles cupWithHandles patternType Finally, receives a list of pattern instances and appends them to a local CSV file. Now, call the method with the required arguments. store_patterns To wrap things up, let's write some controller functions to orchestrate all the previously defined methods. But first, define a utility function to serialize CSV records. open(filepath) f: records = [ klass(**{k: v k, v row.items()}) row csv.DictReader(f, skipinitialspace= )] records # src/ms/utils.py # ... -> List[BaseModel]: def convert_csv_to_records (filepath: str, klass: BaseModel) """Converts a CSV file to a list of models Parameters ---------- filepath : `str` filepath of CSV file klass : `BaseModel` pydantic model to use for serializing the CSV records Returns ------- `List[BaseModel]` serialized CSV records """ with as for in for in True return reads rows of a CSV file and serializes them with a pydantic model. We’ll later use it to read and parse the data in the tickers.csv file. convert_csv_to_records datetime datetime logging typing List src.ms ms src.ms.utils convert_csv_to_records src.models Constituent src.ms.pattern filter_cup_with_handles logging.basicConfig(level=logging.INFO) user = ms.get_user(session) instrument = ms.get_instrument(session, ticker) patterns = ms.get_patterns(instrument, user, session, start, end) filtered_patterns = filter_method(patterns) filtered_patterns ix, ticker enumerate(tickers): logging.info( ) logging.info( ) patterns = extract_patterns( ticker=ticker.symbol, filter_method=filter_cup_with_handles, start=start, end=end) ms.store_patterns(patterns=patterns, ticker=ticker.symbol) logging.info( ) # src/ms/controller.py from import import from import import as from import from import from import -> list: def extract_patterns (ticker: str, filter_method: callable, start: int, end: int, session=ms.AuthSession ) () """Extracts a set of patterns, given a filter method, from MarketSmith API Parameters ---------- ticker : `str` symbol of Instrument to get the data for filter_method : callable method that filters target patterns from `GET_PATTERNS` endpoint response start : `int` start date in millis end : `int` end date in millis session : `AuthSession`, optional authenticated session, by default ms.AuthSession() Returns ------- `list` List of filtered patterns """ return -> : def extract_n_store_cup_with_handles (start: int, end: int, tickers: List[Constituent]) None """Loads tickers from `data/tickers.csv`, calls `extract_patterns` for each ticker to load Cup With Handle patterns, and then stores them in `data/patterns.csv Parameters ---------- start : `int` start date in millis end : `int` end date in millis """ for in f"Fetching data for " {ticker.symbol} f" / " {ix} {len(tickers)} "––––––––––––––" receives a ticker, a filter method for a pattern type, start and end dates, and an authenticated session. It then orchestrates other methods to fetch and serialize filtered patterns. extract_patterns accepts the start and end dates in milliseconds since the epoch with a list of objects, retrieves Cup with Handle patterns for them, and stores those patterns in data/patterns.csv file. Now, call the method with the required arguments. extract_n_store_cup_with_handles Constituent tickers: List[Constituent] = convert_csv_to_records( , Constituent) dt_to_milli = dt: datetime.timestamp(dt) * start = dt_to_milli(datetime( , , )) end = dt_to_milli(datetime( , , )) extract_n_store_cup_with_handles(start, end, tickers) # src/ms/controller.py "data/tickers.csv" lambda 1000 2018 1 1 2020 1 1 Awesome! We’re done with the data collection part. Let’s define a trading algorithm based on these patterns and evaluate the results. Strategy Create a Jupyter Notebook to develop, backtest, and analyze the strategy. First, import the requirements. datetime datetime pandas pd zipline zp yfinance yf pyfolio pf from import import as import as import as import as The algorithm, at each tick, loops through patterns, and if all of the following conditions are met, orders the asset: The current date has passed the property of the object, but not by more than 30 days; handleLowDate The current price has broken out of the pivot price level (the second high of the cup) by more than 1%; The 50-day simple moving average (SMA) is above the 200-day SMA. The algorithm subsequently closes a position in any of these situations: The trade generated 15% profit or more; The trade led to a loss of 5% or more; Twenty-one days or more have been passed since the opening of the position. We use SPY (S&P 500 Trust ETF) returns as the benchmark, run the algorithm from 2016 to 2018, and use ten million dollars of capital. Let’s store all these parameters in a cell to facilitate tweaking or optimizing them. WATCHLIST_WINDOW_DAYS = ABOVE_PIVOT_PCT = TAKE_PROFIT_PCT = STOP_LOSS_PCT = PATIENCE_WINDOW_DAYS = START = datetime( , , ) END = datetime( , , ) BENCHMARK = SHORT_MA_LEN = LONG_MA_LEN = CAPITAL_BASE = 30 1.01 1.15 .95 21 2016 1 1 2018 1 1 "SPY" 50 200 10000000 Before defining the logic, we need a utility function that adds timezone information to date columns of a dataframe, which allows us to compare dates in the patterns.csv file to zipline built-in dates. def convert_date_cols(df: pd.DataFrame) -> pd.DataFrame: col df.columns: ( col.lower()): df[col] = pd.to_datetime(df[col]).dt.tz_localize( ) df "" "Given a dataframe, adds UTC timezone to all columns that have date in their names." "" for in if "date" in "UTC" return Zipline requires two functions: and . The former sets up the backtesting context by receiving an argument and adding global variables to it. The latter gets called at each ticker and accepts two arguments– (the global variables) and that includes the information specific to the current tick–and makes trades based on the current market conditions. By hiding future price data, zipline ensures that there’s no look-ahead bias in the logic. initialize handle_data context data patterns = pd.read_csv( ).drop([ ], axis= ) patterns = convert_date_cols(patterns) context.patterns = patterns tickers = pd.read_csv( ) tickers = convert_date_cols(tickers) context.stocks = [zp.api.symbol(ticker) ticker tickers.symbol] context.position_dates = {} : def initialize (context) # avoid out of bounds error by dropping firstBottomDate col "data/patterns.csv" "firstBottomDate" 1 "data/tickers.csv" for in Note that method receives a ticker and returns the corresponding object. zipline.api.symbol Equity current_dt = zp.api.get_datetime() prices = data.history(context.stocks, , bar_count= , frequency= ) ix, pattern context.patterns.iterrows(): open_positions = set(context.portfolio.positions.keys()) symbol = zp.api.symbol(pattern[ ]) is_open = symbol open_positions (is_open): is_in_window = (pattern[ ] <= current_dt) (pattern[ ] >= (current_dt - pd.DateOffset(WATCHLIST_WINDOW_DAYS))) ( is_in_window): price_history = prices[symbol] pivot_price_date = pattern[ ] : pivot_price = price_history[pivot_price_date] KeyError: pivot_price = current_price = data.current(symbol, ) (current_price / pivot_price < ABOVE_PIVOT_PCT): short_ma = price_history.tail(SHORT_MA_LEN).mean() long_ma = price_history.tail(LONG_MA_LEN).mean() (long_ma > short_ma): open_positions.add(symbol) target_pct = / len(open_positions) position open_positions: zp.api.order_target_percent(position, target_pct) context.position_dates[symbol] = current_dt open_positions = context.portfolio.positions position open_positions.values(): current_price = position.last_sale_price buy_price = position.cost_basis should_take_profit = (current_price / buy_price) > TAKE_PROFIT_PCT should_stop_loss = (current_price / buy_price) < STOP_LOSS_PCT does_exceed_patience = (current_dt - pd.DateOffset(PATIENCE_WINDOW_DAYS)) >= context.position_dates[position.asset] should_close_position = should_take_profit does_exceed_patience should_stop_loss (should_close_position): zp.api.order_target_percent(position.asset, ) : def handle_data (context, data) "price" 200 "1d" # look for new trades for in # skip if asset is already in portfolio "symbol" in if continue # check date window from handleLowDate to N days after "handleLowDate" and "handleLowDate" if not continue # get symbol and price history # check price above pivot "pivotPriceDate" try except None "price" if continue # check short MA above long MA if continue # add new position and update previous ones 1 for in # look for closing positions for in or or if 0 First, loads the price data of the stocks list for the past 200 trading days. Then the method loops through patterns and finds the instances that satisfy all the requirements and are not already in the portfolio. When opening a new position, the capital is re-allocated equally amongst all positions, using . The code stores the current date in dictionary for future reference. Finally, it loops over open positions and, if any sell requirements are satisfied, sells the asset. data.history zp.api.order_target_percent context.position_dates Almost done. Define a method to fetch benchmark price data from yfinance and process it to the acceptable pyfolio format (a pandas Series with date index). bench = yf.Ticker(BENCHMARK) bench_hist = bench_hist.history(start=START, end=END, auto_adjust= ).tz_localize( ) returns = pd.Series(bench_hist[ ].pct_change().values, index=bench_hist.index).dropna() returns.index.names = [ ] returns -> pd.Series: def get_benchmark_returns () True "UTC" "Close" "date" return Note that returns are calculated by calling the method on the column of the price history dataframe. Now we need to handle the analysis of the algorithm. pct_change Close returns, positions, transactions = pf.utils.extract_rets_pos_txn_from_zipline(perf) pf.create_full_tear_sheet(returns=returns, benchmark_rets=bench) -> : def analyze (perf: pd.DataFrame, bench: pd.Series) None analyze receives two arguments: , the return value of zipline function, and , the benchmark returns retrieved from the previously defined method. extracts daily returns, positions history, and the list of all transactions made by the algorithm from the performance dataframe. We pass benchmark and backtest returns to to generate a comprehensive strategy analysis. perf run_algorithm bench pf.utils.extract_rets_pos_txn_from_zipline pf.create_full_tear_sheet In the end, let’s call and inspect the results. Make sure to convert start and end dates to a localized pandas object. zp.run_algorithm Timestamp to_localized_ts = dt: pd.Timestamp(dt).tz_localize( ) start, end = to_localized_ts(START), to_localized_ts(END) benchmark = get_benchmark_returns() results = zp.run_algorithm( start=start, end=end, initialize=initialize, handle_data=handle_data, benchmark_returns=benchmark, capital_base=CAPITAL_BASE, bundle= , data_frequency= ) analyze(results, benchmark) results.to_csv( ) # format start end lambda "UTC" # get returns # run strat 'quandl' 'daily' # analyze results # store results to CSV "results.csv" Results It’s time to receive our just deserts. After running the analyze method, pyfolio generates a that includes several tables and charts to present a detailed analysis of the results. tear sheet Start date 2016-01-04 End date 2017-12-29 Total months 23 Backtest --------- Annual return 9.7% Cumulative returns 20.2% Annual volatility 7.5% Sharpe ratio 1.27 Calmar ratio 1.96 Stability 0.91 Max drawdown -4.9% Omega ratio 1.62 Sortino ratio 2.4 Skew 3.63 Kurtosis 45.14 Tail ratio 1.6 Daily value at risk -0.9% Alpha 0.08 Beta 0.1 With 0.08 alpha and 0.1 beta, the strategy seems too passive, which could be improved by increasing the number of watchlist stocks. But the risk-return measures of the strategy look solid — notably, Sharpe, Sortino, and Calmar ratios display acceptable returns given the low exposure. You can find the full tear sheet of the strategy results below. Improvements The strategy could be enhanced in many ways; let’s discuss some of them. : By taking the ratio of green bars to red bars during the pattern formation, particularly in the latter half of the cup, we can gauge the strength of the bullish pattern and the potential breakout. % of up bars : Similarly, showing above-average volume during up days (skyscrapers of accumulation) may confirm that institutions are interested in the asset. [3] % of up volume : Another solution could be to buy the name when the volume is above average on the breakout day. Volume on breakout : The cup shouldn’t be volatile and V-shaped [4]; using the Average True Range or standard deviation of the price action, we can gauge the smoothness of the price movement while forming the cup pattern. The volatility of the cup : By making sure that the pattern follows a strong and established uptrend, using the height and length of the prior rally, we can ensure that a strong move backs the base. Prior uptrend strength References: [1] D. Saito-Chung, When To Buy The Best Growth Stocks: How To Analyze A Stock’s Cup With Handle (2020), Investor’s Business Daily [2] Cup With Handle , StockCharts ChartSchool [3] S. Lehtonen, Roku, One Of The Top Stocks Of 2019, Built ‘Skyscrapers’ Of Accumulation Before A Breakout (2019), Investor’s Business Daily [4] W. J. O’Neill, How to Make Money in Stocks: A Winning System in Good Times and Bad (2009) You can find the source code here .