Algorithmic Trading
Algorithmic trading refers to the use of computer programs that execute buy and sell orders in financial markets based on a predefined set of rules, such as price thresholds, timing, volume conditions, or mathematical models. In U.S. markets, algorithmic trading now accounts for a substantial share of daily equity volume across NYSE and NASDAQ.
Algorithmic trading, sometimes called algo trading or automated trading, emerged as a dominant force in U.S. equity markets following the SEC's adoption of Regulation NMS in 2005, which mandated electronic execution and fragmented order flow across multiple trading venues. Rather than relying on human traders to manually route and execute orders, algorithmic systems process real-time market data and submit orders in milliseconds — a speed physically impossible for any human participant.
The core mechanics of an algorithm involve a set of conditional instructions: for example, a volume-weighted algorithm might instruct the system to buy a specified number of shares of a given stock over a trading day, distributing the order across time intervals proportional to historical intraday volume patterns. This approach aims to minimize the observable market footprint of a large institutional order. Other algorithms monitor statistical relationships between correlated securities, executing offsetting trades when those relationships deviate from historical norms.
Institutional participants — including mutual funds, hedge funds, and pension funds — rely heavily on algorithmic execution to handle large orders in U.S.-listed equities without moving the market against themselves. Retail brokers such as Fidelity and Charles Schwab also use algorithms internally to achieve best-execution standards mandated by FINRA and the SEC. The prevalence of algorithmic trading has compressed bid-ask spreads significantly since the early 2000s, broadly reducing transaction costs for all market participants.
Algorithmic trading is subject to regulatory scrutiny from both the SEC and FINRA. Regulations require firms operating algorithmic strategies to have risk controls in place, including order size limits and kill switches that can halt a malfunctioning algorithm in real time. The May 6, 2010 Flash Crash, during which the Dow Jones Industrial Average briefly fell nearly 1,000 points in minutes before recovering, prompted regulators to strengthen these safeguards. Subsequent SEC rulemaking introduced circuit breakers at the individual stock level (Limit Up-Limit Down, or LULD) to prevent runaway algorithmic cascades.
Beyond execution algorithms used by institutional investors, a separate category of algorithmic strategies attempts to generate alpha through systematic, rules-based models applied to large historical data sets. Quantitative hedge funds such as Renaissance Technologies and Two Sigma have pioneered these approaches in U.S. equity markets, employing teams of mathematicians and engineers to develop and maintain portfolios driven entirely by algorithmic models. The boundary between pure execution algorithms and alpha-generating algorithmic strategies is not always sharp, as some firms integrate both functions within a single unified system.