Perspectives

Ideas at the intersection
of capital and technology.

We don't write press releases. We write what we actually think about where markets, AI, and private capital are heading — and what it means for those operating in them.

The Algo Decade: Why Systematic Trading Has Become Table Stakes for Serious Capital

Algorithmic strategies now account for over 70% of U.S. equity volume. What started as a hedge fund arms race has quietly become the infrastructure of modern markets. The question is no longer whether algorithms belong in a portfolio — it's whether you can afford to be on the wrong side of the machines that run it.

Read article →
AI Is Not a Tool. It's an Operating System.

The firms winning with AI aren't using it as a faster Google. They've rebuilt their workflows around it. The distinction matters more than most people realize — especially in private equity, where the bottleneck has always been human bandwidth.

Read article →
The Deal Team of 2028 Will Look Nothing Like Today's

A senior associate at a top-decile PE fund will spend less time modeling and more time deciding. The firms building that infrastructure now aren't just getting efficient — they're building a structural advantage that compounds every deal cycle.

Read article →
Quant Technology · April 2026 · 7 min read

The Algo Decade: Why Systematic Trading Has Become Table Stakes for Serious Capital

By Barenberg Capital Partners

In 2005, algorithmic trading accounted for roughly 25% of U.S. equity volume. By 2010, it had crossed 50%. Today, conservative estimates put it above 70% — with some asset classes, particularly futures and options, running closer to 80–90% automated. This is not a trend. It is the new baseline.

And yet, the conversation in most private wealth and family office circles still frames algorithmic strategies as exotic — something reserved for Renaissance Technologies and Two Sigma, not for the individual allocating $500,000 or the family office managing $50 million. That framing is outdated, and it's costing people money.

How We Got Here

The first wave of algorithmic trading was purely about speed. High-frequency traders colocated servers next to exchange matching engines, shaved microseconds off execution, and extracted basis points at scale. That game still exists, but it's a closed loop — the arms race for sub-millisecond advantages is now dominated by firms spending tens of millions on custom FPGA hardware and dedicated fiber optic routes.

The second wave — the one that matters for everyone else — is about signal intelligence. Not speed, but the ability to process more information, more consistently, with fewer cognitive errors, than a human analyst making discretionary decisions under pressure. This is where the real edge lives for non-HFT systematic strategies.

"The human brain is not built for markets. It's built to survive. Those are different optimization functions."

Discretionary trading relies on pattern recognition, intuition, and experience — all of which are genuinely valuable, but all of which are also subject to loss aversion, recency bias, overconfidence, and the simple fact that human beings need to sleep. A systematic strategy doesn't have a bad week because of a difficult personal situation. It doesn't average down on a losing position because it's emotionally invested in the original thesis. It executes the same logic — consistently, repeatedly, at scale — regardless of conditions.

What "Systematic" Actually Means at the Signal Level

The most common misconception is that algorithmic trading means "following moving averages and RSI." That's 1990s quant finance. Modern systematic strategies layer multiple independent signal sources — price action, volume dynamics, market microstructure, regulatory intelligence, options market positioning — and look for confluence: the condition where multiple independent data streams agree on the same directional conclusion simultaneously.

The logic is elegant: any single indicator can generate false positives. A rising RSI can mean momentum or can mean an overextended move about to reverse. But when five independent signals converge on the same conclusion, the probability of a false positive drops dramatically. It's the difference between one witness and five independent witnesses telling the same story.

  • Multi-factor signal confluence reduces noise entries by eliminating low-probability setups
  • Dynamic position sizing scales capital deployment proportionally to signal conviction
  • Volatility-adjusted stop placement ensures risk is defined relative to actual instrument behavior, not arbitrary percentages
  • SEC filing intelligence layers regulatory data into position bias before the broad market has processed the information

The addition of real-time regulatory intelligence is underappreciated. SEC EDGAR processes thousands of filings daily — Form 4 insider transactions, 8-K material event disclosures, Schedule 13D ownership changes. The market is inefficient at processing this data quickly. A system that can parse an 8-K the moment it hits EDGAR and extract a directional signal has a genuine edge window — typically 15 to 90 minutes — before the information is priced in by the broader market.

The Self-Improvement Problem — and Its Solution

The weakest point of most algorithmic strategies is model decay. Markets evolve. Regime changes — from trending to mean-reverting environments, from low to high volatility, from bull to bear — can render a strategy that worked for three years suddenly ineffective. The traditional response is manual re-optimization by a quant team, which is expensive, slow, and often reactive rather than proactive.

The more sophisticated response is to build the re-optimization process into the system itself. An AI critique layer that reviews every closed trade — identifying which signals contributed to wins, which contributed to losses, and what adjustments to the decision logic would have improved outcomes — and proposes targeted rule modifications between trading cycles. Not autonomous rewriting, but a structured feedback loop that surfaces insights a human analyst would take weeks to identify.

This is the concept of recursive self-improvement applied to trading: a system that learns from its own history in a structured, auditable way, rather than repeating the same errors indefinitely.

The Access Question

For decades, the infrastructure required to run a serious systematic strategy — co-location, data feeds, execution management systems, risk frameworks — cost millions of dollars annually. It was genuinely inaccessible to anyone outside of institutional finance.

That has changed. Cloud computing, institutional-grade API-connected brokerages, and AI-powered development tools have compressed the infrastructure cost by orders of magnitude. The question is no longer whether serious systematic strategies can be deployed outside of a hedge fund structure. They can. The question is whether you're accessing that capability — or leaving it to others who are.

The market doesn't care whether your capital is deployed algorithmically or discretionarily. It only cares about the quality of your decisions at the moment you make them. Systematic strategies are not a guarantee of performance. But they are a structural approach to making better decisions, more consistently, at scale — and in a market where 70%+ of volume is already algorithmic, that matters.

Obsidian Quant licenses its algorithmic trading technology to qualified individuals and institutions. Your capital stays in your own account — we never custody it.

Explore Licensing →
Deal Technology · March 2026 · 5 min read

AI Is Not a Tool. It's an Operating System.

By Barenberg Capital Partners

Most organizations are using AI wrong. Not because they're using the wrong models or the wrong vendors — but because they're treating AI as a productivity tool rather than as infrastructure. The firms that will look back on this decade as transformational are the ones that figured out the difference early.

A productivity tool is something you reach for when you have a specific task. You open it, use it, close it. A spreadsheet is a productivity tool. A calculator is a productivity tool. If you're using AI the same way — asking it to summarize a document, draft an email, or explain a concept, then moving on — you are capturing approximately 5% of the available value.

An operating system is something your entire workflow runs on top of. You don't use it for a task. It enables every task. The difference in leverage is not incremental. It's a different order of magnitude.

Where This Matters Most in Private Capital

Private equity and growth equity due diligence is, at its core, an information processing problem. A typical deal process generates hundreds of pages of documents — CIM, financial model, management presentation, quality of earnings, legal data room — that need to be synthesized into a coherent investment thesis, risk assessment, and decision framework under significant time pressure.

The firms that close the best deals are not necessarily the ones with the best judgment. They are the ones with the best judgment and the fastest information processing. An investment team that can synthesize a CIM into an initial thesis in two hours rather than two days has a materially different decision advantage — they can run more deals, go deeper faster, and walk into an IC meeting better prepared than the competition.

"Speed doesn't just save time. In competitive deal processes, it changes which deals you can pursue."

This is not a hypothetical efficiency gain. An AI system trained on the structure of investment committee memos, LBO modeling conventions, risk frameworks, and deal dynamics can process a 150-page CIM and produce a full IC memo draft — with LBO model outputs, risk register, data gap analysis, and executive summary — in a fraction of the time a senior associate would spend on the same task.

The System vs. The Tool

Here's the distinction that matters: using ChatGPT to summarize a document is a tool interaction. Building a workflow where your deal origination pipeline feeds documents directly into an AI system that extracts the deal parameters, runs the financial model, flags the risk factors, generates the memo, and routes it to the IC — that is an operating system.

The difference is that the first interaction helps you one time with one document. The second interaction scales. Every deal that enters the pipeline benefits from the same system. The marginal cost of processing deal number 50 is essentially identical to deal number 1. You are compressing the economics of a 10-person deal team into a 2-person deal team — without sacrificing output quality.

  • AI-powered CIM extraction eliminates 80% of the manual data entry in the early-stage diligence process
  • LBO and growth equity models generated automatically from extracted deal parameters
  • IC memo sections — executive summary, investment thesis, risk register, 100-day plan — produced as drafts for human refinement, not human origination
  • Diligence checklists, IC conditions, and document request management integrated into the same workflow

The Human Role Doesn't Disappear. It Upgrades.

The most common objection is that AI-generated analysis will miss the nuance that experienced deal professionals bring to a process. This objection misunderstands what AI systems are being asked to do. A well-designed system doesn't replace judgment. It handles the information structuring, the model mechanics, and the memo scaffolding — so that human judgment can be applied to the questions that actually require it.

What requires human judgment: is this management team credible? Is this market narrative defensible? Does the exit path make sense given current buyer appetite? Is the business actually performing the way the numbers suggest?

What does not require human judgment: extracting EBITDA figures from page 47 of a CIM. Building an LBO model from those figures. Formatting the executive summary. Organizing the diligence checklist. Tracking which data room documents have been received.

The firms building AI-native deal workflows are not automating judgment. They are automating the scaffolding around judgment — freeing their people to do more of the work that actually requires them.

The Window Is Closing

Technology adoption curves in private equity are slow — slower than in most industries, because the asset class is relationship-driven and conservative by nature. But they're not infinitely slow. The firms building these capabilities now are establishing a structural advantage that will compound across every deal cycle. In three years, AI-native deal infrastructure will not be a differentiator. It will be a baseline expectation, the same way Excel models and Bloomberg terminals eventually became table stakes.

The question is not whether to build on AI infrastructure. It's whether you do it now, while it's still an advantage — or later, when it's a catch-up exercise.

Dealithic is Barenberg Capital's AI-powered deal engine — built for PE and growth equity teams who want to move faster without sacrificing rigor.

Explore Dealithic →
Private Capital · March 2026 · 6 min read

The Deal Team of 2028 Will Look Nothing Like Today's

By Barenberg Capital Partners

Private equity has always been a talent-intensive business. The model — hire the best analysts from investment banking, work them relentlessly, filter for the ones who develop deal judgment, promote slowly — is as old as the industry. It produced extraordinary results for decades. It is also, quietly, becoming a structural liability.

The problem is not that the people are bad. The problem is the ratio of what they spend their time on to what they are actually worth. A third-year associate at a top-quartile fund commands a $350,000 total compensation package and spends approximately 40% of their time on tasks that, within five years, will be performed entirely by AI systems. The economics of that arrangement are going to shift — dramatically, and faster than most firms expect.

Where the Time Actually Goes

Break down the workflow of a deal team from CIM receipt to IC meeting and you find that the hours are heavily concentrated in a handful of tasks: initial CIM analysis and thesis formation, financial model construction, memo drafting, data room organization, diligence tracking, and management preparation. Of these, the first two have the highest human-judgment content. The rest are largely mechanical.

Memo drafting — the act of translating a financial model and a set of diligence findings into a structured investment committee document — is a template-driven process. It has a defined structure, a defined set of sections, and a defined logic for how information flows between them. A senior associate does it better than a junior one because they have more pattern recognition from having done it many times. An AI system trained on thousands of IC memos can develop that pattern recognition faster, apply it more consistently, and produce a first draft in minutes rather than days.

"The bottleneck in private equity has never been judgment. It's always been bandwidth. AI eliminates the bandwidth problem."

This is not a marginal productivity improvement. Cutting 40% of a senior associate's time on mechanical tasks doesn't mean you need 40% fewer associates. It means each associate can cover 40% more deals — which at a fund actively sourcing 200+ opportunities per year to close 4–6, that capacity expansion is the difference between passing on something great because you didn't have time to diligence it properly and getting there first.

What Changes First

The initial phase of AI adoption in deal teams is already underway. It looks like this: a junior analyst uses an AI tool to produce a first-pass summary of a CIM. A senior associate cleans it up and adds their own analysis. The IC memo takes three days instead of six. The financial model takes four hours instead of twelve. The team can run two processes simultaneously instead of one.

This is the productivity tool phase — real value, but not the structural shift. The structural shift happens when the workflow is redesigned around the AI capability rather than bolted on top of an existing workflow. When the CIM comes in and is automatically ingested, the deal parameters are extracted and fed directly into the financial model, the model outputs flow into the memo template, the memo template populates with deal-specific language, and the team's first interaction with the deal is reviewing and refining a nearly complete package — not starting from a blank spreadsheet.

  • Origination teams spending more time on relationship-driven sourcing, less on document processing
  • IC processes accelerating from weeks to days without sacrificing depth of analysis
  • Portfolio monitoring becoming partially automated through KPI dashboards and AI-generated variance analysis
  • Investor reporting — quarterly letters, LP decks — assembled in hours from structured data, not days from scratch

The New Division of Labor

The deal team of 2028 is not smaller necessarily — though some firms will go that direction. It is differently structured. The ratio of senior judgment to junior execution shifts dramatically. You need fewer people doing the mechanical work and more people with the experience to ask the right questions of the output — to identify when the model is making an assumption that doesn't hold, when the thesis has a fatal flaw that the AI missed because it was embedded in a footnote on page 83, when management's narrative doesn't reconcile with the customer concentration data on the data tape.

Those are not things that AI systems will do well by 2028. They require exactly the kind of pattern-matching, skepticism, and domain experience that takes years to develop. The professionals who develop those skills and layer them on top of AI-native workflow tools will be the most valuable people in the industry. The ones who resist the tools, or who allow their skills to atrophy because the tools are doing too much, will find themselves on the wrong side of a widening capability gap.

For Smaller Funds, the Opportunity Is Larger

The conventional wisdom is that AI will primarily benefit large institutions with the resources to build proprietary tools. The opposite is closer to the truth. A $100 million fund with a three-person deal team, using AI-native deal infrastructure, can punch significantly above its weight class — sourcing, diligencing, and executing on deals that would have been impossible to process with a three-person team five years ago.

The technology doesn't care about the size of the AUM. It delivers the same leverage to the two-person emerging manager as it does to the billion-dollar platform. What it rewards is the willingness to redesign the workflow rather than layer the tool on top of the old one. The funds that figure that out — particularly at the smaller end of the market where each incremental deal capacity unit matters most — will compound that structural advantage deal by deal.

2028 is not far away. The teams building this infrastructure today are not future-proofing. They're winning now.

Barenberg Capital's advisory practice works with PE and growth equity teams navigating transactions where speed and rigor both matter.

Speak with Our Team →