AI Consultant & Strategist
Bridging the gap between cutting-edge AI and the operational realities of finance, data, and enterprise systems.
Get in touch →// About
I bring a rare combination of quantitative rigor, financial expertise, and design thinking to AI consulting. With a background spanning astrophysics research at NASA, prime brokerage at Goldman Sachs, foreign exchange at JP Morgan, and analytics leadership in consulting, I understand how organizations actually work — and where AI can genuinely move the needle.
I help companies cut through the AI hype and implement practical, high-impact solutions in operations, reporting, and data systems. No buzzwords. Just results.
// Services
Assess where AI fits in your operations, identify high-value opportunities, and build a practical implementation plan your team can actually execute.
Map your existing processes and replace manual, repetitive work with intelligent automation — freeing your team to focus on what matters.
Build intelligent dashboards and reporting pipelines that surface insights automatically, replacing slow manual reporting with real-time clarity.
Develop bespoke AI-powered tools tailored to your specific business problems — from internal assistants to predictive models and data pipelines.
Specialized expertise in applying AI to financial operations, quantitative analysis, risk reporting, and investment data workflows.
Help your team understand, trust, and effectively use AI tools — turning skeptics into advocates and ensuring lasting organizational change.
// Case Studies
A mid-sized hedge fund was producing daily portfolio reports manually — a process taking analysts 3+ hours each morning, prone to errors, and leaving portfolio managers waiting for data when markets were already moving.
Audited the full reporting workflow and identified key data sources — prime brokerage feeds, internal risk systems, Bloomberg. Built an automated pipeline that ingested, reconciled, and normalized data nightly, generating reports with natural language summaries of portfolio movements, risk exposures, and attribution.
Reporting time reduced from 3 hours to under 10 minutes. Portfolio managers received actionable morning briefs before market open. Analysts were freed to focus on analysis rather than data wrangling.
A family office with multi-currency, multi-asset holdings had no unified view of FX exposure across portfolios. Risk reporting was fragmented across spreadsheets maintained by different teams, making consolidated analysis slow and unreliable.
Designed a centralized data model consolidating positions across asset classes and currencies. Built an AI layer that calculated real-time FX exposure, generated plain-language risk summaries, and flagged threshold breaches automatically — replacing manual weekend reconciliation.
The investment team gained a live, consolidated view of currency exposure for the first time. Weekend reconciliation was eliminated. Principals received automated weekly briefs with actionable hedging considerations derived directly from current positions.
A consulting firm was sitting on years of client engagement data, project outcomes, and market research — but had no way to synthesize it. Consultants spent hours rebuilding analysis from scratch for each new engagement.
Designed and implemented a retrieval-augmented AI system that indexed the firm's full knowledge base and allowed consultants to query it conversationally. Built automated insight reports summarizing relevant past engagements, benchmark data, and methodology frameworks on demand.
Consultants cut research and prep time by over 50% per engagement. The firm's institutional knowledge became accessible and actionable for the first time. New hires onboarded significantly faster with AI-assisted context on past work.
The Met's digital archive of 200,000+ works was largely inaccessible to younger, diverse audiences — particularly young people from nearby neighborhoods who had never engaged with the museum. The data existed; the entry point didn't.
Conducted ethnographic research to identify 13 culturally influential rappers. Built a system integrating the Rap Genius API and the Met's own digital API, using words from rap lyrics as keyword searches through the archive — generating curated, personalized "tours" of the collection through the lens of hip hop. Created for the Met Media Lab.
Demonstrated how combining open cultural datasets with an unexpected interface layer can unlock entirely new audiences. The project launched at rappersdelight.nyc — a model for how institutions with large, underutilized data archives can use API integration to drive discovery and engagement.
Cutting-edge microbiome research from the Mason Lab at Weill Cornell was generating rich scientific data about the microbial life of NYC — but it was invisible to the public and locked in academic formats. How do you make complex biological data not just understandable, but viscerally felt?
Led design and visualization for a cross-disciplinary collaboration with MIT Media Lab, Weill Cornell, The Cooper Union, and the Extrapolation Factory. Combined DNA sequencing data collected via honeybees with thermal imaging of NYC streets and web-based 3D rendering to create a live data art installation — reframing the city as a biological superstructure.
Featured in Wired, National Geographic, and Creative Applications. Results contributed to peer-reviewed publications in environmental microbiology. Demonstrated that rigorous scientific data, when designed with intention, can shift public perception at scale.
// Contact