Matt Owen

Work Experience

Moat, acquired by Oracle Data Cloud

  • Software Engineering Manager, Senior Principal Software Engineer, Data Engineer
New York, New York
  • Software Engineering Manager
    • Managed team of 7 Data Engineers
    • Worked with other Engineering Manager stakeholders to design a flexible core data pipeline
    • Built multiyear Software Roadmap with Engineering Managers and Product Owners
    • Organized and led "Agile" rituals - Sprint Planning, Sprint Review, and Backlog Grooming
    • Promoted from Lead Realtime Systems Engineer
  • Lead Realtime Systems Engineer
    • Built realtime event processing applications handling 1,000,000+ events per second - Python, Go
    • Scoped core data pipeline changes for use across entire company
    • Managed weekly software releases for core business logic
    • Maintained large-timescale statistical databases, importing 100,000,000+ rows per day - Highly modified Timescale DB and Postgres 10
    • Evaluated realtime data pipelines - Kafka and Amazon Kinesis
    • Evaluated high-write-throughput databases - Apache Ignite, Cassandra
    • Evaluated in-memory data stores - Apache Ignite, VoltDB
    • Specced realtime system changes for product-oriented features
    • Specced database architecture for write-heavy workloads using Cassandra
  • Senior Principle Software Engineer
    • Managing weekly software releases for core business logic, codebase contributed to by 4 distinct teams
    • Deploying software on running on over 800 machines
    • Designing software to run highly distributed systems - 11000+ running Python processes
    • Designing and building stream-processing applications - Amazon Kinesis, Java
    • Designing and building system-wide wire protocol - Google protobuf
    • Building and maintaining software end-to-end over 800 AWS instances (ranging from c5.xl to x1e.8xl)
    • Building and maintaining realtime processing systems (250+ r5.8xl EC2 instances) on Amazon AWS
    • Building and maintaining nginx pixel-serving system (400+ c5.xl EC2 instances) on Amazon AWS
    • Replaced original realtime processing system with a event-processed system - Amazon Kinesis
    • Replaced original realtime aggregation and querying system - Apache Ignite


  • Team Lead, Senior Software Engineer, Data Engineer
New York, New York
  • Team Lead
    • Led a 7 person interdisciplinary scrum team - Data Science, Design, and Software Engineering
    • Completed Certified Scrum Master Training
    • Designed Core Historical data pipeline:
      • Organized development of a Historical Data Pipeline - moving data from Kafka to PostGRES
      • Designed systems to fail gracefully in the event of zone unavailability
      • Designed systems to recover from failure - that is, pipeline failures do not trickle downard
      • Designed systems to originally retain data in compliance with Chartbeat's MRC accredidation
    • Organized and ran scrum rituals: Planning, Backlog Grooming, Sprint Review, Retrospective
    • Collaborated with Product Managers on assessing risk and value in feature development
    • Communicated pipeline to downstream teams
    • Worked with Marketing and Customer Success teams to orgranize product release timelines
    • Worked with Marketing team to create accurate Value Propositions for new products
  • Senior Data Engineer
    • Designed wire protocol using Google's protobuf format
    • Built software that, in production, processes over 300,000 messages per second
    • Administered 2 20-Node Redhift Clusters, Multiple RDS Instances, and 50+ EC2 Instances directly
    • Project lead, building a data pipeline across Kafka and Redshift
  • Senior Backend Engineer
    • Built Kafka data pipeline in Clojure and Java
    • Built and maintained Clojure libraries with performance-oriented Java interoperability
    • Wrote Nagios checks, measuring pipeline health and recording instances of data-loss
    • Configured production machines using Puppet
    • Deployed changes using Fabric
    • Developed software on the Kafka platform
    • Developed Kafka Consumers in Clojure
    • Developed Java libraries for data sanitization and munging
    • Developed fast, cheap pipeline, processing 300,000+ messages per second, mapping to 40,000+ stored sessions per second - Kafka to Redshift

Cengage Learning

  • Senior Software Engineer
Boston, MA
  • Senior Engineer:
    • Developed REST services in NodeJS and MongoDB
    • Developed content API's in Javascript for use by client-side developers
    • Developed reusable modules (AMD and Angular) for use across multiple teams
    • Member of agile team, using test-driven development and pair-programming
  • Core Team Contributor:
    • Migrated outdated core functionality to modern technologies (NodeJS, Angular)
    • Seamlessly integrated software on in-service software products
    • Navigated extremely brittle software, while moving forward with mdoernizing changes
  • Technical Lead:
    • Dependency Planning, planned cross-team dependencies and development priorities for a team of 7 engineers
    • Engineering Planning, collaborated with product owners to translate product needs into engineering deliverables
    • Scope Analysis, produced and updated attainable deadlines
    • Cross-team Communication, interfaced with outside software teams to schedule service outages and plan dependencies

Harvard University, Institute for Quantitative Social Sciences

  • Lead Developer
  • Software Maintainer
  • Statistical Programmer
Cambridge, MA
  • Lead developer for Zelig:
    • Developed and extended a framework for unifying statistical interfaces across web and R-scripts
    • Coded over 50 statistical models in R
    • Contributed 7 statistical packages to CRAN, the Comprehensive R Archive Network
    • More on Zelig
  • Statistical Programming Instructor:
    • Beginner Statistical Programming
    • Intermediate Statistical Programming
    • Developing Statistical Software
    • Git
  • Developer for Dataverse Network:
    • Interfaced Statistical analysis tools with Glassfish EE in Java
    • Integrated tools for statistical analysis into the Dataverse Network
    • More on Dataverse Network
  • Miscellaneous:
    • Collaborated with graduate students on developing statistical software
    • Consulted graduate students on cleaning data for Statistical analysis

Planwork, LLC

  • Co-Founder
  • Manager
Somerville, MA
  • Developed and managed the creation of generative visual art in OpenGL and CocoaTouch
  • Developed a series of parametric logos in WebGL and HTML5 Canvas
  • Developed creative software applications using NodeJS, WebGL, OpenGL, and CocoaTouch
  • Developed CocoaTouch applications to mix video in realtime for art installations
  • Developed realtime streaming video apps for Instagram in NodeJS
  • Created non-repeating patterns used in laser-etching leather products
  • Authored a series of fine-art geared towards interaction on Social Media Networks
  • Ran multiple advertising campaigns
  • Used streaming data from Instagram™ and Twitter™ to build fine art pieces


Zelig: Statistical Software

Harvard University, Institute for Quantitative Social Sciences
  • Lead Developer
  • Software Maintainer
  • Statistical Programmer
Zelig is a single, easy-to-use program that can estimate, help interpret, and present the results of a large range of statistical methods. It literally is "everyone's statistical software" because Zelig uses (R) code from many researchers.
  • Developed Statistical Software based on original research by Gary King
  • Coded softwate to unify:
    • Imputing missing data
    • Fitting statistical models
    • Setting counter-factuals
    • Prediction observations
    • Evaluating accuracy of results
    • Displaying likelihood of predictions
  • Developed unified interface to allow for simple, automated input of data into a model
  • Developed API to simplify the implementation of Statistical models
  • Began development on a cloud-based Statistics stack
  • Implements over 50 statistical models:
    • Generalized Linear Models
    • Multi-level Models
    • Multinomial Choice Regressions
    • etc.
  • Interfaces with external software to automatically impute data-sets
  • Coded a general framework for simulating statistical data across a variety of models in R
  • Coded a general framework for imputing data in R

Dataverse Network

  • Java Developer
  • Statistical Programmer
The Dataverse Network is an open source application to publish, share, reference, extract and analyze research data. It facilitates making data available to others, and allows you to replicate others work. Researchers, data authors, publishers, data distributors, and affiliated institutions all receive appropriate credit.
  • Collaborated with a team of Java developers on an Enterprise project
  • Developed Statistical Software to work alongside Web Application
  • Developed software to:
    • Convert Rdata files to an archival format
    • Run tests to ensure consistency between Rdata and archival formats
    • Apply statistical models to remote data sets
  • Developed unit tests to compare newly added modules with the results of previous working models
  • Integrated Statistical analysis software into an existing data-storage web application
  • Developed a cloud-based Statistics Stack

  • Co-Founder
  • Lead Developer
  • Creative Director is internet cultural tourism. Specifically, it is a Cocoa application written in C++ (with OpenGL). It is a relational art piece, exhibited at a monthly event. The piece abstractly navigates through public Instagram streams. It plies modern data-science, OpenGL programming, Instagram API programming, Digital Signal Processing, and Quicktime video rendering to produce a audio silly visualizer. This project is currently being ported to WebGL. The software was built to output its content directly to Tumblr as a GIF or MPEG-4 video.
  • Developed a series of technology art-pieces focused on "Internet Cultural Tourism"
  • Developed event-based software for use with the Instagram API (here)
  • Visualized Instagram Data atop 3D models
  • Developed generative art CocoaTouch application
  • Integrated Quicktime video, FMOD (FFT), 3D objects, Projection Shading in a single application in C++
  • Exports content to:
    • Animated GIF
    • MPEG-4 Video
    • Tumblr

Skills, Education, and Experience

Computer Graphics and Design

  • Digital Design and Streamlining Dataflow to Visual Output
  • OpenGL for blah
  • GLSL
  • GLut
  • Cinder:
  • Collaborated with designers working in keyshot
  • Collaborated with designers working in photoshop
  • Created video in realtime from 3d rendered objects
  • Photoshop, etc.
  • Created software to produce animated GIF's from OpenGL app
  • Create software to asynchronously create quicktime movies from app

Web Programming

  • 7 years experience professionally coding. Focus in data-science and graphics programming.
  • OpenGL & GLSL
  • Cocoa Touch/iOS
  • Python, Perl, PHP
  • R
  • WebGL
  • WebSockets,
  • NodeJS
  • MongoDB
  • Javascript
  • LAMP
  • Instagram API
  • Twitter API
  • Facebook SDK

Data Science and Machine Learning

  • Use high-performance machine-learning and big data to create interesting and artistic applications.
  • Machine Learning:
    • Neural Networks
    • Generalized Linear Models
    • Support Vector Machines
  • Scientific Computing:
    • MATLab; numerical anlysis and diffeq's
    • R; Statistics
    • Python/scikit-learn; Machine-learning
  • Numerical Analysis for parallelizable and fast computation
  • Developed online-learning algorithms
  • R for fast prototyping of machine learning algorithms
  • Python with NumPy, SciPy, SciKit-Learns


  • Apply mathematics in real-world computing problems.
  • Numerical Analysis: Finite Elements Method, Ordinary Diff-eq's
  • Finite Elements Method
  • Optimization and Operations Research
  • KKT Conditions
  • Linear Programming
  • Vector-space optimization
  • Topology
  • Abstract Algebra

In conclusion,

Mathematics and Art. Pizza. Pool water. Thanks. 🍕💀