Catalyst Cooperative is a data engineering and analysis consultancy, specializing in energy system and utility financial data. Our current focus is on the US electricity and natural gas sectors. We primarily serve non-profit organizations, academic researchers, journalists, climate policy advocates, public policymakers, and occasionally smaller business users.
We believe public data should be freely available and easy to use by those working in the public interest. Whenever possible, we release our software under the MIT License, and our data products under the Creative Commons Attribution 4.0 License
If you're interested in hiring us email [email protected]. Our current rate is $155/hr. We can often make acommodations for smaller/grassroots organizations and frequently collaborate with open source contributors.
- For general support, questions, or other conversations about our work that might be of interest to others, head over to our GitHub Discussions
- If you'd like to get (very) occasional updates about our work sign up for our email list.
- Want to schedule a time to chat with us one-on-one about our software or data? Have ideas for improvement, or need to get some personalized support? Join us for Office Hours
- Follow us on Twitter: @CatalystCoop
- Follow us on Mastodon: @[email protected]
- Play with our data and notebooks on Kaggle
- Combine our data with ML models on HuggingFace
- Learn more about us on our website: https://catalyst.coop
- Programmatic acquisition, cleaning, and integration of public data sources.
- Data oriented software development.
- Compilation of new machine-readable data sources from regulatory filings, legislation, and other public information.
- Data warehousing and dashboard development.
- Both ad-hoc and replicable production data analysis.
- Translation of existing ad-hoc data wrangling workflows into replicable data pipelines written in Python.
- Python is our primary language for everything.
- Pandas the swiss army knife of tabular data manipulation in Python.
- Dask to scale up data wrangling tasks we do with Pandas beyond what can be done in memory.
- Dagster for orchestrating and parallelizing our data pipelines.
- SQLite for local storage and distribution of tabular, relational data.
- Apache Parquet to persist larger data tables to disk.
- JupyterLab for interactive data wrangling, exploration, and visualizations.
- Pydantic for managing and validating settings and our collection of metadata.
- Scikit Learn to construct machine learning pipelines.
- Splink for fast, generalized entity matching.
- Google BigQuery to warehouse finished data products for live access.
- Google Batch to miniimize the infrastructure we need to manage for our nightly builds.
- Pandera to specifiy dataframe schemas and data validations in conjunction with Dagster.
- Hypothesis for more robust data-oriented unit testing.
- Zenodo provides long-term, programmatically accessible, versioned archives of all our raw inputs.
- Sphinx for building our documentation, incorporating much of our structured metadata directly using Jinja templates.
- The Frictionless Framework as a standard interchange model for tabular data.
- Tableau for producing dashboards and interactive data visualizations for client projects.
- VS Code is our primary main code editor, ever more deeply integrated with GitHub.
- pre-commit to enforce code formatting and style standards.
- We use GitHub Actions to run our continuous integration and coordinate our nightly builds and data scraping jobs.
- duckdb as a performant, columnar, analysis oriented alternative to SQLite.
- Pixi, a fast, ergonomic conda package management command line tool.
- Evidence, Rill, Apache Superset, and Streamlit as open source BI tools that play nice with revision control.
- SQLModel to more easily unify our metadata and database schema definitions with SQLAlchemy.
- dbt to manage pure SQL data transformations where appropriate within our larger Python based workflows.
- GridStatus
- Interconnection.fyi
- GridEmissions
- PowerGenome from @gschivley
- The Open Grid Emissions Initiative from @grgmiller & Singularity Energy
- Pangeo Forge
- DSIRE at North Carolina State University
- The Open Energy Modeling Initiative
- Open Energy Transition
- CarbonPlan
- The Open Knowledge Foundation
- 2i2c: The International Interactive Computing Consortium
- The Open Collective Foundation
- The Open Energy Outlook
- Code for Science & Society
- The US Research Software Engineering Association
- Diagonal Works
- The Environmental and Data Governance Initiative (EDGI)
- Technology Cooperatives Everywhere!
- The Alfred P. Sloan Foundation Energy & Environment Program
- RMI
- GridLab
- Climate Change AI
- The Mozilla Foundation
- Carbon Tracker
- Climate Policy Initiative
- Energy Innovation
- Lawrence Berkeley Lab Energy Technologies Area
- Invenia Labs
- Western Interstate Energy Board
- Flora Family Foundation
- The Deployment Gap Education Fund
Catalyst is a democratic workplace and a member of the US Federation of Worker Cooperatives. We exist to help our members earn a decent living while working for a more just, livable, and sustainable world. Our income comes from a mix of grant funding and client work. We only work with mission-aligned clients.
We are an entirely remote organization, and have been since well before the coronavirus pandemic. Our members are scattered all across North America from Alaska to Mexico. We enjoy a great deal of autonomy and flexibility in determining our own work-life balance and schedules. Membership entails working a minimum of 1000 hours each year for the co-op.
As a small 100% employee-owned cooperative, we are able to compensate members through an unusual mix of wages and profit sharing, including:
- An hourly wage (currently $36.75/hr)
- Tax-deferred employer retirement plan contributions (proportional to wages, up to 25% of wages)
- Tax-advantaged patronage dividends (proportional to hours worked, unlimited but subject to profitability)
We also reimburse ourselves for expenses related to maintaining a home office, and provide a monthly health insurance stipend.
Candidates must do at least 500 hours of contract work for the cooperative within over six months, at which point they will be considered for membership.