Carp over-budget + maintenance
Current Project Status

This proposal will pay back dcSpark for the overbudget work, as well as continue development of the project given it’s now used by multiple companies inside the Cardano ecosystem


Carp is a more flexible alternative to the cardano-db-sync tool that was created as the result of a previously funded Catalyst proposal. However, it ended up being over-budget

Impact / Alignment


3 members

  • Project Information
  • Community Reviews
  • Team Information
Carp over-budget + maintenance

Please describe your proposed solution.

Blockchains contain a lot of information — often hundreds of gigabytes worth of data. Applications like dApps and wallets need to quickly filter through this data to provide the fast user experience that users expect. This is what indexers are meant to solve. There are many different kinds of indexers and the best one for the job tends to depend on what you’re trying to do. SQL databases are a popular kind of indexer because they provide a lot of flexibility to users to query what they want while maintaining decent performance.

Carp has the following core pillars:

  • Flexible: allow wallets and dapps full freedom in how they handle data
  • Modular: it should only index what you need
  • Speed: queries should be fast
  • Type safe: developers should know exactly what data to expect to avoid any critical issues in their products.
  • Documented: it should be easy for any developer to get started with Carp

You can learn more about Carp in the links provided below this Catalyst proposal.

This project is now being used by multiple projects/companies such as dcSpark, Milkomeda, ErgoDEX, NFT Maker and interest from other projects such as Flint and Yoroi. As such, we plan to continue developing it (see some of things we have planned here and also implemented features requested by projects (new REST endpoints, new Carp tasks, etc)

We also plan to maintain this tool across hardforks such as Vasil, through large updates to underlying libraries (Pallas/Oura/CML)

We also will update this tool to include safety improvements (by proposals such as and performance improvements (by proposals such as

Please describe how your proposed solution will address the Challenge that you have submitted it in.

Carp allows any developer to easily write their own indexer for the specific data they need. This allows developers to have very fast and space-efficient indexers to give them more time to focus on writing the tool or products they want to deliver

What are the main risks that could prevent you from delivering the project successfully and please explain how you will mitigate each risk?

Low risk because part of this proposal is paying us for work we already did that was over-budget. Additionally, our company depends on Carp for multiple projects so we are committed to continued maintenance and improvement of the project.

Please provide a detailed plan, including timeline and key milestones for delivering your proposal.

Part of the proposal is paying us for previously over-budget work, but for the core improvements, some notable things are:

Q3: Update to Vasil, update to latest CML that includes the safer CBOR codegen, adding faster re-indexing support, adding more tasks mentioned in the Github issues for the projects with notably some emphasis on Plutus functionality required for price feed indexing

Please provide a detailed budget breakdown.

All funds will be used for continued development of the project

Please provide details of the people who will work on the project.

Given the modular nature of this project, we expect multiple engineers to add specific components over the course of the Catalyst work. However, Github handled SebastienGllmt and rooooooooob will be the ones overseeing the project

If you are funded, will you return to Catalyst in a later round for further funding? Please explain why / why not.

If Carp continues to gain adoption, we will continue to improve it

Please describe what you will measure to track your project's progress, and how will you measure these?

Merging of Gihub PRs, closing of Github issues and completion of the core Q3 tasks outlined above

What does success for this project look like?

Many projects leveraging Carp to get faster & efficient indexing of data they need for their project(s)

Please provide information on whether this proposal is a continuation of a previously funded project in Catalyst or an entirely new one.

Yes: <>



  • EP2: epoch_length

    Authored by: Darlington Kofa

    Darlington Kofa
  • EP1: 'd' parameter

    Authored by: Darlington Kofa

    Darlington Kofa
  • EP3: key_deposit

    Authored by: Darlington Kofa

    Darlington Kofa
  • EP4: epoch_no

    Authored by: Darlington Kofa

    Darlington Kofa
  • EP5: max_block_size

    Authored by: Darlington Kofa

    Darlington Kofa
  • EP6: pool_deposit

    Authored by: Darlington Kofa

    Darlington Kofa
  • EP7: max_tx_size

    Authored by: Darlington Kofa

    Darlington Kofa