Step 1: Sync Data Source

The following are high level steps on how the algorithm is implemented, in an openly verifiable manner

Sourcing Data

For Karma3 Labs use cases on Lens Protocol, we're periodically exporting data from Lens BigQuery and using this as our source of truth. Similarly for Farcaster, we pull from a locally operating Hub (see Hubble) using their Replicator which converts the RocksDB data into a Postgres database. For all native on-chain data such as from Ethereum, Optimism and Base, we download a portion of data from Google BigQuery's public data set, while merging with other datasets from Dune, Nexandria, Airstack and Morallis as demonstrated at our Onchain Demo site.

Data Selection

We pull only a subset of the data relevant to profile rankings, content recommendations, nearest relevant EOA and smart contracts, etc. since we need the data post-processed, and close to the compute layer in order to run post-processing compute quickly to determine the number of followers, posts/casts, comments, NFT mints, etc. This data is refreshed periodically to ensure that we don't fall too far behind the latest events.

Last updated