An exporter that exports transfers from multiple blockchains to a Kafka topic.
The Docker image is pinned to Node.js 21.4.0 in docker/Dockerfile, and the repository pins the same version in .tool-versions.
If you use asdf, the checked-in .tool-versions file already points to Node.js 21.4.0.
- Install the
nodejsplugin once on your machine:
asdf plugin add nodejs https://github.com/asdf-vm/asdf-nodejs.git- Install the version required by this repository:
asdf install
asdf current nodejs- Install dependencies:
npm ciUseful asdf commands:
# Show the version selected for this directory
asdf current nodejs
# Temporarily force a specific version in the current shell
asdf shell nodejs 21.4.0
# Rewrite .tool-versions in the current directory
asdf local nodejs 21.4.0node-rdkafka is a native module. If you switch Node.js versions after installing dependencies, rerun npm ci (or at least rebuild that module) so native bindings match the active Node.js version.
You can export from any of the blockchains by starting one of the scripts in the bin directory. See bin/README.md for Docker-based workflows.
For local development:
npm ci
npm run build
npm startYou can make health check GET requests to the service. The health check makes a request to Kafka to make sure the connection is not lost, try to request current block number of the blockchain and also checks time from the previous pushing data to kafka (or time of the service start if no data pushed yet):
curl http://localhost:3000/healthcheckIf the health check passes you get response code 200 and response message ok.
If the health check does not pass you get response code 500 and a message describing what failed.
Prometheus metrics are exposed on:
curl http://localhost:3000/metricsYou can control the log level during development with the following environment variables:
LOG_LEVEL. Severity of messages that will be produced. Available values are "trace", "debug", "info", "warn", "error", "fatal"RDKAFKA_DEBUG. This determines which rdkafka debug contexts will be enabled. The value corresponds to thedebugconfiguration value of rdkafka. See https://github.com/edenhill/librdkafka/blob/master/CONFIGURATION.md for possible values. By default no contexts are enabled.
You can run the unit tests in one of the following ways:
- Without Docker
Use Node.js
21.4.0locally to match the Docker image and.tool-versions.
$ npm ci
$ npm test- With Docker
$ ./bin/test.shThose are tests which would run the exporter against a running Node and compare the output to expected values.
npm run integration_test
When writing data exporter you need to make sure the following things:
- All the logging should be on the stdout. Do not create any temp files, as they will most probably disappear in an event of a restart
- All the config should come from ENV variables
- An exporter should continue from where it was interrupted in case of a restart.
You can save the current position using the
savePosition(position)API. - Encode the data as JSON
The main Exporter API used by workers and tests currently includes:
connect- establish connection to the dependent services. Returns a Promise.disconnect- close Zookeeper and Kafka connections. Returns a Promise.initTransactions- initialize Kafka transactions. Returns a Promise.getLastPosition- fetch the last saved position. Returns a Promise.getLastBlockTimestamp- fetch the last saved block timestamp. Returns a Promise.savePosition- update the last saved exporter position. Returns a Promise.saveLastBlockTimestamp- update the last saved block timestamp. Returns a Promise.sendData- push a single event or an array of events. Returns a Promise.storeEvents- write events inside a Kafka transaction. Returns a Promise.