Skip to main content
U.S. flag

An official website of the United States government


This quickstart describes the initial setup required to run an instance of Crossfeed on your local computer.

Initial Setup

  1. Install Node.js 14 and Docker Compose. Make sure the Docker daemon is running.
  2. Copy root dev.env.example file to a .env file.

    cp dev.env.example .env
  3. Build the crossfeed-worker Docker image:

    cd backend && npm run build-worker
  4. Start the entire environment from the root directory:

    npm start
  5. Generate the initial DB schema and populate it with sample data:

    cd backend
    # Generate schema
    npm run syncdb
    # Populate sample data
    npm run syncdb -- -d populate

    If you ever need to drop and recreate the database, you can run npm run syncdb -- -d dangerouslyforce.

  6. Navigate to http://localhost in a browser.
  7. Hot reloading for source files is enabled, but after changes to non-source code files stopping and starting Docker Compose is required. The following are examples of changes that will require restarting the environment:

    • Frontend or backend dependency changes
    • Backend changes to serverless.yml or env.yml
    • Environment variables in root .env
  8. Install Prettier in your dev environment to format code on save.

Running tests

To run tests, first make sure you have already started Crossfeed with npm start (or, at bare minimum, that the database container is running). Then run:

cd backend
npm test

If snapshot tests fail, update snapshots by running npm test -- -u.

To run tests for the subset of worker code that is written in Python, you need to run:

pip install -r worker/requirements.txt

To view a code coverage report (a minimum code coverage threshold is checked in CI), run npm test -- --collectCoverage. You can then view a HTML coverage report in the coverage/lcov-report directory.

Monitoring Docker containers

To see which Docker containers are running, you can run:

docker ps
CONTAINER ID        IMAGE                                                 COMMAND                  CREATED             STATUS              PORTS                                            NAMES
2a155c5bb9ce        crossfeed_backend                                     "docker-entrypoint.s…"   13 minutes ago      Up 13 minutes>3000/tcp                           crossfeed_backend_1
0177dff83a80   "/tini -- /usr/local…"   13 minutes ago      Up 13 minutes>9200/tcp,>9300/tcp   crossfeed_es_1
c0f3dee36d5e        crossfeed_frontend                                    "docker-entrypoint.s…"   13 minutes ago      Up 13 minutes>3000/tcp                             crossfeed_frontend_1
f3491e1b547e        matomo:3.14.1                                         "/ apac…"   13 minutes ago      Up 13 minutes       80/tcp                                           crossfeed_matomo_1
c3ed457a71d2        postgres:latest                                       "docker-entrypoint.s…"   13 minutes ago      Up 13 minutes>5432/tcp                           crossfeed_db_1
98c14a4f8886        mariadb:10.5                                          "docker-entrypoint.s…"   13 minutes ago      Up 13 minutes       3306/tcp                                         crossfeed_matomodb_1
9f70dbdbe867        crossfeed_docs                                        "docker-entrypoint.s…"   13 minutes ago      Up 13 minutes>4000/tcp                           crossfeed_docs_1
746956c514ed        bitnami/minio:2020.9.26                               "/opt/bitnami/script…"   13 minutes ago      Up 13 minutes>9000/tcp                           crossfeed_minio_1

You can then check the logs of a particular container by specifying a container's name with the docker logs command. For example:

docker logs crossfeed_backend_1 --follow

Further information

To see more information about the design and development of each component of Crossfeed, see the following links:

  • Frontend for the React frontend.
  • REST API for the REST API.
  • Database for the database models stored in Postgres.
  • Worker for the worker system and adding new scans and data sources.
  • Search for the search infrastructure and setup with Elasticsearch.
  • Analytics for the analytics setup with Matomo.


The documentation files are stored in the docs directory and served from a Gatsby site. To work on this, you should run npm start from before. You can then open up http://localhost:4000 in your browser to view the docs.

The docs are based on the federalist-uswds-gatsby theme. See that repository for more information on additional theme customizations that can be done.