Overview
GreenFerries is a project aimed at improving the environmental impact transparency of passenger ships. It provides a public website, datasette for API and data exploration, and a Python library for fetching and processing raw data from THETIS and wikidata. The project also includes Jupyter Python notebooks for exploring the different datasets.
Features
- Public website: greenferries.org
- Datasette API & data explorer: data.greenferries.org
- Download SQLite files: files.greenferries.org
- Netlify status lint standard JS/www greenferries.org frontend website
- Python library: greenferries
- Jupyter Python notebooks
- Data platform: data.greenferries.org
Installation
To install the GreenFerries project, follow these steps:
- Install Node.js by running
make install-node. - Deploy the main monorepository master branch to Netlify to automatically build and deploy the 11ty website from the
wwwsubdirectory. - Install the greenferries Python library by running
make install-python. - Register on wikidata and add your credentials to
greenferries/.env. - Run the
full_pipelinemake command from the root of the repository to download, convert, and process all necessary files for the frontend website and the datasette platform. - Explore the different datasets using the Jupyter Python notebooks with the
make notebooks-servercommand. - Access the data platform at data.greenferries.org to explore the processed data using the datasette and csvs-to-sqlite tools.
- To run the datasette platform locally, use the
make datasette-devcommand from the repository root.
Summary
GreenFerries is a project focused on enhancing the transparency of passenger ships’ environmental impact. It offers various features, including a public website, datasette for API and data exploration, a Python library for data processing, and Jupyter Python notebooks for dataset exploration. The project also provides a data platform where users can access and explore the processed data. The installation process involves setting up the necessary dependencies and deploying the website and data platform.