楚门

楚门

楚门的世界

Book lovers are in luck, recommended free Chinese e-book download websites.

image

Recommended Free Chinese eBook Download Websites
Free eBook Download Platform
Includes 130,000 eBooks
The author even provides data and source code for self-building.

Finished product address: https://24hbook.com
Author's Twitter: https://twitter.com/deepsixnone/status/1739976131267465482
Project address: https://github.com/mrdoing/24hbookcom

Building Tutorial

A simple and super-fast book searcher to create and search your personal library.

The book searcher can index metadata for over 10 million books in less than a minute and search at a speed of 30 microseconds. Author's page

Usage

We currently offer two options: desktop version and command line version. For individual users, we recommend using the desktop version.

  • Desktop Version
  1. Download the pre-compiled desktop version installer from the Release Pages

Alternatively, you can compile it yourself. Please refer to the instructions in the Building from Source Code section below.

Windows: Book-Searcher-desktop_version_x64.msi
macOS: Book-Searcher-desktop_version_x64.dmg
Linux:
Deb: Book-Searcher-desktop_version_amd64.deb
AppImage: Book-Searcher-desktop_version_amd64.AppImage

    1. Prepare the index

Please refer to the instructions in the Prepare Index section.

  1. Run book-searcher-desktop

Specify the path of the index folder in the settings menu.

  • Command Line Version
  1. Download the pre-compiled binary from the Release Pages

Alternatively, you can compile it yourself. Please refer to the instructions in the Building from Source Code section.

    1. Prepare the index

Please refer to the instructions in the Prepare Index section.

    1. Run book-searcher run

It listens on 127.0.0.1:7070.

Visit http://127.0.0.1:7070/ to use the web user interface, or you can use the [original search API][6].

Deploying with Docker

mkdir book-searcher && cd book-searcher wget https://raw.githubusercontent.com/mrdoing/book-searcher/master/docker-compose.yml

Prepare the index: Put the CSV files in the directory, then run the following command to create the index#

docker-compose run --rm -v "$PWD:$PWD" -w "$PWD" book-searcher /book-searcher index -f *.csv

Start book-searcher#

docker-compose up -d
Now, book-searcher will listen on 0.0.0.0:7070.

Original Search API

  • You can search by the following fields:

Title
Author
Publisher
Extension
Language
ISBN
ID

Translation:
/search?limit=30&title=TITLE
/search?limit=30&title=TITLE&author=AUTHOR
/search?limit=30&isbn=ISBN
/search?limit=30&query=title extension publisher
Currently, we have two search modes: /search?limit=30&mode=explore&title=TITLE&author=AUTHOR

Filter: Results need to meet all the restriction conditions, which is the default mode.
Explore: Results only need to meet certain restriction conditions.

Building from Source Code

Building the Command Line Version

    1. Set up the frontend

make frontend_preinstall frontend

  • 2. Build book-searcher

TARGET=release make

Move the compiled binary file to the project root directory
mv target/release/book-searcher .
Building the Desktop Version

  1. Install frontend dependencies

make frontend_preinstall

  • 2. Build book-searcher-desktop

cargo tauri build
Prepare the Index

  1. Prepare the raw data

Prepare the raw book metadata and save the CSV file in the project root directory.

The raw data is used to generate the index, please refer to the Raw Data section for detailed information.

    1. Create the index

You may need to execute rm -rf index command first.

book-searcher index -f *.csv
The final folder structure should look like this:

book_searcher_dir
├── index
│ ├── some index files...
│ └── meta.json
└── book-searcher
Raw Data
This raw data is used to generate the index and should be a CSV file containing the following fields:

id, title, author, publisher, extension, filesize, language, year, pages, isbn, ipfs_cid, cover_url, md5
You need to export and maintain the metadata of the books you purchased, as this project only provides fast search functionality.

Loading...
Ownership of this post data is guaranteed by blockchain and smart contracts to the creator alone.