1

I am generating very large epub3 books (almost all text) and I have noticed severe performance issues when trying to read them. Is there any format changes I can make to make reading performance better?

the files in my epub are as follows:

│   mimetype (1KB)
│
├───EPUB
│   │   cover.xhtml (1KB)
│   │   nav.xhtml (36KB)
│   │   package.opf (2KB)
│   │   s04.xhtml (5.3MB)
│   │
│   ├───css
│   │       epub.css (1KB)
│   │       nav.css (1KB)
│   │
│   └───images
│           cover.png (21KB)
│
└───META-INF
        container.xml (1KB)
gfaster
  • 56
  • 3
  • 1
    Break the html up into chapters – mmmmmm Jun 12 '21 at 13:54
  • Do you actually have ~5 MB of text in the so4.xhtml file? How much is actual text that you've input, and how much is XML? Is all that needed? – DrMoishe Pippik Jul 30 '21 at 02:00
  • This EPUB was the result of scraping hundreds of pages, and was pretty well reduced to just

    and a a few

    tags. Breaking it up was the only real solution in this case.
    – gfaster Aug 01 '21 at 07:33

1 Answers1

1

As @mmmmmm suggested, breaking up the file into chapters ended up working and made loading the file much faster. I ended up making 256kb the largest single file and that worked well.

gfaster
  • 56
  • 3