I won't post an actual example, but it's things like 'summarise this 5000 word research paper'. In that case, feeding it the paper in chunks gets you summaries and critiques of the individual chunks and not the whole thing, even if you try to get GPT4 to understand that the chunks form part of the same text. It's a bit annoying because Plus was advertised as having a significantly longer token limit.
ChatGPT is not designed for Document summarisation. You need a system that automatically chunks and creates encodings for each chunk. I don't know of any public API that does this but there are several open-source projectects, some articles on Medium etc on setting this up.