If you search for something within view, it will work.
I bet if you scroll down, you’ll find more than 8 or 2, respectively.
Try searching for “changed to” or “turned on”. Should be like 100 entries. I only get the top page worth and once I scroll down it’s clear how many it’s missed…
Chrome on Windows, MS Edge (Chromium) on Mac, and if I’m not mistaken also Chrome on iOS.
If i open the logbook and want to quick find anything pertaining to Justin (i.e. Justin’s Phone, Justin’s rodents, Justin’s piano, etc) and i hit command-f and type Justin, nothing shows up until I scroll scroll scroll. There are no other options.
Let me ask you this:
If you do a search on google (or any other paginated site for that matter).
Do you get a hit with ctrl+f on page 2+ when looking at page 1?
That is what’s happening here, the things you do not see is page 2, 3, 4 and so on
Well, no because it’s on a completely different page. If google is showing me 100 search results on the first page, I can ^F to my heart’s delight. No matter if I’m currently looking at result 1 or result 100, I can search them all. When I switch to google result page 2, then I can ^F and search from all search results between 101-200, not just the 5 results on the section of the page I’m currently staring at.
My question to you, when you do a search on google and you hit ctrl+f, do you expect it to look for text only within the scope of the 5-6 results that you’re currently scrolled to, or do you expect it to search the entire page?
Take Facebook, Twitter, heck even this forum.
You only get hits for loaded items, that is the same behavior as multiple pages, but gives a better flow.
It’s still paginated
I’m not saying the request is bad, just explaining why.
I fully understand what you were getting at, but it felt like you were waving away my UI frustrations with a technical explanation, so I wanted you to feel my frustrations too!
Your question is wrong in my opinion since we are talking about two completely different things.
Googles pages are more or less just HTML. It’s very basic and everything is downloaded to your computer when the page loads.
But lets take a more comparable page. Youtube.
You can’t view the video that is not buffered.
That is more comparable.
However I understand your question and frustration.
I don’t like those kind of pages either, but they save lots of bandwidth.
Most likely what you are looking for is at the top and not 10 pages down, so the data transmission can be 1/10 th.
I would rather have a “download raw data” button that gives you all in the format you want.
That way you keep the low bandwidth for every day usage, but can get it all if you want.
Or “paginate after x entries…”, then people with bandwidth concerns can save their 2MB of text (total guess but when it’s compressed I bet that’s about right) and I can do my searches til the cows come home.
Edit #2. But I will say this. I understand paginated and lazy loading of pages. This doesn’t seem like lazy loading… because I can load the page (NOT SCROLL DOWN), turn off wifi, and scroll all the way down without any problem or missing text.
Edit #3. Or go to Developer tools on your browser/Network and hit refresh. Watch the waterfall. It doesn’t change after you start scrolling. I don’t think this is about conserving network resources, but perhaps loading dynamically things into RAM (this I know not how to test).
And if you have 3 months of logging? Everyone thinks that history is slow with a day’s worth of data. Your PI would die trying to serve up all that. It would be better to just apply a filter instead of leveraging find.
But as I made perfectly clear in my most recent post (see the edits), everything is already served up! The server saves nothing by doing what it’s doing.
@petro, load your log, turn off Wi-Fi, and scroll down. What happens? What does this tell you about what is happening with the server? Serious question. I’m not a programmer. Thanks
Ok, you are implying that somehow the stress on the server is related to my ability or inability to search beyond the first 100 entries (I say 100 but in reality it seems to be however big your window is while displaying the entries, in actuality mine is about 25 entries). I don’t understand how that’s the case. My server sends the entirety of the log to my browser, it does not break it into chunks as I scroll down. So where does the comment “Your PI would die trying to serve up all that” come into play in this conversation?
I don’t know about you but mine only sends whats between the specified dates. It’s an SQL lookup, more data causes more time. It’s not going to search outside the date range and the current filters only work off entity_id. A solution that many would get behind is a filter that works off friendly names applied prior to the lookup to speed things up.
Please re-read the thread. We’re arguing different points.
I want to be able to hit command-f in my browser and type in a string and if it matches the last entry on the page, it should show up in my in-browser search function. I’m not implying that the logbook load 30 days worth of data. I’m fine with 1 day, but I feel like I should be able to use my browser to search to the bottom of that page. Does that make sense?
Oh I understand but that’s not how webpages work and we don’t have control over what chrome(insert other browser) finds. I guess that’s worth mentioning.