Tested how long it takes to resolve teh AJAX call on different pages.
Telsa Google Search: (344 kB)
- Run 1: 4043 milliseconds, 1912 ms
- Run 2: 1881, 1639 ms
- Run 3: 1696, 1574
Asda Google Search (371 kb)
- Run 1: 1679 milliseconds, 2971
- Run 2: 1145, 1205 ms
- Run 3: 1588, 1779
Tesla Motors Wikipedia Page (<body> string size 884 kB)
- Run 1: 5472 milliseconds, 6455 ms
- Run 2: 6059 ms, 7458 ms
- Run 3: 6959, 6059 ms
Asda Wikipedia Page (<body> string size 264 kB)
- Run 1: 6872 milliseconds, 1870
- Run 2: 1598 ms, 2075 ms
- Run 3: 2383, 2478 ms
KNE Article (84 kB)
- Run 1: 1677 milliseconds, 659
- Run 2: 2260 ms, 379
- Run 3: 744, 371
Have configured the server to parse the script and get the keywords. Basically there seems to be hardly an performance improvement if the parsing is done on the server. I thought sending back a smaller pack of information would save speed, but apparently there is little difference. Looking at the server console, it seems what takes the longest, for example for a 850kb string upload, is the initial upload (about 2 /3 of the time). Then the parsing of the string to remove the html and return text only takes some time (about 1/3 of time) . Getting the kewyords from that string is very fast.
That means the dowload time of the string from the server must be very fast. Much faster than the upload.
So what about technique of sending the data in smaller strings? Can the server handle multiple requests. Maybe it will just take 10 times as long if I do 10 requests.
Ok so I think the speed is fine actually as is. On big wikipedia articles it takes a bit longer, but on news and journal articles it is only 1 – 2 second response time which I think is fine.
8pm: Updated some of the CSS. Looks pretty good now. Would be good to update some of the styling checkbox buttons also. Dosen’t seem to be a very straightforward task. Will work on it though. Also need to update the script so that the content data resets when the URL changes. Because some pages are Single Page Applications, so they will have a URL change on a new page, but not a reload, so the extension will keep the same keywords as the old page since it is not getting a refresh signal. To overcome this need to tell it to refresh when URL changes.
Need to remove keywords where there frequency is only one. Those should not appear because it does not make sense. Then maybe make some more tweaks on the parsing algorithm. Then I should be nearly done.