Official Everybody Edits Forums

Do you think I could just leave this part blank and it'd be okay? We're just going to replace the whole thing with a header image anyway, right?

You are not logged in.

#1 2017-04-29 18:54:35

Anatoly
Guest

A way faster?

Everyone remembers this large image.

It took me one hour. I was going to every page selecting all posts (1204x1301 pixels)...

I want to know if there is a way faster?

The image size is around 15000 x exactly 1024 pixels, and.... that is offtopic a bit...

Help me!

#2 2017-04-29 19:07:39, last edited by AlphaJon (2017-04-29 19:16:32)

AlphaJon
Member
From: Who knows
Joined: 2015-07-21
Posts: 1,297

Re: A way faster?

So, you want numbers on each line, + line colors for non-members?
Brb

EDIT: Do you also need the whole data at once?

Offline

#3 2017-04-29 19:51:04

Gosha
Member
From: Russia
Joined: 2015-03-15
Posts: 6,206

Offline

#4 2017-04-29 22:06:55, last edited by AlphaJon (2017-04-29 22:11:51)

AlphaJon
Member
From: Who knows
Joined: 2015-07-21
Posts: 1,297

Re: A way faster?

Hi again. Had stuff to do, but here you go:

Piece of code

Execute this on the page with the user list, either by pasting it in the console, or using some browser add-on such as Greasemonkey, for example. Uncomment the first line in the case of a browser extension, because the script uses jQuery.

EDIT: It only works page by page, so I wish you good luck if you intend to do so with all results. Unless you somehow convince Diff to get you the whole list nicely on one page.

Offline

#5 2017-04-29 23:11:14

LukeM
Member
From: England
Joined: 2016-06-03
Posts: 3,009
Website

Re: A way faster?

Well... You could probably have a javascript thing that loads the HTML of all the pages fairly easily, then you could just put the part which represents the list one after another, load it as a webpage (using css and maybe some more javascript for the highlighting), then use the thing gosha suggested:

I might do it for you some time if I have nothing better to do, otherwise someone else might be able to do it (or you if you know some javascript)

I dont think the post or topic count things would affect the number of users, so im guessing that that^ would be the fastest way

Offline

#6 2017-04-30 01:19:15, last edited by AlphaJon (2017-04-30 01:52:08)

AlphaJon
Member
From: Who knows
Joined: 2015-07-21
Posts: 1,297

Re: A way faster?

destroyer123 wrote:

You could probably have a javascript thing that loads the HTML of all the pages fairly easily

I agree with that. Straightforward for the most part.

destroyer123 wrote:

then you could just put the part which represents the list one after another

That requires parsing the HTML within JavaScript (Because the API is disabled IIRC). This is the part I don't like. And then putting the relevant bits in the current page. Not to mention loading 48 pages is going to take some time.
Oh well, I have some spare time right now, so I'm going to do it anyway. Nvm I'm tired and going to bed, maybe tomorrow.

Offline

#7 2017-04-30 06:47:02, last edited by LukeM (2017-04-30 06:50:32)

LukeM
Member
From: England
Joined: 2016-06-03
Posts: 3,009
Website

Re: A way faster?

AlphaJon wrote:
destroyer123 wrote:

then you could just put the part which represents the list one after another

That requires parsing the HTML within JavaScript (Because the API is disabled IIRC). This is the part I don't like. And then putting the relevant bits in the current page. Not to mention loading 48 pages is going to take some time.
Oh well, I have some spare time right now, so I'm going to do it anyway. Nvm I'm tired and going to bed, maybe tomorrow.

If there isn't any way to do it directly through JS, you could add / remove it from the page which is currently loading. Then the HTML would be parsed by the browser, and could just be removed after its used

Edit: just googled it and you don't need to add it to the page, just create an HTML DOM element, then set the innerHTML as the page you want to be parsed

Offline

LukeM1493531222657439

Board footer

Powered by FluxBB

[ Started around 1713524226.7124 - Generated in 0.052 seconds, 12 queries executed - Memory usage: 1.47 MiB (Peak: 1.59 MiB) ]