[Home]RoboWiki/Site Issues

Robo Home | RoboWiki | Changes | Preferences | AllPages

Difference (from prior major revision) (no other diffs)

Changed: 31c31,33
That part of the script is quite strange. It was actually what I thought I did with the last tightening-up of things. But obviously I hadn't understood things right. -- PEZ
That part of the script is quite strange. It was actually what I thought I did with the last tightening-up of things. But obviously I hadn't understood things right. -- PEZ

Thank you, its working again. --mue

Dudes. I don't know if you have noticed it. But the latest weeks the wiki has been under serious attack from wiki spammers. They try to replace page content with lots of links to other sites. Trying to boost Google precense of those sites I guess. Anyway, I think it's something like 400 attempts the last two weeks. It's a huge increase, even considering the attacks earlier this year.

Those previous attacks made me patch the wiki script in a way that it doesn't allow posting of pages with more than just a few external links on them. It works. We would have 400 spammed pages without this "filter". Just letting you know. It could be good to know why some of your legitimate edits doesn't stick. Usually I add the attacker's IP-address to the banned list of addresses. Often I need to block entire sub nets. But these latest attacks are som many and from so many different sub nets that I think I happened to block people from our community. SO I have removed the ip-blocks entirely now. Means my link-filter is the only protection at the moment. Let's hope it's enough.

-- PEZ

Clever idea with the limit on the number of external links! -- Pulsar

Don't you get trouble with the RoboRumble/Participants page if more and more robocoders do not use the RobocodeRepository, but use links to their own pages instead? --GrubbmGait

No, because the filter works like so it considers only the difference in number of external links. Means if you add a new extarnal link (or change one) it's OK. Or should be. You could verify it if you like. The real downside is that in order not to give the spam bots a clue of what's going on the filter is "silent". To a spam bot without the necessary checks it looks like it has succeeded. But it makes it a bit less user friendly for legit human users. -- PEZ

BTW. The idea with that filter isn't mine. The implementation is though. -- PEZ

Can't you measure the time between clicking "Edit text of this page" and the submit button? If it's under, say, 2 seconds, then it is either a spambot, or a very fast typer... -- Dummy

Possibly. But actually I don't know how to do it. And I would need to include exeptions for my own bots and any other bot posting legit content. -- PEZ

Also, what if your only changing a spelling error that happens to be in the top of the page? --UnderDark

That shouldn't be a problem. -- ~~~~

Hi, i'm unable to edit Ascendants page. Obviously it works fine with other pages though. In case that filter described above is still in use: i was not adding external links. Is it just me or is there something wrong with that page? --mue

I just tried to edit Ascendants page and it didn't work either. -- Florent

OK. I guess I tightened it up a bit too much. We're (voidious and I) are looking into a smarter solution. Meanwhile I have loosened up the restrictions again so that Ascendant and other pages containing URLs will be editable again. It means our RevertingVandalism efforts will have to be taken into account again. -- PEZ

I have no idea how the wiki script works, but can we force people to sign in before editing any page? -- Florent

That part of the script is quite strange. It was actually what I thought I did with the last tightening-up of things. But obviously I hadn't understood things right. -- PEZ

Thank you, its working again. --mue


Robo Home | RoboWiki | Changes | Preferences | AllPages
Edit text of this page | View other revisions
Last edited May 7, 2006 18:07 EST by Mue (diff)
Search: