SynchronetI've looked for a while but my google-foo is failing me.
I am wanting to have the BBS web pages present, but not allow anyone
to browse the message areas unless logged in. Perhaps allow one or two
areas like a local/main, if possible. I want to shutdown the network
areas from being web crawling/indexing targets.
You can stop the web crawlers with your robots.txt.
I'm not sure but I think the default robots.txt that comes with
will do this. My own robots.txt looks like this..
User-agent: *
Disallow: /bbbs
Re: webecv4 questions
By: Hemo to All on Wed Apr 01 2020 03:45 pm
I am wanting to have the BBS web pages present, but not allow anyone
to browse the message areas unless logged in. Perhaps allow one or
two areas like a local/main, if possible. I want to shutdown the
network areas from being web crawling/indexing targets.
The security levels of the groups determine what can be seen on the web. The guest user's security level controls what un-authenticated users can see from the web.
Re: webecv4 questionsthem
By: Hemo to Al on Wed Apr 01 2020 18:39:33
Hemo> every minute or so, something comes in and goes directly to a specific Hemo> file and tries to download it. Most of these seem to come from Hemo> cn-northwest-1.compute.amazonaws.com.cn
look in your /sbbs/data/logs directory for the http logs (if you have
enabled) and you will see a traditional apache-style log format... the last field contains the user agent which will generally tell you if the visitor really is a spider or not... what you're seeing from that amazon cloud domain may be a spider or it may be someone's file getter or possible even an indexer (which is like a spider or crawler)...
Rampage wrote to Hemo <=-
look in your /sbbs/data/logs directory for the http logs (if you have
them enabled) and you will see a traditional apache-style log format... the last field contains the user agent which will generally tell you if the visitor really is a spider or not... what you're seeing from that amazon cloud domain may be a spider or it may be someone's file getter
or possible even an indexer (which is like a spider or crawler)...
Re: webecv4 questions
By: Al to Hemo on Wed Apr 01
2020 02:28 pm
I've looked for a while but my google-foo is failing me.
I am wanting to have the BBS web pages present, but not allow anyone
to browse the message areas unless logged in. Perhaps allow one or two
areas like a local/main, if possible. I want to shutdown the network
areas from being web crawling/indexing targets.
You can stop the web crawlers with your robots.txt.
I'm not sure but I think the default robots.txt that comes with Synchronet will do this. My own robots.txt looks like this..
User-agent: *
Disallow: /bbbs
I've got this:
User-agent: *
Disallow: /
Its not stopping things taht are not identifying as a crawler. I think. I think a legitimate crawler starts by looking for the robots.txt file, I see some of those too.
Here snips of what I see in the log:
every minute or so, something comes in and goes directly to a specific file and tries to download it. Most of these seem to come from cn-northwest-1.compute.amazonaws.com.cn
--
H
01Re: webecv4 questions
By: Al to Hemo on Wed Apr
two2020 02:28 pm
I've looked for a while but my google-foo is failing me.
I am wanting to have the BBS web pages present, but not allow anyone
to browse the message areas unless logged in. Perhaps allow one or
areas like a local/main, if possible. I want to shutdown the network
areas from being web crawling/indexing targets.
You can stop the web crawlers with your robots.txt.
I'm not sure but I think the default robots.txt that comes with Synchronet will do this. My own robots.txt looks like this..
User-agent: *
Disallow: /bbbs
I've got this:
User-agent: *
Disallow: /
Its not stopping things taht are not identifying as a crawler. I think.
I think a legitimate crawler starts by looking for the robots.txt file, see some of those too.
Here snips of what I see in the log:
every minute or so, something comes in and goes directly to a specific file and tries to download it. Most of these seem to come from cn-northwest-1.compute.amazonaws.com.cn
--
H
I wonder if adding if(user.alias === 'Guest') { writeln('You must be logged in to view files!'); exit(); } to /sbbs/webv4/root/api/files.ssjs would help? o0r something like that
I wonder if adding if(user.alias === 'Guest') { writeln('You must belogged in
to view files!'); exit(); } to /sbbs/webv4/root/api/files.ssjs wouldhelp? o0r
something like that
Sysop: | Keyop |
---|---|
Location: | Huddersfield, West Yorkshire, UK |
Users: | 537 |
Nodes: | 16 (0 / 16) |
Uptime: | 156:16:32 |
Calls: | 10,251 |
Calls today: | 1 |
Files: | 13,983 |
Messages: | 6,406,354 |