-
-
Notifications
You must be signed in to change notification settings - Fork 75
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Request] Add robots.txt parsing #177
Comments
Yeah, that's something to consider. I would opt for https://github.com/spatie/robots-txt instead as it's better maintained. What exactly do you want to achieve with the information? |
Personally, I am looking for sitemaps declared in robots.txt but I think there's also value in checking for rules for crawling. |
Fair enough, that's definitely another use-case. I'll see how we can get
both working
…On Thu, Jan 12, 2023, 15:58 Joshua Dickerson ***@***.***> wrote:
Personally, I am looking for sitemaps declared in robots.txt but I think
there's also value in checking for rules for crawling.
—
Reply to this email directly, view it on GitHub
<#177 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/ACAK7M45YFZADMOUK6LOEHLWSALZZANCNFSM6AAAAAATW5RGTE>
.
You are receiving this because you commented.Message ID:
***@***.***>
|
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Would be nice to have the ability to parse robots.txt like RSS feeds.
$web->robots
https://github.com/bopoda/robots-txt-parser is a library. Not sure if it is the one to use here but it seems to do the job
The text was updated successfully, but these errors were encountered: