Robots.txt allow all

In the robots.txt file on my domain I see this

User-agent: *
Disallow: /
Noindex: /

How can I remove this lines to allow all for all User-agents?

How are you getting that for your robots.txt ?

AFAIK this forum uses the default and it doesn’t look like what you posted.


See: allow index in robots txt in your site settings, be sure to have it ticked like it is out of the box.


Yes, I have it allow index in robots txt checked.
Ok, I’ve got it. It happens when I add smth into whitelisted crawler user agents
So, to make it open for all crawlers I should leave whitelisted crawler user agents empty, correct?