Google has indexed a couple thousand of our pages at both http and https, and of course is saying there's duplicate content. So two questions.
1) How do redirect all the indexed pages on https to http? Is there a way to use a matching pattern rule to redirect all https pages, except those like myaccount and the checkout pages? Or will I have to have a redirect for each?
2) How do we prevent this going forward? Prevent indexing all https pages in the robots.txt?
1) How do redirect all the indexed pages on https to http? Is there a way to use a matching pattern rule to redirect all https pages, except those like myaccount and the checkout pages? Or will I have to have a redirect for each?
2) How do we prevent this going forward? Prevent indexing all https pages in the robots.txt?
Comment