XML sitemap

  • Lol, your not going to give up, are you? Don't take this as a flame or offense, i just find it funny how driven you are :D

    I love Woltlab and often admire where the development is going. :)


    But sometimes they do fall short and miss the ball. I don't know if its simply because they don't want to code something (less work) or they truly believe it's not important (short sighted).


    But other development either have sitemaps, are adding them, or in one case re-adding them after previously taking them out... And that is happening for a reason.


    This is one of those no brainers. It's completely obvious.


    1) You have both Google and Bing who offer this as an advantage for them and you


    2) There is a consumer demand for it (maining because of #1).


    Why alienate yourself and avoid it?

  • The perseverance of @Aslan encouraged me to post... (I threw in the towel)...


    My forum it isn't a big forum (8,000 threads and 10,000 posts) and Google is crawling 2,740 pages per day, but I have many pages not indexed. When I send the URL manually Google indexed this URLs perfectly (in minutes).


    My forum have a daily news section. In this case is important be the first. Normally my rivals are indexed more fast than me (they use sitemaps). Be the first is important for Google Search and for Google Alerts (Google doesn't usually repeat the same alerts).


    For Google and Bing are important the sitemaps and recommend using them. (This guide was published 5 months ago, 5 months are not ""the past"")


    For optimal crawling, we recommend using both XML sitemaps and RSS/Atom feeds. XML sitemaps will give Google information about all of the pages on your site. RSS/Atom feeds will provide all updates on your site, helping Google to keep your content fresher in its index.


    With RSS is impossible check the progress. The submitted pages number is the same every time (in my case 391 pages). You can not use this tool for check your progress.



    Google doesn't like "human bots" or "human sitemaps". I'm temporarily "banned" because I sent a lot urls manually for be indexed more fast: hours (automatically) Vs minutes (manually). Google wants sitemaps.


    Sitemaps are unnecessary for popular sites? for big sites requires a lot of resources? Maybe "on/off" option is the solution. Other softwares have this option.



    Summary: Google doesn't know all the URLs of my site, the indexation of my forum is very slow, Google and Bing recommended use sitemaps, other softwares use sitemaps, for small sites are important.

  • I'm not majorly fussed about the sitemap because my site is small and it isn't really needed. But saying they are last decades tech and totally defunct etc etc is just plain ridiculous.


    Google uses them to be updated of new content so it can be crawled and indexed fast as well as a pointer where to find deep content, on a big site the bot can often overlook new content and older, deeper content. They are still very much used and current.

  • @Marcel Werk I really feel this was dismissed in haste (dismissed carelessly) and I would strongly suggest it be reevaluated.


    Both Google and Bing, the 2 top search engines strongly recommend a site map. There is plenty of documentation backing up the validity of using a sitemap, by both search engines and 3rd party documentation to support that as well.


    XenForo
    Ip.Board
    vBulletin


    All include sitemaps. This isn't something that people dismiss or think it not important.

  • A week later...



    I can't send a new content to Google and the indexation is veeeery slow.



    Google uses them to be updated of new content so it can be crawled and indexed fast as well as a pointer where to find deep content, on a big site the bot can often overlook new content and older, deeper content. They are still very much used and current.


    Absolutely agree! Deep content is another of my problems.