【问题标题】:How to schedule IIS SEO toolkit to run daily如何安排 IIS SEO 工具包每天运行
【发布时间】:2011-02-19 08:18:43
【问题描述】:

我在 IIS 中安装了 Microsoft SEO Toolkit。 http://www.iis.net/download/seotoolkit

我希望能够安排它每天运行并生成报告。

有人知道怎么做吗?

【问题讨论】:

    标签: windows iis seo scheduling


    【解决方案1】:

    您可以通过多种方式这样做:

    1)使用 PowerShell 脚本: h t t p://blogs.iis.net/carlosag/archive/2008/02/10/using-microsoft-web-administration-in-windows-powershell.aspx

    PS C:\ > $iis = new-object Microsoft.Web.Administration.ServerManager
    

    PS C:\ > $iis.Sites | foreach { $.应用程序 |其中 { $.ApplicationPoolName -eq 'DefaultAppPool' } | 选择对象路径,@{Name="AnonymousEnabled";表达式 = { $_.GetWebConfiguration().GetSection("system.webServer/security/authentication/anonymousAuthentication").GetAttributeValue("enabled") }} }

    2) 你可以像这样设置创建一个小的C#程序

    使用系统; 使用 System.IO; 使用 System.Linq; 使用 System.Net; 使用 System.Threading; 使用 Microsoft.Web.Management.SEO.Crawler;

    命名空间 SEORunner { 类程序 {

        static void Main(string[] args) {
    
            if (args.Length != 1) {
                Console.WriteLine("Please specify the URL.");
                return;
            }
    
            // Create a URI class
            Uri startUrl = new Uri(args[0]);
    
            // Run the analysis
            CrawlerReport report = RunAnalysis(startUrl);
    
            // Run a few queries...
            LogSummary(report);
    
            LogStatusCodeSummary(report);
    
            LogBrokenLinks(report);
        }
    
        private static CrawlerReport RunAnalysis(Uri startUrl) {
            CrawlerSettings settings = new CrawlerSettings(startUrl);
            settings.ExternalLinkCriteria = ExternalLinkCriteria.SameFolderAndDeeper;
            // Generate a unique name
            settings.Name = startUrl.Host + " " + DateTime.Now.ToString("yy-MM-dd hh-mm-ss");
    
            // Use the same directory as the default used by the UI
            string path = Path.Combine(
                Environment.GetFolderPath(Environment.SpecialFolder.MyDocuments),
                "IIS SEO Reports");
    
            settings.DirectoryCache = Path.Combine(path, settings.Name);
    
            // Create a new crawler and start running
            WebCrawler crawler = new WebCrawler(settings);
            crawler.Start();
    
            Console.WriteLine("Processed - Remaining - Download Size");
            while (crawler.IsRunning) {
                Thread.Sleep(1000);
                Console.WriteLine("{0,9:N0} - {1,9:N0} - {2,9:N2} MB",
                    crawler.Report.GetUrlCount(),
                    crawler.RemainingUrls,
                    crawler.BytesDownloaded / 1048576.0f);
            }
    
            // Save the report
            crawler.Report.Save(path);
    
            Console.WriteLine("Crawling complete!!!");
    
            return crawler.Report;
        }
    
        private static void LogSummary(CrawlerReport report) {
            Console.WriteLine();
            Console.WriteLine("----------------------------");
            Console.WriteLine(" Overview");
            Console.WriteLine("----------------------------");
            Console.WriteLine("Start URL:  {0}", report.Settings.StartUrl);
            Console.WriteLine("Start Time: {0}", report.Settings.StartTime);
            Console.WriteLine("End Time:   {0}", report.Settings.EndTime);
            Console.WriteLine("URLs:       {0}", report.GetUrlCount());
            Console.WriteLine("Links:      {0}", report.Settings.LinkCount);
            Console.WriteLine("Violations: {0}", report.Settings.ViolationCount);
        }
    
        private static void LogBrokenLinks(CrawlerReport report) {
            Console.WriteLine();
            Console.WriteLine("----------------------------");
            Console.WriteLine(" Broken links");
            Console.WriteLine("----------------------------");
            foreach (var item in from url in report.GetUrls()
                                 where url.StatusCode == HttpStatusCode.NotFound &&
                                       !url.IsExternal
                                 orderby url.Url.AbsoluteUri ascending
                                 select url) {
                Console.WriteLine(item.Url.AbsoluteUri);
            }
        }
    
        private static void LogStatusCodeSummary(CrawlerReport report) {
            Console.WriteLine();
            Console.WriteLine("----------------------------");
            Console.WriteLine(" Status Code summary");
            Console.WriteLine("----------------------------");
            foreach (var item in from url in report.GetUrls()
                                 group url by url.StatusCode into g
                                 orderby g.Key
                                 select g) {
                Console.WriteLine("{0,20} - {1,5:N0}", item.Key, item.Count());
            }
        }
    }
    

    }

    然后配置使用windows scheduler运行它

    我们在http://www.seo-genie.com 使用相同的工具包,并且可以每周为您运行 thouse 测试,如果您可以检查一下,或者只使用我在上面粘贴的代码 + windows sheduler,或者这是一种方法所以使用 Power Shell...

    【讨论】:

      【解决方案2】:

      我发布了一篇关于如何构建使用该引擎的命令行工具的博客文章。然后,您可以使用 Windows 中的任务计划程序安排它运行。

      http://blogs.msdn.com/b/carlosag/archive/2009/11/18/iis-seo-toolkit-start-new-analysis-automatically-through-code.aspx

      【讨论】:

        猜你喜欢
        • 1970-01-01
        • 1970-01-01
        • 2019-06-26
        • 2018-02-01
        • 1970-01-01
        • 2011-07-25
        • 2016-11-24
        • 2015-07-01
        相关资源
        最近更新 更多