Documentation
¶
Index ¶
- Constants
- type CrawlingUserAgent
- type IndexStatusInspectionResult
- type IndexingState
- type InspectionResult
- type PageFetchState
- type RobotsTxtState
- type SearchConsole
- func (sc *SearchConsole) AddSite(siteUrl string) error
- func (sc *SearchConsole) DeleteSite(siteUrl string) error
- func (sc *SearchConsole) GetSite(siteUrl string) (Site, error)
- func (sc *SearchConsole) InspectURL(inspectionUrl, siteUrl string) (InspectionResult, error)
- func (sc *SearchConsole) ListSites() ([]Site, error)
- type Site
- type Verdict
Constants ¶
const ( VerdictUnspecified = "VERDICT_UNSPECIFIED" // Unknown verdict. VerdictPass = "PASS" // Equivalent to "Valid" for the page or item in Search Console. VerdictPartial = "PARTIAL" // Reserved, no longer in use. VerdictFail = "FAIL" // Equivalent to "Error" or "Invalid" for the page or item in Search Console. VerdictNeutral = "NEUTRAL" // Equivalent to "Excluded" for the page or item in Search Console. )
const ( RobotsTxtStateUnspecified = "ROBOTS_TXT_STATE_UNSPECIFIED" // Unknown robots.txt state, typically because the page wasn't fetched or found, or because robots.txt itself couldn't be reached. RobotsTxtStateAllowed = "ALLOWED" // Crawl allowed by robots.txt. RobotsTxtStateDisallowed = "DISALLOWED" // Crawl blocked by robots.txt. )
const ( IndexingStateUnspecified = "INDEXING_STATE_UNSPECIFIED" // Unknown indexing status. IndexingStateAllowed = "INDEXING_ALLOWED" // Indexing allowed. IndexingStateBlockedByMetaTag = "BLOCKED_BY_META_TAG" // Indexing not allowed, 'noindex' detected in 'robots' meta tag. IndexingStateBlockedByHttpHeader = "BLOCKED_BY_HTTP_HEADER" // Indexing not allowed, 'noindex' detected in 'X-Robots-Tag' http header. IndexingStateBlockedByRobotsTxt = "BLOCKED_BY_ROBOTS_TXT" // Reserved, no longer in use. )
const ( PageFetchStateUnspecified = "PAGE_FETCH_STATE_UNSPECIFIED" // Unknown fetch state. PageFetchStateSuccessful = "SUCCESSFUL" // Successful fetch. PageFetchStateSoft404 = "SOFT_404" // Soft 404. PageFetchStateBlockedRobotsTxt = "BLOCKED_ROBOTS_TXT" // Blocked by robots.txt. PageFetchStateNotFound = "NOT_FOUND" // Not found (404). PageFetchStateAccessDenied = "ACCESS_DENIED" // Blocked due to unauthorized request (401). PageFetchStateServerError = "SERVER_ERROR" // Server error (5xx). PageFetchStateRedirectError = "REDIRECT_ERROR" // Redirection error. PageFetchStateAccessForbidden = "ACCESS_FORBIDDEN" // Blocked due to access forbidden (403). PageFetchStateBlocked4xx = "BLOCKED_4XX" // Blocked due to other 4xx issue (not 403, 404). PageFetchStateInternalCrawlError = "INTERNAL_CRAWL_ERROR" // Internal error. PageFetchStateInvalidUrl = "INVALID_URL" // Invalid URL. )
const ( CrawlingUserAgentUnspecified = "CRAWLING_USER_AGENT_UNSPECIFIED" // Unknown user agent. CrawlingUserAgentDesktop = "DESKTOP" // Desktop user agent. CrawlingUserAgentMobile = "MOBILE" // Mobile user agent. )
const ( WebmastersScope = "https://www.googleapis.com/auth/webmasters" // Read/write access. WebmastersReadScope = "https://www.googleapis.com/auth/webmasters.readonly" // Read-only access. )
Variables ¶
This section is empty.
Functions ¶
This section is empty.
Types ¶
type CrawlingUserAgent ¶
type CrawlingUserAgent string
type IndexStatusInspectionResult ¶
type IndexStatusInspectionResult struct { Sitemap []string `json:"sitemap"` ReferringUrls []string `json:"referringUrls"` Verdict Verdict `json:"verdict"` CoverageState string `json:"coverageState"` RobotsTxtState RobotsTxtState `json:"robotsTxtState"` IndexingState IndexingState `json:"indexingState"` LastCrawlTime time.Time `json:"lastCrawlTime"` PageFetchState PageFetchState `json:"pageFetchState"` GoogleCanonical string `json:"googleCanonical"` UserCanonical string `json:"userCanonical"` }
type IndexingState ¶
type IndexingState string
type InspectionResult ¶
type InspectionResult struct { ResultLink string `json:"inspectionResultLink"` IndexStatusResult IndexStatusInspectionResult `json:"indexStatusResult"` }
URL inspection result, including all inspection results.
TODO:
- ampResult
- richResultsResult
type PageFetchState ¶
type PageFetchState string
type RobotsTxtState ¶
type RobotsTxtState string
type SearchConsole ¶
type SearchConsole struct {
// contains filtered or unexported fields
}
func NewSearchConsole ¶
func NewSearchConsole(auth authentication.Authenticator) *SearchConsole
func NewSearchConsoleFromJSON ¶
func NewSearchConsoleFromJSON(data []byte) (*SearchConsole, error)
NewSearchConsoleFromJSON parses data and returns a new *SearchConsole.
func NewSearchConsoleFromJSONFile ¶
func NewSearchConsoleFromJSONFile(name string) (*SearchConsole, error)
NewSearchConsoleFromJSONFile reads file in path name and returns a new *SearchConsole.
func (*SearchConsole) AddSite ¶
func (sc *SearchConsole) AddSite(siteUrl string) error
AddSite adds a site to the set of the user's sites in Search Console.
If successful, this method returns an empty response body. If the body not empty, returns it as an error.
This function sets the Oauth2 scope to WebmastersScope.
See more: https://developers.google.com/webmaster-tools/v1/sites/add
func (*SearchConsole) DeleteSite ¶
func (sc *SearchConsole) DeleteSite(siteUrl string) error
DeleteSite removes a site from the set of the user's Search Console sites.
If successful, this method returns an empty response body. If the body not empty, returns it as an error.
This function sets the Oauth2 scope to WebmastersScope.
See more: https://developers.google.com/webmaster-tools/v1/sites/delete
func (*SearchConsole) GetSite ¶
func (sc *SearchConsole) GetSite(siteUrl string) (Site, error)
GetSite retrieves information about specific site.
This function sets the Oauth2 scope to WebmastersReadScope.
See more: https://developers.google.com/webmaster-tools/v1/sites/get
func (*SearchConsole) InspectURL ¶
func (sc *SearchConsole) InspectURL(inspectionUrl, siteUrl string) (InspectionResult, error)
func (*SearchConsole) ListSites ¶
func (sc *SearchConsole) ListSites() ([]Site, error)
ListSites lists the user's Search Console sites.
This function sets the Oauth2 scope to WebmastersReadScope.
See more: https://developers.google.com/webmaster-tools/v1/sites/list