Options in the General section

Step 1 - Settings > SEO >

Options in the General section

Previous pageReturn to chapter overviewNext page

The commands in this section can be used to create a series of parameters that will help optimize Pages, activate the SiteMap and speed up page loading times and activate robots.txt files.

Specifically, you can apply the following options:

Automatically create a SiteMap: this is active by default, and the SiteMap will be automatically created and linked.

For the SiteMap to be created correctly, you must enter a valid URL in the WebSite Address (URL) field in the Step 1 - Website Settings window.

The sitemap.xml file is automatically created and linked for all the websites created with the Evo edition of  WebSite X5 too.

Meanwhile, the File size, section offers options for reducing page loading times and therefore improve users' navigation experiences:

Image Optimization: optimizes the project's images by reducing their dimensions. You can choose between different levels of compression. Minimal compression doesn't degrade images at all: other compression levels, on the other hand, may yield more significant results in terms of reducing file weights, but they'll also lead to progressive loss of image quality. Images are then compressed and saved in .JPG format.
Convert images in WebP format: if you activate this option, the images are saved in WebP instead of .JPG format. Saving in WebP format leads to an additional and significant size reduction for the selected images.

The WebP compression cab be applied to any image except the GIF and SVG ones. On the other hand, the JPG compression can't be applied to PNG, GIF and SVG images.

Enable file minification: during the project export phase, optimizing the project's JS and CSS files serves to make them smaller. In practice, this removes superfluous information that the file doesn't need in order to function, like spaces and line breaks, for example.
Enable gzip compression: sends files to be displayed in the browser using the gzip compression format, which speeds up the pages' loading times.

As well as enabling a statistics engine, you can also choose from the following options:

Include the robots.txt file: sets up the robots.txt file, which is used for indicating which contents in the site are to be excluded from search engine indexing. By default the instructions in the robots.txt file exclude the contents in some subfolders, such as Admin and Res, from indexing by all robots. You can edit the robots.txt file by hand or paste in new instructions.

For more information on the robots.txt file, click on the button to go to the official site: http://www.robotstxt.org/robotstxt.html.

 


Read the guides:

-

How to create and link the website's SiteMap

-

How to write and use the robots.txt file