A Hassle-free Approach to Clearing Cloudflare Cache: Using Webhooks

To make the websites faster, static content from website is stored in Cloudflare cache on the servers of our globally distributed data centers. Cloudflare stores the copies of data close to your geographical location and loads the websites faster.

Sometimes you make the changes in the backend and websites keep showing content from cache i.e. previous version and updation does not reflect on frontend. Then, it is required to clear the cache of some specific pages as per requirement.

Usually, if we do it on Cloudflare manually, steps to clear/purge cache in Cloudflare are following:

  1. Log in to your Cloudflare dashboard, and select your account and domain.
  2. Select Caching > Configuration.
  3. Under Purge Cache, select Custom Purge. The Custom Purge window appears. Complete the form as per instructions and requirement.
    Or if we want to purge everything then,
    Under Purge Cache, select Purge Everything. A warning window appears.
  4. Select Purge.

Problem is that usually, everybody in the team do not have the access to Cloudflare and also, this consumes a lot of time to login into Cloudflare and performing certain steps.

To save the time and make this functionality available to many, we can create a chatbot in teams using webhooks and Connectors. Authors, editors, marketing team and other team members can easily use it from here with or without tech knowledge.

In order to clear Cloudflare cache using chatbot, Outgoing Webhook is the right option. In the rest of the blog, I will give you a walkthrough how can clear Cloudflare cache utilizing outgoing webhook.

Create Outgoing Webhook

  • Login into Microsoft teams, create a team.
  • Click on three dots on side of team you just created and then select the manage teams option(refer below image)

  • In the manage team, click on the outgoing webhook option at the lower right corner(refer below image).

  • A window with 3 fields will appear. Add the name of your webhook(Users will use this name to call), Callback URL(here, Azure function app URL), Small description in Description field and profile picture(if you want, optional)

  • After adding this information, click on create. Another window with HMAC(Hash-based Message Authentication Code) will appear. Save this because it will be required to ensure the authenticity and integrity of the webhook requests. The HMAC token helps verify that the incoming requests to your webhook endpoint originated genuinely from the expected source.
HMAC is the unique token per configuration and it does not expire.

Set up Visual studio Azure function project

Pre-requisites:

  • Visual Studio 2022 version 17.3
    • .NET core 6+
  • Create Azure function app in visual studio

By selecting Azure functions and clicking Next, you will get many options to select like: Name, Authorization level(Function), Functions worker, Function type (For chatbot, we will use Http trigger),etc. Select as required.

After creating project, you will get following:

Add the code snippet in the .cs file for what the action you want to perform when webhook is call

All input/output related details should be added to host.json file.

Code for Cloudflare cache clear is present here.

To verify that the requests are sent from legitimate client, follow the below steps

  1. Retrieve the Secret Key: When you set up the outgoing webhook, you should have generated or obtained a secret key. Retrieve this key, as it will be used for HMAC validation.
  2. Extract the Request Data: Extract the relevant data from the outgoing webhook request, which typically includes the payload and any additional headers or parameters.
  3. Compute the HMAC: Use the secret key obtained in step 1 and the request data to compute the HMAC using the appropriate hashing algorithm (e.g., HMAC-SHA1, HMAC-SHA256). Check here for hashing algorithm used.
  4. Compare the HMACs: Compare the computed HMAC with the HMAC received in the request. If they match, the request is considered valid and can be trusted.

Once all this code and configuration is done, create the publishing profile to push the code to azure function.

Publish it to Azure

You require Microsoft credentials to perform this.

  • Select the publishing target to push this function app. As we are moving this to azure, select Azure.

  • Based on selection, select the required azure service.

  • Login with Microsoft credentials.

After you logged in, select the subscription, existing resource group and azure function app

Or you can create new one:

Click on create and start publish action,

Add function URL from azure function to the outgoing webhook Callback URL field.

For logs, check the monitor column.

Finally, test and use your chatbot

Once all this is done, @mention with your outgoing webhook name and any other parameters. To clear Cloudflare cache, pass URL whose cache need to be cleared(refer the below example).

This is for clearing the Cloudflare cache. However, we can create chatbots using Webhooks and connectors for any purpose by creating our custom solutions in Azure functions.

Hope you like it!

Advertisement

Creating chatbot in Microsoft Teams with Webhooks and connectors

Webhooks are user defined HTTP callbacks that are used for triggering automated messages, notifications and transfer of data from one web application to another via specific URL(webhook endpoint)

For example, some source application sends an HTTP POST request to a specific URL or endpoint defined by the recipient application and then recipient application processes the request, takes action and send back response once action that is required to be taken by recipient application is completed. It helps multiple applications to communicate and transfer data in real time without delays.

Webhooks are becoming increasingly popular as a way for web applications to integrate with each other in a more streamlined and efficient way. They are often used in conjunction with APIs (Application Programming Interfaces) to allow different systems to work together seamlessly.

APIs are required to be triggered manually every time you want to perform certain task or get certain information. And webhooks are triggered automatically based on some event without any request.

If we consider the Microsoft teams webhooks for creating chatbots then we have two types of webhooks:

Outgoing Webhooks:

  • Outgoing webhooks are used to send data from an application to an external service
  • Outgoing webhooks are triggered by events or actions that occur within a web application or service.
  • When a specific event or action takes place, the application initiates an HTTP request to send data (payload) to a predefined URL (webhook endpoint).
  • The receiving application or service then processes the data sent by the outgoing webhook and takes appropriate actions based on the received payload.
  • Outgoing webhooks are often used to push data or notifications to external systems or services, such as sending updates to a chat platform, triggering actions in another application, or delivering data to a remote server.
Image 1

Incoming Webhooks:

  • Incoming webhooks are used to receive data from external sources into an application.
  • Incoming webhooks are endpoints that are provided by a web application or service to accept and process incoming HTTP requests.
  • External systems or services can send HTTP requests to the webhook endpoint, typically with a payload containing relevant data or information.
  • The receiving application or service processes the incoming webhook request, extracts the payload data, and performs actions based on the received information.
  • Incoming webhooks are commonly used to receive data or notifications from external sources and trigger actions within the receiving application or service. For example, an application might use an incoming webhook to receive data from an external form submission, process it, and store it in a database.

Go to more apps in Image 1 and then search incoming webhook:

Image 2

Apart from this, we also have connectors in Microsoft teams:

Connectors

Connectors are a feature that allows you to integrate external services and receive updates or notifications directly within Teams. Connectors enable you to bring information from various sources into your Teams channels, improving collaboration and keeping your team informed. Here are some key points about connectors in Microsoft Teams:

  • Integration with External Services: Microsoft Teams provides a wide range of pre-built connectors that allow you to connect with external services and applications. These connectors enable you to bring information and updates from external systems directly into Teams channels.
  • Connector Cards: When a connector is configured, it can send messages or updates in the form of Connector Cards. Connector Cards are richly formatted messages that provide information, images, links, and actions related to the external service or system. These cards are displayed in the Teams channel, providing a clear and consistent way to present information from different sources.
  • Connector Configuration: To use a connector in Microsoft Teams, you need to configure it for a specific channel. This configuration involves selecting the desired connector from the available options and specifying the necessary settings or credentials to connect to the external service.
  • Available Connectors: Microsoft Teams offers a wide range of connectors, including connectors for popular services such as GitHub, Trello, SharePoint, Azure DevOps, Salesforce, Jira, and more. These connectors enable you to receive updates, notifications, or specific events from these services directly within Teams.
  • Custom Connectors: In addition to the pre-built connectors, Microsoft Teams also provides the capability to create custom connectors. With custom connectors, you can integrate your own applications or services with Teams, allowing you to send updates, notifications, or information specific to your organization or business processes.
  • Connector Actions: Some connectors also provide actionable buttons or options within the Connector Cards, allowing users to take specific actions directly from Teams. For example, a connector for a project management tool might include buttons to create tasks or update project status.

Go to more apps in Image 1 and search connector, you will be able to see list of connectors available in your teams account:

Image 3

Hope you like this blog!

Part 3: All about Sitecore Serialization using Sitecore CLI

Series of Sitecore CLI blogs

You can check if you have the serialization plugin:

> dotnet sitecore plugin list

If plugin is not there, then install it by running:

> dotnet sitecore plugin add -n Sitecore.DevEx.Extensibility.Serialization

To get the information about serialization commands run:

> dotnet sitecore ser -h

You will get all the information about parameters like below:

Once setup of Sitecore CLI and Sitecore content serialization configuration is done, you are good to execute serialization of items.

First, login via interactive or non interactive flow(check link for more details):

Here, will go with non-interactive client login:

> dotnet sitecore login --authority https://<sitecore-identity-server> --cm http://<sitecore-instance> --allow-write true --client-credentials true --client-id <client-id> --client-secret <client-secret>

Login information will be saved in .sitecore/user.json.

Pull:

Serialize items from Sitecore to disk

You can figure out usage of individual commands with -h or –help command.

  • Pull all items as configured in *.module.json.
> dotnet sitecore ser pull

After pulling items, you will see .yml file corresponding to each item in sitecore in disk

  • Pull items from specific module only(with tags separated by comma ,For inserting tags, check – link)
> dotnet sitecore ser pull -i tags:[tagA,tagB]
  • Pull items from specific module only(with Module namespace)
> dotnet sitecore ser pull -i ModuleA ModuleB
  • Pull all items except certain modules with tags:
> dotnet sitecore ser pull -e tags:[tagA,tagB]
  • Pull all items except some modules(with Module namespace)
> dotnet sitecore ser pull -e ModuleA ModuleB
  • Pull items without integrity validation
> dotnet sitecore ser pull -s
  • Check the differences/changes in disk and Sitecore without actually pulling it
> dotnet sitecore ser pull --what-if

Validate

Ensures file system integrity of serialized items in disk and their paths

It performs the following checks on serialized items:

  • Invalid physical path.
  • Orphaned parent ID.
  • Non-included item.
  • Empty folder.
  • Duplicate item ID.
  • Non-unique path.

  • Run the validation on serialized items
> dotnet sitecore ser validate
  • Fix the issues found in validation above
> dotnet sitecore ser validate --fix

If the --fix command finds duplicate content items in your file system, it keeps the last updated one and deletes the oldest one.

Package

  • Create package of serialized items in disk
> dotnet sitecore ser pkg create -o FILE_PATH

Package with extension .itempackage will be created at the file path given in command.

Create package at root path location

> dotnet sitecore ser pkg create -o <Package name>
  • Install the created package in Sitecore
> dotnet sitecore ser pkg install -f FILE_PATH

Push

Serialize items from disk to Sitecore

Push commands can be executed just like pull commands.

  • Push all items as configured in *.module.json.
> dotnet sitecore ser push
  • Push items from specific module only(with tags separated by comma , for inserting tags, check – link)
> dotnet sitecore ser push -i tags:[tagA,tagB]
  • Push items from specific module only(with Module namespace)
> dotnet sitecore ser push -i ModuleA ModuleB
  • Push all items except certain modules with tags:
> dotnet sitecore ser push -e tags:[tagA,tagB]
  • Push all items except some modules(with Module namespace)
> dotnet sitecore ser push -e ModuleA ModuleB
  • Push items without integrity validation
> dotnet sitecore ser push -s
  • Check the differences/changes in disk and Sitecore without actually pushing it
> dotnet sitecore ser push --what-if

Explain

To check if specific path is present in any modules.
> dotnet sitecore ser explain --path "PATH"

It will return the module information in which mentioned “path” is present.

Watch

Pull the changed items automatically into file system.
  • Activate watch
> dotnet sitecore ser watch

Part 2: Sitecore Content Serialization Configuration(Sitecore CLI)

Series of Sitecore CLI blogs

After this, you will get following folders and files:

.sitecore folder will have schemas and user.json file.

schemas folder contains three schemas:

  • ModuleFile schema for each module configuration (<module name>.module.json)
  • RootConfiguration required in setting up sitecore.json
  • UserConfiguration schema used in user.json when we login into sitecore instance with interactive or non-interactive flow.

User.json saves the login information once the user logged in successfully via interactive or non-interactive flow. Do not commit this file in source control.

Configure sitecore.json

This file will be common to the project. We can configure following information in sitecore.json.

  • RootConfigurationFile.schema.json schema is added at the top
  • Modules: Add path of module or you can add a wildcard as given above to include all modules by itself based on architecture of project.
  • Plugins: plugins installed in this for serialization, index, publish, resource packaging and database are added here.
  • Serialization: It contains information about allowed max relative path length, module relative serialization path(where item ymls will be created), should it continue if serialization fails at some item, what fields need be excluded, etc.

There are several fields on items which do not contain important information and not required to be serialized. So you can add an array of the excluded fields with two properties: fieldId and description to Exclude property.

For Example:

"excludedFields": [
      {
        "fieldId": "badd9cf9-53e0-4d0c-bcc0-2d784c282f6a",
        "description": "__Updated by"
      }
]

NOTE:

-Excluding Revision fields from serialization can result in issues when publishing with the serialization command.
-You can also mention excluded fields in specific *module.json files to exclude from particular module items only.

Settings: It contains boolean entries to enable/disable telemetry and to enable the feature to check the Sitecore management services version compatibility.

Configure <module name>.module.json

  • Create module file with <module name>.module.json.
  • Add the Module schema at the top.

Update the path as per location of file.

  • After schema, add following:
{
  "namespace": "",
  "references": [""],
  "items": ""
}
From dev.sitecore.com
  • In items property, mention sitecore item’s include paths to sync items, descendants, children, etc. as given below:
items:"
"includes": [
        {
          "name": "<name>",
          "path": "<sitecore item path>",
          "scope": "<scope>",
          "allowedPushOperations": "<allowedPushOperations>"
          "rules": [
                    {
                        "path": "<sitecore item path>",
                        "scope": "<scope>"
                    }
        }
]"

Details about include properties:

From dev.sitecore.com
  • Order of the include paths is important. For example: templates, layouts, renderings, should be added first in include path list to avoid conflicts while syncing content items due to dependency on templates, layouts, etc.
  • You can also create a separate Base.module.json file for base items like templates, branches etc. and the add it as references(refer below code snippet) in rest of files. So in this way, we can order include paths inside the *.module.json and also set the hierarchy in which all module files need to be serialized.

  • You can set the relative path at the module level(refer below code snippet) inside items. This will override the defaultModuleRelativeSerializationPath given in sitecore.json.

  • You can also set the different exclude fields in different modules rather than setting same exclude fields for all modules.
...
items": {
        "includes": [
            {
                "name": "Apikey",
                "path": "/sitecore/system/Settings/Services/API Keys"
            },
            {
                "name": "Media",
                "path": "/sitecore/media library/my-first-jss-app"
            }
        ],
        "excludedFields":[
            {
	          "fieldID": "{EB504D1B-B612-4FFF-B239CA3BD7273D1B}",
		  "description": "FieldsForExclude1"
	    },
            {
		  "fieldID": "{3C2C061E-F61F-4DF6-89EA-0B7A56348737}",
		  "description": "FieldsForExclude2"
	     }
        ]
    }
  • Like the include paths of content items, you can exclude certain set of items with rules along with each include path. For example:
"items": {
  "includes": [
    {
      "name": "content",
      "path": "/sitecore/content/home",
      "rules": [
        {
           "path": "/products/legacy",
           "scope": "ignored"
        },
        {
           "path": "/products",
           "scope": "ItemAndDescendants",
           "allowedPushOperations": "createUpdateAndDelete"
        },
        {
           "path": "*",
           "scope": "ignored"
        }
      ]
    }
  ]
}

Details about rule properties:

From dev.sitecore.com

NOTE:

- Scope: Ignored, is only valid in configuring rules
- Sitecore CLI does not support duplicate item names and will break push/pull operations
  • In the initial serialization, you will require to sync everything. However, in proceeding serialization, you may require to serialize a module or couple of modules. You can mention tags in the individual module files and then use those tags with push and pull commands to sync more targeted changes rather than to sync all

Pull command with tags:

dotnet sitecore ser pull --include tags:[global-208]
  • You can also serialize role by adding roles property in module.json. The roles property is an array that consists of role predicate items with two properties: domain (Sitecore role domain) and pattern (a regex pattern to determine specific roles to include under the domain). For example:
{
    ...
    "items": {
        ...
    },
    "roles": [
      {
        "domain": "sitecore",
        "pattern": "Developer"
      },
      {
        "domain": "custom",
        "pattern": "Role*"
      },
      {
        "domain": "extranet",
        "pattern": "^MySite.*$"
      }
    ]
}

Part 1: Getting started with Sitecore CLI

Series of Sitecore CLI blogs

Following are the steps to install and use Sitecore CLI for your project:

Pre requisites:

Install  .NET Core 

Sitecore Instance

Working Sitecore identity

Steps to install:

  • Install Sitecore Management services package(few dlls and configs) on your sitecore instance. Make sure to check compatibility from here.
  • Open Powershell in admin mode
  • Run following commands in your root directory:

> dotnet new tool-manifest
> dotnet nuget add source -n Sitecore https://sitecore.myget.org/F/sc-packages/api/v3/index.json
> dotnet tool install Sitecore.CLI 
Use the -g option when running the install command to install CLI globally. However, it is not recommended because different instances may need different version of Sitecore CLI
  • Initialize a new project
> dotnet sitecore init
  • Install the required Publishing and Serialization plugins
> dotnet sitecore plugin add -n Sitecore.DevEx.Extensibility.Serialization
> dotnet sitecore plugin add -n Sitecore.DevEx.Extensibility.Publishing

You can check the list of plugins installed by using following command:

> dotnet sitecore plugin list

After this, your folder will look like following:

To verify the sitecore CLI is successfully installed and working(-h for help)

> dotnet sitecore -h

Login to Sitecore with Sitecore CLI:

Sitecore CLI supports two flows of authentication and authorization.

  • An interactive user login, using a device code flow. Please follow below steps for interactive login:
> dotnet sitecore login --authority https://<sitecore-identity-server> --cm https://<sitecore-instance> --allow-write true

This will navigate you to browser to authorize and modify ~/.sitecore/user.json with access tokens. Make sure not to commit user.json as it contains sensitive information. After this, you are good to serialize items.

  • A non-interactive client login, using a client credentials flow. This is used by clients such as Continuous Integration servers. Please follow below steps for non- interactive login:

Make configuration changes:

create a file named Sitecore.IdentityServer.DevEx.xml and add following:

<?xml version="1.0" encoding="utf-8"?>
<Settings>
  <Sitecore>
    <IdentityServer>
      <Clients>
        <!-- used to authenticate servers with client id and client secret -->
        <CliServerClient>
            <ClientId>SitecoreCLIServer</ClientId>
            <ClientName>SitecoreCLIServer</ClientName>
            <AccessTokenType>0</AccessTokenType>
            <AccessTokenLifetimeInSeconds>3600</AccessTokenLifetimeInSeconds>
            <IdentityTokenLifetimeInSeconds>3600</IdentityTokenLifetimeInSeconds>
            <RequireClientSecret>true</RequireClientSecret>
            <AllowOfflineAccess>false</AllowOfflineAccess>
            <AllowedGrantTypes>
                <!--
                    client_credentials authenticates with client ID and client secret
                    which is good for CI, tools, etc. However, it's not tied to a USER,
                    it's tied to a client ID.
                -->
                <AllowedGrantType1>client_credentials</AllowedGrantType1>
            </AllowedGrantTypes>
            <ClientSecrets>
                <!--<ClientSecret1>SUPERLONGSECRETHERE</ClientSecret1>-->
            </ClientSecrets>
            <AllowedScopes>
                <!-- this is required even if not a 'user' for Sitecore to like us -->
                <AllowedScope1>sitecore.profile.api</AllowedScope1>
            </AllowedScopes>
        </CliServerClient>
      </Clients>
    </IdentityServer>
  </Sitecore>
</Settings>

Add client id and client secret(max length = 100) in this file. Keep this file in Config folder of Sitecore Identity Server.

create another file Sitecore.Owin.Authentication.ClientCredentialsMapping.
config
 containing the following:

<?xml version="1.0" encoding="utf-8"?>
<configuration xmlns:patch="http://www.sitecore.net/xmlconfig/" xmlns:role="http://www.sitecore.net/xmlconfig/role/" xmlns:set="http://www.sitecore.net/xmlconfig/set/">
  <sitecore role:require="Standalone or ContentDelivery or ContentManagement">
    <federatedAuthentication>
      <identityProviders>
        <identityProvider id="SitecoreIdentityServer" type="Sitecore.Owin.Authentication.IdentityServer.IdentityServerProvider, Sitecore.Owin.Authentication.IdentityServer" resolve="true">
          <transformations hint="list:AddTransformation">
            <transformation name="admin-ify client credentials users" type="Sitecore.Owin.Authentication.Services.DefaultTransformation, Sitecore.Owin.Authentication">
              <sources hint="raw:AddSource">
                <claim name="client_id" value="SitecoreCLIServer" />
              </sources>
              <targets hint="raw:AddTarget">
                <claim name="name" value="sitecore\superuser" />
                <claim name="http://www.sitecore.net/identity/claims/isAdmin" value="true" />
				<claim name="http://schemas.xmlsoap.org/ws/2005/05/identity/claims/emailaddress" value="" />

              </targets>
              <keepSource>true</keepSource>
            </transformation>
          </transformations>
          
        </identityProvider>
      </identityProviders>
    </federatedAuthentication>
  </sitecore>
</configuration>

Add email id(same client id as first file) in email address claim value. And keep it in App_Config/Include/ folder of Sitecore Content management server.

Do the IIS Reset.

Open powershell in admin mode.

Now login:

> dotnet sitecore login --authority https://<sitecore-identity-server> --cm http://<sitecore-instance> --allow-write true --client-credentials true --client-id <client-id> --client-secret <client-secret>

Login information will be saved in .sitecore/user.json

Next step would be Sitecore serialization configuration- structure, rules and items pulled depends decisions taken in defining the configurations.

How you want the items to be pulled? you want to opt for site wise folders, component wise folders, page wise folders, etc. What fields should be excluded while performing pull and push of items?

Everything depends on how you plan to do Sitecore serialization configuration. Please find the details in next blog.

Sitecore CLI login without Identity server

Find the listener by running dotnet Sitecore.IdentityServer.Host.dll in Identity server root folder:

Now login with listener got in above step: https://localhost:5000 or http://localhost:5000 as authority parameter (bypassing identity server) in login command:

dotnet sitecore login --authority https://<sitecore-identity-server> --cm https://<sitecore-instance> --allow-write true

HTTP Error 500.19 – Internal Server Error on identity server

In our QA environment, Sitecore login page is always navigating to ?fbc=1 and bypassing the identity server. When I hit the identity server url directly, it gave me 500.19 error (above screenshot).

First, I started checking the event logs for errors and troubleshooting. As I am having Sitecore version 10.2, so I checked for all the pre-requisite required for it to verify anything is missing.

After some troubleshooting, I figured out that there is problem with IIS support, .NET core, .NET core runtime and URL rewrite. So, installing latest version of .NET core windows hosting bundle and URL rewrite 2.1 resolved the issue.

For more information on HTTP Error 500.19, check out https://learn.microsoft.com/en-us/troubleshoot/developer/webapps/iis/health-diagnostic-performance/http-error-500-19-webpage

Configure the editing host for local XM Cloud

After setting up my local environment, when I tried to open item in experience editor, it gave me below error:

Unable to connect to the remove server

In order to resolve this, we need to connect to rendering host using items because the solution is based on Sitecore Experience Accelerator (SXA)

App Name:

Make sure that name of the app in settings item under your site and app name(config.appName) mentioned in src/<app-folder>/package.json file of the front-end app are same.

Predefined application rendering host:

Check if the Predefined application rendering host field has the value ‘default’ in /content/<SiteName>/<AppName>/Settings/Site Grouping/<your-site> item

Application name:

Verify that the application name is correct in /System/Settings/Services/Rendering Hosts/Default.

If not, change it to match the config.appName setting in the package.json file of your rendering app.

If the field is left blank, then the field uses, by default, the value you configured in step 1(App Name).

Local Configuration:

Create an rendering host item under /System/Settings/Services/Rendering Hosts/ and add values.

In the Server side rendering engine endpoint URL field, enter the URL to the front-end app’s API route for rendering in editing mode.( check docker-compose.override.yml file in the root of your project)

In the Server side rendering engine application URL field, enter the value of the RENDERING_HOST_INTERNAL_URI environment variable.(check docker-compose.override.yml file in the root of your project)

In the Application name field, enter the name of your front-end application as configured in the package.json file.

Select local predefined application rendering host

In the /content/<sitename>/<appname>/Settings/Site Grouping/<appname> item, in the Settings section, the Predefined application rendering host field, select the Local rendering host definition and save the item

Run front-end app in connected mode:

Go to your rendering host directory (say sxastarter) and start the front-end application in connected mode with the command:

npm run start:connected

Now, Open any item in experience editor

Getting started with XM Cloud(Local)

In this blog, I will give a walkthrough how to setup local XM cloud development environment.

Pre requisites:

  1. A valid Sitecore license file
  2. Windows PowerShell 5.1. (PowerShell 7 is not supported at this time)
  3. The current long-term support (LTS) version of Node.js
  4. .NET Core 6.0 SDK
  5. .NET Framework 4.8 SDK
  6. Visual Studio 2022
  7. Docker for Windows, with Windows Containers enabled(Make sure you have all components for running containers/docker)

Prepare Local Environment to run containers:

Open Powershell in admin mode.

Make sure that Internet Information not running at port 443:

Get-Process -Id (Get-NetTCPConnection -LocalPort 443).OwningProcess

If you found any, stop IIS:

iisreset /stop

Check if you have Apache Solr or any other service running on port 8984:

Get-Process -Id (Get-NetTCPConnection -LocalPort 8984).OwningProcess

If yes, stop it:

Stop-Service -Name "<the name of your service>"

or

nssm stop "<the name of your service>"

Set up the XM cloud development solution

The starter template has scripts for following:

Clone the repository you configured for the XM Cloud project and open powershell in admin mode in same folder and run below commands:

Prepare the Sitecore Container environment:

.\init.ps1 -InitEnv -LicenseXmlPath "<C:\path\to\license.xml>" -AdminPassword "<desired password>"

Restart terminal or VS code after this as instructed.

Now, run up.ps1 to download the Sitecore Docker images, install and configure the containers and client application

.\up.ps1

Do as instructed in your browser while running this script, log in to the Sitecore XM instance and accept the device authorization.

Following screenshot has the images created:

The starter XM cloud template is just empty instance

Now, Configure Item serialization to synchronize items between environments

Create a serialization json file under /src for syncing templates, renderings, content items, media items etc. Follow the below syntax:

{
  "name": "<name of type of data>",
  "path": "<sitecore item path>",
  "allowedPushOperations": "CreateUpdateAndDelete"
}

I created a Headless site and configured content tree, media, templates in *.module.json file for serialization.

In order to serialize items, execute the following:

Authorize the local project in your XM Cloud organization

dotnet sitecore cloud login

Do as instructed to log in, authorize your device and all this connectivity information is stored in the .sitecore/user.json file.

Connect to the local environment

dotnet sitecore connect --ref xmcloud --cm https://xmcloudcm.localhost --allow-write true -n local

To pull serialized items from the remote XM Cloud environment

dotnet sitecore ser pull -n "local"

To push the serialized items into your locally running XM instance

dotnet sitecore ser push -n "local"

Configure the editing host

After this, you are all set to use your local XM cloud environment. In this next blog, we will connect this local environment with remote environment.

Mastering Image Optimization: Best Practices for Fast-Loading Websites

Images play crucial role in the performance of websites. Large size images take longer to load that results in poor user experience and hence, poor search engine rankings. So it is very important to use the optimized images. There are several ways to reduce the size of an image, depending on the specific use case and the desired quality of the resulting image.

Compress Images


Image compression is a process that reduces the file size of an image without significantly impacting its quality. Reducing the size by compression does not only helps in reducing loading time on site but also easy for storage and sharing. You can use free online tools such as TinyPNG, CompressJPEG or Kraken.io to compress your images before uploading them to your website. However, choice of the impression tool should be made carefully so that loose the specific details and color accuracy.

Resize Images


Make sure that you are using the appropriate size of the image for the location it is being placed. Avoid using an image that is much larger than you need and then forcibly adding width and height.
Images should be made responsive and size should be rendered as per the device type. You can use several online and offline tools to resize image for different components on webpages like workers in cloudflare, custom code, Content aware fill or seam curving.

Use Lazy Loading and image caching


Lazy loading means that images are only loaded when the user scrolls to the part of the page where the image is located instead of loading all images at once. This can significantly reduce page load times.

Image caching store the images in locally so as to reduce the the need to download them every time they are requested and moreover, reduce amount of data transferred over the network, resulting in faster loading times and lower bandwidth usage.

Use a Content Delivery Network(CDN)


A CDN distributes your website’s content across a network of servers, which can speed up the loading time of your images. When the user sends the requests, CDN helps to fetch the images from the server which is geographically closer.

Use Image Sprites


Image sprites combine multiple images into a single image and help reducing the number of Http requests made to a server and improve page load times. For example: social media images in footer

Reduce number of images


Consider using fewer images on your pages, or reducing the size of your image galleries, as this can also help to speed up your page loading times.

Use Images in Next generation format


Modern image formats, such as WebP and AVIF, are designed to be more efficient than older formats, such as JPEG and PNG. These formats result in smaller image sizes and improve performance.

Remove Metadata


Image metadata like EXIF data(camera settings, data, time, location of image, etc) add extra bits/bytes to the image. Removing this data will decrease the image size without impacting the quality of images. You can use online, offline tools or plugins like adobe photoshop, ExifTool, GIMP, etc. to remove metadata of images. Make sure that you do not remove the meta data that is essential for maintaining the quality.

Use Vector Graphics


Scaling images without loosing the quality is too difficult. Using Vector graphics, it can be achieved. It is created with mathematical equations so it can be scaled easily without losing quality. So choosing vector graphics instead of JPEG or PNG can also help in improving load time.

Use the correct file format


Choosing the image format affects the size and quality of images. JPEG and PNG are considered the correct formats for the images on site. JPEG is generally the best format for photographic images, while PNG is better for graphics with few colors or images with transparency and sharp edges. GIF is good for simple animations.




By optimizing your images, you can reduce page loading times, lower bandwidth usage, improve SEO, and provide a better user experience for your website visitors. However, using the right tools and methods is also essential so as to not impact the details and quality if images.